Compare commits

..

189 Commits

Author SHA1 Message Date
Todd Gamblin
e3fc937b32 WIP 2023-12-14 14:27:59 -08:00
Todd Gamblin
977f6ce65c solver: refactor transforms in condition generation
- [x] allow caller of `condition()` to pass lists of transforms
- [x] all transform functions now take trigger *and* effect as parameters
- [x] add some utility functions to simplify `condition()`
2023-12-14 13:50:41 -08:00
Todd Gamblin
a690b8c27c Improve parsing of quoted flags and variants in specs (#41529)
This PR does several things:

- [x] Allow any character to appear in the quoted values of variants and flags.
- [x] Allow easier passing of quoted flags on the command line, e.g. `cflags="-O2 -g"`.
- [x] Handle quoting better in spec output, using single quotes around double 
      quotes and vice versa.
- [x] Disallow spaces around `=` and `==` when parsing variants and flags.

## Motivation

This PR is motivated by the issues above and by ORNL's 
[tips for launching at scale on Frontier](https://docs.olcf.ornl.gov/systems/frontier_user_guide.html#tips-for-launching-at-scale).
ORNL recommends using `sbcast --send-libs` to broadcast executables and their
libraries to compute nodes when running large jobs (e.g., 80k ranks). For an
executable named `exe`, `sbcast --send-libs` stores the needed libraries in a
directory alongside the executable called `exe_libs`. ORNL recommends pointing
`LD_LIBRARY_PATH` at that directory so that `exe` will find the local libraries and
not overwhelm the filesystem.

There are other ways to mitigate this problem:
* You could build with `RUNPATH` using `spack config add config:shared_linking:type:runpath`,
  which would make `LD_LIBRARY_PATH` take precedence over Spack's `RUNPATHs`.
  I don't recommend this one because `RUNPATH` can cause many other things to go wrong.
* You could use `spack config add config:shared_linking:bind:true`, added in #31948, which
  will greatly reduce the filesystem load for large jobs by pointing `DT_NEEDED` entries in
  ELF *directly* at the needed `.so` files instead of relying on `RPATH` search via soname.
  I have not experimented with this at 80,000 ranks, but it should help quite a bit.
* You could use [Spindle](https://github.com/hpc/Spindle) (as LLNL does on its machines)
  which should transparently fix this without any changes to your executable and without
  any need to use `sbcast` or other tools.

But we want to support the `sbcast` use case as well.

## `sbcast` and Spack

Spack's `RPATHs` break the `sbcast` fix because they're considered with higher precedence
than `LD_LIBRARY_PATH`. So Spack applications will still end up hitting the shared filesystem
when searching for libraries. We can avoid this by injecting some `ldflags` in to the build, e.g.,
if were were going to launch, say, `LAMMPS` at scale, we could add another `RPATH`
specifically for use with `sbcast`:

    spack install lammps ldflags='-Wl,-rpath=$ORIGIN/lmp_libs'

This will put the `lmp_libs` directory alongside `LAMMPS`'s `lmp` executable first in the
`RPATH`, so it will be searched before any directories on the shared filesystem.

## Issues with quoting

Before this PR, the command above would've errored out for two reasons:

1. `$` wasn't an allowed character in our spec parser.
2. You would've had to double quote the flags to get them to pass through correctly:

       spack install lammps ldflags='"-Wl,-rpath=$ORIGIN/lmp_libs"'

This is ugly and I don't think many users will easily figure it out. The behavior was added in
#29282, and it improved parsing of specs passed as a single string, e.g.:

    spack install 'lammps ldflags="-Wl,-rpath=$ORIGIN/lmp_libs"'

but a lot of users are naturally going to try to quote arguments *directly* on the command
line, without quoting their entire spec. #29282 used a heuristic to detect unquoted flags
and warn the user, but the warning could be confusing. In particular, if you wrote
`cflags="-O2 -g"` on the command line, it would break the flags up, warn, and tell you
that you could fix the issue by writing `cflags="-O2 -g"` even though you just wrote
that. It's telling you to *quote* that value, but the user has to know to double quote.

## New heuristic for quoted arguments from the CLI

There are only two places where we allow arbitrary quoted strings in specs: flags and
variant values, so this PR adds a simpler heuristic to the CLI parser: if an argument in
`sys.argv` starts with `name=...`, then we assume the whole argument is quoted.

This means you can write:

    spack install bzip2 cflags="-O2 -g"

directly on the command line, without multiple levels of quoting. This also works:

    spack install 'bzip2 cflags="-O2 -g"'

The only place where this heuristic runs into ambiguity is if you attempt to pass
anonymous specs that start with `name=...` as one large string. e.g., this will be
interpreted as one large flag value:

    spack find 'cflags="-O2 -g" ~bar +baz'

This sets `cflags` to `"-O2 -g" ~bar +baz`, which is likely not what you wanted. You
can fix this easily by either removing the quotes:

    spack find cflags="-O2 -g" ~bar +baz

Or by adding a space at the start, which has the same effect:

    spack find ' cflags="-O2 -g" ~bar +baz'

You may wonder why we don't just look for quotes inside of flag arguments, and the
reason is that you *might* want them there.  If you are passing arguments like:

    spack install zlib cppflags="-D DEBUG_MSG1='quick fox' -D DEBUG_MSG2='lazy dog'"

You *need* the quotes there. So we've opted for one potentially confusing, but easily
fixed outcome vs. limiting what you can put in your quoted strings.

## Quotes in formatted spec output

In addition to being more lenient about characters accepted in quoted strings, this PR fixes
up spec formatting a bit. We now format quoted strings in specs with single quotes, unless
the string has a single quote in it, in which case we JSON-escape the string (i.e., we add
`\` before `"` and `\`).  

    zlib cflags='-D FOO="bar"'
    zlib cflags="-D FOO='bar'"
    zlib cflags="-D FOO='bar' BAR=\"baz\""
2023-12-13 16:36:22 -08:00
yizeyi18
a1fa862c3f camp: fixing build issue (#41400)
* adding necessary headers, to fix https://github.com/spack/spack/issues/41398

* deleting something imported by accident

* [@spackbot] updating style on behalf of yizeyi18

* undo commit 7688fed according to suggestion from @msimberg

* patching camp@:2022.10.1 for compatibility with gcc-13

* adding the patch

* fixing paths in the patch

* [@spackbot] updating style on behalf of yizeyi18

* Update camp patch using LLNL/camp@05e1c35

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* changing patch name

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-13 16:28:57 -08:00
Harmen Stoppels
80f31829a8 python: don't run mkdirp in setup_dependent_package (#41603)
`setup_dependent_package` is not a build phase, it should just set
globals for a package.

It's called during setup of runtime environment of packages, and there
have been reports of it actually failing due to a read only file system
(not sure under what exact conditions that is possible).
2023-12-13 23:54:28 +01:00
Arne Becker
84436f10ba perl-config-tiny: New package (#41584) 2023-12-13 14:43:03 -07:00
Arne Becker
660485709d perl-b-cow: New package (#41596) 2023-12-13 12:06:14 -08:00
Arne Becker
251dce05c9 perl-throwable: New package (#41597) 2023-12-13 12:05:17 -08:00
Arne Becker
ecd05fdfb4 perl-scope-guard: New package (#41598) 2023-12-13 12:02:50 -08:00
Arne Becker
9ffcf36444 perl-test-sharedfork: New package (#41599) 2023-12-13 12:01:54 -08:00
Arne Becker
07258a7c80 perl-safe-isa: New package (#41600) 2023-12-13 12:01:10 -08:00
Arne Becker
395e53a5e0 perl-proc-processtable: New package (#41601) 2023-12-13 11:59:05 -08:00
Arne Becker
77c331c753 perl-net-ip: New package (#41606) 2023-12-13 11:57:17 -08:00
Arne Becker
ee5481a861 perl-any-uri-escape: New package (#41607) 2023-12-13 11:56:11 -08:00
Arne Becker
f7ec061c64 perl-term-table: New package (#41608) 2023-12-13 11:47:08 -08:00
Arne Becker
7cb70ff4b1 perl-test-pod: New package (#41609) 2023-12-13 11:45:59 -08:00
Arne Becker
4a661f3255 perl-spiffy: New package (#41610) 2023-12-13 11:44:36 -08:00
Arne Becker
7037240879 perl-ipc-system-simple: New package (#41611) 2023-12-13 11:43:21 -08:00
Arne Becker
0e96dfaeef perl-mime-types: New package (#41612) 2023-12-13 11:41:48 -08:00
Arne Becker
a0a2cd6a1a perl-convert-nls-date-format: New package (#41613) 2023-12-13 11:39:11 -08:00
Arne Becker
170c05bebb perl-module-pluggable: New package (#41614) 2023-12-13 11:04:27 -08:00
Arne Becker
bdf68b7ac0 perl-email-date-format: New package (#41617) 2023-12-13 11:03:24 -08:00
Arne Becker
c176de94e2 perl-heap: New package (#41618) 2023-12-13 11:00:55 -08:00
Harmen Stoppels
f63dbbe75d spack mirror create --all: include patches (#41579) 2023-12-13 20:00:44 +01:00
Arne Becker
a0c7b10c76 perl-log-any: New package (#41619) 2023-12-13 10:59:54 -08:00
Arne Becker
2dc3bf0164 perl-http-cookiejar: New package (#41620) 2023-12-13 10:57:55 -08:00
Harmen Stoppels
9bf6e05d02 Revert "[protobuf] New versions, explicit cxxstd variant (#41459)" (#41635)
This reverts commit b82bd8e6b6.
2023-12-13 15:02:25 +01:00
Massimiliano Culpo
cd283846af mysql: add v8.0.35, fix build (#41602) 2023-12-13 12:10:33 +01:00
Taillefumier Mathieu
03625c1c95 Add pic variant when building the library (#41631)
* Add pic variant when building the library

* make pretty

* Probably better approach
2023-12-13 01:38:13 -07:00
Wouter Deconinck
f01774f1d4 hepmc3: fix from_variant -> self.define (#41605)
* hepmc3: fix from_variant -> self.define
* hepmc3: str on versions
2023-12-13 07:02:48 +01:00
Christopher Christofi
965860d1f8 perl-getopt-argvfile: add new package with version 1.11 (#41625) 2023-12-12 20:33:25 -07:00
Christopher Christofi
c4baf4e199 perl-parselex: add new package with version 2.21 (#41626) 2023-12-12 20:08:28 -07:00
Aiden Grossman
dd82227ae7 Remove MCT license annotation (#41593)
This license annotation is currently invalid as it specifies a URL
rather than an SPDX expression. Remove it for now until we have a
consensus on how to represent this case.
2023-12-12 17:52:42 -07:00
fpruvost
a9028630a5 pastix: new release v6.3.2 (#41585) 2023-12-12 15:57:11 -07:00
Arne Becker
789c85ed8b perl-clone-pp: New package (#41586) 2023-12-12 15:46:25 -07:00
James Taliaferro
cf9d36fd64 kakoune: add v2023.08.05 (#41443) 2023-12-12 23:02:20 +01:00
Arne Becker
ef7ce46649 perl-ipc-run3: New package (#41583) 2023-12-12 14:29:30 -07:00
pabloaledo
334a50662f Update bioconductor packages (#41227)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-12-12 22:04:45 +01:00
Dom Heinzeller
d68e73d006 New package Model Coupling Toolkit (MCT) (#41564)
* New package Model Coupling Toolkit (MCT)
* Remove ~mpi variant from mct, build is not working correctly
* Remove boilerplate stuff from var/spack/repos/builtin/packages/mct/package.py
2023-12-12 11:26:57 -08:00
Adam J. Stewart
7d7f097295 py-pyvista: add v0.42.3 (#41246) 2023-12-12 10:58:15 -08:00
Massimiliano Culpo
37cdcc7172 mysql: fix issue when using old core API call (#41573)
MySQL was performing a core API call to `Spec.flat_dependencies`
when setting up the build environment. This function is an
implementation detail of the old concretizer, where multiple nodes
from the same package are not allowed.

This PR uses a more idiomatic way to check if "python" is
in the DAG.

For reference, see #11356 to check why the call was introduced.
2023-12-12 10:40:31 -08:00
Thomas Madlener
0a40bb72e8 genfit: Add latest tags and update root dependency (#41572) 2023-12-12 10:30:06 -08:00
James Beal
24b6edac89 bowtie2 add latest version (#41580)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-12 10:23:17 -08:00
Arne Becker
3e7acf3e61 perl-type-tiny: New package (#41582) 2023-12-12 10:21:56 -08:00
Harmen Stoppels
ede36512e7 gcc: simplify patch when range (#41587) 2023-12-12 10:03:41 -07:00
jmuddnv
e06b169720 NVIDIA HPC SDK: add v23.11 (#41125) 2023-12-12 17:40:53 +01:00
Stephen Sachs
7ed968d42c clingo-bootstrap: use new Spack API for environment modifications (#41574) 2023-12-12 17:28:15 +01:00
Cameron Rutherford
c673b9245c exago: Add v1.2.0 and patches for builds without python or tests. (#41350)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-12 10:17:26 -06:00
Mikael Simberg
27c0dab5ca fmt: Add patch to allow compilation with clang in CUDA mode (#41578) 2023-12-12 08:33:44 -07:00
Chris Green
b82bd8e6b6 [protobuf] New versions, explicit cxxstd variant (#41459)
* [protobuf] New versions, explicit cxxstd variant
* New versions 3.15.8, 3.25.0, 3.25.1.
* New explicit variant `cxxstd` with support for older Protobuf
  versions.
* Support testing.
* Use Protobuf's `protobuf_BUILD_SHARED_LIBS` instead of
  `BUILD_SHARED_LIBS`.
* Support building with LLVM/Clang's `libc++`.
* Address audit issue
* Variant default does not honor `when` clause
* Use `self.spec.satisfies()` instead of `with when()`
* Fix silliness; improve consistency
* Today was apparently a go-back-to-bed day
2023-12-11 22:08:40 -07:00
Sreenivasa Murthy Kolam
5351382501 Bump up the version for ROCm-5.7.0 and ROCm-5.7.1 releases. (#40724)
* initial commit for rocm-5.7.0 and 5.7.1 releases
* bump up ther version for 5.7.0 and 5.7.1 releases
* update recipes to support 5.7.0 and 5.7.1 releases
* bump up the version for ROCm 5.7.0 and ROCm-5.7.1 releases
* bump up the version for composable-kernel amd miopen-hip
* fix style errors
* fix style errors in hip etc
* renaming composable-kernel recipe
* changes for composable_kernel
* Revert "renaming composable-kernel recipe"
  This reverts commit 0cf6c6debf.
* Revert "changes for composable_kernel"
  This reverts commit 05272a10a7.
* bump up the version for hiprand
* using the checksum for hiprand-5.7.1
* bump up the version for 5.7.0 and 5.7.1 releases
* fix style errors
* fix merge conflicts with the develop.
* temp workaround for the error seen with rocm-5.7.0 when trying
  to generate the dependency file for runtime/legion/legion_redop.cu
* fix build issue(work around) with legion
* add patch for migraphx package to turn off ck
* update to  hip recipe
* fix hip-path detection inside llvm clang driver
* update llvm-amdgpu and rocm-validation-suite recipes
* fix style errors
* bump up the version for amdsmi for rocm-5.7.0 release
* add support for gfx941,gfx942 for rocm-5.7.0 release onwards
* revert changes to rocm.py file
* added gfx941 and gfx942 to rocm.py and add the gfx942 to kokkos and new checksum
  the new version seem to support gfx942
* bump up the version for rccl for 5.7.1
* update the patch for rocm-openmp-extras for 5.7.0
* update mivisionx recipe for 5.7.0 release
* add new dependencies for rocfft tests
* port the fix for avx build, the start address of values_ buffer in KernelParameters is not
  correct as it is computed based on 16-byte alignment
* set HIP_PATH=ROCM_PATH for 5.7.0 onwards
* address review comments
* revert adding xnack- and xnack+ to gfx940,gfx941,gfx942 as the prechecks were failing
2023-12-11 14:49:19 -08:00
Harmen Stoppels
8c29e90fa9 Build cache: make signed/unsigned a mirror property (#41507)
* Add `signed` property to mirror config

* make unsigned a tri-state: true/false overrides mirror config, none takes mirror config

* test commands

* Document this

* add a test
2023-12-11 15:14:59 -06:00
Tamara Dahlgren
045f398f3d Add missing build-system/custom phases to the CDash map (#41439) 2023-12-11 11:06:19 -08:00
Chris Green
6986e70877 [abseil-cpp] New version 20230802.1 (#41457) 2023-12-11 11:04:46 -08:00
wspear
4d59e746fd otf2: add v3.0.3 (#41499)
* Add otf2 version 3.3
* Update var/spack/repos/builtin/packages/otf2/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-11 11:02:58 -08:00
Arne Becker
f6de34f9db New package: perl-test-file (#41510) 2023-12-11 10:52:24 -08:00
Arne Becker
0f0adb71d0 perl-class-tiny: New package (#41513) 2023-12-11 10:49:38 -08:00
Joe Schoonover
4ec958c5c6 Add new version of feq-parse (#41515)
The new feq-parse version includes fixes for ifort and ifx compilers
Additionally, evaluation of parser objects with multidimension arrays is
now supported.
2023-12-11 10:47:42 -08:00
Kyle Gerheiser
2aa07fa557 libfabric: Add ucx provider variant (#41524) 2023-12-11 10:46:18 -08:00
Arne Becker
239d343588 perl-test-nowarnings: New package (#41539)
* perl-test-nowarnings: New package
* Add myself as maintainer
2023-12-11 10:37:52 -08:00
Andrey Perestoronin
44604708ad intel-oneapi-compilers 2024.0.1: added new version to packages (#41555)
* add new packages

* fix version
2023-12-11 11:33:42 -07:00
Arne Becker
f0109e4afe perl-test-weaken: New package (#41540) 2023-12-11 10:33:04 -08:00
Arne Becker
b8f90e1bdc perl-test-without-module: New package (#41541) 2023-12-11 10:31:44 -08:00
James Beal
8b9064e5e4 samtools/htslib add latest version (#41545)
* samtools/htslib add latest version
* Given I work at the same institute as the authors I think it fair I am willing to review changes, if it's complex I can ask them over tea.

---------

Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:29:08 -08:00
James Beal
ce79785c10 igv add latest version (#41546)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:26:29 -08:00
Christopher Christofi
af378c7f31 perl-bio-db-hts: add new package with version 3.01 (#41554)
* perl-bio-db-hts: add new package with version 3.01
* fix styling
2023-12-11 11:25:20 -07:00
Dom Heinzeller
cf50bfb7c2 Add eckit@1.24.5, ecmwf-atlas@{0.35.0,0.35.1,0.36.0} (#41547)
* Add eckit@1.24.5
* Add ecmwf-atlas@0.35.1
* Add ecmwf-atlas@0.36.0
2023-12-11 10:20:39 -08:00
Dom Heinzeller
620e090ff5 Add @climbfuji to fms maintainers (#41550) 2023-12-11 10:18:05 -08:00
Massimiliano Culpo
c4d86a9c2e apple-clang: add new package (#41485)
* apple-clang: added new package
* Add a maintainer
* Use f-strings, remove leftover comment
2023-12-11 10:06:27 -08:00
Kyle Gerheiser
3b74b894c7 fabtests: Add versions and update maintainer (#41525) 2023-12-11 18:55:02 +01:00
Dan Lipsa
3fa8afc036 Add v5.0.0 to SENSEI (#41551)
Co-authored-by: Dan Lipsa <dan.lipsa@savannah.khq.kitware.com>
2023-12-11 11:12:47 -06:00
Garth N. Wells
60628075cb Update for v0.7.2 (#41393) 2023-12-11 09:10:52 -08:00
Chris Green
9e4fab277b [root] New variants, patches (#41548)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.
2023-12-11 09:18:10 -07:00
Christopher Christofi
5588e328f7 fpocket: improve recipe (#41532) 2023-12-11 15:45:52 +01:00
Rocco Meli
93a1fc90c9 add cmake constraint (#41542) 2023-12-11 15:36:29 +01:00
Patrick Gartung
7297721e78 Revert "[root] New variants, checksum changes, sundry improvements (#41463)" (#41544)
This reverts commit 7d45e132a6.
2023-12-11 15:34:24 +01:00
Felix Werner
eb57d96ea9 geant4/geant4-data: add v10.0.4 (#41478)
* geant4/geant4-data: add builtin_clhep variant and v10.0.4.

* geant4: revert addition of builtin_clhep variant.

* geant4: fix vecgeom variant only being available for v10.3 and above.
2023-12-11 14:21:25 +01:00
Rocco Meli
ce09642922 netlib-scalapack: add git attribute and master version (#41537) 2023-12-11 03:38:28 -07:00
Arne Becker
cd88eb1ed0 kyotocabinet: add new package (#41512) 2023-12-11 03:38:12 -07:00
Thomas Helfer
826df84baf tfel: fix v3.4.5 checksum (#41523) 2023-12-11 03:27:59 -07:00
Alex Richert
0a4b365a7d fms: add v2023.04 (#41475) 2023-12-11 11:15:01 +01:00
Harmen Stoppels
a2ed4704e7 Link to GitHub Action spack/setup-spack in docs (#41509) 2023-12-11 11:12:40 +01:00
Harmen Stoppels
28b49d5d2f unit tests: replace /bin/bash with /bin/sh (#41495) 2023-12-11 11:11:27 +01:00
Adam J. Stewart
16bb4c360a PyTorch: disable sleef dep for now (#41508) 2023-12-11 11:00:24 +01:00
Thomas Gruber
cfd58bdafe Likwid: likwid-icx-mem-group-fix.patch only for 5.2.* versions (#41514) 2023-12-11 10:48:02 +01:00
Matthieu Dorier
53493ceab1 leveldb: turning benchmark and tests off (#41518) 2023-12-11 10:41:03 +01:00
Dave Keeshan
64cd429cc8 Fix filter_compiler_wrapper where compiler is None (#41502)
Fix filer_compiler_wrapper for cases where the compiler returned in None, this happens on some installed gcc systems that do not have fortran built into them as standard, e.g. gcc@11.4.0 on ubuntu 22.04
2023-12-11 10:31:56 +01:00
Harmen Stoppels
525809632e petsc: improve hipsparse compat (#40311)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-11 10:30:14 +01:00
Vanessasaurus
a6c32c80ab flux-core: add v0.57.0 (#41520)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-12-11 10:25:02 +01:00
Todd Gamblin
57ad848f47 commands: better install status help formatting (#41527)
Before (hard to read, doesn't fit on small terminals):
:
```console
  -I, --install-status  show install status of packages

                        packages can be: installed [+], missing and needed by an installed package [-], installed in an upstream instance [^], or not installed (no annotation)
```

After (fits in 80 columns):

```console
  -I, --install-status  show install status of packages
                        [+] installed       [^] installed in an upstream
                         -  not installed   [-] missing dep of installed package
```
2023-12-11 10:17:37 +01:00
Brian Spilner
15623d8077 cdo: add v2.3.0 (#41479) 2023-12-11 09:45:16 +01:00
Thomas Madlener
c352db7645 vecgeom: Use correct checksum for version 1.2.5 (#41530)
See https://gitlab.cern.ch/VecGeom/VecGeom/-/releases/v1.2.5
2023-12-11 08:53:44 +01:00
Brian Van Essen
5d999d0e4f Add logic to cache the RPATH variables in CachedCMakePackages. (#41417) 2023-12-08 09:27:44 -08:00
Seth R. Johnson
694a1ff340 celeritas: new version 0.4.1 (#41504)
* celeritas: new version 0.4.1

* Mark correct versions as deprecated
2023-12-08 09:56:18 +00:00
Richard Berger
4ec451cfed flecsi: remove ^legion network=gasnet restriction (#41494) 2023-12-07 19:03:04 -07:00
Julien Cortial
a77eca7f88 cdt: Add versions 1.3.6 and 1.4.0 (#41490) 2023-12-07 14:27:39 -07:00
Greg Becker
14ac2b063a cce compiler: remove vestigial compiler names (#41303) 2023-12-07 15:17:03 -06:00
Tamara Dahlgren
edf4d6659d add missing endtime property to CDash (#41498) 2023-12-07 22:06:46 +01:00
Lydéric Debusschère
6531fbf425 py-mpldock: new package (#41316)
* py-mpldock: new package

* py-mpldock: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-07 21:24:05 +01:00
Victor Brunini
0a6045eadf Fix cdash reporter time stamps (#38825)
* Fix cdash reporter time stamps (#38818).
   The cdash reporter is created before packages are installed so save the
   starttime then instead of the endtime.
* Use endtime instead of starttime for the endtime of update

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2023-12-07 19:32:10 +00:00
Todd Gamblin
5722a13af0 Spack mailing list is now announcement-only (#41496)
Participation in the venerable Spack google group has dwindled, though we still have
540+ subscribers there.  I've made the mailing list announcement-only, and I've given
a few maintainers posting privileges.

This PR adds some notes to the README indicating that the mailing list is only for
announcements.
2023-12-07 19:22:14 +00:00
Brian Van Essen
9f1223e7a3 Bugfix spectrum-mpi module generation (#41466)
* Ensure that additional environment variables are set when a module
file is generated.

* Fixed the detection of the opal_prefix / MPI_ROOT field to use ompi_info.
---------

Co-authored-by: Greg Becker <becker33@llnl.gov>
2023-12-07 11:06:07 -08:00
Vijay M
5beef28444 Update homepage URL, add 5.5.1 version, remove bad version hashes, and other minro changes (#41492) 2023-12-07 10:52:06 -08:00
snehring
e618a93f3d iqtree2: add new version 2.2.2.7 and new variant lsd2 (#41467)
* iqtree2: add new version 2.2.2.7 and new variant lsd2
* iqtree2: reorder variant and resource
2023-12-07 11:30:14 -07:00
Garth N. Wells
3f0ec5c580 Update UFCx for v0.7.0. (#41392) 2023-12-07 10:20:34 -08:00
Kyle Knoepfel
14392efc6d Permit shared-library for libbacktrace (#41454) 2023-12-07 10:17:31 -08:00
John W. Parent
d7406aaaa5 CMake: v3.26.6 (#41282) 2023-12-07 10:25:42 -07:00
Harmen Stoppels
5a7e691ae2 freebsd (#41480) 2023-12-07 09:19:55 -08:00
Dave Keeshan
b9f63ab40b opensta: add new package (#41484)
* Add opensta, is allows 2 variants, zlib and cudd, but they are both enabled by default
* Remove unused import, os
2023-12-07 09:18:58 -08:00
Auriane R
4417b1f9ee Update pika package to use f-strings (#41483) 2023-12-07 10:14:33 -07:00
Adam J. Stewart
04f14166cb py-keras: add v3.0.1 (#41486) 2023-12-07 09:05:42 -08:00
Robert Cohn
223a54098e [intel-mkl,intel-ipp,intel-daal]: deprecate packages (#41488) 2023-12-07 09:01:02 -08:00
Satish Balay
ea505e2d26 petsc: add variant +zoltan (#41472) 2023-12-07 10:51:00 -06:00
Ataf Fazledin Ahamed
e2b51e01be traverse.py: use > 0 instead of >= 0 (#41482)
Signed-off-by: fazledyn-or <ataf@openrefactory.com>
2023-12-07 09:38:54 -07:00
jmlapre
a04ee77f77 trilinos: replace pytrilinos2 variant with python (#41435)
* depend_on python

There is an ill-named variant "python" that enables the pytrilinos1
variant.  This made it through our testing but broke on our actual
CI test machines.

* adjust "python" variant based on Trilinos version

For Trilinos <= 14, enable PyTrilinos(1). For later versions
of Trilinos, enable PyTrilinos2.

We still support directly enabling PyTrilinos2 via the "pytrilinos2"
variant.

* remove pytrilinos2 variant

* correct depends_on constraints
2023-12-07 10:28:13 -05:00
Jordan Galby
bb03ce7281 Do not use depfile in bootstrap (#41458)
- we don't have a fallback if make is not installed
- we assume file system locking works
- we don't verify that make is gnu make (bootstrapping fails on FreeBSD as a result)
- there are some weird race conditions in writing spack.yaml on concurrent spack install
- the view is updated after every package install instead of post environment install.
2023-12-07 10:09:49 +00:00
Massimiliano Culpo
31640652c7 audit: forbid nested dependencies in depends_on declarations (#41428)
Forbid nested dependencies in depends_on declarations, by running an audit in CI.

Fix the packages not passing the new audit:
- amd-aocl
- exago
- palace
- shapemapper
- xsdk-examples

ginkgo: add a commit sha to v1.5.0.glu_experimental
2023-12-07 10:21:01 +01:00
Samuel Browne
0ff0e8944e trilinos: add v15.0.0 (#41465) 2023-12-07 10:07:02 +01:00
Jack Morrison
a877d812d0 rdma-core: Add new versions 41.5, 42.5, 43.4, 44.4, 45.3, 46.2, 47.1, 49.0 (#41473) 2023-12-07 09:58:06 +01:00
dependabot[bot]
24a59ffd36 build(deps): bump actions/setup-python from 4.8.0 to 5.0.0 (#41474)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.8.0 to 5.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](b64ffcaf5b...0a5c615913)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-07 09:55:55 +01:00
Dave Keeshan
57f46f0375 cudd: add new package (#41476) 2023-12-07 09:37:22 +01:00
Chris Green
7d45e132a6 [root] New variants, checksum changes, sundry improvements (#41463)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.

* Handle unwanted system paths in various `PATH`-like environment
  variables.
2023-12-06 18:39:51 -06:00
Vicente Bolea
e7ac676417 paraview: dropping patch since changes exists (#41462) 2023-12-06 16:18:46 -07:00
Adam J. Stewart
94ba152ef5 py-torchmetrics: add v1.2.1 (#41456) 2023-12-06 12:48:28 -07:00
Massimiliano Culpo
5404a5bb82 llvm: reformulate a when condition to avoid tautology (#41461)
The condition on swig can be interpreted as "true if true,
false if false" and gives clingo the option to add swig
or not.

If not other optimization criteria break the tie, then
the concretization is non-deterministic.
2023-12-06 19:12:42 +01:00
Dave Keeshan
b522d8f610 yosys: add new package (#41416)
* Add EDA tools, yosys to Spack
* Add maintainers
* Move from format to f-strings
2023-12-06 09:47:58 -08:00
Thomas Madlener
2a57c11d28 lcio: add version 2.20.2, sio: add version 0.2 (#41451) 2023-12-06 09:46:18 -08:00
Thomas Madlener
1aa3a641ee edm4hep: Update cmake version dependency for newer versions (#41450) 2023-12-06 09:43:24 -08:00
yizeyi18
6feba1590c nwchem: add libxc/elpa support (#41376)
* added external libxc/elpa choice
* fixed formatting issues and 1 unused variant found by reviewer
* try to fix a string formatting issue
* try to fix some other string formatting issues
* fixed 1 flake8 style issue
* use explicit fftw-api@3
2023-12-06 09:38:46 -08:00
Chris Green
58a7912435 [catch2] Sundry improvements including C++20 support (#41199)
* Add `url_list` to facilitate finding new versions.

* `cxxstd` is not meaningful when `@:2.99.99` as it was a header-only
  package before v3.

* Support C++20/23, remove C++14 support.

* Add @greenc-FNAL to maintainers.

* Add CMake arguments to support testing, build of extras and examples.
2023-12-06 11:07:27 -06:00
yizeyi18
03ae2eb223 dla-future: add a patch (#41409)
* using std::int64_t needs include cstdint in gcc-13

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-06 16:55:40 +01:00
Julien Cortial
013f0d3a13 Only build tests for proj package if required (#41065)
* Only build tests for proj package if required

Even if tests are not explictly required to be built, proj build them
anyway and tries to download Google Test.

* proj: fix name of test activation flag

* proj: Always set test activation flag

* proj: Patch test activation logic for versions 5.x
2023-12-06 09:32:39 -06:00
Eric Berquist
3e68aa0b2f py-pre-commit: add 3.5.0 (#41438) 2023-12-06 09:11:32 -06:00
Gavin John
1da0d0342b Add new versions of py-quast (#40788)
* Add new versions of py-quast

* Update hashes

* Add joblib and simplejson

* Fat finger

* Update dependency type
2023-12-06 09:04:17 -06:00
Lydéric Debusschère
6f7d91aebf py-python-pptx: new package (#41315)
* py-python-pptx: new package

* py-python-pptx: use pil instead of pillow, remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:43:51 -06:00
Lydéric Debusschère
071c74d185 py-tldextract: new package (#41330)
* py-tldextract: new package

* py-tldextract: add version 5.1.1

* py-tldextract: fix version constraint on py-setuptools-scm

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:42:48 -06:00
Juan Miguel Carceller
51435d6d69 ruff: add version 0.1.6 (#41355)
* Add a new version of ruff

* Add a comment about where the dependency can be found

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-12-06 08:41:20 -06:00
Jordan Galby
8ce110e069 bootstrap: Don't catch Ctrl-C (#41449) 2023-12-06 14:58:14 +01:00
Auriane R
90aee11c33 Add pika 0.21.0 release (#41446) 2023-12-06 03:48:26 -07:00
Harmen Stoppels
4fc73bd7f3 minimal support for freebsd (#41434) 2023-12-06 10:27:22 +00:00
Dom Heinzeller
f7fc4b201d Update py-werkzeug version dependency for py-graphene-tornado@2.6.1 (#41426)
* Update py-werkzeug version dependency for py-graphene-tornado@2.6.1

* Add note on diverging version requirements for py-werkzeug in py-graphene-tornado
2023-12-06 10:28:58 +01:00
Jim Edwards
84999b6996 mpiserial: rework installation (#40762) 2023-12-06 09:39:08 +01:00
Harmen Stoppels
b0f193071d bootstrap status: no bash (#41431) 2023-12-06 09:20:59 +01:00
dependabot[bot]
d1c3374ccb build(deps): bump actions/setup-python from 4.7.1 to 4.8.0 (#41441)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.1 to 4.8.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](65d7f2d534...b64ffcaf5b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-06 09:20:14 +01:00
Richard Berger
cba8ba0466 legion: correct cuda dependency for cr version (#41119)
* legion: correct cuda dependency for cr version
* Update var/spack/repos/builtin/packages/legion/package.py

---------

Co-authored-by: Davis Herring <herring@lanl.gov>
2023-12-05 18:36:50 -08:00
Wouter Deconinck
d50f8d7b19 root: sha256 change on latest versions (#41401)
* root: sha256 change on latest version
* root: sha256 change on 6.28.10
* root: replace 6.26.12 by 6.26.14
* root: hash for 6.26.14
2023-12-05 18:20:27 -08:00
kjrstory
969fbbfb5a foam-extend: add v5.0 (#40480)
* foam-extend:add new version
* depreacted versions
2023-12-05 18:11:59 -08:00
Robert Cohn
1cd5397b12 [intel-parallel-studio] Deprecate entire package (#41430) 2023-12-05 17:56:49 -08:00
psakievich
1829dbd7b6 CDash: Spack dumps stage errors to configure phase (#41436) 2023-12-05 22:05:39 +00:00
Philippe Virouleau
9e3b231e6f julia: fix LLVM paches hashes (#41410) 2023-12-05 22:53:40 +01:00
Alec Scott
5911a677d4 py-gidgethub: add new package (#41286)
* py-gidgethub: add new package

* Add main branch version and scope flit/flit-core dependency

* Update var/spack/repos/builtin/packages/py-gidgethub/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Add optional dependencies as variants of package

* Add git url for main version

* Fix variant and dependency ordering

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-05 21:26:42 +01:00
Alec Scott
bb60bb4f7a py-gidgetlab: add new package (#41338)
* gidgetlab: add new package

* Convert both cachetools and aiohttp to optional deps with variants

* Fix forgotten variant conditional on cachetools dependency

* Add git url and main version for dev workflows

* Fix variant and dependency ordering

* Remove cachetools variant and merge dependency with aiohttp variant
2023-12-05 21:24:38 +01:00
Victoria Cherkas
ddec75315e Add maintainers to fdb, eckit, ecbuild, metkit, eccodes (#41433)
* Add maintainers to fdb
* Add maintainers to eckit
* Add maintainers to mekit
* Add maintainers to eccodes
* Add maintainers to ecbuild
* Add climbfuji to eccodes maintainers
2023-12-05 13:18:41 -07:00
Veselin Dobrev
8bcb1f8766 MFEM: Add a patch to fix the +gslib+shared+miniapps build (#41399)
* [mfem] Add a patch to resolve issue #41382

* [mfem] Spack CI wants "patch URL must end with ?full_index=1"
2023-12-05 10:26:48 -08:00
Matthew Thompson
5a0ac4ba94 Update versions of GFE packages (#41429) 2023-12-05 10:24:51 -07:00
Lydéric Debusschère
673689d53b py-sphinx: add versions 7.2.4, 7.2.5 and 7.2.6 (#41411)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 15:07:22 +01:00
Brian Vanderwende
ace8e17f02 libdap4: add explicit RPC dependency (#40019) 2023-12-05 13:11:19 +01:00
Billae
eb9c63541a documentation: add instructions on how to use external opengl (#40987) 2023-12-05 12:59:41 +01:00
Kensuke WATANABE
b9f4d9f6fc sirius: fix build error with Fujitsu compiler (#41101) 2023-12-05 12:54:03 +01:00
Lydéric Debusschère
eda3522ce8 py-sphinxcontrib-moderncmakedomain: add new package (#41331)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 12:47:25 +01:00
Harmen Stoppels
3cefd73fcc spack buildcache check: use same interface as push (#41378) 2023-12-05 12:44:50 +01:00
Alberto Invernizzi
3547bcb517 openfoam-org: fix for being able to build manually checked out repository (#41000)
* grep WM_PROJECT_VERSION from etc/bashrc

This fixes the problem when building from a manually checked out repo
which might have a different version wrt the one defined in the spack
package (e.g. anything later than 5.0 is known as 5.x by the build
system)

* patch applies to just 5.0, in newer versions it is already addressed

In `5.20171030` there's a commit

c66fba323c

very similar (almost identical) to what the patch `50-etc.patch` does.

So the patch should not be applied to other than `5.0` otherwise it errors.

References:
- https://github.com/OpenFOAM/OpenFOAM-5.x/commits/20171030/etc/bashrc
- 197d9d3bf2/etc/bashrc (L45-L47)
2023-12-05 12:37:57 +01:00
Todd Gamblin
53b528f649 bugfix: sort variants in spack info --variants-by-name (#41389)
This was missed while backporting the new `spack info` command from #40326.

Variants should be sorted by name when invoking `spack info --variants-by-name`.
2023-12-05 12:31:40 +01:00
Mark W. Krentel
798770f9e5 hpctoolkit: add conflict for recent intel-xed (#41413)
Intel made an incompatible change in XED in 2023.08.21 that breaks
hpctoolkit (at run time).  Hpctoolkit develop can adapt (soon will),
but older versions must use xed :2023.07.09.
2023-12-05 11:17:37 +01:00
Jim Edwards
4a920243a0 cprnc: update sha256 for github artifacts (#41418) 2023-12-05 11:13:14 +01:00
Ben Wibking
8727195b84 openblas: fix macOS build when using XCode 15 or newer (#41420)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-12-05 09:53:54 +01:00
Massimiliano Culpo
456f2ca40f extensions: improve docs, fix unit-tests (#41425) 2023-12-05 09:49:35 +01:00
dependabot[bot]
b4258aaa25 build(deps): bump docker/metadata-action from 5.2.0 to 5.3.0 (#41423)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.2.0 to 5.3.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](e6428a5c4e...31cebacef4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 09:48:29 +01:00
Mitch B
5d9647544a arrow: add versions up to v14.0.1 (#41424) 2023-12-05 09:48:03 +01:00
Brian Van Essen
1fdb6a3e7e Updating the LBANN, Hydrogen, and DiHydrogen recipes (#41390)
* Updating the LBANN, Hydrogen, and DiHydrogen recipes for both new
variants and to make sure that RPATHs are properly setup.

Co-authored-by: bvanessen <bvanessen@users.noreply.github.com>
2023-12-05 09:31:51 +01:00
Robert Cohn
7c77b3a4b2 [intel] deprecate all versions (#41412)
Deprecating intel package, which contains intel classic compilers. This package has not been updated in 3 years. Please use intel-oneapi-compilers instead.
2023-12-04 21:52:55 -07:00
Ye Luo
eb4b8292b6 A few changes to quantum-espresso (#41225)
* gipaw.x installed by cmake if version >= 5c4a4ce.
  gipaw.x will only be installed with cmake if the qe-gipaw version
  is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
  Here a patch to the submodule_commit_hash_records to use a newer
  qe-gipaw version.
* Update package.py
* Delete var/spack/repos/builtin/packages/quantum-espresso/gipaw-eccee44.patch
* Update package.py
* Restoring gipaw-eccee44 patch
* Update package.py
* Add fox variant in quantum-espresso
* Fix an issue introduced in #36484. Patches are 7.1 only.
* Change plugin handling.
* formatting.
* Typo correction
* Refine conflict

---------

Co-authored-by: S. Alexis Paz <alexis.paz@gmail.com>
2023-12-04 11:37:05 -08:00
John Biddiscombe
16bc58ea49 Add EGL support to ParaView and Glew (#39800)
* Add EGL support to ParaView and Glew

add a package for egl that provides GL but also adds
EGL libs and headers for projects that need them

Fix a header problem with the opengl package

Format files using black

* better description for egl variant description

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* better check/setup of non egl variant dependencies

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* Add biddisco as maintainer

* Fix unused var style warning

* Add egl conflicts for other gl providers

---------

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
2023-12-04 10:53:06 -06:00
Alec Scott
6028ce8bc1 direnv: add v2.33.0 (#41397) 2023-12-04 14:12:13 +01:00
Harmen Stoppels
349e7e4c37 zlib-ng: add v2.1.5 (#41402) 2023-12-04 12:44:10 +01:00
Harmen Stoppels
a982118c1f ci.py: fix missing import (#41391) 2023-12-04 12:14:59 +01:00
Adam J. Stewart
40d12ed7e2 PythonPackage: type hints (#40539)
* PythonPackage: nested config_settings, type hints

* No need to quote PythonPackage

* Use narrower types for now until needed
2023-12-04 10:53:53 +00:00
James Smillie
9e0720207a Windows: fix kit base path and reference to windows registry key (#41388)
* Proper handling of argument passed as semicolon-separated str
* Fix reference to windows registry key in win-wdk
2023-12-03 15:35:13 -08:00
Jaelyn Litzinger
88e738c343 Allow exago to use hiop@develop past v1.0.1 (#41384) 2023-12-02 14:06:22 -06:00
Cameron Rutherford
8bbc2e2ade resolve: add package with cuda and rocm support (#40871) 2023-12-01 20:49:11 -06:00
Julien Cortial
1509e54435 Add MUMPS versions 5.6.0, 5.6.1 and 5.6.2 (#41386)
The patch for version 5.5.x still applies to 5.6.x.
2023-12-01 18:48:00 -07:00
Dom Heinzeller
ca164d6619 Fix curl install using Intel compilers (#41380)
When using Intel to build curl, add 'CFLAGS=-we147' to the configure
args to fix error 'compiler does not halt on function prototype
mismatch'
2023-12-01 17:24:05 -07:00
Felix Werner
a632576231 Add XCDF. (#41379) 2023-12-01 14:56:18 -08:00
Felix Werner
70b16cfb59 Add PhotoSpline. (#41374) 2023-12-01 14:44:53 -08:00
Brian Vanderwende
1d89d4dc13 MET fixes for 11.1 and HDF4 support (#41372)
* MET fixes for 11.1 and HDF4 support
* Fix zlib reference in MET
2023-12-01 14:42:36 -08:00
Dewi
bc8a0f56ed removed cmake build version pointing to fork (#41368) 2023-12-01 14:39:45 -08:00
Jack Morrison
4e09396f8a Libfabric: Introduce OPX provider conflict for v1.20.0 (#41343)
* Libfabric: Introduce OPX provider conflict for v1.20.0
* Add message to libfabric 1.20.0 opx provider conflict
2023-12-01 14:35:54 -08:00
Weiqun Zhang
0d488c6e4f amrex: add v23.12 (#41385) 2023-12-01 22:45:58 +01:00
Erik Heeren
50e76bc3d3 py-pyglet: version bump (#41082)
* py-pyglet: version bump

* py-pyglet: use zip instead of whl, update dependencies

* py-pyglet: 2.0.9 and 2.0.10 zips should be downloaded from github

* py-pyglet: style

* py-pyglet: use virtual packages in dependencies

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-pyglet: doesn't depend on py-future any more

* py-pyglet: remove glx dependency

* py-pyglet: back to the pypi zipfiles with patch instead

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-01 21:44:30 +01:00
384 changed files with 6460 additions and 1990 deletions

View File

@@ -23,7 +23,7 @@ jobs:
operating_system: ["ubuntu-latest", "macos-latest"] operating_system: ["ubuntu-latest", "macos-latest"]
steps: steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: ${{inputs.python_version}} python-version: ${{inputs.python_version}}
- name: Install Python packages - name: Install Python packages

View File

@@ -159,7 +159,7 @@ jobs:
brew install cmake bison@2.7 tree brew install cmake bison@2.7 tree
- name: Checkout - name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: "3.12" python-version: "3.12"
- name: Bootstrap clingo - name: Bootstrap clingo

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout - name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@e6428a5c4e294a61438ed7f43155db912025b6b3 - uses: docker/metadata-action@31cebacef4805868f9ce9a0cb03ee36c32df2ac4
id: docker_meta id: docker_meta
with: with:
images: | images: |

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages

View File

@@ -54,7 +54,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install System packages - name: Install System packages
@@ -101,7 +101,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: '3.11' python-version: '3.11'
- name: Install System packages - name: Install System packages
@@ -159,7 +159,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: '3.11' python-version: '3.11'
- name: Install System packages - name: Install System packages
@@ -194,7 +194,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install Python packages - name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: '3.11' python-version: '3.11'
cache: 'pip' cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: '3.11' python-version: '3.11'
cache: 'pip' cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 - uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages

View File

@@ -66,10 +66,11 @@ Resources:
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org): * **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack. [bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions): * [**Github Discussions**](https://github.com/spack/spack/discussions):
not just for discussions, but also Q&A. for Q&A and discussions. Note the pinned discussions for announcements.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to * **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us! `@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.
Contributing Contributing
------------------------ ------------------------

View File

@@ -153,7 +153,43 @@ keyring, and trusting all downloaded keys.
List of popular build caches List of popular build caches
^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_ * `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_'
-------------------
Build cache signing
-------------------
By default, Spack will add a cryptographic signature to each package pushed to
a build cache, and verifies the signature when installing from a build cache.
Keys for signing can be managed with the :ref:`spack gpg <cmd-spack-gpg>` command,
as well as ``spack buildcache keys`` as mentioned above.
You can disable signing when pushing with ``spack buildcache push --unsigned``,
and disable verification when installing from any build cache with
``spack install --no-check-signature``.
Alternatively, signing and verification can be enabled or disabled on a per build cache
basis:
.. code-block:: console
$ spack mirror add --signed <name> <url> # enable signing and verification
$ spack mirror add --unsigned <name> <url> # disable signing and verification
$ spack mirror set --signed <name> # enable signing and verification for an existing mirror
$ spack mirror set --unsigned <name> # disable signing and verification for an existing mirror
Or you can directly edit the ``mirrors.yaml`` configuration file:
.. code-block:: yaml
mirrors:
<name>:
url: <url>
signed: false # disable signing and verification
See also :ref:`mirrors`.
---------- ----------
Relocation Relocation
@@ -251,87 +287,13 @@ To significantly speed up Spack in GitHub Actions, binaries can be cached in
GitHub Packages. This service is an OCI registry that can be linked to a GitHub GitHub Packages. This service is an OCI registry that can be linked to a GitHub
repository. repository.
A typical workflow is to include a ``spack.yaml`` environment in your repository
that specifies the packages to install, the target architecture, and the build
cache to use under ``mirrors``:
.. code-block:: yaml
spack:
specs:
- python@3.11
config:
install_tree:
root: /opt/spack
padded_length: 128
packages:
all:
require: target=x86_64_v2
mirrors:
local-buildcache: oci://ghcr.io/<organization>/<repository>
A GitHub action can then be used to install the packages and push them to the
build cache:
.. code-block:: yaml
name: Install Spack packages
on: push
env:
SPACK_COLOR: always
jobs:
example:
runs-on: ubuntu-22.04
permissions:
packages: write
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Checkout Spack
uses: actions/checkout@v3
with:
repository: spack/spack
path: spack
- name: Setup Spack
run: echo "$PWD/spack/bin" >> "$GITHUB_PATH"
- name: Concretize
run: spack -e . concretize
- name: Install
run: spack -e . install --no-check-signature
- name: Run tests
run: ./my_view/bin/python3 -c 'print("hello world")'
- name: Push to buildcache
run: |
spack -e . mirror set --oci-username ${{ github.actor }} --oci-password "${{ secrets.GITHUB_TOKEN }}" local-buildcache
spack -e . buildcache push --base-image ubuntu:22.04 --unsigned --update-index local-buildcache
if: ${{ !cancelled() }}
The first time this action runs, it will build the packages from source and
push them to the build cache. Subsequent runs will pull the binaries from the
build cache. The concretizer will ensure that prebuilt binaries are favored
over source builds.
The build cache entries appear in the GitHub Packages section of your repository,
and contain instructions for pulling and running them with ``docker`` or ``podman``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Spack's public build cache for GitHub Actions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack offers a public build cache for GitHub Actions with a set of common packages, Spack offers a public build cache for GitHub Actions with a set of common packages,
which lets you get started quickly. See the following resources for more information: which lets you get started quickly. See the following resources for more information:
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_ * `spack/setup-spack <https://github.com/spack/setup-spack>`_ for setting up Spack in GitHub
Actions
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_ for
more details on the public build cache
.. _cmd-spack-buildcache: .. _cmd-spack-buildcache:

View File

@@ -9,46 +9,42 @@
Custom Extensions Custom Extensions
================= =================
*Spack extensions* permit you to extend Spack capabilities by deploying your *Spack extensions* allow you to extend Spack capabilities by deploying your
own custom commands or logic in an arbitrary location on your filesystem. own custom commands or logic in an arbitrary location on your filesystem.
This might be extremely useful e.g. to develop and maintain a command whose purpose is This might be extremely useful e.g. to develop and maintain a command whose purpose is
too specific to be considered for reintegration into the mainline or to too specific to be considered for reintegration into the mainline or to
evolve a command through its early stages before starting a discussion to merge evolve a command through its early stages before starting a discussion to merge
it upstream. it upstream.
From Spack's point of view an extension is any path in your filesystem which From Spack's point of view an extension is any path in your filesystem which
respects a prescribed naming and layout for files: respects the following naming and layout for files:
.. code-block:: console .. code-block:: console
spack-scripting/ # The top level directory must match the format 'spack-{extension_name}' spack-scripting/ # The top level directory must match the format 'spack-{extension_name}'
├── pytest.ini # Optional file if the extension ships its own tests ├── pytest.ini # Optional file if the extension ships its own tests
├── scripting # Folder that may contain modules that are needed for the extension commands ├── scripting # Folder that may contain modules that are needed for the extension commands
│   ── cmd # Folder containing extension commands │   ── cmd # Folder containing extension commands
│   └── filter.py # A new command that will be available │   │   └── filter.py # A new command that will be available
├── tests # Tests for this extension │   └── functions.py # Module with internal details
└── tests # Tests for this extension
│ ├── conftest.py │ ├── conftest.py
│ └── test_filter.py │ └── test_filter.py
└── templates # Templates that may be needed by the extension └── templates # Templates that may be needed by the extension
In the example above the extension named *scripting* adds an additional command (``filter``) In the example above, the extension is named *scripting*. It adds an additional command
and unit tests to verify its behavior. The code for this example can be (``spack filter``) and unit tests to verify its behavior.
obtained by cloning the corresponding git repository:
.. TODO: write an ad-hoc "hello world" extension and make it part of the spack organization The extension can import any core Spack module in its implementation. When loaded by
the ``spack`` command, the extension itself is imported as a Python package in the
``spack.extensions`` namespace. In the example above, since the extension is named
"scripting", the corresponding Python module is ``spack.extensions.scripting``.
The code for this example extension can be obtained by cloning the corresponding git repository:
.. code-block:: console .. code-block:: console
$ cd ~/ $ git -C /tmp clone https://github.com/spack/spack-scripting.git
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
remote: Counting objects: 11, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 11 (delta 0), reused 11 (delta 0), pack-reused 0
Receiving objects: 100% (11/11), done.
As you can see by inspecting the sources, Python modules that are part of the extension
can import any core Spack module.
--------------------------------- ---------------------------------
Configure Spack to Use Extensions Configure Spack to Use Extensions
@@ -61,7 +57,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config: config:
extensions: extensions:
- ~/tmp/spack-scripting - /tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line: will be available from the command line:
@@ -86,37 +82,32 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly --implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result --output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options The corresponding unit tests can be run giving the appropriate options to ``spack unit-test``:
to ``spack unit-test``:
.. code-block:: console .. code-block:: console
$ spack unit-test --extension=scripting $ spack unit-test --extension=scripting
========================================== test session starts ===========================================
============================================================== test session starts =============================================================== platform linux -- Python 3.11.5, pytest-7.4.3, pluggy-1.3.0
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0 rootdir: /home/culpo/github/spack-scripting
rootdir: /home/mculpo/tmp/spack-scripting, inifile: pytest.ini configfile: pytest.ini
testpaths: tests
plugins: xdist-3.5.0
collected 5 items collected 5 items
tests/test_filter.py ...XX tests/test_filter.py ..... [100%]
============================================================ short test summary info =============================================================
XPASS tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
XPASS tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
=========================================================== slowest 20 test durations ============================================================ ========================================== slowest 30 durations ==========================================
3.74s setup tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0] 2.31s setup tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.17s call tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3] 0.57s call tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.16s call tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2] 0.56s call tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.15s call tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1] 0.54s call tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.13s call tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4] 0.54s call tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.08s call tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0] 0.48s call tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.04s teardown tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4] 0.01s setup tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4] 0.01s setup tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.00s setup tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3] 0.01s setup tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.00s setup tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1] 0.01s setup tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.00s setup tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2] (5 durations < 0.005s hidden. Use -vv to show these durations.)
0.00s teardown tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1] =========================================== 5 passed in 5.06s ============================================
0.00s teardown tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
====================================================== 3 passed, 2 xpassed in 4.51 seconds =======================================================

View File

@@ -111,3 +111,28 @@ CUDA is split into fewer components and is simpler to specify:
prefix: /opt/cuda/cuda-11.0.2/ prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``. where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.
-----------------------------------
Using an External OpenGL API
-----------------------------------
Depending on whether we have a graphics card or not, we may choose to use OSMesa or GLX to implement the OpenGL API.
If a graphics card is unavailable, OSMesa is recommended and can typically be built with Spack.
However, if we prefer to utilize the system GLX tailored to our graphics card, we need to declare it as an external. Here's how to do it:
.. code-block:: yaml
packages:
libglx:
require: [opengl]
opengl:
buildable: false
externals:
- prefix: /usr/
spec: opengl@4.6
Note that prefix has to be the root of both the libraries and the headers, using is /usr not the path the the lib.
To know which spec for opengl is available use ``cd /usr/include/GL && grep -Ri gl_version``.

View File

@@ -1047,9 +1047,9 @@ def __bool__(self):
"""Whether any exceptions were handled.""" """Whether any exceptions were handled."""
return bool(self.exceptions) return bool(self.exceptions)
def forward(self, context: str) -> "GroupedExceptionForwarder": def forward(self, context: str, base: type = BaseException) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message.""" """Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self) return GroupedExceptionForwarder(context, self, base)
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]): def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb)) self.exceptions.append((context, exc, tb))
@@ -1072,15 +1072,18 @@ class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a """A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler.""" GroupedExceptionHandler."""
def __init__(self, context: str, handler: GroupedExceptionHandler): def __init__(self, context: str, handler: GroupedExceptionHandler, base: type):
self._context = context self._context = context
self._handler = handler self._handler = handler
self._base = base
def __enter__(self): def __enter__(self):
return None return None
def __exit__(self, exc_type, exc_value, tb): def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None: if exc_value is not None:
if not issubclass(exc_type, self._base):
return False
self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb)) self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb))
# Suppress any exception from being re-raised: # Suppress any exception from being re-raised:

View File

@@ -499,7 +499,7 @@ def _ensure_packages_are_pickeleable(pkgs, error_cls):
@package_properties @package_properties
def _ensure_packages_are_unparseable(pkgs, error_cls): def _ensure_packages_are_unparseable(pkgs, error_cls):
"""Ensure that all packages can unparse and that unparsed code is valid Python""" """Ensure that all packages can unparse and that unparsed code is valid Python"""
import spack.package_hash as ph import spack.util.package_hash as ph
errors = [] errors = []
for pkg_name in pkgs: for pkg_name in pkgs:
@@ -646,11 +646,7 @@ def _linting_package_file(pkgs, error_cls):
if pkg_cls.homepage.startswith("http://"): if pkg_cls.homepage.startswith("http://"):
https = re.sub("http", "https", pkg_cls.homepage, 1) https = re.sub("http", "https", pkg_cls.homepage, 1)
try: try:
response = urlopen( response = urlopen(https)
https,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except Exception as e: except Exception as e:
msg = 'Error with attempting https for "{0}": ' msg = 'Error with attempting https for "{0}": '
errors.append(error_cls(msg.format(pkg_cls.name), [str(e)])) errors.append(error_cls(msg.format(pkg_cls.name), [str(e)]))
@@ -730,13 +726,37 @@ def _unknown_variants_in_directives(pkgs, error_cls):
@package_directives @package_directives
def _unknown_variants_in_dependencies(pkgs, error_cls): def _issues_in_depends_on_directive(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies""" """Reports issues with 'depends_on' directives.
Issues might be unknown dependencies, unknown variants or variant values, or declaration
of nested dependencies.
"""
errors = [] errors = []
for pkg_name in pkgs: for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name) filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items(): for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
for when, dependency_edge in dependency_data.items():
dependency_spec = dependency_edge.spec
nested_dependencies = dependency_spec.dependencies()
if nested_dependencies:
summary = (
f"{pkg_name}: invalid nested dependency "
f"declaration '{str(dependency_spec)}'"
)
details = [
f"split depends_on('{str(dependency_spec)}', when='{str(when)}') "
f"into {len(nested_dependencies) + 1} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
# No need to analyze virtual packages # No need to analyze virtual packages
if spack.repo.PATH.is_virtual(dependency_name): if spack.repo.PATH.is_virtual(dependency_name):
continue continue

View File

@@ -25,7 +25,7 @@
import warnings import warnings
from contextlib import closing, contextmanager from contextlib import closing, contextmanager
from gzip import GzipFile from gzip import GzipFile
from typing import Dict, List, NamedTuple, Optional, Set, Tuple from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys import llnl.util.filesystem as fsys
@@ -38,14 +38,12 @@
import spack.config as config import spack.config as config
import spack.database as spack_db import spack.database as spack_db
import spack.error import spack.error
import spack.gpg
import spack.hooks import spack.hooks
import spack.hooks.sbang import spack.hooks.sbang
import spack.mirror import spack.mirror
import spack.oci.image import spack.oci.image
import spack.oci.oci import spack.oci.oci
import spack.oci.opener import spack.oci.opener
import spack.paths
import spack.platforms import spack.platforms
import spack.relocate as relocate import spack.relocate as relocate
import spack.repo import spack.repo
@@ -54,6 +52,7 @@
import spack.traverse as traverse import spack.traverse as traverse
import spack.util.crypto import spack.util.crypto
import spack.util.file_cache as file_cache import spack.util.file_cache as file_cache
import spack.util.gpg
import spack.util.path import spack.util.path
import spack.util.spack_json as sjson import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
@@ -487,10 +486,7 @@ def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
scheme = urllib.parse.urlparse(mirror_url).scheme scheme = urllib.parse.urlparse(mirror_url).scheme
if scheme != "oci" and not web_util.url_exists( if scheme != "oci" and not web_util.url_exists(
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, "index.json"), url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, "index.json")
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
verify_ssl=spack.config.get("config:verify_ssl"),
timeout=spack.config.get("config:connect_timeout", 10),
): ):
return False return False
@@ -536,9 +532,7 @@ def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
def binary_index_location(): def binary_index_location():
"""Set up a BinaryCacheIndex for remote buildcache dbs in the user's homedir.""" """Set up a BinaryCacheIndex for remote buildcache dbs in the user's homedir."""
cache_root = os.path.join(misc_cache_location(), "indices") cache_root = os.path.join(misc_cache_location(), "indices")
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(cache_root)
cache_root, replacements=spack.paths.path_replacements()
)
#: Default binary cache index instance #: Default binary cache index instance
@@ -833,7 +827,7 @@ def tarball_path_name(spec, ext):
def select_signing_key(key=None): def select_signing_key(key=None):
if key is None: if key is None:
keys = spack.gpg.signing_keys() keys = spack.util.gpg.signing_keys()
if len(keys) == 1: if len(keys) == 1:
key = keys[0] key = keys[0]
@@ -858,7 +852,7 @@ def sign_specfile(key, force, specfile_path):
raise NoOverwriteException(signed_specfile_path) raise NoOverwriteException(signed_specfile_path)
key = select_signing_key(key) key = select_signing_key(key)
spack.gpg.sign(key, specfile_path, signed_specfile_path, clearsign=True) spack.util.gpg.sign(key, specfile_path, signed_specfile_path, clearsign=True)
def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_dir, concurrency): def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_dir, concurrency):
@@ -910,7 +904,6 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
url_util.join(cache_prefix, "index.json"), url_util.join(cache_prefix, "index.json"),
keep_original=False, keep_original=False,
extra_args={"ContentType": "application/json", "CacheControl": "no-cache"}, extra_args={"ContentType": "application/json", "CacheControl": "no-cache"},
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
# Push the hash # Push the hash
@@ -919,7 +912,6 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
url_util.join(cache_prefix, "index.json.hash"), url_util.join(cache_prefix, "index.json.hash"),
keep_original=False, keep_original=False,
extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"}, extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"},
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
@@ -984,13 +976,9 @@ def _specs_from_cache_fallback(cache_prefix):
def url_read_method(url): def url_read_method(url):
contents = None contents = None
try: try:
_, _, spec_file = web_util.read_from_url( _, _, spec_file = web_util.read_from_url(url)
url,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
contents = codecs.getreader("utf-8")(spec_file).read() contents = codecs.getreader("utf-8")(spec_file).read()
except (URLError, web_util.WebError) as url_err: except (URLError, web_util.SpackWebError) as url_err:
tty.error("Error reading specfile: {0}".format(url)) tty.error("Error reading specfile: {0}".format(url))
tty.error(url_err) tty.error(url_err)
return contents return contents
@@ -998,9 +986,7 @@ def url_read_method(url):
try: try:
file_list = [ file_list = [
url_util.join(cache_prefix, entry) url_util.join(cache_prefix, entry)
for entry in web_util.list_url( for entry in web_util.list_url(cache_prefix)
cache_prefix, verify_ssl=spack.config.get("config:verify_ssl", True)
)
if entry.endswith("spec.json") or entry.endswith("spec.json.sig") if entry.endswith("spec.json") or entry.endswith("spec.json.sig")
] ]
read_fn = url_read_method read_fn = url_read_method
@@ -1098,9 +1084,7 @@ def generate_key_index(key_prefix, tmpdir=None):
try: try:
fingerprints = ( fingerprints = (
entry[:-4] entry[:-4]
for entry in web_util.list_url( for entry in web_util.list_url(key_prefix, recursive=False)
key_prefix, recursive=False, verify_ssl=spack.config.get("config:verify_ssl", True)
)
if entry.endswith(".pub") if entry.endswith(".pub")
) )
except KeyError as inst: except KeyError as inst:
@@ -1137,7 +1121,6 @@ def generate_key_index(key_prefix, tmpdir=None):
url_util.join(key_prefix, "index.json"), url_util.join(key_prefix, "index.json"),
keep_original=False, keep_original=False,
extra_args={"ContentType": "application/json"}, extra_args={"ContentType": "application/json"},
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
except Exception as err: except Exception as err:
msg = "Encountered problem pushing key index to {0}: {1}".format(key_prefix, err) msg = "Encountered problem pushing key index to {0}: {1}".format(key_prefix, err)
@@ -1381,18 +1364,10 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
spackfile_path = os.path.join(cache_prefix, tarball_path_name(spec, ".spack")) spackfile_path = os.path.join(cache_prefix, tarball_path_name(spec, ".spack"))
remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, stage_dir)) remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, stage_dir))
fetch_method = (spack.config.get("config:url_fetch_method", "urllib"),)
verify_ssl = (spack.config.get("config:verify_ssl"),)
timeout = spack.config.get("config:connect_timeout", 10)
url_args = {"fetch_method": fetch_method, "verify_ssl": verify_ssl, "timeout": timeout}
mkdirp(tarfile_dir) mkdirp(tarfile_dir)
if web_util.url_exists(remote_spackfile_path, **url_args): if web_util.url_exists(remote_spackfile_path):
if options.force: if options.force:
web_util.remove_url( web_util.remove_url(remote_spackfile_path)
remote_spackfile_path, verify_ssl=spack.config.get("config:verify_ssl", True)
)
else: else:
raise NoOverwriteException(url_util.format(remote_spackfile_path)) raise NoOverwriteException(url_util.format(remote_spackfile_path))
@@ -1412,13 +1387,12 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
# If force and exists, overwrite. Otherwise raise exception on collision. # If force and exists, overwrite. Otherwise raise exception on collision.
if options.force: if options.force:
verify_ssl = spack.config.get("config:verify_ssl", True) if web_util.url_exists(remote_specfile_path):
if web_util.url_exists(remote_specfile_path, **url_args): web_util.remove_url(remote_specfile_path)
web_util.remove_url(remote_specfile_path, verify_ssl=verify_ssl) if web_util.url_exists(remote_signed_specfile_path):
if web_util.url_exists(remote_signed_specfile_path, **url_args): web_util.remove_url(remote_signed_specfile_path)
web_util.remove_url(remote_signed_specfile_path, verify_ssl=verify_ssl) elif web_util.url_exists(remote_specfile_path) or web_util.url_exists(
elif web_util.url_exists(remote_specfile_path, **url_args) or web_util.url_exists( remote_signed_specfile_path
remote_signed_specfile_path, **url_args
): ):
raise NoOverwriteException(url_util.format(remote_specfile_path)) raise NoOverwriteException(url_util.format(remote_specfile_path))
@@ -1452,17 +1426,11 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
sign_specfile(key, options.force, specfile_path) sign_specfile(key, options.force, specfile_path)
# push tarball and signed spec json to remote mirror # push tarball and signed spec json to remote mirror
web_util.push_to_url( web_util.push_to_url(spackfile_path, remote_spackfile_path, keep_original=False)
spackfile_path,
remote_spackfile_path,
keep_original=False,
verify_ssl=spack.config.get("config:verify_ssl", True),
)
web_util.push_to_url( web_util.push_to_url(
signed_specfile_path if not options.unsigned else specfile_path, signed_specfile_path if not options.unsigned else specfile_path,
remote_signed_specfile_path if not options.unsigned else remote_specfile_path, remote_signed_specfile_path if not options.unsigned else remote_specfile_path,
keep_original=False, keep_original=False,
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
# push the key to the build cache's _pgp directory so it can be # push the key to the build cache's _pgp directory so it can be
@@ -1566,7 +1534,7 @@ def try_verify(specfile_path):
suppress = config.get("config:suppress_gpg_warnings", False) suppress = config.get("config:suppress_gpg_warnings", False)
try: try:
spack.gpg.verify(specfile_path, suppress_warnings=suppress) spack.util.gpg.verify(specfile_path, suppress_warnings=suppress)
except Exception: except Exception:
return False return False
@@ -1637,14 +1605,14 @@ def _get_valid_spec_file(path: str, max_supported_layout: int) -> Tuple[Dict, in
return spec_dict, layout_version return spec_dict, layout_version
def download_tarball(spec, unsigned=False, mirrors_for_spec=None): def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=None):
""" """
Download binary tarball for given package into stage area, returning Download binary tarball for given package into stage area, returning
path to downloaded tarball if successful, None otherwise. path to downloaded tarball if successful, None otherwise.
Args: Args:
spec (spack.spec.Spec): Concrete spec spec (spack.spec.Spec): Concrete spec
unsigned (bool): Whether or not to require signed binaries unsigned: if ``True`` or ``False`` override the mirror signature verification defaults
mirrors_for_spec (list): Optional list of concrete specs and mirrors mirrors_for_spec (list): Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec(). obtained by calling binary_distribution.get_mirrors_for_spec().
These will be checked in order first before looking in other These will be checked in order first before looking in other
@@ -1665,7 +1633,9 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"signature_verified": "true-if-binary-pkg-was-already-verified" "signature_verified": "true-if-binary-pkg-was-already-verified"
} }
""" """
configured_mirrors = spack.mirror.MirrorCollection(binary=True).values() configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
if not configured_mirrors: if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.") tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1683,8 +1653,16 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# mirror for the spec twice though. # mirror for the spec twice though.
try_first = [i["mirror_url"] for i in mirrors_for_spec] if mirrors_for_spec else [] try_first = [i["mirror_url"] for i in mirrors_for_spec] if mirrors_for_spec else []
try_next = [i.fetch_url for i in configured_mirrors if i.fetch_url not in try_first] try_next = [i.fetch_url for i in configured_mirrors if i.fetch_url not in try_first]
mirror_urls = try_first + try_next
mirrors = try_first + try_next # TODO: turn `mirrors_for_spec` into a list of Mirror instances, instead of doing that here.
def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
tried_to_verify_sigs = [] tried_to_verify_sigs = []
@@ -1693,14 +1671,17 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# we remove support for deprecated spec formats and buildcache layouts. # we remove support for deprecated spec formats and buildcache layouts.
for try_signed in (True, False): for try_signed in (True, False):
for mirror in mirrors: for mirror in mirrors:
# Override mirror's default if
currently_unsigned = unsigned if unsigned is not None else not mirror.signed
# If it's an OCI index, do things differently, since we cannot compose URLs. # If it's an OCI index, do things differently, since we cannot compose URLs.
parsed = urllib.parse.urlparse(mirror) fetch_url = mirror.fetch_url
# TODO: refactor this to some "nice" place. # TODO: refactor this to some "nice" place.
if parsed.scheme == "oci": if fetch_url.startswith("oci://"):
ref = spack.oci.image.ImageReference.from_string(mirror[len("oci://") :]).with_tag( ref = spack.oci.image.ImageReference.from_string(
spack.oci.image.default_tag(spec) fetch_url[len("oci://") :]
) ).with_tag(spack.oci.image.default_tag(spec))
# Fetch the manifest # Fetch the manifest
try: try:
@@ -1737,7 +1718,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
except InvalidMetadataFile as e: except InvalidMetadataFile as e:
tty.warn( tty.warn(
f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} " f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} "
f"from {mirror} due to invalid metadata file: {e}" f"from {fetch_url} due to invalid metadata file: {e}"
) )
local_specfile_stage.destroy() local_specfile_stage.destroy()
continue continue
@@ -1759,13 +1740,16 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"tarball_stage": tarball_stage, "tarball_stage": tarball_stage,
"specfile_stage": local_specfile_stage, "specfile_stage": local_specfile_stage,
"signature_verified": False, "signature_verified": False,
"signature_required": not currently_unsigned,
} }
else: else:
ext = "json.sig" if try_signed else "json" ext = "json.sig" if try_signed else "json"
specfile_path = url_util.join(mirror, BUILD_CACHE_RELATIVE_PATH, specfile_prefix) specfile_path = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, specfile_prefix
)
specfile_url = f"{specfile_path}.{ext}" specfile_url = f"{specfile_path}.{ext}"
spackfile_url = url_util.join(mirror, BUILD_CACHE_RELATIVE_PATH, tarball) spackfile_url = url_util.join(fetch_url, BUILD_CACHE_RELATIVE_PATH, tarball)
local_specfile_stage = try_fetch(specfile_url) local_specfile_stage = try_fetch(specfile_url)
if local_specfile_stage: if local_specfile_stage:
local_specfile_path = local_specfile_stage.save_filename local_specfile_path = local_specfile_stage.save_filename
@@ -1778,21 +1762,21 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
except InvalidMetadataFile as e: except InvalidMetadataFile as e:
tty.warn( tty.warn(
f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} " f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} "
f"from {mirror} due to invalid metadata file: {e}" f"from {fetch_url} due to invalid metadata file: {e}"
) )
local_specfile_stage.destroy() local_specfile_stage.destroy()
continue continue
if try_signed and not unsigned: if try_signed and not currently_unsigned:
# If we found a signed specfile at the root, try to verify # If we found a signed specfile at the root, try to verify
# the signature immediately. We will not download the # the signature immediately. We will not download the
# tarball if we could not verify the signature. # tarball if we could not verify the signature.
tried_to_verify_sigs.append(specfile_url) tried_to_verify_sigs.append(specfile_url)
signature_verified = try_verify(local_specfile_path) signature_verified = try_verify(local_specfile_path)
if not signature_verified: if not signature_verified:
tty.warn("Failed to verify: {0}".format(specfile_url)) tty.warn(f"Failed to verify: {specfile_url}")
if unsigned or signature_verified or not try_signed: if currently_unsigned or signature_verified or not try_signed:
# We will download the tarball in one of three cases: # We will download the tarball in one of three cases:
# 1. user asked for --no-check-signature # 1. user asked for --no-check-signature
# 2. user didn't ask for --no-check-signature, but we # 2. user didn't ask for --no-check-signature, but we
@@ -1815,6 +1799,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"tarball_stage": tarball_stage, "tarball_stage": tarball_stage,
"specfile_stage": local_specfile_stage, "specfile_stage": local_specfile_stage,
"signature_verified": signature_verified, "signature_verified": signature_verified,
"signature_required": not currently_unsigned,
} }
local_specfile_stage.destroy() local_specfile_stage.destroy()
@@ -2013,7 +1998,7 @@ def is_backup_file(file):
relocate.relocate_text(text_names, prefix_to_prefix_text) relocate.relocate_text(text_names, prefix_to_prefix_text)
def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum): def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):
stagepath = os.path.dirname(filename) stagepath = os.path.dirname(filename)
spackfile_name = tarball_name(spec, ".spack") spackfile_name = tarball_name(spec, ".spack")
spackfile_path = os.path.join(stagepath, spackfile_name) spackfile_path = os.path.join(stagepath, spackfile_name)
@@ -2033,11 +2018,11 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
else: else:
raise ValueError("Cannot find spec file for {0}.".format(extract_to)) raise ValueError("Cannot find spec file for {0}.".format(extract_to))
if not unsigned: if signature_required:
if os.path.exists("%s.asc" % specfile_path): if os.path.exists("%s.asc" % specfile_path):
suppress = config.get("config:suppress_gpg_warnings", False) suppress = config.get("config:suppress_gpg_warnings", False)
try: try:
spack.gpg.verify("%s.asc" % specfile_path, specfile_path, suppress) spack.util.gpg.verify("%s.asc" % specfile_path, specfile_path, suppress)
except Exception: except Exception:
raise NoVerifyException( raise NoVerifyException(
"Spack was unable to verify package " "Spack was unable to verify package "
@@ -2082,7 +2067,7 @@ def _tar_strip_component(tar: tarfile.TarFile, prefix: str):
m.linkname = m.linkname[result.end() :] m.linkname = m.linkname[result.end() :]
def extract_tarball(spec, download_result, unsigned=False, force=False, timer=timer.NULL_TIMER): def extract_tarball(spec, download_result, force=False, timer=timer.NULL_TIMER):
""" """
extract binary tarball for given package into install area extract binary tarball for given package into install area
""" """
@@ -2108,7 +2093,8 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
bchecksum = spec_dict["binary_cache_checksum"] bchecksum = spec_dict["binary_cache_checksum"]
filename = download_result["tarball_stage"].save_filename filename = download_result["tarball_stage"].save_filename
signature_verified = download_result["signature_verified"] signature_verified: bool = download_result["signature_verified"]
signature_required: bool = download_result["signature_required"]
tmpdir = None tmpdir = None
if layout_version == 0: if layout_version == 0:
@@ -2117,7 +2103,9 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
# and another tarball containing the actual install tree. # and another tarball containing the actual install tree.
tmpdir = tempfile.mkdtemp() tmpdir = tempfile.mkdtemp()
try: try:
tarfile_path = _extract_inner_tarball(spec, filename, tmpdir, unsigned, bchecksum) tarfile_path = _extract_inner_tarball(
spec, filename, tmpdir, signature_required, bchecksum
)
except Exception as e: except Exception as e:
_delete_staged_downloads(download_result) _delete_staged_downloads(download_result)
shutil.rmtree(tmpdir) shutil.rmtree(tmpdir)
@@ -2130,9 +2118,10 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
# the tarball. # the tarball.
tarfile_path = filename tarfile_path = filename
if not unsigned and not signature_verified: if signature_required and not signature_verified:
raise UnsignedPackageException( raise UnsignedPackageException(
"To install unsigned packages, use the --no-check-signature option." "To install unsigned packages, use the --no-check-signature option, "
"or configure the mirror with signed: false."
) )
# compute the sha256 checksum of the tarball # compute the sha256 checksum of the tarball
@@ -2243,10 +2232,9 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
tty.debug("Verified SHA256 checksum of the build cache") tty.debug("Verified SHA256 checksum of the build cache")
# don't print long padded paths while extracting/relocating binaries # don't print long padded paths while extracting/relocating binaries
padding = spack.config.get("config:install_tree:padded_length", None) with spack.util.path.filter_padding():
with spack.util.path.filter_padding(padding=padding):
tty.msg('Installing "{0}" from a buildcache'.format(spec.format())) tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, unsigned, force) extract_tarball(spec, download_result, force)
spack.hooks.post_install(spec, False) spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout) spack.store.STORE.db.add(spec, spack.store.STORE.layout)
@@ -2283,20 +2271,12 @@ def try_direct_fetch(spec, mirrors=None):
mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH, signed_specfile_name mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH, signed_specfile_name
) )
try: try:
_, _, fs = web_util.read_from_url( _, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
buildcache_fetch_url_signed_json,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
specfile_is_signed = True specfile_is_signed = True
except (URLError, web_util.WebError, HTTPError) as url_err: except (URLError, web_util.SpackWebError, HTTPError) as url_err:
try: try:
_, _, fs = web_util.read_from_url( _, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
buildcache_fetch_url_json, except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except (URLError, web_util.WebError, HTTPError) as url_err_x:
tty.debug( tty.debug(
"Did not find {0} on {1}".format( "Did not find {0} on {1}".format(
specfile_name, buildcache_fetch_url_signed_json specfile_name, buildcache_fetch_url_signed_json
@@ -2400,19 +2380,10 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
tty.debug("Finding public keys in {0}".format(url_util.format(fetch_url))) tty.debug("Finding public keys in {0}".format(url_util.format(fetch_url)))
try: try:
_, _, json_file = web_util.read_from_url( _, _, json_file = web_util.read_from_url(keys_index)
keys_index,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
json_index = sjson.load(codecs.getreader("utf-8")(json_file)) json_index = sjson.load(codecs.getreader("utf-8")(json_file))
except (URLError, web_util.WebError) as url_err: except (URLError, web_util.SpackWebError) as url_err:
if web_util.url_exists( if web_util.url_exists(keys_index):
keys_index,
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
verify_ssl=spack.config.get("config:verify_ssl"),
timeout=spack.config.get("config:connect_timeout", 10),
):
err_msg = [ err_msg = [
"Unable to find public keys in {0},", "Unable to find public keys in {0},",
" caught exception attempting to read from {1}.", " caught exception attempting to read from {1}.",
@@ -2443,7 +2414,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
tty.debug("Found key {0}".format(fingerprint)) tty.debug("Found key {0}".format(fingerprint))
if install: if install:
if trust: if trust:
spack.gpg.trust(stage.save_filename) spack.util.gpg.trust(stage.save_filename)
tty.debug("Added this key to trusted keys.") tty.debug("Added this key to trusted keys.")
else: else:
tty.debug( tty.debug(
@@ -2461,7 +2432,7 @@ def push_keys(*mirrors, **kwargs):
tmpdir = kwargs.get("tmpdir") tmpdir = kwargs.get("tmpdir")
remove_tmpdir = False remove_tmpdir = False
keys = spack.gpg.public_keys(*(keys or [])) keys = spack.util.gpg.public_keys(*(keys or []))
try: try:
for mirror in mirrors: for mirror in mirrors:
@@ -2493,7 +2464,7 @@ def push_keys(*mirrors, **kwargs):
export_target = os.path.join(prefix, filename) export_target = os.path.join(prefix, filename)
# Export public keys (private is set to False) # Export public keys (private is set to False)
spack.gpg.export_keys(export_target, [fingerprint]) spack.util.gpg.export_keys(export_target, [fingerprint])
# If mirror is local, the above export writes directly to the # If mirror is local, the above export writes directly to the
# mirror (export_target points directly to the mirror). # mirror (export_target points directly to the mirror).
@@ -2502,10 +2473,7 @@ def push_keys(*mirrors, **kwargs):
# uploaded to the mirror. # uploaded to the mirror.
if not keys_local: if not keys_local:
spack.util.web.push_to_url( spack.util.web.push_to_url(
export_target, export_target, url_util.join(keys_url, filename), keep_original=False
url_util.join(keys_url, filename),
keep_original=False,
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
if regenerate_index: if regenerate_index:
@@ -2539,12 +2507,7 @@ def needs_rebuild(spec, mirror_url):
# Only check for the presence of the json version of the spec. If the # Only check for the presence of the json version of the spec. If the
# mirror only has the json version, or doesn't have the spec at all, we # mirror only has the json version, or doesn't have the spec at all, we
# need to rebuild. # need to rebuild.
return not web_util.url_exists( return not web_util.url_exists(specfile_path)
specfile_path,
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
verify_ssl=spack.config.get("config:verify_ssl"),
timeout=spack.config.get("config:connect_timeout", 10),
)
def check_specs_against_mirrors(mirrors, specs, output_file=None): def check_specs_against_mirrors(mirrors, specs, output_file=None):
@@ -2710,11 +2673,7 @@ def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal # Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash") url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
try: try:
response = self.urlopen( response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
urllib.request.Request(url_index_hash, headers=self.headers),
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except urllib.error.URLError: except urllib.error.URLError:
return None return None
@@ -2736,11 +2695,7 @@ def conditional_fetch(self) -> FetchIndexResult:
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json") url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
try: try:
response = self.urlopen( response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
urllib.request.Request(url_index, headers=self.headers),
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except urllib.error.URLError as e: except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e
@@ -2788,11 +2743,7 @@ def conditional_fetch(self) -> FetchIndexResult:
} }
try: try:
response = self.urlopen( response = self.urlopen(urllib.request.Request(url, headers=headers))
urllib.request.Request(url, headers=headers),
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except urllib.error.HTTPError as e: except urllib.error.HTTPError as e:
if e.getcode() == 304: if e.getcode() == 304:
# Not modified; that means fresh. # Not modified; that means fresh.

View File

@@ -16,6 +16,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
from llnl.util import tty from llnl.util import tty
import spack.platforms
import spack.store import spack.store
import spack.util.environment import spack.util.environment
import spack.util.executable import spack.util.executable
@@ -206,17 +207,19 @@ def _root_spec(spec_str: str) -> str:
"""Add a proper compiler and target to a spec used during bootstrapping. """Add a proper compiler and target to a spec used during bootstrapping.
Args: Args:
spec_str (str): spec to be bootstrapped. Must be without compiler and target. spec_str: spec to be bootstrapped. Must be without compiler and target.
""" """
# Add a proper compiler hint to the root spec. We use GCC for # Add a compiler requirement to the root spec.
# everything but MacOS and Windows. platform = str(spack.platforms.host())
if str(spack.platforms.host()) == "darwin": if platform == "darwin":
spec_str += " %apple-clang" spec_str += " %apple-clang"
elif str(spack.platforms.host()) == "windows": elif platform == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up # TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37" spec_str += " %msvc@:19.37"
else: elif platform == "linux":
spec_str += " %gcc" spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
target = archspec.cpu.host().family target = archspec.cpu.host().family
spec_str += f" target={target}" spec_str += f" target={target}"

View File

@@ -45,8 +45,7 @@ def spec_for_current_python() -> str:
def root_path() -> str: def root_path() -> str:
"""Root of all the bootstrap related folders""" """Root of all the bootstrap related folders"""
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(
spack.config.get("bootstrap:root", spack.paths.default_user_bootstrap_path), spack.config.get("bootstrap:root", spack.paths.default_user_bootstrap_path)
replacements=spack.paths.path_replacements(),
) )
@@ -80,16 +79,12 @@ def spack_python_interpreter() -> Generator:
def _store_path() -> str: def _store_path() -> str:
bootstrap_root_path = root_path() bootstrap_root_path = root_path()
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(os.path.join(bootstrap_root_path, "store"))
os.path.join(bootstrap_root_path, "store"), replacements=spack.paths.path_replacements()
)
def _config_path() -> str: def _config_path() -> str:
bootstrap_root_path = root_path() bootstrap_root_path = root_path()
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(os.path.join(bootstrap_root_path, "config"))
os.path.join(bootstrap_root_path, "config"), replacements=spack.paths.path_replacements()
)
@contextlib.contextmanager @contextlib.contextmanager

View File

@@ -92,9 +92,7 @@ class Bootstrapper:
def __init__(self, conf: ConfigDictionary) -> None: def __init__(self, conf: ConfigDictionary) -> None:
self.conf = conf self.conf = conf
self.name = conf["name"] self.name = conf["name"]
self.metadata_dir = spack.util.path.canonicalize_path( self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
conf["metadata"], replacements=spack.paths.path_replacements()
)
# Promote (relative) paths to file urls # Promote (relative) paths to file urls
url = conf["info"]["url"] url = conf["info"]["url"]
@@ -388,7 +386,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler() exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources(): for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]): with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config) source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config) current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec): if current_bootstrapper.try_import(module, abstract_spec):
@@ -443,7 +441,7 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler() exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources(): for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]): with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config) source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config) current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec): if current_bootstrapper.try_search_path(executables, abstract_spec):
@@ -587,9 +585,7 @@ def bootstrapping_sources(scope: Optional[str] = None):
list_of_sources = [] list_of_sources = []
for entry in source_configs: for entry in source_configs:
current = copy.copy(entry) current = copy.copy(entry)
metadata_dir = spack.util.path.canonicalize_path( metadata_dir = spack.util.path.canonicalize_path(entry["metadata"])
entry["metadata"], replacements=spack.paths.path_replacements()
)
metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME) metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME)
with open(metadata_yaml, encoding="utf-8") as stream: with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream)) current.update(spack.util.spack_yaml.load(stream))

View File

@@ -16,11 +16,9 @@
from llnl.util import tty from llnl.util import tty
import spack.environment import spack.environment
import spack.paths
import spack.tengine import spack.tengine
import spack.util.cpus import spack.util.cpus
import spack.util.executable import spack.util.executable
from spack.environment import depfile
from ._common import _root_spec from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path from .config import root_path, spec_for_current_python, store_path
@@ -51,8 +49,7 @@ def environment_root(cls) -> pathlib.Path:
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}" environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
return pathlib.Path( return pathlib.Path(
spack.util.path.canonicalize_path( spack.util.path.canonicalize_path(
os.path.join(bootstrap_root_path, "environments", environment_dir), os.path.join(bootstrap_root_path, "environments", environment_dir)
replacements=spack.paths.path_replacements(),
) )
) )
@@ -88,12 +85,9 @@ def __init__(self) -> None:
super().__init__(self.environment_root()) super().__init__(self.environment_root())
def update_installations(self) -> None: def update_installations(self) -> None:
"""Update the installations of this environment. """Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
The update is done using a depfile on Linux and macOS, and using the ``install_all`` with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
specs = self.concretize() specs = self.concretize()
if specs: if specs:
colorized_specs = [ colorized_specs = [
@@ -102,10 +96,8 @@ def update_installations(self) -> None:
] ]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})") tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False) self.write(regenerate=False)
if sys.platform == "win32": with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
self.install_all() self.install_all()
else:
self._install_with_depfile()
self.write(regenerate=True) self.write(regenerate=True)
def update_syspath_and_environ(self) -> None: def update_syspath_and_environ(self) -> None:
@@ -124,27 +116,6 @@ def update_syspath_and_environ(self) -> None:
+ [str(x) for x in self.pythonpaths()] + [str(x) for x in self.pythonpaths()]
) )
def _install_with_depfile(self) -> None:
model = depfile.MakefileModel.from_env(self)
template = spack.tengine.make_environment().get_template(
os.path.join("depfile", "Makefile")
)
makefile = self.environment_root() / "Makefile"
makefile.write_text(template.render(model.to_dict()))
make = spack.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(
spack.util.cpus.determine_number_of_jobs(parallel=True, config=spack.config.CONFIG)
),
**kwargs,
)
def _write_spack_yaml_file(self) -> None: def _write_spack_yaml_file(self) -> None:
tty.msg( tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment" "[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"

View File

@@ -66,7 +66,6 @@ def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = { _core_system_exes = {
"make": _missing("make", "required to build software from sources"), "make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"), "patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"), "tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"), "gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"), "unzip": _missing("unzip", "required to compress/decompress code archives"),

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.build_environment
import spack.builder import spack.builder
from .cmake import CMakeBuilder, CMakePackage from .cmake import CMakeBuilder, CMakePackage
@@ -285,6 +286,19 @@ def initconfig_hardware_entries(self):
def std_initconfig_entries(self): def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"] cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";") cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
cmake_rpaths_env = spack.build_environment.get_rpaths(self.pkg)
cmake_rpaths_path = ";".join(cmake_rpaths_env)
complete_rpath_list = cmake_rpaths_path
if "SPACK_COMPILER_EXTRA_RPATHS" in os.environ:
spack_extra_rpaths_env = os.environ["SPACK_COMPILER_EXTRA_RPATHS"]
spack_extra_rpaths_path = spack_extra_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_extra_rpaths_path)
if "SPACK_COMPILER_IMPLICIT_RPATHS" in os.environ:
spack_implicit_rpaths_env = os.environ["SPACK_COMPILER_IMPLICIT_RPATHS"]
spack_implicit_rpaths_path = spack_implicit_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_implicit_rpaths_path)
return [ return [
"#------------------{0}".format("-" * 60), "#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!", "# !!!! This is a generated file, edit at own risk !!!!",
@@ -292,6 +306,9 @@ def std_initconfig_entries(self):
"# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path), "# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path),
"#------------------{0}\n".format("-" * 60), "#------------------{0}\n".format("-" * 60),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path), cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"), self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
] ]

View File

@@ -6,13 +6,14 @@
import os import os
import re import re
import shutil import shutil
from typing import Optional from typing import Iterable, List, Mapping, Optional
import archspec import archspec
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.lang as lang import llnl.util.lang as lang
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder import spack.builder
import spack.config import spack.config
@@ -25,14 +26,18 @@
from spack.directives import build_system, depends_on, extends, maintainers from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part from spack.install_test import test_part
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary): def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
"""Iterable that yields KEY=VALUE paths through a dictionary. """Iterable that yields KEY=VALUE paths through a dictionary.
Args: Args:
dictionary: Possibly nested dictionary of arbitrary keys and values. dictionary: Possibly nested dictionary of arbitrary keys and values.
Yields: Yields:
A single path through the dictionary. A single path through the dictionary.
""" """
@@ -50,7 +55,7 @@ class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart") maintainers("adamjstewart")
@property @property
def import_modules(self): def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides. """Names of modules that the Python package provides.
These are used to test whether or not the installation succeeded. These are used to test whether or not the installation succeeded.
@@ -65,7 +70,7 @@ def import_modules(self):
detected, this property can be overridden by the package. detected, this property can be overridden by the package.
Returns: Returns:
list: list of strings of module names List of strings of module names.
""" """
modules = [] modules = []
pkg = self.spec["python"].package pkg = self.spec["python"].package
@@ -102,14 +107,14 @@ def import_modules(self):
return modules return modules
@property @property
def skip_modules(self): def skip_modules(self) -> Iterable[str]:
"""Names of modules that should be skipped when running tests. """Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules, These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained). they are skipped as well (meaning a.b is skipped if a is contained).
Returns: Returns:
list: list of strings of module names List of strings of module names.
""" """
return [] return []
@@ -185,12 +190,12 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(to_remove) view.remove_files(to_remove)
def test_imports(self): def test_imports(self) -> None:
"""Attempts to import modules of the installed package.""" """Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules, # Make sure we are importing the installed modules,
# not the ones in the source directory # not the ones in the source directory
python = inspect.getmodule(self).python python = inspect.getmodule(self).python # type: ignore[union-attr]
for module in self.import_modules: for module in self.import_modules:
with test_part( with test_part(
self, self,
@@ -315,24 +320,27 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None py_namespace: Optional[str] = None
@lang.classproperty @lang.classproperty
def homepage(cls): def homepage(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi: if cls.pypi:
name = cls.pypi.split("/")[0] name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/" return f"https://pypi.org/project/{name}/"
return None
@lang.classproperty @lang.classproperty
def url(cls): def url(cls) -> Optional[str]:
if cls.pypi: if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
@lang.classproperty @lang.classproperty
def list_url(cls): def list_url(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi: if cls.pypi:
name = cls.pypi.split("/")[0] name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/" return f"https://pypi.org/simple/{name}/"
return None
@property @property
def headers(self): def headers(self) -> HeaderList:
"""Discover header files in platlib.""" """Discover header files in platlib."""
# Remove py- prefix in package name # Remove py- prefix in package name
@@ -350,7 +358,7 @@ def headers(self):
raise NoHeadersError(msg.format(self.spec.name, include, platlib)) raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property @property
def libs(self): def libs(self) -> LibraryList:
"""Discover libraries in platlib.""" """Discover libraries in platlib."""
# Remove py- prefix in package name # Remove py- prefix in package name
@@ -384,7 +392,7 @@ class PythonPipBuilder(BaseBuilder):
install_time_test_callbacks = ["test"] install_time_test_callbacks = ["test"]
@staticmethod @staticmethod
def std_args(cls): def std_args(cls) -> List[str]:
return [ return [
# Verbose # Verbose
"-vvv", "-vvv",
@@ -409,7 +417,7 @@ def std_args(cls):
] ]
@property @property
def build_directory(self): def build_directory(self) -> str:
"""The root directory of the Python package. """The root directory of the Python package.
This is usually the directory containing one of the following files: This is usually the directory containing one of the following files:
@@ -420,51 +428,51 @@ def build_directory(self):
""" """
return self.pkg.stage.source_path return self.pkg.stage.source_path
def config_settings(self, spec, prefix): def config_settings(self, spec: Spec, prefix: Prefix) -> Mapping[str, object]:
"""Configuration settings to be passed to the PEP 517 build backend. """Configuration settings to be passed to the PEP 517 build backend.
Requires pip 22.1 or newer for keys that appear only a single time, Requires pip 22.1 or newer for keys that appear only a single time,
or pip 23.1 or newer if the same key appears multiple times. or pip 23.1 or newer if the same key appears multiple times.
Args: Args:
spec (spack.spec.Spec): build spec spec: Build spec.
prefix (spack.util.prefix.Prefix): installation prefix prefix: Installation prefix.
Returns: Returns:
dict: Possibly nested dictionary of KEY, VALUE settings Possibly nested dictionary of KEY, VALUE settings.
""" """
return {} return {}
def install_options(self, spec, prefix): def install_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra arguments to be supplied to the setup.py install command. """Extra arguments to be supplied to the setup.py install command.
Requires pip 23.0 or older. Requires pip 23.0 or older.
Args: Args:
spec (spack.spec.Spec): build spec spec: Build spec.
prefix (spack.util.prefix.Prefix): installation prefix prefix: Installation prefix.
Returns: Returns:
list: list of options List of options.
""" """
return [] return []
def global_options(self, spec, prefix): def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra global options to be supplied to the setup.py call before the install """Extra global options to be supplied to the setup.py call before the install
or bdist_wheel command. or bdist_wheel command.
Deprecated in pip 23.1. Deprecated in pip 23.1.
Args: Args:
spec (spack.spec.Spec): build spec spec: Build spec.
prefix (spack.util.prefix.Prefix): installation prefix prefix: Installation prefix.
Returns: Returns:
list: list of options List of options.
""" """
return [] return []
def install(self, pkg, spec, prefix): def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory.""" """Install everything from build directory."""
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"] args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]

View File

@@ -10,7 +10,6 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.builder import spack.builder
import spack.config
from spack.build_environment import SPACK_NO_PARALLEL_MAKE from spack.build_environment import SPACK_NO_PARALLEL_MAKE
from spack.directives import build_system, extends, maintainers from spack.directives import build_system, extends, maintainers
from spack.package_base import PackageBase from spack.package_base import PackageBase
@@ -94,7 +93,7 @@ def install(self, pkg, spec, prefix):
"--copy", "--copy",
"-i", "-i",
"-j", "-j",
str(determine_number_of_jobs(parallel=parallel, config=spack.config.CONFIG)), str(determine_number_of_jobs(parallel=parallel)),
"--", "--",
os.getcwd(), os.getcwd(),
] ]

View File

@@ -108,6 +108,8 @@ class ROCmPackage(PackageBase):
"gfx90a:xnack+", "gfx90a:xnack+",
"gfx90c", "gfx90c",
"gfx940", "gfx940",
"gfx941",
"gfx942",
"gfx1010", "gfx1010",
"gfx1011", "gfx1011",
"gfx1012", "gfx1012",
@@ -168,6 +170,8 @@ def hip_flags(amdgpu_target):
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-") depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+") depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940") depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013") depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030") depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031") depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")

View File

@@ -26,7 +26,7 @@ def misc_cache_location():
providers and for which packages provide which tags. providers and for which packages provide which tags.
""" """
path = spack.config.get("config:misc_cache", spack.paths.default_misc_cache_path) path = spack.config.get("config:misc_cache", spack.paths.default_misc_cache_path)
return spack.util.path.canonicalize_path(path, replacements=spack.paths.path_replacements()) return spack.util.path.canonicalize_path(path)
def _misc_cache(): def _misc_cache():
@@ -49,7 +49,7 @@ def fetch_cache_location():
path = spack.config.get("config:source_cache") path = spack.config.get("config:source_cache")
if not path: if not path:
path = spack.paths.default_fetch_cache_path path = spack.paths.default_fetch_cache_path
path = spack.util.path.canonicalize_path(path, replacements=spack.paths.path_replacements()) path = spack.util.path.canonicalize_path(path)
return path return path

View File

@@ -31,13 +31,13 @@
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.config as cfg import spack.config as cfg
import spack.environment as ev import spack.environment as ev
import spack.gpg
import spack.main import spack.main
import spack.mirror import spack.mirror
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.spec import spack.spec
import spack.util.git import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
@@ -1454,13 +1454,13 @@ def can_sign_binaries():
"""Utility method to determine if this spack instance is capable of """Utility method to determine if this spack instance is capable of
signing binary packages. This is currently only possible if the signing binary packages. This is currently only possible if the
spack gpg keystore contains exactly one secret key.""" spack gpg keystore contains exactly one secret key."""
return len(spack.gpg.signing_keys()) == 1 return len(gpg_util.signing_keys()) == 1
def can_verify_binaries(): def can_verify_binaries():
"""Utility method to determin if this spack instance is capable (at """Utility method to determin if this spack instance is capable (at
least in theory) of verifying signed binaries.""" least in theory) of verifying signed binaries."""
return len(spack.gpg.public_keys()) >= 1 return len(gpg_util.public_keys()) >= 1
def _push_mirror_contents(input_spec, sign_binaries, mirror_url): def _push_mirror_contents(input_spec, sign_binaries, mirror_url):
@@ -1756,11 +1756,7 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
gpg_path = None gpg_path = None
if gpg_url: if gpg_url:
gpg_path = web_util.fetch_url_text( gpg_path = web_util.fetch_url_text(gpg_url, dest_dir=os.path.join(work_dir, "_pgp"))
gpg_url,
dest_dir=os.path.join(work_dir, "_pgp"),
fetch_method=spack.config.get("config:url_fetch_method"),
)
rel_gpg_path = gpg_path.replace(work_dir, "").lstrip(os.path.sep) rel_gpg_path = gpg_path.replace(work_dir, "").lstrip(os.path.sep)
lock_file = fs.find(work_dir, "spack.lock")[0] lock_file = fs.find(work_dir, "spack.lock")[0]
@@ -2117,11 +2113,7 @@ def write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dic
with open(file_path, "w") as fd: with open(file_path, "w") as fd:
fd.write(syaml.dump(broken_spec_details)) fd.write(syaml.dump(broken_spec_details))
web_util.push_to_url( web_util.push_to_url(
file_path, file_path, url, keep_original=False, extra_args={"ContentType": "text/plain"}
url,
keep_original=False,
extra_args={"ContentType": "text/plain"},
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
except Exception as err: except Exception as err:
# If there is an S3 error (e.g., access denied or connection # If there is an S3 error (e.g., access denied or connection
@@ -2138,12 +2130,8 @@ def read_broken_spec(broken_spec_url):
object. object.
""" """
try: try:
_, _, fs = web_util.read_from_url( _, _, fs = web_util.read_from_url(broken_spec_url)
broken_spec_url, except (URLError, web_util.SpackWebError, HTTPError):
verify_ssl=cfg.get("config:verify_ssl", True),
timeout=cfg.get("config:connect_timeout", 10),
)
except (URLError, web_util.WebError, HTTPError):
tty.warn("Unable to read broken spec from {0}".format(broken_spec_url)) tty.warn("Unable to read broken spec from {0}".format(broken_spec_url))
return None return None

View File

@@ -6,10 +6,8 @@
import argparse import argparse
import os import os
import re import re
import shlex
import sys import sys
from textwrap import dedent from typing import List, Union
from typing import List, Match, Tuple
import llnl.string import llnl.string
import llnl.util.tty as tty import llnl.util.tty as tty
@@ -147,90 +145,38 @@ def get_command(cmd_name):
return getattr(get_module(cmd_name), pname) return getattr(get_module(cmd_name), pname)
class _UnquotedFlags: def quote_kvp(string: str) -> str:
"""Use a heuristic in `.extract()` to detect whether the user is trying to set """For strings like ``name=value`` or ``name==value``, quote and escape the value if needed.
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
If the heuristic finds a match (which can be checked with `__bool__()`), a warning This is a compromise to respect quoting of key-value pairs on the CLI. The shell
message explaining how to quote multiple flags correctly can be generated with strips quotes from quoted arguments, so we cannot know *exactly* how CLI arguments
`.report()`. were quoted. To compensate, we re-add quotes around anything staritng with ``name=``
or ``name==``, and we assume the rest of the argument is the value. This covers the
common cases of passign flags, e.g., ``cflags="-O2 -g"`` on the command line.
""" """
match = re.match(spack.parser.SPLIT_KVP, string)
if not match:
return string
flags_arg_pattern = re.compile( key, delim, value = match.groups()
r'^({0})=([^\'"].*)$'.format("|".join(spack.spec.FlagMap.valid_compiler_flags())) return f"{key}{delim}{spack.parser.quote_if_needed(value)}"
)
def __init__(self, all_unquoted_flag_pairs: List[Tuple[Match[str], str]]):
self._flag_pairs = all_unquoted_flag_pairs
def __bool__(self) -> bool:
return bool(self._flag_pairs)
@classmethod
def extract(cls, sargs: str) -> "_UnquotedFlags":
all_unquoted_flag_pairs: List[Tuple[Match[str], str]] = []
prev_flags_arg = None
for arg in shlex.split(sargs):
if prev_flags_arg is not None:
all_unquoted_flag_pairs.append((prev_flags_arg, arg))
prev_flags_arg = cls.flags_arg_pattern.match(arg)
return cls(all_unquoted_flag_pairs)
def report(self) -> str:
single_errors = [
"({0}) {1} {2} => {3}".format(
i + 1,
match.group(0),
next_arg,
'{0}="{1} {2}"'.format(match.group(1), match.group(2), next_arg),
)
for i, (match, next_arg) in enumerate(self._flag_pairs)
]
return dedent(
"""\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
intent was to set multiple flags, try quoting them together as described below.
Possible flag quotation errors (with the correctly-quoted version after the =>):
{0}"""
).format("\n".join(single_errors))
def parse_specs(args, **kwargs): def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common """Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors. exceptions and dies if there are errors.
""" """
concretize = kwargs.get("concretize", False) args = [args] if isinstance(args, str) else args
normalize = kwargs.get("normalize", False) arg_string = " ".join([quote_kvp(arg) for arg in args])
tests = kwargs.get("tests", False)
sargs = args specs = spack.parser.parse(arg_string)
if not isinstance(args, str):
sargs = " ".join(args)
unquoted_flags = _UnquotedFlags.extract(sargs)
try:
specs = spack.parser.parse(sargs)
for spec in specs: for spec in specs:
if concretize: if concretize:
spec.concretize(tests=tests) # implies normalize spec.concretize(tests=tests)
elif normalize:
spec.normalize(tests=tests)
return specs return specs
except spack.error.SpecError as e:
msg = e.message
if e.long_message:
msg += e.long_message
# Unquoted flags will be read as a variant or hash
if unquoted_flags and ("variant" in msg or "hash" in msg):
msg += "\n\n"
msg += unquoted_flags.report()
raise spack.error.SpackError(msg) from e
def matching_spec_from_env(spec): def matching_spec_from_env(spec):
""" """

View File

@@ -18,7 +18,6 @@
import spack.config import spack.config
import spack.main import spack.main
import spack.mirror import spack.mirror
import spack.paths
import spack.spec import spack.spec
import spack.stage import spack.stage
import spack.util.path import spack.util.path
@@ -192,9 +191,7 @@ def _root(args):
root = spack.config.get("bootstrap:root", default=None, scope=args.scope) root = spack.config.get("bootstrap:root", default=None, scope=args.scope)
if root: if root:
root = spack.util.path.canonicalize_path( root = spack.util.path.canonicalize_path(root)
root, replacements=spack.paths.path_replacements()
)
print(root) print(root)
@@ -338,9 +335,7 @@ def _add(args):
raise RuntimeError(msg.format(args.name)) raise RuntimeError(msg.format(args.name))
# Check that the metadata file exists # Check that the metadata file exists
metadata_dir = spack.util.path.canonicalize_path( metadata_dir = spack.util.path.canonicalize_path(args.metadata_dir)
args.metadata_dir, replacements=spack.paths.path_replacements()
)
if not os.path.exists(metadata_dir) or not os.path.isdir(metadata_dir): if not os.path.exists(metadata_dir) or not os.path.isdir(metadata_dir):
raise RuntimeError('the directory "{0}" does not exist'.format(args.metadata_dir)) raise RuntimeError('the directory "{0}" does not exist'.format(args.metadata_dir))
@@ -389,9 +384,7 @@ def _remove(args):
def _mirror(args): def _mirror(args):
mirror_dir = spack.util.path.canonicalize_path( mirror_dir = spack.util.path.canonicalize_path(os.path.join(args.root_dir, LOCAL_MIRROR_DIR))
os.path.join(args.root_dir, LOCAL_MIRROR_DIR), replacements=spack.paths.path_replacements()
)
# TODO: Here we are adding gnuconfig manually, but this can be fixed # TODO: Here we are adding gnuconfig manually, but this can be fixed
# TODO: as soon as we have an option to add to a mirror all the possible # TODO: as soon as we have an option to add to a mirror all the possible
@@ -440,24 +433,9 @@ def write_metadata(subdir, metadata):
instructions += cmd.format("local-sources", rel_directory) instructions += cmd.format("local-sources", rel_directory)
if args.binary_packages: if args.binary_packages:
abs_directory, rel_directory = write_metadata(subdir="binaries", metadata=BINARY_METADATA) abs_directory, rel_directory = write_metadata(subdir="binaries", metadata=BINARY_METADATA)
shutil.copy( shutil.copy(spack.util.path.canonicalize_path(CLINGO_JSON), abs_directory)
spack.util.path.canonicalize_path( shutil.copy(spack.util.path.canonicalize_path(GNUPG_JSON), abs_directory)
CLINGO_JSON, replacements=spack.paths.path_replacements() shutil.copy(spack.util.path.canonicalize_path(PATCHELF_JSON), abs_directory)
),
abs_directory,
)
shutil.copy(
spack.util.path.canonicalize_path(
GNUPG_JSON, replacements=spack.paths.path_replacements()
),
abs_directory,
)
shutil.copy(
spack.util.path.canonicalize_path(
PATCHELF_JSON, replacements=spack.paths.path_replacements()
),
abs_directory,
)
instructions += cmd.format("local-binaries", rel_directory) instructions += cmd.format("local-binaries", rel_directory)
print(instructions) print(instructions)

View File

@@ -76,7 +76,19 @@ def setup_parser(subparser: argparse.ArgumentParser):
) )
push_sign = push.add_mutually_exclusive_group(required=False) push_sign = push.add_mutually_exclusive_group(required=False)
push_sign.add_argument( push_sign.add_argument(
"--unsigned", "-u", action="store_true", help="push unsigned buildcache tarballs" "--unsigned",
"-u",
action="store_false",
dest="signed",
default=None,
help="push unsigned buildcache tarballs",
)
push_sign.add_argument(
"--signed",
action="store_true",
dest="signed",
default=None,
help="push signed buildcache tarballs",
) )
push_sign.add_argument( push_sign.add_argument(
"--key", "-k", metavar="key", type=str, default=None, help="key for signing" "--key", "-k", metavar="key", type=str, default=None, help="key for signing"
@@ -188,14 +200,16 @@ def setup_parser(subparser: argparse.ArgumentParser):
default=lambda: spack.config.default_modify_scope(), default=lambda: spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check", help="configuration scope containing mirrors to check",
) )
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True) # Unfortunately there are 3 ways to do the same thing here:
check_spec_or_specfile.add_argument( check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
"-s", "--spec", help="check single spec instead of release specs file" "-s", "--spec", help="check single spec instead of release specs file"
) )
check_spec_or_specfile.add_argument( check_specs.add_argument(
"--spec-file", "--spec-file",
help="check single spec from json or yaml file instead of release specs file", help="check single spec from json or yaml file instead of release specs file",
) )
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn) check.set_defaults(func=check_fn)
@@ -326,17 +340,27 @@ def push_fn(args):
"The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22" "The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22"
) )
mirror: spack.mirror.Mirror = args.mirror
# Check if this is an OCI image. # Check if this is an OCI image.
try: try:
image_ref = spack.oci.oci.image_from_mirror(args.mirror) image_ref = spack.oci.oci.image_from_mirror(mirror)
except ValueError: except ValueError:
image_ref = None image_ref = None
push_url = mirror.push_url
# When neither --signed, --unsigned nor --key are specified, use the mirror's default.
if args.signed is None and not args.key:
unsigned = not mirror.signed
else:
unsigned = not (args.key or args.signed)
# For OCI images, we require dependencies to be pushed for now. # For OCI images, we require dependencies to be pushed for now.
if image_ref: if image_ref:
if "dependencies" not in args.things_to_install: if "dependencies" not in args.things_to_install:
tty.die("Dependencies must be pushed for OCI images.") tty.die("Dependencies must be pushed for OCI images.")
if not args.unsigned: if not unsigned:
tty.warn( tty.warn(
"Code signing is currently not supported for OCI images. " "Code signing is currently not supported for OCI images. "
"Use --unsigned to silence this warning." "Use --unsigned to silence this warning."
@@ -349,12 +373,10 @@ def push_fn(args):
dependencies="dependencies" in args.things_to_install, dependencies="dependencies" in args.things_to_install,
) )
url = args.mirror.push_url
# When pushing multiple specs, print the url once ahead of time, as well as how # When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed. # many specs are being pushed.
if len(specs) > 1: if len(specs) > 1:
tty.info(f"Selected {len(specs)} specs to push to {url}") tty.info(f"Selected {len(specs)} specs to push to {push_url}")
failed = [] failed = []
@@ -371,10 +393,10 @@ def push_fn(args):
try: try:
bindist.push_or_raise( bindist.push_or_raise(
spec, spec,
url, push_url,
bindist.PushOptions( bindist.PushOptions(
force=args.force, force=args.force,
unsigned=args.unsigned, unsigned=unsigned,
key=args.key, key=args.key,
regenerate_index=args.update_index, regenerate_index=args.update_index,
), ),
@@ -382,7 +404,7 @@ def push_fn(args):
msg = f"{_progress(i, len(specs))}Pushed {_format_spec(spec)}" msg = f"{_progress(i, len(specs))}Pushed {_format_spec(spec)}"
if len(specs) == 1: if len(specs) == 1:
msg += f" to {url}" msg += f" to {push_url}"
tty.info(msg) tty.info(msg)
except bindist.NoOverwriteException: except bindist.NoOverwriteException:
@@ -813,15 +835,24 @@ def check_fn(args: argparse.Namespace):
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
""" """
if args.spec_file: if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn( tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. " "The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead." f"Use `spack buildcache check {specs_arg}` instead."
) )
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
specs = spack.cmd.parse_specs(args.spec or args.spec_file) if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
if specs:
specs = _matching_specs(specs)
else: else:
specs = spack.cmd.require_active_env("buildcache check").all_specs() specs = spack.cmd.require_active_env("buildcache check").all_specs()
@@ -918,12 +949,7 @@ def copy_buildcache_file(src_url, dest_url, local_path=None):
try: try:
temp_stage.create() temp_stage.create()
temp_stage.fetch() temp_stage.fetch()
web_util.push_to_url( web_util.push_to_url(local_path, dest_url, keep_original=True)
local_path,
dest_url,
keep_original=True,
verify_ssl=spack.config.get("config:verify_ssl", True),
)
except spack.error.FetchError as e: except spack.error.FetchError as e:
# Expected, since we have to try all the possible extensions # Expected, since we have to try all the possible extensions
tty.debug("no such file: {0}".format(src_url)) tty.debug("no such file: {0}".format(src_url))

View File

@@ -11,7 +11,6 @@
from llnl.util import tty from llnl.util import tty
import spack.cmd import spack.cmd
import spack.config
import spack.repo import spack.repo
import spack.spec import spack.spec
import spack.stage import spack.stage
@@ -276,4 +275,4 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str):
tty.msg(f"Open {filename} to review the additions.") tty.msg(f"Open {filename} to review the additions.")
if sys.stdout.isatty(): if sys.stdout.isatty():
editor(filename, debug=spack.config.get("config:debug")) editor(filename)

View File

@@ -16,9 +16,10 @@
import spack.cmd.buildcache as buildcache import spack.cmd.buildcache as buildcache
import spack.config as cfg import spack.config as cfg
import spack.environment as ev import spack.environment as ev
import spack.gpg import spack.environment.depfile
import spack.hash_types as ht import spack.hash_types as ht
import spack.mirror import spack.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer import spack.util.timer as timer
import spack.util.url as url_util import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
@@ -305,7 +306,7 @@ def ci_rebuild(args):
# Fail early if signing is required but we don't have a signing key # Fail early if signing is required but we don't have a signing key
sign_binaries = require_signing is not None and require_signing.lower() == "true" sign_binaries = require_signing is not None and require_signing.lower() == "true"
if sign_binaries and not spack_ci.can_sign_binaries(): if sign_binaries and not spack_ci.can_sign_binaries():
spack.gpg.list(False, True) gpg_util.list(False, True)
tty.die("SPACK_REQUIRE_SIGNING=True => spack must have exactly one signing key") tty.die("SPACK_REQUIRE_SIGNING=True => spack must have exactly one signing key")
# Construct absolute paths relative to current $CI_PROJECT_DIR # Construct absolute paths relative to current $CI_PROJECT_DIR
@@ -606,7 +607,9 @@ def ci_rebuild(args):
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)), "SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)", "-j$(nproc)",
"install-deps/{}".format( "install-deps/{}".format(
ev.depfile.MakefileSpec(job_spec).safe_format("{name}-{version}-{hash}") spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
), ),
], ],
spack_cmd + ["install"] + root_install_args, spack_cmd + ["install"] + root_install_args,
@@ -730,17 +733,10 @@ def ci_rebuild(args):
broken_specs_url = ci_config["broken-specs-url"] broken_specs_url = ci_config["broken-specs-url"]
just_built_hash = job_spec.dag_hash() just_built_hash = job_spec.dag_hash()
broken_spec_path = url_util.join(broken_specs_url, just_built_hash) broken_spec_path = url_util.join(broken_specs_url, just_built_hash)
if web_util.url_exists( if web_util.url_exists(broken_spec_path):
broken_spec_path,
fetch_method=cfg.get("config:url_fetch_method", "urllib"),
verify_ssl=cfg.get("config:verify_ssl"),
timeout=cfg.get("config:connect_timeout", 10),
):
tty.msg("Removing {0} from the list of broken specs".format(broken_spec_path)) tty.msg("Removing {0} from the list of broken specs".format(broken_spec_path))
try: try:
web_util.remove_url( web_util.remove_url(broken_spec_path)
broken_spec_path, verify_ssl=cfg.get("config:verify_ssl", True)
)
except Exception as err: except Exception as err:
# If there is an S3 error (e.g., access denied or connection # If there is an S3 error (e.g., access denied or connection
# error), the first non boto-specific class in the exception # error), the first non boto-specific class in the exception

View File

@@ -14,7 +14,6 @@
import spack.caches import spack.caches
import spack.cmd.test import spack.cmd.test
import spack.config import spack.config
import spack.paths
import spack.repo import spack.repo
import spack.stage import spack.stage
import spack.store import spack.store
@@ -134,9 +133,7 @@ def clean(parser, args):
remove_python_cache() remove_python_cache()
if args.bootstrap: if args.bootstrap:
bootstrap_prefix = spack.util.path.canonicalize_path( bootstrap_prefix = spack.util.path.canonicalize_path(spack.config.get("bootstrap:root"))
spack.config.get("bootstrap:root"), replacements=spack.paths.path_replacements()
)
msg = 'Removing bootstrapped software and configuration in "{0}"' msg = 'Removing bootstrapped software and configuration in "{0}"'
tty.msg(msg.format(bootstrap_prefix)) tty.msg(msg.format(bootstrap_prefix))
llnl.util.filesystem.remove_directory_contents(bootstrap_prefix) llnl.util.filesystem.remove_directory_contents(bootstrap_prefix)

View File

@@ -384,10 +384,11 @@ def install_status():
"--install-status", "--install-status",
action="store_true", action="store_true",
default=True, default=True,
help="show install status of packages\n\npackages can be: " help=(
"installed [+], missing and needed by an installed package [-], " "show install status of packages\n"
"installed in an upstream instance [^], " "[+] installed [^] installed in an upstream\n"
"or not installed (no annotation)", " - not installed [-] missing dep of installed package\n"
),
) )

View File

@@ -180,7 +180,7 @@ def config_edit(args):
if args.print_file: if args.print_file:
print(config_file) print(config_file)
else: else:
editor(config_file, debug=spack.config.get("config:debug")) editor(config_file)
def config_list(args): def config_list(args):

View File

@@ -11,7 +11,6 @@
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp from llnl.util.filesystem import mkdirp
import spack.config
import spack.repo import spack.repo
import spack.stage import spack.stage
import spack.util.web import spack.util.web
@@ -987,4 +986,4 @@ def create(parser, args):
# Optionally open up the new package file in your $EDITOR # Optionally open up the new package file in your $EDITOR
if not args.skip_editor: if not args.skip_editor:
editor(pkg_path, debug=spack.config.get("config:debug")) editor(pkg_path)

View File

@@ -8,7 +8,6 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.cmd import spack.cmd
import spack.paths
import spack.spec import spack.spec
import spack.util.path import spack.util.path
import spack.version import spack.version
@@ -56,9 +55,7 @@ def develop(parser, args):
# download all dev specs # download all dev specs
for name, entry in env.dev_specs.items(): for name, entry in env.dev_specs.items():
path = entry.get("path", name) path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path( abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
path, default_wd=env.path, replacements=spack.paths.path_replacements()
)
if os.path.exists(abspath): if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"] msg = "Skipping developer download of %s" % entry["spec"]
@@ -89,9 +86,7 @@ def develop(parser, args):
# default path is relative path to spec.name # default path is relative path to spec.name
path = args.path or spec.name path = args.path or spec.name
abspath = spack.util.path.canonicalize_path( abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
path, default_wd=env.path, replacements=spack.paths.path_replacements()
)
# clone default: only if the path doesn't exist # clone default: only if the path doesn't exist
clone = args.clone clone = args.clone

View File

@@ -9,7 +9,6 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.cmd import spack.cmd
import spack.config
import spack.paths import spack.paths
import spack.repo import spack.repo
from spack.spec import Spec from spack.spec import Spec
@@ -46,7 +45,7 @@ def edit_package(name, repo_path, namespace):
else: else:
raise spack.repo.UnknownPackageError(spec.name) raise spack.repo.UnknownPackageError(spec.name)
editor(path, debug=spack.config.get("config:debug")) editor(path)
def setup_parser(subparser): def setup_parser(subparser):

View File

@@ -7,9 +7,9 @@
import os import os
import spack.binary_distribution import spack.binary_distribution
import spack.gpg
import spack.mirror import spack.mirror
import spack.paths import spack.paths
import spack.util.gpg
import spack.util.url import spack.util.url
from spack.cmd.common import arguments from spack.cmd.common import arguments
@@ -129,38 +129,40 @@ def setup_parser(subparser):
def gpg_create(args): def gpg_create(args):
"""create a new key""" """create a new key"""
if args.export or args.secret: if args.export or args.secret:
old_sec_keys = spack.gpg.signing_keys() old_sec_keys = spack.util.gpg.signing_keys()
# Create the new key # Create the new key
spack.gpg.create(name=args.name, email=args.email, comment=args.comment, expires=args.expires) spack.util.gpg.create(
name=args.name, email=args.email, comment=args.comment, expires=args.expires
)
if args.export or args.secret: if args.export or args.secret:
new_sec_keys = set(spack.gpg.signing_keys()) new_sec_keys = set(spack.util.gpg.signing_keys())
new_keys = new_sec_keys.difference(old_sec_keys) new_keys = new_sec_keys.difference(old_sec_keys)
if args.export: if args.export:
spack.gpg.export_keys(args.export, new_keys) spack.util.gpg.export_keys(args.export, new_keys)
if args.secret: if args.secret:
spack.gpg.export_keys(args.secret, new_keys, secret=True) spack.util.gpg.export_keys(args.secret, new_keys, secret=True)
def gpg_export(args): def gpg_export(args):
"""export a gpg key, optionally including secret key""" """export a gpg key, optionally including secret key"""
keys = args.keys keys = args.keys
if not keys: if not keys:
keys = spack.gpg.signing_keys() keys = spack.util.gpg.signing_keys()
spack.gpg.export_keys(args.location, keys, args.secret) spack.util.gpg.export_keys(args.location, keys, args.secret)
def gpg_list(args): def gpg_list(args):
"""list keys available in the keyring""" """list keys available in the keyring"""
spack.gpg.list(args.trusted, args.signing) spack.util.gpg.list(args.trusted, args.signing)
def gpg_sign(args): def gpg_sign(args):
"""sign a package""" """sign a package"""
key = args.key key = args.key
if key is None: if key is None:
keys = spack.gpg.signing_keys() keys = spack.util.gpg.signing_keys()
if len(keys) == 1: if len(keys) == 1:
key = keys[0] key = keys[0]
elif not keys: elif not keys:
@@ -171,12 +173,12 @@ def gpg_sign(args):
if not output: if not output:
output = args.spec[0] + ".asc" output = args.spec[0] + ".asc"
# TODO: Support the package format Spack creates. # TODO: Support the package format Spack creates.
spack.gpg.sign(key, " ".join(args.spec), output, args.clearsign) spack.util.gpg.sign(key, " ".join(args.spec), output, args.clearsign)
def gpg_trust(args): def gpg_trust(args):
"""add a key to the keyring""" """add a key to the keyring"""
spack.gpg.trust(args.keyfile) spack.util.gpg.trust(args.keyfile)
def gpg_init(args): def gpg_init(args):
@@ -189,12 +191,12 @@ def gpg_init(args):
for filename in filenames: for filename in filenames:
if not filename.endswith(".key"): if not filename.endswith(".key"):
continue continue
spack.gpg.trust(os.path.join(root, filename)) spack.util.gpg.trust(os.path.join(root, filename))
def gpg_untrust(args): def gpg_untrust(args):
"""remove a key from the keyring""" """remove a key from the keyring"""
spack.gpg.untrust(args.signing, *args.keys) spack.util.gpg.untrust(args.signing, *args.keys)
def gpg_verify(args): def gpg_verify(args):
@@ -203,7 +205,7 @@ def gpg_verify(args):
signature = args.signature signature = args.signature
if signature is None: if signature is None:
signature = args.spec[0] + ".asc" signature = args.spec[0] + ".asc"
spack.gpg.verify(signature, " ".join(args.spec)) spack.util.gpg.verify(signature, " ".join(args.spec))
def gpg_publish(args): def gpg_publish(args):

View File

@@ -327,7 +327,7 @@ def _variants_by_name_when(pkg):
"""Adaptor to get variants keyed by { name: { when: { [Variant...] } }.""" """Adaptor to get variants keyed by { name: { when: { [Variant...] } }."""
# TODO: replace with pkg.variants_by_name(when=True) when unified directive dicts are merged. # TODO: replace with pkg.variants_by_name(when=True) when unified directive dicts are merged.
variants = {} variants = {}
for name, (variant, whens) in pkg.variants.items(): for name, (variant, whens) in sorted(pkg.variants.items()):
for when in whens: for when in whens:
variants.setdefault(name, {}).setdefault(when, []).append(variant) variants.setdefault(name, {}).setdefault(when, []).append(variant)
return variants return variants

View File

@@ -162,8 +162,8 @@ def setup_parser(subparser):
"--no-check-signature", "--no-check-signature",
action="store_true", action="store_true",
dest="unsigned", dest="unsigned",
default=False, default=None,
help="do not check signatures of binary packages", help="do not check signatures of binary packages (override mirror config)",
) )
subparser.add_argument( subparser.add_argument(
"--show-log-on-error", "--show-log-on-error",

View File

@@ -107,6 +107,23 @@ def setup_parser(subparser):
"and source use `--type binary --type source` (default)" "and source use `--type binary --type source` (default)"
), ),
) )
add_parser_signed = add_parser.add_mutually_exclusive_group(required=False)
add_parser_signed.add_argument(
"--unsigned",
help="do not require signing and signature verification when pushing and installing from "
"this build cache",
action="store_false",
default=None,
dest="signed",
)
add_parser_signed.add_argument(
"--signed",
help="require signing and signature verification when pushing and installing from this "
"build cache",
action="store_true",
default=None,
dest="signed",
)
arguments.add_connection_args(add_parser, False) arguments.add_connection_args(add_parser, False)
# Remove # Remove
remove_parser = sp.add_parser("remove", aliases=["rm"], help=mirror_remove.__doc__) remove_parser = sp.add_parser("remove", aliases=["rm"], help=mirror_remove.__doc__)
@@ -157,6 +174,23 @@ def setup_parser(subparser):
), ),
) )
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'") set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser_unsigned = set_parser.add_mutually_exclusive_group(required=False)
set_parser_unsigned.add_argument(
"--unsigned",
help="do not require signing and signature verification when pushing and installing from "
"this build cache",
action="store_false",
default=None,
dest="signed",
)
set_parser_unsigned.add_argument(
"--signed",
help="require signing and signature verification when pushing and installing from this "
"build cache",
action="store_true",
default=None,
dest="signed",
)
set_parser.add_argument( set_parser.add_argument(
"--scope", "--scope",
action=arguments.ConfigScope, action=arguments.ConfigScope,
@@ -186,6 +220,7 @@ def mirror_add(args):
or args.type or args.type
or args.oci_username or args.oci_username
or args.oci_password or args.oci_password
or args.signed is not None
): ):
connection = {"url": args.url} connection = {"url": args.url}
if args.s3_access_key_id and args.s3_access_key_secret: if args.s3_access_key_id and args.s3_access_key_secret:
@@ -201,6 +236,8 @@ def mirror_add(args):
if args.type: if args.type:
connection["binary"] = "binary" in args.type connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type connection["source"] = "source" in args.type
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name) mirror = spack.mirror.Mirror(connection, name=args.name)
else: else:
mirror = spack.mirror.Mirror(args.url, name=args.name) mirror = spack.mirror.Mirror(args.url, name=args.name)
@@ -233,6 +270,8 @@ def _configure_mirror(args):
changes["endpoint_url"] = args.s3_endpoint_url changes["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password: if args.oci_username and args.oci_password:
changes["access_pair"] = [args.oci_username, args.oci_password] changes["access_pair"] = [args.oci_username, args.oci_password]
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
# argparse cannot distinguish between --binary and --no-binary when same dest :( # argparse cannot distinguish between --binary and --no-binary when same dest :(
# notice that set-url does not have these args, so getattr # notice that set-url does not have these args, so getattr
@@ -495,9 +534,7 @@ def mirror_destroy(args):
elif args.mirror_url: elif args.mirror_url:
mirror_url = args.mirror_url mirror_url = args.mirror_url
web_util.remove_url( web_util.remove_url(mirror_url, recursive=True)
mirror_url, recursive=True, verify_ssl=spack.config.get("config:verify_ssl", True)
)
def mirror(parser, args): def mirror(parser, args):

View File

@@ -12,10 +12,10 @@
from llnl.util.tty.colify import colify from llnl.util.tty.colify import colify
import spack.cmd import spack.cmd
import spack.package_hash as ph
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.util.executable as exe import spack.util.executable as exe
import spack.util.package_hash as ph
from spack.cmd.common import arguments from spack.cmd.common import arguments
description = "query packages associated with particular git revisions" description = "query packages associated with particular git revisions"

View File

@@ -9,7 +9,6 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.config import spack.config
import spack.paths
import spack.repo import spack.repo
import spack.util.path import spack.util.path
from spack.cmd.common import arguments from spack.cmd.common import arguments
@@ -84,9 +83,7 @@ def repo_add(args):
path = args.path path = args.path
# real_path is absolute and handles substitution. # real_path is absolute and handles substitution.
canon_path = spack.util.path.canonicalize_path( canon_path = spack.util.path.canonicalize_path(path)
path, replacements=spack.paths.path_replacements()
)
# check if the path exists # check if the path exists
if not os.path.exists(canon_path): if not os.path.exists(canon_path):
@@ -118,13 +115,9 @@ def repo_remove(args):
namespace_or_path = args.namespace_or_path namespace_or_path = args.namespace_or_path
# If the argument is a path, remove that repository from config. # If the argument is a path, remove that repository from config.
canon_path = spack.util.path.canonicalize_path( canon_path = spack.util.path.canonicalize_path(namespace_or_path)
namespace_or_path, replacements=spack.paths.path_replacements()
)
for repo_path in repos: for repo_path in repos:
repo_canon_path = spack.util.path.canonicalize_path( repo_canon_path = spack.util.path.canonicalize_path(repo_path)
repo_path, replacements=spack.paths.path_replacements()
)
if canon_path == repo_canon_path: if canon_path == repo_canon_path:
repos.remove(repo_path) repos.remove(repo_path)
spack.config.set("repos", repos, args.scope) spack.config.set("repos", repos, args.scope)

View File

@@ -11,9 +11,9 @@
import spack import spack
import spack.config import spack.config
import spack.gpg
import spack.paths import spack.paths
import spack.util.git import spack.util.git
import spack.util.gpg
from spack.cmd.common import arguments from spack.cmd.common import arguments
from spack.util.spack_yaml import syaml_dict from spack.util.spack_yaml import syaml_dict
@@ -76,7 +76,7 @@ def tutorial(parser, args):
spack.config.set("mirrors", mirror_config, scope="user") spack.config.set("mirrors", mirror_config, scope="user")
tty.msg("Ensuring that we trust tutorial binaries", f"spack gpg trust {tutorial_key}") tty.msg("Ensuring that we trust tutorial binaries", f"spack gpg trust {tutorial_key}")
spack.gpg.trust(tutorial_key) spack.util.gpg.trust(tutorial_key)
# Note that checkout MUST be last. It changes Spack under our feet. # Note that checkout MUST be last. It changes Spack under our feet.
# If you don't put this last, you'll get import errors for the code # If you don't put this last, you'll get import errors for the code

View File

@@ -227,9 +227,7 @@ def unit_test(parser, args, unknown_args):
# has been used, then test that extension. # has been used, then test that extension.
pytest_root = spack.paths.spack_root pytest_root = spack.paths.spack_root
if args.extension: if args.extension:
target = args.extension pytest_root = spack.extensions.load_extension(args.extension)
extensions = spack.extensions.get_extension_paths()
pytest_root = spack.extensions.path_for_extension(target, *extensions)
# pytest.ini lives in the root of the spack repository. # pytest.ini lives in the root of the spack repository.
with llnl.util.filesystem.working_dir(pytest_root): with llnl.util.filesystem.working_dir(pytest_root):

View File

@@ -20,16 +20,16 @@ def __init__(self, *args, **kwargs):
self.version_argument = "-V" self.version_argument = "-V"
# Subclasses use possible names of C compiler # Subclasses use possible names of C compiler
cc_names = ["craycc", "cc"] cc_names = ["craycc"]
# Subclasses use possible names of C++ compiler # Subclasses use possible names of C++ compiler
cxx_names = ["crayCC", "CC"] cxx_names = ["crayCC"]
# Subclasses use possible names of Fortran 77 compiler # Subclasses use possible names of Fortran 77 compiler
f77_names = ["crayftn", "ftn"] f77_names = ["crayftn"]
# Subclasses use possible names of Fortran 90 compiler # Subclasses use possible names of Fortran 90 compiler
fc_names = ["crayftn", "ftn"] fc_names = ["crayftn"]
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes. # MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r"-mp-\d\.\d"] suffixes = [r"-mp-\d\.\d"]

View File

@@ -31,7 +31,6 @@
import spack.config import spack.config
import spack.environment import spack.environment
import spack.error import spack.error
import spack.paths
import spack.platforms import spack.platforms
import spack.repo import spack.repo
import spack.spec import spack.spec
@@ -92,9 +91,7 @@ def concretize_develop(self, spec):
if not dev_info: if not dev_info:
return False return False
path = spack.util.path.canonicalize_path( path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
dev_info["path"], default_wd=env.path, replacements=spack.paths.path_replacements()
)
if "dev_path" in spec.variants: if "dev_path" in spec.variants:
assert spec.variants["dev_path"].value == path assert spec.variants["dev_path"].value == path

View File

@@ -1451,9 +1451,7 @@ def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) ->
def _fetch_file(url): def _fetch_file(url):
raw = raw_github_gitlab_url(url) raw = raw_github_gitlab_url(url)
tty.debug("Reading config from url {0}".format(raw)) tty.debug("Reading config from url {0}".format(raw))
return web_util.fetch_url_text( return web_util.fetch_url_text(raw, dest_dir=dest_dir)
raw, dest_dir=dest_dir, fetch_method=CONFIG.get("config:url_fetch_method")
)
if not url: if not url:
raise ConfigFileError("Cannot retrieve configuration without a URL") raise ConfigFileError("Cannot retrieve configuration without a URL")

View File

@@ -309,10 +309,14 @@ def find_windows_kit_roots() -> List[str]:
return glob.glob(kit_base) return glob.glob(kit_base)
@staticmethod @staticmethod
def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]: def find_windows_kit_bin_paths(
kit_base: Union[Optional[str], Optional[list]] = None
) -> List[str]:
"""Returns Windows kit bin directory per version""" """Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base, "Unexpectedly empty value for Windows kit base path" assert kit_base, "Unexpectedly empty value for Windows kit base path"
if isinstance(kit_base, str):
kit_base = kit_base.split(";")
kit_paths = [] kit_paths = []
for kit in kit_base: for kit in kit_base:
kit_bin = os.path.join(kit, "bin") kit_bin = os.path.join(kit, "bin")
@@ -320,10 +324,14 @@ def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]:
return kit_paths return kit_paths
@staticmethod @staticmethod
def find_windows_kit_lib_paths(kit_base: Optional[str] = None) -> List[str]: def find_windows_kit_lib_paths(
kit_base: Union[Optional[str], Optional[list]] = None
) -> List[str]:
"""Returns Windows kit lib directory per version""" """Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base, "Unexpectedly empty value for Windows kit base path" assert kit_base, "Unexpectedly empty value for Windows kit base path"
if isinstance(kit_base, str):
kit_base = kit_base.split(";")
kit_paths = [] kit_paths = []
for kit in kit_base: for kit in kit_base:
kit_lib = os.path.join(kit, "Lib") kit_lib = os.path.join(kit, "Lib")

View File

@@ -90,8 +90,7 @@
def env_root_path(): def env_root_path():
"""Override default root path if the user specified it""" """Override default root path if the user specified it"""
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(
spack.config.get("config:environments_root", default=default_env_path), spack.config.get("config:environments_root", default=default_env_path)
replacements=spack.paths.path_replacements(),
) )
@@ -479,9 +478,7 @@ def __init__(
): ):
self.base = base_path self.base = base_path
self.raw_root = root self.raw_root = root
self.root = spack.util.path.canonicalize_path( self.root = spack.util.path.canonicalize_path(root, default_wd=base_path)
root, default_wd=base_path, replacements=spack.paths.path_replacements()
)
self.projections = projections self.projections = projections
self.select = select self.select = select
self.exclude = exclude self.exclude = exclude
@@ -496,9 +493,7 @@ def exclude_fn(self, spec):
def update_root(self, new_path): def update_root(self, new_path):
self.raw_root = new_path self.raw_root = new_path
self.root = spack.util.path.canonicalize_path( self.root = spack.util.path.canonicalize_path(new_path, default_wd=self.base)
new_path, default_wd=self.base, replacements=spack.paths.path_replacements()
)
def __eq__(self, other): def __eq__(self, other):
return all( return all(
@@ -990,9 +985,7 @@ def included_config_scopes(self):
missing = [] missing = []
for i, config_path in enumerate(reversed(includes)): for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc. # allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables( config_path = substitute_path_variables(config_path)
config_path, replacements=spack.paths.path_replacements()
)
include_url = urllib.parse.urlparse(config_path) include_url = urllib.parse.urlparse(config_path)
@@ -1303,9 +1296,7 @@ def develop(self, spec: Spec, path: str, clone: bool = False) -> bool:
# to be created, then copy it afterwards somewhere else. It would be # to be created, then copy it afterwards somewhere else. It would be
# better if we can create the `source_path` directly into its final # better if we can create the `source_path` directly into its final
# destination. # destination.
abspath = spack.util.path.canonicalize_path( abspath = spack.util.path.canonicalize_path(path, default_wd=self.path)
path, default_wd=self.path, replacements=spack.paths.path_replacements()
)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name) pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# We construct a package class ourselves, rather than asking for # We construct a package class ourselves, rather than asking for
# Spec.package, since Spec only allows this when it is concrete # Spec.package, since Spec only allows this when it is concrete

View File

@@ -6,17 +6,18 @@
for Spack's command extensions. for Spack's command extensions.
""" """
import difflib import difflib
import glob
import importlib import importlib
import os import os
import re import re
import sys import sys
import types import types
from typing import List
import llnl.util.lang import llnl.util.lang
import spack.config import spack.config
import spack.error import spack.error
import spack.paths
import spack.util.path import spack.util.path
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$") _extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
@@ -76,6 +77,15 @@ def load_command_extension(command, path):
if not os.path.exists(cmd_path): if not os.path.exists(cmd_path):
return None return None
ensure_extension_loaded(extension, path=path)
module = importlib.import_module(module_name)
sys.modules[module_name] = module
return module
def ensure_extension_loaded(extension, *, path):
def ensure_package_creation(name): def ensure_package_creation(name):
package_name = "{0}.{1}".format(__name__, name) package_name = "{0}.{1}".format(__name__, name)
if package_name in sys.modules: if package_name in sys.modules:
@@ -101,17 +111,28 @@ def ensure_package_creation(name):
ensure_package_creation(extension) ensure_package_creation(extension)
ensure_package_creation(extension + ".cmd") ensure_package_creation(extension + ".cmd")
module = importlib.import_module(module_name)
sys.modules[module_name] = module
return module def load_extension(name: str) -> str:
"""Loads a single extension into the 'spack.extensions' package.
Args:
name: name of the extension
"""
extension_root = path_for_extension(name, paths=get_extension_paths())
ensure_extension_loaded(name, path=extension_root)
commands = glob.glob(
os.path.join(extension_root, extension_name(extension_root), "cmd", "*.py")
)
commands = [os.path.basename(x).rstrip(".py") for x in commands]
for command in commands:
load_command_extension(command, extension_root)
return extension_root
def get_extension_paths(): def get_extension_paths():
"""Return the list of canonicalized extension paths from config:extensions.""" """Return the list of canonicalized extension paths from config:extensions."""
extension_paths = spack.config.get("config:extensions") or [] extension_paths = spack.config.get("config:extensions") or []
r = spack.paths.path_replacements() paths = [spack.util.path.canonicalize_path(p) for p in extension_paths]
paths = [spack.util.path.canonicalize_path(p, replacements=r) for p in extension_paths]
return paths return paths
@@ -127,7 +148,7 @@ def get_command_paths():
return command_paths return command_paths
def path_for_extension(target_name, *paths): def path_for_extension(target_name: str, *, paths: List[str]) -> str:
"""Return the test root dir for a given extension. """Return the test root dir for a given extension.
Args: Args:

View File

@@ -301,12 +301,7 @@ def fetch(self):
url = None url = None
errors = [] errors = []
for url in self.candidate_urls: for url in self.candidate_urls:
if not web_util.url_exists( if not web_util.url_exists(url):
url,
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
verify_ssl=spack.config.get("config:verify_ssl"),
timeout=spack.config.get("config:connect_timeout", 10),
):
tty.debug("URL does not exist: " + url) tty.debug("URL does not exist: " + url)
continue continue
@@ -343,12 +338,8 @@ def _fetch_urllib(self, url):
# Run urllib but grab the mime type from the http headers # Run urllib but grab the mime type from the http headers
try: try:
url, headers, response = web_util.read_from_url( url, headers, response = web_util.read_from_url(url)
url, except web_util.SpackWebError as e:
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
except web_util.WebError as e:
# clean up archive on failure. # clean up archive on failure.
if self.archive_file: if self.archive_file:
os.remove(self.archive_file) os.remove(self.archive_file)
@@ -394,15 +385,7 @@ def _fetch_curl(self, url):
timeout = self.extra_options.get("timeout") timeout = self.extra_options.get("timeout")
connect_timeout = spack.config.get("config:connect_timeout", 10) base_args = web_util.base_curl_fetch_args(url, timeout)
if timeout:
timeout = max(int(timeout), int(connect_timeout))
else:
timeout = int(connect_timeout)
base_args = web_util.base_curl_fetch_args(
url, timeout=timeout, verify_ssl=spack.config.get("config:verify_ssl")
)
curl_args = save_args + base_args + cookie_args curl_args = save_args + base_args + cookie_args
# Run curl but grab the mime type from the http headers # Run curl but grab the mime type from the http headers
@@ -420,7 +403,7 @@ def _fetch_curl(self, url):
try: try:
web_util.check_curl_code(curl.returncode) web_util.check_curl_code(curl.returncode)
except web_util.WebError as err: except spack.error.FetchError as err:
raise spack.fetch_strategy.FailedDownloadError(url, str(err)) raise spack.fetch_strategy.FailedDownloadError(url, str(err))
self._check_headers(headers) self._check_headers(headers)
@@ -480,10 +463,7 @@ def archive(self, destination):
raise NoArchiveFileError("Cannot call archive() before fetching.") raise NoArchiveFileError("Cannot call archive() before fetching.")
web_util.push_to_url( web_util.push_to_url(
self.archive_file, self.archive_file, url_util.path_to_file_url(destination), keep_original=True
url_util.path_to_file_url(destination),
keep_original=True,
verify_ssl=spack.config.get("config:verify_ssl", True),
) )
@_needs_stage @_needs_stage
@@ -1350,11 +1330,7 @@ def fetch(self):
basename = os.path.basename(parsed_url.path) basename = os.path.basename(parsed_url.path)
with working_dir(self.stage.path): with working_dir(self.stage.path):
_, headers, stream = web_util.read_from_url( _, headers, stream = web_util.read_from_url(self.url)
self.url,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
with open(basename, "wb") as f: with open(basename, "wb") as f:
shutil.copyfileobj(stream, f) shutil.copyfileobj(stream, f)
@@ -1401,11 +1377,7 @@ def fetch(self):
basename = os.path.basename(parsed_url.path) basename = os.path.basename(parsed_url.path)
with working_dir(self.stage.path): with working_dir(self.stage.path):
_, headers, stream = web_util.read_from_url( _, headers, stream = web_util.read_from_url(self.url)
self.url,
verify_ssl=spack.config.get("config:verify_ssl", True),
timeout=spack.config.get("config:connect_timeout", 10),
)
with open(basename, "wb") as f: with open(basename, "wb") as f:
shutil.copyfileobj(stream, f) shutil.copyfileobj(stream, f)

View File

@@ -9,7 +9,6 @@
from llnl.util.filesystem import mkdirp from llnl.util.filesystem import mkdirp
from llnl.util.symlink import symlink from llnl.util.symlink import symlink
import spack.config
import spack.util.editor as ed import spack.util.editor as ed
@@ -40,7 +39,7 @@ def set_up_license(pkg):
write_license_file(pkg, license_path) write_license_file(pkg, license_path)
# use spack.util.executable so the editor does not hang on return here # use spack.util.executable so the editor does not hang on return here
ed.editor(license_path, exec_fn=ed.executable, debug=spack.config.get("config:debug")) ed.editor(license_path, exec_fn=ed.executable)
else: else:
# Use already existing license file # Use already existing license file
tty.msg("Found already existing license %s" % license_path) tty.msg("Found already existing license %s" % license_path)

View File

@@ -5,18 +5,18 @@
import os import os
import spack.package_prefs as pp import spack.util.file_permissions as fp
def post_install(spec, explicit=None): def post_install(spec, explicit=None):
if not spec.external: if not spec.external:
pp.set_permissions_by_spec(spec.prefix, spec) fp.set_permissions_by_spec(spec.prefix, spec)
# os.walk explicitly set not to follow links # os.walk explicitly set not to follow links
for root, dirs, files in os.walk(spec.prefix, followlinks=False): for root, dirs, files in os.walk(spec.prefix, followlinks=False):
for d in dirs: for d in dirs:
if not os.path.islink(os.path.join(root, d)): if not os.path.islink(os.path.join(root, d)):
pp.set_permissions_by_spec(os.path.join(root, d), spec) fp.set_permissions_by_spec(os.path.join(root, d), spec)
for f in files: for f in files:
if not os.path.islink(os.path.join(root, f)): if not os.path.islink(os.path.join(root, f)):
pp.set_permissions_by_spec(os.path.join(root, f), spec) fp.set_permissions_by_spec(os.path.join(root, f), spec)

View File

@@ -91,8 +91,7 @@ def get_test_stage_dir():
the default test stage path the default test stage path
""" """
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(
spack.config.get("config:test_stage", spack.paths.default_test_path), spack.config.get("config:test_stage", spack.paths.default_test_path)
replacements=spack.paths.path_replacements(),
) )

View File

@@ -357,6 +357,7 @@ def _print_installed_pkg(message: str) -> None:
Args: Args:
message (str): message to be output message (str): message to be output
""" """
if tty.msg_enabled():
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message)) print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
@@ -380,7 +381,10 @@ def _print_timer(pre: str, pkg_id: str, timer: timer.BaseTimer) -> None:
def _install_from_cache( def _install_from_cache(
pkg: "spack.package_base.PackageBase", cache_only: bool, explicit: bool, unsigned: bool = False pkg: "spack.package_base.PackageBase",
cache_only: bool,
explicit: bool,
unsigned: Optional[bool] = False,
) -> bool: ) -> bool:
""" """
Extract the package from binary cache Extract the package from binary cache
@@ -390,8 +394,7 @@ def _install_from_cache(
cache_only: only extract from binary cache cache_only: only extract from binary cache
explicit: ``True`` if installing the package was explicitly explicit: ``True`` if installing the package was explicitly
requested by the user, otherwise, ``False`` requested by the user, otherwise, ``False``
unsigned: ``True`` if binary package signatures to be checked, unsigned: if ``True`` or ``False`` override the mirror signature verification defaults
otherwise, ``False``
Return: ``True`` if the package was extract from binary cache, ``False`` otherwise Return: ``True`` if the package was extract from binary cache, ``False`` otherwise
""" """
@@ -461,7 +464,7 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
def _process_binary_cache_tarball( def _process_binary_cache_tarball(
pkg: "spack.package_base.PackageBase", pkg: "spack.package_base.PackageBase",
explicit: bool, explicit: bool,
unsigned: bool, unsigned: Optional[bool],
mirrors_for_spec: Optional[list] = None, mirrors_for_spec: Optional[list] = None,
timer: timer.BaseTimer = timer.NULL_TIMER, timer: timer.BaseTimer = timer.NULL_TIMER,
) -> bool: ) -> bool:
@@ -471,8 +474,7 @@ def _process_binary_cache_tarball(
Args: Args:
pkg: the package being installed pkg: the package being installed
explicit: the package was explicitly requested by the user explicit: the package was explicitly requested by the user
unsigned: ``True`` if binary package signatures to be checked, unsigned: if ``True`` or ``False`` override the mirror signature verification defaults
otherwise, ``False``
mirrors_for_spec: Optional list of concrete specs and mirrors mirrors_for_spec: Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec(). obtained by calling binary_distribution.get_mirrors_for_spec().
timer: timer to keep track of binary install phases. timer: timer to keep track of binary install phases.
@@ -491,11 +493,8 @@ def _process_binary_cache_tarball(
tty.msg(f"Extracting {package_id(pkg)} from binary cache") tty.msg(f"Extracting {package_id(pkg)} from binary cache")
padding = spack.config.get("config:install_tree:padded_length", None) with timer.measure("install"), spack.util.path.filter_padding():
with timer.measure("install"), spack.util.path.filter_padding(padding=padding): binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
binary_distribution.extract_tarball(
pkg.spec, download_result, unsigned=unsigned, force=False, timer=timer
)
pkg.installed_from_binary_cache = True pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit) spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
@@ -505,7 +504,7 @@ def _process_binary_cache_tarball(
def _try_install_from_binary_cache( def _try_install_from_binary_cache(
pkg: "spack.package_base.PackageBase", pkg: "spack.package_base.PackageBase",
explicit: bool, explicit: bool,
unsigned: bool = False, unsigned: Optional[bool] = None,
timer: timer.BaseTimer = timer.NULL_TIMER, timer: timer.BaseTimer = timer.NULL_TIMER,
) -> bool: ) -> bool:
""" """
@@ -514,8 +513,7 @@ def _try_install_from_binary_cache(
Args: Args:
pkg: package to be extracted from binary cache pkg: package to be extracted from binary cache
explicit: the package was explicitly requested by the user explicit: the package was explicitly requested by the user
unsigned: ``True`` if binary package signatures to be checked, unsigned: if ``True`` or ``False`` override the mirror signature verification defaults
otherwise, ``False``
timer: timer to keep track of binary install phases. timer: timer to keep track of binary install phases.
""" """
# Early exit if no binary mirrors are configured. # Early exit if no binary mirrors are configured.
@@ -825,7 +823,7 @@ def _add_default_args(self) -> None:
("restage", False), ("restage", False),
("skip_patch", False), ("skip_patch", False),
("tests", False), ("tests", False),
("unsigned", False), ("unsigned", None),
("verbose", False), ("verbose", False),
]: ]:
_ = self.install_args.setdefault(arg, default) _ = self.install_args.setdefault(arg, default)
@@ -1663,7 +1661,7 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
use_cache = task.use_cache use_cache = task.use_cache
tests = install_args.get("tests", False) tests = install_args.get("tests", False)
assert isinstance(tests, (bool, list)) # make mypy happy. assert isinstance(tests, (bool, list)) # make mypy happy.
unsigned = bool(install_args.get("unsigned")) unsigned: Optional[bool] = install_args.get("unsigned")
pkg, pkg_id = task.pkg, task.pkg_id pkg, pkg_id = task.pkg, task.pkg_id
@@ -2008,7 +2006,9 @@ def install(self) -> None:
# Only enable the terminal status line when we're in a tty without debug info # Only enable the terminal status line when we're in a tty without debug info
# enabled, so that the output does not get cluttered. # enabled, so that the output does not get cluttered.
term_status = TermStatusLine(enabled=sys.stdout.isatty() and not tty.is_debug()) term_status = TermStatusLine(
enabled=sys.stdout.isatty() and tty.msg_enabled() and not tty.is_debug()
)
while self.build_pq: while self.build_pq:
task = self._pop_task() task = self._pop_task()
@@ -2493,8 +2493,7 @@ def build_process(pkg: "spack.package_base.PackageBase", install_args: dict) ->
installer = BuildProcessInstaller(pkg, install_args) installer = BuildProcessInstaller(pkg, install_args)
# don't print long padded paths in executable debug output. # don't print long padded paths in executable debug output.
padding = spack.config.get("config:install_tree:padded_length", None) with spack.util.path.filter_padding():
with spack.util.path.filter_padding(padding=padding):
return installer.run() return installer.run()

View File

@@ -30,7 +30,6 @@
import spack.fetch_strategy import spack.fetch_strategy
import spack.mirror import spack.mirror
import spack.oci.image import spack.oci.image
import spack.paths
import spack.spec import spack.spec
import spack.util.path import spack.util.path
import spack.util.spack_json as sjson import spack.util.spack_json as sjson
@@ -52,11 +51,7 @@ def _url_or_path_to_url(url_or_path: str) -> str:
return url_or_path return url_or_path
# Otherwise we interpret it as path, and we should promote it to file:// URL. # Otherwise we interpret it as path, and we should promote it to file:// URL.
return url_util.path_to_file_url( return url_util.path_to_file_url(spack.util.path.canonicalize_path(url_or_path))
spack.util.path.canonicalize_path(
url_or_path, replacements=spack.paths.path_replacements()
)
)
class Mirror: class Mirror:
@@ -138,6 +133,10 @@ def binary(self):
def source(self): def source(self):
return isinstance(self._data, str) or self._data.get("source", True) return isinstance(self._data, str) or self._data.get("source", True)
@property
def signed(self) -> bool:
return isinstance(self._data, str) or self._data.get("signed", True)
@property @property
def fetch_url(self): def fetch_url(self):
"""Get the valid, canonicalized fetch URL""" """Get the valid, canonicalized fetch URL"""
@@ -151,7 +150,7 @@ def push_url(self):
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool): def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
keys = ["url", "access_pair", "access_token", "profile", "endpoint_url"] keys = ["url", "access_pair", "access_token", "profile", "endpoint_url"]
if top_level: if top_level:
keys += ["binary", "source"] keys += ["binary", "source", "signed"]
changed = False changed = False
for key in keys: for key in keys:
if key in new_data and current_data.get(key) != new_data[key]: if key in new_data and current_data.get(key) != new_data[key]:

View File

@@ -93,7 +93,7 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
replacements = [] replacements = []
for idx, (env_var, compiler_path) in enumerate(compiler_vars): for idx, (env_var, compiler_path) in enumerate(compiler_vars):
if env_var in os.environ: if env_var in os.environ and compiler_path is not None:
# filter spack wrapper and links to spack wrapper in case # filter spack wrapper and links to spack wrapper in case
# build system runs realpath # build system runs realpath
wrapper = os.environ[env_var] wrapper = os.environ[env_var]

View File

@@ -46,7 +46,6 @@
import spack.environment import spack.environment
import spack.error import spack.error
import spack.modules.common import spack.modules.common
import spack.package_prefs as pp
import spack.paths import spack.paths
import spack.projections as proj import spack.projections as proj
import spack.repo import spack.repo
@@ -55,6 +54,7 @@
import spack.store import spack.store
import spack.tengine as tengine import spack.tengine as tengine
import spack.util.environment import spack.util.environment
import spack.util.file_permissions as fp
import spack.util.path import spack.util.path
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
from spack.context import Context from spack.context import Context
@@ -225,7 +225,7 @@ def root_path(name, module_set_name):
roots = spack.config.merge_yaml(defaults, roots) roots = spack.config.merge_yaml(defaults, roots)
path = roots.get(name, os.path.join(spack.paths.share_path, name)) path = roots.get(name, os.path.join(spack.paths.share_path, name))
return spack.util.path.canonicalize_path(path, replacements=spack.paths.path_replacements()) return spack.util.path.canonicalize_path(path)
def generate_module_index(root, modules, overwrite=False): def generate_module_index(root, modules, overwrite=False):
@@ -968,7 +968,7 @@ def write(self, overwrite=False):
# Set the file permissions of the module to match that of the package # Set the file permissions of the module to match that of the package
if os.path.exists(self.layout.filename): if os.path.exists(self.layout.filename):
pp.set_permissions_by_spec(self.layout.filename, self.spec) fp.set_permissions_by_spec(self.layout.filename, self.spec)
# Symlink defaults if needed # Symlink defaults if needed
self.update_module_defaults() self.update_module_defaults()

View File

@@ -5,11 +5,20 @@
from ._operating_system import OperatingSystem from ._operating_system import OperatingSystem
from .cray_backend import CrayBackend from .cray_backend import CrayBackend
from .cray_frontend import CrayFrontend from .cray_frontend import CrayFrontend
from .freebsd import FreeBSDOs
from .linux_distro import LinuxDistro from .linux_distro import LinuxDistro
from .mac_os import MacOs from .mac_os import MacOs
from .windows_os import WindowsOs from .windows_os import WindowsOs
__all__ = ["OperatingSystem", "LinuxDistro", "MacOs", "CrayFrontend", "CrayBackend", "WindowsOs"] __all__ = [
"OperatingSystem",
"LinuxDistro",
"MacOs",
"CrayFrontend",
"CrayBackend",
"WindowsOs",
"FreeBSDOs",
]
#: List of all the Operating Systems known to Spack #: List of all the Operating Systems known to Spack
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs] operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs, FreeBSDOs]

View File

@@ -0,0 +1,15 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform as py_platform
from spack.version import Version
from ._operating_system import OperatingSystem
class FreeBSDOs(OperatingSystem):
def __init__(self):
release = py_platform.release().split("-", 1)[0]
super().__init__("freebsd", Version(release))

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
# flake8: noqa: F401 # flake8: noqa: F401
"""spack.package is a set of useful build tools and directives for packages. """spack.util.package is a set of useful build tools and directives for packages.
Everything in this module is automatically imported into Spack package files. Everything in this module is automatically imported into Spack package files.
""" """
@@ -101,6 +101,7 @@
on_package_attributes, on_package_attributes,
) )
from spack.spec import InvalidSpecDetected, Spec from spack.spec import InvalidSpecDetected, Spec
from spack.util.cpus import determine_number_of_jobs
from spack.util.executable import * from spack.util.executable import *
from spack.variant import ( from spack.variant import (
any_combination_of, any_combination_of,

View File

@@ -63,9 +63,9 @@
install_test_root, install_test_root,
) )
from spack.installer import InstallError, PackageInstaller from spack.installer import InstallError, PackageInstaller
from spack.package_hash import package_hash
from spack.stage import DIYStage, ResourceStage, Stage, StageComposite, compute_stage_name from spack.stage import DIYStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.executable import ProcessError, which from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.version import GitVersion, StandardVersion, Version from spack.version import GitVersion, StandardVersion, Version
FLAG_HANDLER_RETURN_TYPE = Tuple[ FLAG_HANDLER_RETURN_TYPE = Tuple[
@@ -829,9 +829,7 @@ def name(cls):
@classproperty @classproperty
def global_license_dir(cls): def global_license_dir(cls):
"""Returns the directory where license files for all packages are stored.""" """Returns the directory where license files for all packages are stored."""
return spack.util.path.canonicalize_path( return spack.util.path.canonicalize_path(spack.config.get("config:license_dir"))
spack.config.get("config:license_dir"), replacements=spack.paths.path_replacements()
)
@property @property
def global_license_file(self): def global_license_file(self):
@@ -987,12 +985,7 @@ def find_valid_url_for_version(self, version):
urls = self.all_urls_for_version(version) urls = self.all_urls_for_version(version)
for u in urls: for u in urls:
if spack.util.web.url_exists( if spack.util.web.url_exists(u):
u,
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
verify_ssl=spack.config.get("config:verify_ssl"),
timeout=spack.config.get("config:connect_timeout", 10),
):
return u return u
return None return None
@@ -1042,15 +1035,26 @@ def _make_stage(self):
# To fetch the current version # To fetch the current version
source_stage = self._make_root_stage(self.fetcher) source_stage = self._make_root_stage(self.fetcher)
# Extend it with all resources and patches # all_stages is source + resources + patches
all_stages = StageComposite() all_stages = StageComposite()
all_stages.append(source_stage) all_stages.append(source_stage)
all_stages.extend( all_stages.extend(
self._make_resource_stage(source_stage, r) for r in self._get_needed_resources() self._make_resource_stage(source_stage, r) for r in self._get_needed_resources()
) )
if self.spec.concrete:
all_stages.extend( all_stages.extend(
p.stage for p in self.spec.patches if isinstance(p, spack.patch.UrlPatch) p.stage for p in self.spec.patches if isinstance(p, spack.patch.UrlPatch)
) )
else:
# The only code path that gets here is spack mirror create --all which just needs all
# matching patches.
all_stages.extend(
p.stage
for when_spec, patch_list in self.patches.items()
if self.spec.intersects(when_spec)
for p in patch_list
if isinstance(p, spack.patch.UrlPatch)
)
return all_stages return all_stages
@property @property
@@ -1750,28 +1754,16 @@ def _if_ninja_target_execute(self, target, *args, **kwargs):
inspect.getmodule(self).ninja(target, *args, **kwargs) inspect.getmodule(self).ninja(target, *args, **kwargs)
def _get_needed_resources(self): def _get_needed_resources(self):
resources = [] # We use intersects here cause it would also work if self.spec is abstract
# Select the resources that are needed for this build resources = [
if self.spec.concrete: resource
for when_spec, resource_list in self.resources.items(): for when_spec, resource_list in self.resources.items()
if when_spec in self.spec: if self.spec.intersects(when_spec)
resources.extend(resource_list) for resource in resource_list
else: ]
for when_spec, resource_list in self.resources.items(): # Sorts the resources by the length of the string representing their destination. Since any
# Note that variant checking is always strict for specs where # nested resource must contain another resource's path, that should work
# the name is not specified. But with strict variant checking, return sorted(resources, key=lambda res: len(res.destination))
# only variants mentioned in 'other' are checked. Here we only
# want to make sure that no constraints in when_spec
# conflict with the spec, so we need to invoke
# when_spec.satisfies(self.spec) vs.
# self.spec.satisfies(when_spec)
if when_spec.intersects(self.spec):
resources.extend(resource_list)
# Sorts the resources by the length of the string representing their
# destination. Since any nested resource must contain another
# resource's name in its path, it seems that should work
resources = sorted(resources, key=lambda res: len(res.destination))
return resources
def _resource_stage(self, resource): def _resource_stage(self, resource):
pieces = ["resource", resource.name, self.spec.dag_hash()] pieces = ["resource", resource.name, self.spec.dag_hash()]

View File

@@ -2,14 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import stat import stat
import warnings import warnings
import spack.error import spack.error
import spack.paths
import spack.repo import spack.repo
import spack.util.file_permissions as fp
from spack.config import ConfigError from spack.config import ConfigError
from spack.util.path import canonicalize_path from spack.util.path import canonicalize_path
from spack.version import Version from spack.version import Version
@@ -180,9 +177,7 @@ def _package(maybe_abstract_spec):
spec_str = entry["spec"] spec_str = entry["spec"]
external_path = entry.get("prefix", None) external_path = entry.get("prefix", None)
if external_path: if external_path:
external_path = canonicalize_path( external_path = canonicalize_path(external_path)
external_path, replacements=spack.paths.path_replacements()
)
external_modules = entry.get("modules", None) external_modules = entry.get("modules", None)
external_spec = spack.spec.Spec.from_detection( external_spec = spack.spec.Spec.from_detection(
spack.spec.Spec( spack.spec.Spec(
@@ -299,16 +294,5 @@ def get_package_group(spec):
return group return group
def set_permissions_by_spec(path, spec):
# Get permissions for spec
if os.path.isdir(path):
perms = get_package_dir_permissions(spec)
else:
perms = get_package_permissions(spec)
group = get_package_group(spec)
fp.set_permissions(path, perms, group)
class VirtualInPackagesYAMLError(spack.error.SpackError): class VirtualInPackagesYAMLError(spack.error.SpackError):
"""Raised when a disallowed virtual is found in packages.yaml""" """Raised when a disallowed virtual is found in packages.yaml"""

View File

@@ -58,6 +58,7 @@
expansion when it is the first character in an id typed on the command line. expansion when it is the first character in an id typed on the command line.
""" """
import enum import enum
import json
import pathlib import pathlib
import re import re
import sys import sys
@@ -95,13 +96,55 @@
else: else:
FILENAME = WINDOWS_FILENAME FILENAME = WINDOWS_FILENAME
#: These are legal values that *can* be parsed bare, without quotes on the command line.
VALUE = r"(?:[a-zA-Z_0-9\-+\*.,:=\~\/\\]+)" VALUE = r"(?:[a-zA-Z_0-9\-+\*.,:=\~\/\\]+)"
QUOTED_VALUE = r"[\"']+(?:[a-zA-Z_0-9\-+\*.,:=\~\/\\\s]+)[\"']+"
#: Variant/flag values that match this can be left unquoted in Spack output
NO_QUOTES_NEEDED = r"^[a-zA-Z0-9,/_.-]+$"
#: Quoted values can be *anything* in between quotes, including escaped quotes.
QUOTED_VALUE = r"(?:'(?:[^']|(?<=\\)')*'|\"(?:[^\"]|(?<=\\)\")*\")"
VERSION = r"=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\-\.]*\b)" VERSION = r"=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\-\.]*\b)"
VERSION_RANGE = rf"(?:(?:{VERSION})?:(?:{VERSION}(?!\s*=))?)" VERSION_RANGE = rf"(?:(?:{VERSION})?:(?:{VERSION}(?!\s*=))?)"
VERSION_LIST = rf"(?:{VERSION_RANGE}|{VERSION})(?:\s*,\s*(?:{VERSION_RANGE}|{VERSION}))*" VERSION_LIST = rf"(?:{VERSION_RANGE}|{VERSION})(?:\s*,\s*(?:{VERSION_RANGE}|{VERSION}))*"
#: Regex with groups to use for splitting (optionally propagated) key-value pairs
SPLIT_KVP = rf"^({NAME})(==?)(.*)$"
#: Regex to strip quotes. Group 2 will be the unquoted string.
STRIP_QUOTES = r"^(['\"])(.*)\1$"
def strip_quotes_and_unescape(string: str) -> str:
"""Remove surrounding single or double quotes from string, if present."""
match = re.match(STRIP_QUOTES, string)
if not match:
return string
# replace any escaped quotes with bare quotes
quote, result = match.groups()
return result.replace(rf"\{quote}", quote)
def quote_if_needed(value: str) -> str:
"""Add quotes around the value if it requires quotes.
This will add quotes around the value unless it matches ``NO_QUOTES_NEEDED``.
This adds:
* single quotes by default
* double quotes around any value that contains single quotes
If double quotes are used, we json-escpae the string. That is, we escape ``\\``,
``"``, and control codes.
"""
if re.match(spack.parser.NO_QUOTES_NEEDED, value):
return value
return json.dumps(value) if "'" in value else f"'{value}'"
class TokenBase(enum.Enum): class TokenBase(enum.Enum):
"""Base class for an enum type with a regex value""" """Base class for an enum type with a regex value"""
@@ -138,8 +181,8 @@ class TokenType(TokenBase):
# Variants # Variants
PROPAGATED_BOOL_VARIANT = rf"(?:(?:\+\+|~~|--)\s*{NAME})" PROPAGATED_BOOL_VARIANT = rf"(?:(?:\+\+|~~|--)\s*{NAME})"
BOOL_VARIANT = rf"(?:[~+-]\s*{NAME})" BOOL_VARIANT = rf"(?:[~+-]\s*{NAME})"
PROPAGATED_KEY_VALUE_PAIR = rf"(?:{NAME}\s*==\s*(?:{VALUE}|{QUOTED_VALUE}))" PROPAGATED_KEY_VALUE_PAIR = rf"(?:{NAME}==(?:{VALUE}|{QUOTED_VALUE}))"
KEY_VALUE_PAIR = rf"(?:{NAME}\s*=\s*(?:{VALUE}|{QUOTED_VALUE}))" KEY_VALUE_PAIR = rf"(?:{NAME}=(?:{VALUE}|{QUOTED_VALUE}))"
# Compilers # Compilers
COMPILER_AND_VERSION = rf"(?:%\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))" COMPILER_AND_VERSION = rf"(?:%\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))"
COMPILER = rf"(?:%\s*(?:{NAME}))" COMPILER = rf"(?:%\s*(?:{NAME}))"
@@ -351,12 +394,14 @@ def parse(
# accept another package name afterwards in a node # accept another package name afterwards in a node
if self.ctx.accept(TokenType.UNQUALIFIED_PACKAGE_NAME): if self.ctx.accept(TokenType.UNQUALIFIED_PACKAGE_NAME):
initial_spec.name = self.ctx.current_token.value initial_spec.name = self.ctx.current_token.value
elif self.ctx.accept(TokenType.FULLY_QUALIFIED_PACKAGE_NAME): elif self.ctx.accept(TokenType.FULLY_QUALIFIED_PACKAGE_NAME):
parts = self.ctx.current_token.value.split(".") parts = self.ctx.current_token.value.split(".")
name = parts[-1] name = parts[-1]
namespace = ".".join(parts[:-1]) namespace = ".".join(parts[:-1])
initial_spec.name = name initial_spec.name = name
initial_spec.namespace = namespace initial_spec.namespace = namespace
elif self.ctx.accept(TokenType.FILENAME): elif self.ctx.accept(TokenType.FILENAME):
return FileParser(self.ctx).parse(initial_spec) return FileParser(self.ctx).parse(initial_spec)
@@ -370,6 +415,7 @@ def parse(
compiler_name = self.ctx.current_token.value[1:] compiler_name = self.ctx.current_token.value[1:]
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":") initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
self.has_compiler = True self.has_compiler = True
elif self.ctx.accept(TokenType.COMPILER_AND_VERSION): elif self.ctx.accept(TokenType.COMPILER_AND_VERSION):
if self.has_compiler: if self.has_compiler:
raise spack.spec.DuplicateCompilerSpecError( raise spack.spec.DuplicateCompilerSpecError(
@@ -381,6 +427,7 @@ def parse(
compiler_name.strip(), compiler_version compiler_name.strip(), compiler_version
) )
self.has_compiler = True self.has_compiler = True
elif ( elif (
self.ctx.accept(TokenType.VERSION_HASH_PAIR) self.ctx.accept(TokenType.VERSION_HASH_PAIR)
or self.ctx.accept(TokenType.GIT_VERSION) or self.ctx.accept(TokenType.GIT_VERSION)
@@ -395,31 +442,39 @@ def parse(
) )
initial_spec.attach_git_version_lookup() initial_spec.attach_git_version_lookup()
self.has_version = True self.has_version = True
elif self.ctx.accept(TokenType.BOOL_VARIANT): elif self.ctx.accept(TokenType.BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0] == "+" variant_value = self.ctx.current_token.value[0] == "+"
initial_spec._add_flag( initial_spec._add_flag(
self.ctx.current_token.value[1:].strip(), variant_value, propagate=False self.ctx.current_token.value[1:].strip(), variant_value, propagate=False
) )
elif self.ctx.accept(TokenType.PROPAGATED_BOOL_VARIANT): elif self.ctx.accept(TokenType.PROPAGATED_BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0:2] == "++" variant_value = self.ctx.current_token.value[0:2] == "++"
initial_spec._add_flag( initial_spec._add_flag(
self.ctx.current_token.value[2:].strip(), variant_value, propagate=True self.ctx.current_token.value[2:].strip(), variant_value, propagate=True
) )
elif self.ctx.accept(TokenType.KEY_VALUE_PAIR): elif self.ctx.accept(TokenType.KEY_VALUE_PAIR):
name, value = self.ctx.current_token.value.split("=", maxsplit=1) match = re.match(SPLIT_KVP, self.ctx.current_token.value)
name = name.strip("'\" ") assert match, "SPLIT_KVP and KEY_VALUE_PAIR do not agree."
value = value.strip("'\" ")
initial_spec._add_flag(name, value, propagate=False) name, delim, value = match.groups()
initial_spec._add_flag(name, strip_quotes_and_unescape(value), propagate=False)
elif self.ctx.accept(TokenType.PROPAGATED_KEY_VALUE_PAIR): elif self.ctx.accept(TokenType.PROPAGATED_KEY_VALUE_PAIR):
name, value = self.ctx.current_token.value.split("==", maxsplit=1) match = re.match(SPLIT_KVP, self.ctx.current_token.value)
name = name.strip("'\" ") assert match, "SPLIT_KVP and PROPAGATED_KEY_VALUE_PAIR do not agree."
value = value.strip("'\" ")
initial_spec._add_flag(name, value, propagate=True) name, delim, value = match.groups()
initial_spec._add_flag(name, strip_quotes_and_unescape(value), propagate=True)
elif self.ctx.expect(TokenType.DAG_HASH): elif self.ctx.expect(TokenType.DAG_HASH):
if initial_spec.abstract_hash: if initial_spec.abstract_hash:
break break
self.ctx.accept(TokenType.DAG_HASH) self.ctx.accept(TokenType.DAG_HASH)
initial_spec.abstract_hash = self.ctx.current_token.value[1:] initial_spec.abstract_hash = self.ctx.current_token.value[1:]
else: else:
break break

View File

@@ -9,16 +9,11 @@
throughout Spack and should bring in a minimal number of external throughout Spack and should bring in a minimal number of external
dependencies. dependencies.
""" """
import getpass
import os import os
import tempfile
from datetime import date
from pathlib import PurePath from pathlib import PurePath
import llnl.util.filesystem import llnl.util.filesystem
from spack.util.path import NOMATCH
#: This file lives in $prefix/lib/spack/spack/__file__ #: This file lives in $prefix/lib/spack/spack/__file__
prefix = str(PurePath(llnl.util.filesystem.ancestor(__file__, 4))) prefix = str(PurePath(llnl.util.filesystem.ancestor(__file__, 4)))
@@ -141,50 +136,3 @@ def _get_system_config_path():
#: System configuration location #: System configuration location
system_config_path = _get_system_config_path() system_config_path = _get_system_config_path()
def architecture():
# break circular import
import spack.platforms
import spack.spec
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
host_target = host_platform.target("default_target")
return spack.spec.ArchSpec((str(host_platform), str(host_os), str(host_target)))
def get_user():
# User pwd where available because it accounts for effective uids when using ksu and similar
try:
# user pwd for unix systems
import pwd
return pwd.getpwuid(os.geteuid()).pw_name
except ImportError:
# fallback on getpass
return getpass.getuser()
def path_replacements():
# break circular imports
import spack.environment as ev
arch = architecture()
return {
"spack": lambda: prefix,
"user": lambda: get_user(),
"tempdir": lambda: tempfile.gettempdir(),
"user_cache_path": lambda: user_cache_path,
"architecture": lambda: arch,
"arch": lambda: arch,
"platform": lambda: arch.platform,
"operating_system": lambda: arch.os,
"os": lambda: arch.os,
"target": lambda: arch.target,
"target_family": lambda: arch.target.microarchitecture.family,
"date": lambda: date.today().strftime("%Y-%m-%d"),
"env": lambda: ev.active_environment().path if ev.active_environment() else NOMATCH,
}

View File

@@ -8,6 +8,7 @@
from ._platform import Platform from ._platform import Platform
from .cray import Cray from .cray import Cray
from .darwin import Darwin from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux from .linux import Linux
from .test import Test from .test import Test
from .windows import Windows from .windows import Windows
@@ -17,6 +18,7 @@
"Cray", "Cray",
"Darwin", "Darwin",
"Linux", "Linux",
"FreeBSD",
"Test", "Test",
"Windows", "Windows",
"platforms", "platforms",

View File

@@ -10,12 +10,13 @@
from .cray import Cray from .cray import Cray
from .darwin import Darwin from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux from .linux import Linux
from .test import Test from .test import Test
from .windows import Windows from .windows import Windows
#: List of all the platform classes known to Spack #: List of all the platform classes known to Spack
platforms = [Cray, Darwin, Linux, Windows, Test] platforms = [Cray, Darwin, Linux, Windows, FreeBSD, Test]
@llnl.util.lang.memoized @llnl.util.lang.memoized

View File

@@ -0,0 +1,37 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import archspec.cpu
import spack.target
from spack.operating_systems.freebsd import FreeBSDOs
from ._platform import Platform
class FreeBSD(Platform):
priority = 102
def __init__(self):
super().__init__("freebsd")
for name in archspec.cpu.TARGETS:
self.add_target(name, spack.target.Target(name))
# Get specific default
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
os = FreeBSDOs()
self.default_os = str(os)
self.front_os = self.default_os
self.back_os = self.default_os
self.add_operating_system(str(os), os)
@classmethod
def detect(cls):
return platform.system().lower() == "freebsd"

View File

@@ -37,7 +37,6 @@
import spack.config import spack.config
import spack.error import spack.error
import spack.patch import spack.patch
import spack.paths
import spack.provider_index import spack.provider_index
import spack.spec import spack.spec
import spack.tag import spack.tag
@@ -929,9 +928,7 @@ def __init__(self, root, cache=None):
""" """
# Root directory, containing _repo.yaml and package dirs # Root directory, containing _repo.yaml and package dirs
# Allow roots to by spack-relative by starting with '$spack' # Allow roots to by spack-relative by starting with '$spack'
self.root = spack.util.path.canonicalize_path( self.root = spack.util.path.canonicalize_path(root)
root, replacements=spack.paths.path_replacements()
)
# check and raise BadRepoError on fail. # check and raise BadRepoError on fail.
def check(condition, msg): def check(condition, msg):
@@ -1330,7 +1327,7 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
If the namespace is not provided, use basename of root. If the namespace is not provided, use basename of root.
Return the canonicalized path and namespace of the created repository. Return the canonicalized path and namespace of the created repository.
""" """
root = spack.util.path.canonicalize_path(root, replacements=spack.paths.path_replacements()) root = spack.util.path.canonicalize_path(root)
if not namespace: if not namespace:
namespace = os.path.basename(root) namespace = os.path.basename(root)

View File

@@ -32,13 +32,26 @@
from .extract import extract_test_parts from .extract import extract_test_parts
# Mapping Spack phases to the corresponding CTest/CDash phase. # Mapping Spack phases to the corresponding CTest/CDash phase.
# TODO: Some of the phases being lumped into configure in the CDash tables
# TODO: really belong in a separate column, such as "Setup".
# TODO: Would also be nice to have `stage` as a separate phase that could
# TODO: be lumped into that new column instead of configure, for example.
MAP_PHASES_TO_CDASH = { MAP_PHASES_TO_CDASH = {
"autoreconf": "configure", "autoreconf": "configure", # AutotoolsBuilder
"cmake": "configure", "bootstrap": "configure", # CMakeBuilder
"configure": "configure",
"edit": "configure",
"build": "build", "build": "build",
"build_processes": "build", # Openloops
"cmake": "configure", # CMakeBuilder
"configure": "configure",
"edit": "configure", # MakefileBuilder
"generate_luarocks_config": "configure", # LuaBuilder
"hostconfig": "configure", # Lvarray
"initconfig": "configure", # CachedCMakeBuilder
"install": "build", "install": "build",
"meson": "configure", # MesonBuilder
"preprocess": "configure", # LuaBuilder
"qmake": "configure", # QMakeBuilder
"unpack": "configure", # LuaBuilder
} }
# Initialize data structures common to each phase's report. # Initialize data structures common to each phase's report.
@@ -92,11 +105,12 @@ def __init__(self, configuration: CDashConfiguration):
self.osname = platform.system() self.osname = platform.system()
self.osrelease = platform.release() self.osrelease = platform.release()
self.target = spack.platforms.host().target("default_target") self.target = spack.platforms.host().target("default_target")
self.endtime = int(time.time()) self.starttime = int(time.time())
self.endtime = self.starttime
self.buildstamp = ( self.buildstamp = (
configuration.buildstamp configuration.buildstamp
if configuration.buildstamp if configuration.buildstamp
else build_stamp(configuration.track, self.endtime) else build_stamp(configuration.track, self.starttime)
) )
self.buildIds: Dict[str, str] = {} self.buildIds: Dict[str, str] = {}
self.revision = "" self.revision = ""
@@ -125,7 +139,7 @@ def build_report_for_package(self, report_dir, package, duration):
report_data[phase] = {} report_data[phase] = {}
report_data[phase]["loglines"] = [] report_data[phase]["loglines"] = []
report_data[phase]["status"] = 0 report_data[phase]["status"] = 0
report_data[phase]["endtime"] = self.endtime report_data[phase]["starttime"] = self.starttime
# Track the phases we perform so we know what reports to create. # Track the phases we perform so we know what reports to create.
# We always report the update step because this is how we tell CDash # We always report the update step because this is how we tell CDash
@@ -153,6 +167,25 @@ def build_report_for_package(self, report_dir, package, duration):
elif cdash_phase: elif cdash_phase:
report_data[cdash_phase]["loglines"].append(xml.sax.saxutils.escape(line)) report_data[cdash_phase]["loglines"].append(xml.sax.saxutils.escape(line))
# something went wrong pre-cdash "configure" phase b/c we have an exception and only
# "update" was encounterd.
# dump the report in the configure line so teams can see what the issue is
if len(phases_encountered) == 1 and package["exception"]:
# TODO this mapping is not ideal since these are pre-configure errors
# we need to determine if a more appropriate cdash phase can be utilized
# for now we will add a message to the log explaining this
cdash_phase = "configure"
phases_encountered.append(cdash_phase)
log_message = (
"Pre-configure errors occured in Spack's process that terminated the "
"build process prematurely.\nSpack output::\n{0}".format(
xml.sax.saxutils.escape(package["exception"])
)
)
report_data[cdash_phase]["loglines"].append(log_message)
# Move the build phase to the front of the list if it occurred. # Move the build phase to the front of the list if it occurred.
# This supports older versions of CDash that expect this phase # This supports older versions of CDash that expect this phase
# to be reported before all others. # to be reported before all others.
@@ -160,9 +193,9 @@ def build_report_for_package(self, report_dir, package, duration):
build_pos = phases_encountered.index("build") build_pos = phases_encountered.index("build")
phases_encountered.insert(0, phases_encountered.pop(build_pos)) phases_encountered.insert(0, phases_encountered.pop(build_pos))
self.starttime = self.endtime - duration self.endtime = self.starttime + duration
for phase in phases_encountered: for phase in phases_encountered:
report_data[phase]["starttime"] = self.starttime report_data[phase]["endtime"] = self.endtime
report_data[phase]["log"] = "\n".join(report_data[phase]["loglines"]) report_data[phase]["log"] = "\n".join(report_data[phase]["loglines"])
errors, warnings = parse_log_events(report_data[phase]["loglines"]) errors, warnings = parse_log_events(report_data[phase]["loglines"])
@@ -309,7 +342,7 @@ def test_report_for_package(self, report_dir, package, duration):
self.buildname = "{0}-{1}".format(self.current_package_name, package["id"]) self.buildname = "{0}-{1}".format(self.current_package_name, package["id"])
else: else:
self.buildname = self.report_build_name(self.current_package_name) self.buildname = self.report_build_name(self.current_package_name)
self.starttime = self.endtime - duration self.endtime = self.starttime + duration
report_data = self.initialize_report(report_dir) report_data = self.initialize_report(report_dir)
report_data["hostname"] = socket.gethostname() report_data["hostname"] = socket.gethostname()
@@ -354,7 +387,7 @@ def concretization_report(self, report_dir, msg):
self.buildname = self.base_buildname self.buildname = self.base_buildname
report_data = self.initialize_report(report_dir) report_data = self.initialize_report(report_dir)
report_data["update"] = {} report_data["update"] = {}
report_data["update"]["starttime"] = self.endtime report_data["update"]["starttime"] = self.starttime
report_data["update"]["endtime"] = self.endtime report_data["update"]["endtime"] = self.endtime
report_data["update"]["revision"] = self.revision report_data["update"]["revision"] = self.revision
report_data["update"]["log"] = msg report_data["update"]["log"] = msg

View File

@@ -42,6 +42,7 @@
"properties": { "properties": {
"source": {"type": "boolean"}, "source": {"type": "boolean"},
"binary": {"type": "boolean"}, "binary": {"type": "boolean"},
"signed": {"type": "boolean"},
"fetch": fetch_and_push, "fetch": fetch_and_push,
"push": fetch_and_push, "push": fetch_and_push,
**connection, # type: ignore **connection, # type: ignore

View File

@@ -13,7 +13,7 @@
import re import re
import types import types
import warnings import warnings
from typing import Callable, Dict, List, NamedTuple, Optional, Sequence, Set, Tuple, Union from typing import Any, Callable, Dict, List, NamedTuple, Optional, Sequence, Set, Tuple, Union
import archspec.cpu import archspec.cpu
@@ -41,7 +41,6 @@
import spack.error import spack.error
import spack.package_base import spack.package_base
import spack.package_prefs import spack.package_prefs
import spack.paths
import spack.platforms import spack.platforms
import spack.repo import spack.repo
import spack.spec import spack.spec
@@ -312,6 +311,29 @@ def __call__(self, *args):
""" """
return AspFunction(self.name, self.args + args) return AspFunction(self.name, self.args + args)
def match(self, pattern: "AspFunction"):
"""Compare name and args of this ASP function to a match pattern.
Arguments of ``pattern`` function can be strings, arbitrary objects or ``any``:
* ``any`` matches any argument;
* ``str`` arguments are treated as regular expressions and match against the
string representation of the args of this function.
* any other object is compared with `==`.
"""
if self.name != pattern.name or len(pattern.args) > len(self.args):
return False
for parg, arg in zip(pattern.args, self.args):
if parg is any:
continue
elif isinstance(parg, str) and not re.match(parg, str(arg)):
return False
elif parg != arg:
return False
return True
def symbol(self, positive=True): def symbol(self, positive=True):
def argify(arg): def argify(arg):
if isinstance(arg, bool): if isinstance(arg, bool):
@@ -339,12 +361,36 @@ def __getattr__(self, name):
fn = AspFunctionBuilder() fn = AspFunctionBuilder()
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]] TransformFunction = Callable[
[spack.spec.Spec, spack.spec.Spec, List[AspFunction]], List[AspFunction]
]
def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]: def transform(
required: spack.spec.Spec,
imposed: spack.spec.Spec,
clauses: List[AspFunction],
transformations: Optional[List[TransformFunction]],
) -> List[AspFunction]:
"""Apply a list of TransformFunctions in order."""
if transformations is None:
return clauses
for func in transformations:
clauses = func(required, imposed, clauses)
return clauses
def cond_key(spec, transforms):
"""Key generator for caching triggers and effects"""
return (str(spec), None) if transforms is None else (str(spec), tuple(transforms))
def remove_node(
required: spack.spec.Spec, imposed: spack.spec.Spec, functions: List[AspFunction]
) -> List[AspFunction]:
"""Transformation that removes all "node" and "virtual_node" from the input list of facts.""" """Transformation that removes all "node" and "virtual_node" from the input list of facts."""
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts)) return [func for func in functions if func.args[0] not in ("node", "virtual_node")]
def _create_counter(specs, tests): def _create_counter(specs, tests):
@@ -1108,6 +1154,7 @@ def __init__(self, tests=False):
self.possible_oses = set() self.possible_oses = set()
self.variant_values_from_specs = set() self.variant_values_from_specs = set()
self.version_constraints = set() self.version_constraints = set()
self.synced_version_constraints = set()
self.target_constraints = set() self.target_constraints = set()
self.default_targets = [] self.default_targets = []
self.compiler_version_constraints = set() self.compiler_version_constraints = set()
@@ -1502,14 +1549,26 @@ def variant_rules(self, pkg):
self.gen.newline() self.gen.newline()
def _lookup_condition_id(self, condition, transforms, factory, cache_by_name):
"""Look up or create a condition in a trigger/effect cache."""
key = cond_key(condition, transforms)
cache = cache_by_name[condition.name]
pair = cache.get(key)
if pair is None:
pair = cache[key] = (next(self._id_counter), factory())
id, _ = pair
return id
def condition( def condition(
self, self,
required_spec: spack.spec.Spec, required_spec: spack.spec.Spec,
imposed_spec: Optional[spack.spec.Spec] = None, imposed_spec: Optional[spack.spec.Spec] = None,
name: Optional[str] = None, name: Optional[str] = None,
msg: Optional[str] = None, msg: Optional[str] = None,
transform_required: Optional[TransformFunction] = None, transform_required: Optional[List[TransformFunction]] = None,
transform_imposed: Optional[TransformFunction] = remove_node, transform_imposed: Optional[List[TransformFunction]] = [remove_node],
): ):
"""Generate facts for a dependency or virtual provider condition. """Generate facts for a dependency or virtual provider condition.
@@ -1537,35 +1596,27 @@ def condition(
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition(condition_id))) self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg)) self.gen.fact(fn.condition_reason(condition_id, msg))
cache = self._trigger_cache[named_cond.name] def make_requirements():
named_cond_key = (str(named_cond), transform_required)
if named_cond_key not in cache:
trigger_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=True, required_from=name) requirements = self.spec_clauses(named_cond, body=True, required_from=name)
return transform(named_cond, imposed_spec, requirements, transform_required)
if transform_required: trigger_id = self._lookup_condition_id(
requirements = transform_required(named_cond, requirements) named_cond, transform_required, make_requirements, self._trigger_cache
)
cache[named_cond_key] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id))) self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
if not imposed_spec: if not imposed_spec:
return condition_id return condition_id
cache = self._effect_cache[named_cond.name] def make_impositions():
imposed_spec_key = (str(imposed_spec), transform_imposed) impositions = self.spec_clauses(imposed_spec, body=False, required_from=name)
if imposed_spec_key not in cache: return transform(named_cond, imposed_spec, impositions, transform_imposed)
effect_id = next(self._id_counter)
requirements = self.spec_clauses(imposed_spec, body=False, required_from=name)
if transform_imposed: effect_id = self._lookup_condition_id(
requirements = transform_imposed(imposed_spec, requirements) imposed_spec, transform_imposed, make_impositions, self._effect_cache
)
cache[imposed_spec_key] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id))) self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
return condition_id return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False): def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
@@ -1604,6 +1655,27 @@ def package_provider_rules(self, pkg):
) )
self.gen.newline() self.gen.newline()
def transform_my_version(
self, require: spack.spec.Spec, impose: spack.spec.Spec, funcs: List[AspFunction]
) -> List[AspFunction]:
"""Replace symbolic "my" version with reference to dependent's version."""
result = []
for f in funcs:
if not f.match(fn.attr("node_version_satisfies", any, r"^my\.version$")):
result.append(f)
continue
# get Version from version(Package, Version) and generate
# node_version_satisfies(dep, Version)
dep = f.args[1]
sync = fn.attr("sync", dep, "node_version_satisfies", fn.attr("version", require.name))
result.append(sync)
# remember to generate version_satisfies/3 for my.version constraints
self.synced_version_constraints.add((require.name, dep))
return result
def package_dependencies_rules(self, pkg): def package_dependencies_rules(self, pkg):
"""Translate 'depends_on' directives into ASP logic.""" """Translate 'depends_on' directives into ASP logic."""
for _, conditions in sorted(pkg.dependencies.items()): for _, conditions in sorted(pkg.dependencies.items()):
@@ -1628,14 +1700,12 @@ def package_dependencies_rules(self, pkg):
else: else:
pass pass
def track_dependencies(input_spec, requirements): def track_dependencies(required, imposed, requirements):
return requirements + [fn.attr("track_dependencies", input_spec.name)] return requirements + [fn.attr("track_dependencies", required.name)]
def dependency_holds(input_spec, requirements): def dependency_holds(required, imposed, impositions):
return remove_node(input_spec, requirements) + [ return impositions + [
fn.attr( fn.attr("dependency_holds", pkg.name, imposed.name, dt.flag_to_string(t))
"dependency_holds", pkg.name, input_spec.name, dt.flag_to_string(t)
)
for t in dt.ALL_FLAGS for t in dt.ALL_FLAGS
if t & depflag if t & depflag
] ]
@@ -1645,8 +1715,8 @@ def dependency_holds(input_spec, requirements):
dep.spec, dep.spec,
name=pkg.name, name=pkg.name,
msg=msg, msg=msg,
transform_required=track_dependencies, transform_required=[track_dependencies],
transform_imposed=dependency_holds, transform_imposed=[remove_node, dependency_holds, self.transform_my_version],
) )
self.gen.newline() self.gen.newline()
@@ -1744,15 +1814,13 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
try: try:
# With virtual we want to emit "node" and "virtual_node" in imposed specs # With virtual we want to emit "node" and "virtual_node" in imposed specs
transform: Optional[TransformFunction] = remove_node transform = None if virtual else [remove_node]
if virtual:
transform = None
member_id = self.condition( member_id = self.condition(
required_spec=when_spec, required_spec=when_spec,
imposed_spec=spec, imposed_spec=spec,
name=pkg_name, name=pkg_name,
transform_imposed=transform, transform_imposed=[transform],
msg=f"{spec_str} is a requirement for package {pkg_name}", msg=f"{spec_str} is a requirement for package {pkg_name}",
) )
except Exception as e: except Exception as e:
@@ -1817,14 +1885,14 @@ def external_packages(self):
for local_idx, spec in enumerate(external_specs): for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec) msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, _): def external_imposition(required, imposed, _):
return [fn.attr("external_conditions_hold", input_spec.name, local_idx)] return [fn.attr("external_conditions_hold", imposed.name, local_idx)]
self.condition( self.condition(
spec, spec,
spack.spec.Spec(spec.name), spack.spec.Spec(spec.name),
msg=msg, msg=msg,
transform_imposed=external_imposition, transform_imposed=[external_imposition],
) )
self.possible_versions[spec.name].add(spec.version) self.possible_versions[spec.name].add(spec.version)
self.gen.newline() self.gen.newline()
@@ -2415,6 +2483,15 @@ def generate_possible_compilers(self, specs):
def define_version_constraints(self): def define_version_constraints(self):
"""Define what version_satisfies(...) means in ASP logic.""" """Define what version_satisfies(...) means in ASP logic."""
# quadratic for now b/c we're anticipating pkg_ver being an
# expression/range/etc. right now this only does exact matches until we can
# propagate an expression from depends_on to here.
for pkg, dep in self.synced_version_constraints:
for pkg_ver in self.possible_versions[pkg]:
for dep_ver in self.possible_versions[dep]:
if dep_ver.satisfies(pkg_ver):
self.gen.fact(fn.pkg_fact(dep, fn.version_satisfies(dep_ver, pkg_ver)))
for pkg_name, versions in sorted(self.version_constraints): for pkg_name, versions in sorted(self.version_constraints):
# generate facts for each package constraint and the version # generate facts for each package constraint and the version
# that satisfies it # that satisfies it
@@ -2603,11 +2680,7 @@ def setup(
dev_specs = tuple( dev_specs = tuple(
spack.spec.Spec(info["spec"]).constrained( spack.spec.Spec(info["spec"]).constrained(
"dev_path=%s" "dev_path=%s"
% spack.util.path.canonicalize_path( % spack.util.path.canonicalize_path(info["path"], default_wd=env.path)
info["path"],
default_wd=env.path,
replacements=spack.paths.path_replacements(),
)
) )
for name, info in env.dev_specs.items() for name, info in env.dev_specs.items()
) )
@@ -3124,9 +3197,7 @@ def _develop_specs_from_env(spec, env):
if not dev_info: if not dev_info:
return return
path = spack.util.path.canonicalize_path( path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
dev_info["path"], default_wd=env.path, replacements=spack.paths.path_replacements()
)
if "dev_path" in spec.variants: if "dev_path" in spec.variants:
error_msg = ( error_msg = (

View File

@@ -366,6 +366,11 @@ attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constrai
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name). attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)). attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)).
attr(DepAttrName, DepNode, Value)
:- depends_on(node(X, Package), DepNode),
attr(PackageAttrName, node(X, Package), Value),
attr("sync", DepNode, DepAttrName, attr(PackageAttrName, Package)).
% For node flag sources we need to look at the condition_set of the source, since it is the dependent % For node flag sources we need to look at the condition_set of the source, since it is the dependent
% of the package on which I want to impose the constraint % of the package on which I want to impose the constraint
attr("node_flag_source", node(X, A1), A2, node(Y, A3)) attr("node_flag_source", node(X, A1), A2, node(Y, A3))

View File

@@ -910,19 +910,23 @@ def flags():
yield flags yield flags
def __str__(self): def __str__(self):
sorted_keys = [k for k in sorted(self.keys()) if self[k] != []] sorted_items = sorted((k, v) for k, v in self.items() if v)
cond_symbol = " " if len(sorted_keys) > 0 else ""
return ( result = ""
cond_symbol for flag_type, flags in sorted_items:
+ " ".join( normal = [f for f in flags if not f.propagate]
key if normal:
+ ('=="' if True in [f.propagate for f in self[key]] else '="') result += f" {flag_type}={spack.parser.quote_if_needed(' '.join(normal))}"
+ " ".join(self[key])
+ '"' propagated = [f for f in flags if f.propagate]
for key in sorted_keys if propagated:
) result += f" {flag_type}=={spack.parser.quote_if_needed(' '.join(propagated))}"
+ cond_symbol
) # TODO: somehow add this space only if something follows in Spec.format()
if sorted_items:
result += " "
return result
def _sort_by_dep_types(dspec: DependencySpec): def _sort_by_dep_types(dspec: DependencySpec):
@@ -3128,9 +3132,7 @@ def flat_dependencies(self, **kwargs):
except spack.error.UnsatisfiableSpecError as e: except spack.error.UnsatisfiableSpecError as e:
# Here, the DAG contains two instances of the same package # Here, the DAG contains two instances of the same package
# with inconsistent constraints. Users cannot produce # with inconsistent constraints.
# inconsistent specs like this on the command line: the
# parser doesn't allow it. Spack must be broken!
raise InconsistentSpecError("Invalid Spec DAG: %s" % e.message) from e raise InconsistentSpecError("Invalid Spec DAG: %s" % e.message) from e
def index(self, deptype="all"): def index(self, deptype="all"):

View File

@@ -150,7 +150,7 @@ def _resolve_paths(candidates):
Adjustments involve removing extra $user from $tempdir if $tempdir includes Adjustments involve removing extra $user from $tempdir if $tempdir includes
$user and appending $user if it is not present in the path. $user and appending $user if it is not present in the path.
""" """
temp_path = sup.canonicalize_path("$tempdir", replacements=spack.paths.path_replacements()) temp_path = sup.canonicalize_path("$tempdir")
user = getpass.getuser() user = getpass.getuser()
tmp_has_usr = user in temp_path.split(os.path.sep) tmp_has_usr = user in temp_path.split(os.path.sep)
@@ -162,7 +162,7 @@ def _resolve_paths(candidates):
path = path.replace("/$user", "", 1) path = path.replace("/$user", "", 1)
# Ensure the path is unique per user. # Ensure the path is unique per user.
can_path = sup.canonicalize_path(path, replacements=spack.paths.path_replacements()) can_path = sup.canonicalize_path(path)
# When multiple users share a stage root, we can avoid conflicts between # When multiple users share a stage root, we can avoid conflicts between
# them by adding a per-user subdirectory. # them by adding a per-user subdirectory.
# Avoid doing this on Windows to keep stage absolute path as short as possible. # Avoid doing this on Windows to keep stage absolute path as short as possible.
@@ -199,10 +199,9 @@ def get_stage_root():
def _mirror_roots(): def _mirror_roots():
mirrors = spack.config.get("mirrors") mirrors = spack.config.get("mirrors")
return [ return [
sup.substitute_path_variables(root, replacements=spack.paths.path_replacements()) sup.substitute_path_variables(root)
if root.endswith(os.sep) if root.endswith(os.sep)
else sup.substitute_path_variables(root, replacemnts=spack.paths.path_replacements()) else sup.substitute_path_variables(root) + os.sep
+ os.sep
for root in mirrors.values() for root in mirrors.values()
] ]

View File

@@ -77,9 +77,7 @@ def parse_install_tree(config_dict):
if isinstance(install_tree, str): if isinstance(install_tree, str):
tty.warn("Using deprecated format for configuring install_tree") tty.warn("Using deprecated format for configuring install_tree")
unpadded_root = install_tree unpadded_root = install_tree
unpadded_root = spack.util.path.canonicalize_path( unpadded_root = spack.util.path.canonicalize_path(unpadded_root)
unpadded_root, replacements=spack.paths.path_replacements()
)
# construct projection from previous values for backwards compatibility # construct projection from previous values for backwards compatibility
all_projection = config_dict.get( all_projection = config_dict.get(
"install_path_scheme", spack.directory_layout.default_projections["all"] "install_path_scheme", spack.directory_layout.default_projections["all"]
@@ -88,9 +86,7 @@ def parse_install_tree(config_dict):
projections = {"all": all_projection} projections = {"all": all_projection}
else: else:
unpadded_root = install_tree.get("root", DEFAULT_INSTALL_TREE_ROOT) unpadded_root = install_tree.get("root", DEFAULT_INSTALL_TREE_ROOT)
unpadded_root = spack.util.path.canonicalize_path( unpadded_root = spack.util.path.canonicalize_path(unpadded_root)
unpadded_root, replacements=spack.paths.path_replacements()
)
padded_length = install_tree.get("padded_length", False) padded_length = install_tree.get("padded_length", False)
if padded_length is True: if padded_length is True:
@@ -271,9 +267,7 @@ def _construct_upstream_dbs_from_install_roots(
for install_root in reversed(install_roots): for install_root in reversed(install_roots):
upstream_dbs = list(accumulated_upstream_dbs) upstream_dbs = list(accumulated_upstream_dbs)
next_db = spack.database.Database( next_db = spack.database.Database(
spack.util.path.canonicalize_path( spack.util.path.canonicalize_path(install_root),
install_root, replacements=spack.paths.path_replacements()
),
is_upstream=True, is_upstream=True,
upstream_dbs=upstream_dbs, upstream_dbs=upstream_dbs,
) )

View File

@@ -10,7 +10,6 @@
import spack.config import spack.config
import spack.extensions import spack.extensions
import spack.paths
from spack.util.path import canonicalize_path from spack.util.path import canonicalize_path
@@ -77,10 +76,7 @@ def make_environment(dirs: Optional[Tuple[str, ...]] = None):
# Default directories where to search for templates # Default directories where to search for templates
builtins = spack.config.get("config:template_dirs", ["$spack/share/spack/templates"]) builtins = spack.config.get("config:template_dirs", ["$spack/share/spack/templates"])
extensions = spack.extensions.get_template_dirs() extensions = spack.extensions.get_template_dirs()
r = spack.paths.path_replacements() dirs = tuple(canonicalize_path(d) for d in itertools.chain(builtins, extensions))
dirs = tuple(
canonicalize_path(d, replacements=r) for d in itertools.chain(builtins, extensions)
)
# Loader for the templates # Loader for the templates
loader = jinja2.FileSystemLoader(dirs) loader = jinja2.FileSystemLoader(dirs)

View File

@@ -30,6 +30,8 @@ def current_host_platform():
current_platform = spack.platforms.Darwin() current_platform = spack.platforms.Darwin()
elif "Windows" in platform.system(): elif "Windows" in platform.system():
current_platform = spack.platforms.Windows() current_platform = spack.platforms.Windows()
elif "FreeBSD" in platform.system():
current_platform = spack.platforms.FreeBSD()
return current_platform return current_platform

View File

@@ -25,12 +25,12 @@
import spack.caches import spack.caches
import spack.config import spack.config
import spack.fetch_strategy import spack.fetch_strategy
import spack.gpg
import spack.hooks.sbang as sbang import spack.hooks.sbang as sbang
import spack.main import spack.main
import spack.mirror import spack.mirror
import spack.repo import spack.repo
import spack.store import spack.store
import spack.util.gpg
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
@@ -344,10 +344,10 @@ def test_push_and_fetch_keys(mock_gnupghome):
# dir 1: create a new key, record its fingerprint, and push it to a new # dir 1: create a new key, record its fingerprint, and push it to a new
# mirror # mirror
with spack.gpg.gnupghome_override(gpg_dir1): with spack.util.gpg.gnupghome_override(gpg_dir1):
spack.gpg.create(name="test-key", email="fake@test.key", expires="0", comment=None) spack.util.gpg.create(name="test-key", email="fake@test.key", expires="0", comment=None)
keys = spack.gpg.public_keys() keys = spack.util.gpg.public_keys()
assert len(keys) == 1 assert len(keys) == 1
fpr = keys[0] fpr = keys[0]
@@ -355,12 +355,12 @@ def test_push_and_fetch_keys(mock_gnupghome):
# dir 2: import the key from the mirror, and confirm that its fingerprint # dir 2: import the key from the mirror, and confirm that its fingerprint
# matches the one created above # matches the one created above
with spack.gpg.gnupghome_override(gpg_dir2): with spack.util.gpg.gnupghome_override(gpg_dir2):
assert len(spack.gpg.public_keys()) == 0 assert len(spack.util.gpg.public_keys()) == 0
bindist.get_keys(mirrors=mirrors, install=True, trust=True, force=True) bindist.get_keys(mirrors=mirrors, install=True, trust=True, force=True)
new_keys = spack.gpg.public_keys() new_keys = spack.util.gpg.public_keys()
assert len(new_keys) == 1 assert len(new_keys) == 1
assert new_keys[0] == fpr assert new_keys[0] == fpr
@@ -672,7 +672,7 @@ def test_etag_fetching_304():
# Test conditional fetch with etags. If the remote hasn't modified the file # Test conditional fetch with etags. If the remote hasn't modified the file
# it returns 304, which is an HTTPError in urllib-land. That should be # it returns 304, which is an HTTPError in urllib-land. That should be
# handled as success, since it means the local cache is up-to-date. # handled as success, since it means the local cache is up-to-date.
def response_304(request: urllib.request.Request, verify_ssl=True, timeout=10): def response_304(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url == "https://www.example.com/build_cache/index.json": if url == "https://www.example.com/build_cache/index.json":
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"' assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
@@ -694,7 +694,7 @@ def response_304(request: urllib.request.Request, verify_ssl=True, timeout=10):
def test_etag_fetching_200(): def test_etag_fetching_200():
# Test conditional fetch with etags. The remote has modified the file. # Test conditional fetch with etags. The remote has modified the file.
def response_200(request: urllib.request.Request, verify_ssl=True, timeout=10): def response_200(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url == "https://www.example.com/build_cache/index.json": if url == "https://www.example.com/build_cache/index.json":
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"' assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
@@ -722,7 +722,7 @@ def response_200(request: urllib.request.Request, verify_ssl=True, timeout=10):
def test_etag_fetching_404(): def test_etag_fetching_404():
# Test conditional fetch with etags. The remote has modified the file. # Test conditional fetch with etags. The remote has modified the file.
def response_404(request: urllib.request.Request, verify_ssl=True, timeout=10): def response_404(request: urllib.request.Request):
raise urllib.error.HTTPError( raise urllib.error.HTTPError(
request.get_full_url(), request.get_full_url(),
404, 404,
@@ -745,7 +745,7 @@ def test_default_index_fetch_200():
index_json = '{"Hello": "World"}' index_json = '{"Hello": "World"}'
index_json_hash = bindist.compute_hash(index_json) index_json_hash = bindist.compute_hash(index_json)
def urlopen(request: urllib.request.Request, **kwargs): def urlopen(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url.endswith("index.json.hash"): if url.endswith("index.json.hash"):
return urllib.response.addinfourl( # type: ignore[arg-type] return urllib.response.addinfourl( # type: ignore[arg-type]
@@ -784,7 +784,7 @@ def test_default_index_dont_fetch_index_json_hash_if_no_local_hash():
index_json = '{"Hello": "World"}' index_json = '{"Hello": "World"}'
index_json_hash = bindist.compute_hash(index_json) index_json_hash = bindist.compute_hash(index_json)
def urlopen(request: urllib.request.Request, **kwargs): def urlopen(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url.endswith("index.json"): if url.endswith("index.json"):
return urllib.response.addinfourl( return urllib.response.addinfourl(
@@ -813,7 +813,7 @@ def test_default_index_not_modified():
index_json = '{"Hello": "World"}' index_json = '{"Hello": "World"}'
index_json_hash = bindist.compute_hash(index_json) index_json_hash = bindist.compute_hash(index_json)
def urlopen(request: urllib.request.Request, **kwargs): def urlopen(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url.endswith("index.json.hash"): if url.endswith("index.json.hash"):
return urllib.response.addinfourl( return urllib.response.addinfourl(
@@ -838,7 +838,7 @@ def test_default_index_invalid_hash_file(index_json):
# Test invalid unicode / invalid hash type # Test invalid unicode / invalid hash type
index_json_hash = bindist.compute_hash(index_json) index_json_hash = bindist.compute_hash(index_json)
def urlopen(request: urllib.request.Request, **kwargs): def urlopen(request: urllib.request.Request):
return urllib.response.addinfourl( return urllib.response.addinfourl(
io.BytesIO(), io.BytesIO(),
headers={}, # type: ignore[arg-type] headers={}, # type: ignore[arg-type]
@@ -858,7 +858,7 @@ def test_default_index_json_404():
index_json = '{"Hello": "World"}' index_json = '{"Hello": "World"}'
index_json_hash = bindist.compute_hash(index_json) index_json_hash = bindist.compute_hash(index_json)
def urlopen(request: urllib.request.Request, **kwargs): def urlopen(request: urllib.request.Request):
url = request.get_full_url() url = request.get_full_url()
if url.endswith("index.json.hash"): if url.endswith("index.json.hash"):
return urllib.response.addinfourl( return urllib.response.addinfourl(

View File

@@ -10,7 +10,6 @@
import spack.bootstrap.core import spack.bootstrap.core
import spack.compilers import spack.compilers
import spack.environment import spack.environment
import spack.paths
import spack.store import spack.store
import spack.util.path import spack.util.path
@@ -82,9 +81,7 @@ def test_store_path_customization(config_value, expected, mutable_config):
# Check the store path # Check the store path
current = spack.bootstrap.config.store_path() current = spack.bootstrap.config.store_path()
assert current == spack.util.path.canonicalize_path( assert current == spack.util.path.canonicalize_path(expected)
expected, replacements=spack.paths.path_replacements()
)
def test_raising_exception_if_bootstrap_disabled(mutable_config): def test_raising_exception_if_bootstrap_disabled(mutable_config):

View File

@@ -16,9 +16,9 @@
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.gpg
import spack.paths as spack_paths import spack.paths as spack_paths
import spack.util.git import spack.util.git
import spack.util.gpg
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml

View File

@@ -25,7 +25,7 @@ def test_error_when_multiple_specs_are_given():
assert "only takes one spec" in output assert "only takes one spec" in output
@pytest.mark.parametrize("args", [("--", "/bin/bash", "-c", "echo test"), ("--",), ()]) @pytest.mark.parametrize("args", [("--", "/bin/sh", "-c", "echo test"), ("--",), ()])
@pytest.mark.usefixtures("config", "mock_packages", "working_env") @pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_build_env_requires_a_spec(args): def test_build_env_requires_a_spec(args):
output = build_env(*args, fail_on_error=False) output = build_env(*args, fail_on_error=False)
@@ -35,7 +35,7 @@ def test_build_env_requires_a_spec(args):
_out_file = "env.out" _out_file = "env.out"
@pytest.mark.parametrize("shell", ["pwsh", "bat"] if sys.platform == "win32" else ["bash"]) @pytest.mark.parametrize("shell", ["pwsh", "bat"] if sys.platform == "win32" else ["sh"])
@pytest.mark.usefixtures("config", "mock_packages", "working_env") @pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_dump(shell_as, shell, tmpdir): def test_dump(shell_as, shell, tmpdir):
with tmpdir.as_cwd(): with tmpdir.as_cwd():

View File

@@ -331,3 +331,37 @@ def fake_push(node, push_url, options):
# Ensure no duplicates # Ensure no duplicates
assert len(set(packages_to_push)) == len(packages_to_push) assert len(set(packages_to_push)) == len(packages_to_push)
@pytest.mark.parametrize("signed", [True, False])
def test_push_and_install_with_mirror_marked_unsigned_does_not_require_extra_flags(
tmp_path, mutable_database, mock_gnupghome, signed
):
"""Tests whether marking a mirror as unsigned makes it possible to push and install to/from
it without requiring extra flags on the command line (and no signing keys configured)."""
# Create a named mirror with signed set to True or False
add_flag = "--signed" if signed else "--unsigned"
mirror("add", add_flag, "my-mirror", str(tmp_path))
spec = mutable_database.query_local("libelf", installed=True)[0]
# Push
if signed:
# Need to pass "--unsigned" to override the mirror's default
args = ["push", "--update-index", "--unsigned", "my-mirror", f"/{spec.dag_hash()}"]
else:
# No need to pass "--unsigned" if the mirror is unsigned
args = ["push", "--update-index", "my-mirror", f"/{spec.dag_hash()}"]
buildcache(*args)
# Install
if signed:
# Need to pass "--no-check-signature" to avoid install errors
kwargs = {"cache_only": True, "unsigned": True}
else:
# No need to pass "--no-check-signature" if the mirror is unsigned
kwargs = {"cache_only": True}
spec.package.do_uninstall(force=True)
spec.package.do_install(**kwargs)

View File

@@ -18,11 +18,11 @@
import spack.ci as ci import spack.ci as ci
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.gpg
import spack.hash_types as ht import spack.hash_types as ht
import spack.main import spack.main
import spack.paths as spack_paths import spack.paths as spack_paths
import spack.repo as repo import spack.repo as repo
import spack.util.gpg
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
from spack.schema.buildcache_spec import schema as specfile_schema from spack.schema.buildcache_spec import schema as specfile_schema
@@ -2000,7 +2000,7 @@ def test_ci_reproduce(
install_script = os.path.join(working_dir.strpath, "install.sh") install_script = os.path.join(working_dir.strpath, "install.sh")
with open(install_script, "w") as fd: with open(install_script, "w") as fd:
fd.write("#!/bin/bash\n\n#fake install\nspack install blah\n") fd.write("#!/bin/sh\n\n#fake install\nspack install blah\n")
spack_info_file = os.path.join(working_dir.strpath, "spack_info.txt") spack_info_file = os.path.join(working_dir.strpath, "spack_info.txt")
with open(spack_info_file, "w") as fd: with open(spack_info_file, "w") as fd:

View File

@@ -50,8 +50,8 @@ def test_negative_integers_not_allowed_for_parallel_jobs(job_parser):
[ [
(['coreutils cflags="-O3 -g"'], ["-O3", "-g"], [False, False], []), (['coreutils cflags="-O3 -g"'], ["-O3", "-g"], [False, False], []),
(['coreutils cflags=="-O3 -g"'], ["-O3", "-g"], [True, True], []), (['coreutils cflags=="-O3 -g"'], ["-O3", "-g"], [True, True], []),
(["coreutils", "cflags=-O3 -g"], ["-O3"], [False], ["g"]), (["coreutils", "cflags=-O3 -g"], ["-O3", "-g"], [False, False], []),
(["coreutils", "cflags==-O3 -g"], ["-O3"], [True], ["g"]), (["coreutils", "cflags==-O3 -g"], ["-O3", "-g"], [True, True], []),
(["coreutils", "cflags=-O3", "-g"], ["-O3"], [False], ["g"]), (["coreutils", "cflags=-O3", "-g"], ["-O3"], [False], ["g"]),
], ],
) )

View File

@@ -10,7 +10,6 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.environment as ev import spack.environment as ev
import spack.paths
import spack.spec import spack.spec
from spack.main import SpackCommand from spack.main import SpackCommand
@@ -107,9 +106,7 @@ def test_develop_canonicalize_path(self, monkeypatch, config):
env("create", "test") env("create", "test")
with ev.read("test") as e: with ev.read("test") as e:
path = "../$user" path = "../$user"
abspath = spack.util.path.canonicalize_path( abspath = spack.util.path.canonicalize_path(path, e.path)
path, e.path, replacements=spack.paths.path_replacements()
)
def check_path(stage, dest): def check_path(stage, dest):
assert dest == abspath assert dest == abspath
@@ -126,9 +123,7 @@ def test_develop_canonicalize_path_no_args(self, monkeypatch, config):
env("create", "test") env("create", "test")
with ev.read("test") as e: with ev.read("test") as e:
path = "$user" path = "$user"
abspath = spack.util.path.canonicalize_path( abspath = spack.util.path.canonicalize_path(path, e.path)
path, e.path, replacements=spack.paths.path_replacements()
)
def check_path(stage, dest): def check_path(stage, dest):
assert dest == abspath assert dest == abspath

View File

@@ -904,9 +904,7 @@ def test_env_with_included_config_var_path(tmpdir, packages_file):
spack_yaml = env_path / ev.manifest_name spack_yaml = env_path / ev.manifest_name
spack_yaml.write_text(mpileaks_env_config(config_var_path)) spack_yaml.write_text(mpileaks_env_config(config_var_path))
config_real_path = substitute_path_variables( config_real_path = substitute_path_variables(config_var_path)
config_var_path, replacements=spack.paths.path_replacements()
)
shutil.move(included_file, config_real_path) shutil.move(included_file, config_real_path)
assert os.path.exists(config_real_path) assert os.path.exists(config_real_path)

View File

@@ -10,9 +10,8 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.bootstrap import spack.bootstrap
import spack.gpg
import spack.paths
import spack.util.executable import spack.util.executable
import spack.util.gpg
from spack.main import SpackCommand from spack.main import SpackCommand
from spack.paths import mock_gpg_data_path, mock_gpg_keys_path from spack.paths import mock_gpg_data_path, mock_gpg_keys_path
from spack.util.executable import ProcessError from spack.util.executable import ProcessError
@@ -46,19 +45,19 @@ def test_find_gpg(cmd_name, version, tmpdir, mock_gnupghome, monkeypatch):
monkeypatch.setenv("PATH", str(tmpdir)) monkeypatch.setenv("PATH", str(tmpdir))
if version == "undetectable" or version.endswith("1.3.4"): if version == "undetectable" or version.endswith("1.3.4"):
with pytest.raises(spack.gpg.SpackGPGError): with pytest.raises(spack.util.gpg.SpackGPGError):
spack.gpg.init(force=True, gpg_path=spack.paths.gpg_path) spack.util.gpg.init(force=True)
else: else:
spack.gpg.init(force=True, gpg_path=spack.paths.gpg_path) spack.util.gpg.init(force=True)
assert spack.gpg.GPG is not None assert spack.util.gpg.GPG is not None
assert spack.gpg.GPGCONF is not None assert spack.util.gpg.GPGCONF is not None
def test_no_gpg_in_path(tmpdir, mock_gnupghome, monkeypatch, mutable_config): def test_no_gpg_in_path(tmpdir, mock_gnupghome, monkeypatch, mutable_config):
monkeypatch.setenv("PATH", str(tmpdir)) monkeypatch.setenv("PATH", str(tmpdir))
bootstrap("disable") bootstrap("disable")
with pytest.raises(RuntimeError): with pytest.raises(RuntimeError):
spack.gpg.init(force=True, gpg_path=spack.paths.gpg_path) spack.util.gpg.init(force=True)
@pytest.mark.maybeslow @pytest.mark.maybeslow
@@ -106,7 +105,7 @@ def test_gpg(tmpdir, mutable_config, mock_gnupghome):
"Spack testing 1", "Spack testing 1",
"spack@googlegroups.com", "spack@googlegroups.com",
) )
keyfp = spack.gpg.signing_keys()[0] keyfp = spack.util.gpg.signing_keys()[0]
# List the keys. # List the keys.
# TODO: Test the output here. # TODO: Test the output here.

View File

@@ -398,3 +398,12 @@ def test_mirror_set_2(mutable_config):
"url": "http://example.com", "url": "http://example.com",
"push": {"url": "http://example2.com", "access_pair": ["username", "password"]}, "push": {"url": "http://example2.com", "access_pair": ["username", "password"]},
} }
def test_mirror_add_set_signed(mutable_config):
mirror("add", "--signed", "example", "http://example.com")
assert spack.config.get("mirrors:example") == {"url": "http://example.com", "signed": True}
mirror("set", "--unsigned", "example")
assert spack.config.get("mirrors:example") == {"url": "http://example.com", "signed": False}
mirror("set", "--signed", "example")
assert spack.config.get("mirrors:example") == {"url": "http://example.com", "signed": True}

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re import re
from textwrap import dedent
import pytest import pytest
@@ -74,41 +73,6 @@ def test_spec_parse_cflags_quoting():
assert ["-flto", "-Os"] == gh_flagged.compiler_flags["cxxflags"] assert ["-flto", "-Os"] == gh_flagged.compiler_flags["cxxflags"]
def test_spec_parse_unquoted_flags_report():
"""Verify that a useful error message is produced if unquoted compiler flags are
provided."""
# This should fail during parsing, since /usr/include is interpreted as a spec hash.
with pytest.raises(spack.error.SpackError) as cm:
# We don't try to figure out how many following args were intended to be part of
# cflags, we just explain how to fix it for the immediate next arg.
spec("gcc cflags=-Os -pipe -other-arg-that-gets-ignored cflags=-I /usr/include")
# Verify that the generated error message is nicely formatted.
expected_message = dedent(
'''\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
intent was to set multiple flags, try quoting them together as described below.
Possible flag quotation errors (with the correctly-quoted version after the =>):
(1) cflags=-Os -pipe => cflags="-Os -pipe"
(2) cflags=-I /usr/include => cflags="-I /usr/include"'''
)
assert expected_message in str(cm.value)
# Verify that the same unquoted cflags report is generated in the error message even
# if it fails during concretization, not just during parsing.
with pytest.raises(spack.error.SpackError) as cm:
spec("gcc cflags=-Os -pipe")
cm = str(cm.value)
assert cm.startswith(
'trying to set variant "pipe" in package "gcc", but the package has no such variant'
)
assert cm.endswith('(1) cflags=-Os -pipe => cflags="-Os -pipe"')
def test_spec_yaml(): def test_spec_yaml():
output = spec("--yaml", "mpileaks") output = spec("--yaml", "mpileaks")

View File

@@ -253,8 +253,8 @@ def test_get_compiler_link_paths_load_env(working_env, monkeypatch, tmpdir):
gcc = str(tmpdir.join("gcc")) gcc = str(tmpdir.join("gcc"))
with open(gcc, "w") as f: with open(gcc, "w") as f:
f.write( f.write(
"""#!/bin/bash """#!/bin/sh
if [[ $ENV_SET == "1" && $MODULE_LOADED == "1" ]]; then if [ "$ENV_SET" = "1" ] && [ "$MODULE_LOADED" = "1" ]; then
echo '""" echo '"""
+ no_flag_output + no_flag_output
+ """' + """'
@@ -699,8 +699,8 @@ def test_compiler_get_real_version(working_env, monkeypatch, tmpdir):
gcc = str(tmpdir.join("gcc")) gcc = str(tmpdir.join("gcc"))
with open(gcc, "w") as f: with open(gcc, "w") as f:
f.write( f.write(
"""#!/bin/bash """#!/bin/sh
if [[ $CMP_ON == "1" ]]; then if [ "$CMP_ON" = "1" ]; then
echo "$CMP_VER" echo "$CMP_VER"
fi fi
""" """
@@ -745,8 +745,8 @@ def test_compiler_get_real_version_fails(working_env, monkeypatch, tmpdir):
gcc = str(tmpdir.join("gcc")) gcc = str(tmpdir.join("gcc"))
with open(gcc, "w") as f: with open(gcc, "w") as f:
f.write( f.write(
"""#!/bin/bash """#!/bin/sh
if [[ $CMP_ON == "1" ]]; then if [ "$CMP_ON" = "1" ]; then
echo "$CMP_VER" echo "$CMP_VER"
fi fi
""" """
@@ -799,7 +799,7 @@ def test_compiler_flags_use_real_version(working_env, monkeypatch, tmpdir):
gcc = str(tmpdir.join("gcc")) gcc = str(tmpdir.join("gcc"))
with open(gcc, "w") as f: with open(gcc, "w") as f:
f.write( f.write(
"""#!/bin/bash """#!/bin/sh
echo "4.4.4" echo "4.4.4"
""" """
) # Version for which c++11 flag is -std=c++0x ) # Version for which c++11 flag is -std=c++0x

View File

@@ -338,51 +338,50 @@ def __init__(self, path):
def test_substitute_config_variables(mock_low_high_config, monkeypatch): def test_substitute_config_variables(mock_low_high_config, monkeypatch):
prefix = spack.paths.prefix.lstrip("/") prefix = spack.paths.prefix.lstrip("/")
r = spack.paths.path_replacements()
assert cross_plat_join( assert cross_plat_join(
os.sep + os.path.join("foo", "bar", "baz"), prefix os.sep + os.path.join("foo", "bar", "baz"), prefix
) == spack_path.canonicalize_path("/foo/bar/baz/$spack", replacements=r) ) == spack_path.canonicalize_path("/foo/bar/baz/$spack")
assert cross_plat_join( assert cross_plat_join(
spack.paths.prefix, os.path.join("foo", "bar", "baz") spack.paths.prefix, os.path.join("foo", "bar", "baz")
) == spack_path.canonicalize_path("$spack/foo/bar/baz/", replacements=r) ) == spack_path.canonicalize_path("$spack/foo/bar/baz/")
assert cross_plat_join( assert cross_plat_join(
os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz") os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz")
) == spack_path.canonicalize_path("/foo/bar/baz/$spack/foo/bar/baz/", replacements=r) ) == spack_path.canonicalize_path("/foo/bar/baz/$spack/foo/bar/baz/")
assert cross_plat_join( assert cross_plat_join(
os.sep + os.path.join("foo", "bar", "baz"), prefix os.sep + os.path.join("foo", "bar", "baz"), prefix
) == spack_path.canonicalize_path("/foo/bar/baz/${spack}", replacements=r) ) == spack_path.canonicalize_path("/foo/bar/baz/${spack}")
assert cross_plat_join( assert cross_plat_join(
spack.paths.prefix, os.path.join("foo", "bar", "baz") spack.paths.prefix, os.path.join("foo", "bar", "baz")
) == spack_path.canonicalize_path("${spack}/foo/bar/baz/", replacements=r) ) == spack_path.canonicalize_path("${spack}/foo/bar/baz/")
assert cross_plat_join( assert cross_plat_join(
os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz") os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz")
) == spack_path.canonicalize_path("/foo/bar/baz/${spack}/foo/bar/baz/", replacements=r) ) == spack_path.canonicalize_path("/foo/bar/baz/${spack}/foo/bar/baz/")
assert cross_plat_join( assert cross_plat_join(
os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz") os.sep + os.path.join("foo", "bar", "baz"), prefix, os.path.join("foo", "bar", "baz")
) != spack_path.canonicalize_path("/foo/bar/baz/${spack/foo/bar/baz/", replacements=r) ) != spack_path.canonicalize_path("/foo/bar/baz/${spack/foo/bar/baz/")
# $env replacement is a no-op when no environment is active # $env replacement is a no-op when no environment is active
assert spack_path.canonicalize_path( assert spack_path.canonicalize_path(
os.sep + os.path.join("foo", "bar", "baz", "$env"), replacements=r os.sep + os.path.join("foo", "bar", "baz", "$env")
) == os.sep + os.path.join("foo", "bar", "baz", "$env") ) == os.sep + os.path.join("foo", "bar", "baz", "$env")
# Fake an active environment and $env is replaced properly # Fake an active environment and $env is replaced properly
fake_env_path = os.sep + os.path.join("quux", "quuux") fake_env_path = os.sep + os.path.join("quux", "quuux")
monkeypatch.setattr(ev, "active_environment", lambda: MockEnv(fake_env_path)) monkeypatch.setattr(ev, "active_environment", lambda: MockEnv(fake_env_path))
assert spack_path.canonicalize_path("$env/foo/bar/baz", replacements=r) == os.path.join( assert spack_path.canonicalize_path("$env/foo/bar/baz") == os.path.join(
fake_env_path, os.path.join("foo", "bar", "baz") fake_env_path, os.path.join("foo", "bar", "baz")
) )
# relative paths without source information are relative to cwd # relative paths without source information are relative to cwd
assert spack_path.canonicalize_path( assert spack_path.canonicalize_path(os.path.join("foo", "bar", "baz")) == os.path.abspath(
os.path.join("foo", "bar", "baz"), replacements=r os.path.join("foo", "bar", "baz")
) == os.path.abspath(os.path.join("foo", "bar", "baz")) )
# relative paths with source information are relative to the file # relative paths with source information are relative to the file
spack.config.set( spack.config.set(
@@ -390,19 +389,19 @@ def test_substitute_config_variables(mock_low_high_config, monkeypatch):
) )
spack.config.CONFIG.clear_caches() spack.config.CONFIG.clear_caches()
path = spack.config.get("modules:default:roots:lmod") path = spack.config.get("modules:default:roots:lmod")
assert spack_path.canonicalize_path(path, replacements=r) == os.path.normpath( assert spack_path.canonicalize_path(path) == os.path.normpath(
os.path.join(mock_low_high_config.scopes["low"].path, os.path.join("foo", "bar", "baz")) os.path.join(mock_low_high_config.scopes["low"].path, os.path.join("foo", "bar", "baz"))
) )
# test architecture information is in replacements # test architecture information is in replacements
assert spack_path.canonicalize_path( assert spack_path.canonicalize_path(
os.path.join("foo", "$platform", "bar"), replacements=r os.path.join("foo", "$platform", "bar")
) == os.path.abspath(os.path.join("foo", "test", "bar")) ) == os.path.abspath(os.path.join("foo", "test", "bar"))
host_target = spack.platforms.host().target("default_target") host_target = spack.platforms.host().target("default_target")
host_target_family = str(host_target.microarchitecture.family) host_target_family = str(host_target.microarchitecture.family)
assert spack_path.canonicalize_path( assert spack_path.canonicalize_path(
os.path.join("foo", "$target_family", "bar"), replacements=r os.path.join("foo", "$target_family", "bar")
) == os.path.abspath(os.path.join("foo", host_target_family, "bar")) ) == os.path.abspath(os.path.join("foo", host_target_family, "bar"))
@@ -439,33 +438,28 @@ def test_substitute_user(mock_low_high_config):
assert os.sep + os.path.join( assert os.sep + os.path.join(
"foo", "bar" "foo", "bar"
) + os.sep + user + os.sep + "baz" == spack_path.canonicalize_path( ) + os.sep + user + os.sep + "baz" == spack_path.canonicalize_path(
os.sep + os.path.join("foo", "bar", "$user", "baz"), os.sep + os.path.join("foo", "bar", "$user", "baz")
replacements=spack.paths.path_replacements(),
) )
def test_substitute_user_cache(mock_low_high_config): def test_substitute_user_cache(mock_low_high_config):
user_cache_path = spack.paths.user_cache_path user_cache_path = spack.paths.user_cache_path
assert user_cache_path + os.sep + "baz" == spack_path.canonicalize_path( assert user_cache_path + os.sep + "baz" == spack_path.canonicalize_path(
os.path.join("$user_cache_path", "baz"), replacements=spack.paths.path_replacements() os.path.join("$user_cache_path", "baz")
) )
def test_substitute_tempdir(mock_low_high_config): def test_substitute_tempdir(mock_low_high_config):
tempdir = tempfile.gettempdir() tempdir = tempfile.gettempdir()
assert tempdir == spack_path.canonicalize_path( assert tempdir == spack_path.canonicalize_path("$tempdir")
"$tempdir", replacements=spack.paths.path_replacements()
)
assert tempdir + os.sep + os.path.join("foo", "bar", "baz") == spack_path.canonicalize_path( assert tempdir + os.sep + os.path.join("foo", "bar", "baz") == spack_path.canonicalize_path(
os.path.join("$tempdir", "foo", "bar", "baz"), replacements=spack.paths.path_replacements() os.path.join("$tempdir", "foo", "bar", "baz")
) )
def test_substitute_date(mock_low_high_config): def test_substitute_date(mock_low_high_config):
test_path = os.path.join("hello", "world", "on", "$date") test_path = os.path.join("hello", "world", "on", "$date")
new_path = spack_path.canonicalize_path( new_path = spack_path.canonicalize_path(test_path)
test_path, replacements=spack.paths.path_replacements()
)
assert "$date" in test_path assert "$date" in test_path
assert date.today().strftime("%Y-%m-%d") in new_path assert date.today().strftime("%Y-%m-%d") in new_path

View File

@@ -39,7 +39,6 @@
import spack.directory_layout import spack.directory_layout
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.gpg
import spack.package_base import spack.package_base
import spack.package_prefs import spack.package_prefs
import spack.paths import spack.paths
@@ -51,6 +50,7 @@
import spack.test.cray_manifest import spack.test.cray_manifest
import spack.util.executable import spack.util.executable
import spack.util.git import spack.util.git
import spack.util.gpg
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
from spack.fetch_strategy import URLFetchStrategy from spack.fetch_strategy import URLFetchStrategy
@@ -1074,13 +1074,13 @@ def mock_gnupghome(monkeypatch):
# This comes up because tmp paths on macOS are already long-ish, and # This comes up because tmp paths on macOS are already long-ish, and
# pytest makes them longer. # pytest makes them longer.
try: try:
spack.gpg.init(gpg_path=spack.paths.gpg_path) spack.util.gpg.init()
except spack.gpg.SpackGPGError: except spack.util.gpg.SpackGPGError:
if not spack.gpg.GPG: if not spack.util.gpg.GPG:
pytest.skip("This test requires gpg") pytest.skip("This test requires gpg")
short_name_tmpdir = tempfile.mkdtemp() short_name_tmpdir = tempfile.mkdtemp()
with spack.gpg.gnupghome_override(short_name_tmpdir): with spack.util.gpg.gnupghome_override(short_name_tmpdir):
yield short_name_tmpdir yield short_name_tmpdir
# clean up, since we are doing this manually # clean up, since we are doing this manually

View File

@@ -321,3 +321,18 @@ def inner():
""" """
).format(__file__) ).format(__file__)
) )
def test_grouped_exception_base_type():
h = llnl.util.lang.GroupedExceptionHandler()
with h.forward("catch-runtime-error", RuntimeError):
raise NotImplementedError()
with pytest.raises(NotImplementedError):
with h.forward("catch-value-error", ValueError):
raise NotImplementedError()
message = h.grouped_message(with_tracebacks=False)
assert "catch-runtime-error" in message
assert "catch-value-error" not in message

View File

@@ -21,10 +21,10 @@
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.cmd.buildcache as buildcache import spack.cmd.buildcache as buildcache
import spack.error import spack.error
import spack.gpg
import spack.package_base import spack.package_base
import spack.repo import spack.repo
import spack.store import spack.store
import spack.util.gpg
import spack.util.url as url_util import spack.util.url as url_util
from spack.fetch_strategy import URLFetchStrategy from spack.fetch_strategy import URLFetchStrategy
from spack.paths import mock_gpg_keys_path from spack.paths import mock_gpg_keys_path
@@ -72,7 +72,7 @@ def test_buildcache(mock_archive, tmp_path, monkeypatch, mutable_config):
create_args = ["create", "-f", "--rebuild-index", mirror_path, pkghash] create_args = ["create", "-f", "--rebuild-index", mirror_path, pkghash]
# Create a private key to sign package with if gpg2 available # Create a private key to sign package with if gpg2 available
spack.gpg.create( spack.util.gpg.create(
name="test key 1", name="test key 1",
expires="0", expires="0",
email="spack@googlegroups.com", email="spack@googlegroups.com",

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import itertools import itertools
import os import os
import re import re
@@ -9,6 +10,7 @@
import pytest import pytest
import spack.cmd
import spack.platforms.test import spack.platforms.test
import spack.spec import spack.spec
import spack.variant import spack.variant
@@ -203,7 +205,8 @@ def _specfile_for(spec_str, filename):
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1~qt_4 debug=2 ^stackwalker@8.1_1e", "mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1~qt_4 debug=2 ^stackwalker@8.1_1e",
), ),
( (
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags=-O3 +debug~qt_4 ^stackwalker@8.1_1e", # noqa: E501 "mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags=-O3 +debug~qt_4 "
"^stackwalker@8.1_1e",
[ [
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"), Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
Token(TokenType.DEPENDENCY, value="^"), Token(TokenType.DEPENDENCY, value="^"),
@@ -217,7 +220,8 @@ def _specfile_for(spec_str, filename):
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"), Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
Token(TokenType.VERSION, value="@8.1_1e"), Token(TokenType.VERSION, value="@8.1_1e"),
], ],
'mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags="-O3" +debug~qt_4 ^stackwalker@8.1_1e', # noqa: E501 "mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags=-O3 +debug~qt_4 "
"^stackwalker@8.1_1e",
), ),
# Specs containing YAML or JSON in the package name # Specs containing YAML or JSON in the package name
( (
@@ -424,7 +428,7 @@ def _specfile_for(spec_str, filename):
compiler_with_version_range("%gcc@10.1.0,12.2.1:"), compiler_with_version_range("%gcc@10.1.0,12.2.1:"),
compiler_with_version_range("%gcc@:8.4.3,10.2.1:12.1.0"), compiler_with_version_range("%gcc@:8.4.3,10.2.1:12.1.0"),
# Special key value arguments # Special key value arguments
("dev_path=*", [Token(TokenType.KEY_VALUE_PAIR, value="dev_path=*")], "dev_path=*"), ("dev_path=*", [Token(TokenType.KEY_VALUE_PAIR, value="dev_path=*")], "dev_path='*'"),
( (
"dev_path=none", "dev_path=none",
[Token(TokenType.KEY_VALUE_PAIR, value="dev_path=none")], [Token(TokenType.KEY_VALUE_PAIR, value="dev_path=none")],
@@ -444,33 +448,28 @@ def _specfile_for(spec_str, filename):
( (
"cflags=a=b=c", "cflags=a=b=c",
[Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c")], [Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c")],
'cflags="a=b=c"', "cflags='a=b=c'",
), ),
( (
"cflags=a=b=c", "cflags=a=b=c",
[Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c")], [Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c")],
'cflags="a=b=c"', "cflags='a=b=c'",
), ),
( (
"cflags=a=b=c+~", "cflags=a=b=c+~",
[Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c+~")], [Token(TokenType.KEY_VALUE_PAIR, value="cflags=a=b=c+~")],
'cflags="a=b=c+~"', "cflags='a=b=c+~'",
), ),
( (
"cflags=-Wl,a,b,c", "cflags=-Wl,a,b,c",
[Token(TokenType.KEY_VALUE_PAIR, value="cflags=-Wl,a,b,c")], [Token(TokenType.KEY_VALUE_PAIR, value="cflags=-Wl,a,b,c")],
'cflags="-Wl,a,b,c"', "cflags=-Wl,a,b,c",
), ),
# Multi quoted # Multi quoted
(
"cflags=''-Wl,a,b,c''",
[Token(TokenType.KEY_VALUE_PAIR, value="cflags=''-Wl,a,b,c''")],
'cflags="-Wl,a,b,c"',
),
( (
'cflags=="-O3 -g"', 'cflags=="-O3 -g"',
[Token(TokenType.PROPAGATED_KEY_VALUE_PAIR, value='cflags=="-O3 -g"')], [Token(TokenType.PROPAGATED_KEY_VALUE_PAIR, value='cflags=="-O3 -g"')],
'cflags=="-O3 -g"', "cflags=='-O3 -g'",
), ),
# Whitespace is allowed in version lists # Whitespace is allowed in version lists
("@1.2:1.4 , 1.6 ", [Token(TokenType.VERSION, value="@1.2:1.4 , 1.6")], "@1.2:1.4,1.6"), ("@1.2:1.4 , 1.6 ", [Token(TokenType.VERSION, value="@1.2:1.4 , 1.6")], "@1.2:1.4,1.6"),
@@ -484,22 +483,6 @@ def _specfile_for(spec_str, filename):
], ],
"a@1:", "a@1:",
), ),
(
"@1.2: develop = foo",
[
Token(TokenType.VERSION, value="@1.2:"),
Token(TokenType.KEY_VALUE_PAIR, value="develop = foo"),
],
"@1.2: develop=foo",
),
(
"@1.2:develop = foo",
[
Token(TokenType.VERSION, value="@1.2:"),
Token(TokenType.KEY_VALUE_PAIR, value="develop = foo"),
],
"@1.2: develop=foo",
),
( (
"% intel @ 12.1:12.6 + debug", "% intel @ 12.1:12.6 + debug",
[ [
@@ -587,8 +570,8 @@ def _specfile_for(spec_str, filename):
) )
def test_parse_single_spec(spec_str, tokens, expected_roundtrip): def test_parse_single_spec(spec_str, tokens, expected_roundtrip):
parser = SpecParser(spec_str) parser = SpecParser(spec_str)
assert parser.tokens() == tokens assert tokens == parser.tokens()
assert str(parser.next_spec()) == expected_roundtrip assert expected_roundtrip == str(parser.next_spec())
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -654,19 +637,79 @@ def test_parse_multiple_specs(text, tokens, expected_specs):
assert str(total_parser.next_spec()) == str(single_spec_parser.next_spec()) assert str(total_parser.next_spec()) == str(single_spec_parser.next_spec())
@pytest.mark.parametrize(
"args,expected",
[
# Test that CLI-quoted flags/variant values are preserved
(["zlib", "cflags=-O3 -g", "+bar", "baz"], "zlib cflags='-O3 -g' +bar baz"),
# Test that CLI-quoted propagated flags/variant values are preserved
(["zlib", "cflags==-O3 -g", "+bar", "baz"], "zlib cflags=='-O3 -g' +bar baz"),
# An entire string passed on the CLI with embedded quotes also works
(["zlib cflags='-O3 -g' +bar baz"], "zlib cflags='-O3 -g' +bar baz"),
# Entire string *without* quoted flags splits -O3/-g (-g interpreted as a variant)
(["zlib cflags=-O3 -g +bar baz"], "zlib cflags=-O3 +bar~g baz"),
# If the entirety of "-O3 -g +bar baz" is quoted on the CLI, it's all taken as flags
(["zlib", "cflags=-O3 -g +bar baz"], "zlib cflags='-O3 -g +bar baz'"),
# If the string doesn't start with key=, it needs internal quotes for flags
(["zlib", " cflags=-O3 -g +bar baz"], "zlib cflags=-O3 +bar~g baz"),
# Internal quotes for quoted CLI args are considered part of *one* arg
(["zlib", 'cflags="-O3 -g" +bar baz'], """zlib cflags='"-O3 -g" +bar baz'"""),
# Use double quotes if internal single quotes are present
(["zlib", "cflags='-O3 -g' +bar baz"], '''zlib cflags="'-O3 -g' +bar baz"'''),
# Use single quotes and escape single quotes with internal single and double quotes
(["zlib", "cflags='-O3 -g' \"+bar baz\""], 'zlib cflags="\'-O3 -g\' \\"+bar baz\\""'),
# Ensure that empty strings are handled correctly on CLI
(["zlib", "ldflags=", "+pic"], "zlib+pic"),
# These flags are assumed to be quoted by the shell, but the space doesn't matter because
# flags are space-separated.
(["zlib", "ldflags= +pic"], "zlib ldflags='+pic'"),
(["ldflags= +pic"], "ldflags='+pic'"),
# If the name is not a flag name, the space is preserved verbatim, because variant values
# are comma-separated.
(["zlib", "foo= +pic"], "zlib foo=' +pic'"),
(["foo= +pic"], "foo=' +pic'"),
# You can ensure no quotes are added parse_specs() by starting your string with space,
# but you still need to quote empty strings properly.
([" ldflags= +pic"], SpecTokenizationError),
([" ldflags=", "+pic"], SpecTokenizationError),
([" ldflags='' +pic"], "+pic"),
([" ldflags=''", "+pic"], "+pic"),
# Ensure that empty strings are handled properly in quoted strings
(["zlib ldflags='' +pic"], "zlib+pic"),
# Ensure that $ORIGIN is handled correctly
(["zlib", "ldflags=-Wl,-rpath=$ORIGIN/_libs"], "zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'"),
# Ensure that passing escaped quotes on the CLI raises a tokenization error
(["zlib", '"-g', '-O2"'], SpecTokenizationError),
],
)
def test_cli_spec_roundtrip(args, expected):
if inspect.isclass(expected) and issubclass(expected, BaseException):
with pytest.raises(expected):
spack.cmd.parse_specs(args)
return
specs = spack.cmd.parse_specs(args)
output_string = " ".join(str(spec) for spec in specs)
assert expected == output_string
@pytest.mark.parametrize( @pytest.mark.parametrize(
"text,expected_in_error", "text,expected_in_error",
[ [
("x@@1.2", "x@@1.2\n ^^^^^"), ("x@@1.2", r"x@@1.2\n ^"),
("y ^x@@1.2", "y ^x@@1.2\n ^^^^^"), ("y ^x@@1.2", r"y ^x@@1.2\n ^"),
("x@1.2::", "x@1.2::\n ^"), ("x@1.2::", r"x@1.2::\n ^"),
("x::", "x::\n ^^"), ("x::", r"x::\n ^^"),
("cflags=''-Wl,a,b,c''", r"cflags=''-Wl,a,b,c''\n ^ ^ ^ ^^"),
("@1.2: develop = foo", r"@1.2: develop = foo\n ^^"),
("@1.2:develop = foo", r"@1.2:develop = foo\n ^^"),
], ],
) )
def test_error_reporting(text, expected_in_error): def test_error_reporting(text, expected_in_error):
parser = SpecParser(text) parser = SpecParser(text)
with pytest.raises(SpecTokenizationError) as exc: with pytest.raises(SpecTokenizationError) as exc:
parser.tokens() parser.tokens()
assert expected_in_error in str(exc), parser.tokens() assert expected_in_error in str(exc), parser.tokens()

View File

@@ -734,7 +734,7 @@ def test_resolve_paths(self):
assert spack.stage._resolve_paths(paths) == paths assert spack.stage._resolve_paths(paths) == paths
tempdir = "$tempdir" tempdir = "$tempdir"
can_tempdir = canonicalize_path(tempdir, replacements=spack.paths.path_replacements()) can_tempdir = canonicalize_path(tempdir)
user = getpass.getuser() user = getpass.getuser()
temp_has_user = user in can_tempdir.split(os.sep) temp_has_user = user in can_tempdir.split(os.sep)
paths = [ paths = [
@@ -744,8 +744,7 @@ def test_resolve_paths(self):
os.path.join(tempdir, "$user", "stage", "$user"), os.path.join(tempdir, "$user", "stage", "$user"),
] ]
r = spack.paths.path_replacements() res_paths = [canonicalize_path(p) for p in paths]
res_paths = [canonicalize_path(p, replacements=r) for p in paths]
if temp_has_user: if temp_has_user:
res_paths[1] = can_tempdir res_paths[1] = can_tempdir
res_paths[2] = os.path.join(can_tempdir, user) res_paths[2] = os.path.join(can_tempdir, user)

View File

@@ -7,7 +7,6 @@
import pytest import pytest
import spack.config import spack.config
import spack.paths
import spack.tengine as tengine import spack.tengine as tengine
from spack.util.path import canonicalize_path from spack.util.path import canonicalize_path
@@ -71,9 +70,8 @@ class TestTengineEnvironment:
def test_template_retrieval(self): def test_template_retrieval(self):
"""Tests the template retrieval mechanism hooked into config files""" """Tests the template retrieval mechanism hooked into config files"""
# Check the directories are correct # Check the directories are correct
r = spack.paths.path_replacements()
template_dirs = spack.config.get("config:template_dirs") template_dirs = spack.config.get("config:template_dirs")
template_dirs = tuple([canonicalize_path(x, replacements=r) for x in template_dirs]) template_dirs = tuple([canonicalize_path(x) for x in template_dirs])
assert len(template_dirs) == 3 assert len(template_dirs) == 3
env = tengine.make_environment(template_dirs) env = tengine.make_environment(template_dirs)

View File

@@ -350,8 +350,8 @@ def _which(*args, **kwargs):
def test_url_fetch_text_without_url(tmpdir): def test_url_fetch_text_without_url(tmpdir):
with pytest.raises(web_util.WebError, match="URL is required"): with pytest.raises(spack.error.FetchError, match="URL is required"):
web_util.fetch_url_text(None, fetch_method=spack.config.get("config:url_fetch_method")) web_util.fetch_url_text(None)
def test_url_fetch_text_curl_failures(tmpdir, monkeypatch): def test_url_fetch_text_curl_failures(tmpdir, monkeypatch):
@@ -367,20 +367,18 @@ def _which(*args, **kwargs):
monkeypatch.setattr(spack.util.web, "which", _which) monkeypatch.setattr(spack.util.web, "which", _which)
with spack.config.override("config:url_fetch_method", "curl"): with spack.config.override("config:url_fetch_method", "curl"):
with pytest.raises(web_util.WebError, match="Missing required curl"): with pytest.raises(spack.error.FetchError, match="Missing required curl"):
web_util.fetch_url_text( web_util.fetch_url_text("https://github.com/")
"https://github.com/", fetch_method=spack.config.get("config:url_fetch_method")
)
def test_url_check_curl_errors(): def test_url_check_curl_errors():
"""Check that standard curl error returncodes raise expected errors.""" """Check that standard curl error returncodes raise expected errors."""
# Check returncode 22 (i.e., 404) # Check returncode 22 (i.e., 404)
with pytest.raises(web_util.WebError, match="not found"): with pytest.raises(spack.error.FetchError, match="not found"):
web_util.check_curl_code(22) web_util.check_curl_code(22)
# Check returncode 60 (certificate error) # Check returncode 60 (certificate error)
with pytest.raises(web_util.WebError, match="invalid certificate"): with pytest.raises(spack.error.FetchError, match="invalid certificate"):
web_util.check_curl_code(60) web_util.check_curl_code(60)
@@ -397,11 +395,8 @@ def _which(*args, **kwargs):
monkeypatch.setattr(spack.util.web, "which", _which) monkeypatch.setattr(spack.util.web, "which", _which)
with spack.config.override("config:url_fetch_method", "curl"): with spack.config.override("config:url_fetch_method", "curl"):
with pytest.raises(web_util.WebError, match="Missing required curl"): with pytest.raises(spack.error.FetchError, match="Missing required curl"):
web_util.url_exists( web_util.url_exists("https://github.com/")
"https://github.com/",
fetch_method=spack.config.get("config:url_fetch_method", "urllib"),
)
def test_url_fetch_text_urllib_bad_returncode(tmpdir, monkeypatch): def test_url_fetch_text_urllib_bad_returncode(tmpdir, monkeypatch):
@@ -415,20 +410,16 @@ def _read_from_url(*args, **kwargs):
monkeypatch.setattr(spack.util.web, "read_from_url", _read_from_url) monkeypatch.setattr(spack.util.web, "read_from_url", _read_from_url)
with spack.config.override("config:url_fetch_method", "urllib"): with spack.config.override("config:url_fetch_method", "urllib"):
with pytest.raises(web_util.WebError, match="failed with error code"): with pytest.raises(spack.error.FetchError, match="failed with error code"):
web_util.fetch_url_text( web_util.fetch_url_text("https://github.com/")
"https://github.com/", fetch_method=spack.config.get("config:url_fetch_method")
)
def test_url_fetch_text_urllib_web_error(tmpdir, monkeypatch): def test_url_fetch_text_urllib_web_error(tmpdir, monkeypatch):
def _raise_web_error(*args, **kwargs): def _raise_web_error(*args, **kwargs):
raise web_util.WebError("bad url") raise web_util.SpackWebError("bad url")
monkeypatch.setattr(spack.util.web, "read_from_url", _raise_web_error) monkeypatch.setattr(spack.util.web, "read_from_url", _raise_web_error)
with spack.config.override("config:url_fetch_method", "urllib"): with spack.config.override("config:url_fetch_method", "urllib"):
with pytest.raises(web_util.WebError, match="fetch failed to verify"): with pytest.raises(spack.error.FetchError, match="fetch failed to verify"):
web_util.fetch_url_text( web_util.fetch_url_text("https://github.com/")
"https://github.com/", fetch_method=spack.config.get("config:url_fetch_method")
)

View File

@@ -9,9 +9,9 @@
import pytest import pytest
import spack.directives import spack.directives
import spack.package_hash as ph
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.util.package_hash as ph
from spack.spec import Spec from spack.spec import Spec
from spack.util.unparse import unparse from spack.util.unparse import unparse

Some files were not shown because too many files have changed in this diff Show More