Compare commits

...

951 Commits

Author SHA1 Message Date
Todd Gamblin
e3fc937b32 WIP 2023-12-14 14:27:59 -08:00
Todd Gamblin
977f6ce65c solver: refactor transforms in condition generation
- [x] allow caller of `condition()` to pass lists of transforms
- [x] all transform functions now take trigger *and* effect as parameters
- [x] add some utility functions to simplify `condition()`
2023-12-14 13:50:41 -08:00
Todd Gamblin
a690b8c27c Improve parsing of quoted flags and variants in specs (#41529)
This PR does several things:

- [x] Allow any character to appear in the quoted values of variants and flags.
- [x] Allow easier passing of quoted flags on the command line, e.g. `cflags="-O2 -g"`.
- [x] Handle quoting better in spec output, using single quotes around double 
      quotes and vice versa.
- [x] Disallow spaces around `=` and `==` when parsing variants and flags.

## Motivation

This PR is motivated by the issues above and by ORNL's 
[tips for launching at scale on Frontier](https://docs.olcf.ornl.gov/systems/frontier_user_guide.html#tips-for-launching-at-scale).
ORNL recommends using `sbcast --send-libs` to broadcast executables and their
libraries to compute nodes when running large jobs (e.g., 80k ranks). For an
executable named `exe`, `sbcast --send-libs` stores the needed libraries in a
directory alongside the executable called `exe_libs`. ORNL recommends pointing
`LD_LIBRARY_PATH` at that directory so that `exe` will find the local libraries and
not overwhelm the filesystem.

There are other ways to mitigate this problem:
* You could build with `RUNPATH` using `spack config add config:shared_linking:type:runpath`,
  which would make `LD_LIBRARY_PATH` take precedence over Spack's `RUNPATHs`.
  I don't recommend this one because `RUNPATH` can cause many other things to go wrong.
* You could use `spack config add config:shared_linking:bind:true`, added in #31948, which
  will greatly reduce the filesystem load for large jobs by pointing `DT_NEEDED` entries in
  ELF *directly* at the needed `.so` files instead of relying on `RPATH` search via soname.
  I have not experimented with this at 80,000 ranks, but it should help quite a bit.
* You could use [Spindle](https://github.com/hpc/Spindle) (as LLNL does on its machines)
  which should transparently fix this without any changes to your executable and without
  any need to use `sbcast` or other tools.

But we want to support the `sbcast` use case as well.

## `sbcast` and Spack

Spack's `RPATHs` break the `sbcast` fix because they're considered with higher precedence
than `LD_LIBRARY_PATH`. So Spack applications will still end up hitting the shared filesystem
when searching for libraries. We can avoid this by injecting some `ldflags` in to the build, e.g.,
if were were going to launch, say, `LAMMPS` at scale, we could add another `RPATH`
specifically for use with `sbcast`:

    spack install lammps ldflags='-Wl,-rpath=$ORIGIN/lmp_libs'

This will put the `lmp_libs` directory alongside `LAMMPS`'s `lmp` executable first in the
`RPATH`, so it will be searched before any directories on the shared filesystem.

## Issues with quoting

Before this PR, the command above would've errored out for two reasons:

1. `$` wasn't an allowed character in our spec parser.
2. You would've had to double quote the flags to get them to pass through correctly:

       spack install lammps ldflags='"-Wl,-rpath=$ORIGIN/lmp_libs"'

This is ugly and I don't think many users will easily figure it out. The behavior was added in
#29282, and it improved parsing of specs passed as a single string, e.g.:

    spack install 'lammps ldflags="-Wl,-rpath=$ORIGIN/lmp_libs"'

but a lot of users are naturally going to try to quote arguments *directly* on the command
line, without quoting their entire spec. #29282 used a heuristic to detect unquoted flags
and warn the user, but the warning could be confusing. In particular, if you wrote
`cflags="-O2 -g"` on the command line, it would break the flags up, warn, and tell you
that you could fix the issue by writing `cflags="-O2 -g"` even though you just wrote
that. It's telling you to *quote* that value, but the user has to know to double quote.

## New heuristic for quoted arguments from the CLI

There are only two places where we allow arbitrary quoted strings in specs: flags and
variant values, so this PR adds a simpler heuristic to the CLI parser: if an argument in
`sys.argv` starts with `name=...`, then we assume the whole argument is quoted.

This means you can write:

    spack install bzip2 cflags="-O2 -g"

directly on the command line, without multiple levels of quoting. This also works:

    spack install 'bzip2 cflags="-O2 -g"'

The only place where this heuristic runs into ambiguity is if you attempt to pass
anonymous specs that start with `name=...` as one large string. e.g., this will be
interpreted as one large flag value:

    spack find 'cflags="-O2 -g" ~bar +baz'

This sets `cflags` to `"-O2 -g" ~bar +baz`, which is likely not what you wanted. You
can fix this easily by either removing the quotes:

    spack find cflags="-O2 -g" ~bar +baz

Or by adding a space at the start, which has the same effect:

    spack find ' cflags="-O2 -g" ~bar +baz'

You may wonder why we don't just look for quotes inside of flag arguments, and the
reason is that you *might* want them there.  If you are passing arguments like:

    spack install zlib cppflags="-D DEBUG_MSG1='quick fox' -D DEBUG_MSG2='lazy dog'"

You *need* the quotes there. So we've opted for one potentially confusing, but easily
fixed outcome vs. limiting what you can put in your quoted strings.

## Quotes in formatted spec output

In addition to being more lenient about characters accepted in quoted strings, this PR fixes
up spec formatting a bit. We now format quoted strings in specs with single quotes, unless
the string has a single quote in it, in which case we JSON-escape the string (i.e., we add
`\` before `"` and `\`).  

    zlib cflags='-D FOO="bar"'
    zlib cflags="-D FOO='bar'"
    zlib cflags="-D FOO='bar' BAR=\"baz\""
2023-12-13 16:36:22 -08:00
yizeyi18
a1fa862c3f camp: fixing build issue (#41400)
* adding necessary headers, to fix https://github.com/spack/spack/issues/41398

* deleting something imported by accident

* [@spackbot] updating style on behalf of yizeyi18

* undo commit 7688fed according to suggestion from @msimberg

* patching camp@:2022.10.1 for compatibility with gcc-13

* adding the patch

* fixing paths in the patch

* [@spackbot] updating style on behalf of yizeyi18

* Update camp patch using LLNL/camp@05e1c35

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* changing patch name

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-13 16:28:57 -08:00
Harmen Stoppels
80f31829a8 python: don't run mkdirp in setup_dependent_package (#41603)
`setup_dependent_package` is not a build phase, it should just set
globals for a package.

It's called during setup of runtime environment of packages, and there
have been reports of it actually failing due to a read only file system
(not sure under what exact conditions that is possible).
2023-12-13 23:54:28 +01:00
Arne Becker
84436f10ba perl-config-tiny: New package (#41584) 2023-12-13 14:43:03 -07:00
Arne Becker
660485709d perl-b-cow: New package (#41596) 2023-12-13 12:06:14 -08:00
Arne Becker
251dce05c9 perl-throwable: New package (#41597) 2023-12-13 12:05:17 -08:00
Arne Becker
ecd05fdfb4 perl-scope-guard: New package (#41598) 2023-12-13 12:02:50 -08:00
Arne Becker
9ffcf36444 perl-test-sharedfork: New package (#41599) 2023-12-13 12:01:54 -08:00
Arne Becker
07258a7c80 perl-safe-isa: New package (#41600) 2023-12-13 12:01:10 -08:00
Arne Becker
395e53a5e0 perl-proc-processtable: New package (#41601) 2023-12-13 11:59:05 -08:00
Arne Becker
77c331c753 perl-net-ip: New package (#41606) 2023-12-13 11:57:17 -08:00
Arne Becker
ee5481a861 perl-any-uri-escape: New package (#41607) 2023-12-13 11:56:11 -08:00
Arne Becker
f7ec061c64 perl-term-table: New package (#41608) 2023-12-13 11:47:08 -08:00
Arne Becker
7cb70ff4b1 perl-test-pod: New package (#41609) 2023-12-13 11:45:59 -08:00
Arne Becker
4a661f3255 perl-spiffy: New package (#41610) 2023-12-13 11:44:36 -08:00
Arne Becker
7037240879 perl-ipc-system-simple: New package (#41611) 2023-12-13 11:43:21 -08:00
Arne Becker
0e96dfaeef perl-mime-types: New package (#41612) 2023-12-13 11:41:48 -08:00
Arne Becker
a0a2cd6a1a perl-convert-nls-date-format: New package (#41613) 2023-12-13 11:39:11 -08:00
Arne Becker
170c05bebb perl-module-pluggable: New package (#41614) 2023-12-13 11:04:27 -08:00
Arne Becker
bdf68b7ac0 perl-email-date-format: New package (#41617) 2023-12-13 11:03:24 -08:00
Arne Becker
c176de94e2 perl-heap: New package (#41618) 2023-12-13 11:00:55 -08:00
Harmen Stoppels
f63dbbe75d spack mirror create --all: include patches (#41579) 2023-12-13 20:00:44 +01:00
Arne Becker
a0c7b10c76 perl-log-any: New package (#41619) 2023-12-13 10:59:54 -08:00
Arne Becker
2dc3bf0164 perl-http-cookiejar: New package (#41620) 2023-12-13 10:57:55 -08:00
Harmen Stoppels
9bf6e05d02 Revert "[protobuf] New versions, explicit cxxstd variant (#41459)" (#41635)
This reverts commit b82bd8e6b6.
2023-12-13 15:02:25 +01:00
Massimiliano Culpo
cd283846af mysql: add v8.0.35, fix build (#41602) 2023-12-13 12:10:33 +01:00
Taillefumier Mathieu
03625c1c95 Add pic variant when building the library (#41631)
* Add pic variant when building the library

* make pretty

* Probably better approach
2023-12-13 01:38:13 -07:00
Wouter Deconinck
f01774f1d4 hepmc3: fix from_variant -> self.define (#41605)
* hepmc3: fix from_variant -> self.define
* hepmc3: str on versions
2023-12-13 07:02:48 +01:00
Christopher Christofi
965860d1f8 perl-getopt-argvfile: add new package with version 1.11 (#41625) 2023-12-12 20:33:25 -07:00
Christopher Christofi
c4baf4e199 perl-parselex: add new package with version 2.21 (#41626) 2023-12-12 20:08:28 -07:00
Aiden Grossman
dd82227ae7 Remove MCT license annotation (#41593)
This license annotation is currently invalid as it specifies a URL
rather than an SPDX expression. Remove it for now until we have a
consensus on how to represent this case.
2023-12-12 17:52:42 -07:00
fpruvost
a9028630a5 pastix: new release v6.3.2 (#41585) 2023-12-12 15:57:11 -07:00
Arne Becker
789c85ed8b perl-clone-pp: New package (#41586) 2023-12-12 15:46:25 -07:00
James Taliaferro
cf9d36fd64 kakoune: add v2023.08.05 (#41443) 2023-12-12 23:02:20 +01:00
Arne Becker
ef7ce46649 perl-ipc-run3: New package (#41583) 2023-12-12 14:29:30 -07:00
pabloaledo
334a50662f Update bioconductor packages (#41227)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-12-12 22:04:45 +01:00
Dom Heinzeller
d68e73d006 New package Model Coupling Toolkit (MCT) (#41564)
* New package Model Coupling Toolkit (MCT)
* Remove ~mpi variant from mct, build is not working correctly
* Remove boilerplate stuff from var/spack/repos/builtin/packages/mct/package.py
2023-12-12 11:26:57 -08:00
Adam J. Stewart
7d7f097295 py-pyvista: add v0.42.3 (#41246) 2023-12-12 10:58:15 -08:00
Massimiliano Culpo
37cdcc7172 mysql: fix issue when using old core API call (#41573)
MySQL was performing a core API call to `Spec.flat_dependencies`
when setting up the build environment. This function is an
implementation detail of the old concretizer, where multiple nodes
from the same package are not allowed.

This PR uses a more idiomatic way to check if "python" is
in the DAG.

For reference, see #11356 to check why the call was introduced.
2023-12-12 10:40:31 -08:00
Thomas Madlener
0a40bb72e8 genfit: Add latest tags and update root dependency (#41572) 2023-12-12 10:30:06 -08:00
James Beal
24b6edac89 bowtie2 add latest version (#41580)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-12 10:23:17 -08:00
Arne Becker
3e7acf3e61 perl-type-tiny: New package (#41582) 2023-12-12 10:21:56 -08:00
Harmen Stoppels
ede36512e7 gcc: simplify patch when range (#41587) 2023-12-12 10:03:41 -07:00
jmuddnv
e06b169720 NVIDIA HPC SDK: add v23.11 (#41125) 2023-12-12 17:40:53 +01:00
Stephen Sachs
7ed968d42c clingo-bootstrap: use new Spack API for environment modifications (#41574) 2023-12-12 17:28:15 +01:00
Cameron Rutherford
c673b9245c exago: Add v1.2.0 and patches for builds without python or tests. (#41350)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-12 10:17:26 -06:00
Mikael Simberg
27c0dab5ca fmt: Add patch to allow compilation with clang in CUDA mode (#41578) 2023-12-12 08:33:44 -07:00
Chris Green
b82bd8e6b6 [protobuf] New versions, explicit cxxstd variant (#41459)
* [protobuf] New versions, explicit cxxstd variant
* New versions 3.15.8, 3.25.0, 3.25.1.
* New explicit variant `cxxstd` with support for older Protobuf
  versions.
* Support testing.
* Use Protobuf's `protobuf_BUILD_SHARED_LIBS` instead of
  `BUILD_SHARED_LIBS`.
* Support building with LLVM/Clang's `libc++`.
* Address audit issue
* Variant default does not honor `when` clause
* Use `self.spec.satisfies()` instead of `with when()`
* Fix silliness; improve consistency
* Today was apparently a go-back-to-bed day
2023-12-11 22:08:40 -07:00
Sreenivasa Murthy Kolam
5351382501 Bump up the version for ROCm-5.7.0 and ROCm-5.7.1 releases. (#40724)
* initial commit for rocm-5.7.0 and 5.7.1 releases
* bump up ther version for 5.7.0 and 5.7.1 releases
* update recipes to support 5.7.0 and 5.7.1 releases
* bump up the version for ROCm 5.7.0 and ROCm-5.7.1 releases
* bump up the version for composable-kernel amd miopen-hip
* fix style errors
* fix style errors in hip etc
* renaming composable-kernel recipe
* changes for composable_kernel
* Revert "renaming composable-kernel recipe"
  This reverts commit 0cf6c6debf.
* Revert "changes for composable_kernel"
  This reverts commit 05272a10a7.
* bump up the version for hiprand
* using the checksum for hiprand-5.7.1
* bump up the version for 5.7.0 and 5.7.1 releases
* fix style errors
* fix merge conflicts with the develop.
* temp workaround for the error seen with rocm-5.7.0 when trying
  to generate the dependency file for runtime/legion/legion_redop.cu
* fix build issue(work around) with legion
* add patch for migraphx package to turn off ck
* update to  hip recipe
* fix hip-path detection inside llvm clang driver
* update llvm-amdgpu and rocm-validation-suite recipes
* fix style errors
* bump up the version for amdsmi for rocm-5.7.0 release
* add support for gfx941,gfx942 for rocm-5.7.0 release onwards
* revert changes to rocm.py file
* added gfx941 and gfx942 to rocm.py and add the gfx942 to kokkos and new checksum
  the new version seem to support gfx942
* bump up the version for rccl for 5.7.1
* update the patch for rocm-openmp-extras for 5.7.0
* update mivisionx recipe for 5.7.0 release
* add new dependencies for rocfft tests
* port the fix for avx build, the start address of values_ buffer in KernelParameters is not
  correct as it is computed based on 16-byte alignment
* set HIP_PATH=ROCM_PATH for 5.7.0 onwards
* address review comments
* revert adding xnack- and xnack+ to gfx940,gfx941,gfx942 as the prechecks were failing
2023-12-11 14:49:19 -08:00
Harmen Stoppels
8c29e90fa9 Build cache: make signed/unsigned a mirror property (#41507)
* Add `signed` property to mirror config

* make unsigned a tri-state: true/false overrides mirror config, none takes mirror config

* test commands

* Document this

* add a test
2023-12-11 15:14:59 -06:00
Tamara Dahlgren
045f398f3d Add missing build-system/custom phases to the CDash map (#41439) 2023-12-11 11:06:19 -08:00
Chris Green
6986e70877 [abseil-cpp] New version 20230802.1 (#41457) 2023-12-11 11:04:46 -08:00
wspear
4d59e746fd otf2: add v3.0.3 (#41499)
* Add otf2 version 3.3
* Update var/spack/repos/builtin/packages/otf2/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-11 11:02:58 -08:00
Arne Becker
f6de34f9db New package: perl-test-file (#41510) 2023-12-11 10:52:24 -08:00
Arne Becker
0f0adb71d0 perl-class-tiny: New package (#41513) 2023-12-11 10:49:38 -08:00
Joe Schoonover
4ec958c5c6 Add new version of feq-parse (#41515)
The new feq-parse version includes fixes for ifort and ifx compilers
Additionally, evaluation of parser objects with multidimension arrays is
now supported.
2023-12-11 10:47:42 -08:00
Kyle Gerheiser
2aa07fa557 libfabric: Add ucx provider variant (#41524) 2023-12-11 10:46:18 -08:00
Arne Becker
239d343588 perl-test-nowarnings: New package (#41539)
* perl-test-nowarnings: New package
* Add myself as maintainer
2023-12-11 10:37:52 -08:00
Andrey Perestoronin
44604708ad intel-oneapi-compilers 2024.0.1: added new version to packages (#41555)
* add new packages

* fix version
2023-12-11 11:33:42 -07:00
Arne Becker
f0109e4afe perl-test-weaken: New package (#41540) 2023-12-11 10:33:04 -08:00
Arne Becker
b8f90e1bdc perl-test-without-module: New package (#41541) 2023-12-11 10:31:44 -08:00
James Beal
8b9064e5e4 samtools/htslib add latest version (#41545)
* samtools/htslib add latest version
* Given I work at the same institute as the authors I think it fair I am willing to review changes, if it's complex I can ask them over tea.

---------

Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:29:08 -08:00
James Beal
ce79785c10 igv add latest version (#41546)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:26:29 -08:00
Christopher Christofi
af378c7f31 perl-bio-db-hts: add new package with version 3.01 (#41554)
* perl-bio-db-hts: add new package with version 3.01
* fix styling
2023-12-11 11:25:20 -07:00
Dom Heinzeller
cf50bfb7c2 Add eckit@1.24.5, ecmwf-atlas@{0.35.0,0.35.1,0.36.0} (#41547)
* Add eckit@1.24.5
* Add ecmwf-atlas@0.35.1
* Add ecmwf-atlas@0.36.0
2023-12-11 10:20:39 -08:00
Dom Heinzeller
620e090ff5 Add @climbfuji to fms maintainers (#41550) 2023-12-11 10:18:05 -08:00
Massimiliano Culpo
c4d86a9c2e apple-clang: add new package (#41485)
* apple-clang: added new package
* Add a maintainer
* Use f-strings, remove leftover comment
2023-12-11 10:06:27 -08:00
Kyle Gerheiser
3b74b894c7 fabtests: Add versions and update maintainer (#41525) 2023-12-11 18:55:02 +01:00
Dan Lipsa
3fa8afc036 Add v5.0.0 to SENSEI (#41551)
Co-authored-by: Dan Lipsa <dan.lipsa@savannah.khq.kitware.com>
2023-12-11 11:12:47 -06:00
Garth N. Wells
60628075cb Update for v0.7.2 (#41393) 2023-12-11 09:10:52 -08:00
Chris Green
9e4fab277b [root] New variants, patches (#41548)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.
2023-12-11 09:18:10 -07:00
Christopher Christofi
5588e328f7 fpocket: improve recipe (#41532) 2023-12-11 15:45:52 +01:00
Rocco Meli
93a1fc90c9 add cmake constraint (#41542) 2023-12-11 15:36:29 +01:00
Patrick Gartung
7297721e78 Revert "[root] New variants, checksum changes, sundry improvements (#41463)" (#41544)
This reverts commit 7d45e132a6.
2023-12-11 15:34:24 +01:00
Felix Werner
eb57d96ea9 geant4/geant4-data: add v10.0.4 (#41478)
* geant4/geant4-data: add builtin_clhep variant and v10.0.4.

* geant4: revert addition of builtin_clhep variant.

* geant4: fix vecgeom variant only being available for v10.3 and above.
2023-12-11 14:21:25 +01:00
Rocco Meli
ce09642922 netlib-scalapack: add git attribute and master version (#41537) 2023-12-11 03:38:28 -07:00
Arne Becker
cd88eb1ed0 kyotocabinet: add new package (#41512) 2023-12-11 03:38:12 -07:00
Thomas Helfer
826df84baf tfel: fix v3.4.5 checksum (#41523) 2023-12-11 03:27:59 -07:00
Alex Richert
0a4b365a7d fms: add v2023.04 (#41475) 2023-12-11 11:15:01 +01:00
Harmen Stoppels
a2ed4704e7 Link to GitHub Action spack/setup-spack in docs (#41509) 2023-12-11 11:12:40 +01:00
Harmen Stoppels
28b49d5d2f unit tests: replace /bin/bash with /bin/sh (#41495) 2023-12-11 11:11:27 +01:00
Adam J. Stewart
16bb4c360a PyTorch: disable sleef dep for now (#41508) 2023-12-11 11:00:24 +01:00
Thomas Gruber
cfd58bdafe Likwid: likwid-icx-mem-group-fix.patch only for 5.2.* versions (#41514) 2023-12-11 10:48:02 +01:00
Matthieu Dorier
53493ceab1 leveldb: turning benchmark and tests off (#41518) 2023-12-11 10:41:03 +01:00
Dave Keeshan
64cd429cc8 Fix filter_compiler_wrapper where compiler is None (#41502)
Fix filer_compiler_wrapper for cases where the compiler returned in None, this happens on some installed gcc systems that do not have fortran built into them as standard, e.g. gcc@11.4.0 on ubuntu 22.04
2023-12-11 10:31:56 +01:00
Harmen Stoppels
525809632e petsc: improve hipsparse compat (#40311)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-11 10:30:14 +01:00
Vanessasaurus
a6c32c80ab flux-core: add v0.57.0 (#41520)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-12-11 10:25:02 +01:00
Todd Gamblin
57ad848f47 commands: better install status help formatting (#41527)
Before (hard to read, doesn't fit on small terminals):
:
```console
  -I, --install-status  show install status of packages

                        packages can be: installed [+], missing and needed by an installed package [-], installed in an upstream instance [^], or not installed (no annotation)
```

After (fits in 80 columns):

```console
  -I, --install-status  show install status of packages
                        [+] installed       [^] installed in an upstream
                         -  not installed   [-] missing dep of installed package
```
2023-12-11 10:17:37 +01:00
Brian Spilner
15623d8077 cdo: add v2.3.0 (#41479) 2023-12-11 09:45:16 +01:00
Thomas Madlener
c352db7645 vecgeom: Use correct checksum for version 1.2.5 (#41530)
See https://gitlab.cern.ch/VecGeom/VecGeom/-/releases/v1.2.5
2023-12-11 08:53:44 +01:00
Brian Van Essen
5d999d0e4f Add logic to cache the RPATH variables in CachedCMakePackages. (#41417) 2023-12-08 09:27:44 -08:00
Seth R. Johnson
694a1ff340 celeritas: new version 0.4.1 (#41504)
* celeritas: new version 0.4.1

* Mark correct versions as deprecated
2023-12-08 09:56:18 +00:00
Richard Berger
4ec451cfed flecsi: remove ^legion network=gasnet restriction (#41494) 2023-12-07 19:03:04 -07:00
Julien Cortial
a77eca7f88 cdt: Add versions 1.3.6 and 1.4.0 (#41490) 2023-12-07 14:27:39 -07:00
Greg Becker
14ac2b063a cce compiler: remove vestigial compiler names (#41303) 2023-12-07 15:17:03 -06:00
Tamara Dahlgren
edf4d6659d add missing endtime property to CDash (#41498) 2023-12-07 22:06:46 +01:00
Lydéric Debusschère
6531fbf425 py-mpldock: new package (#41316)
* py-mpldock: new package

* py-mpldock: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-07 21:24:05 +01:00
Victor Brunini
0a6045eadf Fix cdash reporter time stamps (#38825)
* Fix cdash reporter time stamps (#38818).
   The cdash reporter is created before packages are installed so save the
   starttime then instead of the endtime.
* Use endtime instead of starttime for the endtime of update

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2023-12-07 19:32:10 +00:00
Todd Gamblin
5722a13af0 Spack mailing list is now announcement-only (#41496)
Participation in the venerable Spack google group has dwindled, though we still have
540+ subscribers there.  I've made the mailing list announcement-only, and I've given
a few maintainers posting privileges.

This PR adds some notes to the README indicating that the mailing list is only for
announcements.
2023-12-07 19:22:14 +00:00
Brian Van Essen
9f1223e7a3 Bugfix spectrum-mpi module generation (#41466)
* Ensure that additional environment variables are set when a module
file is generated.

* Fixed the detection of the opal_prefix / MPI_ROOT field to use ompi_info.
---------

Co-authored-by: Greg Becker <becker33@llnl.gov>
2023-12-07 11:06:07 -08:00
Vijay M
5beef28444 Update homepage URL, add 5.5.1 version, remove bad version hashes, and other minro changes (#41492) 2023-12-07 10:52:06 -08:00
snehring
e618a93f3d iqtree2: add new version 2.2.2.7 and new variant lsd2 (#41467)
* iqtree2: add new version 2.2.2.7 and new variant lsd2
* iqtree2: reorder variant and resource
2023-12-07 11:30:14 -07:00
Garth N. Wells
3f0ec5c580 Update UFCx for v0.7.0. (#41392) 2023-12-07 10:20:34 -08:00
Kyle Knoepfel
14392efc6d Permit shared-library for libbacktrace (#41454) 2023-12-07 10:17:31 -08:00
John W. Parent
d7406aaaa5 CMake: v3.26.6 (#41282) 2023-12-07 10:25:42 -07:00
Harmen Stoppels
5a7e691ae2 freebsd (#41480) 2023-12-07 09:19:55 -08:00
Dave Keeshan
b9f63ab40b opensta: add new package (#41484)
* Add opensta, is allows 2 variants, zlib and cudd, but they are both enabled by default
* Remove unused import, os
2023-12-07 09:18:58 -08:00
Auriane R
4417b1f9ee Update pika package to use f-strings (#41483) 2023-12-07 10:14:33 -07:00
Adam J. Stewart
04f14166cb py-keras: add v3.0.1 (#41486) 2023-12-07 09:05:42 -08:00
Robert Cohn
223a54098e [intel-mkl,intel-ipp,intel-daal]: deprecate packages (#41488) 2023-12-07 09:01:02 -08:00
Satish Balay
ea505e2d26 petsc: add variant +zoltan (#41472) 2023-12-07 10:51:00 -06:00
Ataf Fazledin Ahamed
e2b51e01be traverse.py: use > 0 instead of >= 0 (#41482)
Signed-off-by: fazledyn-or <ataf@openrefactory.com>
2023-12-07 09:38:54 -07:00
jmlapre
a04ee77f77 trilinos: replace pytrilinos2 variant with python (#41435)
* depend_on python

There is an ill-named variant "python" that enables the pytrilinos1
variant.  This made it through our testing but broke on our actual
CI test machines.

* adjust "python" variant based on Trilinos version

For Trilinos <= 14, enable PyTrilinos(1). For later versions
of Trilinos, enable PyTrilinos2.

We still support directly enabling PyTrilinos2 via the "pytrilinos2"
variant.

* remove pytrilinos2 variant

* correct depends_on constraints
2023-12-07 10:28:13 -05:00
Jordan Galby
bb03ce7281 Do not use depfile in bootstrap (#41458)
- we don't have a fallback if make is not installed
- we assume file system locking works
- we don't verify that make is gnu make (bootstrapping fails on FreeBSD as a result)
- there are some weird race conditions in writing spack.yaml on concurrent spack install
- the view is updated after every package install instead of post environment install.
2023-12-07 10:09:49 +00:00
Massimiliano Culpo
31640652c7 audit: forbid nested dependencies in depends_on declarations (#41428)
Forbid nested dependencies in depends_on declarations, by running an audit in CI.

Fix the packages not passing the new audit:
- amd-aocl
- exago
- palace
- shapemapper
- xsdk-examples

ginkgo: add a commit sha to v1.5.0.glu_experimental
2023-12-07 10:21:01 +01:00
Samuel Browne
0ff0e8944e trilinos: add v15.0.0 (#41465) 2023-12-07 10:07:02 +01:00
Jack Morrison
a877d812d0 rdma-core: Add new versions 41.5, 42.5, 43.4, 44.4, 45.3, 46.2, 47.1, 49.0 (#41473) 2023-12-07 09:58:06 +01:00
dependabot[bot]
24a59ffd36 build(deps): bump actions/setup-python from 4.8.0 to 5.0.0 (#41474)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.8.0 to 5.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](b64ffcaf5b...0a5c615913)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-07 09:55:55 +01:00
Dave Keeshan
57f46f0375 cudd: add new package (#41476) 2023-12-07 09:37:22 +01:00
Chris Green
7d45e132a6 [root] New variants, checksum changes, sundry improvements (#41463)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.

* Handle unwanted system paths in various `PATH`-like environment
  variables.
2023-12-06 18:39:51 -06:00
Vicente Bolea
e7ac676417 paraview: dropping patch since changes exists (#41462) 2023-12-06 16:18:46 -07:00
Adam J. Stewart
94ba152ef5 py-torchmetrics: add v1.2.1 (#41456) 2023-12-06 12:48:28 -07:00
Massimiliano Culpo
5404a5bb82 llvm: reformulate a when condition to avoid tautology (#41461)
The condition on swig can be interpreted as "true if true,
false if false" and gives clingo the option to add swig
or not.

If not other optimization criteria break the tie, then
the concretization is non-deterministic.
2023-12-06 19:12:42 +01:00
Dave Keeshan
b522d8f610 yosys: add new package (#41416)
* Add EDA tools, yosys to Spack
* Add maintainers
* Move from format to f-strings
2023-12-06 09:47:58 -08:00
Thomas Madlener
2a57c11d28 lcio: add version 2.20.2, sio: add version 0.2 (#41451) 2023-12-06 09:46:18 -08:00
Thomas Madlener
1aa3a641ee edm4hep: Update cmake version dependency for newer versions (#41450) 2023-12-06 09:43:24 -08:00
yizeyi18
6feba1590c nwchem: add libxc/elpa support (#41376)
* added external libxc/elpa choice
* fixed formatting issues and 1 unused variant found by reviewer
* try to fix a string formatting issue
* try to fix some other string formatting issues
* fixed 1 flake8 style issue
* use explicit fftw-api@3
2023-12-06 09:38:46 -08:00
Chris Green
58a7912435 [catch2] Sundry improvements including C++20 support (#41199)
* Add `url_list` to facilitate finding new versions.

* `cxxstd` is not meaningful when `@:2.99.99` as it was a header-only
  package before v3.

* Support C++20/23, remove C++14 support.

* Add @greenc-FNAL to maintainers.

* Add CMake arguments to support testing, build of extras and examples.
2023-12-06 11:07:27 -06:00
yizeyi18
03ae2eb223 dla-future: add a patch (#41409)
* using std::int64_t needs include cstdint in gcc-13

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-06 16:55:40 +01:00
Julien Cortial
013f0d3a13 Only build tests for proj package if required (#41065)
* Only build tests for proj package if required

Even if tests are not explictly required to be built, proj build them
anyway and tries to download Google Test.

* proj: fix name of test activation flag

* proj: Always set test activation flag

* proj: Patch test activation logic for versions 5.x
2023-12-06 09:32:39 -06:00
Eric Berquist
3e68aa0b2f py-pre-commit: add 3.5.0 (#41438) 2023-12-06 09:11:32 -06:00
Gavin John
1da0d0342b Add new versions of py-quast (#40788)
* Add new versions of py-quast

* Update hashes

* Add joblib and simplejson

* Fat finger

* Update dependency type
2023-12-06 09:04:17 -06:00
Lydéric Debusschère
6f7d91aebf py-python-pptx: new package (#41315)
* py-python-pptx: new package

* py-python-pptx: use pil instead of pillow, remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:43:51 -06:00
Lydéric Debusschère
071c74d185 py-tldextract: new package (#41330)
* py-tldextract: new package

* py-tldextract: add version 5.1.1

* py-tldextract: fix version constraint on py-setuptools-scm

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:42:48 -06:00
Juan Miguel Carceller
51435d6d69 ruff: add version 0.1.6 (#41355)
* Add a new version of ruff

* Add a comment about where the dependency can be found

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-12-06 08:41:20 -06:00
Jordan Galby
8ce110e069 bootstrap: Don't catch Ctrl-C (#41449) 2023-12-06 14:58:14 +01:00
Auriane R
90aee11c33 Add pika 0.21.0 release (#41446) 2023-12-06 03:48:26 -07:00
Harmen Stoppels
4fc73bd7f3 minimal support for freebsd (#41434) 2023-12-06 10:27:22 +00:00
Dom Heinzeller
f7fc4b201d Update py-werkzeug version dependency for py-graphene-tornado@2.6.1 (#41426)
* Update py-werkzeug version dependency for py-graphene-tornado@2.6.1

* Add note on diverging version requirements for py-werkzeug in py-graphene-tornado
2023-12-06 10:28:58 +01:00
Jim Edwards
84999b6996 mpiserial: rework installation (#40762) 2023-12-06 09:39:08 +01:00
Harmen Stoppels
b0f193071d bootstrap status: no bash (#41431) 2023-12-06 09:20:59 +01:00
dependabot[bot]
d1c3374ccb build(deps): bump actions/setup-python from 4.7.1 to 4.8.0 (#41441)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.1 to 4.8.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](65d7f2d534...b64ffcaf5b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-06 09:20:14 +01:00
Richard Berger
cba8ba0466 legion: correct cuda dependency for cr version (#41119)
* legion: correct cuda dependency for cr version
* Update var/spack/repos/builtin/packages/legion/package.py

---------

Co-authored-by: Davis Herring <herring@lanl.gov>
2023-12-05 18:36:50 -08:00
Wouter Deconinck
d50f8d7b19 root: sha256 change on latest versions (#41401)
* root: sha256 change on latest version
* root: sha256 change on 6.28.10
* root: replace 6.26.12 by 6.26.14
* root: hash for 6.26.14
2023-12-05 18:20:27 -08:00
kjrstory
969fbbfb5a foam-extend: add v5.0 (#40480)
* foam-extend:add new version
* depreacted versions
2023-12-05 18:11:59 -08:00
Robert Cohn
1cd5397b12 [intel-parallel-studio] Deprecate entire package (#41430) 2023-12-05 17:56:49 -08:00
psakievich
1829dbd7b6 CDash: Spack dumps stage errors to configure phase (#41436) 2023-12-05 22:05:39 +00:00
Philippe Virouleau
9e3b231e6f julia: fix LLVM paches hashes (#41410) 2023-12-05 22:53:40 +01:00
Alec Scott
5911a677d4 py-gidgethub: add new package (#41286)
* py-gidgethub: add new package

* Add main branch version and scope flit/flit-core dependency

* Update var/spack/repos/builtin/packages/py-gidgethub/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Add optional dependencies as variants of package

* Add git url for main version

* Fix variant and dependency ordering

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-05 21:26:42 +01:00
Alec Scott
bb60bb4f7a py-gidgetlab: add new package (#41338)
* gidgetlab: add new package

* Convert both cachetools and aiohttp to optional deps with variants

* Fix forgotten variant conditional on cachetools dependency

* Add git url and main version for dev workflows

* Fix variant and dependency ordering

* Remove cachetools variant and merge dependency with aiohttp variant
2023-12-05 21:24:38 +01:00
Victoria Cherkas
ddec75315e Add maintainers to fdb, eckit, ecbuild, metkit, eccodes (#41433)
* Add maintainers to fdb
* Add maintainers to eckit
* Add maintainers to mekit
* Add maintainers to eccodes
* Add maintainers to ecbuild
* Add climbfuji to eccodes maintainers
2023-12-05 13:18:41 -07:00
Veselin Dobrev
8bcb1f8766 MFEM: Add a patch to fix the +gslib+shared+miniapps build (#41399)
* [mfem] Add a patch to resolve issue #41382

* [mfem] Spack CI wants "patch URL must end with ?full_index=1"
2023-12-05 10:26:48 -08:00
Matthew Thompson
5a0ac4ba94 Update versions of GFE packages (#41429) 2023-12-05 10:24:51 -07:00
Lydéric Debusschère
673689d53b py-sphinx: add versions 7.2.4, 7.2.5 and 7.2.6 (#41411)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 15:07:22 +01:00
Brian Vanderwende
ace8e17f02 libdap4: add explicit RPC dependency (#40019) 2023-12-05 13:11:19 +01:00
Billae
eb9c63541a documentation: add instructions on how to use external opengl (#40987) 2023-12-05 12:59:41 +01:00
Kensuke WATANABE
b9f4d9f6fc sirius: fix build error with Fujitsu compiler (#41101) 2023-12-05 12:54:03 +01:00
Lydéric Debusschère
eda3522ce8 py-sphinxcontrib-moderncmakedomain: add new package (#41331)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 12:47:25 +01:00
Harmen Stoppels
3cefd73fcc spack buildcache check: use same interface as push (#41378) 2023-12-05 12:44:50 +01:00
Alberto Invernizzi
3547bcb517 openfoam-org: fix for being able to build manually checked out repository (#41000)
* grep WM_PROJECT_VERSION from etc/bashrc

This fixes the problem when building from a manually checked out repo
which might have a different version wrt the one defined in the spack
package (e.g. anything later than 5.0 is known as 5.x by the build
system)

* patch applies to just 5.0, in newer versions it is already addressed

In `5.20171030` there's a commit

c66fba323c

very similar (almost identical) to what the patch `50-etc.patch` does.

So the patch should not be applied to other than `5.0` otherwise it errors.

References:
- https://github.com/OpenFOAM/OpenFOAM-5.x/commits/20171030/etc/bashrc
- 197d9d3bf2/etc/bashrc (L45-L47)
2023-12-05 12:37:57 +01:00
Todd Gamblin
53b528f649 bugfix: sort variants in spack info --variants-by-name (#41389)
This was missed while backporting the new `spack info` command from #40326.

Variants should be sorted by name when invoking `spack info --variants-by-name`.
2023-12-05 12:31:40 +01:00
Mark W. Krentel
798770f9e5 hpctoolkit: add conflict for recent intel-xed (#41413)
Intel made an incompatible change in XED in 2023.08.21 that breaks
hpctoolkit (at run time).  Hpctoolkit develop can adapt (soon will),
but older versions must use xed :2023.07.09.
2023-12-05 11:17:37 +01:00
Jim Edwards
4a920243a0 cprnc: update sha256 for github artifacts (#41418) 2023-12-05 11:13:14 +01:00
Ben Wibking
8727195b84 openblas: fix macOS build when using XCode 15 or newer (#41420)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-12-05 09:53:54 +01:00
Massimiliano Culpo
456f2ca40f extensions: improve docs, fix unit-tests (#41425) 2023-12-05 09:49:35 +01:00
dependabot[bot]
b4258aaa25 build(deps): bump docker/metadata-action from 5.2.0 to 5.3.0 (#41423)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.2.0 to 5.3.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](e6428a5c4e...31cebacef4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 09:48:29 +01:00
Mitch B
5d9647544a arrow: add versions up to v14.0.1 (#41424) 2023-12-05 09:48:03 +01:00
Brian Van Essen
1fdb6a3e7e Updating the LBANN, Hydrogen, and DiHydrogen recipes (#41390)
* Updating the LBANN, Hydrogen, and DiHydrogen recipes for both new
variants and to make sure that RPATHs are properly setup.

Co-authored-by: bvanessen <bvanessen@users.noreply.github.com>
2023-12-05 09:31:51 +01:00
Robert Cohn
7c77b3a4b2 [intel] deprecate all versions (#41412)
Deprecating intel package, which contains intel classic compilers. This package has not been updated in 3 years. Please use intel-oneapi-compilers instead.
2023-12-04 21:52:55 -07:00
Ye Luo
eb4b8292b6 A few changes to quantum-espresso (#41225)
* gipaw.x installed by cmake if version >= 5c4a4ce.
  gipaw.x will only be installed with cmake if the qe-gipaw version
  is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
  Here a patch to the submodule_commit_hash_records to use a newer
  qe-gipaw version.
* Update package.py
* Delete var/spack/repos/builtin/packages/quantum-espresso/gipaw-eccee44.patch
* Update package.py
* Restoring gipaw-eccee44 patch
* Update package.py
* Add fox variant in quantum-espresso
* Fix an issue introduced in #36484. Patches are 7.1 only.
* Change plugin handling.
* formatting.
* Typo correction
* Refine conflict

---------

Co-authored-by: S. Alexis Paz <alexis.paz@gmail.com>
2023-12-04 11:37:05 -08:00
John Biddiscombe
16bc58ea49 Add EGL support to ParaView and Glew (#39800)
* Add EGL support to ParaView and Glew

add a package for egl that provides GL but also adds
EGL libs and headers for projects that need them

Fix a header problem with the opengl package

Format files using black

* better description for egl variant description

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* better check/setup of non egl variant dependencies

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* Add biddisco as maintainer

* Fix unused var style warning

* Add egl conflicts for other gl providers

---------

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
2023-12-04 10:53:06 -06:00
Alec Scott
6028ce8bc1 direnv: add v2.33.0 (#41397) 2023-12-04 14:12:13 +01:00
Harmen Stoppels
349e7e4c37 zlib-ng: add v2.1.5 (#41402) 2023-12-04 12:44:10 +01:00
Harmen Stoppels
a982118c1f ci.py: fix missing import (#41391) 2023-12-04 12:14:59 +01:00
Adam J. Stewart
40d12ed7e2 PythonPackage: type hints (#40539)
* PythonPackage: nested config_settings, type hints

* No need to quote PythonPackage

* Use narrower types for now until needed
2023-12-04 10:53:53 +00:00
James Smillie
9e0720207a Windows: fix kit base path and reference to windows registry key (#41388)
* Proper handling of argument passed as semicolon-separated str
* Fix reference to windows registry key in win-wdk
2023-12-03 15:35:13 -08:00
Jaelyn Litzinger
88e738c343 Allow exago to use hiop@develop past v1.0.1 (#41384) 2023-12-02 14:06:22 -06:00
Cameron Rutherford
8bbc2e2ade resolve: add package with cuda and rocm support (#40871) 2023-12-01 20:49:11 -06:00
Julien Cortial
1509e54435 Add MUMPS versions 5.6.0, 5.6.1 and 5.6.2 (#41386)
The patch for version 5.5.x still applies to 5.6.x.
2023-12-01 18:48:00 -07:00
Dom Heinzeller
ca164d6619 Fix curl install using Intel compilers (#41380)
When using Intel to build curl, add 'CFLAGS=-we147' to the configure
args to fix error 'compiler does not halt on function prototype
mismatch'
2023-12-01 17:24:05 -07:00
Felix Werner
a632576231 Add XCDF. (#41379) 2023-12-01 14:56:18 -08:00
Felix Werner
70b16cfb59 Add PhotoSpline. (#41374) 2023-12-01 14:44:53 -08:00
Brian Vanderwende
1d89d4dc13 MET fixes for 11.1 and HDF4 support (#41372)
* MET fixes for 11.1 and HDF4 support
* Fix zlib reference in MET
2023-12-01 14:42:36 -08:00
Dewi
bc8a0f56ed removed cmake build version pointing to fork (#41368) 2023-12-01 14:39:45 -08:00
Jack Morrison
4e09396f8a Libfabric: Introduce OPX provider conflict for v1.20.0 (#41343)
* Libfabric: Introduce OPX provider conflict for v1.20.0
* Add message to libfabric 1.20.0 opx provider conflict
2023-12-01 14:35:54 -08:00
Weiqun Zhang
0d488c6e4f amrex: add v23.12 (#41385) 2023-12-01 22:45:58 +01:00
Erik Heeren
50e76bc3d3 py-pyglet: version bump (#41082)
* py-pyglet: version bump

* py-pyglet: use zip instead of whl, update dependencies

* py-pyglet: 2.0.9 and 2.0.10 zips should be downloaded from github

* py-pyglet: style

* py-pyglet: use virtual packages in dependencies

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-pyglet: doesn't depend on py-future any more

* py-pyglet: remove glx dependency

* py-pyglet: back to the pypi zipfiles with patch instead

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-01 21:44:30 +01:00
Dave Keeshan
a543fd79f1 verible: add new package (#41270)
* Add initial version of verible to spack
* Update to use explict url path for each release, as the release tagh includes extra data, also added the bottom most point of gcc, gcc9
2023-12-01 10:48:52 -08:00
Adam J. Stewart
ab50aa61db py-keras: add v3.0.0 (#41356)
* py-keras: add v3.0.0

* Older keras actually requires protobuf

* Correct url_for_version

* Capitalization is important

* Keep pil and pydot deps
2023-12-01 18:23:58 +00:00
Richard Berger
d06a102e69 Various FleCSI updates (#41068)
* flecsi: remove deprecated versions
* flecsi: add explicit conflict for backend=hpx +hdf5
* flecsi: propagate +openmp to kokkos and legion
* flecsi: remove doc variant prior to @2.2
   It wouldn't do anything meaningful and won't install the documentation.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-01 09:41:33 -08:00
Rohit Goswami
b0f0d2f1fb dftbplus: Update and add upstream maintainer (#33243)
* dftbp: Update and add upstream maintainer
* dftbp: Trust in the hybrid cmake builds
* dftbp: Handle scalapack better
* dftbp: Refactor as per review
* dftbp: Build shared for python
* dftbp: Address review comments
* dftbp: Add another maintainer
* dftp: Fix typo
* dftbp: Arpack for serial builds only
* dftbp: Update option docs
* dftbp: Update documentation for elsi
* dftbp: Add comment for context
* dftbp: Tighter bounds on python
* dftbp: Add negf only when shared
* dftbp: Fix typo
* dftbp: Update sha256
* dftpb: Add when directive for cmake and ninja
* dftbp: Enforce comment

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: awvwgk <awvwgk@users.noreply.github.com>
Co-authored-by: iamashwin99 <iamashwin99@users.noreply.github.com>
Co-authored-by: Ashwin Kumar Karnad <46030335+iamashwin99@users.noreply.github.com>
Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
Co-authored-by: tldahlgren <tldahlgren@users.noreply.github.com>
2023-12-01 09:34:11 -08:00
Victoria Cherkas
6029b600f0 eccodes: add v2.32.0, v2.31.0 (#40770)
* eccodes new versions and dependencies
* Suggested changes for multiple variant defaults
* Update var/spack/repos/builtin/packages/eccodes/package.py

---------

Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2023-12-01 09:25:31 -08:00
dependabot[bot]
e6107e336c build(deps): bump docutils from 0.18.1 to 0.20.1 in /lib/spack/docs (#38174)
Bumps [docutils](https://docutils.sourceforge.io/) from 0.18.1 to 0.20.1.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 17:16:34 +01:00
Lydéric Debusschère
9ef57c3c86 py-cma: new package (#41326)
* py-pycma: new package

* rename py-pycma in py-cma; py-cma: use pypi instead of github sources

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:54:02 -06:00
Lydéric Debusschère
3b021bb4ac py-mahotas: new package (#41329)
* py-mahotas: new package

* py-mahotas: relax version constraint on numpy

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:52:37 -06:00
Richard Berger
42bac83c2e LAMMPS updates (#40879)
* lammps: add new stable version 20230802.1

* lammps: add missing potential download for +mesont

* lammps: fix python package install

* Update var/spack/repos/builtin/packages/lammps/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* lammps: py-numpy and py-mpi4py should be build and run deps

* lammps: add new 20231121 release

- MPIIO package has been removed -> disable mpiio variant
- LAMMPS_EXCEPTIONS is now always on -> disable exceptions variant
- CMake 3.16+ is now required
- Kokkos 4.1.0 is now supported

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:13:56 -07:00
Lydéric Debusschère
7e38e9e515 py-xmlplain: new package (#41324)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:02:24 -06:00
Lydéric Debusschère
cf5ffedc23 py-requests-file: new package (#41328)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:00:23 -06:00
Lydéric Debusschère
3a9aea753d [add] py-autodocsumm: new package (#41309)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:57 -06:00
Lydéric Debusschère
c61da8381c py-sphinx-jinja2-compat: new package (#41310)
* [add] py-sphinx-jinja2-compat: new package

* py-whey-pth: new package, dependence of py-sphinx-jinja2-compat

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:02 -06:00
Lydéric Debusschère
8de814eddf py-sphinx-removed-in: new package (#41325)
* py-sphinx-removed-in: new package

* py-sphinx-removed-in: fix dependence

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:43:40 -06:00
Satish Balay
c9341a2532 petsc, py-petsc4py: add v3.20.2 (#41366) 2023-12-01 07:42:48 -06:00
Manuela Kuhn
8b202b3fb2 py-urllib3: add 2.1.0 and 2.0.7 (#41358) 2023-12-01 07:39:18 -06:00
Dom Heinzeller
2ecc260e0e Update py-cylc-flow (add version 8.2.3) (#41209)
* Add missing runtime dependency on py-colorama to py-ansimarkup

* Add py-metomi-isodatetime@3.1.0

* New package py-graphql-relay

* Update py-cylc-flowi, add version 8.2.3

* Fix merge conflict

* Revert mistake in var/spack/repos/builtin/packages/py-cylc-flow/package.py

* Update py-metomi-isodatetime dependencies for py-cylc-flow

* Add 'climbfuji' to list of maintainers for py-cylc-flow
2023-12-01 07:33:34 -06:00
Christopher Christofi
6130fe8f57 py-beartype: new package with versions 0.15.0 and 0.16.2 (#39759)
* py-beartype: new package with version 0.15.0

* Update var/spack/repos/builtin/packages/py-beartype/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-beartype: depend on python 3.8 or higher

* py-beartype: add new version 0.16.2

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:29:30 -06:00
Adam J. Stewart
d3aa7a620e py-mpi4py: fix build with Apple Clang (#41362)
* py-mpi4py: fix build with Apple Clang

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-12-01 05:33:14 -06:00
dependabot[bot]
2794e14870 build(deps): bump pygments from 2.17.1 to 2.17.2 in /lib/spack/docs (#41212)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.1 to 2.17.2.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/2.17.2/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.1...2.17.2)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:42 +01:00
dependabot[bot]
cc1e990c7e build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#41305)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.3.0 to 2.0.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.3.0...2.0.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:22 +01:00
dependabot[bot]
59e6b0b100 build(deps): bump mypy from 1.7.0 to 1.7.1 in /lib/spack/docs (#41243)
Bumps [mypy](https://github.com/python/mypy) from 1.7.0 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.7.0...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:22:23 +01:00
dependabot[bot]
ec53d02814 build(deps): bump docker/metadata-action from 5.0.0 to 5.2.0 (#41371)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.0.0 to 5.2.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](96383f4557...e6428a5c4e)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:21:56 +01:00
dependabot[bot]
389c77cf83 build(deps): bump mypy from 1.6.1 to 1.7.1 in /.github/workflows/style (#41242)
Bumps [mypy](https://github.com/python/mypy) from 1.6.1 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.6.1...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:20:56 +01:00
Vicente Bolea
17c87b4c29 vtk-m: bump vtk-m 2.1.0 (#41351)
* vtk-m: bump vtk-m 2.1.0

* Update package.py

* Update package.py
2023-11-30 20:18:15 -08:00
Thomas Helfer
91453c5ba0 Add support for new versions of TFEL and MGIS (#41357)
* Add new versions to TFEL and MGIS
2023-11-30 19:47:57 -07:00
Robert Cohn
a587a10c86 [intel-mpi] deprecation (#41322) 2023-11-30 19:52:42 -05:00
Derek Ryan Strong
cfe77fcd90 r: add license and missing versions and fix rmath build directory (#41260)
* Add R license and missing versions
* Fix rmath build directory
2023-11-30 16:18:02 -08:00
Matthieu Dorier
6cf36a1817 mochi-margo: added version 0.15.0 (#41319) 2023-11-30 15:32:24 -08:00
Matthieu Dorier
ee40cfa830 mochi-thallium: adding a few new versions (#41323)
* mochi-thallium: added a few newer versions
* mochi-thallium: added constraint on the version of margo required
* removed thallium 0.12 which needs updating
* mochi-thallium: fixed hash for version 0.12.0
* removed thallium 0.12 which needs updating (again)
2023-11-30 15:19:32 -08:00
Harmen Stoppels
04f64d4ac6 tests: use temporary_store (#41369) 2023-11-30 15:11:35 -08:00
jmlapre
779fef7d41 trilinos: new pytrilinos2 variant (#40615) 2023-11-30 14:08:58 -07:00
Richard Berger
5be3ca396b Singularity-EOS update (#41333)
* singularity-eos: deprecate v1.6 versions and remove unused code
* singularity-eos: add v1.8.0
2023-11-30 12:55:25 -08:00
Dave Keeshan
e420441bc2 Fix flex for build and link, limit gcc to 7 or greater (#41335) 2023-11-30 12:51:48 -08:00
Dave Keeshan
a039dc16fa Move compiler renaming to filter_compiler_wrappers (#41336) 2023-11-30 12:48:59 -08:00
Christopher Christofi
b31c89b110 perl-carp-assert: add new package with version 0.22 (#41347)
* perl-carp-assert: add new package with version 0.22
* fix style
2023-11-30 12:46:24 -08:00
Christopher Christofi
bc4f3d6cbd perl-parse-yapp: add new package with version 1.21 (#41348) 2023-11-30 12:37:26 -08:00
Matthias Wolf
40209506b7 acfl: truncate version version number (#41354)
When using `spack external find acfl`, we get the full version string
with 4 components in `packages.yaml`.  This PR truncates the version
nubmer when finding the `armpl` component to be able to run without
intervention.
2023-11-30 12:32:10 -08:00
Massimiliano Culpo
6ff07c7753 Fix issue with latest mypy (#41363) 2023-11-30 20:31:03 +01:00
Adam J. Stewart
d874c6d79c py-tensorflow-estimator: add new versions (#41364) 2023-11-30 10:54:11 -08:00
Adam J. Stewart
927e739e0a GDAL: add v3.8.1 (#41365) 2023-11-30 10:51:42 -08:00
Tom Scogland
dd607d11d5 developer tools stack try 2 (#40921)
* developer tools stack try 2

This version is actually in use locally and has largely stabilized, at
least on x86.  Some packages are still a challenge on ppc64le, but maybe
worth keeping this working as a set.

* add packages, try to get container with newer gcc

* remove reuse: true

* try to get cmake to build on medium, 25 minutes is too long

* add lsd package and add to dev tools stack

* clean up fzf dependency and sorting

* Update share/spack/gitlab/cloud_pipelines/stacks/developer_tools/spack.yaml

* cuda: add 12.3.0 (#40827)

* Switch to dashes

* yet more underscores

---------

Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
2023-11-30 18:32:21 +00:00
Harmen Stoppels
d436e97fc6 reuse concretization: allow externals from remote when locally configured (#35975)
This looks to me like the best compromise regarding externals in a
build cache. I wouldn't want `spack install` on my machine to install
specs that were marked external on another. At the same time there are
centers that control the target systems on which spack is used, and
would want to use external in buildcaches.

As a solution, reuse concretization will now consider those externals
used in buildcaches that match a locally configured external in
packages.yaml.

So for example person A installs and pushes specs with this config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6.0.12345 +feature
      prefix: /usr
```

and person B concretizes and installs using that buildcache with the
following config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6
    prefix: /usr
```

the spec will be reused (or rather, will be considered for reuse...)
2023-11-30 09:38:05 -08:00
Harmen Stoppels
f3983d60c2 tests: add missing mutable db (#41359) 2023-11-30 18:37:35 +01:00
Harmen Stoppels
40e705d39e tests: fix side effects of default_config fixture (#41361)
* tests: default_config drop scope

* use default_config elsewhere

* use parse_install_tree for missing defaults in default config
2023-11-30 18:19:10 +01:00
Harmen Stoppels
d92457467a test_variant_propagation_with_unify_false: missing fixture (#41345) 2023-11-30 17:27:53 +01:00
Harmen Stoppels
4c2734fe14 --scope: lazy defaults (#41353) 2023-11-30 15:35:21 +01:00
Taillefumier Mathieu
34d791189d Update SIRIUS version for CP2K master (#41264)
* Update SIRIUS version for CP2K master

* Update var/spack/repos/builtin/packages/cp2k/package.py

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-11-30 04:08:22 -07:00
Vinícius
eec9eced1b simgrid: add v3.34 and v3.35 (#41340) 2023-11-30 11:23:57 +01:00
Christopher Christofi
3bc8a7aa5f use double quotes where spack style finds errors (#41349) 2023-11-30 09:46:02 +01:00
Massimiliano Culpo
3b045c289d Fix a typo in an integrity constraint (#41334) 2023-11-30 09:44:51 +01:00
Harmen Stoppels
4b93c57d44 argparse: make scope choices lazy s.t. validation in tests works (#41344) 2023-11-30 08:37:11 +01:00
Harmen Stoppels
377e7de0d2 tests: fix issue with os.environ binding (#41342) 2023-11-30 07:18:41 +01:00
Greg Sjaardema
0a9c84dd25 SEACAS: new release (#41273)
Replace last release due to build issues on windows. Tag was applied incorrectly
2023-11-29 21:38:05 -07:00
Chris White
2ececcd03e MFEM: add mpi link dir (#41337)
Also fix netcdf-c zlib reference
2023-11-29 21:32:50 -07:00
Adam J. Stewart
f4f67adf49 py-numba: add v0.58.1 (#41262)
* py-numba: add v0.58.1

* Passing tests
2023-11-30 00:17:34 +01:00
Adam J. Stewart
220898b4de py-pygeos: add v0.14 (#41248)
* py-pygeos: add v0.14

* Python is also a link dep
2023-11-30 00:15:07 +01:00
Luc Berger
450f938056 kokkos: add v4.2.00 (#41203)
* Kokkos: adding version 4.2.00 to the package
* Kokkos: adding AMD GPU arch
* kokkos@4.2.00 +sycl: patch numeric traits unit test

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-11-29 22:47:15 +00:00
Massimiliano Culpo
889b729e52 Refactor a test to not use the "working_env" fixture (#41308)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-29 22:14:57 +01:00
Sergio Alexis Paz
3987604b89 Add gipaw when building quantum-espresso with cmake (#41142)
* gipaw.x installed by cmake if version >= 5c4a4ce.

gipaw.x will only be installed with cmake if the qe-gipaw version
is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
Here a patch to the submodule_commit_hash_records to use a newer
qe-gipaw version.
2023-11-29 12:16:27 -08:00
Andrey Perestoronin
b6f8cb821c intel-oneapi-ccl 2021.11.1: added new version to packages (#41300)
* added new packages

* align release with ccl patch

* revert version on compiler classics
2023-11-29 10:09:32 -07:00
Mikael Simberg
72216b503f dla-future: Add conflicts for compilation issues pre-0.3.1 (#41317)
* dla-future: Add conflict for pedantic warnings turning into errors

* dla-future: Add conflicts for nvcc/fmt/Umpire compilation issues
2023-11-29 15:10:27 +01:00
Harmen Stoppels
c06f353f55 Simplify _create_mock_configuration_scopes (#41318) 2023-11-29 14:06:41 +01:00
Mikael Simberg
367ca3f0ec fmt: Add git attribute (#41293) 2023-11-29 12:50:28 +01:00
Harmen Stoppels
2c4bc287b8 cuda: fix compiler conflicts (#41304) 2023-11-29 12:14:39 +01:00
Harmen Stoppels
fcb3d62093 unit tests: use --verbose to see order on macos (#41314) 2023-11-29 11:29:17 +01:00
John W. Parent
306377684b CMake: add v3.27.9 (#41301) 2023-11-29 02:19:29 -07:00
Massimiliano Culpo
29b75a7ace Fix an issue with deconcretization/reconcretization of environments (#41294) 2023-11-29 09:09:16 +01:00
Paul R. C. Kent
4b41b11c30 cuda: add 12.3.0 (#40827) 2023-11-28 13:28:28 -08:00
afzpatel
92e0d42b64 update hipblas rocalution, rocsolver, rocsparse to new syntax (#40135)
* initial commit to update hipblas rocalution, rocsolver, rocsparse to new syntax
* add rocblas test changes and fixes for hipblas and rocsolver tests
* fix styling
* remove updates for rocblas
2023-11-28 12:49:07 -08:00
afzpatel
1ebd37d20c fix hip tests and bump hip-examples to 5.6.1 (#40928)
* Initial commit to fix hip tests and bump hip-examples to 5.6.1
* fix styling
* add installation of hip samples to share folder
2023-11-28 12:17:10 -08:00
Adam J. Stewart
b719c905f1 apple-libuuid: update installation directory (#40416)
* apple-libuuid: update installation directory

Copy design of Apple GL
2023-11-28 10:13:55 -08:00
eugeneswalker
430b2dff5c e4s ci: disable gpu test stack (#41296) 2023-11-28 18:02:00 +01:00
Tom Payerle
ef8e6a969c vtk: Restrict application of patch xdmf2-hdf51.13.2.patch (#40266)
The changes in patch xdmf2-hdf51.13.2.patch have effectively
been added to vtk@9.2.3 (commit e81a2fe)

So restrict application of patch fo @9:9.2
2023-11-28 11:27:58 -05:00
Massimiliano Culpo
0e65e84768 ASP-based solver: use a unique ID counter (#41290)
* solver: use a unique counter for condition, triggers and effects

* Do not reset counters when re-running setup

  What we need is just a unique ID, it doesn't need
  to start from zero every time.
2023-11-28 16:28:54 +01:00
Seth R. Johnson
e0da7154ad celeritas: new version 0.4.0 (#41288) 2023-11-28 14:44:15 +00:00
Juan Miguel Carceller
6f08daf670 root: add a webgui patch (#41289)
* root: add a webgui patch

* Update var/spack/repos/builtin/packages/root/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Add also the versions that don't need the webgui patch

* Fix hash

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-28 07:33:48 -07:00
Tom Scogland
c2d29ca38c libvips requires pkg-config to find glib (#41184) 2023-11-28 14:42:16 +01:00
Raffaele Solcà
f037ef7451 Fix elpa flags (missing optimization) (#41252)
Setting CFLAGS/FCFLAGS overrides the default optimization flags.

This commit brings them back.
2023-11-28 11:31:30 +01:00
dependabot[bot]
f84557a81b build(deps): bump vermin from 1.5.2 to 1.6.0 in /.github/workflows/style (#41285)
Bumps [vermin](https://github.com/netromdk/vermin) from 1.5.2 to 1.6.0.
- [Release notes](https://github.com/netromdk/vermin/releases)
- [Commits](https://github.com/netromdk/vermin/compare/v1.5.2...v1.6.0)

---
updated-dependencies:
- dependency-name: vermin
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-28 09:38:38 +00:00
Alec Scott
18efd808da GoPackage: add new build system for Go packages (#41164)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-28 10:33:46 +01:00
Wouter Deconinck
5299b84319 qt-*: new versions 6.6.1 (#41281)
* qt-*: new versions 6.6.1

* qt-quick3d: fixup
2023-11-28 02:18:54 -07:00
Adam J. Stewart
70fb0b35e5 py-transformers: add v4.35.2 (#41266) 2023-11-28 10:17:52 +01:00
Alec Scott
4f7f3cbbdf smee-client: add new package (#41280) 2023-11-28 08:34:49 +01:00
Jose E. Roman
4205ac74e8 slepc: add v3.20.1 (#41274) 2023-11-27 20:53:45 -06:00
Jack Morrison
72ed14e4a9 libfabric: Add version 1.20.0 (#41277) 2023-11-27 19:04:44 -07:00
Dave Keeshan
ed54359454 Move compiler renaming to filter_compiler_wrappers (#41275) 2023-11-27 18:16:03 -07:00
Dave Keeshan
ea610d3fe2 iverilog-vpi: filter compiler wrappers from a few files (#41244) 2023-11-27 17:44:44 -07:00
John W. Parent
cd33becebc CMake: add version 3.27.8 (#41094) 2023-11-27 15:09:44 -07:00
Jim Edwards
8ccfe9f710 cprnc: add new package (#41237) 2023-11-27 14:02:04 -07:00
Eric Berquist
abc294e3a2 Update SST packages to 13.1.0 (#41220)
* Update SST packages to 13.1.0

* Allow mismatch between sst-core dependency and current macro version

* SST does not work with Python 3.12 yet

* Sanity check install binaries for sst-core

* Elements compiles with OTF2 but not OTF

* Version bounds in specs are inclusive

* Remove not-strictly-necessary file check
2023-11-27 14:55:22 -06:00
Alec Scott
c482534c1d CargoPackage: add new build system for Cargo packages (#41192)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-27 20:15:16 +00:00
Robert Cohn
fbec91e491 handle use of an unconfigured compiler (#41213) 2023-11-27 11:57:10 -07:00
Andrey Perestoronin
3d744e7c95 intel-oneapi 2024.0.0: added new version to packages (#41135)
* oneapi 2024.0.0 release

* oneapi v2 directory support and some cleanups

* sycl abi change requires 2024 compilers for packages that use sycl

---------

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2023-11-27 13:13:04 -05:00
Dave Keeshan
b4bafbbf7e verilator: add v5.018 (#41256)
Add all version since 4.108, deprecate previous version, issues with flex,
switch from veripool to github for releases
2023-11-27 10:24:49 -07:00
Juan Miguel Carceller
bd3a1d28bf Simplify a few CMakePackages by removing redundant directives (#41163)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-27 17:43:39 +01:00
Harmen Stoppels
e0ef78b26e docs: refer to oci build cache from containers.rst (#41269) 2023-11-27 16:36:00 +00:00
Satish Balay
d768e6ea5c Balay/xsdk 1.0.0 updates (#41180)
* superlu-dist: add v8.2.1 for xsdk

* heffte, phist build fixes on tioga

* exago: build fixes on polaris

---------

Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2023-11-27 09:57:41 -06:00
Harmen Stoppels
8d0e0d5c77 tests: fix more cases of env variables (#41226) 2023-11-27 16:37:31 +01:00
Anton Kozhevnikov
13b711f620 [sirius] update spack recipe; add v7.5.0 (#41233)
* update spack recipe

* [@spackbot] updating style on behalf of toxa81

* change  from @develop to @7.5.0

* return dependency on boost_filesystem

* return dependency on boost_filesystem

* remove boost filesystem as agreed by @RMeli and  @simonpintarelli

---------

Co-authored-by: toxa81 <toxa81@users.noreply.github.com>
2023-11-27 16:28:19 +01:00
Nisarg Patel
d76a774957 libpsm3: add v11.5.1.1 (#41231) 2023-11-27 15:52:19 +01:00
Brian Vanderwende
ee8e40003b vapor: add new recipe (#40707) 2023-11-27 15:48:04 +01:00
Adam J. Stewart
dc715d9840 py-llvmlite: add new versions (#41247) 2023-11-27 15:43:49 +01:00
Derek Ryan Strong
89173b6d24 fpart: add license and variants (#41257) 2023-11-27 15:25:07 +01:00
stepanvanecek
848d270548 sys-sage: update repo url, rework recipe (#41005)
Co-authored-by: Stepan Vanecek <stepan.vanecek@tum.de>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-27 15:21:57 +01:00
fpruvost
ea347b6468 pastix: add v6.3.1 (#41265) 2023-11-27 15:03:04 +01:00
Dave Keeshan
b1b4ef6d1b Add patch so that ccache can compile with the standard gcc@12 version (#41249) 2023-11-27 14:33:30 +01:00
Thomas Madlener
c564b2d969 googletest: Add 1.13.0 and 1.14.0 tags (#41253)
* Add latest tags for googletest

* Implement proper url_for_version

* Fix hashes for older versions
2023-11-27 07:12:49 -05:00
Massimiliano Culpo
343517e794 Improve semantic for packages:all:require (#41239)
An `all` requirement is emitted for a package if all variants referenced are defined by it. Otherwise, the constraint is rejected.
2023-11-27 12:41:16 +01:00
Harmen Stoppels
6fff0d4aed libxsmm: relax arch requirement (#41193)
* libxsmm: relax arch requirement

* libxsmm: add a fixed commit from main
2023-11-27 11:55:33 +01:00
Alberto Invernizzi
34bce3f490 Remove old conflict with gcc@10.3.0 (#41254)
The conflict is captured in CudaPackage and redundant in umpire
2023-11-27 09:29:19 +01:00
Rocco Meli
7cb70e3258 force cp2k cuda/rocm variant on elpa (#41241) 2023-11-27 09:08:50 +01:00
Chris Richardson
df777dbbaa py-fenics-basix: update for main and future 0.8.0 (#40838)
* Update to latest version

* Add dependency

* revert

* address PR comments

* Correct dependencies for 0.7 to 0.8 transition

* Fix cmake line.

* Update nanobind dep

---------

Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Jack S. Hale <mail@jackhale.co.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2023-11-25 16:42:54 -06:00
Morten Kristensen
f28ccae3df py-vermin: add latest version 1.6.0 (#41261) 2023-11-25 14:42:58 -07:00
svengoldberg
ecdc296ef8 py-heat: add new package (#39394)
* heat: Create new spack package

* t8code: Add maintainer

* t8code: Add variant descriptions

* t8code: Add second maintainer

* t8code: Add another maintainer

* heat: Changes after review

* heat: Fix style test error

* heat: Delete obsolete install_options and re-add package homepage

* heat: Add dependency on py-setuptools

---------

Co-authored-by: Sven Goldberg <sven.goldberg@dlr.de>
2023-11-25 08:56:27 -06:00
Erik Heeren
9b5c85e919 py-sqlalchemy-utils, py-sql-alchemy: version bump (#41081)
* py-sqlalchemy-utils, py-sql-alchemy: version bump

* Update var/spack/repos/builtin/packages/py-sqlalchemy-utils/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-11-25 08:53:00 -06:00
Erik Heeren
ea8dcb73db py-dictdiffer: version bump (#41080)
* py-dictdiffer: version bump

* py-dictdiffer: removed runtime py-setuptools dependency in 0.9.0

* Update var/spack/repos/builtin/packages/py-dictdiffer/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-11-25 08:51:13 -06:00
Moritz Kern
089e117904 Update py-neo (#39213)
* add 0.12.0

* remove whitespace

* update deps

* Update var/spack/repos/builtin/packages/py-neo/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add dep for python 3.8+

* add dep for python 3.8+ with 0.12.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-25 08:40:10 -06:00
Wouter Deconinck
1456d9b727 py-stashcp: new package (#41091)
* py-stashcp: new package

* py-stashcp: depends_on py-urllib3

* py-stashcp: comment as suggested in review

* Update var/spack/repos/builtin/packages/py-stashcp/package.py
2023-11-25 08:31:56 -06:00
Wouter Deconinck
c485709f62 iwyu: new versions up 0.21 (depends_on llvm-17) (#41235) 2023-11-24 10:09:24 -05:00
Michael Kuhn
7db386a018 Fix multi-word aliases (#41126)
PR #40929 reverted the argument parsing to make `spack --verbose
install` work again. It looks like `--verbose` is the only instance
where this kind of argument inheritance is used since all other commands
override arguments with the same name instead. For instance, `spack
--bootstrap clean` does not invoke `spack clean --bootstrap`.

Therefore, fix multi-line aliases again by parsing the resolved
arguments and instead explicitly pass down `args.verbose` to commands.
2023-11-24 15:56:42 +01:00
Massimiliano Culpo
92d076e683 spack graph: fix coloring with environments (#41240)
If we use all specs, we won't color correctly build-only dependencies
2023-11-24 10:08:21 +01:00
joscot-linaro
f70ef51f1a linaro-forge: update for 23.1 (#41236) 2023-11-24 09:15:47 +01:00
Jógvan Magnus Haugaard Olsen
e9d968d95f pfunit: add version 4.7.4 (#41232) 2023-11-24 09:04:37 +01:00
Nathalie Furmento
918d6baed4 starpu: add release 1.4.2 (#41238) 2023-11-24 09:02:51 +01:00
Loris Ercole
624df2a1bb nlcglib: pass cuda_arch setting to kokkos dependency (#39725)
When building with `+cuda`, the specified `cuda_arch` was not passed to kokkos,
leading to a wrong concretization.
2023-11-23 17:49:00 +01:00
Massimiliano Culpo
ee0d3a3be2 ASP-based solver: don't error for type mismatch on preferences (#41138)
This commit discards type mismatches or failures to validate a package preference during concretization. The values discarded are logged as debug level messages. It also adds a config audit to help users spot misconfigurations in packages.yaml preferences.
2023-11-23 11:30:39 +01:00
Mark Abraham
de64ce5541 rdma-core: add new variants for a library without Python dependencies (#41195)
These variants allow packages that use rdma-core as a library to avoid
dependencies on python infrastructure that is not useful to them.
2023-11-23 09:52:57 +01:00
Raffaele Solcà
f556e52bf6 Add dla-future 0.3.1 (#41219) 2023-11-23 09:23:20 +01:00
Todd Gamblin
81e7d39bd2 Update CHANGELOG.md from v0.21.0
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-22 14:39:05 -08:00
Harmen Stoppels
61055d9ee5 test_which: do not mutate os.environ 2023-11-22 14:22:37 -08:00
afzpatel
c1a8bb2a12 composable-kernel, migraphx: Fix build on CentOS8 (#41206) 2023-11-22 20:43:45 +01:00
Juan Miguel Carceller
7e9ddca0ff gloo: add a patch for building with gcc 12 (#41169) 2023-11-22 20:39:01 +01:00
Jen Herting
7cad4bb8d9 [nettle] depend on spack managed openssl (#40783) 2023-11-22 20:37:36 +01:00
Thomas Madlener
2d71c6bb8e dd4hep: add v1.27.1 (#41202)
* Make sure that geant4 comes with cxxstd that should be OK
2023-11-22 20:29:46 +01:00
Alec Scott
285de8ad4d fzf: add v0.44.1 (#41204) 2023-11-22 20:21:53 +01:00
Manuela Kuhn
89a0ea01a7 pulseaudio: add missing m4 dependency (#41216) 2023-11-22 20:20:59 +01:00
Stephen Sachs
e49f55ba53 aocc: help compiler find include paths and libstdc++.so (#40450)
Add --gcc-toolchain option by default.
Only add these paths if c++ libs and include files are available and the compiler was built with gcc
2023-11-22 09:59:20 -08:00
Thomas Madlener
3c54177c5d edm4hep: add latest tag for 0.10.2 (#41201) 2023-11-22 10:16:39 -07:00
Harmen Stoppels
24a38e6782 setup_platform_environment before package env mods (#41205)
This roughly restores the order of operation from Spack 0.20,
where where `AutotoolsPackage.setup_build_environment` would
override the env variable set in `setup_platform_environment` on
macOS.
2023-11-22 17:32:13 +01:00
Massimiliano Culpo
3cf7f7b800 ASP-based solver: don't emit spurious debug output (#41218)
When improving the error message, we started #showing in the
answer set a lot more symbols - but we forgot to suppress the
debug messages warning about UNKNOWN SYMBOLs
2023-11-22 16:06:46 +01:00
LucaMarradi
d7e756a26b onnxruntime: fix the call to as_string() operator (#41087)
* onnxruntime: fix the call to as_string() operator

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-onnxruntime: rm now-unused stringpiece_1_10.patch

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-22 07:23:52 -06:00
Tom Scogland
721f15bbeb hub: add v2.14.2, update to go module (#41183)
* packages/hub: add new version, update to module

Hub now uses a go module to build, needs different env vars, and we're
on a very, very old version before that.  Deprecate the old ones so we
can clean out that old build once we pass a spack version.

* cleanup suggested by @adamjstewart
2023-11-22 07:22:22 -06:00
Michael Kuhn
efa316aafa perl: add missing gmake dependency (#41210) 2023-11-22 11:17:47 +01:00
Michael Kuhn
6ac23545ec python: add missing gmake dependency (#41211) 2023-11-22 11:17:27 +01:00
Alex Richert
432f5d64e3 Add cxx17_flag to intel.py (#41207)
* Add cxx17_flag to intel.py
2023-11-21 18:08:02 -07:00
Chris Richardson
f2192a48ce Update py-scikit-build-core to version 0.6.1 (#40779)
* Update to latest version

* Fix linebreak

* Make suggested changes

* bumped to 0.6.1

* Update var/spack/repos/builtin/packages/py-scikit-build-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Chris Richardson <cnr12@cam.ac.uk>
Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-21 13:17:07 -06:00
Alec Scott
70bed662fc gettext: add v0.22.4 (#41189) 2023-11-21 05:19:16 -07:00
dependabot[bot]
ae38987cb4 build(deps): bump pygments from 2.16.1 to 2.17.1 in /lib/spack/docs (#41191)
Bumps [pygments](https://github.com/pygments/pygments) from 2.16.1 to 2.17.1.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.16.1...2.17.1)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-21 01:12:50 -07:00
Massimiliano Culpo
b361ffbe22 spack style: fix isort on sl:7 (#41133)
Bump the minimum version required for isort. This should fix
an issue reported on Scientific Linux 7, and due to:

https://github.com/PyCQA/isort/issues/1363
2023-11-21 06:24:37 +01:00
Mark W. Krentel
964440a08b elfutils: add version 0.190 (#41187) 2023-11-20 21:10:21 -07:00
Wouter Deconinck
aeb1bec8f3 qt-base: have QtBase provide qmake, ont QtPackage (#41186) 2023-11-20 20:05:11 -07:00
Mark Abraham
cf163eecc5 gromacs: fix newly added variant (#41178)
In practice, one can only compiler for the Intel Data Center Max GPU
via a SYCL build and the oneAPI compiler. This is unlikely to change,
so we can be explicit about that.
2023-11-20 19:49:34 -07:00
Harmen Stoppels
7ec62d117e py-grpcio* do not assume lib / header dir (#41182) 2023-11-20 15:34:29 -06:00
John W. Parent
5154d69629 MSVC preview version breaks clingo build (#41185)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-20 21:55:07 +01:00
Alec Scott
d272c49fb6 rust: add v1.73.0 and add support for external openssl certs (#41161)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-20 21:35:26 +01:00
Alec Scott
868a3c43e4 llvm: Remove python bindings when >= v17 (#41160)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-20 14:10:13 +01:00
Marc Perache
73858df14d Catch2: add variant to choose cxx standard (#40996) 2023-11-20 11:31:17 +01:00
Sergio Sánchez Ramírez
8003f18709 openblas: optimize flags for A64FX (#41093) 2023-11-20 11:20:11 +01:00
Adam J. Stewart
87a9b428e5 py-scipy: add v1.11.4 (#41158) 2023-11-20 11:10:28 +01:00
Scott Wittenburg
714a362f94 mpich: support ch3:sock for a non busy-polling option (#40964) 2023-11-20 11:00:39 +01:00
Alec Scott
4636d6ec62 restic: add v0.16.2 (#41168) 2023-11-20 10:46:59 +01:00
Alec Scott
a015078c36 bfs: add v3.0.4 (#41165) 2023-11-20 10:00:31 +01:00
Alec Scott
ec2a0c8847 rclone: add v1.64.2 (#41166) 2023-11-20 10:00:15 +01:00
Alec Scott
cfae42a514 glab: add v1.35.0 (#41167) 2023-11-20 09:59:55 +01:00
iarspider
2c74ac5b2b Remove a maintainer from CMS packages (#41170) 2023-11-20 01:27:38 -07:00
Wouter Deconinck
df1111c24a sherpa: only enable_or_disable in v3: (#41162) 2023-11-20 09:20:21 +01:00
Harmen Stoppels
55d2ee9160 docs: document how spack picks a version / variant (#41070) 2023-11-20 09:00:53 +01:00
Alec Scott
edda2ef419 npm: only depend on libvips when @6, remove deprecated versions (#41159)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-19 10:13:51 -07:00
Ethan Williams
6159168079 elbencho: add new version and git master branch (#41136)
* elbencho add new version and git master branch

* Update var/spack/repos/builtin/packages/elbencho/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* formatting fix requested by @alecbcs

* remove whitespace added in blank line by github auto resolve

---------

Co-authored-by: Ethan W <mail@ethanwilliams.xyz>
Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-19 08:24:06 -07:00
eugeneswalker
2870b6002c e4s oneapi stack: turn on +sycl: ginkgo, heffte, petsc, upcxx, warpx (#41157)
* e4s oneapi stack: turn on +sycl: ginkgo, heffte, petsc, upcxx, warpx

* comment out warpx; build fails; add note
2023-11-18 22:21:20 -08:00
Christoph Junghans
73a715ad75 votca: add v2023 (#41100) 2023-11-18 16:51:46 -07:00
Mark Abraham
6ca49549d9 gromacs: Add new variants and clarify existing ones (#41115)
* gromacs: Add new variants and clarify existing ones

Add new variants that reflect existing capabilities and defaults in
the upstream build system. Add other existing constraints that were
not yet specified.

* conform to style

* Fix missing hyphens

* Correct cmake variable names
2023-11-18 16:48:00 -07:00
Wouter Deconinck
50051b5619 geant4: new version 11.1.3 (#41112)
* geant4: new version 11.1.3

Release notes: https://geant4.web.cern.ch/download/release-notes/notes-v11.1.3.txt

* geant4: cmake patch with expat fix only until 11.1.2.
2023-11-18 15:39:28 -07:00
dependabot[bot]
3907838e1d build(deps): bump docker/build-push-action from 5.0.0 to 5.1.0 (#41149)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.0.0 to 5.1.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](0565240e2d...4a13e500e5)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-18 15:30:12 -07:00
Mark Abraham
f12b877e51 heffte: add sycl variant (#41132)
* heffte: add sycl variant

This targets the oneAPI SYCL compiler with oneMKL as FFT
implementation library.

* Require oneAPI compiler for sycl variant
2023-11-18 09:21:50 -07:00
Massimiliano Culpo
a701b24ad3 libksba: add v1.6.5 (#41129) 2023-11-18 09:18:19 -07:00
Adam J. Stewart
c60a806f0e py-matplotlib: add v3.7.4, v3.8.2 (#41156) 2023-11-18 09:16:51 -07:00
Vanessasaurus
df7747eb9a Automated deployment to update package flux-sched 2023-11-18 (#41153)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:13:49 -07:00
Vanessasaurus
81130274f4 Automated deployment to update package flux-core 2023-11-18 (#41154)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:13:06 -07:00
Vanessasaurus
2428c10703 Automated deployment to update package flux-security 2023-11-18 (#41152)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:11:36 -07:00
Wouter Deconinck
d171f314c7 py-pygithub: new versions, dependencies (#41072)
* py-pygithub: new versions, dependencies

* py-pygithub: reordered dependencies per requirements.txt

* py-pygithub: depends on py-setuptools-scm
2023-11-18 08:57:46 -07:00
Wouter Deconinck
16f4c53cd4 py-bokeh: new version 3.3.1, and supporting packages (#41089)
* py-bokeh: new version 3.3.1

* py-xyzservices: new package

* py-bokeh, py-xyzservices: update homepages

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-bokeh: depends on newer py-numpy

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 08:47:45 -07:00
Moritz Kern
e8f09713be Update py-elephant (#39200)
* add elephant version v0.12.0 and 0.13.0

* update copyright

* reformat according to black format errors

* restore maintainers directive

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add dependency python 3.8+

* sorted dependencies

* sort dependencies from newest to oldest

* add deps for @master

* removed dependency for master, since it is included in 0.12.0:

* removed dependency for python 3.7+ , since 3.7+ is the lowest supported version anyway

* removed specific deps for master, since master is always newer than all stable releases

* updated numpy dependency for Elephant 0.12.0:

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed upper bounds for py-quantities, omitting v0.14.0

* add elephant v0.14.0

* update required quantities version

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 07:42:57 -07:00
Lydéric Debusschère
063c28e559 py-cleo: add versions 2.0.0 2.0.1; add maintainers (#40611)
* py-cleo: add versions 2.0.0 2.0.1; add maintainers

* py-cleo: add forgotten dependence

* py-cleo: update from review: remove preferred version, remove a dependence, fix py-rapidfuzz version

* py-cleo: deprecated version 1.0.0a5; add version 1.0.0; update dependences

* py-cleo: add version 2.1.0; update version range of dependences

* py-crashtest: add version 0.4.1, dependence of py-cleo

* py-cleo: update dependence

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cleo: update dependence py-clikit

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cleo: update dependence py-rapidfuzz

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-rapidfuzz: add version 2.2.0 dependence of py-cleo@2

* py-cleo: fix version range of py-crashtest

* py-rapidfuzz: fix dependences; add py-rapidfuzz-capi and py-jarowinkler

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 08:28:56 -06:00
Hariharan Devarajan
31ec1be85f Release DLIO Profiler Py 0.0.2 (#41127) 2023-11-18 07:58:56 -06:00
Erik Heeren
7137c43407 py-jsonpath-ng: version bump (#41078)
* py-jsonpath-ng: version bump

* py-jsonpath-ng: fix typo in version number
2023-11-18 07:54:50 -06:00
Erik Heeren
f474c87814 py-gitpython: version bump (#41079) 2023-11-18 07:53:55 -06:00
Juan Miguel Carceller
edf872c94b py-pyh5py: reorder dependencies from newest version to oldest (#41137)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-18 07:48:34 -06:00
Michael Kuhn
223e5b8ca2 Fix invalid escape sequences (#41130)
Using Python 3.12 in a freshly cloned Spack repository results in
warnings such as this:
```
==> Warning: invalid escape sequence '\$'
==> Warning: invalid escape sequence '\('
==> Warning: invalid escape sequence '\.'
==> Warning: invalid escape sequence '\.'
```
These will turn into errors in 3.13, so fix them. All of them actually
do not need to be regexes, so convert them into normal strings.
2023-11-18 07:43:35 -06:00
Moritz Kern
cb764ce41c Update py-quantities (#39201)
* add version 0.14.1

* formatting

* style checks

* fix style errors

* remove old versions

* fix typo

* style

* update maintainers directive

* sort dependencies from newest to oldest

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* sort dependencies from newest to oldest

* removed upper bounds for python version

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove dependency on Python 3.7 +, since 3.7 is the lowest supported version anyway

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 07:12:58 -06:00
Brian Van Essen
9383953f76 Hydrogen package: avoid newer openblas on power (#41143) 2023-11-17 14:20:33 -08:00
Stephen Herbener
9ad134c594 Add cxxstd variant to ecflow, update with latest ecflow version (#41120)
* Added recent versions to ecflow/package.py, as well as added a cxxstd variant that is
needed to set BOOST_NO_CXX98_FUNCTION_BASE appropriately when building with C++17 standard.
* Fixed pep8 style error in the ecflow package.py script.
* Remov
* Removed cxxstd variant since the ecflow cmake configuration was already specifying
to use the c++17 standard for newer versions. The use of the BOOST_NO_CXX98_FUNCTION_BASE
define is now triggered by the ecflow version.
2023-11-17 13:34:17 -08:00
Mark Abraham
ec8bd38c4e Permit packages that depend on Intel oneAPI packages to access sdk (#41117)
* Permit packages that depend on Intel oneAPI packages to access sdk

* Implement and use IntelOneapiLibraryPackageWithSdk

* Restore libs property to IntelOneapiLibraryPackage

* Conform to style

* Provide new class to infrastructure

* Treat sdk/include as the main include
2023-11-17 14:59:04 +00:00
Wouter Deconinck
81e73b4dd4 root: new version 6.30.00 (#41118)
* root: new version 6.30.00

There is a new release of ROOT, v6.30.00, with release notes at https://root.cern/doc/v630/release-notes.html.

In addition to some deprecations of build options, this updates the C++ standard to 17 or higher (well, 20), and increases the vc minimum version.

* vc: new version 1.4.4

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-11-17 08:49:01 -06:00
Thomas Madlener
53c7edb0ad lcio: Add latest tag 2.20.1 (#41102) 2023-11-17 08:24:43 -06:00
Massimiliano Culpo
54ab0872f2 py-archspec: add v0.2.2 (#41110) 2023-11-17 11:22:32 +01:00
Harmen Stoppels
1b66cbacf0 llvm: patch missing cstdint include (#41108)
* llvm: patch missing cstdint include
2023-11-17 10:17:25 +01:00
Kelly (KT) Thompson
a3d6714c8b [doxygen] Add versions 1.9.7 and 1.9.8. (#41123)
* [doxygen] Add versions 1.9.7 and 1.9.8.
* Fix has for 1.9.8.
2023-11-16 19:44:17 -07:00
bk
da2e53b2ee r-rlang: add v1.1.1, v1.1.2 (#41114) 2023-11-16 19:24:20 -07:00
Weiqun Zhang
3e060cce60 Update amrex maintainers (#41122) 2023-11-16 15:06:54 -08:00
Wouter Deconinck
00df2368a3 clhep: new version 2.4.7.1 (#41113) 2023-11-16 15:57:29 -07:00
Massimiliano Culpo
ef689ea586 libgcrypt: add v1.10.3 (#41111) 2023-11-16 15:46:44 -07:00
Thomas Madlener
4991a60eac podio: Add latest tag 0.17.3 (#41103) 2023-11-16 14:41:24 -08:00
Tim Wickberg
67c2c80cf4 Use preferred capitalization of "Slurm" (#41109)
https://slurm.schedmd.com/faq.html#acronym
2023-11-16 22:40:54 +00:00
Massimiliano Culpo
0cde944ccc Improve the error message for deprecated preferences (#41075)
Improves the warning for deprecated preferences, and adds a configuration
audit to get files:lines details of the issues.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-11-16 23:30:29 +01:00
Adam J. Stewart
1927ca1f35 Update PyTorch ecosystem (#41105) 2023-11-16 10:59:13 -08:00
Adam J. Stewart
765df31381 py-lightning: add v2.1.2 (#41106) 2023-11-16 10:52:21 -08:00
Sinan
ba091e00b3 package/lemon: improve (#40971)
* package/lemon: improve
* fix bug
* final improvements
* use f strings for boolean options, add soplex as TODO
* leave +coin as TODO
* depends on bzip2 when +coin
* tidy

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-11-16 10:36:29 -08:00
afzpatel
8d2e76e8b5 enable rocAL and add MIVisionX tests (#39630)
* initial commit to enable rocAL and add MIVisionX tests
* fix styling
* updated checksum for libjpeg patches
* update for 5.6
* use satisfies for checking spec version
2023-11-16 10:29:00 -08:00
Martin Aumüller
0798bd0915 Updates for Ospray@3.0.0 (#41054)
* rkcommon: add v1.12.0
* openimagedenoise: add v2.1.0
* openvkl: 1.3.2 only compatible with rkcommon@:1.11
* openvkl: add v2.0.0
* ospray: add v3.0.0
* paraview: not yet compatible with ospray@3
2023-11-16 10:04:02 -08:00
Massimiliano Culpo
1e1cb68b84 Add audit check to spot when= arguments using wrong named specs (#41107)
* Add audit check to spot when= arguments using named specs

* Fix package issues caught by the new audit
2023-11-16 14:19:05 +01:00
Auriane R
495252f7f6 Add patch for libffi@3.4.4 since failing to install using clang@15 (#41083) 2023-11-16 10:04:46 +01:00
Sergio Sánchez Ramírez
66dea1d396 Update package.py (#41092) 2023-11-15 17:38:28 -07:00
Daniel Arndt
2f4046308f deal.II: Require at least taskflow 3.4 (#41095) 2023-11-15 16:16:53 -08:00
Alberto Sartori
95321f4f3a justbuild: add version v1.2.3 (#41084) 2023-11-15 15:57:06 -08:00
Satish Balay
6eae4b9714 taskflow: add v3.6.0 (#41098) 2023-11-15 15:20:38 -08:00
Harmen Stoppels
2f24aeb7f6 docs: packages config on separate page, demote bootstrapping (#41085) 2023-11-15 15:49:16 +00:00
Rocco Meli
1d30e78b54 cp2k: add hipfft and hipblas explicitly (#41074) 2023-11-15 01:44:38 -07:00
Jonathon Anderson
b3146559fb hpctoolkit: Add depends on autotools for @develop (#41067) 2023-11-15 09:19:02 +01:00
moloney
84e33b496f mrtrix3: fix some issues w/ 3.0.3 and add 3.0.4 (#41036) 2023-11-15 09:13:21 +01:00
Jonathon Anderson
de850e97e8 libevent: call autoreconf directly instead of via autogen.sh (#41057) 2023-11-15 09:11:49 +01:00
kwryankrattiger
c7157d13a8 ParaView: Add release candidate 5.12.0-RC1 (#41009)
* ParaView: Add release candidate 5.12.0-RC1

* [@spackbot] updating style on behalf of kwryankrattiger
2023-11-15 08:27:27 +01:00
Gerhard Theurich
d97d73fad1 esmf: add v8.6.0 (#41066) 2023-11-14 22:23:37 -07:00
Julien Cortial
9792625d1f Fix typo in mumps recipe (#41062)
* Fix typo in mumps recipe
* Adopt mumps package
2023-11-14 10:43:40 -07:00
Satish Balay
43a94e981a xsdk: add version 1.0.0 (#40825)
xsdk: add +sycl variant - with amrex, arborx, ginkgo, petsc, sundials

xsdk: add +pflotran variant

xsdk: enable hypre+rocm

xsdk: enable superlu-dist for GPU - but use trilinos~superlu-dist [as that breaks builds]

xsdk: dealii: disable oce as it can cause intel-tbb-2017.6 to be picked up for some builds (for ex: gcc=13) and result in subsequent build failures
2023-11-14 11:00:19 -06:00
Thomas-Ulrich
ee1a2d94ad bison: conflict %oneapi due to possible miscompilation (#40860) 2023-11-14 17:58:17 +01:00
Harmen Stoppels
25eca56909 gmake: fix bootstrap (#41060) 2023-11-14 17:44:48 +01:00
Massimiliano Culpo
2ac128a3ad Add papyrus to the list of broken tests (#40923)
* Disable papyrus in the neoverse v1 pipeline
   See https://gitlab.spack.io/spack/spack/-/jobs/8983875
   The job is hanging on tests for 6 hrs.
* Add papyrus to broken tests instead of removing it
2023-11-14 07:37:29 -08:00
Massimiliano Culpo
1255620a14 Fix infinite recursion when computing concretization errors (#41061) 2023-11-14 14:44:58 +01:00
Harmen Stoppels
18ebef60aa R: cleanup recipe and fix linking to lapack libraries (#41040) 2023-11-14 14:44:36 +01:00
Dennis Klein
6fc8679fb4 fairmq: add v1.8.1 (#41007) 2023-11-14 12:55:09 +01:00
Massimiliano Culpo
8a8dcb9479 modules: unit-tests without polluted user scope (#41041) 2023-11-14 10:29:28 +00:00
Adam J. Stewart
a6179f26b9 GDAL: add v3.8.0 (#41047) 2023-11-14 03:01:31 -06:00
Martin Aumüller
0dc73884c7 ispc: add v1.21 and v1.21.1 (#41053) 2023-11-14 09:38:08 +01:00
dependabot[bot]
a80b4fd20d build(deps): bump urllib3 from 2.0.7 to 2.1.0 in /lib/spack/docs (#41055)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.7 to 2.1.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.7...2.1.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-14 09:33:41 +01:00
Wouter Deconinck
c264cf12a2 dd4hep: avoid IndexError in setup_run_environment (#41051)
Some environments may have `dd4hep` as a concretized package without having it installed (yet). For those environments, `dd4hep` has property `libs` that is an empty list. Nevertheless, it can be added to a run environment (for example in case `dd4hep` is part of an environment). This results in an IndexError:
```
==> Warning: couldn't load runtime environment due to IndexError: list index out of range
```
To avoid the IndexError, only prepend the `dd4hep` libs if there are actually libs found.
2023-11-14 09:25:56 +01:00
Raffaele Solcà
769474fcb0 DLA-future: add v0.3.0 (#41042) 2023-11-14 09:25:08 +01:00
Adam J. Stewart
ab60bfe36a py-numpy: add v1.26.2 (#41046) 2023-11-13 16:41:05 -07:00
John W. Parent
8bcc3e2820 CMake Package: support building ~ownlibs on Windows (#38758) 2023-11-13 14:26:33 -08:00
Hariharan Devarajan
388f141a92 Release Brahma v0.0.2 (#40994) 2023-11-13 14:25:12 -08:00
Todd Gamblin
f74b083a15 info: improve coverage (#41001)
Tests didn't cover the new `--variants-by-name` parameter in #40998.
Add some parameterization to hit that.

This changeset makes me think that the main section-printing loop in `spack info` isn't
factored so well. It makes it difficult to pass different arguments to different helper
functions.  I could break it out into if statements if folks think that would be cleaner.
2023-11-13 13:45:18 -08:00
heatherkellyucl
5b9d260054 gzip: deprecate <1.13 for vulnerability (#41044) 2023-11-13 13:38:16 -07:00
Greg Becker
4bd47d89db spack diff: allow hashes from mirrors (#41043) 2023-11-13 12:27:52 -08:00
Daniel Arndt
96f3c76052 dealii: add v9.5.0, v9.5.1 (#40747)
* dealii: 9.5.0

* kokkos+cuda_lambda

* dealii ^kokkos@3.7: require +cuda +cuda_lambda +wrapper

* Added 9.5.1, try ~cgal when +cuda

* Forward Cuda architecture request

* Remove workaround

* Try not enforcing the Kokkos compiler

* Enforce using nvcc_wrapper with Trilinos+Cuda

* Don't define CMAKE_*_COMPILER to point to MPI wrappers

* Use the same compiler as Trilinos/Kokkos

* Only check for Trilinos compiler

* Disable Trilinos+Cuda

* Disable Cuda support

* Try CUDA build without ninja

* Combined examples and examples_compile

* Use f-string for cuda_arch

* p -> _package

* Indentation

* Fix up f-string

---------

Co-authored-by: Luca Heltai <luca.heltai@sissa.it>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-11-13 12:29:55 -06:00
H. Joe Lee
9c74eda61f hdf5: add a new variant for enabling sub-filing VFD (#40804) 2023-11-13 11:18:02 -07:00
dependabot[bot]
d9de93a0fc build(deps): bump black from 23.10.1 to 23.11.0 in /lib/spack/docs (#40967)
Bumps [black](https://github.com/psf/black) from 23.10.1 to 23.11.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.10.1...23.11.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 08:18:06 -07:00
Wouter Deconinck
3892fadbf6 qwt: conflict with qt-base (Qt6) (#40883) 2023-11-13 12:42:37 +01:00
Glenn Horton-Smith
62b32080a8 epics-base: patch to avoid failure on "perl xsubpp" when "xsubpp" otherwise works fine. (#40849) 2023-11-13 12:29:51 +01:00
Wanlin Wang
09d66168c4 riscv-gnu-toolchain: add v2023.09.13 -> v2023.10.18 (#40854) 2023-11-13 12:19:09 +01:00
Tuomas Koskela
d7869da36b conquest: add build system changes and library paths (#40718) 2023-11-13 12:13:53 +01:00
Mikael Simberg
b6864fb1c3 Add license directives to various packages (#41039) 2023-11-13 04:03:48 -07:00
Harmen Stoppels
e6125061e1 Compiler.debug_flags: drop -gz (#40900)
That enables compression of the debug symbols, it doesn't toggle them on
or off.
2023-11-13 11:33:40 +01:00
dependabot[bot]
491bd48897 build(deps): bump black in /.github/workflows/style (#40968)
Bumps [black](https://github.com/psf/black) from 23.10.1 to 23.11.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.10.1...23.11.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 11:18:52 +01:00
Victoria Cherkas
ad4878f770 metkit: add v1.10.2 and v1.10.17 (#40668) 2023-11-13 10:56:52 +01:00
Satish Balay
420eff11b7 superlu-dist: add v8.2.0 (#41004) 2023-11-13 10:55:05 +01:00
Adam J. Stewart
15e7aaf94d py-mypy: add v1.4:v1.7 (#41015) 2023-11-13 10:33:33 +01:00
Adam J. Stewart
bd6c5ec82d py-pandas: add v2.1.3 (#41017) 2023-11-13 10:26:56 +01:00
dependabot[bot]
4e171453c0 build(deps): bump mypy from 1.6.1 to 1.7.0 in /lib/spack/docs (#41020)
Bumps [mypy](https://github.com/python/mypy) from 1.6.1 to 1.7.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.6.1...v1.7.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 10:13:26 +01:00
Adam J. Stewart
420bce5cd2 GEOS: add new versions (#41030) 2023-11-13 10:09:58 +01:00
Thomas Gruber
b4f6c49bc0 likwid: add 5.3.0 version (#41008) 2023-11-13 10:09:39 +01:00
David Gardner
da4f2776d2 sundials: add license directive (#41028) 2023-11-13 10:08:28 +01:00
Adam J. Stewart
e2f274a634 PyTorch: allow +openmp on macOS (#41025) 2023-11-13 10:07:18 +01:00
Christian Glusa
15dcd3c65c py-pynucleus: Add variant, modify dependencies (#41006) 2023-11-11 16:24:12 -06:00
Matthew Archer
49c2894def update to latest version (#40905) 2023-11-11 16:16:45 -06:00
Stephen Hudson
1ae37f6720 libEnsemble: add v1.1.0 (#40969) 2023-11-11 16:15:43 -06:00
Adrien Berchet
15f6368c7f Add geomdl package (#40933) 2023-11-11 15:55:08 -06:00
Terry Cojean
57b63228ce Ginkgo: 1.7.0, change compatibility, update option oneapi->sycl (#40874)
Signed-off-by: Terry Cojean <terry.cojean@kit.edu>
2023-11-11 09:00:52 -06:00
Greg Becker
13abfb7013 spack deconcretize command (#38803)
We have two ways to concretize now:
* `spack concretize` concretizes only the root specs that are not concrete in the environment.
* `spack concretize -f` eliminates all cached concretization data and reconcretizes the *entire* environment.

This PR adds `spack deconcretize`, which eliminates cached concretization data for a spec.  This allows
users greater control over what is preserved from their `spack.lock` file and what is reused when not
using `spack concretize -f`.  If you want to update a spec installed in your environment, you can call
`spack deconcretize` on it, and that spec and any relevant dependents will be removed from the lock file.

`spack concretize` has two options:
* `--root`: limits deconcretized specs to *specific* roots in the environment. You can use this to
  deconcretize exactly one root in a `unify: false` environment.  i.e., if `foo` root is a dependent
  of `bar`, both roots, `spack deconcretize bar` will *not* deconcretize `foo`.
* `--all`: deconcretize *all* specs that match the input spec. By default `spack deconcretize`
  will complain about multiple matches, like `spack uninstall`.
2023-11-10 14:55:35 -08:00
Nils Lehmann
b41fc1ec79 new release (#41010) 2023-11-10 11:52:59 -07:00
Henri Menke
124e41da23 libpspio 0.3.0 (#40953)
Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-10 09:48:50 -08:00
Massimiliano Culpo
f6039d1d45 builtin.repo: fix ^mkl pattern in minor packages (#41003)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-10 17:18:24 +01:00
Victoria Cherkas
8871bd5ba5 fdb: add dependency on eckit later release (#40737)
* depends_on("eckit@1.24.4:", when="@5.11.22:")

* Update var/spack/repos/builtin/packages/fdb/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* make latest tagged release the default install

* revert f258f46660

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-10 07:54:25 -08:00
Cody Balos
efe85755d8 alquimia: apply patch for iso_c_binding to latest version (#40989) 2023-11-10 08:31:38 -06:00
Cody Balos
7aaa17856d pflotran: tweak for building with xsdk rocm/hip (#40990) 2023-11-10 08:30:35 -06:00
Massimiliano Culpo
fbf02b561a gromacs et al: fix ^mkl pattern (#41002)
The ^mkl pattern was used to refer to three packages
even though none of software using it was depending
on "mkl".

This pattern, which follows Hyrum's law, is now being
removed in favor of a more explicit one.

In this PR gromacs, abinit, lammps, and quantum-espresso
are modified.

Intel packages are also modified to provide "lapack"
and "blas" together.
2023-11-10 13:56:04 +00:00
Harmen Stoppels
4027a2139b env: compute env mods only for installed roots (#40997)
And improve the error message (load vs unload).

Of course you could have some uninstalled dependency too, but as long as
it doesn't implement `setup_run_environment` etc, I don't think it hurts
to attempt to load the root anyways, given that failure to do so is a
warning, not a fatal error.
2023-11-10 12:32:48 +01:00
Todd Gamblin
f0ced1af42 info: rework spack info command to display variants better (#40998)
This changes variant display to use a much more legible format, and to use screen space
much better (particularly on narrow terminals). It also adds color the variant display
to match other parts of `spack info`.

Descriptions and variant value lists that were frequently squished into a tiny column
before now have closer to the full terminal width.

This change also preserves any whitespace formatting present in `package.py`, so package
maintainers can make easer-to-read descriptions of variant values if they want. For
example, `gasnet` has had a nice description of the `conduits` variant for a while, but
it was wrapped and made illegible by `spack info`. That is now fixed and the original
newlines are kept.

Conditional variants are grouped by their when clauses by default, but if you do not
like the grouping, you can display all the variants in order with `--variants-by-name`.
I'm not sure when people will prefer this, but it makes it easier to tell that a
particular variant is/isn't there. I do think grouping by `when` is the better default.
2023-11-10 12:31:28 +01:00
David Boehme
2e45edf4e3 Add adiak v0.4.0 (#40993)
* Add adiak v0.4.0
* Fix style checks
2023-11-09 21:23:00 -07:00
Adam J. Stewart
4bcfb01566 py-black: add v23.10: (#40959) 2023-11-10 00:10:28 +01:00
Adam J. Stewart
b8bb8a70ce PyTorch: specify CUDA root directory (#40855) 2023-11-09 14:25:54 -08:00
Hariharan Devarajan
dd2b436b5a new release cpp-logger v0.0.2 (#40972) 2023-11-09 13:08:04 -08:00
Hariharan Devarajan
da2cc2351c Release Gotcha v1.0.5 (#40973) 2023-11-09 13:06:56 -08:00
eugeneswalker
383ec19a0c Revert "Deactivate Cray sles, due to unavailable runner (#40291)" (#40910)
This reverts commit 4b06862a7f.
2023-11-09 12:24:18 -08:00
Todd Gamblin
45f8a0e42c docs: tweak formatting of +: and -: operators (#40988)
Just trying to make these stand out a bit more in the docs.
2023-11-09 19:55:29 +00:00
Dom Heinzeller
4636a7f14f Add symlinks for hdf5 library names when built in debug mode (#40965)
* Add symlinks for hdf5 library names when built in debug mode
* Only apply bug fix for debug libs when build type is Debug
2023-11-09 11:40:53 -08:00
Kelly (KT) Thompson
38f3f57a54 [lcov] Add build and runtime deps necessary for lcov@2.0.0: (#40974)
* [lcov] Add build and runtime deps necessary for lcov@2.0.0:
   + Many additional Perl package dependecies are required for the new version of lcov.
   + Some of the new dependencies were not known to spack until now.
* Style fix
2023-11-09 11:37:38 -08:00
Satish Balay
b17d7cd0e6 mfem: add hipblas dependency for superlu-dist (#40981) 2023-11-09 11:19:48 -08:00
Satish Balay
b5e2f23b6c hypre: add in hipblas dependency due to superlu-dist (#40980) 2023-11-09 11:03:03 -08:00
Brian Van Essen
7a4df732e1 DiHydrogen, Hydrogen, and Aluminum CachedCMakePackage (#39714) 2023-11-09 19:08:37 +01:00
George Young
7e6aaf9458 py-macs3: adding zlib dependency (#40979) 2023-11-09 16:44:24 +01:00
Cody Balos
2d35d29e0f sundials: add v6.6.2 (#40920) 2023-11-09 07:38:40 -06:00
Scott Wittenburg
1baed0d833 buildcache: skip unrecognized metadata files (#40941)
This commit improves forward compatibility of Spack with newer build cache metadata formats.

Before this commit, invalid or unrecognized metadata would be fatal errors, now they just cause
a mirror to be skipped.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-09 13:30:41 +00:00
Harmen Stoppels
cadc2a1aa5 Set version to 0.22.0.dev0 (#40975) 2023-11-09 10:02:29 +01:00
Satish Balay
78449ba92b intel-oneapi-mkl: do not set __INTEL_POST_CFLAGS env variable (#40947)
This triggers warnings from icx compiler - that breaks petsc configure

$ I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
$ __INTEL_POST_CFLAGS=-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64 I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
icx: warning: -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64: 'linker' input unused [-Wunused-command-line-argument]
2023-11-09 08:40:12 +01:00
Massimiliano Culpo
26d6bfbb7f modules: remove deprecated code and test data (#40966)
This removes a few deprecated attributes from the
schema of the "modules" section. Test data for
deprecated options is removed as well.
2023-11-09 08:15:46 +01:00
Sergio Sánchez Ramírez
3405fe60f1 libgit2: add python as test dependency (#40863)
Libgit2 requires python as build dependency. I was getting an error because it was falling back to system Python which is compiled with Intel compilers and thus, `libgit2` was failing because it couldn't find `libimf.so` (which doesn't make sense).

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-08 23:20:55 +01:00
Harmen Stoppels
53c266b161 modules: restore exclude_implicits (#40958) 2023-11-08 22:56:55 +01:00
Thomas Madlener
ed8ecc469e podio: Add the latest tag (0.17.2) (#40956)
* podio: Add myself as maintainer
* podio: Add 0.17.2 tag
2023-11-08 14:53:23 -07:00
Harmen Stoppels
b2840acd52 Revert "defaults/modules.yaml: hide implicits (#40906)" (#40955)
This reverts commit a2f00886e9.
2023-11-08 14:33:50 -07:00
Tom Vander Aa
c35250b313 libevent: always autogen.sh (#40945)
The libevent release tarballs ship with a `configure` script generated by an old `libtool`. The `libtool` generated by `configure` is not compatible with `MACOSX_DEPLOYMENT_VERSION` > 10. Regeneration of the `configure` scripts fixes build on macOS. 

Original configure contains:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*) # darwin 5.x on
      # if running on 10.5 or later, the deployment target defaults
      # to the OS version, if on x86, and 10.4, the deployment
      # target defaults to 10.4. Don't you love it?
      case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in
        10.0,*86*-darwin8*|10.0,*-darwin[91]*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
        10.[012][,.]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        10.*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```

After re-running `autogen.sh`:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*)
      case $MACOSX_DEPLOYMENT_TARGET,$host in
        10.[012],*|,*powerpc*-darwin[5-8]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        *)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```
2023-11-08 22:33:09 +01:00
Adam J. Stewart
e114853115 py-lightning: add v2.1.1 (#40957) 2023-11-08 13:15:23 -08:00
Cameron Smith
89fc9a9d47 lcov: add version2, embed perl path in binaries (#39342)
* lcov: add version2, perl dep at build and runtime
* lcov: add runtime deps
* namespace-autoclean: new perl package
* datetime: dep on autoclean
* formatting
2023-11-08 11:23:23 -08:00
Massimiliano Culpo
afc693645a tcl: filter compiler wrappers to avoid pointing to Spack (#40946) 2023-11-08 19:38:41 +01:00
downloadico
4ac0e511ad abinit: add v9.10.3 (#40919)
* abinit:	add v9.10.3
Changed	configure arguments for specfying how to use Wannier90 for versions
after 9.8.
When the mpi variant is requested, set the F90 environment variable to point
to the MPI Fortran wrapper when building versions after 9.8 instead of FC.
---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2023-11-08 10:15:49 -08:00
Henri Menke
b0355d6cc0 ScaFaCoS 1.0.4 (#40948) 2023-11-08 10:17:58 -07:00
Konstantinos Parasyris
300d53d6f8 Add new tag on AMS (#40949) 2023-11-08 08:52:53 -08:00
Greg Becker
0b344e0fd3 tutorial stack: update for changes to the basics section for SC23 (#40942) 2023-11-07 23:46:57 -08:00
Peter Scheibel
15adb308bf RAJA package: find libs (#40885) 2023-11-08 08:33:04 +01:00
Michael Kuhn
050d565375 julia: constrain patchelf version (#40938)
* julia: constrain patchelf version

patchelf@0.18 breaks (at least) `libjulea-internal.so`, leading to
errors like:
```
$ julia --version
ERROR: Unable to load dependent library $SPACK/opt/spack/linux-centos8-x86_64_v3/gcc-12.3.0/julia-1.9.2-6hf5qx2q27jth2fkm6kgqmfdlhzzw6pl/bin/../lib/julia/libjulia-internal.so.1
Message:$SPACK/opt/spack/linux-centos8-x86_64_v3/gcc-12.3.0/julia-1.9.2-6hf5qx2q27jth2fkm6kgqmfdlhzzw6pl/bin/../lib/julia/libjulia-internal.so.1: ELF load command address/offset not properly aligned
```

* patchelf: prefer v0.17.x since v0.18 breaks libraries
2023-11-08 08:13:54 +01:00
Matthew Thompson
f6ef2c254e mapl: add v2.41 and v2.42 (#40870)
* mapl: add 2.41 and 2.42

* Conflict MPICH 3
2023-11-07 17:36:11 -08:00
Freifrau von Bleifrei
62c27b1924 discotec: add compression variant (#40925) 2023-11-07 14:58:48 -08:00
SWAT Team (JSC)
2ff0766aa4 adds cubew 4.8.1, cubelib 4.8.1 and cubegui 4.8.1, 4.8.2 (#40612)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 (#40676)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 due to pybind error with py 3.11
* hiop@:1.0 +cuda: constrain to cuda@:11.9
* fixes syntax of maintainers

---------

Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2023-11-07 14:40:36 -08:00
Mark W. Krentel
dc245e87f9 intel-xed: fix git hash for mbuild, add version 2023.10.11 (#40922)
* intel-xed: fix git hash for mbuild, add version 2023.10.11
   Fixes #40912
* Fix the git commit hash for mbuild 2022.04.17.  This was broken in
   commit eef9939c21 by mixing up the hashes for xed versus mbuild.
* Add versions 2023.08.21 and 2023.10.11.
* fix style
2023-11-07 14:36:42 -08:00
Harmen Stoppels
c1f134e2a0 tutorial: use lmod@8.7.18 because @8.7.19: has bugs (#40939) 2023-11-07 23:04:45 +01:00
Mosè Giordano
391940d2eb julia: Add v1.9.3 (#40911) 2023-11-07 22:06:12 +01:00
Adam J. Stewart
8c061e51e3 sleef: build shared libs (#40893) 2023-11-07 21:48:59 +01:00
Richarda Butler
5774df6b7a Propagate variant across nodes that don't have that variant (#38512)
Before this PR, variant were not propagated to leaf nodes that could accept 
the propagated value, if some intermediate node couldn't accept it.

This PR fixes that issue by marking nodes as "candidate" for propagation
and by setting the variant only if it can be accepted by the node.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-07 21:04:41 +01:00
Harmen Stoppels
3a5c1eb5f3 tutorial pipeline: force gcc@12.3.0 (#40937) 2023-11-07 20:53:44 +01:00
Harmen Stoppels
3a2ec729f7 Ensure global command line arguments end up in args like before (#40929) 2023-11-07 20:35:56 +01:00
Jacob King
a093f4a8ce superlu-dist: add +parmetis variant. (#40746)
* Expose ability to make parmetis an optional superlu-dist dependency to
spack package management.

* rename parmetis variant: Enable ParMETIS library

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-11-07 10:21:38 -08:00
Scott Wittenburg
b8302a8277 ci: do not retry timed out build jobs (#40936) 2023-11-07 17:44:28 +00:00
Massimiliano Culpo
32f319157d Update the branch for the tutorial command (#40934) 2023-11-07 16:59:48 +00:00
Harmen Stoppels
75dfad8788 catch exceptions in which_string (#40935) 2023-11-07 17:17:31 +01:00
Vanessasaurus
f3ba20db26 fix configure args for darshan-runtime (#40873)
Problem: the current configure arguments are added lists to a list,
and this needs to be adding strings to the same list.
Solution: ensure we add each item (string) separately.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-11-07 07:00:28 -08:00
Rob Falgout
6301edbd5d Update package.py for new release 2.30.0 (#40907) 2023-11-07 07:58:00 -07:00
Massimiliano Culpo
c232bf435a Change container labeling so that "latest" is the latest tag (#40593)
* Use `major.minor.patch`, `major.minor`, `major` in tags

* Ensure `latest` is the semver largest version, and not "latest in time"

* Remove Ubuntu 18.04 from the list of images
2023-11-07 11:53:36 +01:00
Massimiliano Culpo
f3537bc66b ASP: targets, compilers and providers soft-preferences are only global (#31261)
Modify the packages.yaml schema so that soft-preferences on targets,
compilers and providers can only be specified under the "all" attribute.
This makes them effectively global preferences.

Version preferences instead can only be specified under a package
specific section.

If a preference attribute is found in a section where it should
not be, it will be ignored and a warning is printed to screen.
2023-11-07 07:46:06 +01:00
Massimiliano Culpo
4004f27bc0 archspec: update to v0.2.2 (#40917)
Adds support for Neoverse V2
2023-11-07 07:44:52 +01:00
Todd Gamblin
910190f55b database: optimize query() by skipping unnecessary virtual checks (#40898)
Most queries will end up calling `spec.satisfies(query)` on everything in the DB, which
will cause Spack to ask whether the query spec is virtual if its name doesn't match the
target spec's. This can be expensive, because it can cause Spack to check if any new
virtuals showed up in *all* the packages it knows about. That can currently trigger
thousands of `stat()` calls.

We can avoid the virtual check for most successful queries if we consider that if there
*is* a match by name, the query spec *can't* be virtual. This PR adds an optimization to
the query loop to save any comparisons that would trigger a virtual check for last.

- [x] Add a `deferred` list to the `query()` loop.
- [x] First run through the `query()` loop *only* checks for name matches.
- [x] Query loop now returns early if there's a name match, skipping most `satisfies()` calls.
- [x] Second run through the `deferred()` list only runs if query spec is virtual.
- [x] Fix up handling of concrete specs.
- [x] Add test for querying virtuals in DB.
- [x] Avoid allocating deferred if not necessary.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-07 01:00:37 +00:00
Harmen Stoppels
4ce80b95f3 spack compiler find --[no]-mixed-toolchain (#40902)
Currently there's some hacky logic in the AppleClang compiler that makes
it also accept `gfortran` as a fortran compiler if `flang` is not found.

This is guarded by `if sys.platform` checks s.t. it only applies to
Darwin.

But on Linux the feature of detecting mixed toolchains is highly
requested too, cause it's rather annoying to run into a failed build of
`openblas` after dozens of minutes of compiling its dependencies, just
because clang doesn't have a fortran compiler.

In particular in CI where the system compilers may change during system
updates, it's typically impossible to fix compilers in a hand-written
compilers.yaml config file: the config will almost certainly be outdated
sooner or later, and maintaining one config file per target machine and
writing logic to select the correct config is rather undesirable too.

---

This PR introduces a flag `spack compiler find --mixed-toolchain` that
fills out missing `fc` and `f77` entries in `clang` / `apple-clang` by
picking the best matching `gcc`.

It is enabled by default on macOS, but not on Linux, matching current
behavior of `spack compiler find`.

The "best matching gcc" logic and compiler path updates are identical to
how compiler path dictionaries are currently flattened "horizontally"
(per compiler id). This just adds logic to do the same "vertically"
(across different compiler ids).

So, with this change on Ubuntu 22.04:

```
$ spack compiler find --mixed-toolchain
==> Added 6 new compilers to /home/harmen/.spack/linux/compilers.yaml
    gcc@13.1.0  gcc@12.3.0  gcc@11.4.0  gcc@10.5.0  clang@16.0.0  clang@15.0.7
==> Compilers are defined in the following files:
    /home/harmen/.spack/linux/compilers.yaml

```

you finally get:

```
compilers:
- compiler:
    spec: clang@=15.0.7
    paths:
      cc: /usr/bin/clang
      cxx: /usr/bin/clang++
      f77: /usr/bin/gfortran
      fc: /usr/bin/gfortran
    flags: {}
    operating_system: ubuntu23.04
    target: x86_64
    modules: []
    environment: {}
    extra_rpaths: []
- compiler:
    spec: clang@=16.0.0
    paths:
      cc: /usr/bin/clang-16
      cxx: /usr/bin/clang++-16
      f77: /usr/bin/gfortran
      fc: /usr/bin/gfortran
    flags: {}
    operating_system: ubuntu23.04
    target: x86_64
    modules: []
    environment: {}
    extra_rpaths: []
```

The "best gcc" is automatically default system gcc, since it has no
suffixes / prefixes.
2023-11-06 15:17:31 -08:00
Sinan
8f1f9048ec package/qgis: add latest ltr (#40752)
* package/qgis: add latest ltr

* fix bug

* [@spackbot] updating style on behalf of Sinan81

* make flake happy

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-11-06 15:55:20 -07:00
Harmen Stoppels
e7372a54a1 docs: expand section about relocation, suggest padding (#40909) 2023-11-06 14:49:54 -08:00
Michael Kuhn
5074b7e922 Add support for aliases (#17229)
Add a new config section: `config:aliases`, which is a dictionary mapping aliases
to commands.

For instance:


```yaml
config:
    aliases:
        sp: spec -I
```

will define a new command `sp` that will execute `spec` with the `-I`
argument. 

Aliases cannot override existing commands, and this is ensured with a test.

We cannot currently alias subcommands. Spack will warn about any aliases
containing a space, but will not error, which leaves room for subcommand
aliases in the future.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-11-06 14:37:46 -08:00
Harmen Stoppels
461eb944bd Don't let runtime env variables of compiler like deps leak into the build environment (#40916)
* Test that setup_run_environment changes to CC/CXX/FC/F77 are dropped in build env

* compilers set in run env shouldn't impact build

Adds `drop` to EnvironmentModifications courtesy of @haampie, and uses
it to clear modifications of CC, CXX, F77 and FC made by
`setup_{,dependent_}run_environment` routines when producing an
environment in BUILD context.

* comment / style

* comment

---------

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-06 14:30:27 -08:00
Harmen Stoppels
4700108b5b fix prefix_inspections keys in example (#40904) 2023-11-06 13:22:13 -08:00
Harmen Stoppels
3384181868 docs: mention public build cache for GHA (#40908) 2023-11-06 13:21:16 -08:00
Vicente Bolea
f0f6e54b29 adios2: add v2.9.2 release (#40832) 2023-11-06 12:15:29 -08:00
Harmen Stoppels
a2f00886e9 defaults/modules.yaml: hide implicits (#40906) 2023-11-06 10:37:29 -08:00
Harmen Stoppels
1235084c20 Introduce default_args context manager (#39964)
This adds a rather trivial context manager that lets you deduplicate repeated
arguments in directives, e.g.

```python
depends_on("py-x@1", when="@1", type=("build", "run"))
depends_on("py-x@2", when="@2", type=("build", "run"))
depends_on("py-x@3", when="@3", type=("build", "run"))
depends_on("py-x@4", when="@4", type=("build", "run"))
```

can be condensed to

```python
with default_args(type=("build", "run")):
    depends_on("py-x@1", when="@1")
    depends_on("py-x@2", when="@2")
    depends_on("py-x@3", when="@3")
    depends_on("py-x@4", when="@4")
```

The advantage is it's clear for humans, the downside it's less clear for type checkers due to type erasure.
2023-11-06 10:22:29 -08:00
Greg Becker
b5538960c3 error messages: condition chaining (#40173)
Create chains of causation for error messages.

The current implementation is only completed for some of the many errors presented by the concretizer. The rest will need to be filled out over time, but this demonstrates the capability.

The basic idea is to associate conditions in the solver with one another in causal relationships, and to associate errors with the proximate causes of their facts in the condition graph. Then we can construct causal trees to explain errors, which will hopefully present users with useful information to avoid the error or report issues.

Technically, this is implemented as a secondary solve. The concretizer computes the optimal model, and if the optimal model contains an error, then a secondary solve computes causation information about the error(s) in the concretizer output.

Examples:

$ spack solve hdf5 ^cmake@3.0.1
==> Error: concretization failed for the following reasons:

   1. Cannot satisfy 'cmake@3.0.1'
   2. Cannot satisfy 'cmake@3.0.1'
        required because hdf5 ^cmake@3.0.1 requested from CLI 
   3. Cannot satisfy 'cmake@3.18:' and 'cmake@3.0.1
        required because hdf5 ^cmake@3.0.1 requested from CLI 
        required because hdf5 depends on cmake@3.18: when @1.13: 
          required because hdf5 ^cmake@3.0.1 requested from CLI 
   4. Cannot satisfy 'cmake@3.12:' and 'cmake@3.0.1
        required because hdf5 depends on cmake@3.12: 
          required because hdf5 ^cmake@3.0.1 requested from CLI 
        required because hdf5 ^cmake@3.0.1 requested from CLI

$ spack spec cmake ^curl~ldap   # <-- with curl configured non-buildable and an external with `+ldap`
==> Error: concretization failed for the following reasons:

   1. Attempted to use external for 'curl' which does not satisfy any configured external spec
   2. Attempted to build package curl which is not buildable and does not have a satisfying external
        attr('variant_value', 'curl', 'ldap', 'True') is an external constraint for curl which was not satisfied
   3. Attempted to build package curl which is not buildable and does not have a satisfying external
        attr('variant_value', 'curl', 'gssapi', 'True') is an external constraint for curl which was not satisfied
   4. Attempted to build package curl which is not buildable and does not have a satisfying external
        'curl+ldap' is an external constraint for curl which was not satisfied
        'curl~ldap' required
        required because cmake ^curl~ldap requested from CLI 

$ spack solve yambo+mpi ^hdf5~mpi
==> Error: concretization failed for the following reasons:

   1. 'hdf5' required multiple values for single-valued variant 'mpi'
   2. 'hdf5' required multiple values for single-valued variant 'mpi'
    Requested '~mpi' and '+mpi'
        required because yambo depends on hdf5+mpi when +mpi 
          required because yambo+mpi ^hdf5~mpi requested from CLI 
        required because yambo+mpi ^hdf5~mpi requested from CLI 
   3. 'hdf5' required multiple values for single-valued variant 'mpi'
    Requested '~mpi' and '+mpi'
        required because netcdf-c depends on hdf5+mpi when +mpi 
          required because netcdf-fortran depends on netcdf-c 
            required because yambo depends on netcdf-fortran 
              required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because netcdf-fortran depends on netcdf-c@4.7.4: when @4.5.3: 
            required because yambo depends on netcdf-fortran 
              required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because yambo depends on netcdf-c 
            required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because yambo depends on netcdf-c+mpi when +mpi 
            required because yambo+mpi ^hdf5~mpi requested from CLI 
        required because yambo+mpi ^hdf5~mpi requested from CLI 

Future work:

In addition to fleshing out the causes of other errors, I would like to find a way to associate different components of the error messages with different causes. In this example it's pretty easy to infer which part is which, but I'm not confident that will always be the case. 

See the previous PR #34500 for discussion of how the condition chains are incomplete. In the future, we may need custom logic for individual attributes to associate some important choice rules with conditions such that clingo choices or other derivations can be part of the explanation.

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-06 09:55:21 -08:00
Michael Kuhn
d3d82e8d6b c-blosc2: add v2.11.1 (#40889) 2023-11-06 09:48:42 -08:00
Tamara Dahlgren
17a9198c78 Environments: remove environments created with SpackYAMLErrors (#40878) 2023-11-06 18:48:28 +01:00
Juan Miguel Carceller
c6c689be28 pythia8: fix configure args (#40644)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-06 09:33:23 -08:00
AMD Toolchain Support
ab563c09d2 enable threading in amdlibflame (#40852)
Co-authored-by: vkallesh <Vijay-teekinavar.Kallesh@amd.com>
2023-11-06 09:20:19 -08:00
Sergio Sánchez Ramírez
abdac36fd5 Add Python as build dependency of Julia (#40903) 2023-11-06 09:03:38 -07:00
Harmen Stoppels
b8a18f0a78 mpich: remove unnecessary tuples and upperbounds (#40899)
* mpich: remove unnecessary tuples

* remove redundant :3.3.99 upperbound
2023-11-06 07:58:50 -07:00
Wouter Deconinck
17656b2ea0 qt: new version 5.15.11 (#40884)
* qt: new version 5.15.11

* qt: open end patch for qtlocation when gcc-10:
2023-11-06 06:08:19 -07:00
Harmen Stoppels
3c641c8509 spack env activate: create & activate default environment without args (#40756)
This PR implements the concept of "default environment", which doesn't have to be
created explicitly. The aim is to lower the barrier for adopting environments.

To (create and) activate the default environment, run

```
$ spack env activate
```

This mimics the behavior of

```
$ cd
```

which brings you to your home directory.

This is not a breaking change, since `spack env activate` without arguments
currently errors. It is similar to the already existing `spack env activate --temp`
command which always creates an env in a temporary directory, the difference
is that the default environment is a managed / named environment named `default`.

The name `default` is not a reserved name, it's just that `spack env activate`
creates it for you if you don't have it already.

With this change, you can get started with environments faster:

```
$ spack env activate [--prompt]
$ spack install --add x y z
```

instead of

```
$ spack env create default
==> Created environment 'default in /Users/harmenstoppels/spack/var/spack/environments/default
==> You can activate this environment with:
==>   spack env activate default
$ spack env activate [--prompt] default 
$ spack install --add x y z
```

Notice that Spack supports switching (but not stacking) environments, so the
parallel with `cd` is pretty clear:

```
$ spack env activate named_env
$ spack env status
==> In environment named_env
$ spack env activate
$ spack env status
==> In environment default
```
2023-11-05 22:53:26 -08:00
Michael Kuhn
141c7de5d8 Add command and package suggestions (#40895)
* Add command suggestions

This adds suggestions of similar commands in case users mistype a
command. Before:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.
```
After:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.

Did you mean one of the following commands?
  spec
  patch
```

* Add package name suggestions

* Remove suggestion to run spack clean -m
2023-11-05 14:32:09 -08:00
Todd Gamblin
f6b23b4653 bugfix: compress aliases for first command in completion (#40890)
This completes to `spack concretize`:

```
spack conc<tab>
```

but this still gets hung up on the difference between `concretize` and `concretise`:

```
spack -e . conc<tab>
```

We were checking `"$COMP_CWORD" = 1`, which tracks the word on the command line
including any flags and their args, but we should track `"$COMP_CWORD_NO_FLAGS" = 1` to
figure out if the arg we're completing is the first real command.
2023-11-05 10:15:37 +00:00
Harmen Stoppels
4755b28398 Hidden modules: always append hash (#40868) 2023-11-05 08:56:11 +01:00
Tamara Dahlgren
c9dfb9b0fd Environments: Add support for including definitions files (#33960)
This PR adds support for including separate definitions from `spack.yaml`.

Supporting the inclusion of files with definitions enables user to make
curated/standardized collections of packages that can re-used by others.
2023-11-05 00:47:06 -07:00
Veselin Dobrev
5a67c578b7 mfem: allow cuda/rocm builds with superlu-dist built without cuda/rocm (#40847) 2023-11-04 20:15:56 -05:00
Michael Kuhn
e47be18acb c-blosc: add v1.21.5 (#40888) 2023-11-04 16:51:37 -07:00
Harmen Stoppels
6593d22c4e spack.modules.commmon: pass spec to SetupContext (#40886)
Currently module globals aren't set before running
`setup_[dependent_]run_environment` to compute environment modifications
for module files. This commit fixes that.
2023-11-04 20:42:47 +00:00
Massimiliano Culpo
f51dad976e hdf5-vol-async: better specify dependency condition (#40882) 2023-11-04 20:31:52 +01:00
Cameron Rutherford
ff8cd597e0 hiop: fix cuda constraints (#40875) 2023-11-04 13:09:59 -05:00
eugeneswalker
fd22d109a6 sundials +sycl: add cxxflags=-fsycl via flag_handler (#40845) 2023-11-04 08:55:19 -05:00
zv-io
88ee3a0fba linux-headers: support multiple versions (#40877)
The download URL for linux-headers was hardcoded to 4.x;
we need to derive the correct URL from the version number.
2023-11-04 12:21:12 +01:00
Massimiliano Culpo
f50377de7f environment: solve one spec per child process (#40876)
Looking at the memory profiles of concurrent solves
for environment with unify:false, it seems memory
is only ramping up.

This exchange in the potassco mailing list:
 https://sourceforge.net/p/potassco/mailman/potassco-users/thread/b55b5b8c2e8945409abb3fa3c935c27e%40lohn.at/#msg36517698

Seems to suggest that clingo doesn't release memory
until end of the application.

Since when unify:false we distribute work to processes,
here we give a maxtaskperchild=1, so we clean memory
after each solve.
2023-11-03 23:10:42 +00:00
Adam J. Stewart
8e96d3a051 GDAL: add v3.7.3 (#40865) 2023-11-03 22:59:52 +01:00
Richarda Butler
8fc1ba2d7a Bugfix: propagation of multivalued variants (#39833)
Don't encourage use of default value if propagating a multivalued variant.
2023-11-03 12:09:39 -07:00
Massimiliano Culpo
668a5b45e5 clingo-bootstrap: force setuptools through variant (#40866) 2023-11-03 16:53:45 +01:00
Andrew W Elble
70171d6caf squashfuse: remove url_for_version (#40862)
0.5.0 tarball now has the 'v' removed from the name
2023-11-03 10:34:25 -04:00
Thomas-Ulrich
0f1898c82a xdmf3: fix compilation with hdf5@1.10 and above (#37551) 2023-11-03 14:23:49 +01:00
Massimiliano Culpo
db16335aec ASP-based solver: fix for unsplittable providers (#40859)
Some providers must provide virtuals "together", i.e.
if they provide one virtual of a set, they must be the
providers also of the others.

There was a bug though, where we were not checking if
the other virtuals in the set were needed at all in
the DAG.

This commit fixes the bug.
2023-11-03 12:56:37 +01:00
Harmen Stoppels
3082ce6a22 oci parsing: make image name case insensitive (#40858) 2023-11-03 12:50:30 +01:00
George Young
fe0cf80e05 py-spython: updating to @0.3.1 (#40839)
* py-spython: updating to @0.3.1

* Adding `when=` for py-semver

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-11-03 05:07:58 -06:00
Thomas-Ulrich
a5e6097af7 fix typo in packaging guide (#40853) 2023-11-03 09:56:13 +01:00
eugeneswalker
d4a1618e07 tau: update 2.33 hash, add syscall variant (#40851)
Co-authored-by: wspear <wjspear@gmail.com>
2023-11-03 07:58:00 +01:00
Veselin Dobrev
48a21970d1 MFEM: add logic to find CUDA math-libs when using HPC SDK installation (#40815)
* mfem: add logic to find CUDA math-libs when using HPC SDK installation

* [@spackbot] updating style on behalf of v-dobrev
2023-11-02 20:19:11 -07:00
Martin Aumüller
864d47043c qt-svg: new package for Qt6 SVG module (#40834)
enables loading of SVG icons by providing plugin used by qt-base
2023-11-02 17:05:54 -07:00
Martin Aumüller
c2af2bcac3 qt-*: add v6.5.3 & v6.6.0 (#40833) 2023-11-02 19:52:15 -04:00
Martin Aumüller
7c79c744b6 libtheora: fix build on macos (#40840)
* libtheora: regenerate Makefile.in during autoreconf

The patch to inhibit running of configure would exit autogen.sh so early
that it did not yet run autoconf/automake/...
Instead of patching autogen.sh, just pass -V as argument, as this is
passed on to configure and lets it just print its version instead of
configuring the build tree.

Also drop arguments from autogen.sh, as they are unused when configure
does not run.

* libtheora: fix build on macos

Apply upstream patches in order to avoid unresolved symbols during building of libtheoraenc.
These patches require re-running automake/autoconf/...

Error messages:
libtool: link: /Users/ma/git/spack/lib/spack/env/clang/clang -dynamiclib  -o .libs/libtheoraenc.1.dylib  .libs/apiwrapper.o .libs/fragment.o .libs/idct.o .libs/internal.o .libs/state.o .libs/quant.o .l
ibs/analyze.o .libs/fdct.o .libs/encfrag.o .libs/encapiwrapper.o .libs/encinfo.o .libs/encode.o .libs/enquant.o .libs/huffenc.o .libs/mathops.o .libs/mcenc.o .libs/rate.o .libs/tokenize.o   -L/opt/spac
k/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib -logg -lm  -Wl,-exported_symbols_list -Wl,/var/folders/zv/qr55pmd9065glf0mcltpx5bm000102/T/ma/spack-stage/spac
k-stage-libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/spack-src/lib/theoraenc.exp   -install_name  /opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib
/libtheoraenc.1.dylib -compatibility_version 3 -current_version 3.2
ld: warning: search path '/opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib' not found
ld: Undefined symbols:
  _th_comment_add, referenced from:
      _theora_comment_add in apiwrapper.o
  _th_comment_add_tag, referenced from:
      _theora_comment_add_tag in apiwrapper.o
  _th_comment_clear, referenced from:
      _theora_comment_clear in apiwrapper.o
  _th_comment_init, referenced from:
      _theora_comment_init in apiwrapper.o
  _th_comment_query, referenced from:
      _theora_comment_query in apiwrapper.o
  _th_comment_query_count, referenced from:
      _theora_comment_query_count in apiwrapper.o

* libtheora: add git versions

stable as version name for theora-1.1 branch was chosen so that it sorts between 1.1.x and master

* libtheora: remove unused patch

thanks to @michaelkuhn for noticing
2023-11-03 00:08:22 +01:00
garylawson
94d143763e Update Anaconda3 -- add version 2023.09-0 for x86_64, aarch64, and ppc64le (#40622)
* Add 2023.09-0 for x86_64, aarch64, and ppc64le
   extend the anaconda3 package.py to support aarch64 and ppc64le. 
   add the latest version of anaconda3 to each new platform, including the existing x86_64
* formatting
2023-11-02 16:42:44 -06:00
Vanessasaurus
6f9425c593 Automated deployment to update package flux-sched 2023-10-18 (#40596)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-02 13:16:39 -07:00
Nicolas Cornu
05953e4491 highfive: 2.8.0 (#40837)
Co-authored-by: Nicolas Cornu <me@alkino.fr>
2023-11-02 14:03:44 -06:00
Sergey Kosukhin
6b236f130c eccodes: rename variant 'definitions' to 'extra_definitions' (#36186) 2023-11-02 13:28:31 -06:00
Greg Becker
fa08de669e bugfix: computing NodeID2 in requirement node_flag_source (#40846) 2023-11-02 20:17:54 +01:00
Seth R. Johnson
c2193b5470 py-pint: new versions 0.21, 0.22 (#40745)
* py-pint: new versions 0.21, 0.22

* Address feedback

* Fix dumb typo

* Add typing extension requirement
2023-11-02 14:13:19 -05:00
Chris Richardson
b5b94d89d3 Update to latest version (#40778) 2023-11-02 14:07:44 -05:00
vucoda
dd57b58c2f py-pyside2: fix to build with newer llvm and to use spack install headers (#40544)
* Fix py-pyside2 to build with newer llvm and to use spack libglx and libxcb headers where system headers are missing

pyside2 needs LLVM_INSTALL_DIR to be set when using llvm 11: and expects system headers for libglx and libxcb and won't build otherwise.

* Fix styling

* remove raw string type

* Update var/spack/repos/builtin/packages/py-pyside2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-02 14:03:18 -05:00
Chris Richardson
29a30963b3 Fixes to ffcx @0.6.0 (#40787) 2023-11-02 14:02:07 -05:00
Jordan Ogas
3447e425f0 add charliecloud 0.35 (#40842)
* add charliecloud 0.35
* fix linter rage
* fix linter rage?
2023-11-02 11:23:49 -07:00
Juan Miguel Carceller
518da16833 Gaudi: Add a few versions and a dependency on tbb after 37.1 (#40802)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-02 11:15:27 -07:00
Paul R. C. Kent
4633327e60 llvm: add 17.0.2-4 (#40820) 2023-11-02 17:00:35 +01:00
Harmen Stoppels
6930176ac6 clingo ^pyhton@3.12: revisit distutils fix (#40844) 2023-11-02 16:48:21 +01:00
Adam J. Stewart
bb64b22066 PyTorch: build with external sleef (#40763)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-11-02 16:09:49 +01:00
Harmen Stoppels
8b0ab67de4 depfile: deal with empty / non-concrete env (#40816) 2023-11-02 16:04:35 +01:00
Satish Balay
dbf21bf843 exago: update petsc dependency (#40831) 2023-11-02 07:29:37 -07:00
Harmen Stoppels
af3a29596e go/rust bootstrap: no versions if unsupported arch (#40841)
The lookup in a dictionary causes KeyError on package load for
unsupported architectures such as i386 and ppc big endian.
2023-11-02 08:13:13 -06:00
Harmen Stoppels
80944d22f7 spack external find: fix multi-arch troubles (#33973) 2023-11-02 09:45:31 +01:00
Tamara Dahlgren
f56efaff3e env remove: add a unit test removing two environments (#40814) 2023-11-02 08:51:08 +01:00
Martin Aumüller
83bb2002b4 openscenegraph: support more file formats (#39897) 2023-11-02 08:41:03 +01:00
Massimiliano Culpo
16fa3b9f07 Cherry-picking virtual dependencies (#35322)
This PR makes it possible to select only a subset of virtual dependencies from a spec that _may_ provide more. To select providers, a syntax to specify edge attributes is introduced:
```
hdf5 ^[virtuals=mpi] mpich
```
With that syntax we can concretize specs like:
```console
$ spack spec strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
```

On `develop` this would currently fail with:
```console
$ spack spec strumpack ^intel-parallel-studio+mkl ^openblas
==> Error: Spec cannot include multiple providers for virtual 'blas'
    Requested 'intel-parallel-studio' and 'openblas'
```

In package recipes, virtual specs that are declared in the same `provides` directive need to be provided _together_. This means that e.g. `openblas`, which has:
```python
provides("blas", "lapack")
```
needs to provide both `lapack` and `blas` when requested to provide at least one of them.

## Additional notes

This capability is needed to model compilers. Assuming that languages are treated like virtual dependencies, we might want e.g. to use LLVM to compile C/C++ and Gnu GCC to compile Fortran. This can be accomplished by the following[^1]:
```
hdf5 ^[virtuals=c,cxx] llvm ^[virtuals=fortran] gcc
```

[^1]: We plan to add some syntactic sugar around this syntax, and reuse the `%` sigil to avoid having a lot of boilerplate around compilers.

Modifications:
- [x] Add syntax to interact with edge attributes from spec literals
- [x] Add concretization logic to be able to cherry-pick virtual dependencies
- [x] Extend semantic of the `provides` directive to express when virtuals need to be provided together
- [x] Add unit-tests and documentation
2023-11-01 23:35:23 -07:00
Thomas Madlener
6cd2241e49 edm4hep: Add 0.10.1 tag and update maintainers (#40829)
* edm4hep: add latest tag
* edm4hep: Add myself as maintainer
2023-11-01 23:04:00 -06:00
snehring
6af45230b4 ceres-solver: adding version 2.2.0 (#40824)
* ceres-solver: adding version 2.2.0
* ceres-solver: adding suite-sparse dep
2023-11-01 17:47:55 -07:00
snehring
a8285f0eec vcftools: add v0.1.16 (#40805)
* vcftools: adding new version 0.1.16

* Update var/spack/repos/builtin/packages/vcftools/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-01 16:33:12 -07:00
Adam J. Stewart
e7456e1aab py-matplotlib: add v3.8.1 (#40819) 2023-11-01 16:33:00 -07:00
Jeremy L Thompson
dd636dd3fb libCEED v0.12.0, Ratel v0.3.0 (#40822)
* ratel - add v0.3.0
* libceed - add version 0.12.0
2023-11-01 16:29:18 -07:00
Mikael Simberg
a73c95b734 pika: Add 0.20.0 (#40817) 2023-11-01 17:19:56 -06:00
Miroslav Stoyanov
33b355a085 heffte: add v2.4.0 (#40741)
* update the heffte versions

* remove obsolete patch files

* update testing

* style

* restore version (unknown reason)

* restore old patch

* change the syntax

* [@spackbot] updating style on behalf of mkstoyanov

* missed one

* style
2023-11-01 16:54:11 -06:00
Satish Balay
f7630f265b pflotran: add version 5.0.0 (#40828)
alquimia: add version 1.1.0
And fix alquimia@master
2023-11-01 15:16:04 -07:00
dependabot[bot]
9744e86d02 build(deps): bump black in /.github/workflows/style (#40681)
Bumps [black](https://github.com/psf/black) from 23.9.1 to 23.10.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.9.1...23.10.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-01 14:20:29 -07:00
Harmen Stoppels
ff6bbf03a1 changelog: add 0.20.2 and 0.20.3 changes (#40818) 2023-11-01 22:09:11 +01:00
Cameron Rutherford
0767c8673e hiop: fix cuda constraints and add tag to versions (#40721)
* hiop: fix cuda constraints and add tag to versions

* hiop: fix styling
2023-11-01 13:21:14 -07:00
Satish Balay
9aa75eaf65 superlu-dist: -std=c99 prevents usage of putenv() (#40729) 2023-11-01 12:44:13 -07:00
Weiqun Zhang
73f012b999 amrex: add v23.11 (#40821) 2023-11-01 12:38:02 -07:00
Satish Balay
c7a8a83cbf petsc, py-petsc4py: add v3.20.1 (#40794) 2023-11-01 12:37:53 -07:00
Satish Balay
5f87db98ea butterflypack: add version 2.4.0 (#40826) 2023-11-01 12:20:13 -07:00
Brian Van Essen
d05dc8a468 LBANN: add explicit variant for shared builds (#40808) 2023-11-01 13:18:57 -06:00
wspear
afa2a2566e Add 2.33 to tau (#40810) 2023-11-01 12:10:35 -07:00
Thomas Madlener
581f45b639 podio: Add latest tags and variants and update dependencies accordingly (#40182)
* Make sure sio is in dependent build env for podio

* podio: Fix likely(?) typo in root dependency

* podio: Add latest tag and new variants + dependencies

* podio: Add v00-16-07 tag

* podio: Fix dependencies flagged by package audit

* podio: Simplify root dependency

* podio: Add 0.17.1 tag
2023-11-01 13:44:11 -05:00
Bilal Mirza
92780a9af6 fix: sentence framing (#40809) 2023-11-01 11:41:37 +01:00
Harmen Stoppels
2ea8e6c820 Executable.add_default_arg: multiple (#40801) 2023-11-01 09:14:37 +01:00
Harmen Stoppels
ac976a4bf4 Parser: fix ambiguity with whitespace in version ranges (#40344)
Allowing white space around `:` in version ranges introduces an ambiguity:

```
a@1: b
```

parses as `a@1:b` but should really be parsed as two separate specs `a@1:` and `b`.

With white space disallowed around `:` in ranges, the ambiguity is resolved.
2023-11-01 09:08:57 +01:00
Harmen Stoppels
e5f3ffc04f SetupContext.get_env_modifications fixes and documentation (#40683)
Call setup_dependent_run_environment on both link and run edges,
instead of only run edges, which restores old behavior.

Move setup_build_environment into get_env_modifications

Also call setup_run_environment on direct build deps, since their run
environment has to be set up.
2023-11-01 08:47:15 +01:00
Harmen Stoppels
7aaed4d6f3 Revert python build isolation & setuptools source install (#40796)
* Revert "Improve build isolation in PythonPipBuilder (#40224)"

This reverts commit 0f43074f3e.

* Revert "py-setuptools: sdist + rpath patch backport (#40205)"

This reverts commit 512e41a84a.
2023-11-01 07:10:34 +01:00
Tamara Dahlgren
f5d717cd5a Fix env remove indentation (#40811) 2023-11-01 00:08:46 -06:00
Sreenivasa Murthy Kolam
cb018fd7eb Enable address sanitizer in rocm's llvm-amdgpu package. (#40570)
* enable address sanitizer in rocm's llvm-amdgpu package
* remove references to 5.7.0 for now
* fix style error
* address review comments
2023-10-31 19:09:40 -06:00
Luisa Burini
e5cebb6b6f fix create/remove env with invalid spack.yaml (#39898)
* fix create/remove env with invalid spack.yaml
* fix isort error
* fix env ident unittests
* Fix pull request points
2023-10-31 15:39:42 -07:00
Patrick Bridges
4738b45fb1 beatnik: mall changes for v1.0 (#40726)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-31 22:28:48 +01:00
Harmen Stoppels
343ed8a3fa force color in subshell if not SPACK_COLOR (#40782) 2023-10-31 22:27:00 +01:00
Adam J. Stewart
58e5315089 PyTorch: build with external gloo (#40759)
* PyTorch: build with external gloo

* Fix gloo compilation with GCC 11

* undeprecate

* py-torch+cuda+gloo requires gloo+cuda
2023-10-31 16:25:24 -05:00
Samuel Li
26649e71f9 Update sperr (#40626)
* update SPERR package
* remove blank line
* update SPERR to be version 0.7.1
* a little clean up
* bound versions that require zstd
* add USE_ZSTD
* add libpressio-sperr version upbound
* update libpressio-sperr
* address review comments
* improve format

---------

Co-authored-by: Samuel Li <Sam@Navada>
Co-authored-by: Samuel Li <sam@cisl-m121a>
2023-10-31 13:53:09 -07:00
Peter Scheibel
2f2d9ae30d Fix cflags requirements (#40639) 2023-10-31 21:19:12 +01:00
jalcaraz
f9c0a15ba0 TAU: Added dyninst variant (#40790)
* Added dyninst variant

* Added dyninst variant and fixed some issues

* Update package.py

* Removed whitespace

* Update package.py

* Update package.py

* Fixed conflicting version

---------

Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2023-10-31 13:28:16 -06:00
Sreenivasa Murthy Kolam
14cb923dd8 add new recipe for rocm packages- amdsmi (#39270)
* add new recipe for rocm packages- amdsmilib
* update tags,maintainers list
2023-10-31 10:18:32 -07:00
Massimiliano Culpo
544a121248 Fix interaction of spec literals that propagate variants with unify:false (#40789)
* Add tests to ensure variant propagation syntax can round-trip to/from string

* Add a regression test for the bug in 35298

* Reconstruct the spec constraints in the worker process

Specs do not preserve any information on propagation of variants
when round-tripping to/from JSON (which we use to pickle), but
preserve it when round-tripping to/from strings.

Therefore, we pass a spec literal to the worker and reconstruct
the Spec objects there.
2023-10-31 17:50:13 +01:00
Harmen Stoppels
cd6bb9e159 spack checksum: improve signature (#40800) 2023-10-31 16:52:53 +01:00
Greg Sjaardema
e420a685a9 Seacas: Update for latest seacas releaes version (#40698) 2023-10-31 09:38:20 -06:00
Harmen Stoppels
40a5c1ff2d spack checksum: fix error when initial filter yields empty list (#40799) 2023-10-31 15:08:41 +01:00
Harmen Stoppels
6933e1c3cb ci: bump tutorial image and toolchain (#40795) 2023-10-31 12:58:33 +01:00
Harmen Stoppels
160bfd881d tutorial: replace zlib -> gmake to avoid deprecated versions (#40769) 2023-10-31 10:04:53 +01:00
G-Ragghianti
81997ae6d6 Added NVML and cgroup support to the slurm package (#40638)
* Added NVML support to the slurm package
* dbus package is required for cgroup support
* Fixing formatting
* Style fix
* Added PAM support
* Added ROCm SMI support
2023-10-30 19:12:09 -07:00
Todd Gamblin
702a2250fa docs: update license() docs with examples and links (#40598)
- [x] Add links to information people are going to want to know when adding license
      information to their packages (namely OSI licenses and SPDX identifiers).
- [x] Update the packaging docs for `license()` with Spack as an example for `when=`.
      After all, it's a dual-licensed package that changed once in the past.
- [x] Add link to https://spdx.org/licenses/ in the `spack create` boilerplate as well.
2023-10-30 18:54:31 -07:00
Freifrau von Bleifrei
3a0f9ce226 selalib: add (sca)lapack dependency (#40667)
* selalib: add (sca)lapack dependency
* selalib: change when "-mpi" to "~mpi"
2023-10-30 18:28:52 -07:00
Thomas Madlener
a095c8113d dd4hep: Add tag for version 1.27 (#40776) 2023-10-30 17:55:33 -07:00
Larry Knox
4ef433b64d Add hdf5 version 1.14.3. (#40786)
Add hdf5 version 1.10.11.
Update version condition for adding h5pfc->h5fc symlink.  File h5pfc
exists in versions 1.10.10 and 1.10.22.
2023-10-30 17:22:55 -06:00
dependabot[bot]
f228c7cbcc build(deps): bump black from 23.9.1 to 23.10.1 in /lib/spack/docs (#40680)
Bumps [black](https://github.com/psf/black) from 23.9.1 to 23.10.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.9.1...23.10.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-31 00:11:53 +01:00
MatthewLieber
e9ca16ab07 adding sha for OMB 7.3 release (#40784)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2023-10-30 16:01:48 -07:00
Andrew W Elble
47ac2b8d09 squashfuse: add version 0.5.0 (#40775) 2023-10-30 11:33:22 -06:00
Harmen Stoppels
b1b8500eba ci: print colored specs in concretization progress (#40711) 2023-10-30 15:29:27 +01:00
Harmen Stoppels
060a1ff2f3 tty: flush immediately (#40774) 2023-10-30 15:07:30 +01:00
marcost2
9ed9a541c9 freesurfer: fix support for linux (#39864)
* Load the script file during enviroment setup so that all the enviroment variables are set properly
* Patch csh/tcsh so that it uses spacks via env
* Update SHA for latest version
* Extend shebang to perl and fix up the regex
2023-10-30 14:19:42 +01:00
Alec Scott
1ebf1c8d1c must: remove release candidates (#40476) 2023-10-30 14:08:23 +01:00
SXS Bot
c2f3943e9e spectre: add v2023.10.11 (#40463)
Co-authored-by: nilsvu <nilsvu@users.noreply.github.com>
2023-10-30 13:56:05 +01:00
Brian Vanderwende
1ba530bff5 Get utilities necessary for successful PIO build (#40502) 2023-10-30 13:53:57 +01:00
RichardBuntLinaro
cc09e88a4a linaro-forge: add v23.0.4 (#40772) 2023-10-30 06:43:07 -06:00
Harmen Stoppels
2f3801196d binary_distribution.py: fix type annotation singleton (#40572)
Convince the language server it's really just a BinaryCacheIndex,
otherwise it defaults to thinking it's Singleton, and can't autocomplete
etc.
2023-10-30 12:52:47 +01:00
Juan Miguel Carceller
d03289c38b Fetch recola from gitlab and add a new version of collier (#40651)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-10-30 12:22:31 +01:00
kwryankrattiger
e720d8640a ISPC: Drop ncurses workaround in favor of patch (#39662)
ISPC had a bug in their lookup for NCurses, this was fixed upstream and
backported here.
2023-10-30 12:16:25 +01:00
Federico Ficarelli
00602cda4f pegtl: add v3.2.7 (#35687) 2023-10-30 12:12:20 +01:00
Alberto Sartori
35882130ce justbuild: add version 1.2.2 (#40701) 2023-10-30 12:09:42 +01:00
Brian Van Essen
1586c8c786 aluminum: make network variants "sticky" (#40715) 2023-10-30 11:26:24 +01:00
Wouter Deconinck
a9e78dc7d8 acts: new variant +binaries when +examples (#40738)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-10-30 10:40:31 +01:00
wspear
b53b235cff RAJA: add "plugins" variant (#40750) 2023-10-30 09:40:08 +01:00
Veselin Dobrev
33cb8c988f Fix an issue with using the environment variable MACHTYPE which is not always defined (#40733)
* Fix an issue reported here:
   https://github.com/spack/spack/pull/36154#issuecomment-1781854894

* [@spackbot] updating style on behalf of v-dobrev
2023-10-30 09:36:02 +01:00
Adam J. Stewart
6511d3dfff py-pandas: add v2.1.2 (#40734) 2023-10-30 03:32:48 -05:00
Adam J. Stewart
272ca0fc24 PyTorch: build with external fp16 (#40760) 2023-10-30 09:28:52 +01:00
Martin Aumüller
a8f42b865f pcl: checksum new versions (#39039) 2023-10-30 08:54:36 +01:00
Cameron Rutherford
7739c54eb5 exago: fix exago missing on PYTHONPATH when +python (#40748) 2023-10-30 08:35:36 +01:00
Veselin Dobrev
bd1bb7d1ba mfem: support petsc+rocm with spack-installed rocm (#40768) 2023-10-30 01:17:51 -06:00
Massimiliano Culpo
6983db1392 ASP-based solver: avoid cycles in clingo using hidden directive (#40720)
The code should be functonally equivalent to what it was before,
but now to avoid cycles by design we are using a "hidden"
feature of clingo
2023-10-30 07:38:53 +01:00
Wouter Deconinck
2a797f90b4 acts: add v28.1.0:30.3.2 (#40723)
* acts: new version from 28.1.0 to 30.3.1

* acts: new version 30.3.2

* acts: new variant +podio
2023-10-29 18:01:27 -07:00
Harmen Stoppels
2e097b4cbd py-numcodecs: fix broken sse / avx2 variables (#40754) 2023-10-29 13:45:23 -05:00
Aoba
a1282337c0 Add liggght patched for newer compiler (#38685)
* Add liggght patched for newer compiler

Add C++ 17 support
Add Clang and Oneapi support

* Add maintainers

* Fix format in liggghts

* Fix maintainers before versions

Co-authored-by: Alec Scott <alec@bcs.sh>

* Fix style and user to usr

* Update package.py

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-29 09:56:27 -07:00
Jerome Soumagne
361d973f97 mercury: add v2.3.1 (#40749) 2023-10-28 10:05:50 -07:00
Lydéric Debusschère
64ec6e7d8e py-moarchiving: new package (#40558)
* [add] py-moarchiving: new package

* py-moarchiving: update from review: description, variant default value is False, switch when and type

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-28 08:06:48 -05:00
Lydéric Debusschère
9f95945cb5 py-generateds: new package (#40555)
* [add] py-generateds: new package

* py-generateds: Update from review

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-generateds: add versions 2.41.5, 2.42.1, 2.42.2, 2.43.1 and 2.43.2

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-10-28 08:05:37 -05:00
Rémi Lacroix
21f3240e08 NCCL: Add version 2.19.3-1 (#40704) 2023-10-28 08:03:02 -05:00
Jen Herting
28d617c1c8 New version of py-langsmith (#40674)
Co-authored-by: Benjamin Meyers <bsmits@rit.edu>
2023-10-28 08:02:19 -05:00
Erik Heeren
7da4b3569f py-bluepyemodel: opensourcing with dependencies (#40592)
* py-bluepyemodel: new package with dependencies

* py-morphio: add MPI as dependency to avoid failing builds

* Formatting

* py-bluepyefe: no need to set NEURON_INIT_MPI

* py-morphio: unifurcation branch is ancient history

* py-bluepyopt: only set NEURON_INIT_MPI with +neuron

* py-efel: get rid of old version

* py-morph{-tool,io}: rename develop to master to match branch

* py-bluepyefe: unset PMI_RANK is also neuron-related

* py-bluepyopt: PMI_RANK is also neuron-related

* Implement review remarks

* py-morph-tool, py-neurom: small fixes

* py-morphio: reword dependencies
2023-10-28 07:55:49 -05:00
Manuela Kuhn
f8aa66b62e py-comm: add 0.1.4 (#40669) 2023-10-28 07:51:55 -05:00
Adam J. Stewart
a1d3e0002c py-numpy: add v1.26 (#40057) 2023-10-28 13:17:32 +02:00
John W. Parent
148dce96ed MSVC: detection from registry (#38500)
Typically MSVC is detected via the VSWhere program. However, this may
not be available, or may be installed in an unpredictable location.
This PR adds an additional approach via Windows Registry queries to
determine VS install location root.

Additionally:

* Construct vs_install_paths after class-definition time (move it to
  variable-access time).
* Skip over keys for which a user does not have read permissions
  when performing searches (previously the presence of these keys
  would have caused an error, regardless of whether they were
  needed).
* Extend helper functionality with option for regex matching on
  registry keys vs. exact string matching.
* Some internal refactoring: remove boolean parameters in some cases
  where the function was always called with the same value
  (e.g. `find_subkey`)
2023-10-27 16:58:50 -07:00
Mosè Giordano
9e01199e13 hipsycl: restrict compatibility with llvm for v0.8.0 (#40736) 2023-10-27 21:33:48 +02:00
eugeneswalker
ed7274a4d0 e4s ci stacks: add exago specs (#40712)
* e4s ci: add exago +cuda, +rocm builds

* exago: rename 5-18-2022-snapshot to snapshot.5-18-2022

* disable exago +rocm for non-external rocm ci install

* note that hiop +rocm fails to find hip libraries when they are spack-installed
2023-10-27 11:15:11 -07:00
eugeneswalker
f2963e41ba mgard@2020-10-01 %oneapi@2023: turn of c++11-narrowing via cxxflags (#40743) 2023-10-27 12:08:33 -06:00
John W. Parent
069762cd37 External finding: update default paths; treat .bat as executable on Windows (#39850)
.bat or .exe files can be considered executable on Windows. This PR
expands the regex for detectable packages to allow for the detection
of packages that vendor .bat wrappers (intel mpi for example).

Additional changes:

* Outside of Windows, when searching for executables `path_hints=None`
  was used to indicate that default path hints should be provided,
  and `[]` was taken to mean that no defaults should be chosen
  (in that case, nothing is searched); behavior on Windows has
  now been updated to match.
* Above logic for handling of `path_hints=[]`  has also been extended
  to library search (for both Linux and Windows).
* All exceptions for external packages were documented as timeout
  errors: this commit adds a distinction for other types of errors
  in warning messages to the user.
2023-10-27 10:40:44 -07:00
Harmen Stoppels
195f965076 OCI buildcache (#38358)
Credits to @ChristianKniep for advocating the idea of OCI image layers
being identical to spack buildcache tarballs.

With this you can configure an OCI registry as a buildcache:

```console 
$ spack mirror add my_registry oci://user/image # Dockerhub

$ spack mirror add my_registry oci://ghcr.io/haampie/spack-test # GHCR

$ spack mirror set --push --oci-username ... --oci-password ... my_registry  # set login credentials
```

which should result in this config:

```yaml
mirrors:
  my_registry:
    url: oci://ghcr.io/haampie/spack-test
    push:
      access_pair: [<username>, <password>]
```

It can be used like any other registry

```
spack buildcache push my_registry [specs...]
```

It will upload the Spack tarballs in parallel, as well as manifest + config
files s.t. the binaries are compatible with `docker pull` or `skopeo copy`.

In fact, a base image can be added to get a _runnable_ image:

```console
$ spack buildcache push --base-image ubuntu:23.04 my_registry python
Pushed ... as [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack

$ docker run --rm -it [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
```

which should really be a game changer for sharing binaries.

Further, all content-addressable blobs that are downloaded and verified
will be cached in Spack's download cache. This should make repeated
`push` commands faster, as well as `push` followed by a separate
`update-index` command.

An end to end example of how to use this in Github Actions is here:

**https://github.com/haampie/spack-oci-buildcache-example**


TODO:

- [x] Generate environment modifications in config so PATH is set up
- [x] Enrich config with Spack's `spec` json (this is allowed in the OCI specification)
- [x] When ^ is done, add logic to create an index in say `<image>:index` by fetching all config files (using OCI distribution discovery API)
- [x] Add logic to use object storage in an OCI registry in `spack install`.
- [x] Make the user pick the base image for generated OCI images.
- [x] Update buildcache install logic to deal with absolute paths in tarballs
- [x] Merge with `spack buildcache` command
- [x] Merge #37441 (included here)
- [x] Merge #39077 (included here)
- [x] #39187 + #39285
- [x] #39341
- [x] Not a blocker: #35737 fixes correctness run env for the generated container images

NOTE:

1. `oci://` is unfortunately taken, so it's being abused in this PR to mean "oci type mirror". `skopeo` uses `docker://` which I'd like to avoid, given that classical docker v1 registries are not supported.
2. this is currently `https`-only, given that basic auth is used to login. I _could_ be convinced to allow http, but I'd prefer not to, given that for a `spack buildcache push` command multiple domains can be involved (auth server, source of base image, destination registry). Right now, no urllib http handler is added, so redirects to https and auth servers with http urls will simply result in a hard failure.

CAVEATS:

1. Signing is not implemented in this PR. `gpg --clearsign` is not the nicest solution, since (a) the spec.json is merged into the image config, which must be valid json, and (b) it would be better to sign the manifest (referencing both config/spec file and tarball) using more conventional image signing tools
2. `spack.binary_distribution.push` is not yet implemented for the OCI buildcache, only `spack buildcache push` is. This is because I'd like to always push images + deps to the registry, so that it's `docker pull`-able, whereas in `spack ci` we really wanna push an individual package without its deps to say `pr-xyz`, while its deps reside in some `develop` buildcache.
3. The `push -j ...` flag only works for OCI buildcache, not for others
2023-10-27 15:30:04 +02:00
Ashwin Kumar Karnad
3fff8be929 octopus: split netcdf-c and netcdf-fortran dependency (#40685) 2023-10-27 14:24:44 +02:00
Satish Balay
1bf758a784 strumpack: add version 7.2.0 (#40732) 2023-10-27 04:29:15 -07:00
Harmen Stoppels
9b8fb413c3 gromacs: default to external blas & lapack (#40490)
* gromacs: default to external blas & lapack

* drop vendored lapack/blas altogether
2023-10-27 09:51:12 +02:00
Harmen Stoppels
51275df0b1 ci: spack compiler find should list extra config scopes (#40727)
otherwise it detected pre-configured compilers in an potentially different way.
2023-10-27 09:43:01 +02:00
dmt4
af13d16c2c Fixes and options for package spglib (#40684)
* Fix cmake_args for spglib v2.1.0+

* Add option to build fortran interface in package spglib

* fix style as sugested by ci/prechecks/style

* Enable fortran variant from v1.16.4 as suggested

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-10-27 08:55:57 +02:00
Harmen Stoppels
37f48aff8b gromacs: fix version branch in intel fftw (#40489) 2023-10-27 08:29:02 +02:00
Alec Scott
feda52f800 akantu: use f-strings (#40466)
Co-authored-by: Nicolas Richart <nrichart@users.noreply.github.com>
2023-10-27 08:12:20 +02:00
Satish Balay
8959d65577 plasma: add version 23.8.2 (#40728) 2023-10-26 16:48:20 -06:00
Carlos Bederián
546695f193 itk: misc fixes (#39832)
* itk: patch missing include for newer compilers

* itk: The package doesn't use MPI

* itk: package requires the high-level hdf5 api

* itk: patch url with ?full_index=1

* itk: point to 4041 commit in master

* itk: don't constrain hdf5 with ~mpi
2023-10-26 15:13:27 -07:00
snehring
c3f5ee54d4 ldak: add v5.2 & add maintainer (#40710)
* ldak: update to 5.2, add maintainer

* ldak: use compiler.openmp_flag
2023-10-26 15:12:10 -07:00
Daniel Arndt
d64f312726 dataTransferKit: add v3.1.1, v3.1.0 (#40556)
* Update DataTransferKit for 3.1.1 release

* Require Trilinos-14 for 3.1.0 and higher
2023-10-26 15:10:16 -07:00
Adam J. Stewart
b4b25dec64 PythonPackage: allow archive_files to be overridden (#40694) 2023-10-26 15:25:56 -05:00
Torbjörn Lönnemark
81172f9251 curl: Fix librtmp variant (#40713)
* rtmpdump: New package

* curl: Fix librtmp variant

Add the previously missing dependency required for rtmp support.

The variant has been broken since its addition in PR #25166.

Fixes one of the two issues reported in #26887.
2023-10-26 21:11:43 +02:00
Alec Scott
cbf9dd0aee unmaintained a* packages: update to use f-strings (#40467) 2023-10-26 21:08:55 +02:00
Ryan Danehy
7ecb9243c1 Update spack package for exago@1.6.0 release (#40614)
* Update spack package for exago:1.6.0

* update style

* Weird spack style env bug fixed

* Update spack package for exago:1.6.0

* update style

* Weird spack style env bug fixed

* changes to allow release 1.6.0

* fix depends, and versioning

* rm cmake variable

* add s

* style fix

---------

Co-authored-by: Ryan Danehy <dane678@deception04.pnl.gov>
Co-authored-by: Ryan Danehy <dane678@deception03.pnl.gov>
Co-authored-by: ryan.danehy@pnnl.gov <dane678@we45149.home>
2023-10-26 11:18:31 -07:00
Harmen Stoppels
e96f31c29d spack checksum pkg@1.2, use as version filter (#39694)
* spack checksum pkg@1.2, use as version filter

Currently pkg@1.2 splits on @ and looks for 1.2 specifically, with this
PR pkg@1.2 is a filter so any matching 1.2, 1.2.1, ..., 1.2.10 version
is displayed.

* fix tests

* fix style
2023-10-26 09:57:55 -07:00
Auriane R
53d5011192 Add conflict between cxxstd > 17 and cuda < 12 in pika (#40717)
* Add conflict with C++ standard > 17 and cuda < 12

* Removing map_cxxstd since boost supports C++20 flag
2023-10-26 16:08:21 +02:00
Xavier Delaruelle
751b64cbcd modules: no --delim option if separator is colon character (#39010)
Update Tcl modulefile template to simplify generated `append-path`,
`prepend-path` and `remove-path` commands and improve their readability.

If path element delimiter is colon character, do not set the `--delim`
option as it is the default delimiter value.
2023-10-26 15:55:49 +02:00
Adam J. Stewart
f57c2501a3 PythonPackage: nested config_settings (#40693)
* PythonPackage: nested config_settings

* flake8
2023-10-26 08:18:02 -05:00
Harmen Stoppels
1c8073c21f spack checksum: show long flags in usage output (#40407) 2023-10-26 14:48:35 +02:00
Xavier Delaruelle
86520abb68 modules: hide implicit modulefiles (#36619)
Renames exclude_implicits to hide_implicits

When hide_implicits option is enabled, generate modulefile of
implicitly installed software and hide them. Even if implicit, those
modulefiles may be referred as dependency in other modulefiles thus they
should be generated to make module properly load dependent module.

A new hidden property is added to BaseConfiguration class.

To hide modulefiles, modulercs are generated along modulefiles. Such rc
files contain specific module command to indicate a module should be
hidden (for instance when using "module avail").

A modulerc property is added to TclFileLayout and LmodFileLayout classes
to get fully qualified path name of the modulerc associated to a given
modulefile.

Modulerc files will be located in each module directory, next to the
version modulefiles. This scheme is supported by both module tool
implementations.

modulerc_header and hide_cmd_format attributes are added to
TclModulefileWriter and LmodModulefileWriter. They help to know how to
generate a modulerc file with hidden commands for each module tool.

Tcl modulerc file requires an header. As we use a command introduced on
Modules 4.7 (module-hide --hidden-loaded), a version requirement is
added to header string.

For lmod, modules that open up a hierarchy are never hidden, even if
they are implicitly installed.

Modulerc is created, updated or removed when associated modulefile is
written or removed. If an implicit modulefile becomes explicit, hidden
command in modulerc for this modulefile is removed. If modulerc becomes
empty, this file is removed. Modulerc file is not rewritten when no
content change is detected.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-10-26 11:49:13 +00:00
Alberto Invernizzi
bf88ed45da libluv: require CMake 3 and CMP0042 (#40716) 2023-10-26 03:33:27 -06:00
Harmen Stoppels
b4cf3d9f18 git versions: fix commit shas [automated] (#40703) 2023-10-26 11:26:47 +02:00
Ben Boeckel
8e19576ec5 Paraview 5.12 prep (#40527)
* paraview: rebase the adios2 patch for 5.12-to-be

* paraview: disable fastfloat and token for 5.12-to-be

* paraview: require older protobuf for 5.12 as well

* paraview: require C++11-supporting protobuf for `master` too
2023-10-25 16:26:49 -07:00
Victoria Cherkas
3c590ad071 fdb: add releases v5.11.23 and v5.11.17 (#40571) 2023-10-25 16:24:54 -07:00
afzpatel
3e47f3f05c initial commit to fix mivisionx build for 5.6 (#40579) 2023-10-25 16:24:31 -07:00
Dominic Hofer
d9edc92119 cuda: add NVHPC_CUDA_HOME. (#40507)
* [cuda] Add NVHPC_CUDA_HOME.

* Add CUDA_HOME and NVHC_CUDA_HOME to cuda's dependent build env.

---------

Co-authored-by: Dominic Hofer <dominic.hofer@meteoswiss.ch>
2023-10-25 16:22:22 -07:00
Filippo Barbari
2a245fdd21 Added Highway versions up to 1.0.7 (#40691) 2023-10-25 15:49:46 -07:00
Adam J. Stewart
932d7a65e0 PyTorch: patch breakpad dependency (#40648) 2023-10-25 23:10:48 +02:00
dependabot[bot]
6bd2dd032b build(deps): bump pytest from 7.4.2 to 7.4.3 in /lib/spack/docs (#40697) 2023-10-25 20:58:53 +02:00
Harmen Stoppels
c0a4be156c ci: don't put compilers in config (#40700)
* ci: don't register detectable compilers

Cause they go out of sync...

* remove intel compiler, it can be detected too

* Do not run spack compiler find since compilers are registered in concretize job already

* trilinos: work around +stokhos +cuda +superlu-dist bug due to EMPTY macro
2023-10-25 11:55:04 -07:00
Harmen Stoppels
0c30418732 ci: darwin aarch64 use apple-clang-15 tag (#40706) 2023-10-25 17:35:47 +02:00
Adam J. Stewart
3063093322 py-lightning: py-torch~distributed is broken again (#40696) 2023-10-25 13:06:35 +02:00
Rocco Meli
f4bbc0dbd2 Add dlaf variant to cp2k (#40702) 2023-10-25 04:13:32 -06:00
Taillefumier Mathieu
1ecb100e43 [cp2k] Use fftw3 MKL by default when cp2k is compiled with mkl (#40671) 2023-10-25 09:55:13 +02:00
John W. Parent
e1da9339d9 Windows: search PATH for patch utility (#40513)
Previously, we only searched for `patch` inside of whatever Git
installation was available because the most common installation of Git
available on Windows had `patch`. That's not true for all possible
installations of Git though, so this updates the search to also check
PATH.
2023-10-24 16:37:26 -07:00
Alex Richert
2d203df075 Add ufs-utils@1.11.0 (#40695)
* Add ufs-utils@1.11.0
* Update package.py
2023-10-24 15:46:23 -07:00
renjithravindrankannath
50f25964cf Updating rvs binary path. (#40604)
* Updating rvs binary path
* Updating spec check as per the recommendation
2023-10-24 15:30:02 -07:00
AMD Toolchain Support
95558d67ae openmpi: fix pmi@4.2.3: compat (#40686) 2023-10-24 20:06:32 +02:00
Filippo Barbari
83532b5469 Added new benchmark version up to 1.8.3 (#40689) 2023-10-24 10:26:26 -07:00
Alberto Invernizzi
444c27ca53 neovim: conflict for libluv problem on macOS + add newer versions of neovim and libluv (#40690)
* add conflict with libluv version >=1.44 just on macOS
* minor change
* add libluv versions
* neovim: add newer releases
2023-10-24 10:21:58 -07:00
eugeneswalker
d075732cc5 hiop +cuda: fix issue 40678 (#40688) 2023-10-24 10:28:23 -06:00
eugeneswalker
cf9a32e6db exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 (#40676)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 due to pybind error with py 3.11

* hiop@:1.0 +cuda: constrain to cuda@:11.9
2023-10-24 01:08:05 -06:00
Annop Wongwathanarat
bc54aa1e82 armpl-gcc: add version 23.10 and macOS support (#40511) 2023-10-24 00:58:04 -06:00
Nakano Masaki
88622d5129 fix installation error of bear (#40637)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-10-23 13:02:15 -07:00
Vicente Bolea
d0982115b3 Adios2: add kokkos variant (#40623)
* adios2: update variants and dependencies

* adios2: add kokkos rocm|cuda|sycl variant

* e4s oneapi ci stack: add adios2 +sycl

* e4s ci stack: add adios2 +rocm

* [@spackbot] updating style on behalf of vicentebolea

* Apply suggestions from code review

* adios2: fixed cuda variant

* update ecp-data-vis-sdk

* Update share/spack/gitlab/cloud_pipelines/stacks/e4s-power/spack.yaml

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: vicentebolea <vicentebolea@users.noreply.github.com>
2023-10-23 13:01:57 -07:00
Taillefumier Mathieu
1e4a5791b2 Add rccl and nccl variants to cp2k and cosma (#40451) 2023-10-23 12:37:42 -07:00
Jim Galarowicz
8def7f5583 Update survey package file for survey version 9 changes. (#40619)
* Update survey package file for survey version 9 changes.
* Fix single quote - make double.
* Small change to trigger spack tests
2023-10-23 12:31:20 -07:00
Adam J. Stewart
66f07088cb py-scikit-learn: add v1.3.2 (#40672) 2023-10-23 13:56:27 -05:00
Michael Kuhn
bf6d5df0ec audit: add check for GitLab patches (#40656)
GitLab's .patch URLs only provide abbreviated hashes, while .diff URLs
provide full hashes. There does not seem to be a parameter to force
.patch URLs to also return full hashes, so we should make sure to use
the .diff ones.
2023-10-23 20:22:39 +02:00
Olivier Cessenat
3eac79bba7 ngspice: new version 41 and option osdi (#40664) 2023-10-23 12:56:12 -04:00
Juan Miguel Carceller
47c9760492 geant4: add patch for when using the system expat library (#40650)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-10-23 16:11:51 +01:00
Harmen Stoppels
a452e8379e nghttp2: add v1.57.0 (#40652) 2023-10-23 16:22:41 +02:00
Aiden Grossman
a6466b9ddd 3proxy: respect compiler choice (#39240) 2023-10-23 03:43:54 -06:00
Harmen Stoppels
96548047f8 concretizer verbose: show progress in % too (#40654) 2023-10-23 10:26:20 +02:00
Harmen Stoppels
a675156c70 py-cython: new version, python 3.11 upperbound (#40343) 2023-10-23 09:37:20 +02:00
Tamara Dahlgren
cfc5363053 Docs: Update spec variant checks plus python quotes and string formatting (#40643) 2023-10-23 09:15:03 +02:00
Michael Kuhn
d9167834c4 libtheora: fix GitLab patch (#40657)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 09:00:22 +02:00
Michael Kuhn
8a4860480a knem: fix GitLab patch (#40662)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 08:59:58 +02:00
Michael Kuhn
f4c813f74a gobject-introspection: fix GitLab patch (#40661)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 08:59:38 +02:00
Michael Kuhn
8b4e557fed garfieldpp: fix GitLab patch (#40660)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 08:59:10 +02:00
Michael Kuhn
c5d0fd42e6 vtk: fix GitLab patch (#40659)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 08:58:47 +02:00
Michael Kuhn
428202b246 libxml2: fix GitLab patch (#40658)
GitLab's .patch URLs do not provide stable/full hashes, while .diff URLs
do. See #40656 for more information.
2023-10-23 08:58:24 +02:00
Bill Williams
1c0d3bc071 Add Score-P 8.3 and dependencies (#40478)
Includes Score-P 8.3 and Cubew/cubelib 4.8.2.
2023-10-22 22:11:19 +02:00
Juan Miguel Carceller
eea3c07628 glib: add patch with a fix for PTRACE_0_EXITKILL (#40655)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-10-22 11:18:16 -06:00
Harmen Stoppels
7cd5fcb484 zlib-ng: add v2.1.4 (#40647) 2023-10-22 11:17:48 -06:00
Juan Miguel Carceller
bbb4c939da py-kiwisolver: add a new version (#40653)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-10-22 09:07:31 -05:00
Tamara Dahlgren
f915489c62 Docs: Add version range example to conditional dependencies (#40630)
* Docs: Add version range example to conditional dependencies

* Add when context manager example
2023-10-22 10:52:44 +02:00
Martin Aumüller
1527853efd intel-tbb: patch patch for Apple's patch (#40640)
While e.g. GNU patch 2.7.6 (as provided by homebrew) would apply the previous
version of this patch without problems, Apple's patch 2.0-12u11-Apple fails
to find out which file to patch.

Adding two lines to the patch fixes that. Renamed the patch in order to
not require a `spack clean -m`.
2023-10-21 09:26:36 -04:00
Harmen Stoppels
d820cf73e9 py-kombu: fix setuptools bound (#40646) 2023-10-21 13:38:30 +02:00
Scott Wittenburg
8714b24420 py-kombu: pick older version of py-setuptools (#40642) 2023-10-21 08:38:03 +02:00
Lydéric Debusschère
0c18f81b80 [add] py-dict2css: new package (#40552)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 18:09:13 -06:00
Andrey Alekseenko
d442fac69a gromacs: add 2022.6, 2023.2, 2023.3 versions (#38906)
* gromacs: add 2022.6, 2023.2 versions
* gromacs: add version 2023.3
2023-10-20 17:28:45 -06:00
Garth N. Wells
76c57af021 py-fenics-ffcx: update to v0.7 (#40569) 2023-10-20 12:04:02 -05:00
Harmen Stoppels
27a0425e5d concretize separately: show concretization time per spec as they concretize when verbose (#40634) 2023-10-20 17:09:19 +02:00
Harmen Stoppels
4bade7ef96 gromacs +cp2k: build in CI (#40494)
* gromacs +cp2k: build in CI

* libxsmm: x86 only

* attempt to fix dbcsr + new mpich

* use c11 standard

* gromacs: does not depend on dbcsr

* cp2k: build with cmake in CI, s.t. dbcsr is a separate package

* cp2k: cmake patches for config files and C/C++ std

* cp2k: remove unnecessary constraints due to patch
2023-10-20 16:20:20 +02:00
Lydéric Debusschère
a0e33bf7b0 py-corner: new package (#40546)
* [add] py-corner: new package

* py-corner: remove py-wheel dependence with respect to reviewing commentary

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:37:15 -05:00
Massimiliano Culpo
cbc39977ca ASP-based solver: minimize weights over edges (#40632)
With the introduction of multiple build dependencies from the same package in the DAG, we need to minimize a few weights accounting for edges rather than nodes. If we don't do that we might have multiple "optimal" solutions that differ only in how the same nodes are connected together. This commit ensures optimal versions are picked per parent in case of multiple choices for a dependency.
2023-10-20 14:37:07 +02:00
Adam J. Stewart
06fc24df5e TensorFlow/Keras/TensorBoard: add v2.14.0 (#40297)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-10-20 14:23:54 +02:00
Claire Guilbaud
9543abd2d9 add recipes for sphinx-book-theme and its dependencies if unknown (#40312)
* add recipes for sphinx-book-theme and its dependencies if unknown

* fix version and mission https

* fix based on reviewers remarks
2023-10-20 07:18:41 -05:00
Lydéric Debusschère
004d3e4cca [add] py-fraction: new package (#40554)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:03:48 -05:00
Lydéric Debusschère
25aff66d34 [add] py-cssutils: new package (#40551)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:01:38 -05:00
Lydéric Debusschère
9bd77b2ed3 [add] py-css-parser: new package (#40550)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:00:46 -05:00
Manuela Kuhn
5de1c1c98f py-statsmodels: add 0.14.0 (#39156)
* py-statsmodels: add 0.14.0

* Fix style

* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove python limits

* Remove comment

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-20 06:59:26 -05:00
Manuela Kuhn
5b9b5eaa28 py-dcm2bids: add v3.1.0 (#40447)
* py-dcm2bids: add 3.1.0

* Fix python restriction
2023-10-20 06:56:48 -05:00
Manuela Kuhn
00ee72396f py-bidscoin: add v4.1.1 and py-argparse-manpage: add new package (#40414)
* py-bidscoin: add 4.1.1

* Fix style

* Fix restrictions for dependencies
2023-10-20 06:55:41 -05:00
snehring
aa4d55004c Add package py-macs3 and dependencies (#40498)
* py-cykhash: adding new package py-cykhash

* py-hmmlearn: adding new package py-hmmlearn

* py-macs3: adding new package py-macs3

* py-macs3: adding python version restriction and other changes.
2023-10-20 06:53:41 -05:00
Harmen Stoppels
468f6c757e schema/compilers.py: fix validation of 2+ entries (#40627)
Fix the following syntax which validates only the first array entry:

```python
"compilers": {
    "type": "array",
    "items": [
        {
            "type": ...
        }
    ]
}
```

to

```python
"compilers": {
    "type": "array",
    "items": {
        "type": ...
    }
}
```

which validates the entire array.

Oops...
2023-10-20 09:51:49 +02:00
Adam J. Stewart
0907d43783 Drop support for external PythonX.Y (#40628)
On some systems, multiple pythonx.y are placed in the same prefix as
pythonx (where only one of them is associated with that pythonx).
Spack external detection for Python was willing to register all of
these as external versions. Moreover, the `package.py` for Python
was able to distinguish these.

This can cause an issue for some build systems, which will just look
for python3 for example, so if that python3 is actually python3.6,
and the build system needs 3.7 (which spack may have found in the
same prefix, and offered as a suitable external), it will fail when
invoking python3.

To avoid that issue, we simply avoid treating pythonx.y as external
candidates. In the above case, Spack would only detect a Python 3.6
external, and the build would be forced to use a Spack-built Python
3.7 (which we consider a good thing).
2023-10-20 00:29:38 -06:00
wspear
c9e5173bbd TAU: Respect ~fortran for +mpi (#40617) 2023-10-19 23:24:17 -06:00
Vicente Bolea
0019faaa17 vtk-m: update to latest release (#40624) 2023-10-19 18:02:25 -06:00
Lydéric Debusschère
e30f53f206 perl: change permissions in order to apply patch on version 5.38.0 (#40609)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-10-20 00:00:24 +02:00
dependabot[bot]
f2ba25e09d build(deps): bump actions/checkout from 4.1.0 to 4.1.1 (#40584) 2023-10-19 23:04:40 +02:00
dependabot[bot]
405de56c71 build(deps): bump mypy from 1.6.0 to 1.6.1 in /lib/spack/docs (#40603) 2023-10-19 23:03:48 +02:00
dependabot[bot]
ba571f2404 build(deps): bump mypy from 1.6.0 to 1.6.1 in /.github/workflows/style (#40602) 2023-10-19 23:03:23 +02:00
dependabot[bot]
4c1785d5f6 build(deps): bump urllib3 from 2.0.6 to 2.0.7 in /lib/spack/docs (#40583) 2023-10-19 23:02:51 +02:00
Adam J. Stewart
fa4d5ee929 py-rasterio: add v1.3.9 (#40621) 2023-10-19 12:08:43 -07:00
Cody Balos
8720cec283 add nvechip to sundials components when mfem+rocm (#40512) 2023-10-19 12:08:24 -07:00
Harmen Stoppels
72b36ac144 Improve setup build / run / test environment (#35737)
This adds a `SetupContext` class which is responsible for setting
package.py module globals, and computing the changes to environment
variables for the build, test or run context.

The class uses `effective_deptypes` which takes a list of specs (e.g. single
item of a spec to build, or a list of environment roots) and a context
(build, run, test), and outputs a flat list of specs that affect the
environment together with a flag in what way they do so. This list is
topologically ordered from root to leaf, so that one can be assured that
dependents override variables set by dependencies, not the other way
around.

This is used to replace the logic in `modifications_from_dependencies`,
which has several issues: missing calls to `setup_run_environment`, and
the order in which operations are applied.

Further, it should improve performance a bit in certain cases, since
`effective_deptypes` run in O(v + e) time, whereas `spack env activate`
currently can take up to O(v^2 + e) time due to loops over roots. Each
edge in the DAG is visited once by calling `effective_deptypes` with
`env.concrete_roots()`.

By marking and propagating flags through the DAG, this commit also fixes
a bug where Spack wouldn't call `setup_run_environment` for runtime
dependencies of link dependencies. And this PR ensures that Spack
correctly sets up the runtime environment of direct build dependencies.

Regarding test dependencies: in a build context they are are build-time
test deps, whereas in a test context they are install-time test deps.
Since there are no means to distinguish the build/install type test deps,
they're both.

Further changes:

- all `package.py` module globals are guaranteed to be set before any of the
  `setup_(dependent)_(run|build)_env` functions is called
- traversal order during setup: first the group of externals, then the group
  of non-externals, with specs in each group traversed topological (dependencies
  are setup before dependents)
- modules: only ever call `setup_dependent_run_environment` of *direct* link/run
   type deps
- the marker in `set_module_variables_for_package` is dropped, since we should
  call the method once per spec. This allows us to set only a cheap subset of
  globals on the module: for example it's not necessary to compute the expensive
  `cmake_args` and w/e if the spec under consideration is not the root node to be
  built.
- `spack load`'s `--only` is deprecated (it has no effect now), and `spack load x`
  now means: do everything that's required for `x` to work at runtime, which
  requires runtime deps to be setup -- just like `spack env activate`.
- `spack load` no longer loads build deps (of build deps) ...
- `spack env activate` on partially installed or broken environments: this is all
  or nothing now. If some spec errors during setup of its runtime env, you'll only
  get the unconditional variables + a warning that says the runtime changes for
  specs couldn't be applied.
- Remove traversal in upward direction from `setup_dependent_*` in packages.
  Upward traversal may iterate to specs that aren't children of the roots
  (e.g. zlib / python have hundreds of dependents, only a small fraction is
  reachable from the roots. Packages should only modify the direct dependent
  they receive as an argument)
2023-10-19 20:44:05 +02:00
Harmen Stoppels
79896ee85c spack checksum: restore ability to select top n (#40531)
The ability to select the top N versions got removed in the checksum overhaul,
cause initially numbers were used for commands.

Now that we settled on characters for commands, let's make numbers pick the top
N again.
2023-10-19 11:33:01 -07:00
Vanessasaurus
408ee04014 Automated deployment to update package flux-core 2023-10-19 (#40605)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-10-19 10:16:42 -07:00
Harmen Stoppels
3f594e86a1 libvorbis: drop -force_cpusubtype_ALL flag (#40616)
This flag was only relevant when targeting powerpc from apple-clang,
which we don't do. The flag is removed from apple-clang@15. Let's drop
it unconditionally.
2023-10-19 10:15:18 -07:00
Scott Wittenburg
46c1a8e4c6 gitlab ci: Rework how mirrors are configured (#39939)
Improve how mirrors are used in gitlab ci, where we have until now thought
of them as only a string.

By configuring ci mirrors ahead of time using the proposed mirror templates,
and by taking advantage of the expressiveness that spack now has for mirrors,
this PR will allow us to easily switch the protocol/url we use for fetching
binary dependencies.

This change also deprecates some gitlab functionality and marks it for
removal in Spack 0.23:

    - arguments to "spack ci generate":
        * --buildcache-destination
        * --copy-to
    - gitlab configuration options:
        * enable-artifacts-buildcache
        * temporary-storage-url-prefix
2023-10-19 11:04:59 -05:00
Satish Balay
b2d3e01fe6 petsc: add variant +sycl (#40562)
* petsc: add variant +sycl

* petsc: add in gmake as dependency - so that consistent make gets used between petsc and slepc builds [that can have different env for each of the builds]
2023-10-19 07:31:02 -07:00
Harmen Stoppels
681639985a ci: remove incorrect compilers.yaml (#40610) 2023-10-19 16:11:42 +02:00
Massimiliano Culpo
a1ca1a944a ASP-based solver: single Spec instance per dag hash (#39590)
Reused specs used to be referenced directly into the built spec.

This might cause issues like in issue 39570 where two objects in
memory represent the same node, because two reused specs were
loaded from different sources but referred to the same spec
by DAG hash.

The issue is solved by copying concrete specs to a dictionary keyed
by dag hash.
2023-10-19 16:00:45 +02:00
Tamara Dahlgren
4f49f7b9df Stand-alone test feature deprecation postponed to v0.22 (#40600) 2023-10-19 06:03:54 -06:00
Aiden Grossman
fb584853dd byte-unixbench: respect compiler choice (#39242)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-10-19 02:20:34 -06:00
Aiden Grossman
cc47b06756 connect-proxy: respect compiler choice (#39243) 2023-10-19 09:58:58 +02:00
Aiden Grossman
b68a620fc2 bioawk: respect compiler choice (#39241) 2023-10-19 09:55:57 +02:00
Aiden Grossman
e417ca54a0 busybox: respect compiler choice (#39239) 2023-10-19 09:55:06 +02:00
Michael Kuhn
5bbf8454d0 julia: Fix build for @1.9 (#39045)
julia@1.9 tries to download ittapi, which requires cmake. Disable it
explicitly.
2023-10-19 09:09:45 +02:00
Annop Wongwathanarat
67b8dd0913 acfl: add version 23.10 (#40510) 2023-10-18 17:07:40 -07:00
Harmen Stoppels
a42eb0d2bd unparse: also support generic type aliases (#40328) 2023-10-18 23:16:05 +02:00
Harmen Stoppels
294e659ae8 AutotoolsPackage / MakefilePackage: add gmake build dependency (#40380) 2023-10-18 19:56:54 +02:00
Harmen Stoppels
55198c49e5 llvm: fix ncurses+termlib linking in lldb (#40594) 2023-10-18 19:04:49 +02:00
Harmen Stoppels
dc071a3995 Fix dev-build keep_stage behavior (#40576)
`spack dev-build` would incorrectly set `keep_stage=True` for the
entire DAG, including for non-dev specs, even though the dev specs
have a DIYStage which never deletes sources.
2023-10-18 11:44:26 +00:00
Lydéric Debusschère
db5d0ac6ac [fix] py-werkzeug: add constraint in python dependence (#40590)
py-werkzeug@:0.12 does not work with python@3.10:

Test with py-werkzeug 0.12.2 and python 3.10:
```
$ python3.10 -c 'import werkzeug'
py-werkzeug-0.12.2/lib/python3.11/site-packages/werkzeug/datastructures.py", line 16, in <module>
from collections import Container, Iterable, MutableSet
ImportError: cannot import name 'Container' from 'collections'
```

Test with py-werkzeug 0.12.2 and python 3.9:
```
python3.9 -c "from collections import Container"
<string>:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will stop working
```
2023-10-18 13:04:21 +02:00
Aiden Grossman
2802013dc6 Add license directive (#39346)
This patch adds in a license directive to get the ball rolling on adding in license 
information about packages to spack. I'm primarily interested in just adding
license into spack, but this would also help with other efforts that people are
interested in such as adding license information to the ASP solve for 
concretization to make sure licenses are compatible.

Usage:

Specifying the specific license that a package is released under in a project's
`package.py` is good practice. To specify a license, find the SPDX identifier for
a project and then add it using the license directive:

```python
   license("<SPDX Identifier HERE>")
```

For example, for Apache 2.0, you might write:

```python
   license("Apache-2.0")
```

Note that specifying a license without a when clause makes it apply to all
versions and variants of the package, which might not actually be the case.
For example, a project might have switched licenses at some point or have
certain build configurations that include files that are licensed differently.
To account for this, you can specify when licenses should be applied. For
example, to specify that a specific license identifier should only apply
to versionup to and including 1.5, you could write the following directive:

```python
   license("MIT", when="@:1.5")
```
2023-10-18 03:58:19 -07:00
Greg Becker
37bafce384 abi.py: fix typo, add type-hints (#38216)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-18 11:22:55 +02:00
jfavre
da0813b049 paraview: add variant for NVIDIA IndeX (#40577)
* add variant for NVIDIA IndeX
* remove whitespaces
2023-10-17 13:03:41 -06:00
Dennis Klein
e2bb2595b3 xmlto: add more dependencies (#40578)
`xmllint` is called by `xmlto` during generation of `libzmq`'s docs, so
adding `libxml2`.

The docbook deps and the patches are taken from
https://src.fedoraproject.org/rpms/xmlto/blob/rawhide/f/xmlto.spec

There are still many more dependencies missing, but this is out of scope
of this patch (which is only concerned about the use case of `libzmq`).
2023-10-17 11:58:46 -07:00
Mikael Simberg
b7cbcfdcab Add tracy 0.10 (#40573) 2023-10-17 11:35:55 -07:00
Peter Scheibel
9cde25b39e Allow / in GitVersion (#39398)
This commit allows version specifiers to refer to git branches that contain
forward slashes. For example, the following is valid syntax now:

    pkg@git.releases/1.0
   
It also adds a new method `Spec.format_path(fmt)` which is like `Spec.format`,
but also maps unsafe characters to `_` after interpolation. The difference is
as follows:

    >>> Spec("pkg@git.releases/1.0").format("{name}/{version}")
    'pkg/git.releases/1.0'

    >>> Spec("pkg@git.releases/1.0").format_path("{name}/{version}")
    'pkg/git.releases_1.0'

The `format_path` method is used in all projections. Notice that this method
also maps `=` to `_`

    >>> Spec("pkg@git.main=1.0").format_path("{name}/{version}")
    'pkg/git.main_1.0'
   
which should avoid syntax issues when `Spec.prefix` is literally copied into a
Makefile as sometimes happens in AutotoolsPackage or MakefilePackage
2023-10-17 20:33:59 +02:00
Rocco Meli
49ea0a8e2e Add mpi_f08 variant to CP2K (#40574)
* add mpi_f08 variant
* add conflict
* add conflict with released versions of cp2k and +mpi_f08
2023-10-17 11:33:13 -07:00
Adam J. Stewart
d317ddfebe py-rtree: add v1.1.0 (#40575) 2023-10-17 11:25:18 -07:00
Cameron Rutherford
b1eef4c82d hiop: 1.0.1 release (#40580) 2023-10-17 11:14:17 -07:00
Harmen Stoppels
a4ad365de0 patchelf: fix compilation with GCC 7 (#40581) 2023-10-17 11:12:09 -07:00
wspear
8c257d55b4 Support apple-clang in pdt (#40582) 2023-10-17 11:23:15 -06:00
Harmen Stoppels
bd165ebc4d Support spack env activate --with-view <name> <env> (#40549)
Currently `spack env activate --with-view` exists, but is a no-op.

So, it is not too much of a breaking change to make this redundant flag
accept a value `spack env activate --with-view <name>` which activates
a particular view by name.

The view name is stored in `SPACK_ENV_VIEW`.

This also fixes an issue where deactivating a view that was activated
with `--without-view` possibly removes entries from PATH, since now we
keep track of whether the default view was "enabled" or not.
2023-10-17 15:40:48 +02:00
Massimiliano Culpo
348e5cb522 packages: use "requires" to allow only selected compilers (#40567)
A few packages have encoded an idiom that pre-dates the introduction
of the 'requires' directive, and they cycle over all compilers
to conflict with the ones that are not supported.

Here instead we reverse the logic, and require the ones that
are supported.
2023-10-17 08:38:06 +02:00
Patrick Bridges
7cc17f208c Creation of Beatnik package and associated updates to silo and cabana spack package (#40382)
* Added initial package for building Beatnik with spack

* Fixed github ID for Jason as a maintainer.

* Major revision of beatnik spack package to properly support GPU spack builds with CUDA (and ROCm, though that it untested)

* Marked that beatnik 1.0 will require cabana 0.6.0. We will wait for the cabana 0.6.0 release before we release beatnik

* Update to beatnik package spec to compile with hipcc when +rocm

* Updated spack package for cabana for version 0.6.0 and appropriate heffte dependency

* Updated beatnik package to require cabana 0.6.0

* More updates to cabana and beatnik to build with cabana 0.6.0

* Finish removing BLT dependence from beatnik

* More updates to beatnik package for compiling on cray systems

* Updated beatnik package for new cabana package

* Changes to silo package for new silo version

* Fixed version specs for heffte to be able to concretize and build

* Fixed spack style issues for beatnik and silo packages

* More spack formatting fixes to beatnik and silo

* Patrick adopting silo package as maintainer for now

* Should address final style changes to beatnik package spec

* Yet more style fixes.

* Perhaps this is the final style fixes? :)

* Minor fix to cabana package on required versions
2023-10-16 21:13:31 -06:00
Eric Berquist
2913cd936a Add latest versions of rlwrap (#40563)
* Add latest versions of rlwrap
* rlwrap: fix URL for v0.46.1
2023-10-16 20:28:51 -06:00
Stephen Sachs
361a185ddb intel-oneapi-compilers: ifx is located in bin not bin/intel64 (#40561)
This is a fix on top of https://github.com/spack/spack/pull/40557 .
Tagging @rscohn2 for review.
2023-10-16 20:23:46 -06:00
Seth R. Johnson
9d5615620a py-furo: new version (#40559) 2023-10-16 17:10:31 -04:00
Adam J. Stewart
ae185087e7 py-grayskull: add new package (#40293)
* py-grayskull: add new package

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-10-16 15:37:50 -05:00
Massimiliano Culpo
4a96d29e69 Use string representation of deptypes for concrete specs (#40566) 2023-10-16 22:36:22 +02:00
Adam J. Stewart
1e44f33163 py-fiona: add v1.9.5 (#40497) 2023-10-16 15:34:13 -05:00
Adam J. Stewart
348493abcd py-shapely: add v2.0.2 (#40523) 2023-10-16 15:33:56 -05:00
Adam J. Stewart
2bc4bfa877 py-grpcio: cython 3 still not supported (#40537) 2023-10-16 15:33:32 -05:00
Adam J. Stewart
3e3b287761 py-lightning: add v2.1.0 (#40496) 2023-10-16 14:14:46 -05:00
renjithravindrankannath
74bbb1ef1b Updating patch to enable flag mcode-object-version=none (#40367)
* Updating patch to add flag mcode-object-version=none when
   device libs is buils as part of llvm-amdgpu
* Limiting patch to +rocm-device-libs variant and adding
   appropriate comment for the patch
* Updating llvmpatch as per the mailine code
   Updating hsa-rocr patch as per the latest code
   Updating the if elif condition for the hip test src path
* Updating flags for 5.5 relases and above
* Updating build flags and patches
2023-10-16 10:25:10 -07:00
Dom Heinzeller
22405fbb68 Fix version incompatibilities of py-pandas and py-openpyxl (#40472)
* Fix version incompatibilities of py-pandas and py-openpyxl

* Add variant excel for py-pandas

* Add package py-pyxlsb

* Add versios for py-xlsxwriter

* Define excel dependencies for py-pandas 1.4, 1.5, 2.0, 2.1

* Fix variant excel in py-pandas

* Add package py-odfpy, which is also a dependency for py-pandas@2.0:

* Rearrange excel dependencies for py-pandas

* Change url to pypi

* Add missing newline to fix style in py-odfpy
2023-10-16 10:28:38 -06:00
Diego Alvarez S
14d935bd6c Add nextflow 23.10.0 (#40547) 2023-10-16 09:06:36 -07:00
Garth N. Wells
363b9d3c7b fenics-basix: update for v0.7 (#40440)
* Uodate for Basix 0.7

* Version fix for nanobind dependency

* Simplification

* Version update and simplify dependencies

* Add comment on location of pyproject.toml

* Update var/spack/repos/builtin/packages/py-fenics-basix/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-16 08:53:02 -06:00
Stephen Sachs
8347ae3766 intel-oneapi-compilers: ifx uses --gcc-name & --gxx-name (#40557)
`ifx` uses the older syntax instead of `--gcc-toolchain`. Tested up to version
2023.2.0.
2023-10-16 10:24:21 -04:00
Garth N. Wells
1106f6b9f2 py-fenics-ufl: update version and add test (#40534)
* Update py-ufl vesion

* Syntax fix

* Syntax fix

* Add test

* Updates following comments
2023-10-16 06:52:59 -05:00
Lydéric Debusschère
e22117304e [add] py-cylc-uiserver: new recipe (#39983)
* [add] py-cylc-uiserver: new recipe

* py-cylc-uiserver: remove version constraint on the dependence python

* [fix] py-cylc-uiserver: add forgotten dependence py-graphql-core

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-16 06:51:11 -05:00
Vanessasaurus
10999c0283 fix: flux-core needs libarchive with +iconv (#40541)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-10-15 17:43:10 -06:00
Todd Gamblin
7adeee0980 README.md: tweak matrix description to indicate bridging (#40540)
This tweaks the matrix description to indicate that it's bridged with Slack. So people
don't think they're missing out (even though the icon says there are only 3 users on
Matrix).
2023-10-15 22:48:05 +00:00
Harmen Stoppels
a9cfa32c34 spack checksum: handle all versions dropped better (#40530)
* spack checksum: fix error when all versions are dropped

* add test
2023-10-15 15:08:11 -07:00
Garth N. Wells
718aa8b82f Version update and simplify dependencies (#40543) 2023-10-15 15:53:03 -06:00
Adam J. Stewart
dbf3bed380 py-torchdata: version rename (#40522) 2023-10-15 13:22:49 -07:00
Adam J. Stewart
ef55c7c916 Python: allow OneAPI 2024 when it's released (#40536) 2023-10-15 11:18:04 -06:00
Adam J. Stewart
2015d3d2bc py-click: fix Python 3.6 support (#40535) 2023-10-15 19:16:51 +02:00
Alec Scott
76bac6d4bf Add matrix space link and badge to README (#40532) 2023-10-15 09:28:08 +00:00
Miroslav Stoyanov
b960d476e3 tasmanian: patch for clang17 (#40515) 2023-10-15 00:07:15 -05:00
Veselin Dobrev
8a311d7746 mfem: add a patch for v4.6 for gcc 13, see mfem PR 3903 (#40495) 2023-10-15 00:01:53 -05:00
Miroslav Stoyanov
39d2baec8a heffte: fix rocm deps (#40514) 2023-10-14 23:55:01 -05:00
Dom Heinzeller
26e063177d Bug fixes in py-awscrt to fix build errors reported in #40386 (#40469)
* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: on Linux, tell aws-crt-python to use libcrypto from spack (openssl)

* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: add missing build dependencies cmake (for all), openssl (for linux)

* Update var/spack/repos/builtin/packages/py-awscrt/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-14 21:02:41 +00:00
Alex Richert
149d1946ee Add static support for proj (#40322)
* Add static-only option for proj

* Update proj

* update proj

* Update package.py

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* proj: Add pic and static variant support for cmake
2023-10-14 13:09:24 -05:00
Manuela Kuhn
3604f6238d py-pydicom: add 2.4.3 (#40487) 2023-10-14 13:06:51 -05:00
Manuela Kuhn
2ad9470670 py-pybids: add 0.16.3 (#40486) 2023-10-14 13:05:36 -05:00
Manuela Kuhn
8dde74854a py-cfgv: add 3.4.0 (#40465) 2023-10-14 13:03:43 -05:00
Manuela Kuhn
fa5aadbbc0 py-charset-normalizer: add 3.3.0 (#40471) 2023-10-14 12:56:59 -05:00
Manuela Kuhn
0989cb8866 py-click: add 8.1.7 (#40473) 2023-10-14 12:56:14 -05:00
Manuela Kuhn
b536260eb5 py-chardet: add 5.2.0 (#40468) 2023-10-14 12:47:01 -05:00
Manuela Kuhn
8f2de4663e py-certifi: add 2023.7.22 (#40445) 2023-10-14 12:41:34 -05:00
Manuela Kuhn
0693892521 py-bidskit: add 2023.9.7 (#40444) 2023-10-14 12:40:40 -05:00
Manuela Kuhn
3be78717d2 py-pyqt6: add 6.5.2 (#40413) 2023-10-14 12:34:45 -05:00
Sam Gillingham
655d123785 py-python-fmask: update to latest versions (#40378)
* tidy and add new version

* add comment about dependencies

* whitespace
2023-10-14 12:28:51 -05:00
Lydéric Debusschère
b8cb36ce50 [add] py-graphene-tornado: new recipe, required by py-cylc-uiserver (#39985)
* [add] py-graphene-tornado: new recipe, required by py-cylc-uiserver

* py-graphene-tornado: Taking reviewing into account

* py-graphene-tornado: add type run in dependences py-jinja, py-tornado and py-werkzeug

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-14 12:25:02 -05:00
Manuela Kuhn
bc3cd02776 py-urllib3: add 2.0.6 (#40207)
* py-urllib3: add 2.0.5

* Add py-brotli package

* Group brotli dependencies and make limits more specific

* Add minimum version limits to variants

* Remove python upper limit for py-brotli

* Fix restrictions for py-brotli dependency

* Fix py-brotli dependency

* py-urllib3: add 2.0.6
2023-10-14 12:23:54 -05:00
Seth R. Johnson
a027adcaa2 cpr: new package (#40509)
* New package: cpr

* Support libcpr version 1.9

* Fix build phase for git

* Update var/spack/repos/builtin/packages/cpr/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* [@spackbot] updating style on behalf of sethrj

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-14 08:31:53 -07:00
Harmen Stoppels
3783032d28 screen: add v4.9.1 (#40529) 2023-10-14 08:29:55 -07:00
Harmen Stoppels
c0ac5e3f6b git: add 2.42 (#40528) 2023-10-14 08:28:52 -07:00
Michael Kuhn
87371d58d5 libbson, mongo-c-driver: add 1.24.4 (#40518) 2023-10-14 11:38:00 +02:00
Michael Kuhn
ef11fd7f75 libfuse: add 3.16.2 (#40519) 2023-10-14 11:37:07 +02:00
Michael Kuhn
d0f046e788 mariadb-c-client: add 3.3.7 (#40521) 2023-10-14 11:35:46 +02:00
Michael Kuhn
794fb9b252 rocksdb: add 8.6.7 (#40525) 2023-10-14 11:34:32 +02:00
Michael Kuhn
86c7d646c3 Fix pkgconfig dependencies (#40524)
`pkgconfig` is the correct virtual dependency.
2023-10-14 11:33:36 +02:00
Michael Kuhn
7d96077667 glib: add 2.78.0, 2.76.6 (#40517) 2023-10-14 11:33:16 +02:00
Michael Kuhn
a6fbfedc08 sqlite: add 3.43.2 (#40520) 2023-10-14 11:32:25 +02:00
Alex Richert
a6cfeabc10 cairo: add shared and pic variants (#40302) 2023-10-13 14:21:43 -06:00
Martin Aumüller
a3a29006aa wayland: dot is a build dependency (#39854)
* wayland: dot is a build dependency

otherwise this build failure happens:
../spack-src/doc/meson.build:5:6: ERROR: Program 'dot' not found or not executable

* wayland: make building of documentation optional

renders several dependencies optional
2023-10-13 21:57:13 +02:00
Harmen Stoppels
a5cb7a9816 spack checksum: improve interactive filtering (#40403)
* spack checksum: improve interactive filtering

* fix signature of executable

* Fix restart when using editor

* Don't show [x version(s) are new] when no known versions (e.g. in spack create <url>)

* Test ^D in test_checksum_interactive_quit_from_ask_each

* formatting

* colorize / skip header on invalid command

* show original total, not modified total

* use colify for command list

* Warn about possible URL changes

* show possible URL change as comment

* make mypy happy

* drop numbers

* [o]pen editor -> [e]dit
2023-10-13 19:43:22 +00:00
Gabriel Cretin
edf4aa9f52 Fpocket: fix installation (#40499)
* Fpocket: fix edit() positional args + add install()

* Remove comments

* Fix line too long

* Fix line too long

* Remove extension specification in version

Co-authored-by: Alec Scott <alec@bcs.sh>

* Use f-strings

Co-authored-by: Alec Scott <alec@bcs.sh>

* Fix styling

* Use the default MakefilePackage install stage

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-13 11:30:20 -07:00
Dom Heinzeller
02c680ec3a texinfo package: fix external detection (#40470)
A complete texinfo install includes both `info` and `makeinfo`. Some
system installations of texinfo may exclude one or the other. This
updates the external finding logic to require both.
2023-10-13 10:39:08 -07:00
David Huber
8248e180ca Add gsi-ncdiag v1.1.2 (#40508) 2023-10-13 08:48:23 -07:00
Harmen Stoppels
c9677b2465 Expand multiple build systems section (#39589)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-13 14:59:44 +02:00
Massimiliano Culpo
3752fe9e42 Better error message when wrong platform is used (#40492)
fixes #40299
2023-10-13 11:18:55 +02:00
Matthew Chan
8a0de10f60 containerize: update docs to activate env before using container templates (#40493) 2023-10-13 06:59:44 +00:00
Adam J. Stewart
6aa8d76e32 py-cmocean: add v3.0.3 (#40482) 2023-10-12 20:32:48 -06:00
Nils Vu
fb1d0f60d9 catch2: add +pic and +shared options (#40337)
Also add latest version
2023-10-12 20:13:01 -06:00
Martin Aumüller
728eaa515f ospray: new versions 2.11.0 and 2.12.0 (#40394)
* openimagedenoise: checksum 2.0.1
* ospray: new versions 2.11.0 and 2.12.0
  - both depend on embree@4
  - also update dependency versions for rkcommon, openvkl, openimagedenois and ispc
  - expose that dependency on openvkl is optional since @2.11 with variant "volumes"
* ospray: limit embree to @3 for ospray @:2.10
2023-10-12 14:13:15 -07:00
Julius Plehn
7c354095a9 Updates Variorum to 0.7.0 (#40488) 2023-10-12 11:27:06 -06:00
Harmen Stoppels
64ef33767f modules:prefix_inspections: allow empty dict (#40485)
Currently

```
modules:
  prefix_inspections:: {}
```

gives you the builtin defaults instead of no mapping.
2023-10-12 09:28:16 -07:00
Dennis Klein
265432f7b7 libzmq: Revert "libzmq: make location of libsodium explicit (#34553)" (#40477)
and make variants independent of upstream defaults
2023-10-12 09:15:00 -06:00
dependabot[bot]
aa7dfdb5c7 build(deps): bump python-levenshtein in /lib/spack/docs (#40461)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.22.0 to 0.23.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.22.0...v0.23.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-12 14:38:33 +00:00
Alec Scott
bfe37435a4 go: add v1.21.3 and deprecate previous versions due to CVE-2023-39533 (#40454) 2023-10-12 07:40:04 -06:00
Adam J. Stewart
285a50f862 PyTorch: fix build with Xcode 15 (#40460) 2023-10-12 07:27:03 -06:00
Harmen Stoppels
995e82e72b gettext: Add 0.22.3 and fix keyerror: "shared" (#39423)
After the merge of #37957 (Add static and pic variants), if a gettext install
from a build before that merge is present, building any package using gettext
fails with keyerror: "shared" because the use of self.spec.variants["shared"]
does not check for the presence of the new variant in the old installation
but expects that the new key variants["shared"] exists always.

Fix it with a fallback to the default of True and update gettext to v22.3

Co-authored-by: Bernharad Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2023-10-12 04:40:38 -06:00
Massimiliano Culpo
3935e047c6 Remove deprecated "extra_instructions" option for containers (#40365) 2023-10-12 12:12:15 +02:00
Harmen Stoppels
0fd2427d9b clingo: fix build with Python 3.12 (#40154) 2023-10-12 12:11:22 +02:00
Alec Scott
30d29d0201 acfl: use f-strings (#40433) 2023-10-12 11:30:22 +02:00
Tim Haines
3e1f2392d4 must: add versions 1.8.0 and 1.9.0 (#40141) 2023-10-11 22:34:22 -07:00
Lydéric Debusschère
6a12a40208 [add] py-cylc-rose: new recipe (#39980)
* [add] py-cylc-rose: new recipe

* py-cylc-rose: update recipe

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-11 22:31:46 -07:00
Leonhard Reichenbach
90e73391c2 opendatadetector: add version v3.0.0 (#39693) 2023-10-11 22:29:00 -07:00
Alec Scott
deec1b7c2e adios2: use f-strings (#40437) 2023-10-11 21:08:00 -06:00
Scott Wittenburg
d9cb1a1070 buildcache: Tell servers not to cache index or hash (#40339) 2023-10-11 18:13:57 -06:00
afzpatel
01747b50df fix ck build for 5.6.1 (#40304)
* initial commit to fix ck build for 5.6.1
* disable mlir for miopen-hip
* use satisfies for checking specs and add nlohmann-json dependency for 5.4 onwards
2023-10-11 14:43:21 -07:00
Alex Richert
df01a11e07 apr-util: Fix spack install apr-util +crypto ^openssl~shared (#40301) 2023-10-11 23:38:28 +02:00
Victor Brunini
7a4b479724 cmake: drop CMAKE_STATIC_LINKER_FLAGS (#40423)
Because those end up being passed to ar which does not understand linker
arguments. This was making ldflags largely unusuable for statically
linked cmake packages.
2023-10-11 23:30:44 +02:00
Harmen Stoppels
89e34d56a1 curl: add v8.4.0, allow r@8.3: to use it (#40442)
* curl: 8.4.0

* fix r curl upperbound range
2023-10-11 21:04:09 +02:00
Miroslav Stoyanov
a5853ee51a update for the tasmanain versions (#40453) 2023-10-11 12:33:31 -06:00
Alec Scott
537ab48167 acts: use f-strings (#40434) 2023-10-11 11:03:30 -07:00
Alec Scott
e43a090877 abduco: use f-string (#40432) 2023-10-11 19:51:26 +02:00
Alec Scott
275a2f35b5 adiak: use f-strings (#40435) 2023-10-11 10:48:39 -07:00
Alec Scott
dae746bb96 abacus: use f-string (#40431) 2023-10-11 19:48:32 +02:00
Alec Scott
3923b81d87 adios: use f-strings (#40436) 2023-10-11 10:47:42 -07:00
Alec Scott
5d582a5e48 7zip: use f-strings (#40430) 2023-10-11 19:46:23 +02:00
Alec Scott
7dbc712fba 3proxy: use f-strings (#40429) 2023-10-11 19:44:57 +02:00
Alec Scott
639ef9e24a adol-c: use f-strings (#40438) 2023-10-11 10:43:31 -07:00
Alex Richert
86d2200523 krb5: Fix spack install krb5 ^openssl~shared (#40306) 2023-10-11 19:37:48 +02:00
Massimiliano Culpo
fe6860e0d7 cmake: add v3.27.7 (#40441) 2023-10-11 10:20:49 -07:00
Manuela Kuhn
8f2e68aeb8 c-blosc2: add 2.10.5 (#40428) 2023-10-11 19:19:58 +02:00
Laura Weber
bc4c887452 tecplot: Add version 2023r1 (#40425) 2023-10-11 10:05:37 -07:00
Alec Scott
b3534b4435 restic: add v0.16.0 (#40439) 2023-10-11 19:04:26 +02:00
Massimiliano Culpo
861bb4d35a Update bootstrap buildcache to support Python 3.12 (#40404)
* Add support for Python 3.12
* Use optimized build of clingo
2023-10-11 19:03:17 +02:00
Harmen Stoppels
65e7ec0509 spider: respect <base> tag (#40443) 2023-10-11 08:49:50 -07:00
Manuela Kuhn
1ab8886695 qt-base: fix-build without opengl (#40421) 2023-10-11 08:56:59 -04:00
dependabot[bot]
26136c337f build(deps): bump mypy from 1.5.1 to 1.6.0 in /.github/workflows/style (#40422)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 06:35:51 -06:00
dependabot[bot]
e3b71b32aa build(deps): bump mypy from 1.5.1 to 1.6.0 in /lib/spack/docs (#40424)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 12:35:41 +00:00
Alec Scott
6d1711f4c2 Update legacy .format() calls to fstrings in installer.py (#40426) 2023-10-11 12:35:37 +00:00
Massimiliano Culpo
26f291ef25 spack buildcache: fix a typo in a function call (#40446)
fixes #40415
2023-10-11 13:09:21 +02:00
Martin Aumüller
da030617a1 botan: checksum 3.2.0 (#40417) 2023-10-10 21:52:51 -06:00
Martin Aumüller
1ebfcd3b18 openvkl: add 1.3.2 (#40392)
* openvkl: add 1.3.2

works with (and requires) embree@4

* openvkl: simplify formatting with f-string

thank you for the suggestion in the review
2023-10-10 21:24:17 -06:00
Edward Hartnett
d385a57da3 w3emc: add v2.11.0 (#40376)
* added version 2.11.0

* more fixes
2023-10-10 15:11:14 -07:00
eugeneswalker
37df8bfc73 e4s arm stack: duplicate and target neoverse v1 (#40369)
* e4s arm stack: duplicate and target both neoverse n1, v1

* remove neoverse_n1 target until issue #40397 is resolved
2023-10-10 14:32:51 -07:00
Adam J. Stewart
b781a530a1 GCC: fix build with Apple Clang 15 (#40318) 2023-10-10 22:35:15 +02:00
Harmen Stoppels
390b0aa25c More helpful error when patch lookup fails (#40379) 2023-10-10 21:09:04 +02:00
Adam J. Stewart
620835e30c py-jupyter-packaging: remove duplicate packages (#38671)
* py-jupyter-packaging: remove duplicate packages

* Allow py-jupyter-packaging to be duplicated in DAG

* Deprecate version of py-jupyterlab that requires py-jupyter-packaging at run-time
2023-10-10 13:50:22 -05:00
Adam J. Stewart
da10487219 Update PyTorch ecosystem (#40321)
* Update PyTorch ecosystem

* py-pybind11: better documentation of supported compilers

* py-torchdata: add v0.7.0

* Black fixes

* py-torchtext: fix Python reqs

* py-horovod: py-torch 2.1.0 not yet supported
2023-10-10 13:45:32 -05:00
Miroslav Stoyanov
4d51810888 find rocm fix (#40388)
* find rocm fix
* format fix
* style fix
* formatting is broken
2023-10-10 10:42:26 -07:00
Harmen Stoppels
6c7b2e1056 git: optimize build by not setting CFLAGS (#40387) 2023-10-10 11:48:06 +02:00
Carlos Bederián
749e99bf11 python: add 3.11.6 (#40384) 2023-10-10 09:53:04 +02:00
Martin Aumüller
6db8e0a61e embree: checksum 4.3.0 (#40395) 2023-10-09 20:44:10 -06:00
Martin Aumüller
6fe914421a rkcommon: checksum 0.11.0 (#40391) 2023-10-09 20:43:56 -06:00
Andrew W Elble
9275f180bb openmm: new version 8.0.0 (#40396) 2023-10-09 20:33:40 -06:00
Tom Epperly
2541b42fc2 Add a new release sha256 hash (#37680) 2023-10-09 20:28:45 -06:00
Auriane R
fb340f130b Add pika 0.19.1 (#40385) 2023-10-09 20:23:54 -06:00
Dennis Klein
d2ddd99ef6 libzmq: add v4.3.5 (#40383) 2023-10-09 17:30:46 -06:00
kenche-linaro
492a8111b9 linaro-forge: added package file for rebranded product (#39587) 2023-10-09 12:14:14 -07:00
George Young
d846664165 paintor: new package @3.0 (#40359)
* paintor: new package @3.0
* Update var/spack/repos/builtin/packages/paintor/package.py
---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:54:44 -07:00
Vanessasaurus
31b3e4898b Add: flux-pmix 0.4.0 (#40323)
* Automated deployment to update package flux-pmix 2023-10-05

* Pin exactly to flux-core 0.49.0 when between 0.3 and 0.4

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Mark Grondona <mark.grondona@gmail.com>

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Mark Grondona <mark.grondona@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:29:18 -07:00
Harmen Stoppels
82f1267486 unparse: drop python 3.3 remnants (#40331) 2023-10-09 08:22:27 -07:00
Harmen Stoppels
19202b2528 docs: update Spack prerequisites (#40381) 2023-10-09 13:41:36 +00:00
Adam J. Stewart
831cbec71f py-pydevtool: add new package (#40377) 2023-10-09 15:09:59 +02:00
Patrick Broderick
bb2ff802e2 fftx: add v1.1.3 (#40283) 2023-10-09 15:06:59 +02:00
dependabot[bot]
83e9537f57 build(deps): bump python-levenshtein in /lib/spack/docs (#40220)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.21.1 to 0.22.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.21.1...v0.22.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-09 15:02:14 +02:00
Gavin John
3488e83deb py-s3cmd: Add new versions (#40212)
* Add new versions of py-s3cmd

* Use correct hashes
2023-10-09 07:46:38 -05:00
Jacob King
c116eee921 nimrod-aai: add v23.9. (#40303) 2023-10-09 14:39:40 +02:00
Adam J. Stewart
9cb291b41b py-jsonargparse: add v4.25.0 (#40185) 2023-10-09 14:33:45 +02:00
Thomas Dickerson
c0f1072dc7 racket packages: fix typo after multiple build systems support (#40088) 2023-10-09 14:21:13 +02:00
Jordan Galby
3108036533 elfutils: fix +debuginfod again with new libarchive versions (#40314) 2023-10-09 14:17:58 +02:00
Mike Renfro
215c699307 velvet: improved variants (#40225) 2023-10-09 12:47:54 +02:00
jmuddnv
f609093c6e Adding NVIDIA HPC SDK 23.9 (#40371) 2023-10-09 12:35:59 +02:00
Joe Schoonover
eb4fd98f09 feq-parse: add v2.0.3 (#40230) 2023-10-09 12:31:23 +02:00
Harmen Stoppels
08da9a854a parser: use non-capturing groups (#40373) 2023-10-09 07:18:27 +02:00
Mark (he/his) C. Miller
3a18fe04cc Update CITATION.cff with conf dates (#40375)
Add `start-date` and `end-date` to citation
2023-10-08 18:04:25 -07:00
Harmen Stoppels
512e41a84a py-setuptools: sdist + rpath patch backport (#40205) 2023-10-08 11:47:36 -06:00
Alex Richert
8089aedde1 gettext: Add static and pic options (#37957) 2023-10-08 17:42:21 +02:00
Manuela Kuhn
6b9e103305 py-tables: add 3.9.0 (#40340)
* py-tables: add 3.9.0

* Add conflict with apple-clang
2023-10-08 10:28:13 -05:00
Jen Herting
00396fbe6c [py-tokenizers] added version 0.13.3 (#40360) 2023-10-08 10:27:18 -05:00
Jen Herting
a3be9cb853 [py-tensorboardx] Added version 2.6.2.2 (#39731)
* [py-tensorboardx] Added version 2.6.2.2

* [py-tensorboardx] flake8

* [py-tensorboardx] requires py-setuptools-scm
2023-10-08 10:21:55 -05:00
Jen Herting
81f58229ab py-torch-sparse: add v0.6.17 (#39495)
* [py-torch-sparse] New version 0.6.17

* [py-torch-sparse] added dependency on parallel-hashmap

* [py-torch-sparse]

- spack only supports python@3.7:
- py-pytest-runner only needed with old versions
2023-10-08 10:20:46 -05:00
Jen Herting
2eb16a8ea2 [py-lvis] New package (#39080)
* [py-lvis] New package

* [py-lvis] flake8

* [py-lvis] os agnostic

* [py-lvis] added comment for imported dependency

* [py-lvis] style fix
2023-10-08 10:18:01 -05:00
Manuela Kuhn
9db782f8d9 py-bids-validator: add 1.13.1 (#40356)
* py-bids-validator: add 1.13.1

* Fix style
2023-10-08 10:16:08 -05:00
Lydéric Debusschère
633df54520 [add] py-cylc-flow: new recipe (#39986)
* [add] py-cylc-flow: new recipe

* py-cylc-flow: fix py-protobuf version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cylc-flow: fix py-colorama version

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-aiofiles

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-pyzmq

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylcflow: remove useless dependence

* py-cylc-flow: fix indent

* py-cylc-flow: fix argument in depends_on; move lines

* py-cylc-flow: fix the type of the dependence py-setuptools

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-08 10:06:13 -05:00
Mark (he/his) C. Miller
e2a7f2ee9a Update CITATION.cff (#40363)
You will note the `Cite this repository` link is not working.

This commit fixes the underlying file...

* `authors` was not indented
* `authors` required by `preferred-citation`
* `authors` list required at top level (I simply duplicated)
* `"USA"` not correct country code
* `month` requires an integer month number
* Added URL to the actual pdf of the cited paper
* Used `identifiers` for doi and LLNL doc number
* added `abstract` copied from paper

Various fixes were confirmed by `cffconvert` using `docker run -v `pwd`:/app citationcff/cffconvert --validate`
2023-10-07 17:44:31 -07:00
1248 changed files with 31656 additions and 9515 deletions

View File

@@ -22,8 +22,8 @@ jobs:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -42,8 +42,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -80,8 +80,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup repo
@@ -145,8 +145,8 @@ jobs:
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -158,13 +158,16 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -179,7 +182,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap clingo
run: |
set -ex
@@ -204,7 +207,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup repo
@@ -247,7 +250,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -265,6 +268,7 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -283,7 +287,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -302,8 +306,8 @@ jobs:
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -316,10 +320,11 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -333,13 +338,13 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -38,12 +38,11 @@ jobs:
# Meaning of the various items in the matrix list
# 0: Container name (e.g. ubuntu-bionic)
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:18.04)
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos7, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:7'],
[centos-stream, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-bionic, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:18.04'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
@@ -56,20 +55,22 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
container="${{ matrix.dockerfile[0] }}:latest"
echo "container=${container}" >> $GITHUB_ENV
echo "versioned=${container}" >> $GITHUB_ENV
# On a new release create a container with the same tag as the release.
- name: Set Container Tag on Release
if: github.event_name == 'release'
run: |
versioned="${{matrix.dockerfile[0]}}:${GITHUB_REF##*/}"
echo "versioned=${versioned}" >> $GITHUB_ENV
- uses: docker/metadata-action@31cebacef4805868f9ce9a0cb03ee36c32df2ac4
id: docker_meta
with:
images: |
ghcr.io/${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
tags: |
type=schedule,pattern=nightly
type=schedule,pattern=develop
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
- name: Generate the Dockerfile
env:
@@ -92,13 +93,13 @@ jobs:
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3 # @v1
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226 # @v1
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226
- name: Log in to GitHub Container Registry
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -106,21 +107,18 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09 # @v2
uses: docker/build-push-action@4a13e500e55cf31b7a5d59a38ab2040ab0f42f56
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
cache-from: type=gha
cache-to: type=gha,mode=max
tags: |
spack/${{ env.container }}
spack/${{ env.versioned }}
ghcr.io/spack/${{ env.container }}
ghcr.io/spack/${{ env.versioned }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}

View File

@@ -35,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,7 +1,7 @@
black==23.9.1
black==23.11.0
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.5.1
mypy==1.7.1
types-six==1.16.21.9
vermin==1.5.2
vermin==1.6.0

View File

@@ -15,7 +15,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
@@ -45,12 +45,16 @@ jobs:
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.11'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -94,10 +98,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -133,7 +137,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -152,10 +156,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -185,12 +189,12 @@ jobs:
runs-on: macos-latest
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.11"]
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -211,7 +215,7 @@ jobs:
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
with:
flags: unittests,macos

View File

@@ -18,8 +18,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'
@@ -35,10 +35,10 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'
@@ -69,7 +69,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,10 +15,10 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -39,10 +39,10 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -63,10 +63,10 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,320 @@
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.
## Features in this release
1. **Better error messages with condition chaining**
In v0.18, we added better error messages that could tell you what problem happened,
but they couldn't tell you *why* it happened. `0.21` adds *condition chaining* to the
solver, and Spack can now trace back through the conditions that led to an error and
build a tree of causes potential causes and where they came from. For example:
```console
$ spack solve hdf5 ^cmake@3.0.1
==> Error: concretization failed for the following reasons:
1. Cannot satisfy 'cmake@3.0.1'
2. Cannot satisfy 'cmake@3.0.1'
required because hdf5 ^cmake@3.0.1 requested from CLI
3. Cannot satisfy 'cmake@3.18:' and 'cmake@3.0.1
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 depends on cmake@3.18: when @1.13:
required because hdf5 ^cmake@3.0.1 requested from CLI
4. Cannot satisfy 'cmake@3.12:' and 'cmake@3.0.1
required because hdf5 depends on cmake@3.12:
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 ^cmake@3.0.1 requested from CLI
```
More details in #40173.
2. **OCI build caches**
You can now use an arbitrary [OCI](https://opencontainers.org) registry as a build
cache:
```console
$ spack mirror add my_registry oci://user/image # Dockerhub
$ spack mirror add my_registry oci://ghcr.io/haampie/spack-test # GHCR
$ spack mirror set --push --oci-username ... --oci-password ... my_registry # set login creds
$ spack buildcache push my_registry [specs...]
```
And you can optionally add a base image to get *runnable* images:
```console
$ spack buildcache push --base-image ubuntu:23.04 my_registry python
Pushed ... as [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
$ docker run --rm -it [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
```
This creates a container image from the Spack installations on the host system,
without the need to run `spack install` from a `Dockerfile` or `sif` file. It also
addresses the inconvenience of losing binaries of dependencies when `RUN spack
install` fails inside `docker build`.
Further, the container image layers and build cache tarballs are the same files. This
means that `spack install` and `docker pull` use the exact same underlying binaries.
If you previously used `spack install` inside of `docker build`, this feature helps
you save storage by a factor two.
More details in #38358.
3. **Multiple versions of build dependencies**
Increasingly, complex package builds require multiple versions of some build
dependencies. For example, Python packages frequently require very specific versions
of `setuptools`, `cython`, and sometimes different physics packages require different
versions of Python to build. The concretizer enforced that every solve was *unified*,
i.e., that there only be one version of every package. The concretizer now supports
"duplicate" nodes for *build dependencies*, but enforces unification through
transitive link and run dependencies. This will allow it to better resolve complex
dependency graphs in ecosystems like Python, and it also gets us very close to
modeling compilers as proper dependencies.
This change required a major overhaul of the concretizer, as well as a number of
performance optimizations. See #38447, #39621.
4. **Cherry-picking virtual dependencies**
You can now select only a subset of virtual dependencies from a spec that may provide
more. For example, if you want `mpich` to be your `mpi` provider, you can be explicit
by writing:
```
hdf5 ^[virtuals=mpi] mpich
```
Or, if you want to use, e.g., `intel-parallel-studio` for `blas` along with an external
`lapack` like `openblas`, you could write:
```
strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
```
The `virtuals=mpi` is an edge attribute, and dependency edges in Spack graphs now
track which virtuals they satisfied. More details in #17229 and #35322.
Note for packaging: in Spack 0.21 `spec.satisfies("^virtual")` is true if and only if
the package specifies `depends_on("virtual")`. This is different from Spack 0.20,
where depending on a provider implied depending on the virtual provided. See #41002
for an example where `^mkl` was being used to test for several `mkl` providers in a
package that did not depend on `mkl`.
5. **License directive**
Spack packages can now have license metadata, with the new `license()` directive:
```python
license("Apache-2.0")
```
Licenses use [SPDX identifiers](https://spdx.org/licenses), and you can use SPDX
expressions to combine them:
```python
license("Apache-2.0 OR MIT")
```
Like other directives in Spack, it's conditional, so you can handle complex cases like
Spack itself:
```python
license("LGPL-2.1", when="@:0.11")
license("Apache-2.0 OR MIT", when="@0.12:")
```
More details in #39346, #40598.
6. **`spack deconcretize` command**
We are getting close to having a `spack update` command for environments, but we're
not quite there yet. This is the next best thing. `spack deconcretize` gives you
control over what you want to update in an already concrete environment. If you have
an environment built with, say, `meson`, and you want to update your `meson` version,
you can run:
```console
spack deconcretize meson
```
and have everything that depends on `meson` rebuilt the next time you run `spack
concretize`. In a future Spack version, we'll handle all of this in a single command,
but for now you can use this to drop bits of your lockfile and resolve your
dependencies again. More in #38803.
7. **UI Improvements**
The venerable `spack info` command was looking shabby compared to the rest of Spack's
UI, so we reworked it to have a bit more flair. `spack info` now makes much better
use of terminal space and shows variants, their values, and their descriptions much
more clearly. Conditional variants are grouped separately so you can more easily
understand how packages are structured. More in #40998.
`spack checksum` now allows you to filter versions from your editor, or by version
range. It also notifies you about potential download URL changes. See #40403.
8. **Environments can include definitions**
Spack did not previously support using `include:` with The
[definitions](https://spack.readthedocs.io/en/latest/environments.html#spec-list-references)
section of an environment, but now it does. You can use this to curate lists of specs
and more easily reuse them across environments. See #33960.
9. **Aliases**
You can now add aliases to Spack commands in `config.yaml`, e.g. this might enshrine
your favorite args to `spack find` as `spack f`:
```yaml
config:
aliases:
f: find -lv
```
See #17229.
10. **Improved autoloading of modules**
Spack 0.20 was the first release to enable autoloading of direct dependencies in
module files.
The downside of this was that `module avail` and `module load` tab completion would
show users too many modules to choose from, and many users disabled generating
modules for dependencies through `exclude_implicits: true`. Further, it was
necessary to keep hashes in module names to avoid file name clashes.
In this release, you can start using `hide_implicits: true` instead, which exposes
only explicitly installed packages to the user, while still autoloading
dependencies. On top of that, you can safely use `hash_length: 0`, as this config
now only applies to the modules exposed to the user -- you don't have to worry about
file name clashes for hidden dependencies.
Note: for `tcl` this feature requires Modules 4.7 or higher
11. **Updated container labeling**
Nightly Docker images from the `develop` branch will now be tagged as `:develop` and
`:nightly`. The `:latest` tag is no longer associated with `:develop`, but with the
latest stable release. Releases will be tagged with `:{major}`, `:{major}.{minor}`
and `:{major}.{minor}.{patch}`. `ubuntu:18.04` has also been removed from the list of
generated Docker images, as it is no longer supported. See #40593.
## Other new commands and directives
* `spack env activate` without arguments now loads a `default` environment that you do
not have to create (#40756).
* `spack find -H` / `--hashes`: a new shortcut for piping `spack find` output to
other commands (#38663)
* Add `spack checksum --verify`, fix `--add` (#38458)
* New `default_args` context manager factors out common args for directives (#39964)
* `spack compiler find --[no]-mixed-toolchain` lets you easily mix `clang` and
`gfortran` on Linux (#40902)
## Performance improvements
* `spack external find` execution is now much faster (#39843)
* `spack location -i` now much faster on success (#40898)
* Drop redundant rpaths post install (#38976)
* ASP-based solver: avoid cycles in clingo using hidden directive (#40720)
* Fix multiple quadratic complexity issues in environments (#38771)
## Other new features of note
* archspec: update to v0.2.2, support for Sapphire Rapids, Power10, Neoverse V2 (#40917)
* Propagate variants across nodes that don't have that variant (#38512)
* Implement fish completion (#29549)
* Can now distinguish between source/binary mirror; don't ping mirror.spack.io as much (#34523)
* Improve status reporting on install (add [n/total] display) (#37903)
## Windows
This release has the best Windows support of any Spack release yet, with numerous
improvements and much larger swaths of tests passing:
* MSVC and SDK improvements (#37711, #37930, #38500, #39823, #39180)
* Windows external finding: update default paths; treat .bat as executable on Windows (#39850)
* Windows decompression: fix removal of intermediate file (#38958)
* Windows: executable/path handling (#37762)
* Windows build systems: use ninja and enable tests (#33589)
* Windows testing (#36970, #36972, #36973, #36840, #36977, #36792, #36834, #34696, #36971)
* Windows PowerShell support (#39118, #37951)
* Windows symlinking and libraries (#39933, #38599, #34701, #38578, #34701)
## Notable refactors
* User-specified flags take precedence over others in Spack compiler wrappers (#37376)
* Improve setup of build, run, and test environments (#35737, #40916)
* `make` is no longer a required system dependency of Spack (#40380)
* Support Python 3.12 (#40404, #40155, #40153)
* docs: Replace package list with packages.spack.io (#40251)
* Drop Python 2 constructs in Spack (#38720, #38718, #38703)
## Binary cache and stack updates
* e4s arm stack: duplicate and target neoverse v1 (#40369)
* Add macOS ML CI stacks (#36586)
* E4S Cray CI Stack (#37837)
* e4s cray: expand spec list (#38947)
* e4s cray sles ci: expand spec list (#39081)
## Removals, deprecations, and syntax changes
* ASP: targets, compilers and providers soft-preferences are only global (#31261)
* Parser: fix ambiguity with whitespace in version ranges (#40344)
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
* Remove deprecated "extra_instructions" option for containers (#40365)
* Stand-alone test feature deprecation postponed to v0.22 (#40600)
* buildcache push: make `--allow-root` the default and deprecate the option (#38878)
## Notable Bugfixes
* Bugfix: propagation of multivalued variants (#39833)
* Allow `/` in git versions (#39398)
* Fetch & patch: actually acquire stage lock, and many more issues (#38903)
* Environment/depfile: better escaping of targets with Git versions (#37560)
* Prevent "spack external find" to error out on wrong permissions (#38755)
* lmod: allow core compiler to be specified with a version range (#37789)
## Spack community stats
* 7,469 total packages, 303 new since `v0.20.0`
* 150 new Python packages
* 34 new R packages
* 353 people contributed to this release
* 336 committers to packages
* 65 committers to core
# v0.20.3 (2023-10-31)
## Bugfixes
- Fix a bug where `spack mirror set-url` would drop configured connection info (reverts #34210)
- Fix a minor issue with package hash computation for Python 3.12 (#40328)
# v0.20.2 (2023-10-03)
## Features in this release
Spack now supports Python 3.12 (#40155)
## Bugfixes
- Improve escaping in Tcl module files (#38375)
- Make repo cache work on repositories with zero mtime (#39214)
- Ignore errors for newer, incompatible buildcache version (#40279)
- Print an error when git is required, but missing (#40254)
- Ensure missing build dependencies get installed when using `spack install --overwrite` (#40252)
- Fix an issue where Spack freezes when the build process unexpectedly exits (#39015)
- Fix a bug where installation failures cause an unrelated `NameError` to be thrown (#39017)
- Fix an issue where Spack package versions would be incorrectly derived from git tags (#39414)
- Fix a bug triggered when file locking fails internally (#39188)
- Prevent "spack external find" to error out when a directory cannot be accessed (#38755)
- Fix multiple performance regressions in environments (#38771)
- Add more ignored modules to `pyproject.toml` for `mypy` (#38769)
# v0.20.1 (2023-07-10)
## Spack Bugfixes

View File

@@ -27,12 +27,53 @@
# And here's the CITATION.cff format:
#
cff-version: 1.2.0
type: software
message: "If you are referencing Spack in a publication, please cite the paper below."
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
abstract: >-
Large HPC centers spend considerable time supporting software for thousands of users, but the complexity of HPC software is quickly outpacing the capabilities of existing software management tools.
Scientific applications require specific versions of compilers, MPI, and other dependency libraries, so using a single, standard software stack is infeasible.
However, managing many configurations is difficult because the configuration space is combinatorial in size.
We introduce Spack, a tool used at Lawrence Livermore National Laboratory to manage this complexity.
Spack provides a novel, re- cursive specification syntax to invoke parametric builds of packages and dependencies.
It allows any number of builds to coexist on the same system, and it ensures that installed packages can find their dependencies, regardless of the environment.
We show through real-world use cases that Spack supports diverse and demanding applications, bringing order to HPC software chaos.
preferred-citation:
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
type: conference-paper
doi: "10.1145/2807591.2807623"
url: "https://github.com/spack/spack"
url: "https://tgamblin.github.io/pubs/spack-sc15.pdf"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
given-names: "Matthew"
- family-names: "Collette"
given-names: "Michael R."
- family-names: "Lee"
given-names: "Gregory L."
- family-names: "Moody"
given-names: "Adam"
- family-names: "de Supinski"
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "US"
date-start: 2015-11-15
date-end: 2015-11-20
month: 11
year: 2015
identifiers:
- description: "The concept DOI of the work."
type: doi
value: 10.1145/2807591.2807623
- description: "The DOE Document Release Number of the work"
type: other
value: "LLNL-CONF-669890"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
@@ -47,12 +88,3 @@ preferred-citation:
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "USA"
month: November 15-20
year: 2015
notes: LLNL-CONF-669890

View File

@@ -7,6 +7,7 @@
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Slack](https://slack.spack.io/badge.svg)](https://slack.spack.io)
[![Matrix](https://img.shields.io/matrix/spack-space%3Amatrix.org?label=Matrix)](https://matrix.to/#/#spack-space:matrix.org)
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
@@ -62,10 +63,14 @@ Resources:
* **Slack workspace**: [spackpm.slack.com](https://spackpm.slack.com).
To get an invitation, visit [slack.spack.io](https://slack.spack.io).
* [**Github Discussions**](https://github.com/spack/spack/discussions): not just for discussions, also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.
Contributing
------------------------

View File

@@ -9,15 +9,15 @@ bootstrap:
# may not be able to bootstrap all the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions-v0.5'
metadata: $spack/share/spack/bootstrap/github-actions-v0.5
- name: 'github-actions-v0.4'
metadata: $spack/share/spack/bootstrap/github-actions-v0.4
- name: 'github-actions-v0.3'
metadata: $spack/share/spack/bootstrap/github-actions-v0.3
- name: 'spack-install'
metadata: $spack/share/spack/bootstrap/spack-install
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions-v0.5: true
github-actions-v0.4: true
github-actions-v0.3: true
spack-install: true

View File

@@ -229,3 +229,11 @@ config:
flags:
# Whether to keep -Werror flags active in package builds.
keep_werror: 'none'
# A mapping of aliases that can be used to define new commands. For instance,
# `sp: spec -I` will define a new command `sp` that will execute `spec` with
# the `-I` argument. Aliases cannot override existing commands.
aliases:
concretise: concretize
containerise: containerize
rm: remove

View File

@@ -50,4 +50,4 @@ packages:
# Apple bundles libuuid in libsystem_c version 1353.100.2,
# although the version number used here isn't critical
- spec: apple-libuuid@1353.100.2
prefix: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk

View File

@@ -1526,6 +1526,30 @@ any MPI implementation will do. If another package depends on
error. Likewise, if you try to plug in some package that doesn't
provide MPI, Spack will raise an error.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Explicit binding of virtual dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are packages that provide more than just one virtual dependency. When interacting with them, users
might want to utilize just a subset of what they could provide, and use other providers for virtuals they
need.
It is possible to be more explicit and tell Spack which dependency should provide which virtual, using a
special syntax:
.. code-block:: console
$ spack spec strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
Concretizing the spec above produces the following DAG:
.. figure:: images/strumpack_virtuals.svg
:scale: 60 %
:align: center
where ``intel-parallel-studio`` *could* provide ``mpi``, ``lapack``, and ``blas`` but is used only for the former. The ``lapack``
and ``blas`` dependencies are satisfied by ``openblas``.
^^^^^^^^^^^^^^^^^^^^^^^^
Specifying Specs by Hash
^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -153,18 +153,147 @@ keyring, and trusting all downloaded keys.
List of popular build caches
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_'
-------------------
Build cache signing
-------------------
By default, Spack will add a cryptographic signature to each package pushed to
a build cache, and verifies the signature when installing from a build cache.
Keys for signing can be managed with the :ref:`spack gpg <cmd-spack-gpg>` command,
as well as ``spack buildcache keys`` as mentioned above.
You can disable signing when pushing with ``spack buildcache push --unsigned``,
and disable verification when installing from any build cache with
``spack install --no-check-signature``.
Alternatively, signing and verification can be enabled or disabled on a per build cache
basis:
.. code-block:: console
$ spack mirror add --signed <name> <url> # enable signing and verification
$ spack mirror add --unsigned <name> <url> # disable signing and verification
$ spack mirror set --signed <name> # enable signing and verification for an existing mirror
$ spack mirror set --unsigned <name> # disable signing and verification for an existing mirror
Or you can directly edit the ``mirrors.yaml`` configuration file:
.. code-block:: yaml
mirrors:
<name>:
url: <url>
signed: false # disable signing and verification
See also :ref:`mirrors`.
----------
Relocation
----------
Initial build and later installation do not necessarily happen at the same
location. Spack provides a relocation capability and corrects for RPATHs and
non-relocatable scripts. However, many packages compile paths into binary
artifacts directly. In such cases, the build instructions of this package would
need to be adjusted for better re-locatability.
When using buildcaches across different machines, it is likely that the install
root will be different from the one used to build the binaries.
To address this issue, Spack automatically relocates all paths encoded in binaries
and scripts to their new location upon install.
Note that there are some cases where this is not possible: if binaries are built in
a relatively short path, and then installed to a longer path, there may not be enough
space in the binary to encode the new path. In this case, Spack will fail to install
the package from the build cache, and a source build is required.
To reduce the likelihood of this happening, it is highly recommended to add padding to
the install root during the build, as specified in the :ref:`config <config-yaml>`
section of the configuration:
.. code-block:: yaml
config:
install_tree:
root: /opt/spack
padded_length: 128
.. _binary_caches_oci:
-----------------------------------------
OCI / Docker V2 registries as build cache
-----------------------------------------
Spack can also use OCI or Docker V2 registries such as Dockerhub, Quay.io,
Github Packages, GitLab Container Registry, JFrog Artifactory, and others
as build caches. This is a convenient way to share binaries using public
infrastructure, or to cache Spack built binaries in Github Actions and
GitLab CI.
To get started, configure an OCI mirror using ``oci://`` as the scheme,
and optionally specify a username and password (or personal access token):
.. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://example.com/my_image
Spack follows the naming conventions of Docker, with Dockerhub as the default
registry. To use Dockerhub, you can omit the registry domain:
.. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://username/my_image
From here, you can use the mirror as any other build cache:
.. code-block:: console
$ spack buildcache push my_registry <specs...> # push to the registry
$ spack install <specs...> # install from the registry
A unique feature of buildcaches on top of OCI registries is that it's incredibly
easy to generate get a runnable container image with the binaries installed. This
is a great way to make applications available to users without requiring them to
install Spack -- all you need is Docker, Podman or any other OCI-compatible container
runtime.
To produce container images, all you need to do is add the ``--base-image`` flag
when pushing to the build cache:
.. code-block:: console
$ spack buildcache push --base-image ubuntu:20.04 my_registry ninja
Pushed to example.com/my_image:ninja-1.11.1-yxferyhmrjkosgta5ei6b4lqf6bxbscz.spack
$ docker run -it example.com/my_image:ninja-1.11.1-yxferyhmrjkosgta5ei6b4lqf6bxbscz.spack
root@e4c2b6f6b3f4:/# ninja --version
1.11.1
If ``--base-image`` is not specified, distroless images are produced. In practice,
you won't be able to run these as containers, since they don't come with libc and
other system dependencies. However, they are still compatible with tools like
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
.. note::
The docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
``max depth exceeded`` error may be produced when pulling the image. There
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
------------------------------------
Spack build cache for GitHub Actions
------------------------------------
To significantly speed up Spack in GitHub Actions, binaries can be cached in
GitHub Packages. This service is an OCI registry that can be linked to a GitHub
repository.
Spack offers a public build cache for GitHub Actions with a set of common packages,
which lets you get started quickly. See the following resources for more information:
* `spack/setup-spack <https://github.com/spack/setup-spack>`_ for setting up Spack in GitHub
Actions
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_ for
more details on the public build cache
.. _cmd-spack-buildcache:

View File

@@ -37,7 +37,11 @@ to enable reuse for a single installation, and you can use:
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
``reuse: dependencies`` is the default.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
------------------------------------------
Selection of the target microarchitectures
@@ -99,551 +103,3 @@ while `py-numpy` still needs an older version:
Up to Spack v0.20 ``duplicates:strategy:none`` was the default (and only) behavior. From Spack v0.21 the
default behavior is ``duplicates:strategy:minimal``.
.. _build-settings:
================================
Package Settings (packages.yaml)
================================
Spack allows you to customize how your software is built through the
``packages.yaml`` file. Using it, you can make Spack prefer particular
implementations of virtual dependencies (e.g., MPI or BLAS/LAPACK),
or you can make it prefer to build with particular compilers. You can
also tell Spack to use *external* software installations already
present on your system.
At a high level, the ``packages.yaml`` file is structured like this:
.. code-block:: yaml
packages:
package1:
# settings for package1
package2:
# settings for package2
# ...
all:
# settings that apply to all packages.
So you can either set build preferences specifically for *one* package,
or you can specify that certain settings should apply to *all* packages.
The types of settings you can customize are described in detail below.
Spack's build defaults are in the default
``etc/spack/defaults/packages.yaml`` file. You can override them in
``~/.spack/packages.yaml`` or ``etc/spack/packages.yaml``. For more
details on how this works, see :ref:`configuration-scopes`.
.. _sec-external-packages:
-----------------
External Packages
-----------------
Spack can be configured to use externally-installed
packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
External packages are configured through the ``packages.yaml`` file.
Here's an example of an external configuration:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the pre-installed OpenMPI in
the given directory. Note that the specified path is the top-level
install prefix, not the ``bin`` subdirectory.
``packages.yaml`` can also be used to specify modules to load instead
of the installation prefixes. The following example says that module
``CMake/3.7.2`` provides cmake version 3.7.2.
.. code-block:: yaml
cmake:
externals:
- spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add an ``externals:``
attribute under the package name, which lists externals.
Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based
on its most-favored packages, and it may guess incorrectly.
Each package version and compiler listed in an external should
have entries in Spack's packages and compiler configuration, even
though the package and compiler may not ever be built.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Prevent packages from being built from sources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an external spec in ``packages.yaml`` allows Spack to use an external location,
but it does not prevent Spack from building packages from sources. In the above example,
Spack might choose for many valid reasons to start building and linking with the
latest version of OpenMPI rather than continue using the pre-installed OpenMPI versions.
To prevent this, the ``packages.yaml`` configuration also allows packages
to be flagged as non-buildable. The previous example could be modified to
be:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI from sources, and it will instead always rely on a pre-built
OpenMPI.
.. note::
If ``concretizer:reuse`` is on (see :ref:`concretizer-options` for more information on that flag)
pre-built specs include specs already available from a local store, an upstream store, a registered
buildcache or specs marked as externals in ``packages.yaml``. If ``concretizer:reuse`` is off, only
external specs in ``packages.yaml`` are included in the list of pre-built specs.
If an external module is specified as not buildable, then Spack will load the
external module into the build environment which can be used for linking.
The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Non-buildable virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Virtual packages in Spack can also be specified as not buildable, and
external implementations can be provided. In the example above,
OpenMPI is configured as not buildable, but Spack will often prefer
other MPI implementations over the externally available OpenMPI. Spack
can be configured with every MPI provider not buildable individually,
but more conveniently:
.. code-block:: yaml
packages:
mpi:
buildable: False
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
In cases where the concretizer is configured to reuse specs, and other ``mpi`` providers
(available via stores or buildcaches) are not wanted, Spack can be configured to require
specs matching only the available externals:
.. code-block:: yaml
packages:
mpi:
buildable: False
require:
- one_of: [
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64",
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug",
"openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
]
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This configuration prevents any spec using MPI and originating from stores or buildcaches to be reused,
unless it matches the requirements under ``packages:mpi:require``. For more information on requirements see
:ref:`package-requirements`.
.. _cmd-spack-external-find:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically Find External Packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can run the :ref:`spack external find <spack-external-find>` command
to search for system-provided packages and add them to ``packages.yaml``.
After running this command your ``packages.yaml`` may include new entries:
.. code-block:: yaml
packages:
cmake:
externals:
- spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies.
Specific limitations include:
* Packages are not discoverable by default: For a package to be
discoverable with ``spack external find``, it needs to add special
logic. See :ref:`here <make-package-findable>` for more details.
* The logic does not search through module files, it can only detect
packages with executables defined in ``PATH``; you can help Spack locate
externals which use module files by loading any associated modules for
packages that you want Spack to know about before running
``spack external find``.
* Spack does not overwrite existing entries in the package configuration:
If there is an external defined for a spec at any configuration scope,
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _package-requirements:
--------------------
Package Requirements
--------------------
Spack can be configured to always use certain compilers, package
versions, and variants during concretization through package
requirements.
Package requirements are useful when you find yourself repeatedly
specifying the same constraints on the command line, and wish that
Spack respects these constraints whether you mention them explicitly
or not. Another use case is specifying constraints that should apply
to all root specs in an environment, without having to repeat the
constraint everywhere.
Apart from that, requirements config is more flexible than constraints
on the command line, because it can specify constraints on packages
*when they occur* as a dependency. In contrast, on the command line it
is not possible to specify constraints on dependencies while also keeping
those dependencies optional.
^^^^^^^^^^^^^^^^^^^
Requirements syntax
^^^^^^^^^^^^^^^^^^^
The package requirements configuration is specified in ``packages.yaml``,
keyed by package name and expressed using the Spec syntax. In the simplest
case you can specify attributes that you always want the package to have
by providing a single spec string to ``require``:
.. code-block:: yaml
packages:
libfabric:
require: "@1.13.2"
In the above example, ``libfabric`` will always build with version 1.13.2. If you
need to compose multiple configuration scopes ``require`` accepts a list of
strings:
.. code-block:: yaml
packages:
libfabric:
require:
- "@1.13.2"
- "%gcc"
In this case ``libfabric`` will always build with version 1.13.2 **and** using GCC
as a compiler.
For more complex use cases, require accepts also a list of objects. These objects
must have either a ``any_of`` or a ``one_of`` field, containing a list of spec strings,
and they can optionally have a ``when`` and a ``message`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["@4.1.5", "%gcc"]
message: "in this example only 4.1.5 can build with other compilers"
``any_of`` is a list of specs. One of those specs must be satisfied
and it is also allowed for the concretized spec to match more than one.
In the above example, that means you could build ``openmpi@4.1.5%gcc``,
``openmpi@4.1.5%clang`` or ``openmpi@3.9%gcc``, but
not ``openmpi@3.9%clang``.
If a custom message is provided, and the requirement is not satisfiable,
Spack will print the custom error message:
.. code-block:: console
$ spack spec openmpi@3.9%clang
==> Error: in this example only 4.1.5 can build with other compilers
We could express a similar requirement using the ``when`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
In the example above, if the version turns out to be 4.1.4 or less, we require the compiler to be GCC.
For readability, Spack also allows a ``spec`` key accepting a string when there is only a single
constraint:
.. code-block:: yaml
packages:
openmpi:
require:
- spec: "%gcc"
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
This code snippet and the one before it are semantically equivalent.
Finally, instead of ``any_of`` you can use ``one_of`` which also takes a list of specs. The final
concretized spec must match one and only one of them:
.. code-block:: yaml
packages:
mpich:
require:
- one_of: ["+cuda", "+rocm"]
In the example above, that means you could build ``mpich+cuda`` or ``mpich+rocm`` but not ``mpich+cuda+rocm``.
.. note::
For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
.. note::
When using a conditional requirement, Spack is allowed to actively avoid the triggering
condition (the ``when=...`` spec) if that leads to a concrete spec with better scores in
the optimization criteria. To check the current optimization criteria and their
priorities you can run ``spack solve zlib``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting default requirements
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can also set default requirements for all packages under ``all``
like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
which means every spec will be required to use ``clang`` as a compiler.
Note that in this case ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
cmake:
require: '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including ``cmake``
dependencies) to use ``clang``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A requirement on a virtual spec applies whenever that virtual is present in the DAG.
This can be useful for fixing which virtual provider you want to use:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
With the configuration above the only allowed ``mpi`` provider is ``mvapich2 %gcc``.
Requirements on the virtual spec and on the specific provider are both applied, if
present. For instance with a configuration like:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
mvapich2:
require: '~cuda'
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-preferences:
-------------------
Package Preferences
-------------------
In some cases package requirements can be too strong, and package
preferences are the better option. Package preferences do not impose
constraints on packages for particular versions or variants values,
they rather only set defaults -- the concretizer is free to change
them if it must due to other constraints. Also note that package
preferences are of lower priority than reuse of already installed
packages.
Here's an example ``packages.yaml`` file that sets preferred packages:
.. code-block:: yaml
packages:
opencv:
compiler: [gcc@4.9]
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
all:
compiler: [gcc@4.4.7, 'gcc@4.6:', intel, clang, pgi]
target: [sandybridge]
providers:
mpi: [mvapich2, mpich, openmpi]
At a high level, this example is specifying how packages are preferably
concretized. The opencv package should prefer using GCC 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich2 for
its MPI and GCC 4.4.7 (except for opencv, which overrides this by preferring GCC 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
Package preferences accept the follow keys or components under
the specific package (or ``all``) section: ``compiler``, ``variants``,
``version``, ``providers``, and ``target``. Each component has an
ordered list of spec ``constraints``, with earlier entries in the
list being preferred over later entries.
Sometimes a package installation may have constraints that forbid
the first concretization rule, in which case Spack will use the first
legal concretization rule. Going back to the example, if a user
requests gperftools 2.3 or later, then Spack will install version 2.4
as the 2.4 version of gperftools is preferred over 2.3.
An explicit concretization rule in the preferred section will always
take preference over unlisted concretizations. In the above example,
xlc isn't listed in the compiler list. Every listed compiler from
gcc to pgi will thus be preferred over the xlc compiler.
The syntax for the ``provider`` section differs slightly from other
concretization rules. A provider lists a value that packages may
``depends_on`` (e.g, MPI) and a list of rules for fulfilling that
dependency.
.. _package_permissions:
-------------------
Package Permissions
-------------------
Spack can be configured to assign permissions to the files installed
by a package.
In the ``packages.yaml`` file under ``permissions``, the attributes
``read``, ``write``, and ``group`` control the package
permissions. These attributes can be set per-package, or for all
packages under ``all``. If permissions are set under ``all`` and for a
specific package, the package-specific settings take precedence.
The ``read`` and ``write`` attributes take one of ``user``, ``group``,
and ``world``.
.. code-block:: yaml
packages:
all:
permissions:
write: group
group: spack
my_app:
permissions:
read: group
group: my_team
The permissions settings describe the broadest level of access to
installations of the specified packages. The execute permissions of
the file are set to the same level as read permissions for those files
that are executable. The default setting for ``read`` is ``world``,
and for ``write`` is ``user``. In the example above, installations of
``my_app`` will be installed with user and group permissions but no
world permissions, and owned by the group ``my_team``. All other
packages will be installed with user and group write privileges, and
world read privileges. Those packages will be owned by the group
``spack``.
The ``group`` attribute assigns a Unix-style group to a package. All
files installed by the package will be owned by the assigned group,
and the sticky group bit will be set on the install prefix and all
directories inside the install prefix. This will ensure that even
manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
----------------------------
Assigning Package Attributes
----------------------------
You can assign class-level attributes in the configuration:
.. code-block:: yaml
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these
attributes may be any value parseable by yaml.
These can only be applied to specific packages, not "all" or
virtual packages.

View File

@@ -127,9 +127,9 @@ check out a commit from the ``master`` branch, you would want to add:
.. code-block:: python
depends_on('autoconf', type='build', when='@master')
depends_on('automake', type='build', when='@master')
depends_on('libtool', type='build', when='@master')
depends_on("autoconf", type="build", when="@master")
depends_on("automake", type="build", when="@master")
depends_on("libtool", type="build", when="@master")
It is typically redundant to list the ``m4`` macro processor package as a
dependency, since ``autoconf`` already depends on it.
@@ -145,7 +145,7 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
.. code-block:: python
def autoreconf(self, spec, prefix):
which('bash')('autogen.sh')
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
@@ -186,9 +186,9 @@ To opt out of this feature, use the following setting:
To enable it conditionally on different architectures, define a property and
make the package depend on ``gnuconfig`` as a build dependency:
.. code-block
.. code-block:: python
depends_on('gnuconfig', when='@1.0:')
depends_on("gnuconfig", when="@1.0:")
@property
def patch_config_files(self):
@@ -230,7 +230,7 @@ version, this can be done like so:
@property
def force_autoreconf(self):
return self.version == Version('1.2.3')
return self.version == Version("1.2.3")
^^^^^^^^^^^^^^^^^^^^^^^
Finding configure flags
@@ -278,13 +278,22 @@ function like so:
def configure_args(self):
args = []
if '+mpi' in self.spec:
args.append('--enable-mpi')
if self.spec.satisfies("+mpi"):
args.append("--enable-mpi")
else:
args.append('--disable-mpi')
args.append("--disable-mpi")
return args
Alternatively, you can use the :ref:`enable_or_disable <autotools_enable_or_disable>` helper:
.. code-block:: python
def configure_args(self):
return [self.enable_or_disable("mpi")]
Note that we are explicitly disabling MPI support if it is not
requested. This is important, as many Autotools packages will enable
options by default if the dependencies are found, and disable them
@@ -295,9 +304,11 @@ and `here <https://wiki.gentoo.org/wiki/Project:Quality_Assurance/Automagic_depe
for a rationale as to why these so-called "automagic" dependencies
are a problem.
By default, Autotools installs packages to ``/usr``. We don't want this,
so Spack automatically adds ``--prefix=/path/to/installation/prefix``
to your list of ``configure_args``. You don't need to add this yourself.
.. note::
By default, Autotools installs packages to ``/usr``. We don't want this,
so Spack automatically adds ``--prefix=/path/to/installation/prefix``
to your list of ``configure_args``. You don't need to add this yourself.
^^^^^^^^^^^^^^^^
Helper functions
@@ -308,6 +319,8 @@ You may have noticed that most of the Autotools flags are of the form
``--without-baz``. Since these flags are so common, Spack provides a
couple of helper functions to make your life easier.
.. _autotools_enable_or_disable:
"""""""""""""""""
enable_or_disable
"""""""""""""""""
@@ -319,11 +332,11 @@ typically used to enable or disable some feature within the package.
.. code-block:: python
variant(
'memchecker',
"memchecker",
default=False,
description='Memchecker support for debugging [degrades performance]'
description="Memchecker support for debugging [degrades performance]"
)
config_args.extend(self.enable_or_disable('memchecker'))
config_args.extend(self.enable_or_disable("memchecker"))
In this example, specifying the variant ``+memchecker`` will generate
the following configuration options:
@@ -343,15 +356,15 @@ the ``with_or_without`` method.
.. code-block:: python
variant(
'schedulers',
"schedulers",
values=disjoint_sets(
('auto',), ('alps', 'lsf', 'tm', 'slurm', 'sge', 'loadleveler')
).with_non_feature_values('auto', 'none'),
("auto",), ("alps", "lsf", "tm", "slurm", "sge", "loadleveler")
).with_non_feature_values("auto", "none"),
description="List of schedulers for which support is enabled; "
"'auto' lets openmpi determine",
)
if 'schedulers=auto' not in spec:
config_args.extend(self.with_or_without('schedulers'))
if not spec.satisfies("schedulers=auto"):
config_args.extend(self.with_or_without("schedulers"))
In this example, specifying the variant ``schedulers=slurm,sge`` will
generate the following configuration options:
@@ -376,16 +389,16 @@ generated, using the ``activation_value`` argument to
.. code-block:: python
variant(
'fabrics',
"fabrics",
values=disjoint_sets(
('auto',), ('psm', 'psm2', 'verbs', 'mxm', 'ucx', 'libfabric')
).with_non_feature_values('auto', 'none'),
("auto",), ("psm", "psm2", "verbs", "mxm", "ucx", "libfabric")
).with_non_feature_values("auto", "none"),
description="List of fabrics that are enabled; "
"'auto' lets openmpi determine",
)
if 'fabrics=auto' not in spec:
config_args.extend(self.with_or_without('fabrics',
activation_value='prefix'))
if not spec.satisfies("fabrics=auto"):
config_args.extend(self.with_or_without("fabrics",
activation_value="prefix"))
``activation_value`` accepts a callable that generates the configure
parameter value given the variant value; but the special value
@@ -409,16 +422,16 @@ When Spack variants and configure flags do not correspond one-to-one, the
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.enable_or_disable('debug-tools', variant='debug_tools')
variant("debug_tools", default=False)
config_args += self.enable_or_disable("debug-tools", variant="debug_tools")
Or when one variant controls multiple flags:
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.with_or_without('memchecker', variant='debug_tools')
config_args += self.with_or_without('profiler', variant='debug_tools')
variant("debug_tools", default=False)
config_args += self.with_or_without("memchecker", variant="debug_tools")
config_args += self.with_or_without("profiler", variant="debug_tools")
""""""""""""""""""""
@@ -432,8 +445,8 @@ For example:
.. code-block:: python
variant('profiler', when='@2.0:')
config_args += self.with_or_without('profiler')
variant("profiler", when="@2.0:")
config_args += self.with_or_without("profiler")
will neither add ``--with-profiler`` nor ``--without-profiler`` when the version is
below ``2.0``.
@@ -452,10 +465,10 @@ the variant values require atypical behavior.
def with_or_without_verbs(self, activated):
# Up through version 1.6, this option was named --with-openib.
# In version 1.7, it was renamed to be --with-verbs.
opt = 'verbs' if self.spec.satisfies('@1.7:') else 'openib'
opt = "verbs" if self.spec.satisfies("@1.7:") else "openib"
if not activated:
return '--without-{0}'.format(opt)
return '--with-{0}={1}'.format(opt, self.spec['rdma-core'].prefix)
return f"--without-{opt}"
return f"--with-{opt}={self.spec['rdma-core'].prefix}"
Defining ``with_or_without_verbs`` overrides the behavior of a
``fabrics=verbs`` variant, changing the configure-time option to
@@ -479,7 +492,7 @@ do this like so:
.. code-block:: python
configure_directory = 'src'
configure_directory = "src"
^^^^^^^^^^^^^^^^^^^^^^
Building out of source
@@ -491,7 +504,7 @@ This can be done using the ``build_directory`` variable:
.. code-block:: python
build_directory = 'spack-build'
build_directory = "spack-build"
By default, Spack will build the package in the same directory that
contains the ``configure`` script
@@ -514,8 +527,8 @@ library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
build_targets = ["all", "docs"]
install_targets = ["install", "docs"]
^^^^^^^
Testing

View File

@@ -87,7 +87,7 @@ A typical usage of these methods may look something like this:
.. code-block:: python
def initconfig_mpi_entries(self)
def initconfig_mpi_entries(self):
# Get existing MPI configurations
entries = super(self, Foo).initconfig_mpi_entries()
@@ -95,25 +95,25 @@ A typical usage of these methods may look something like this:
# This spec has an MPI variant, and we need to enable MPI when it is on.
# This hypothetical package controls MPI with the ``FOO_MPI`` option to
# cmake.
if '+mpi' in self.spec:
entries.append(cmake_cache_option('FOO_MPI', True, "enable mpi"))
if self.spec.satisfies("+mpi"):
entries.append(cmake_cache_option("FOO_MPI", True, "enable mpi"))
else:
entries.append(cmake_cache_option('FOO_MPI', False, "disable mpi"))
entries.append(cmake_cache_option("FOO_MPI", False, "disable mpi"))
def initconfig_package_entries(self):
# Package specific options
entries = []
entries.append('#Entries for build options')
entries.append("#Entries for build options")
bar_on = '+bar' in self.spec
entries.append(cmake_cache_option('FOO_BAR', bar_on, 'toggle bar'))
bar_on = self.spec.satisfies("+bar")
entries.append(cmake_cache_option("FOO_BAR", bar_on, "toggle bar"))
entries.append('#Entries for dependencies')
entries.append("#Entries for dependencies")
if self.spec['blas'].name == 'baz': # baz is our blas provider
entries.append(cmake_cache_string('FOO_BLAS', 'baz', 'Use baz'))
entries.append(cmake_cache_path('BAZ_PREFIX', self.spec['baz'].prefix))
if self.spec["blas"].name == "baz": # baz is our blas provider
entries.append(cmake_cache_string("FOO_BLAS", "baz", "Use baz"))
entries.append(cmake_cache_path("BAZ_PREFIX", self.spec["baz"].prefix))
^^^^^^^^^^^^^^^^^^^^^^
External documentation

View File

@@ -82,7 +82,7 @@ class already contains:
.. code-block:: python
depends_on('cmake', type='build')
depends_on("cmake", type="build")
If you need to specify a particular version requirement, you can
@@ -90,7 +90,7 @@ override this in your package:
.. code-block:: python
depends_on('cmake@2.8.12:', type='build')
depends_on("cmake@2.8.12:", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -137,10 +137,10 @@ and without the :meth:`~spack.build_systems.cmake.CMakeBuilder.define` and
def cmake_args(self):
args = [
'-DWHATEVER:STRING=somevalue',
self.define('ENABLE_BROKEN_FEATURE', False),
self.define_from_variant('DETECT_HDF5', 'hdf5'),
self.define_from_variant('THREADS'), # True if +threads
"-DWHATEVER:STRING=somevalue",
self.define("ENABLE_BROKEN_FEATURE", False),
self.define_from_variant("DETECT_HDF5", "hdf5"),
self.define_from_variant("THREADS"), # True if +threads
]
return args
@@ -151,10 +151,10 @@ and CMake simply ignores the empty command line argument. For example the follow
.. code-block:: python
variant('example', default=True, when='@2.0:')
variant("example", default=True, when="@2.0:")
def cmake_args(self):
return [self.define_from_variant('EXAMPLE', 'example')]
return [self.define_from_variant("EXAMPLE", "example")]
will generate ``'cmake' '-DEXAMPLE=ON' ...`` when `@2.0: +example` is met, but will
result in ``'cmake' '' ...`` when the spec version is below ``2.0``.
@@ -193,9 +193,9 @@ a variant to control this:
.. code-block:: python
variant('build_type', default='RelWithDebInfo',
description='CMake build type',
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
variant("build_type", default="RelWithDebInfo",
description="CMake build type",
values=("Debug", "Release", "RelWithDebInfo", "MinSizeRel"))
However, not every CMake package accepts all four of these options.
Grep the ``CMakeLists.txt`` file to see if the default values are
@@ -205,9 +205,9 @@ package overrides the default variant with:
.. code-block:: python
variant('build_type', default='DebugRelease',
description='The build type to build',
values=('Debug', 'Release', 'DebugRelease'))
variant("build_type", default="DebugRelease",
description="The build type to build",
values=("Debug", "Release", "DebugRelease"))
For more information on ``CMAKE_BUILD_TYPE``, see:
https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
@@ -250,7 +250,7 @@ generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = 'Ninja'
generator = "Ninja"
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the
@@ -258,7 +258,7 @@ Ninja generator, make sure to add:
.. code-block:: python
depends_on('ninja', type='build')
depends_on("ninja", type="build")
to the package as well. Aside from that, you shouldn't need to do
anything else. Spack will automatically detect that you are using
@@ -288,7 +288,7 @@ like so:
.. code-block:: python
root_cmakelists_dir = 'src'
root_cmakelists_dir = "src"
Note that this path is relative to the root of the extracted tarball,
@@ -304,7 +304,7 @@ different sub-directory, simply override ``build_directory`` like so:
.. code-block:: python
build_directory = 'my-build'
build_directory = "my-build"
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
@@ -324,8 +324,8 @@ library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
build_targets = ["all", "docs"]
install_targets = ["install", "docs"]
^^^^^^^
Testing

View File

@@ -54,8 +54,8 @@ to terminate such build attempts with a suitable message:
.. code-block:: python
conflicts('cuda_arch=none', when='+cuda',
msg='CUDA architecture is required')
conflicts("cuda_arch=none", when="+cuda",
msg="CUDA architecture is required")
Similarly, if your software does not support all versions of the property,
you could add ``conflicts`` to your package for those versions. For example,
@@ -66,13 +66,13 @@ custom message should a user attempt such a build:
.. code-block:: python
unsupported_cuda_archs = [
'10', '11', '12', '13',
'20', '21',
'30', '32', '35', '37'
"10", "11", "12", "13",
"20", "21",
"30", "32", "35", "37"
]
for value in unsupported_cuda_archs:
conflicts('cuda_arch={0}'.format(value), when='+cuda',
msg='CUDA architecture {0} is not supported'.format(value))
conflicts(f"cuda_arch={value}", when="+cuda",
msg=f"CUDA architecture {value} is not supported")
^^^^^^^
Methods
@@ -107,16 +107,16 @@ class of your package. For example, you can add it to your
spec = self.spec
args = []
...
if '+cuda' in spec:
if spec.satisfies("+cuda"):
# Set up the cuda macros needed by the build
args.append('-DWITH_CUDA=ON')
cuda_arch_list = spec.variants['cuda_arch'].value
args.append("-DWITH_CUDA=ON")
cuda_arch_list = spec.variants["cuda_arch"].value
cuda_arch = cuda_arch_list[0]
if cuda_arch != 'none':
args.append('-DCUDA_FLAGS=-arch=sm_{0}'.format(cuda_arch))
if cuda_arch != "none":
args.append(f"-DCUDA_FLAGS=-arch=sm_{cuda_arch}")
else:
# Ensure build with cuda is disabled
args.append('-DWITH_CUDA=OFF')
args.append("-DWITH_CUDA=OFF")
...
return args
@@ -125,7 +125,7 @@ You will need to customize options as needed for your build.
This example also illustrates how to check for the ``cuda`` variant using
``self.spec`` and how to retrieve the ``cuda_arch`` variant's value, which
is a list, using ``self.spec.variants['cuda_arch'].value``.
is a list, using ``self.spec.variants["cuda_arch"].value``.
With over 70 packages using ``CudaPackage`` as of January 2021 there are
lots of examples to choose from to get more ideas for using this package.

View File

@@ -57,13 +57,13 @@ If you look at the ``perl`` package, you'll see:
.. code-block:: python
phases = ['configure', 'build', 'install']
phases = ["configure", "build", "install"]
Similarly, ``cmake`` defines:
.. code-block:: python
phases = ['bootstrap', 'build', 'install']
phases = ["bootstrap", "build", "install"]
If we look at the ``cmake`` example, this tells Spack's ``PackageBase``
class to run the ``bootstrap``, ``build``, and ``install`` functions
@@ -78,7 +78,7 @@ If we look at ``perl``, we see that it defines a ``configure`` method:
.. code-block:: python
def configure(self, spec, prefix):
configure = Executable('./Configure')
configure = Executable("./Configure")
configure(*self.configure_args())
There is also a corresponding ``configure_args`` function that handles
@@ -92,7 +92,7 @@ phases are pretty simple:
make()
def install(self, spec, prefix):
make('install')
make("install")
The ``cmake`` package looks very similar, but with a ``bootstrap``
function instead of ``configure``:
@@ -100,14 +100,14 @@ function instead of ``configure``:
.. code-block:: python
def bootstrap(self, spec, prefix):
bootstrap = Executable('./bootstrap')
bootstrap = Executable("./bootstrap")
bootstrap(*self.bootstrap_args())
def build(self, spec, prefix):
make()
def install(self, spec, prefix):
make('install')
make("install")
Again, there is a ``boostrap_args`` function that determines the
correct bootstrap flags to use.
@@ -128,16 +128,16 @@ before or after a particular phase. For example, in ``perl``, we see:
.. code-block:: python
@run_after('install')
@run_after("install")
def install_cpanm(self):
spec = self.spec
if '+cpanm' in spec:
with working_dir(join_path('cpanm', 'cpanm')):
perl = spec['perl'].command
perl('Makefile.PL')
if spec.satisfies("+cpanm"):
with working_dir(join_path("cpanm", "cpanm")):
perl = spec["perl"].command
perl("Makefile.PL")
make()
make('install')
make("install")
This extra step automatically installs ``cpanm`` in addition to the
base Perl installation.
@@ -174,10 +174,10 @@ In the ``perl`` package, we can see:
.. code-block:: python
@run_after('build')
@run_after("build")
@on_package_attributes(run_tests=True)
def test(self):
make('test')
make("test")
As you can guess, this runs ``make test`` *after* building the package,
if and only if testing is requested. Again, this is not specific to
@@ -189,7 +189,7 @@ custom build systems, it can be added to existing build systems as well.
.. code-block:: python
@run_after('install')
@run_after("install")
@on_package_attributes(run_tests=True)
works as expected. However, if you reverse the ordering:
@@ -197,7 +197,7 @@ custom build systems, it can be added to existing build systems as well.
.. code-block:: python
@on_package_attributes(run_tests=True)
@run_after('install')
@run_after("install")
the tests will always be run regardless of whether or not
``--test=root`` is requested. See https://github.com/spack/spack/issues/3833

View File

@@ -53,18 +53,24 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
spack compiler list
Note that 2024 and later releases do not include ``icc``. Before 2024,
the package layout was different::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
The ``intel-oneapi-compilers`` package includes 2 families of
compilers:
* ``intel``: ``icc``, ``icpc``, ``ifort``. Intel's *classic*
compilers.
compilers. 2024 and later releases contain ``ifort``, but not
``icc`` and ``icpc``.
* ``oneapi``: ``icx``, ``icpx``, ``ifx``. Intel's new generation of
compilers based on LLVM.
@@ -89,8 +95,8 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so Spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
@@ -146,8 +152,7 @@ Compilers
To use the compilers, add some information about the installation to
``compilers.yaml``. For most users, it is sufficient to do::
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
spack compiler add /opt/intel/oneapi/compiler/latest/bin
Adapt the paths above if you did not install the tools in the default
location. After adding the compilers, using them is the same
@@ -156,6 +161,12 @@ Another option is to manually add the configuration to
``compilers.yaml`` as described in :ref:`Compiler configuration
<compiler-config>`.
Before 2024, the directory structure was different::
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
Libraries
---------

View File

@@ -392,7 +392,7 @@ See section
:ref:`Configuration Scopes <configuration-scopes>`
for an explanation about the different files
and section
:ref:`Build customization <build-settings>`
:ref:`Build customization <packages-config>`
for specifics and examples for ``packages.yaml`` files.
.. If your system administrator did not provide modules for pre-installed Intel
@@ -934,9 +934,9 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
# Examples for absolute and conditional dependencies:
depends_on('mkl')
depends_on('mkl', when='+mkl')
depends_on('mkl', when='fftw=mkl')
depends_on("mkl")
depends_on("mkl", when="+mkl")
depends_on("mkl", when="fftw=mkl")
The ``MKLROOT`` environment variable (part of the documented API) will be set
during all stages of client package installation, and is available to both
@@ -972,8 +972,8 @@ a *virtual* ``mkl`` package is declared in Spack.
def configure_args(self):
args = []
...
args.append('--with-blas=%s' % self.spec['blas'].libs.ld_flags)
args.append('--with-lapack=%s' % self.spec['lapack'].libs.ld_flags)
args.append("--with-blas=%s" % self.spec["blas"].libs.ld_flags)
args.append("--with-lapack=%s" % self.spec["lapack"].libs.ld_flags)
...
.. tip::
@@ -989,13 +989,13 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
self.spec['blas'].headers.include_flags
self.spec["blas"].headers.include_flags
and to generate linker options (``-L<dir> -llibname ...``), use the same as above,
.. code-block:: python
self.spec['blas'].libs.ld_flags
self.spec["blas"].libs.ld_flags
See
:ref:`MakefilePackage <makefilepackage>`

View File

@@ -88,7 +88,7 @@ override the ``luarocks_args`` method like so:
.. code-block:: python
def luarocks_args(self):
return ['flag1', 'flag2']
return ["flag1", "flag2"]
One common use of this is to override warnings or flags for newer compilers, as in:

View File

@@ -59,7 +59,7 @@ using GNU Make, you should add a dependency on ``gmake``:
.. code-block:: python
depends_on('gmake', type='build')
depends_on("gmake", type="build")
^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -93,8 +93,8 @@ there are any other variables you need to set, you can do this in the
.. code-block:: python
def edit(self, spec, prefix):
env['PREFIX'] = prefix
env['BLASLIB'] = spec['blas'].libs.ld_flags
env["PREFIX"] = prefix
env["BLASLIB"] = spec["blas"].libs.ld_flags
`cbench <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cbench/package.py>`_
@@ -113,7 +113,7 @@ you can do this like so:
.. code-block:: python
build_targets = ['CC=cc']
build_targets = ["CC=cc"]
If you do need access to the spec, you can create a property like so:
@@ -125,8 +125,8 @@ If you do need access to the spec, you can create a property like so:
spec = self.spec
return [
'CC=cc',
'BLASLIB={0}'.format(spec['blas'].libs.ld_flags),
"CC=cc",
f"BLASLIB={spec['blas'].libs.ld_flags}",
]
@@ -145,12 +145,12 @@ and a ``filter_file`` method to help with this. For example:
.. code-block:: python
def edit(self, spec, prefix):
makefile = FileFilter('Makefile')
makefile = FileFilter("Makefile")
makefile.filter(r'^\s*CC\s*=.*', 'CC = ' + spack_cc)
makefile.filter(r'^\s*CXX\s*=.*', 'CXX = ' + spack_cxx)
makefile.filter(r'^\s*F77\s*=.*', 'F77 = ' + spack_f77)
makefile.filter(r'^\s*FC\s*=.*', 'FC = ' + spack_fc)
makefile.filter(r"^\s*CC\s*=.*", f"CC = {spack_cc}")
makefile.filter(r"^\s*CXX\s*=.*", f"CXX = {spack_cxx}")
makefile.filter(r"^\s*F77\s*=.*", f"F77 = {spack_f77}")
makefile.filter(r"^\s*FC\s*=.*", f"FC = {spack_fc}")
`stream <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/stream/package.py>`_
@@ -181,16 +181,16 @@ well for storing variables:
def edit(self, spec, prefix):
config = {
'CC': 'cc',
'MAKE': 'make',
"CC": "cc",
"MAKE": "make",
}
if '+blas' in spec:
config['BLAS_LIBS'] = spec['blas'].libs.joined()
if spec.satisfies("+blas"):
config["BLAS_LIBS"] = spec["blas"].libs.joined()
with open('make.inc', 'w') as inc:
with open("make.inc", "w") as inc:
for key in config:
inc.write('{0} = {1}\n'.format(key, config[key]))
inc.write(f"{key} = {config[key]}\n")
`elk <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/elk/package.py>`_
@@ -204,14 +204,14 @@ them in a list:
def edit(self, spec, prefix):
config = [
'INSTALL_DIR = {0}'.format(prefix),
'INCLUDE_DIR = $(INSTALL_DIR)/include',
'LIBRARY_DIR = $(INSTALL_DIR)/lib',
f"INSTALL_DIR = {prefix}",
"INCLUDE_DIR = $(INSTALL_DIR)/include",
"LIBRARY_DIR = $(INSTALL_DIR)/lib",
]
with open('make.inc', 'w') as inc:
with open("make.inc", "w") as inc:
for var in config:
inc.write('{0}\n'.format(var))
inc.write(f"{var}\n")
`hpl <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/hpl/package.py>`_
@@ -284,7 +284,7 @@ can tell Spack where to locate it like so:
.. code-block:: python
build_directory = 'src'
build_directory = "src"
^^^^^^^^^^^^^^^^^^^
@@ -299,8 +299,8 @@ install the package:
def install(self, spec, prefix):
mkdir(prefix.bin)
install('foo', prefix.bin)
install_tree('lib', prefix.lib)
install("foo", prefix.bin)
install_tree("lib", prefix.lib)
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -48,8 +48,8 @@ class automatically adds the following dependencies:
.. code-block:: python
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
depends_on("java", type=("build", "run"))
depends_on("maven", type="build")
In the ``pom.xml`` file, you may see sections like:
@@ -72,8 +72,8 @@ should add:
.. code-block:: python
depends_on('java@7:', type='build')
depends_on('maven@3.5.4:', type='build')
depends_on("java@7:", type="build")
depends_on("maven@3.5.4:", type="build")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -88,9 +88,9 @@ the build phase. For example:
def build_args(self):
return [
'-Pdist,native',
'-Dtar',
'-Dmaven.javadoc.skip=true'
"-Pdist,native",
"-Dtar",
"-Dmaven.javadoc.skip=true"
]

View File

@@ -86,8 +86,8 @@ the ``MesonPackage`` base class already contains:
.. code-block:: python
depends_on('meson', type='build')
depends_on('ninja', type='build')
depends_on("meson", type="build")
depends_on("ninja", type="build")
If you need to specify a particular version requirement, you can
@@ -95,8 +95,8 @@ override this in your package:
.. code-block:: python
depends_on('meson@0.43.0:', type='build')
depends_on('ninja', type='build')
depends_on("meson@0.43.0:", type="build")
depends_on("ninja", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -121,7 +121,7 @@ override the ``meson_args`` method like so:
.. code-block:: python
def meson_args(self):
return ['--warnlevel=3']
return ["--warnlevel=3"]
This method can be used to pass flags as well as variables.

View File

@@ -118,7 +118,7 @@ so ``PerlPackage`` contains:
.. code-block:: python
extends('perl')
extends("perl")
If your package requires a specific version of Perl, you should
@@ -132,14 +132,14 @@ properly. If your package uses ``Makefile.PL`` to build, add:
.. code-block:: python
depends_on('perl-extutils-makemaker', type='build')
depends_on("perl-extutils-makemaker", type="build")
If your package uses ``Build.PL`` to build, add:
.. code-block:: python
depends_on('perl-module-build', type='build')
depends_on("perl-module-build", type="build")
^^^^^^^^^^^^^^^^^
@@ -165,11 +165,11 @@ arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
.. code-block:: python
def configure_args(self):
expat = self.spec['expat'].prefix
expat = self.spec["expat"].prefix
return [
'EXPATLIBPATH={0}'.format(expat.lib),
'EXPATINCPATH={0}'.format(expat.include),
"EXPATLIBPATH={0}".format(expat.lib),
"EXPATINCPATH={0}".format(expat.include),
]

View File

@@ -152,16 +152,16 @@ set. Once set, ``pypi`` will be used to define the ``homepage``,
.. code-block:: python
homepage = 'https://pypi.org/project/setuptools/'
url = 'https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip'
list_url = 'https://pypi.org/simple/setuptools/'
homepage = "https://pypi.org/project/setuptools/"
url = "https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip"
list_url = "https://pypi.org/simple/setuptools/"
is equivalent to:
.. code-block:: python
pypi = 'setuptools/setuptools-49.2.0.zip'
pypi = "setuptools/setuptools-49.2.0.zip"
If a package has a different homepage listed on PyPI, you can
@@ -208,7 +208,7 @@ dependencies to your package:
.. code-block:: python
depends_on('py-setuptools@42:', type='build')
depends_on("py-setuptools@42:", type="build")
Note that ``py-wheel`` is already listed as a build dependency in the
@@ -232,7 +232,7 @@ Look for dependencies under the following keys:
* ``dependencies`` under ``[project]``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
add them with ``type=("build", "run")``.
* ``[project.optional-dependencies]``
@@ -279,12 +279,12 @@ distutils library, and has almost the exact same API. In addition to
* ``setup_requires``
These packages are usually only needed at build-time, so you can
add them with ``type='build'``.
add them with ``type="build"``.
* ``install_requires``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
add them with ``type=("build", "run")``.
* ``extras_require``
@@ -296,7 +296,7 @@ distutils library, and has almost the exact same API. In addition to
These are packages that are required to run the unit tests for the
package. These dependencies can be specified using the
``type='test'`` dependency type. However, the PyPI tarballs rarely
``type="test"`` dependency type. However, the PyPI tarballs rarely
contain unit tests, so there is usually no reason to add these.
See https://setuptools.pypa.io/en/latest/userguide/dependency_management.html
@@ -321,7 +321,7 @@ older versions of flit may use the following keys:
* ``requires`` under ``[tool.flit.metadata]``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
add them with ``type=("build", "run")``.
* ``[tool.flit.metadata.requires-extra]``
@@ -434,12 +434,12 @@ the BLAS/LAPACK library you want pkg-config to search for:
.. code-block:: python
depends_on('py-pip@22.1:', type='build')
depends_on("py-pip@22.1:", type="build")
def config_settings(self, spec, prefix):
return {
'blas': spec['blas'].libs.names[0],
'lapack': spec['lapack'].libs.names[0],
"blas": spec["blas"].libs.names[0],
"lapack": spec["lapack"].libs.names[0],
}
@@ -463,10 +463,10 @@ has an optional dependency on ``libyaml`` that can be enabled like so:
def global_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
options.append('--with-libyaml')
if spec.satisfies("+libyaml"):
options.append("--with-libyaml")
else:
options.append('--without-libyaml')
options.append("--without-libyaml")
return options
@@ -492,10 +492,10 @@ allows you to specify the directories to search for ``libyaml``:
def install_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
if spec.satisfies("+libyaml"):
options.extend([
spec['libyaml'].libs.search_flags,
spec['libyaml'].headers.include_flags,
spec["libyaml"].libs.search_flags,
spec["libyaml"].headers.include_flags,
])
return options
@@ -556,7 +556,7 @@ detected are wrong, you can provide the names yourself by overriding
.. code-block:: python
import_modules = ['six']
import_modules = ["six"]
Sometimes the list of module names to import depends on how the
@@ -571,9 +571,9 @@ This can be expressed like so:
@property
def import_modules(self):
modules = ['yaml']
if '+libyaml' in self.spec:
modules.append('yaml.cyaml')
modules = ["yaml"]
if self.spec.satisfies("+libyaml"):
modules.append("yaml.cyaml")
return modules
@@ -586,14 +586,14 @@ Instead of defining the ``import_modules`` explicitly, only the subset
of module names to be skipped can be defined by using ``skip_modules``.
If a defined module has submodules, they are skipped as well, e.g.,
in case the ``plotting`` modules should be excluded from the
automatically detected ``import_modules`` ``['nilearn', 'nilearn.surface',
'nilearn.plotting', 'nilearn.plotting.data']`` set:
automatically detected ``import_modules`` ``["nilearn", "nilearn.surface",
"nilearn.plotting", "nilearn.plotting.data"]`` set:
.. code-block:: python
skip_modules = ['nilearn.plotting']
skip_modules = ["nilearn.plotting"]
This will set ``import_modules`` to ``['nilearn', 'nilearn.surface']``
This will set ``import_modules`` to ``["nilearn", "nilearn.surface"]``
Import tests can be run during the installation using ``spack install
--test=root`` or at any time after the installation using
@@ -612,11 +612,11 @@ after the ``install`` phase:
.. code-block:: python
@run_after('install')
@run_after("install")
@on_package_attributes(run_tests=True)
def install_test(self):
with working_dir('spack-test', create=True):
python('-c', 'import numpy; numpy.test("full", verbose=2)')
with working_dir("spack-test", create=True):
python("-c", "import numpy; numpy.test('full', verbose=2)")
when testing is enabled during the installation (i.e., ``spack install
@@ -638,7 +638,7 @@ provides Python bindings in a ``python`` directory, you can use:
.. code-block:: python
build_directory = 'python'
build_directory = "python"
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -83,7 +83,7 @@ base class already contains:
.. code-block:: python
depends_on('qt', type='build')
depends_on("qt", type="build")
If you want to specify a particular version requirement, or need to
@@ -91,7 +91,7 @@ link to the ``qt`` libraries, you can override this in your package:
.. code-block:: python
depends_on('qt@5.6.0:')
depends_on("qt@5.6.0:")
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to qmake
@@ -103,7 +103,7 @@ override the ``qmake_args`` method like so:
.. code-block:: python
def qmake_args(self):
return ['-recursive']
return ["-recursive"]
This method can be used to pass flags as well as variables.
@@ -118,7 +118,7 @@ sub-directory by adding the following to the package:
.. code-block:: python
build_directory = 'src'
build_directory = "src"
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -81,28 +81,27 @@ class of your package. For example, you can add it to your
class MyRocmPackage(CMakePackage, ROCmPackage):
...
# Ensure +rocm and amdgpu_targets are passed to dependencies
depends_on('mydeppackage', when='+rocm')
depends_on("mydeppackage", when="+rocm")
for val in ROCmPackage.amdgpu_targets:
depends_on('mydeppackage amdgpu_target={0}'.format(val),
when='amdgpu_target={0}'.format(val))
depends_on(f"mydeppackage amdgpu_target={val}",
when=f"amdgpu_target={val}")
...
def cmake_args(self):
spec = self.spec
args = []
...
if '+rocm' in spec:
if spec.satisfies("+rocm"):
# Set up the hip macros needed by the build
args.extend([
'-DENABLE_HIP=ON',
'-DHIP_ROOT_DIR={0}'.format(spec['hip'].prefix)])
rocm_archs = spec.variants['amdgpu_target'].value
if 'none' not in rocm_archs:
args.append('-DHIP_HIPCC_FLAGS=--amdgpu-target={0}'
.format(",".join(rocm_archs)))
"-DENABLE_HIP=ON",
f"-DHIP_ROOT_DIR={spec['hip'].prefix}"])
rocm_archs = spec.variants["amdgpu_target"].value
if "none" not in rocm_archs:
args.append(f"-DHIP_HIPCC_FLAGS=--amdgpu-target={','.join(rocm_archs}")
else:
# Ensure build with hip is disabled
args.append('-DENABLE_HIP=OFF')
args.append("-DENABLE_HIP=OFF")
...
return args
...
@@ -114,7 +113,7 @@ build.
This example also illustrates how to check for the ``rocm`` variant using
``self.spec`` and how to retrieve the ``amdgpu_target`` variant's value
using ``self.spec.variants['amdgpu_target'].value``.
using ``self.spec.variants["amdgpu_target"].value``.
All five packages using ``ROCmPackage`` as of January 2021 also use the
:ref:`CudaPackage <cudapackage>`. So it is worth looking at those packages

View File

@@ -163,28 +163,28 @@ attributes that can be used to set ``homepage``, ``url``, ``list_url``, and
.. code-block:: python
cran = 'caret'
cran = "caret"
is equivalent to:
.. code-block:: python
homepage = 'https://cloud.r-project.org/package=caret'
url = 'https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz'
list_url = 'https://cloud.r-project.org/src/contrib/Archive/caret'
homepage = "https://cloud.r-project.org/package=caret"
url = "https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz"
list_url = "https://cloud.r-project.org/src/contrib/Archive/caret"
Likewise, the following ``bioc`` attribute:
.. code-block:: python
bioc = 'BiocVersion'
bioc = "BiocVersion"
is equivalent to:
.. code-block:: python
homepage = 'https://bioconductor.org/packages/BiocVersion/'
git = 'https://git.bioconductor.org/packages/BiocVersion'
homepage = "https://bioconductor.org/packages/BiocVersion/"
git = "https://git.bioconductor.org/packages/BiocVersion"
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -200,7 +200,7 @@ base class contains:
.. code-block:: python
extends('r')
extends("r")
Take a close look at the homepage for ``caret``. If you look at the
@@ -209,7 +209,7 @@ You should add this to your package like so:
.. code-block:: python
depends_on('r@3.2.0:', type=('build', 'run'))
depends_on("r@3.2.0:", type=("build", "run"))
^^^^^^^^^^^^^^
@@ -227,7 +227,7 @@ and list all of their dependencies in the following sections:
* LinkingTo
As far as Spack is concerned, all 3 of these dependency types
correspond to ``type=('build', 'run')``, so you don't have to worry
correspond to ``type=("build", "run")``, so you don't have to worry
about the details. If you are curious what they mean,
https://github.com/spack/spack/issues/2951 has a pretty good summary:
@@ -330,7 +330,7 @@ the dependency:
.. code-block:: python
depends_on('r-lattice@0.20:', type=('build', 'run'))
depends_on("r-lattice@0.20:", type=("build", "run"))
^^^^^^^^^^^^^^^^^^
@@ -361,20 +361,20 @@ like so:
.. code-block:: python
def configure_args(self):
mpi_name = self.spec['mpi'].name
mpi_name = self.spec["mpi"].name
# The type of MPI. Supported values are:
# OPENMPI, LAM, MPICH, MPICH2, or CRAY
if mpi_name == 'openmpi':
Rmpi_type = 'OPENMPI'
elif mpi_name == 'mpich':
Rmpi_type = 'MPICH2'
if mpi_name == "openmpi":
Rmpi_type = "OPENMPI"
elif mpi_name == "mpich":
Rmpi_type = "MPICH2"
else:
raise InstallError('Unsupported MPI type')
raise InstallError("Unsupported MPI type")
return [
'--with-Rmpi-type={0}'.format(Rmpi_type),
'--with-mpi={0}'.format(spec['mpi'].prefix),
"--with-Rmpi-type={0}".format(Rmpi_type),
"--with-mpi={0}".format(spec["mpi"].prefix),
]

View File

@@ -84,8 +84,8 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
summary = 'An implementation of the AsciiDoc text processor and publishing toolchain'
description = 'A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats.'
summary = "An implementation of the AsciiDoc text processor and publishing toolchain"
description = "A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats."
Either of these can be used for the description of the Spack package.
@@ -98,7 +98,7 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
homepage = 'https://asciidoctor.org'
homepage = "https://asciidoctor.org"
This should be used as the official homepage of the Spack package.
@@ -112,21 +112,21 @@ the base class contains:
.. code-block:: python
extends('ruby')
extends("ruby")
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
required_ruby_version = '>= 2.3.0'
required_ruby_version = ">= 2.3.0"
This can be added to the Spack package using:
.. code-block:: python
depends_on('ruby@2.3.0:', type=('build', 'run'))
depends_on("ruby@2.3.0:", type=("build", "run"))
^^^^^^^^^^^^^^^^^

View File

@@ -57,7 +57,7 @@ overridden like so:
.. code-block:: python
def test(self):
scons('check')
scons("check")
^^^^^^^^^^^^^^^
@@ -88,7 +88,7 @@ base class already contains:
.. code-block:: python
depends_on('scons', type='build')
depends_on("scons", type="build")
If you want to specify a particular version requirement, you can override
@@ -96,7 +96,7 @@ this in your package:
.. code-block:: python
depends_on('scons@2.3.0:', type='build')
depends_on("scons@2.3.0:", type="build")
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -238,14 +238,14 @@ the package build phase. This is done by overriding ``build_args`` like so:
def build_args(self, spec, prefix):
args = [
'PREFIX={0}'.format(prefix),
'ZLIB={0}'.format(spec['zlib'].prefix),
f"PREFIX={prefix}",
f"ZLIB={spec['zlib'].prefix}",
]
if '+debug' in spec:
args.append('DEBUG=yes')
if spec.satisfies("+debug"):
args.append("DEBUG=yes")
else:
args.append('DEBUG=no')
args.append("DEBUG=no")
return args
@@ -275,8 +275,8 @@ environment variables. For example, cantera has the following option:
* env_vars: [ string ]
Environment variables to propagate through to SCons. Either the
string "all" or a comma separated list of variable names, e.g.
'LD_LIBRARY_PATH,HOME'.
- default: 'LD_LIBRARY_PATH,PYTHONPATH'
"LD_LIBRARY_PATH,HOME".
- default: "LD_LIBRARY_PATH,PYTHONPATH"
In the case of cantera, using ``env_vars=all`` allows us to use

View File

@@ -124,7 +124,7 @@ are wrong, you can provide the names yourself by overriding
.. code-block:: python
import_modules = ['PyQt5']
import_modules = ["PyQt5"]
These tests often catch missing dependencies and non-RPATHed

View File

@@ -63,8 +63,8 @@ run package-specific unit tests.
.. code-block:: python
def installtest(self):
with working_dir('test'):
pytest = which('py.test')
with working_dir("test"):
pytest = which("py.test")
pytest()
@@ -93,7 +93,7 @@ the following dependency automatically:
.. code-block:: python
depends_on('python@2.5:', type='build')
depends_on("python@2.5:", type="build")
Waf only supports Python 2.5 and up.
@@ -113,7 +113,7 @@ phase, you can use:
args = []
if self.run_tests:
args.append('--test')
args.append("--test")
return args

View File

@@ -204,6 +204,7 @@ def setup(sphinx):
("py:class", "clingo.Control"),
("py:class", "six.moves.urllib.parse.ParseResult"),
("py:class", "TextIO"),
("py:class", "hashlib._Hash"),
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),

View File

@@ -304,3 +304,17 @@ To work properly, this requires your terminal to reset its title after
Spack has finished its work, otherwise Spack's status information will
remain in the terminal's title indefinitely. Most terminals should already
be set up this way and clear Spack's status information.
-----------
``aliases``
-----------
Aliases can be used to define new Spack commands. They can be either shortcuts
for longer commands or include specific arguments for convenience. For instance,
if users want to use ``spack install``'s ``-v`` argument all the time, they can
create a new alias called ``inst`` that will always call ``install -v``:
.. code-block:: yaml
aliases:
inst: install -v

View File

@@ -17,7 +17,7 @@ case you want to skip directly to specific docs:
* :ref:`config.yaml <config-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <build-settings>`
* :ref:`packages.yaml <packages-config>`
* :ref:`repos.yaml <repositories>`
You can also add any of these as inline configuration in the YAML
@@ -243,9 +243,11 @@ lower-precedence settings. Completely ignoring higher-level configuration
options is supported with the ``::`` notation for keys (see
:ref:`config-overrides` below).
There are also special notations for string concatenation and precendense override.
Using the ``+:`` notation can be used to force *prepending* strings or lists. For lists, this is identical
to the default behavior. Using the ``-:`` works similarly, but for *appending* values.
There are also special notations for string concatenation and precendense override:
* ``+:`` will force *prepending* strings or lists. For lists, this is the default behavior.
* ``-:`` works similarly, but for *appending* values.
:ref:`config-prepend-append`
^^^^^^^^^^^

View File

@@ -24,6 +24,16 @@ image, or to set up a proper entrypoint to run the image. These tasks are
usually both necessary and repetitive, so Spack comes with a command
to generate recipes for container images starting from a ``spack.yaml``.
.. seealso::
This page is a reference for generating recipes to build container images.
It means that your environment is built from scratch inside the container
runtime.
Since v0.21, Spack can also create container images from existing package installations
on your host system. See :ref:`binary_caches_oci` for more information on
that topic.
--------------------
A Quick Introduction
--------------------
@@ -212,18 +222,12 @@ under the ``container`` attribute of environments:
final:
- libgomp
# Extra instructions
extra_instructions:
final: |
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
# Labels for the image
labels:
app: "gromacs"
mpi: "mpich"
A detailed description of the options available can be found in the
:ref:`container_config_options` section.
A detailed description of the options available can be found in the :ref:`container_config_options` section.
-------------------
Setting Base Images
@@ -525,6 +529,13 @@ the example below:
COPY data /share/myapp/data
{% endblock %}
The Dockerfile is generated by running:
.. code-block:: console
$ spack -e /opt/environment containerize
Note that the environment must be active for spack to read the template.
The recipe that gets generated contains the two extra instruction that we added in our template extension:
.. code-block:: Dockerfile

View File

@@ -9,46 +9,42 @@
Custom Extensions
=================
*Spack extensions* permit you to extend Spack capabilities by deploying your
*Spack extensions* allow you to extend Spack capabilities by deploying your
own custom commands or logic in an arbitrary location on your filesystem.
This might be extremely useful e.g. to develop and maintain a command whose purpose is
too specific to be considered for reintegration into the mainline or to
evolve a command through its early stages before starting a discussion to merge
it upstream.
From Spack's point of view an extension is any path in your filesystem which
respects a prescribed naming and layout for files:
respects the following naming and layout for files:
.. code-block:: console
spack-scripting/ # The top level directory must match the format 'spack-{extension_name}'
├── pytest.ini # Optional file if the extension ships its own tests
├── scripting # Folder that may contain modules that are needed for the extension commands
│   ── cmd # Folder containing extension commands
│   └── filter.py # A new command that will be available
├── tests # Tests for this extension
│   ── cmd # Folder containing extension commands
│   │   └── filter.py # A new command that will be available
│   └── functions.py # Module with internal details
└── tests # Tests for this extension
│ ├── conftest.py
│ └── test_filter.py
└── templates # Templates that may be needed by the extension
In the example above the extension named *scripting* adds an additional command (``filter``)
and unit tests to verify its behavior. The code for this example can be
obtained by cloning the corresponding git repository:
In the example above, the extension is named *scripting*. It adds an additional command
(``spack filter``) and unit tests to verify its behavior.
.. TODO: write an ad-hoc "hello world" extension and make it part of the spack organization
The extension can import any core Spack module in its implementation. When loaded by
the ``spack`` command, the extension itself is imported as a Python package in the
``spack.extensions`` namespace. In the example above, since the extension is named
"scripting", the corresponding Python module is ``spack.extensions.scripting``.
The code for this example extension can be obtained by cloning the corresponding git repository:
.. code-block:: console
$ cd ~/
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
remote: Counting objects: 11, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 11 (delta 0), reused 11 (delta 0), pack-reused 0
Receiving objects: 100% (11/11), done.
As you can see by inspecting the sources, Python modules that are part of the extension
can import any core Spack module.
$ git -C /tmp clone https://github.com/spack/spack-scripting.git
---------------------------------
Configure Spack to Use Extensions
@@ -61,7 +57,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- ~/tmp/spack-scripting
- /tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:
@@ -86,37 +82,32 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options
to ``spack unit-test``:
The corresponding unit tests can be run giving the appropriate options to ``spack unit-test``:
.. code-block:: console
$ spack unit-test --extension=scripting
============================================================== test session starts ===============================================================
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0
rootdir: /home/mculpo/tmp/spack-scripting, inifile: pytest.ini
========================================== test session starts ===========================================
platform linux -- Python 3.11.5, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/culpo/github/spack-scripting
configfile: pytest.ini
testpaths: tests
plugins: xdist-3.5.0
collected 5 items
tests/test_filter.py ...XX
============================================================ short test summary info =============================================================
XPASS tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
XPASS tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
tests/test_filter.py ..... [100%]
=========================================================== slowest 20 test durations ============================================================
3.74s setup tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.17s call tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.16s call tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.15s call tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.13s call tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.08s call tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.04s teardown tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.00s setup tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s setup tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
====================================================== 3 passed, 2 xpassed in 4.51 seconds =======================================================
========================================== slowest 30 durations ==========================================
2.31s setup tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.57s call tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.56s call tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.48s call tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================

View File

@@ -0,0 +1,77 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
==========================
Frequently Asked Questions
==========================
This page contains answers to frequently asked questions about Spack.
If you have questions that are not answered here, feel free to ask on
`Slack <https://slack.spack.io>`_ or `GitHub Discussions
<https://github.com/spack/spack/discussions>`_. If you've learned the
answer to a question that you think should be here, please consider
contributing to this page.
.. _faq-concretizer-precedence:
-----------------------------------------------------
Why does Spack pick particular versions and variants?
-----------------------------------------------------
This question comes up in a variety of forms:
1. Why does Spack seem to ignore my package preferences from ``packages.yaml`` config?
2. Why does Spack toggle a variant instead of using the default from the ``package.py`` file?
The short answer is that Spack always picks an optimal configuration
based on a complex set of criteria\ [#f1]_. These criteria are more nuanced
than always choosing the latest versions or default variants.
.. note::
As a rule of thumb: requirements + constraints > reuse > preferences > defaults.
The following set of criteria (from lowest to highest precedence) explain
common cases where concretization output may seem surprising at first.
1. :ref:`Package preferences <package-preferences>` configured in ``packages.yaml``
override variant defaults from ``package.py`` files, and influence the optimal
ordering of versions. Preferences are specified as follows:
.. code-block:: yaml
packages:
foo:
version: [1.0, 1.1]
variants: ~mpi
2. :ref:`Reuse concretization <concretizer-options>` configured in ``concretizer.yaml``
overrides preferences, since it's typically faster to reuse an existing spec than to
build a preferred one from sources. When build caches are enabled, specs may be reused
from a remote location too. Reuse concretization is configured as follows:
.. code-block:: yaml
concretizer:
reuse: dependencies # other options are 'true' and 'false'
3. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
and constraints from the command line as well as ``package.py`` files override all
of the above. Requirements are specified as follows:
.. code-block:: yaml
packages:
foo:
require:
- "@1.2: +mpi"
Requirements and constraints restrict the set of possible solutions, while reuse
behavior and preferences influence what an optimal solution looks like.
.. rubric:: Footnotes
.. [#f1] The exact list of criteria can be retrieved with the ``spack solve`` command

View File

@@ -111,3 +111,28 @@ CUDA is split into fewer components and is simpler to specify:
prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.
-----------------------------------
Using an External OpenGL API
-----------------------------------
Depending on whether we have a graphics card or not, we may choose to use OSMesa or GLX to implement the OpenGL API.
If a graphics card is unavailable, OSMesa is recommended and can typically be built with Spack.
However, if we prefer to utilize the system GLX tailored to our graphics card, we need to declare it as an external. Here's how to do it:
.. code-block:: yaml
packages:
libglx:
require: [opengl]
opengl:
buildable: false
externals:
- prefix: /usr/
spec: opengl@4.6
Note that prefix has to be the root of both the libraries and the headers, using is /usr not the path the the lib.
To know which spec for opengl is available use ``cd /usr/include/GL && grep -Ri gl_version``.

Binary file not shown.

After

Width:  |  Height:  |  Size: 296 KiB

View File

@@ -0,0 +1,534 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><!-- Generated by graphviz version 2.40.1 (20161225.0304)
--><!-- Title: G Pages: 1 --><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="3044pt" height="1683pt" viewBox="0.00 0.00 3043.65 1682.80">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 1678.8)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-1678.8 3039.6456,-1678.8 3039.6456,4 -4,4"/>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="node1" class="node">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2407.965,-1198.3002C2407.965,-1198.3002 1948.1742,-1198.3002 1948.1742,-1198.3002 1942.1742,-1198.3002 1936.1742,-1192.3002 1936.1742,-1186.3002 1936.1742,-1186.3002 1936.1742,-1123.6998 1936.1742,-1123.6998 1936.1742,-1117.6998 1942.1742,-1111.6998 1948.1742,-1111.6998 1948.1742,-1111.6998 2407.965,-1111.6998 2407.965,-1111.6998 2413.965,-1111.6998 2419.965,-1117.6998 2419.965,-1123.6998 2419.965,-1123.6998 2419.965,-1186.3002 2419.965,-1186.3002 2419.965,-1192.3002 2413.965,-1198.3002 2407.965,-1198.3002"/>
<text text-anchor="middle" x="2178.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">netlib-scalapack@2.2.0%gcc@9.4.0/hkcrbrt</text>
</g>
<!-- o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="node8" class="node">
<title>o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M901.2032,-1039.5002C901.2032,-1039.5002 486.936,-1039.5002 486.936,-1039.5002 480.936,-1039.5002 474.936,-1033.5002 474.936,-1027.5002 474.936,-1027.5002 474.936,-964.8998 474.936,-964.8998 474.936,-958.8998 480.936,-952.8998 486.936,-952.8998 486.936,-952.8998 901.2032,-952.8998 901.2032,-952.8998 907.2032,-952.8998 913.2032,-958.8998 913.2032,-964.8998 913.2032,-964.8998 913.2032,-1027.5002 913.2032,-1027.5002 913.2032,-1033.5002 907.2032,-1039.5002 901.2032,-1039.5002"/>
<text text-anchor="middle" x="694.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">openblas@0.3.21%gcc@9.4.0/o524geb</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge10" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1936.1981,-1113.832C1933.0949,-1113.4088 1930.0059,-1112.9948 1926.9392,-1112.5915 1575.405,-1066.3348 1485.3504,-1074.0879 1131.9752,-1040.5955 1064.2267,-1034.1713 990.6114,-1026.9648 923.4066,-1020.2975"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1936.4684,-1111.8504C1933.3606,-1111.4265 1930.2716,-1111.0125 1927.2,-1110.6085 1575.2335,-1064.3422 1485.1789,-1072.0953 1132.164,-1038.6045 1064.4216,-1032.1808 990.8062,-1024.9744 923.604,-1018.3073"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="923.505,-1015.7853 913.2081,-1018.2801 922.8133,-1022.751 923.505,-1015.7853"/>
<text text-anchor="middle" x="1368.79" y="-1067.6346" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- 2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="node23" class="node">
<title>2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2767.3081,-1039.5002C2767.3081,-1039.5002 2166.8311,-1039.5002 2166.8311,-1039.5002 2160.8311,-1039.5002 2154.8311,-1033.5002 2154.8311,-1027.5002 2154.8311,-1027.5002 2154.8311,-964.8998 2154.8311,-964.8998 2154.8311,-958.8998 2160.8311,-952.8998 2166.8311,-952.8998 2166.8311,-952.8998 2767.3081,-952.8998 2767.3081,-952.8998 2773.3081,-952.8998 2779.3081,-958.8998 2779.3081,-964.8998 2779.3081,-964.8998 2779.3081,-1027.5002 2779.3081,-1027.5002 2779.3081,-1033.5002 2773.3081,-1039.5002 2767.3081,-1039.5002"/>
<text text-anchor="middle" x="2467.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">intel-parallel-studio@cluster.2020.4%gcc@9.4.0/2w3nq3n</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge29" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2256.5586,-1110.7308C2294.3103,-1089.9869 2339.6329,-1065.083 2378.4976,-1043.7276"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2257.5217,-1112.4836C2295.2735,-1091.7397 2340.5961,-1066.8358 2379.4607,-1045.4804"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2381.116,-1047.4235 2388.1946,-1039.5403 2377.745,-1041.2886 2381.116,-1047.4235"/>
<text text-anchor="middle" x="2286.6606" y="-1079.8414" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="node27" class="node">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1539.1928,-1039.5002C1539.1928,-1039.5002 1152.9464,-1039.5002 1152.9464,-1039.5002 1146.9464,-1039.5002 1140.9464,-1033.5002 1140.9464,-1027.5002 1140.9464,-1027.5002 1140.9464,-964.8998 1140.9464,-964.8998 1140.9464,-958.8998 1146.9464,-952.8998 1152.9464,-952.8998 1152.9464,-952.8998 1539.1928,-952.8998 1539.1928,-952.8998 1545.1928,-952.8998 1551.1928,-958.8998 1551.1928,-964.8998 1551.1928,-964.8998 1551.1928,-1027.5002 1551.1928,-1027.5002 1551.1928,-1033.5002 1545.1928,-1039.5002 1539.1928,-1039.5002"/>
<text text-anchor="middle" x="1346.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">cmake@3.25.1%gcc@9.4.0/gguve5i</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge17" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1950.9968,-1111.6597C1829.5529,-1088.4802 1680.8338,-1060.0949 1561.2457,-1037.2697"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.7091,-1033.795 1551.2303,-1035.3581 1560.3967,-1040.6709 1561.7091,-1033.795"/>
</g>
<!-- i4avrindvhcamhurzbfdaggbj2zgsrrh -->
<g id="node2" class="node">
<title>i4avrindvhcamhurzbfdaggbj2zgsrrh</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1536.3649,-86.7002C1536.3649,-86.7002 1155.7743,-86.7002 1155.7743,-86.7002 1149.7743,-86.7002 1143.7743,-80.7002 1143.7743,-74.7002 1143.7743,-74.7002 1143.7743,-12.0998 1143.7743,-12.0998 1143.7743,-6.0998 1149.7743,-.0998 1155.7743,-.0998 1155.7743,-.0998 1536.3649,-.0998 1536.3649,-.0998 1542.3649,-.0998 1548.3649,-6.0998 1548.3649,-12.0998 1548.3649,-12.0998 1548.3649,-74.7002 1548.3649,-74.7002 1548.3649,-80.7002 1542.3649,-86.7002 1536.3649,-86.7002"/>
<text text-anchor="middle" x="1346.0696" y="-36.2" font-family="Monaco" font-size="24.00" fill="#000000">pkgconf@1.8.0%gcc@9.4.0/i4avrin</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="node3" class="node">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M849.3673,-721.9002C849.3673,-721.9002 480.7719,-721.9002 480.7719,-721.9002 474.7719,-721.9002 468.7719,-715.9002 468.7719,-709.9002 468.7719,-709.9002 468.7719,-647.2998 468.7719,-647.2998 468.7719,-641.2998 474.7719,-635.2998 480.7719,-635.2998 480.7719,-635.2998 849.3673,-635.2998 849.3673,-635.2998 855.3673,-635.2998 861.3673,-641.2998 861.3673,-647.2998 861.3673,-647.2998 861.3673,-709.9002 861.3673,-709.9002 861.3673,-715.9002 855.3673,-721.9002 849.3673,-721.9002"/>
<text text-anchor="middle" x="665.0696" y="-671.4" font-family="Monaco" font-size="24.00" fill="#000000">perl@5.36.0%gcc@9.4.0/ywrpvv2</text>
</g>
<!-- h3ujmb3ts4kxxxv77knh2knuystuerbx -->
<g id="node7" class="node">
<title>h3ujmb3ts4kxxxv77knh2knuystuerbx</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M392.4016,-563.1002C392.4016,-563.1002 19.7376,-563.1002 19.7376,-563.1002 13.7376,-563.1002 7.7376,-557.1002 7.7376,-551.1002 7.7376,-551.1002 7.7376,-488.4998 7.7376,-488.4998 7.7376,-482.4998 13.7376,-476.4998 19.7376,-476.4998 19.7376,-476.4998 392.4016,-476.4998 392.4016,-476.4998 398.4016,-476.4998 404.4016,-482.4998 404.4016,-488.4998 404.4016,-488.4998 404.4016,-551.1002 404.4016,-551.1002 404.4016,-557.1002 398.4016,-563.1002 392.4016,-563.1002"/>
<text text-anchor="middle" x="206.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">bzip2@1.0.8%gcc@9.4.0/h3ujmb3</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;h3ujmb3ts4kxxxv77knh2knuystuerbx -->
<g id="edge9" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;h3ujmb3ts4kxxxv77knh2knuystuerbx</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M539.3189,-636.1522C477.7157,-614.8394 403.4197,-589.1353 340.5959,-567.4002"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M539.9728,-634.2622C478.3696,-612.9494 404.0736,-587.2452 341.2498,-565.5101"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="341.9365,-563.1023 331.3417,-563.1403 339.6478,-569.7176 341.9365,-563.1023"/>
</g>
<!-- uabgssx6lsgrevwbttslldnr5nzguprj -->
<g id="node19" class="node">
<title>uabgssx6lsgrevwbttslldnr5nzguprj</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1298.2296,-563.1002C1298.2296,-563.1002 937.9096,-563.1002 937.9096,-563.1002 931.9096,-563.1002 925.9096,-557.1002 925.9096,-551.1002 925.9096,-551.1002 925.9096,-488.4998 925.9096,-488.4998 925.9096,-482.4998 931.9096,-476.4998 937.9096,-476.4998 937.9096,-476.4998 1298.2296,-476.4998 1298.2296,-476.4998 1304.2296,-476.4998 1310.2296,-482.4998 1310.2296,-488.4998 1310.2296,-488.4998 1310.2296,-551.1002 1310.2296,-551.1002 1310.2296,-557.1002 1304.2296,-563.1002 1298.2296,-563.1002"/>
<text text-anchor="middle" x="1118.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">gdbm@1.23%gcc@9.4.0/uabgssx</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;uabgssx6lsgrevwbttslldnr5nzguprj -->
<g id="edge44" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;uabgssx6lsgrevwbttslldnr5nzguprj</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M788.523,-634.2635C849.3209,-612.9507 922.6457,-587.2465 984.6483,-565.5114"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M789.1847,-636.1509C849.9825,-614.8381 923.3073,-589.1339 985.3099,-567.3988"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="986.1559,-569.7515 994.435,-563.1403 983.8402,-563.1456 986.1559,-569.7515"/>
</g>
<!-- gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb -->
<g id="node20" class="node">
<title>gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M896.1744,-563.1002C896.1744,-563.1002 433.9648,-563.1002 433.9648,-563.1002 427.9648,-563.1002 421.9648,-557.1002 421.9648,-551.1002 421.9648,-551.1002 421.9648,-488.4998 421.9648,-488.4998 421.9648,-482.4998 427.9648,-476.4998 433.9648,-476.4998 433.9648,-476.4998 896.1744,-476.4998 896.1744,-476.4998 902.1744,-476.4998 908.1744,-482.4998 908.1744,-488.4998 908.1744,-488.4998 908.1744,-551.1002 908.1744,-551.1002 908.1744,-557.1002 902.1744,-563.1002 896.1744,-563.1002"/>
<text text-anchor="middle" x="665.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">berkeley-db@18.1.40%gcc@9.4.0/gkw4dg2</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb -->
<g id="edge23" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M664.0696,-635.2072C664.0696,-616.1263 664.0696,-593.5257 664.0696,-573.4046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M666.0696,-635.2072C666.0696,-616.1263 666.0696,-593.5257 666.0696,-573.4046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="668.5697,-573.1403 665.0696,-563.1403 661.5697,-573.1404 668.5697,-573.1403"/>
</g>
<!-- nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="node24" class="node">
<title>nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2195.2248,-563.1002C2195.2248,-563.1002 1840.9144,-563.1002 1840.9144,-563.1002 1834.9144,-563.1002 1828.9144,-557.1002 1828.9144,-551.1002 1828.9144,-551.1002 1828.9144,-488.4998 1828.9144,-488.4998 1828.9144,-482.4998 1834.9144,-476.4998 1840.9144,-476.4998 1840.9144,-476.4998 2195.2248,-476.4998 2195.2248,-476.4998 2201.2248,-476.4998 2207.2248,-482.4998 2207.2248,-488.4998 2207.2248,-488.4998 2207.2248,-551.1002 2207.2248,-551.1002 2207.2248,-557.1002 2201.2248,-563.1002 2195.2248,-563.1002"/>
<text text-anchor="middle" x="2018.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">zlib@1.2.13%gcc@9.4.0/nizxi5u</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="edge4" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M861.3292,-654.5584C1116.9929,-624.5514 1561.4447,-572.3867 1818.5758,-542.2075"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M861.5624,-656.5447C1117.2261,-626.5378 1561.6778,-574.373 1818.8089,-544.1939"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1819.373,-546.6449 1828.8968,-542.003 1818.5569,-539.6926 1819.373,-546.6449"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id -->
<g id="node4" class="node">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2383.212,-1674.7002C2383.212,-1674.7002 1972.9272,-1674.7002 1972.9272,-1674.7002 1966.9272,-1674.7002 1960.9272,-1668.7002 1960.9272,-1662.7002 1960.9272,-1662.7002 1960.9272,-1600.0998 1960.9272,-1600.0998 1960.9272,-1594.0998 1966.9272,-1588.0998 1972.9272,-1588.0998 1972.9272,-1588.0998 2383.212,-1588.0998 2383.212,-1588.0998 2389.212,-1588.0998 2395.212,-1594.0998 2395.212,-1600.0998 2395.212,-1600.0998 2395.212,-1662.7002 2395.212,-1662.7002 2395.212,-1668.7002 2389.212,-1674.7002 2383.212,-1674.7002"/>
<text text-anchor="middle" x="2178.0696" y="-1624.2" font-family="Monaco" font-size="24.00" fill="#000000">strumpack@7.0.1%gcc@9.4.0/idvshq5</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="edge33" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2177.0696,-1587.8598C2177.0696,-1500.5185 2177.0696,-1304.1624 2177.0696,-1208.8885"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2179.0696,-1587.8598C2179.0696,-1500.5185 2179.0696,-1304.1624 2179.0696,-1208.8885"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2181.5697,-1208.611 2178.0696,-1198.611 2174.5697,-1208.611 2181.5697,-1208.611"/>
<text text-anchor="middle" x="2125.9224" y="-1397.5399" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=scalapack</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge8" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1960.6199,-1629.1097C1600.5855,-1621.4505 897.1143,-1596.5054 662.748,-1516.9469 459.8544,-1447.9506 281.1117,-1289.236 401.2427,-1111.0377 418.213,-1086.3492 472.759,-1062.01 530.3793,-1041.9698"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1960.6625,-1627.1101C1600.6564,-1619.4517 897.1852,-1594.5067 663.3912,-1515.0531 461.1823,-1446.4551 282.4397,-1287.7405 402.8965,-1112.1623 419.028,-1088.1757 473.574,-1063.8364 531.0362,-1043.8589"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="532.0142,-1046.1665 540.3395,-1039.6137 529.7449,-1039.5445 532.0142,-1046.1665"/>
<text text-anchor="middle" x="1175.5163" y="-1600.8866" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh -->
<g id="node12" class="node">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M3003.3872,-1357.1002C3003.3872,-1357.1002 2606.752,-1357.1002 2606.752,-1357.1002 2600.752,-1357.1002 2594.752,-1351.1002 2594.752,-1345.1002 2594.752,-1345.1002 2594.752,-1282.4998 2594.752,-1282.4998 2594.752,-1276.4998 2600.752,-1270.4998 2606.752,-1270.4998 2606.752,-1270.4998 3003.3872,-1270.4998 3003.3872,-1270.4998 3009.3872,-1270.4998 3015.3872,-1276.4998 3015.3872,-1282.4998 3015.3872,-1282.4998 3015.3872,-1345.1002 3015.3872,-1345.1002 3015.3872,-1351.1002 3009.3872,-1357.1002 3003.3872,-1357.1002"/>
<text text-anchor="middle" x="2805.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">parmetis@4.0.3%gcc@9.4.0/imopnxj</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;imopnxjmv7cwzyiecdw2saq42qvpnauh -->
<g id="edge51" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;imopnxjmv7cwzyiecdw2saq42qvpnauh</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2393.6993,-1587.0809C2455.3565,-1569.7539 2521.1771,-1546.2699 2577.5864,-1515.1245 2649.1588,-1475.6656 2717.4141,-1409.6691 2759.9512,-1363.9364"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2394.2404,-1589.0062C2456.0286,-1571.6376 2521.8491,-1548.1536 2578.5528,-1516.8755 2650.5491,-1477.1034 2718.8043,-1411.107 2761.4156,-1365.2986"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2763.3454,-1366.8938 2767.5512,-1357.1695 2758.1992,-1362.1485 2763.3454,-1366.8938"/>
</g>
<!-- ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="node13" class="node">
<title>ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2928.3784,-1198.3002C2928.3784,-1198.3002 2563.7608,-1198.3002 2563.7608,-1198.3002 2557.7608,-1198.3002 2551.7608,-1192.3002 2551.7608,-1186.3002 2551.7608,-1186.3002 2551.7608,-1123.6998 2551.7608,-1123.6998 2551.7608,-1117.6998 2557.7608,-1111.6998 2563.7608,-1111.6998 2563.7608,-1111.6998 2928.3784,-1111.6998 2928.3784,-1111.6998 2934.3784,-1111.6998 2940.3784,-1117.6998 2940.3784,-1123.6998 2940.3784,-1123.6998 2940.3784,-1186.3002 2940.3784,-1186.3002 2940.3784,-1192.3002 2934.3784,-1198.3002 2928.3784,-1198.3002"/>
<text text-anchor="middle" x="2746.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">metis@5.1.0%gcc@9.4.0/ern66gy</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="edge25" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2371.6269,-1587.103C2443.5875,-1567.249 2513.691,-1542.0963 2537.3223,-1515.3355 2611.3482,-1433.6645 2525.4748,-1364.8484 2585.2274,-1269.8608 2602.2478,-1243.3473 2627.3929,-1221.1402 2652.8797,-1203.3777"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2372.1589,-1589.0309C2444.2629,-1569.1315 2514.3664,-1543.9788 2538.8169,-1516.6645 2612.5989,-1432.1038 2526.7255,-1363.2878 2586.9118,-1270.9392 2603.5717,-1244.8464 2628.7168,-1222.6393 2654.0229,-1205.0188"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2655.7411,-1206.8749 2662.0621,-1198.3722 2651.8184,-1201.0773 2655.7411,-1206.8749"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf -->
<g id="node14" class="node">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1964.017,-1357.1002C1964.017,-1357.1002 1532.1222,-1357.1002 1532.1222,-1357.1002 1526.1222,-1357.1002 1520.1222,-1351.1002 1520.1222,-1345.1002 1520.1222,-1345.1002 1520.1222,-1282.4998 1520.1222,-1282.4998 1520.1222,-1276.4998 1526.1222,-1270.4998 1532.1222,-1270.4998 1532.1222,-1270.4998 1964.017,-1270.4998 1964.017,-1270.4998 1970.017,-1270.4998 1976.017,-1276.4998 1976.017,-1282.4998 1976.017,-1282.4998 1976.017,-1345.1002 1976.017,-1345.1002 1976.017,-1351.1002 1970.017,-1357.1002 1964.017,-1357.1002"/>
<text text-anchor="middle" x="1748.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">butterflypack@2.2.2%gcc@9.4.0/nqiyrxl</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;nqiyrxlid6tikfpvoqdpvsjt5drs2obf -->
<g id="edge26" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;nqiyrxlid6tikfpvoqdpvsjt5drs2obf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2118.5874,-1588.7094C2039.1194,-1530.0139 1897.9154,-1425.72 1814.4793,-1364.0937"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2119.7757,-1587.1006C2040.3076,-1528.4052 1899.1036,-1424.1112 1815.6675,-1362.485"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1817.0581,-1360.404 1806.9348,-1357.2781 1812.8992,-1366.0347 1817.0581,-1360.404"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu -->
<g id="node16" class="node">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1106.2192,-1515.9002C1106.2192,-1515.9002 683.92,-1515.9002 683.92,-1515.9002 677.92,-1515.9002 671.92,-1509.9002 671.92,-1503.9002 671.92,-1503.9002 671.92,-1441.2998 671.92,-1441.2998 671.92,-1435.2998 677.92,-1429.2998 683.92,-1429.2998 683.92,-1429.2998 1106.2192,-1429.2998 1106.2192,-1429.2998 1112.2192,-1429.2998 1118.2192,-1435.2998 1118.2192,-1441.2998 1118.2192,-1441.2998 1118.2192,-1503.9002 1118.2192,-1503.9002 1118.2192,-1509.9002 1112.2192,-1515.9002 1106.2192,-1515.9002"/>
<text text-anchor="middle" x="895.0696" y="-1465.4" font-family="Monaco" font-size="24.00" fill="#000000">slate@2022.07.00%gcc@9.4.0/4bu62ky</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;4bu62kyfuh4ikdkuyxfxjxanf7e7qopu -->
<g id="edge5" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;4bu62kyfuh4ikdkuyxfxjxanf7e7qopu</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1960.6663,-1605.4991C1729.5518,-1576.8935 1365.2868,-1531.8075 1128.237,-1502.4673"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1960.912,-1603.5143C1729.7975,-1574.9086 1365.5325,-1529.8227 1128.4827,-1500.4825"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1128.5789,-1497.9754 1118.2247,-1500.2204 1127.719,-1504.9224 1128.5789,-1497.9754"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge20" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2395.1113,-1591.5061C2621.5772,-1545.7968 2953.3457,-1462.5053 3023.2362,-1356.6473 3049.986,-1316.785 3021.2047,-1131.5143 3003.3326,-1112.2759 2971.8969,-1077.7826 2884.3944,-1052.6467 2789.1441,-1034.9179"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2395.507,-1593.4665C2622.0642,-1547.7366 2953.8327,-1464.4452 3024.903,-1357.7527 3051.9623,-1316.478 3023.181,-1131.2073 3004.8066,-1110.9241 2972.4491,-1075.8603 2884.9466,-1050.7244 2789.5102,-1032.9517"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2789.9449,-1030.4898 2779.4781,-1032.132 2788.6845,-1037.3754 2789.9449,-1030.4898"/>
<text text-anchor="middle" x="2611.7445" y="-1537.8321" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- 7rzbmgoxhmm2jhellkgcjmn62uklf22x -->
<g id="node25" class="node">
<title>7rzbmgoxhmm2jhellkgcjmn62uklf22x</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1749.1952,-1515.9002C1749.1952,-1515.9002 1398.944,-1515.9002 1398.944,-1515.9002 1392.944,-1515.9002 1386.944,-1509.9002 1386.944,-1503.9002 1386.944,-1503.9002 1386.944,-1441.2998 1386.944,-1441.2998 1386.944,-1435.2998 1392.944,-1429.2998 1398.944,-1429.2998 1398.944,-1429.2998 1749.1952,-1429.2998 1749.1952,-1429.2998 1755.1952,-1429.2998 1761.1952,-1435.2998 1761.1952,-1441.2998 1761.1952,-1441.2998 1761.1952,-1503.9002 1761.1952,-1503.9002 1761.1952,-1509.9002 1755.1952,-1515.9002 1749.1952,-1515.9002"/>
<text text-anchor="middle" x="1574.0696" y="-1465.4" font-family="Monaco" font-size="24.00" fill="#000000">zfp@0.5.5%gcc@9.4.0/7rzbmgo</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;7rzbmgoxhmm2jhellkgcjmn62uklf22x -->
<g id="edge36" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;7rzbmgoxhmm2jhellkgcjmn62uklf22x</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2012.7697,-1588.9743C1930.7903,-1567.4208 1831.729,-1541.3762 1748.4742,-1519.4874"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2013.2782,-1587.0401C1931.2989,-1565.4866 1832.2376,-1539.442 1748.9827,-1517.5531"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1749.477,-1515.0982 1738.9157,-1515.9403 1747.697,-1521.8681 1749.477,-1515.0982"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge3" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2229.2864,-1587.9836C2336.2076,-1492.3172 2562.5717,-1260.0833 2429.0696,-1111.6 2372.2327,-1048.3851 1860.8259,-1017.0375 1561.5401,-1003.9799"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.5673,-1000.4779 1551.4253,-1003.5421 1561.2645,-1007.4714 1561.5673,-1000.4779"/>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="node5" class="node">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M912.4048,-1198.3002C912.4048,-1198.3002 475.7344,-1198.3002 475.7344,-1198.3002 469.7344,-1198.3002 463.7344,-1192.3002 463.7344,-1186.3002 463.7344,-1186.3002 463.7344,-1123.6998 463.7344,-1123.6998 463.7344,-1117.6998 469.7344,-1111.6998 475.7344,-1111.6998 475.7344,-1111.6998 912.4048,-1111.6998 912.4048,-1111.6998 918.4048,-1111.6998 924.4048,-1117.6998 924.4048,-1123.6998 924.4048,-1123.6998 924.4048,-1186.3002 924.4048,-1186.3002 924.4048,-1192.3002 918.4048,-1198.3002 912.4048,-1198.3002"/>
<text text-anchor="middle" x="694.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">blaspp@2022.07.00%gcc@9.4.0/mujlx42</text>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge16" class="edge">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M693.0696,-1111.6072C693.0696,-1092.5263 693.0696,-1069.9257 693.0696,-1049.8046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M695.0696,-1111.6072C695.0696,-1092.5263 695.0696,-1069.9257 695.0696,-1049.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="697.5697,-1049.5403 694.0696,-1039.5403 690.5697,-1049.5404 697.5697,-1049.5403"/>
<text text-anchor="middle" x="657.8516" y="-1079.8482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas</text>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge28" class="edge">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M872.2315,-1111.6072C960.9952,-1089.988 1068.311,-1063.8504 1158.3512,-1041.9204"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1159.2354,-1045.3074 1168.1232,-1039.5403 1157.5789,-1038.5062 1159.2354,-1045.3074"/>
</g>
<!-- htzjns66gmq6pjofohp26djmjnpbegho -->
<g id="node6" class="node">
<title>htzjns66gmq6pjofohp26djmjnpbegho</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M2663.3553,-880.7002C2663.3553,-880.7002 2270.7839,-880.7002 2270.7839,-880.7002 2264.7839,-880.7002 2258.7839,-874.7002 2258.7839,-868.7002 2258.7839,-868.7002 2258.7839,-806.0998 2258.7839,-806.0998 2258.7839,-800.0998 2264.7839,-794.0998 2270.7839,-794.0998 2270.7839,-794.0998 2663.3553,-794.0998 2663.3553,-794.0998 2669.3553,-794.0998 2675.3553,-800.0998 2675.3553,-806.0998 2675.3553,-806.0998 2675.3553,-868.7002 2675.3553,-868.7002 2675.3553,-874.7002 2669.3553,-880.7002 2663.3553,-880.7002"/>
<text text-anchor="middle" x="2467.0696" y="-830.2" font-family="Monaco" font-size="24.00" fill="#000000">patchelf@0.16.1%gcc@9.4.0/htzjns6</text>
</g>
<!-- xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6 -->
<g id="node15" class="node">
<title>xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M394.2232,-404.3002C394.2232,-404.3002 17.916,-404.3002 17.916,-404.3002 11.916,-404.3002 5.916,-398.3002 5.916,-392.3002 5.916,-392.3002 5.916,-329.6998 5.916,-329.6998 5.916,-323.6998 11.916,-317.6998 17.916,-317.6998 17.916,-317.6998 394.2232,-317.6998 394.2232,-317.6998 400.2232,-317.6998 406.2232,-323.6998 406.2232,-329.6998 406.2232,-329.6998 406.2232,-392.3002 406.2232,-392.3002 406.2232,-398.3002 400.2232,-404.3002 394.2232,-404.3002"/>
<text text-anchor="middle" x="206.0696" y="-353.8" font-family="Monaco" font-size="24.00" fill="#000000">diffutils@3.8%gcc@9.4.0/xm3ldz3</text>
</g>
<!-- h3ujmb3ts4kxxxv77knh2knuystuerbx&#45;&gt;xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6 -->
<g id="edge1" class="edge">
<title>h3ujmb3ts4kxxxv77knh2knuystuerbx-&gt;xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M206.0696,-476.4072C206.0696,-457.3263 206.0696,-434.7257 206.0696,-414.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="209.5697,-414.3403 206.0696,-404.3403 202.5697,-414.3404 209.5697,-414.3403"/>
</g>
<!-- o524gebsxavobkte3k5fglgwnedfkadf&#45;&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="edge11" class="edge">
<title>o524gebsxavobkte3k5fglgwnedfkadf-&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M690.0981,-952.705C684.8522,-895.2533 675.6173,-794.1153 669.9514,-732.0637"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="673.4345,-731.7184 669.0396,-722.0781 666.4635,-732.355 673.4345,-731.7184"/>
</g>
<!-- 4vsmjofkhntilgzh4zebluqak5mdsu3x -->
<g id="node9" class="node">
<title>4vsmjofkhntilgzh4zebluqak5mdsu3x</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1977.9121,-721.9002C1977.9121,-721.9002 1386.2271,-721.9002 1386.2271,-721.9002 1380.2271,-721.9002 1374.2271,-715.9002 1374.2271,-709.9002 1374.2271,-709.9002 1374.2271,-647.2998 1374.2271,-647.2998 1374.2271,-641.2998 1380.2271,-635.2998 1386.2271,-635.2998 1386.2271,-635.2998 1977.9121,-635.2998 1977.9121,-635.2998 1983.9121,-635.2998 1989.9121,-641.2998 1989.9121,-647.2998 1989.9121,-647.2998 1989.9121,-709.9002 1989.9121,-709.9002 1989.9121,-715.9002 1983.9121,-721.9002 1977.9121,-721.9002"/>
<text text-anchor="middle" x="1682.0696" y="-671.4" font-family="Monaco" font-size="24.00" fill="#000000">ca-certificates-mozilla@2023-01-10%gcc@9.4.0/4vsmjof</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow -->
<g id="node10" class="node">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M988.1824,-1357.1002C988.1824,-1357.1002 533.9568,-1357.1002 533.9568,-1357.1002 527.9568,-1357.1002 521.9568,-1351.1002 521.9568,-1345.1002 521.9568,-1345.1002 521.9568,-1282.4998 521.9568,-1282.4998 521.9568,-1276.4998 527.9568,-1270.4998 533.9568,-1270.4998 533.9568,-1270.4998 988.1824,-1270.4998 988.1824,-1270.4998 994.1824,-1270.4998 1000.1824,-1276.4998 1000.1824,-1282.4998 1000.1824,-1282.4998 1000.1824,-1345.1002 1000.1824,-1345.1002 1000.1824,-1351.1002 994.1824,-1357.1002 988.1824,-1357.1002"/>
<text text-anchor="middle" x="761.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">lapackpp@2022.07.00%gcc@9.4.0/xiro2z6</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="edge37" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M741.8402,-1270.7959C733.6789,-1251.4525 723.9915,-1228.4917 715.4149,-1208.1641"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M743.6829,-1270.0185C735.5216,-1250.675 725.8342,-1227.7143 717.2576,-1207.3866"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="719.4676,-1206.1933 712.3555,-1198.3403 713.0181,-1208.9144 719.4676,-1206.1933"/>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge35" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M597.2326,-1271.3826C534.1471,-1251.0571 472.8527,-1225.5904 454.2471,-1198.9688 432.1275,-1166.6075 433.5639,-1144.2113 454.2226,-1111.0684 472.6194,-1081.8657 500.3255,-1060.004 530.6572,-1043.4601"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M597.8458,-1269.4789C534.9144,-1249.2102 473.6201,-1223.7435 455.8921,-1197.8312 434.1234,-1166.7355 435.5598,-1144.3393 455.9166,-1112.1316 473.8583,-1083.4358 501.5644,-1061.5741 531.6142,-1045.2163"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="532.9062,-1047.362 540.1422,-1039.6231 529.6595,-1041.1605 532.9062,-1047.362"/>
<text text-anchor="middle" x="474.3109" y="-1250.2598" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge45" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M833.5823,-1270.3956C865.3249,-1250.0918 902.2709,-1224.6296 933.0696,-1198.4 973.2414,-1164.1878 969.8532,-1140.395 1014.0696,-1111.6 1058.5051,-1082.6623 1111.0286,-1060.0733 1161.029,-1042.8573"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1162.313,-1046.1177 1170.6621,-1039.5953 1160.0678,-1039.4876 1162.313,-1046.1177"/>
</g>
<!-- j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="node11" class="node">
<title>j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1527.3625,-245.5002C1527.3625,-245.5002 1164.7767,-245.5002 1164.7767,-245.5002 1158.7767,-245.5002 1152.7767,-239.5002 1152.7767,-233.5002 1152.7767,-233.5002 1152.7767,-170.8998 1152.7767,-170.8998 1152.7767,-164.8998 1158.7767,-158.8998 1164.7767,-158.8998 1164.7767,-158.8998 1527.3625,-158.8998 1527.3625,-158.8998 1533.3625,-158.8998 1539.3625,-164.8998 1539.3625,-170.8998 1539.3625,-170.8998 1539.3625,-233.5002 1539.3625,-233.5002 1539.3625,-239.5002 1533.3625,-245.5002 1527.3625,-245.5002"/>
<text text-anchor="middle" x="1346.0696" y="-195" font-family="Monaco" font-size="24.00" fill="#000000">ncurses@6.4%gcc@9.4.0/j5rupoq</text>
</g>
<!-- j5rupoqliu7kasm6xndl7ui32wgawkru&#45;&gt;i4avrindvhcamhurzbfdaggbj2zgsrrh -->
<g id="edge15" class="edge">
<title>j5rupoqliu7kasm6xndl7ui32wgawkru-&gt;i4avrindvhcamhurzbfdaggbj2zgsrrh</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1346.0696,-158.8072C1346.0696,-139.7263 1346.0696,-117.1257 1346.0696,-97.0046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-96.7403 1346.0696,-86.7403 1342.5697,-96.7404 1349.5697,-96.7403"/>
<text text-anchor="middle" x="1292.7436" y="-127.0482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=pkgconfig</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="edge19" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2788.0102,-1270.7555C2780.8234,-1251.412 2772.2926,-1228.4513 2764.7402,-1208.1236"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2789.885,-1270.0589C2782.6982,-1250.7155 2774.1674,-1227.7547 2766.615,-1207.4271"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2768.9358,-1206.4953 2762.1721,-1198.3403 2762.3741,-1208.9332 2768.9358,-1206.4953"/>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge12" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2907.2846,-1269.5018C2936.475,-1251.8137 2964.9158,-1228.1116 2981.1904,-1197.9236 2999.477,-1164.2363 3005.2125,-1141.4693 2981.289,-1112.225 2954.5472,-1078.5579 2876.5297,-1053.8974 2789.2983,-1036.3535"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2908.3216,-1271.2119C2937.7554,-1253.3501 2966.1962,-1229.648 2982.9488,-1198.8764 3001.4164,-1164.7249 3007.1519,-1141.9579 2982.8502,-1110.975 2955.15,-1076.6509 2877.1325,-1051.9904 2789.6927,-1034.3928"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2790.125,-1031.93 2779.6364,-1033.4269 2788.7692,-1038.7974 2790.125,-1031.93"/>
<text text-anchor="middle" x="2836.0561" y="-1059.5023" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge49" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2883.731,-1270.4691C2909.4451,-1251.9243 2934.9956,-1227.7144 2949.0696,-1198.4 2965.7663,-1163.6227 2975.3506,-1139.841 2949.0696,-1111.6 2925.7161,-1086.5049 1993.0368,-1031.9055 1561.3071,-1007.9103"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.3813,-1004.4092 1551.2026,-1007.3492 1560.9931,-1011.3984 1561.3813,-1004.4092"/>
</g>
<!-- ern66gyp6qmhmpod4jaynxx4weoberfm&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge50" class="edge">
<title>ern66gyp6qmhmpod4jaynxx4weoberfm-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2551.6031,-1113.7387C2547.0531,-1112.9948 2542.537,-1112.2802 2538.0696,-1111.6 2198.5338,-1059.8997 1800.8632,-1026.8711 1561.4583,-1009.9443"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.4619,-1006.436 1551.2407,-1009.2249 1560.9702,-1013.4187 1561.4619,-1006.436"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="edge34" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1865.2226,-1269.4691C1922.6966,-1248.2438 1991.964,-1222.6632 2050.6644,-1200.985"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1865.9154,-1271.3453C1923.3894,-1250.12 1992.6569,-1224.5394 2051.3572,-1202.8612"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2052.5441,-1205.088 2060.7123,-1198.3403 2050.119,-1198.5215 2052.5441,-1205.088"/>
<text text-anchor="middle" x="1910.9073" y="-1238.6056" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=scalapack</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge52" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1519.9696,-1290.6844C1394.6018,-1273.3057 1237.6631,-1244.7294 1102.7507,-1199.3478 1021.8138,-1171.8729 1008.1992,-1149.8608 932.6248,-1112.4956 887.1715,-1089.9216 836.578,-1065.4054 793.6914,-1044.8018"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1520.2442,-1288.7034C1394.9601,-1271.3381 1238.0214,-1242.7618 1103.3885,-1197.4522 1023.5148,-1170.8208 1009.9002,-1148.8087 933.5144,-1110.7044 888.0436,-1088.1218 837.4502,-1063.6056 794.5574,-1042.999"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="795.6235,-1040.7377 785.0938,-1039.565 792.5939,-1047.0482 795.6235,-1040.7377"/>
<text text-anchor="middle" x="1046.8307" y="-1202.5988" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef -->
<g id="node21" class="node">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1547.9922,-1198.3002C1547.9922,-1198.3002 1144.147,-1198.3002 1144.147,-1198.3002 1138.147,-1198.3002 1132.147,-1192.3002 1132.147,-1186.3002 1132.147,-1186.3002 1132.147,-1123.6998 1132.147,-1123.6998 1132.147,-1117.6998 1138.147,-1111.6998 1144.147,-1111.6998 1144.147,-1111.6998 1547.9922,-1111.6998 1547.9922,-1111.6998 1553.9922,-1111.6998 1559.9922,-1117.6998 1559.9922,-1123.6998 1559.9922,-1123.6998 1559.9922,-1186.3002 1559.9922,-1186.3002 1559.9922,-1192.3002 1553.9922,-1198.3002 1547.9922,-1198.3002"/>
<text text-anchor="middle" x="1346.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">arpack-ng@3.8.0%gcc@9.4.0/lfh3aov</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;lfh3aovn65e66cs24qiehq3nd2ddojef -->
<g id="edge46" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;lfh3aovn65e66cs24qiehq3nd2ddojef</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1637.8539,-1271.3373C1584.2332,-1250.1557 1519.6324,-1224.6368 1464.827,-1202.9873"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1638.5887,-1269.4771C1584.968,-1248.2956 1520.3672,-1222.7767 1465.5618,-1201.1272"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1466.3716,-1198.7592 1455.785,-1198.3403 1463.7998,-1205.2696 1466.3716,-1198.7592"/>
</g>
<!-- 57joith2sqq6sehge54vlloyolm36mdu -->
<g id="node22" class="node">
<title>57joith2sqq6sehge54vlloyolm36mdu</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1906.2352,-1198.3002C1906.2352,-1198.3002 1589.904,-1198.3002 1589.904,-1198.3002 1583.904,-1198.3002 1577.904,-1192.3002 1577.904,-1186.3002 1577.904,-1186.3002 1577.904,-1123.6998 1577.904,-1123.6998 1577.904,-1117.6998 1583.904,-1111.6998 1589.904,-1111.6998 1589.904,-1111.6998 1906.2352,-1111.6998 1906.2352,-1111.6998 1912.2352,-1111.6998 1918.2352,-1117.6998 1918.2352,-1123.6998 1918.2352,-1123.6998 1918.2352,-1186.3002 1918.2352,-1186.3002 1918.2352,-1192.3002 1912.2352,-1198.3002 1906.2352,-1198.3002"/>
<text text-anchor="middle" x="1748.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">sed@4.8%gcc@9.4.0/57joith</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;57joith2sqq6sehge54vlloyolm36mdu -->
<g id="edge27" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;57joith2sqq6sehge54vlloyolm36mdu</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1748.0696,-1270.4072C1748.0696,-1251.3263 1748.0696,-1228.7257 1748.0696,-1208.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1751.5697,-1208.3403 1748.0696,-1198.3403 1744.5697,-1208.3404 1751.5697,-1208.3403"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge24" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1975.9734,-1301.684C2148.2819,-1288.3961 2365.6859,-1259.5384 2428.3689,-1197.6866 2466.9261,-1160.1438 2472.9783,-1095.7153 2471.5152,-1049.9701"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1976.1272,-1303.678C2148.5451,-1290.3788 2365.949,-1261.521 2429.7703,-1199.1134 2468.9173,-1160.3309 2474.9695,-1095.9024 2473.5142,-1049.9065"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2476.0078,-1049.7027 2472.0657,-1039.8686 2469.0147,-1050.0146 2476.0078,-1049.7027"/>
<text text-anchor="middle" x="2207.8884" y="-1273.0053" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge6" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1520.1614,-1301.6771C1362.9712,-1287.992 1173.582,-1259.0928 1123.0696,-1198.4 1098.3914,-1168.7481 1103.0165,-1144.5563 1123.0696,-1111.6 1140.5998,-1082.79 1167.9002,-1060.8539 1197.4647,-1044.2681"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1199.1408,-1047.3408 1206.2789,-1039.5114 1195.8163,-1041.1806 1199.1408,-1047.3408"/>
</g>
<!-- ogcucq2eod3xusvvied5ol2iobui4nsb -->
<g id="node18" class="node">
<title>ogcucq2eod3xusvvied5ol2iobui4nsb</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M400.2088,-245.5002C400.2088,-245.5002 11.9304,-245.5002 11.9304,-245.5002 5.9304,-245.5002 -.0696,-239.5002 -.0696,-233.5002 -.0696,-233.5002 -.0696,-170.8998 -.0696,-170.8998 -.0696,-164.8998 5.9304,-158.8998 11.9304,-158.8998 11.9304,-158.8998 400.2088,-158.8998 400.2088,-158.8998 406.2088,-158.8998 412.2088,-164.8998 412.2088,-170.8998 412.2088,-170.8998 412.2088,-233.5002 412.2088,-233.5002 412.2088,-239.5002 406.2088,-245.5002 400.2088,-245.5002"/>
<text text-anchor="middle" x="206.0696" y="-195" font-family="Monaco" font-size="24.00" fill="#000000">libiconv@1.17%gcc@9.4.0/ogcucq2</text>
</g>
<!-- xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6&#45;&gt;ogcucq2eod3xusvvied5ol2iobui4nsb -->
<g id="edge47" class="edge">
<title>xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6-&gt;ogcucq2eod3xusvvied5ol2iobui4nsb</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M205.0696,-317.6072C205.0696,-298.5263 205.0696,-275.9257 205.0696,-255.8046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M207.0696,-317.6072C207.0696,-298.5263 207.0696,-275.9257 207.0696,-255.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="209.5697,-255.5403 206.0696,-245.5403 202.5697,-255.5404 209.5697,-255.5403"/>
<text text-anchor="middle" x="165.5739" y="-285.8482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=iconv</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="edge42" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M672.6614,-1430.2151C600.7916,-1411.3548 534.1254,-1386.9583 512.2667,-1357.7962 489.0909,-1326.029 493.54,-1304.0273 512.1928,-1269.9192 527.5256,-1242.0821 552.3382,-1220.1508 578.9347,-1203.0434"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M673.169,-1428.2806C601.4789,-1409.4766 534.8127,-1385.0802 513.8725,-1356.6038 491.0512,-1326.4254 495.5003,-1304.4237 513.9464,-1270.8808 528.8502,-1243.5806 553.6627,-1221.6493 580.016,-1204.7259"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="581.46,-1206.7724 588.1193,-1198.532 577.7747,-1200.8211 581.46,-1206.7724"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge43" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M680.4783,-1430.2246C600.8632,-1410.3933 522.8724,-1385.2921 493.3877,-1357.9314 411.1392,-1281.1573 374.1678,-1206.1582 435.2305,-1111.0561 454.3431,-1081.6726 482.5021,-1059.8261 513.5088,-1043.3725"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M680.9617,-1428.2839C601.476,-1408.4895 523.4851,-1383.3883 494.7515,-1356.4686 412.9331,-1280.273 375.9616,-1205.2739 436.9087,-1112.1439 455.569,-1083.2528 483.728,-1061.4063 514.4455,-1045.1396"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="515.8631,-1047.2236 523.1893,-1039.5699 512.6893,-1040.9844 515.8631,-1047.2236"/>
<text text-anchor="middle" x="453.0969" y="-1356.92" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;xiro2z6na56qdd4czjhj54eag3ekbiow -->
<g id="edge38" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;xiro2z6na56qdd4czjhj54eag3ekbiow</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M857.6892,-1429.8521C840.9235,-1409.9835 820.9375,-1386.2985 803.4466,-1365.5705"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M859.2178,-1428.5623C842.4521,-1408.6937 822.466,-1385.0087 804.9751,-1364.2807"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="806.7654,-1362.5258 797.6414,-1357.1403 801.4156,-1367.0402 806.7654,-1362.5258"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge13" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1118.1783,-1450.5735C1412.4221,-1422.447 1902.6188,-1374.0528 1984.8578,-1356.2227 2203.916,-1308.9943 2329.6342,-1377.1305 2461.2658,-1197.8052 2492.3675,-1156.1664 2488.743,-1094.1171 2480.3694,-1050.0521"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1118.3686,-1452.5644C1412.6186,-1424.4374 1902.8153,-1376.0432 1985.2814,-1358.1773 2202.963,-1310.7526 2328.6812,-1378.8889 2462.8734,-1198.9948 2494.3641,-1156.0498 2490.7395,-1094.0005 2482.3343,-1049.6791"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2484.7438,-1048.9818 2479.3189,-1039.8812 2477.8845,-1050.3784 2484.7438,-1048.9818"/>
<text text-anchor="middle" x="1820.4407" y="-1379.7188" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge32" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M947.2173,-1428.5496C968.7089,-1408.5917 992.2747,-1383.3345 1008.2117,-1356.6861 1067.0588,-1259.8646 1008.3745,-1197.6371 1084.3226,-1110.9351 1110.3076,-1081.7965 1144.7149,-1059.7578 1180.1804,-1043.0531"/>
<path fill="none" stroke="#daa520" stroke-width="2" d="M948.5783,-1430.0151C970.1712,-1409.9561 993.737,-1384.6989 1009.9275,-1357.7139 1068.5139,-1258.4924 1009.8295,-1196.2649 1085.8166,-1112.2649 1111.3864,-1083.4807 1145.7936,-1061.442 1181.0322,-1044.8626"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1182.4567,-1046.9607 1190.1008,-1039.6246 1179.5503,-1040.5926 1182.4567,-1046.9607"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo -->
<g id="node17" class="node">
<title>5xerf6imlgo4xlubacr4mljacc3edexo</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1822.3657,-880.7002C1822.3657,-880.7002 1437.7735,-880.7002 1437.7735,-880.7002 1431.7735,-880.7002 1425.7735,-874.7002 1425.7735,-868.7002 1425.7735,-868.7002 1425.7735,-806.0998 1425.7735,-806.0998 1425.7735,-800.0998 1431.7735,-794.0998 1437.7735,-794.0998 1437.7735,-794.0998 1822.3657,-794.0998 1822.3657,-794.0998 1828.3657,-794.0998 1834.3657,-800.0998 1834.3657,-806.0998 1834.3657,-806.0998 1834.3657,-868.7002 1834.3657,-868.7002 1834.3657,-874.7002 1828.3657,-880.7002 1822.3657,-880.7002"/>
<text text-anchor="middle" x="1630.0696" y="-830.2" font-family="Monaco" font-size="24.00" fill="#000000">openssl@1.1.1s%gcc@9.4.0/5xerf6i</text>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="edge22" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1425.7129,-803.7711C1262.7545,-776.9548 1035.5151,-739.5603 871.9084,-712.6373"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="872.1525,-709.1305 861.7169,-710.9602 871.0158,-716.0376 872.1525,-709.1305"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;4vsmjofkhntilgzh4zebluqak5mdsu3x -->
<g id="edge48" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;4vsmjofkhntilgzh4zebluqak5mdsu3x</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1644.2788,-794.0072C1650.5843,-774.7513 1658.0636,-751.9107 1664.6976,-731.6514"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1668.0917,-732.533 1667.8776,-721.9403 1661.4393,-730.3546 1668.0917,-732.533"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="edge41" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1834.3289,-793.5645C1906.6817,-774.1673 1975.9199,-749.2273 1998.2925,-721.3707 2031.5218,-680.681 2032.1636,-617.9031 2027.044,-573.3921"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1834.8468,-795.4962C1907.3595,-776.0489 1976.5977,-751.1089 1999.8467,-722.6293 2033.5217,-680.7015 2034.1635,-617.9235 2029.0309,-573.1639"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2031.4885,-572.6712 2026.7474,-563.1964 2024.5451,-573.5598 2031.4885,-572.6712"/>
</g>
<!-- v32wejd4d5lc6uka4qlrogwh5xae2h3r -->
<g id="node26" class="node">
<title>v32wejd4d5lc6uka4qlrogwh5xae2h3r</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1306.1776,-404.3002C1306.1776,-404.3002 929.9616,-404.3002 929.9616,-404.3002 923.9616,-404.3002 917.9616,-398.3002 917.9616,-392.3002 917.9616,-392.3002 917.9616,-329.6998 917.9616,-329.6998 917.9616,-323.6998 923.9616,-317.6998 929.9616,-317.6998 929.9616,-317.6998 1306.1776,-317.6998 1306.1776,-317.6998 1312.1776,-317.6998 1318.1776,-323.6998 1318.1776,-329.6998 1318.1776,-329.6998 1318.1776,-392.3002 1318.1776,-392.3002 1318.1776,-398.3002 1312.1776,-404.3002 1306.1776,-404.3002"/>
<text text-anchor="middle" x="1118.0696" y="-353.8" font-family="Monaco" font-size="24.00" fill="#000000">readline@8.2%gcc@9.4.0/v32wejd</text>
</g>
<!-- uabgssx6lsgrevwbttslldnr5nzguprj&#45;&gt;v32wejd4d5lc6uka4qlrogwh5xae2h3r -->
<g id="edge7" class="edge">
<title>uabgssx6lsgrevwbttslldnr5nzguprj-&gt;v32wejd4d5lc6uka4qlrogwh5xae2h3r</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1117.0696,-476.4072C1117.0696,-457.3263 1117.0696,-434.7257 1117.0696,-414.6046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1119.0696,-476.4072C1119.0696,-457.3263 1119.0696,-434.7257 1119.0696,-414.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1121.5697,-414.3403 1118.0696,-404.3403 1114.5697,-414.3404 1121.5697,-414.3403"/>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge14" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1167.6711,-1112.5788C1078.9073,-1090.9596 971.5916,-1064.822 881.5513,-1042.892"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1168.1444,-1110.6356C1079.3806,-1089.0165 972.0649,-1062.8788 882.0246,-1040.9488"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="882.5603,-1038.5062 872.016,-1039.5403 880.9038,-1045.3074 882.5603,-1038.5062"/>
<text text-anchor="middle" x="963.904" y="-1079.817" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge31" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1559.7922,-1112.1043C1562.8511,-1111.5975 1565.8904,-1111.1002 1568.9103,-1110.6128 1759.2182,-1079.8992 1973.2397,-1052.1328 2144.6143,-1031.5343"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1560.1191,-1114.0774C1563.1741,-1113.5712 1566.2134,-1113.0739 1569.2289,-1112.5872 1759.4755,-1081.8826 1973.497,-1054.1161 2144.8529,-1033.52"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2145.1529,-1036.002 2154.6648,-1031.3357 2144.3191,-1029.0518 2145.1529,-1036.002"/>
<text text-anchor="middle" x="1828.178" y="-1072.4692" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge21" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1346.0696,-1111.6072C1346.0696,-1092.5263 1346.0696,-1069.9257 1346.0696,-1049.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-1049.5403 1346.0696,-1039.5403 1342.5697,-1049.5404 1349.5697,-1049.5403"/>
</g>
<!-- 2w3nq3n3hcj2tqlvcpewsryamltlu5tw&#45;&gt;htzjns66gmq6pjofohp26djmjnpbegho -->
<g id="edge30" class="edge">
<title>2w3nq3n3hcj2tqlvcpewsryamltlu5tw-&gt;htzjns66gmq6pjofohp26djmjnpbegho</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2467.0696,-952.8072C2467.0696,-933.7263 2467.0696,-911.1257 2467.0696,-891.0046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2470.5697,-890.7403 2467.0696,-880.7403 2463.5697,-890.7404 2470.5697,-890.7403"/>
</g>
<!-- 7rzbmgoxhmm2jhellkgcjmn62uklf22x&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge2" class="edge">
<title>7rzbmgoxhmm2jhellkgcjmn62uklf22x-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1422.351,-1429.2133C1312.2528,-1388.8872 1171.1589,-1316.8265 1103.0696,-1198.4 1083.8409,-1164.956 1082.4563,-1144.2088 1103.0696,-1111.6 1121.4102,-1082.5864 1149.2483,-1060.7204 1179.6189,-1044.2895"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1181.4205,-1047.2977 1188.6801,-1039.5809 1178.1927,-1041.0863 1181.4205,-1047.2977"/>
</g>
<!-- v32wejd4d5lc6uka4qlrogwh5xae2h3r&#45;&gt;j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="edge39" class="edge">
<title>v32wejd4d5lc6uka4qlrogwh5xae2h3r-&gt;j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1179.8001,-316.7866C1209.2065,-296.3053 1244.4355,-271.7686 1274.8343,-250.5961"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1180.9431,-318.4278C1210.3495,-297.9465 1245.5785,-273.4098 1275.9774,-252.2373"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1277.6375,-254.1277 1283.8429,-245.5403 1273.6367,-248.3836 1277.6375,-254.1277"/>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if&#45;&gt;j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="edge18" class="edge">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if-&gt;j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1345.0696,-952.7909C1345.0696,-891.6316 1345.0696,-776.6094 1345.0696,-678.6 1345.0696,-678.6 1345.0696,-678.6 1345.0696,-519.8 1345.0696,-426.9591 1345.0696,-318.8523 1345.0696,-255.7237"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1347.0696,-952.7909C1347.0696,-891.6316 1347.0696,-776.6094 1347.0696,-678.6 1347.0696,-678.6 1347.0696,-678.6 1347.0696,-519.8 1347.0696,-426.9591 1347.0696,-318.8523 1347.0696,-255.7237"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-255.6091 1346.0696,-245.6091 1342.5697,-255.6092 1349.5697,-255.6091"/>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if&#45;&gt;5xerf6imlgo4xlubacr4mljacc3edexo -->
<g id="edge40" class="edge">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if-&gt;5xerf6imlgo4xlubacr4mljacc3edexo</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1423.1858,-951.9344C1460.2844,-931.1905 1504.8229,-906.2866 1543.0151,-884.9312"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1424.1619,-953.68C1461.2605,-932.9361 1505.799,-908.0322 1543.9912,-886.6769"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1545.5391,-888.6757 1552.5592,-880.7403 1542.1228,-882.5659 1545.5391,-888.6757"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 58 KiB

View File

@@ -55,6 +55,7 @@ or refer to the full manual below.
getting_started
basic_usage
replace_conda_homebrew
frequently_asked_questions
.. toctree::
:maxdepth: 2
@@ -70,7 +71,7 @@ or refer to the full manual below.
configuration
config_yaml
bootstrapping
packages_yaml
build_settings
environments
containers
@@ -78,6 +79,7 @@ or refer to the full manual below.
module_file_support
repositories
binary_caches
bootstrapping
command_index
chain
extensions

View File

@@ -519,11 +519,11 @@ inspections and customize them per-module-set.
modules:
prefix_inspections:
bin:
./bin:
- PATH
man:
./man:
- MANPATH
'':
./:
- CMAKE_PREFIX_PATH
Prefix inspections are only applied if the relative path inside the
@@ -579,7 +579,7 @@ the view.
view_relative_modules:
use_view: my_view
prefix_inspections:
bin:
./bin:
- PATH
view:
my_view:

View File

@@ -0,0 +1,591 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _packages-config:
================================
Package Settings (packages.yaml)
================================
Spack allows you to customize how your software is built through the
``packages.yaml`` file. Using it, you can make Spack prefer particular
implementations of virtual dependencies (e.g., MPI or BLAS/LAPACK),
or you can make it prefer to build with particular compilers. You can
also tell Spack to use *external* software installations already
present on your system.
At a high level, the ``packages.yaml`` file is structured like this:
.. code-block:: yaml
packages:
package1:
# settings for package1
package2:
# settings for package2
# ...
all:
# settings that apply to all packages.
So you can either set build preferences specifically for *one* package,
or you can specify that certain settings should apply to *all* packages.
The types of settings you can customize are described in detail below.
Spack's build defaults are in the default
``etc/spack/defaults/packages.yaml`` file. You can override them in
``~/.spack/packages.yaml`` or ``etc/spack/packages.yaml``. For more
details on how this works, see :ref:`configuration-scopes`.
.. _sec-external-packages:
-----------------
External Packages
-----------------
Spack can be configured to use externally-installed
packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
External packages are configured through the ``packages.yaml`` file.
Here's an example of an external configuration:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the pre-installed OpenMPI in
the given directory. Note that the specified path is the top-level
install prefix, not the ``bin`` subdirectory.
``packages.yaml`` can also be used to specify modules to load instead
of the installation prefixes. The following example says that module
``CMake/3.7.2`` provides cmake version 3.7.2.
.. code-block:: yaml
cmake:
externals:
- spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add an ``externals:``
attribute under the package name, which lists externals.
Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based
on its most-favored packages, and it may guess incorrectly.
Each package version and compiler listed in an external should
have entries in Spack's packages and compiler configuration, even
though the package and compiler may not ever be built.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Prevent packages from being built from sources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an external spec in ``packages.yaml`` allows Spack to use an external location,
but it does not prevent Spack from building packages from sources. In the above example,
Spack might choose for many valid reasons to start building and linking with the
latest version of OpenMPI rather than continue using the pre-installed OpenMPI versions.
To prevent this, the ``packages.yaml`` configuration also allows packages
to be flagged as non-buildable. The previous example could be modified to
be:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI from sources, and it will instead always rely on a pre-built
OpenMPI.
.. note::
If ``concretizer:reuse`` is on (see :ref:`concretizer-options` for more information on that flag)
pre-built specs include specs already available from a local store, an upstream store, a registered
buildcache or specs marked as externals in ``packages.yaml``. If ``concretizer:reuse`` is off, only
external specs in ``packages.yaml`` are included in the list of pre-built specs.
If an external module is specified as not buildable, then Spack will load the
external module into the build environment which can be used for linking.
The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Non-buildable virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Virtual packages in Spack can also be specified as not buildable, and
external implementations can be provided. In the example above,
OpenMPI is configured as not buildable, but Spack will often prefer
other MPI implementations over the externally available OpenMPI. Spack
can be configured with every MPI provider not buildable individually,
but more conveniently:
.. code-block:: yaml
packages:
mpi:
buildable: False
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
In cases where the concretizer is configured to reuse specs, and other ``mpi`` providers
(available via stores or buildcaches) are not wanted, Spack can be configured to require
specs matching only the available externals:
.. code-block:: yaml
packages:
mpi:
buildable: False
require:
- one_of: [
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64",
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug",
"openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
]
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This configuration prevents any spec using MPI and originating from stores or buildcaches to be reused,
unless it matches the requirements under ``packages:mpi:require``. For more information on requirements see
:ref:`package-requirements`.
.. _cmd-spack-external-find:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically Find External Packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can run the :ref:`spack external find <spack-external-find>` command
to search for system-provided packages and add them to ``packages.yaml``.
After running this command your ``packages.yaml`` may include new entries:
.. code-block:: yaml
packages:
cmake:
externals:
- spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies.
Specific limitations include:
* Packages are not discoverable by default: For a package to be
discoverable with ``spack external find``, it needs to add special
logic. See :ref:`here <make-package-findable>` for more details.
* The logic does not search through module files, it can only detect
packages with executables defined in ``PATH``; you can help Spack locate
externals which use module files by loading any associated modules for
packages that you want Spack to know about before running
``spack external find``.
* Spack does not overwrite existing entries in the package configuration:
If there is an external defined for a spec at any configuration scope,
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _package-requirements:
--------------------
Package Requirements
--------------------
Spack can be configured to always use certain compilers, package
versions, and variants during concretization through package
requirements.
Package requirements are useful when you find yourself repeatedly
specifying the same constraints on the command line, and wish that
Spack respects these constraints whether you mention them explicitly
or not. Another use case is specifying constraints that should apply
to all root specs in an environment, without having to repeat the
constraint everywhere.
Apart from that, requirements config is more flexible than constraints
on the command line, because it can specify constraints on packages
*when they occur* as a dependency. In contrast, on the command line it
is not possible to specify constraints on dependencies while also keeping
those dependencies optional.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
^^^^^^^^^^^^^^^^^^^
Requirements syntax
^^^^^^^^^^^^^^^^^^^
The package requirements configuration is specified in ``packages.yaml``,
keyed by package name and expressed using the Spec syntax. In the simplest
case you can specify attributes that you always want the package to have
by providing a single spec string to ``require``:
.. code-block:: yaml
packages:
libfabric:
require: "@1.13.2"
In the above example, ``libfabric`` will always build with version 1.13.2. If you
need to compose multiple configuration scopes ``require`` accepts a list of
strings:
.. code-block:: yaml
packages:
libfabric:
require:
- "@1.13.2"
- "%gcc"
In this case ``libfabric`` will always build with version 1.13.2 **and** using GCC
as a compiler.
For more complex use cases, require accepts also a list of objects. These objects
must have either a ``any_of`` or a ``one_of`` field, containing a list of spec strings,
and they can optionally have a ``when`` and a ``message`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["@4.1.5", "%gcc"]
message: "in this example only 4.1.5 can build with other compilers"
``any_of`` is a list of specs. One of those specs must be satisfied
and it is also allowed for the concretized spec to match more than one.
In the above example, that means you could build ``openmpi@4.1.5%gcc``,
``openmpi@4.1.5%clang`` or ``openmpi@3.9%gcc``, but
not ``openmpi@3.9%clang``.
If a custom message is provided, and the requirement is not satisfiable,
Spack will print the custom error message:
.. code-block:: console
$ spack spec openmpi@3.9%clang
==> Error: in this example only 4.1.5 can build with other compilers
We could express a similar requirement using the ``when`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
In the example above, if the version turns out to be 4.1.4 or less, we require the compiler to be GCC.
For readability, Spack also allows a ``spec`` key accepting a string when there is only a single
constraint:
.. code-block:: yaml
packages:
openmpi:
require:
- spec: "%gcc"
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
This code snippet and the one before it are semantically equivalent.
Finally, instead of ``any_of`` you can use ``one_of`` which also takes a list of specs. The final
concretized spec must match one and only one of them:
.. code-block:: yaml
packages:
mpich:
require:
- one_of: ["+cuda", "+rocm"]
In the example above, that means you could build ``mpich+cuda`` or ``mpich+rocm`` but not ``mpich+cuda+rocm``.
.. note::
For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
.. note::
When using a conditional requirement, Spack is allowed to actively avoid the triggering
condition (the ``when=...`` spec) if that leads to a concrete spec with better scores in
the optimization criteria. To check the current optimization criteria and their
priorities you can run ``spack solve zlib``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting default requirements
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can also set default requirements for all packages under ``all``
like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
which means every spec will be required to use ``clang`` as a compiler.
Requirements on variants for all packages are possible too, but note that they
are only enforced for those packages that define these variants, otherwise they
are disregarded. For example:
.. code-block:: yaml
packages:
all:
require:
- "+shared"
- "+cuda"
will just enforce ``+shared`` on ``zlib``, which has a boolean ``shared`` variant but
no ``cuda`` variant.
Constraints in a single spec literal are always considered as a whole, so in a case like:
.. code-block:: yaml
packages:
all:
require: "+shared +cuda"
the default requirement will be enforced only if a package has both a ``cuda`` and
a ``shared`` variant, and will never be partially enforced.
Finally, ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
.. code-block:: yaml
packages:
all:
require:
- 'build_type=Debug'
- '%clang'
cmake:
require:
- 'build_type=Debug'
- '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including ``cmake``
dependencies) to use ``clang``. If enforcing ``build_type=Debug`` is needed also
on ``cmake``, it must be repeated in the specific ``cmake`` requirements.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A requirement on a virtual spec applies whenever that virtual is present in the DAG.
This can be useful for fixing which virtual provider you want to use:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
With the configuration above the only allowed ``mpi`` provider is ``mvapich2 %gcc``.
Requirements on the virtual spec and on the specific provider are both applied, if
present. For instance with a configuration like:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
mvapich2:
require: '~cuda'
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-preferences:
-------------------
Package Preferences
-------------------
In some cases package requirements can be too strong, and package
preferences are the better option. Package preferences do not impose
constraints on packages for particular versions or variants values,
they rather only set defaults. The concretizer is free to change
them if it must, due to other constraints, and also prefers reusing
installed packages over building new ones that are a better match for
preferences.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
Most package preferences (``compilers``, ``target`` and ``providers``)
can only be set globally under the ``all`` section of ``packages.yaml``:
.. code-block:: yaml
packages:
all:
compiler: [gcc@12.2.0, clang@12:, oneapi@2023:]
target: [x86_64_v3]
providers:
mpi: [mvapich2, mpich, openmpi]
These preferences override Spack's default and effectively reorder priorities
when looking for the best compiler, target or virtual package provider. Each
preference takes an ordered list of spec constraints, with earlier entries in
the list being preferred over later entries.
In the example above all packages prefer to be compiled with ``gcc@12.2.0``,
to target the ``x86_64_v3`` microarchitecture and to use ``mvapich2`` if they
depend on ``mpi``.
The ``variants`` and ``version`` preferences can be set under
package specific sections of the ``packages.yaml`` file:
.. code-block:: yaml
packages:
opencv:
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
In this case, the preference for ``opencv`` is to build with debug options, while
``gperftools`` prefers version 2.2 over 2.4.
Any preference can be overwritten on the command line if explicitly requested.
Preferences cannot overcome explicit constraints, as they only set a preferred
ordering among homogeneous attribute values. Going back to the example, if
``gperftools@2.3:`` was requested, then Spack will install version 2.4
since the most preferred version 2.2 is prohibited by the version constraint.
.. _package_permissions:
-------------------
Package Permissions
-------------------
Spack can be configured to assign permissions to the files installed
by a package.
In the ``packages.yaml`` file under ``permissions``, the attributes
``read``, ``write``, and ``group`` control the package
permissions. These attributes can be set per-package, or for all
packages under ``all``. If permissions are set under ``all`` and for a
specific package, the package-specific settings take precedence.
The ``read`` and ``write`` attributes take one of ``user``, ``group``,
and ``world``.
.. code-block:: yaml
packages:
all:
permissions:
write: group
group: spack
my_app:
permissions:
read: group
group: my_team
The permissions settings describe the broadest level of access to
installations of the specified packages. The execute permissions of
the file are set to the same level as read permissions for those files
that are executable. The default setting for ``read`` is ``world``,
and for ``write`` is ``user``. In the example above, installations of
``my_app`` will be installed with user and group permissions but no
world permissions, and owned by the group ``my_team``. All other
packages will be installed with user and group write privileges, and
world read privileges. Those packages will be owned by the group
``spack``.
The ``group`` attribute assigns a Unix-style group to a package. All
files installed by the package will be owned by the assigned group,
and the sticky group bit will be set on the install prefix and all
directories inside the install prefix. This will ensure that even
manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
----------------------------
Assigning Package Attributes
----------------------------
You can assign class-level attributes in the configuration:
.. code-block:: yaml
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these
attributes may be any value parseable by yaml.
These can only be applied to specific packages, not "all" or
virtual packages.

View File

@@ -1549,7 +1549,7 @@ its value:
def configure_args(self):
...
if "+shared" in self.spec:
if self.spec.satisfies("+shared"):
extra_args.append("--enable-shared")
else:
extra_args.append("--disable-shared")
@@ -1636,7 +1636,7 @@ Within a package recipe a multi-valued variant is tested using a ``key=value`` s
.. code-block:: python
if "languages=jit" in spec:
if spec.satisfies("languages=jit"):
options.append("--enable-host-shared")
"""""""""""""""""""""""""""""""""""""""""""
@@ -2337,7 +2337,7 @@ window while a batch job is running ``spack install`` on the same or
overlapping dependencies without any process trying to re-do the work of
another.
For example, if you are using SLURM, you could launch an installation
For example, if you are using Slurm, you could launch an installation
of ``mpich`` using the following command:
.. code-block:: console
@@ -2352,7 +2352,7 @@ the following at the command line of a bash shell:
.. code-block:: console
$ for i in {1..12}; do nohup spack install -j 4 mpich@3.3.2 >> mpich_install.txt 2>&1 &; done
$ for i in {1..12}; do nohup spack install -j 4 mpich@3.3.2 >> mpich_install.txt 2>&1 & done
.. note::
@@ -2557,9 +2557,10 @@ Conditional dependencies
^^^^^^^^^^^^^^^^^^^^^^^^
You may have a package that only requires a dependency under certain
conditions. For example, you may have a package that has optional MPI support,
- MPI is only a dependency when you want to enable MPI support for the
package. In that case, you could say something like:
conditions. For example, you may have a package with optional MPI support.
You would then provide a variant to reflect that the feature is optional
and specify the MPI dependency only applies when MPI support is enabled.
In that case, you could say something like:
.. code-block:: python
@@ -2567,13 +2568,39 @@ package. In that case, you could say something like:
depends_on("mpi", when="+mpi")
``when`` can include constraints on the variant, version, compiler, etc. and
the :mod:`syntax<spack.spec>` is the same as for Specs written on the command
line.
If a dependency/feature of a package isn't typically used, you can save time
by making it conditional (since Spack will not build the dependency unless it
is required for the Spec).
Suppose the above package also has, since version 3, optional `Trilinos`
support and you want them both to build either with or without MPI. Further
suppose you require a version of `Trilinos` no older than 12.6. In that case,
the `trilinos` variant and dependency directives would be:
.. code-block:: python
variant("trilinos", default=False, description="Enable Trilinos support")
depends_on("trilinos@12.6:", when="@3: +trilinos")
depends_on("trilinos@12.6: +mpi", when="@3: +trilinos +mpi")
Alternatively, you could use the `when` context manager to equivalently specify
the `trilinos` variant dependencies as follows:
.. code-block:: python
with when("@3: +trilinos"):
depends_on("trilinos@12.6:")
depends_on("trilinos +mpi", when="+mpi")
The argument to ``when`` in either case can include any Spec constraints that
are supported on the command line using the same :ref:`syntax <sec-specs>`.
.. note::
If a dependency isn't typically used, you can save time by making it
conditional since Spack will not build the dependency unless it is
required for the Spec.
.. _dependency_dependency_patching:
@@ -2661,60 +2688,6 @@ appear in the package file (or in this case, in the list).
right version. If two packages depend on ``binutils`` patched *the
same* way, they can both use a single installation of ``binutils``.
.. _setup-dependent-environment:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Influence how dependents are built or run
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack provides a mechanism for dependencies to influence the
environment of their dependents by overriding the
:meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
or the
:meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
methods.
The Qt package, for instance, uses this call:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/qt/package.py
:pyobject: Qt.setup_dependent_build_environment
:linenos:
to set the ``QTDIR`` environment variable so that packages
that depend on a particular Qt installation will find it.
Another good example of how a dependency can influence
the build environment of dependents is the Python package:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.setup_dependent_build_environment
:linenos:
In the method above it is ensured that any package that depends on Python
will have the ``PYTHONPATH``, ``PYTHONHOME`` and ``PATH`` environment
variables set appropriately before starting the installation. To make things
even simpler the ``python setup.py`` command is also inserted into the module
scope of dependents by overriding a third method called
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`
:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.setup_dependent_package
:linenos:
This allows most python packages to have a very simple install procedure,
like the following:
.. code-block:: python
def install(self, spec, prefix):
setup_py("install", "--prefix={0}".format(prefix))
Finally the Python package takes also care of the modifications to ``PYTHONPATH``
to allow dependencies to run correctly:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.setup_dependent_run_environment
:linenos:
.. _packaging_conflicts:
@@ -2859,6 +2832,70 @@ variant(s) are selected. This may be accomplished with conditional
extends("python", when="+python")
...
.. _setup-environment:
--------------------------------------------
Runtime and build time environment variables
--------------------------------------------
Spack provides a few methods to help package authors set up the required environment variables for
their package. Environment variables typically depend on how the package is used: variables that
make sense during the build phase may not be needed at runtime, and vice versa. Further, sometimes
it makes sense to let a dependency set the environment variables for its dependents. To allow all
this, Spack provides four different methods that can be overridden in a package:
1. :meth:`setup_build_environment <spack.builder.Builder.setup_build_environment>`
2. :meth:`setup_run_environment <spack.package_base.PackageBase.setup_run_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
4. :meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
The Qt package, for instance, uses this call:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/qt/package.py
:pyobject: Qt.setup_dependent_build_environment
:linenos:
to set the ``QTDIR`` environment variable so that packages that depend on a particular Qt
installation will find it.
The following diagram will give you an idea when each of these methods is called in a build
context:
.. image:: images/setup_env.png
:align: center
Notice that ``setup_dependent_run_environment`` can be called multiple times, once for each
dependent package, whereas ``setup_run_environment`` is called only once for the package itself.
This means that the former should only be used if the environment variables depend on the dependent
package, whereas the latter should be used if the environment variables depend only on the package
itself.
--------------------------------
Setting package module variables
--------------------------------
Apart from modifying environment variables of the dependent package, you can also define Python
variables to be used by the dependent. This is done by implementing
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`. An
example of this can be found in the ``Python`` package:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.setup_dependent_package
:linenos:
This allows Python packages to directly use these variables:
.. code-block:: python
def install(self, spec, prefix):
...
install("script.py", python_platlib)
.. note::
We recommend using ``setup_dependent_package`` sparingly, as it is not always clear where
global variables are coming from when editing a ``package.py`` file.
-----
Views
-----
@@ -2937,6 +2974,33 @@ The ``provides("mpi")`` call tells Spack that the ``mpich`` package
can be used to satisfy the dependency of any package that
``depends_on("mpi")``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Providing multiple virtuals simultaneously
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Packages can provide more than one virtual dependency. Sometimes, due to implementation details,
there are subsets of those virtuals that need to be provided together by the same package.
A well-known example is ``openblas``, which provides both the ``lapack`` and ``blas`` API in a single ``libopenblas``
library. A package that needs ``lapack`` and ``blas`` must either use ``openblas`` to provide both, or not use
``openblas`` at all. It cannot pick one or the other.
To express this constraint in a package, the two virtual dependencies must be listed in the same ``provides`` directive:
.. code-block:: python
provides('blas', 'lapack')
This makes it impossible to select ``openblas`` as a provider for one of the two
virtual dependencies and not for the other. If you try to, Spack will report an error:
.. code-block:: console
$ spack spec netlib-scalapack ^[virtuals=lapack] openblas ^[virtuals=blas] atlas
==> Error: concretization failed for the following reasons:
1. Package 'openblas' needs to provide both 'lapack' and 'blas' together, but provides only 'lapack'
^^^^^^^^^^^^^^^^^^^^
Versioned Interfaces
^^^^^^^^^^^^^^^^^^^^
@@ -3439,6 +3503,56 @@ is equivalent to:
Constraints from nested context managers are also combined together, but they are rarely
needed or recommended.
.. _default_args:
------------------------
Common default arguments
------------------------
Similarly, if directives have a common set of default arguments, you can
group them together in a ``with default_args()`` block:
.. code-block:: python
class PyExample(PythonPackage):
with default_args(type=("build", "run")):
depends_on("py-foo")
depends_on("py-foo@2:", when="@2:")
depends_on("py-bar")
depends_on("py-bz")
The above is short for:
.. code-block:: python
class PyExample(PythonPackage):
depends_on("py-foo", type=("build", "run"))
depends_on("py-foo@2:", when="@2:", type=("build", "run"))
depends_on("py-bar", type=("build", "run"))
depends_on("py-bz", type=("build", "run"))
.. note::
The ``with when()`` context manager is composable, while ``with default_args()``
merely overrides the default. For example:
.. code-block:: python
with default_args(when="+feature"):
depends_on("foo")
depends_on("bar")
depends_on("baz", when="+baz")
is equivalent to:
.. code-block:: python
depends_on("foo", when="+feature")
depends_on("bar", when="+feature")
depends_on("baz", when="+baz") # Note: not when="+feature+baz"
.. _install-method:
------------------
@@ -3501,7 +3615,7 @@ need to override methods like ``configure_args``:
def configure_args(self):
args = ["--enable-cxx"] + self.enable_or_disable("libs")
if "libs=static" in self.spec:
if self.spec.satisfies("libs=static"):
args.append("--with-pic")
return args
@@ -3635,7 +3749,8 @@ regardless of the build system. The arguments for the phase are:
The arguments ``spec`` and ``prefix`` are passed only for convenience, as they always
correspond to ``self.spec`` and ``self.spec.prefix`` respectively.
If the ``package.py`` encodes builders explicitly, the signature for a phase changes slightly:
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
@@ -3645,56 +3760,6 @@ If the ``package.py`` encodes builders explicitly, the signature for a phase cha
In this case the package is passed as the second argument, and ``self`` is the builder instance.
.. _multiple_build_systems:
^^^^^^^^^^^^^^^^^^^^^^
Multiple build systems
^^^^^^^^^^^^^^^^^^^^^^
There are cases where a software actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases natively, if a recipe is written using builders explicitly.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default. The ``package.py``
will likely contain some overriding of default builder methods:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
pass
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
pass
In more complex cases it might happen that the build system changes according to certain conditions,
for instance across versions. That can be expressed with conditional variant values:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
^^^^^^^^^^^^^^^^^^
Mixin base classes
^^^^^^^^^^^^^^^^^^
@@ -3741,6 +3806,106 @@ for instance:
In the example above ``Cp2k`` inherits all the conflicts and variants that ``CudaPackage`` defines.
.. _multiple_build_systems:
----------------------
Multiple build systems
----------------------
There are cases where a package actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases by splitting the build instructions into separate builder classes.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
variant("my_feature", default=True)
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default.
Additional build instructions are split into separate builder classes:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
return [
self.define_from_variant("MY_FEATURE", "my_feature")
]
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
return self.with_or_without("my-feature", variant="my_feature")
In this example, ``spack install example +feature build_sytem=cmake`` will
pick the ``CMakeBuilder`` and invoke ``cmake -DMY_FEATURE:BOOL=ON``.
Similarly, ``spack install example +feature build_system=autotools`` will pick
the ``AutotoolsBuilder`` and invoke ``./configure --with-my-feature``.
Dependencies are always specified in the package class. When some dependencies
depend on the choice of the build system, it is possible to use when conditions as
usual:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
# Runtime dependencies
depends_on("ncurses")
depends_on("libxml2")
# Lowerbounds for cmake only apply when using cmake as the build system
with when("build_system=cmake"):
depends_on("cmake@3.18:", when="@2.0:", type="build")
depends_on("cmake@3:", type="build")
# Specify extra build dependencies used only in the configure script
with when("build_system=autotools"):
depends_on("perl", type="build")
depends_on("pkgconfig", type="build")
Very often projects switch from one build system to another, or add support
for a new build system from a certain version, which means that the choice
of the build system typically depends on a version range. Those situations can
be handled by using conditional values in the ``build_system`` directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
The ``build_system`` can be used as an ordinary variant, which also means that it can
be used in ``depends_on`` statements. This can be useful when a package *requires* that
its dependency has a CMake config file, meaning that the dependent can only build when the
dependency is built with CMake, and not Autotools. In that case, you can force the choice
of the build system in the dependent:
.. code-block:: python
class Dependent(CMakePackage):
depends_on("example build_system=cmake")
.. _install-environment:
-----------------------
@@ -4313,7 +4478,7 @@ for supported features, for instance:
.. code-block:: python
if "avx512" in spec.target:
if spec.satisfies("target=avx512"):
args.append("--with-avx512")
The snippet above will append the ``--with-avx512`` item to a list of arguments only if the corresponding
@@ -6748,3 +6913,63 @@ To achieve backward compatibility with the single-class format Spack creates in
Overall the role of the adapter is to route access to attributes of methods first through the ``*Package``
hierarchy, and then back to the base class builder. This is schematically shown in the diagram above, where
the adapter role is to "emulate" a method resolution order like the one represented by the red arrows.
------------------------------
Specifying License Information
------------------------------
Most of the software in Spack is open source, and most open source software is released
under one or more `common open source licenses <https://opensource.org/licenses/>`_.
Specifying the license that a package is released under in a project's
`package.py` is good practice. To specify a license, find the `SPDX identifier
<https://spdx.org/licenses/>`_ for a project and then add it using the license
directive:
.. code-block:: python
license("<SPDX Identifier HERE>")
For example, the SPDX ID for the Apache Software License, version 2.0 is ``Apache-2.0``,
so you'd write:
.. code-block:: python
license("Apache-2.0")
Or, for a dual-licensed package like Spack, you would use an `SPDX Expression
<https://spdx.github.io/spdx-spec/v2-draft/SPDX-license-expressions/>`_ with both of its
licenses:
.. code-block:: python
license("Apache-2.0 OR MIT")
Note that specifying a license without a when clause makes it apply to all
versions and variants of the package, which might not actually be the case.
For example, a project might have switched licenses at some point or have
certain build configurations that include files that are licensed differently.
Spack itself used to be under the ``LGPL-2.1`` license, until it was relicensed
in version ``0.12`` in 2018.
You can specify when a ``license()`` directive applies using with a ``when=``
clause, just like other directives. For example, to specify that a specific
license identifier should only apply to versions up to ``0.11``, but another
license should apply for later versions, you could write:
.. code-block:: python
license("LGPL-2.1", when="@:0.11")
license("Apache-2.0 OR MIT", when="@0.12:")
Note that unlike for most other directives, the ``when=`` constraints in the
``license()`` directive can't intersect. Spack needs to be able to resolve
exactly one license identifier expression for any given version. To specify
*multiple* licenses, use SPDX expressions and operators as above. The operators
you probably care most about are:
* ``OR``: user chooses one license to adhere to; and
* ``AND``: user has to adhere to all the licenses.
You may also care about `license exceptions
<https://spdx.org/licenses/exceptions-index.html>`_ that use the ``WITH`` operator,
e.g. ``Apache-2.0 WITH LLVM-exception``.

View File

@@ -213,6 +213,16 @@ pipeline jobs.
``spack ci generate``
^^^^^^^^^^^^^^^^^^^^^
Throughout this documentation, references to the "mirror" mean the target
mirror which is checked for the presence of up-to-date specs, and where
any scheduled jobs should push built binary packages. In the past, this
defaulted to the mirror at index 0 in the mirror configs, and could be
overridden using the ``--buildcache-destination`` argument. Starting with
Spack 0.23, ``spack ci generate`` will require you to identify this mirror
by the name "buildcache-destination". While you can configure any number
of mirrors as sources for your pipelines, you will need to identify the
destination mirror by name.
Concretizes the specs in the active environment, stages them (as described in
:ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk.
During concretization of the environment, ``spack ci generate`` also writes a

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
python-levenshtein==0.21.1
docutils==0.18.1
pygments==2.16.1
urllib3==2.0.6
pytest==7.4.2
sphinx-rtd-theme==2.0.0
python-levenshtein==0.23.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
pytest==7.4.3
isort==5.12.0
black==23.9.1
black==23.11.0
flake8==6.1.0
mypy==1.5.1
mypy==1.7.1

View File

@@ -1,9 +1,7 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 3.6--3.11, , Interpreter for Spack
Python, 3.6--3.12, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
1 Name Supported Versions Notes Requirement Reason
2 Python 3.6--3.11 3.6--3.12 Interpreter for Spack
3 C/C++ Compilers Building software
make Build software
4 patch Build software
bash Compiler wrappers
5 tar Extract/create archives
6 gzip Compress/Decompress archives
7 unzip Compress/Decompress archives

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.1 (commit df43a1834460bf94516136951c4729a3100603ec)
* Version: 0.2.2 (commit 1dc58a5776dd77e6fc6e4ba5626af5b1fb24996e)
astunparse
----------------

View File

@@ -1,2 +1,2 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.1"
__version__ = "0.2.2"

View File

@@ -2318,6 +2318,26 @@
]
}
},
"power10": {
"from": ["power9"],
"vendor": "IBM",
"generation": 10,
"features": [],
"compilers": {
"gcc": [
{
"versions": "11.1:",
"flags": "-mcpu={name} -mtune={name}"
}
],
"clang": [
{
"versions": "11.0:",
"flags": "-mcpu={name} -mtune={name}"
}
]
}
},
"ppc64le": {
"from": [],
"vendor": "generic",
@@ -2405,6 +2425,29 @@
]
}
},
"power10le": {
"from": ["power9le"],
"vendor": "IBM",
"generation": 10,
"features": [],
"compilers": {
"gcc": [
{
"name": "power10",
"versions": "11.1:",
"flags": "-mcpu={name} -mtune={name}"
}
],
"clang": [
{
"versions": "11.0:",
"family": "ppc64le",
"name": "power10",
"flags": "-mcpu={name} -mtune={name}"
}
]
}
},
"aarch64": {
"from": [],
"vendor": "generic",
@@ -2592,6 +2635,37 @@
]
}
},
"armv9.0a": {
"from": ["armv8.5a"],
"vendor": "generic",
"features": [],
"compilers": {
"gcc": [
{
"versions": "12:",
"flags": "-march=armv9-a -mtune=generic"
}
],
"clang": [
{
"versions": "14:",
"flags": "-march=armv9-a -mtune=generic"
}
],
"apple-clang": [
{
"versions": ":",
"flags": "-march=armv9-a -mtune=generic"
}
],
"arm": [
{
"versions": ":",
"flags": "-march=armv9-a -mtune=generic"
}
]
}
},
"thunderx2": {
"from": ["armv8.1a"],
"vendor": "Cavium",
@@ -2813,8 +2887,12 @@
],
"arm" : [
{
"versions": "20:",
"versions": "20:21.9",
"flags" : "-march=armv8.2-a+fp16+rcpc+dotprod+crypto"
},
{
"versions": "22:",
"flags" : "-mcpu=neoverse-n1"
}
],
"nvhpc" : [
@@ -2942,7 +3020,7 @@
},
{
"versions": "22:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
"flags" : "-mcpu=neoverse-v1"
}
],
"nvhpc" : [
@@ -2954,6 +3032,126 @@
]
}
},
"neoverse_v2": {
"from": ["neoverse_n1", "armv9.0a"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
"svepmull",
"svebitperm",
"svesha3",
"svesm4",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:5.99",
"flags": "-march=armv8-a"
},
{
"versions": "6:6.99",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7.0:7.99",
"flags" : "-march=armv8.2-a -mtune=cortex-a72"
},
{
"versions": "8.0:8.99",
"flags" : "-march=armv8.4-a+sve -mtune=cortex-a72"
},
{
"versions": "9.0:9.99",
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:11.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "12.0:12.99",
"flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710"
},
{
"versions": "13.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"clang" : [
{
"versions": "9.0:10.99",
"flags" : "-march=armv8.5-a+sve"
},
{
"versions": "11.0:13.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16"
},
{
"versions": "14.0:15.99",
"flags" : "-march=armv9-a+i8mm+bf16"
},
{
"versions": "16.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"arm" : [
{
"versions": "23.04.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"nvhpc" : [
{
"versions": "23.3:",
"name": "neoverse-v2",
"flags": "-tp {name}"
}
]
}
},
"m1": {
"from": ["armv8.4a"],
"vendor": "Apple",

View File

@@ -156,6 +156,37 @@ def lookup(name):
shutil.copystat = copystat
def polite_path(components: Iterable[str]):
"""
Given a list of strings which are intended to be path components,
generate a path, and format each component to avoid generating extra
path entries.
For example all "/", "\", and ":" characters will be replaced with
"_". Other characters like "=" will also be replaced.
"""
return os.path.join(*[polite_filename(x) for x in components])
@memoized
def _polite_antipattern():
# A regex of all the characters we don't want in a filename
return re.compile(r"[^A-Za-z0-9_.-]")
def polite_filename(filename: str) -> str:
"""
Replace generally problematic filename characters with underscores.
This differs from sanitize_filename in that it is more aggressive in
changing characters in the name. For example it removes "=" which can
confuse path parsing in external tools.
"""
# This character set applies for both Windows and Linux. It does not
# account for reserved filenames in Windows.
return _polite_antipattern().sub("_", filename)
def getuid():
if sys.platform == "win32":
import ctypes

View File

@@ -1047,9 +1047,9 @@ def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context: str) -> "GroupedExceptionForwarder":
def forward(self, context: str, base: type = BaseException) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
return GroupedExceptionForwarder(context, self, base)
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
@@ -1072,15 +1072,18 @@ class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context: str, handler: GroupedExceptionHandler):
def __init__(self, context: str, handler: GroupedExceptionHandler, base: type):
self._context = context
self._handler = handler
self._base = base
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None:
if not issubclass(exc_type, self._base):
return False
self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb))
# Suppress any exception from being re-raised:

View File

@@ -211,6 +211,7 @@ def info(message, *args, **kwargs):
stream.write(line + "\n")
else:
stream.write(indent + _output_filter(str(arg)) + "\n")
stream.flush()
def verbose(message, *args, **kwargs):

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.21.0.dev0"
__version__ = "0.22.0.dev0"
spack_version = __version__

View File

@@ -8,8 +8,8 @@
from llnl.util.lang import memoized
import spack.spec
import spack.version
from spack.compilers.clang import Clang
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
@@ -17,7 +17,9 @@ class ABI:
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(self, target, constraint):
def architecture_compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec
) -> bool:
"""Return true if architecture of target spec is ABI compatible
to the architecture of constraint spec. If either the target
or constraint specs have no architecture, target is also defined
@@ -34,7 +36,7 @@ def _gcc_get_libstdcxx_version(self, version):
a compiler's libstdc++ or libgcc_s"""
from spack.build_environment import dso_suffix
spec = CompilerSpec("gcc", version)
spec = spack.spec.CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:
return None
@@ -77,16 +79,20 @@ def _gcc_compiler_compare(self, pversion, cversion):
return False
return plib == clib
def _intel_compiler_compare(self, pversion, cversion):
def _intel_compiler_compare(
self, pversion: spack.version.ClosedOpenRange, cversion: spack.version.ClosedOpenRange
) -> bool:
"""Returns true iff the intel version pversion and cversion
are ABI compatible"""
# Test major and minor versions. Ignore build version.
if len(pversion.version) < 2 or len(cversion.version) < 2:
return False
return pversion.version[:2] == cversion.version[:2]
pv = pversion.lo
cv = cversion.lo
return pv.up_to(2) == cv.up_to(2)
def compiler_compatible(self, parent, child, **kwargs):
def compiler_compatible(
self, parent: spack.spec.Spec, child: spack.spec.Spec, loose: bool = False
) -> bool:
"""Return true if compilers for parent and child are ABI compatible."""
if not parent.compiler or not child.compiler:
return True
@@ -95,7 +101,7 @@ def compiler_compatible(self, parent, child, **kwargs):
# Different compiler families are assumed ABI incompatible
return False
if kwargs.get("loose", False):
if loose:
return True
# TODO: Can we move the specialized ABI matching stuff
@@ -116,9 +122,10 @@ def compiler_compatible(self, parent, child, **kwargs):
return True
return False
def compatible(self, target, constraint, **kwargs):
def compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec, loose: bool = False
) -> bool:
"""Returns true if target spec is ABI compatible to constraint spec"""
loosematch = kwargs.get("loose", False)
return self.architecture_compatible(target, constraint) and self.compiler_compatible(
target, constraint, loose=loosematch
target, constraint, loose=loose
)

View File

@@ -40,6 +40,7 @@ def _search_duplicate_compilers(error_cls):
import collections.abc
import glob
import inspect
import io
import itertools
import pathlib
import pickle
@@ -54,6 +55,7 @@ def _search_duplicate_compilers(error_cls):
import spack.repo
import spack.spec
import spack.util.crypto
import spack.util.spack_yaml as syaml
import spack.variant
#: Map an audit tag to a list of callables implementing checks
@@ -250,6 +252,88 @@ def _search_duplicate_specs_in_externals(error_cls):
return errors
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(attribute_name, config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
dict_view = syaml.syaml_dict((k, v) for k, v in config_data.items() if k == attribute_name)
syaml.dump_config(dict_view, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
if "all" in packages_yaml and "version" in packages_yaml["all"]:
summary = "Using the deprecated 'version' attribute under 'packages:all'"
errors.append(make_error("version", packages_yaml["all"], summary))
for package_name in packages_yaml:
if package_name == "all":
continue
package_conf = packages_yaml[package_name]
for attribute in ("compiler", "providers", "target"):
if attribute not in package_conf:
continue
summary = (
f"Using the deprecated '{attribute}' attribute " f"under 'packages:{package_name}'"
)
errors.append(make_error(attribute, package_conf, summary))
return errors
@config_packages
def _avoid_mismatched_variants(error_cls):
"""Warns if variant preferences have mismatched types or names."""
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
for pkg_name in packages_yaml:
# 'all:' must be more forgiving, since it is setting defaults for everything
if pkg_name == "all" or "variants" not in packages_yaml[pkg_name]:
continue
preferences = packages_yaml[pkg_name]["variants"]
if not isinstance(preferences, list):
preferences = [preferences]
for variants in preferences:
current_spec = spack.spec.Spec(variants)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant in current_spec.variants.values():
# Variant does not exist at all
if variant.name not in pkg_cls.variants:
summary = (
f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'"
)
errors.append(make_error(preferences, summary))
continue
# Variant cannot accept this value
s = spack.spec.Spec(pkg_name)
try:
s.update_variant_validate(variant.name, variant.value)
except Exception:
summary = (
f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
f"to the invalid value '{str(variant)}'"
)
errors.append(make_error(preferences, summary))
return errors
#: Sanity checks on package directives
package_directives = AuditClass(
group="packages",
@@ -307,10 +391,17 @@ def _check_build_test_callbacks(pkgs, error_cls):
@package_directives
def _check_patch_urls(pkgs, error_cls):
"""Ensure that patches fetched from GitHub have stable sha256 hashes."""
"""Ensure that patches fetched from GitHub and GitLab have stable sha256
hashes."""
github_patch_url_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
".+/.+/(?:commit|pull)/[a-fA-F0-9]*.(?:patch|diff)"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = (
r"^https?://(?:.+)?gitlab(?:.+)/"
r".+/.+/-/(?:commit|merge_requests)/[a-fA-F0-9]+\.(?:patch|diff)"
)
errors = []
@@ -321,19 +412,27 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch):
continue
if not re.match(github_patch_url_re, patch.url):
continue
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
[patch.url],
if re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
[patch.url],
)
)
elif re.match(gitlab_patch_url_re, patch.url):
if not patch.url.endswith(".diff"):
errors.append(
error_cls(
"patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
[patch.url],
)
)
)
return errors
@@ -627,13 +726,37 @@ def _unknown_variants_in_directives(pkgs, error_cls):
@package_directives
def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
def _issues_in_depends_on_directive(pkgs, error_cls):
"""Reports issues with 'depends_on' directives.
Issues might be unknown dependencies, unknown variants or variant values, or declaration
of nested dependencies.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
for when, dependency_edge in dependency_data.items():
dependency_spec = dependency_edge.spec
nested_dependencies = dependency_spec.dependencies()
if nested_dependencies:
summary = (
f"{pkg_name}: invalid nested dependency "
f"declaration '{str(dependency_spec)}'"
)
details = [
f"split depends_on('{str(dependency_spec)}', when='{str(when)}') "
f"into {len(nested_dependencies) + 1} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
# No need to analyze virtual packages
if spack.repo.PATH.is_virtual(dependency_name):
continue
@@ -761,7 +884,7 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
)
except Exception:
summary = (
"{0}: dependency on {1} cannot be satisfied " "by known versions of {1.name}"
"{0}: dependency on {1} cannot be satisfied by known versions of {1.name}"
).format(pkg_name, s)
details = ["happening in " + filename]
if dependency_pkg_cls is not None:
@@ -803,6 +926,53 @@ def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
return errors
@package_directives
def _named_specs_in_when_arguments(pkgs, error_cls):
"""Reports named specs in the 'when=' attribute of a directive.
Note that 'conflicts' is the only directive allowing that.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
def _extracts_errors(triggers, summary):
_errors = []
for trigger in list(triggers):
when_spec = spack.spec.Spec(trigger)
if when_spec.name is not None and when_spec.name != pkg_name:
details = [f"using '{trigger}', should be '^{trigger}'"]
_errors.append(error_cls(summary=summary, details=details))
return _errors
for dname, triggers in pkg_cls.dependencies.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{dname}' dependency"
errors.extend(_extracts_errors(triggers, summary))
for vname, (variant, triggers) in pkg_cls.variants.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant"
errors.extend(_extracts_errors(triggers, summary))
for provided, triggers in pkg_cls.provided.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{provided}' virtual"
errors.extend(_extracts_errors(triggers, summary))
for _, triggers in pkg_cls.requirements.items():
triggers = [when_spec for when_spec, _, _ in triggers]
summary = f"{pkg_name}: wrong 'when=' condition in 'requires' directive"
errors.extend(_extracts_errors(triggers, summary))
triggers = list(pkg_cls.patches)
summary = f"{pkg_name}: wrong 'when=' condition in 'patch' directives"
errors.extend(_extracts_errors(triggers, summary))
triggers = list(pkg_cls.resources)
summary = f"{pkg_name}: wrong 'when=' condition in 'resource' directives"
errors.extend(_extracts_errors(triggers, summary))
return llnl.util.lang.dedupe(errors)
#: Sanity checks on package directives
external_detection = AuditClass(
group="externals",

File diff suppressed because it is too large Load Diff

View File

@@ -16,6 +16,7 @@
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.platforms
import spack.store
import spack.util.environment
import spack.util.executable
@@ -206,16 +207,19 @@ def _root_spec(spec_str: str) -> str:
"""Add a proper compiler and target to a spec used during bootstrapping.
Args:
spec_str (str): spec to be bootstrapped. Must be without compiler and target.
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS and Windows.
if str(spack.platforms.host()) == "darwin":
# Add a compiler requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif str(spack.platforms.host()) == "windows":
spec_str += " %msvc"
else:
elif platform == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -143,7 +143,9 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch):
new_compilers = spack.compilers.find_new_compilers()
new_compilers = spack.compilers.find_new_compilers(
mixed_toolchain=sys.platform == "darwin"
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)

View File

@@ -214,7 +214,7 @@ def _install_and_test(
with spack.config.override(self.mirror_scope):
# This index is currently needed to get the compiler used to build some
# specs that we know by dag hash.
spack.binary_distribution.binary_index.regenerate_spec_cache()
spack.binary_distribution.BINARY_INDEX.regenerate_spec_cache()
index = spack.binary_distribution.update_cache_and_get_specs()
if not index:
@@ -228,7 +228,7 @@ def _install_and_test(
if not abstract_spec.intersects(candidate_spec):
continue
if python_spec is not None and python_spec not in abstract_spec:
if python_spec is not None and not abstract_spec.intersects(f"^{python_spec}"):
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
@@ -291,6 +291,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
@@ -382,7 +386,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
@@ -437,7 +441,7 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):
@@ -446,16 +450,11 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
env_mods = spack.util.environment.EnvironmentModifications()
for dep in concrete_spec.traverse(
root=True, order="post", deptype=("link", "run")
):
env_mods.extend(
spack.user_environment.environment_modifications_for_spec(
dep, set_package_py_globals=False
)
cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False
)
cmd.add_default_envmod(env_mods)
)
return cmd
assert exception_handler, (

View File

@@ -19,7 +19,6 @@
import spack.tengine
import spack.util.cpus
import spack.util.executable
from spack.environment import depfile
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
@@ -86,12 +85,9 @@ def __init__(self) -> None:
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment.
The update is done using a depfile on Linux and macOS, and using the ``install_all``
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
specs = self.concretize()
if specs:
colorized_specs = [
@@ -100,11 +96,9 @@ def update_installations(self) -> None:
]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False)
if sys.platform == "win32":
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
self.install_all()
else:
self._install_with_depfile()
self.write(regenerate=True)
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
@@ -122,25 +116,6 @@ def update_syspath_and_environ(self) -> None:
+ [str(x) for x in self.pythonpaths()]
)
def _install_with_depfile(self) -> None:
model = depfile.MakefileModel.from_env(self)
template = spack.tengine.make_environment().get_template(
os.path.join("depfile", "Makefile")
)
makefile = self.environment_root() / "Makefile"
makefile.write_text(template.render(model.to_dict()))
make = spack.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(spack.util.cpus.determine_number_of_jobs(parallel=True)),
**kwargs,
)
def _write_spack_yaml_file(self) -> None:
tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"
@@ -161,7 +136,7 @@ def _write_spack_yaml_file(self) -> None:
def isort_root_spec() -> str:
"""Return the root spec used to bootstrap isort"""
return _root_spec("py-isort@4.3.5:")
return _root_spec("py-isort@5")
def mypy_root_spec() -> str:

View File

@@ -66,7 +66,6 @@ def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"),

View File

@@ -40,12 +40,15 @@
import sys
import traceback
import types
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.string import plural
from llnl.util.filesystem import join_path
from llnl.util.lang import dedupe
from llnl.util.lang import dedupe, stable_partition
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
@@ -55,17 +58,21 @@
import spack.build_systems.python
import spack.builder
import spack.config
import spack.deptypes as dt
import spack.main
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
import spack.schema.environment
import spack.spec
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.path
import spack.util.pattern
from spack import traverse
from spack.context import Context
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import spack_install_test_log
from spack.installer import InstallError
@@ -76,7 +83,6 @@
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
validate,
)
@@ -109,7 +115,6 @@
SPACK_CCACHE_BINARY = "SPACK_CCACHE_BINARY"
SPACK_SYSTEM_DIRS = "SPACK_SYSTEM_DIRS"
# Platform-specific library suffix.
if sys.platform == "darwin":
dso_suffix = "dylib"
@@ -319,19 +324,29 @@ def set_compiler_environment_variables(pkg, env):
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to call
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
@@ -406,19 +421,13 @@ def set_compiler_environment_variables(pkg, env):
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper
(which have the prefix `SPACK_`) and also add the compiler wrappers
to PATH.
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
This determines the injected -L/-I/-rpath options; each
of these specifies a search order and this function computes these
options in a manner that is intended to match the DAG traversal order
in `modifications_from_dependencies`: that method uses a post-order
traversal so that `PrependPath` actions from dependencies take lower
precedence; we use a post-order traversal here to match the visitation
order of `modifications_from_dependencies` (so we are visiting the
lowest priority packages first).
"""
This determines the injected -L/-I/-rpath options; each of these specifies a search order and
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
@@ -537,45 +546,42 @@ def update_compiler_args_for_dep(dep):
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs))
def set_module_variables_for_package(pkg):
def set_package_py_globals(pkg, context: Context = Context.BUILD):
"""Populate the Python module of a package with some useful global names.
This makes things easier for package writers.
"""
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
marker = "_set_run_already_called"
if getattr(pkg.module, marker, False):
return
module = ModuleChangePropagator(pkg)
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m = module
m.make_jobs = jobs
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# analog to configure for win32
m.cscript = Executable("cscript")
if context == Context.BUILD:
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m.make_jobs = jobs
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# analog to configure for win32
m.cscript = Executable("cscript")
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
m.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
# Put spack compiler paths in module scope.
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
m.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
m.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
m.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
@@ -599,9 +605,6 @@ def static_to_shared_library(static_lib, shared_lib=None, **kwargs):
m.static_to_shared_library = static_to_shared_library
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
setattr(m, marker, True)
module.propagate_changes_to_mro()
@@ -727,12 +730,15 @@ def load_external_modules(pkg):
load_module(external_module)
def setup_package(pkg, dirty, context="build"):
def setup_package(pkg, dirty, context: Context = Context.BUILD):
"""Execute all environment setup routines."""
if context not in ["build", "test"]:
raise ValueError("'context' must be one of ['build', 'test'] - got: {0}".format(context))
if context not in (Context.BUILD, Context.TEST):
raise ValueError(f"'context' must be Context.BUILD or Context.TEST - got {context}")
set_module_variables_for_package(pkg)
# First populate the package.py's module with the relevant globals that could be used in any
# of the setup_* functions.
setup_context = SetupContext(pkg.spec, context=context)
setup_context.set_all_package_py_globals()
# Keep track of env changes from packages separately, since we want to
# issue warnings when packages make "suspicious" modifications.
@@ -740,42 +746,30 @@ def setup_package(pkg, dirty, context="build"):
env_mods = EnvironmentModifications()
# setup compilers for build contexts
need_compiler = context == "build" or (context == "test" and pkg.test_requires_compiler)
need_compiler = context == Context.BUILD or (
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(modifications_from_dependencies(pkg.spec, context, custom_mods_only=False))
tty.debug("setup_package: collected all modifications from dependencies")
# architecture specific setup
# Platform specific setup goes before package specific setup. This is for setting
# defaults like MACOSX_DEPLOYMENT_TARGET on macOS.
platform = spack.platforms.by_name(pkg.spec.architecture.platform)
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods)
if context == "build":
tty.debug("setup_package: setup build environment for root")
builder = spack.builder.create(pkg)
builder.setup_build_environment(env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
if (not dirty) and (not env_mods.is_unset("CPATH")):
tty.debug(
"A dependency has updated CPATH, this may lead pkg-"
"config to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'."
)
elif context == "test":
tty.debug("setup_package: setup test environment for root")
env_mods.extend(
inspect_path(
pkg.spec.prefix,
spack.user_environment.prefix_inspections(pkg.spec.platform),
exclude=is_system_path,
)
)
pkg.setup_run_environment(env_mods)
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
tty.debug(
"A dependency has updated CPATH, this may lead pkg-config to assume that the package "
"is part of the system includes and omit it when invoked with '--cflags'."
)
# First apply the clean environment changes
env_base.apply_modifications()
@@ -813,158 +807,257 @@ def setup_package(pkg, dirty, context="build"):
return env_base
def _make_runnable(pkg, env):
# Helper method which prepends a Package's bin/ prefix to the PATH
# environment variable
prefix = pkg.prefix
class EnvironmentVisitor:
def __init__(self, *roots: spack.spec.Spec, context: Context):
# For the roots (well, marked specs) we follow different edges
# than for their deps, depending on the context.
self.root_hashes = set(s.dag_hash() for s in roots)
for dirname in ["bin", "bin64"]:
bin_dir = os.path.join(prefix, dirname)
if os.path.isdir(bin_dir):
env.prepend_path("PATH", bin_dir)
if context == Context.BUILD:
# Drop direct run deps in build context
# We don't really distinguish between install and build time test deps,
# so we include them here as build-time test deps.
self.root_depflag = dt.BUILD | dt.TEST | dt.LINK
elif context == Context.TEST:
# This is more of an extended run environment
self.root_depflag = dt.TEST | dt.RUN | dt.LINK
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
depflag = self.root_depflag
else:
depflag = dt.LINK | dt.RUN
return traverse.sort_edges(spec.edges_to_dependencies(depflag=depflag))
def modifications_from_dependencies(
spec, context, custom_mods_only=True, set_package_py_globals=True
):
"""Returns the environment modifications that are required by
the dependencies of a spec and also applies modifications
to this spec's package at module scope, if need be.
class UseMode(Flag):
#: Entrypoint spec (a spec to be built; an env root, etc)
ROOT = auto()
Environment modifications include:
#: A spec used at runtime, but no executables in PATH
RUNTIME = auto()
- Updating PATH so that executables can be found
- Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective
tools can find Spack-built dependencies
- Running custom package environment modifications
#: A spec used at runtime, with executables in PATH
RUNTIME_EXECUTABLE = auto()
Custom package modifications can conflict with the default PATH changes
we make (specifically for the PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH
environment variables), so this applies changes in a fixed order:
#: A spec that's a direct build or test dep
BUILDTIME_DIRECT = auto()
- All modifications (custom and default) from external deps first
- All modifications from non-external deps afterwards
#: A spec that should be visible in search paths in a build env.
BUILDTIME = auto()
With that order, `PrependPath` actions from non-external default
environment modifications will take precedence over custom modifications
from external packages.
#: Flag is set when the (node, mode) is finalized
ADDED = auto()
A secondary constraint is that custom and default modifications are
grouped on a per-package basis: combined with the post-order traversal this
means that default modifications of dependents can override custom
modifications of dependencies (again, this would only occur for PATH,
CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).
Args:
spec (spack.spec.Spec): spec for which we want the modifications
context (str): either 'build' for build-time modifications or 'run'
for run-time modifications
custom_mods_only (bool): if True returns only custom modifications, if False
returns custom and default modifications
set_package_py_globals (bool): whether or not to set the global variables in the
package.py files (this may be problematic when using buildcaches that have
been built on a different but compatible OS)
"""
if context not in ["build", "run", "test"]:
raise ValueError(
"Expecting context to be one of ['build', 'run', 'test'], " "got: {0}".format(context)
def effective_deptypes(
*specs: spack.spec.Spec, context: Context = Context.BUILD
) -> List[Tuple[spack.spec.Spec, UseMode]]:
"""Given a list of input specs and a context, return a list of tuples of
all specs that contribute to (environment) modifications, together with
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
if parent is None:
use_modes[child] = UseMode.ROOT
continue
parent_mode = use_modes[parent]
# Nothing to propagate.
if not parent_mode:
continue
# Dependending on the context, include particular deps from the root.
if UseMode.ROOT & parent_mode:
if context == Context.BUILD:
if (dt.BUILD | dt.TEST) & depflag:
use_modes[child] |= UseMode.BUILDTIME_DIRECT
if dt.LINK & depflag:
use_modes[child] |= UseMode.BUILDTIME
elif context == Context.TEST:
if (dt.RUN | dt.TEST) & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
elif dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
elif context == Context.RUN:
if dt.RUN & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
elif dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
# Propagate RUNTIME and RUNTIME_EXECUTABLE through link and run deps.
if (UseMode.RUNTIME | UseMode.RUNTIME_EXECUTABLE | UseMode.BUILDTIME_DIRECT) & parent_mode:
if dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
if dt.RUN & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
# Propagate BUILDTIME through link deps.
if UseMode.BUILDTIME & parent_mode:
if dt.LINK & depflag:
use_modes[child] |= UseMode.BUILDTIME
# Finalize the spec; the invariant is that all in-edges are processed
# before out-edges, meaning that parent is done.
if not (UseMode.ADDED & parent_mode):
use_modes[parent] |= UseMode.ADDED
nodes_with_type.append((parent, parent_mode))
# Attach the leaf nodes, since we only added nodes with out-edges.
for spec, parent_mode in use_modes.items():
if parent_mode and not (UseMode.ADDED & parent_mode):
nodes_with_type.append((spec, parent_mode))
return nodes_with_type
class SetupContext:
"""This class encapsulates the logic to determine environment modifications, and is used as
well to set globals in modules of package.py."""
def __init__(self, *specs: spack.spec.Spec, context: Context) -> None:
"""Construct a ModificationsFromDag object.
Args:
specs: single root spec for build/test context, possibly more for run context
context: build, run, or test"""
if (context == Context.BUILD or context == Context.TEST) and not len(specs) == 1:
raise ValueError("Cannot setup build environment for multiple specs")
specs_with_type = effective_deptypes(*specs, context=context)
self.specs = specs
self.context = context
self.external: List[Tuple[spack.spec.Spec, UseMode]]
self.nonexternal: List[Tuple[spack.spec.Spec, UseMode]]
# Reverse so we go from leaf to root
self.nodes_in_subdag = set(id(s) for s, _ in specs_with_type)
# Split into non-external and external, maintaining topo order per group.
self.external, self.nonexternal = stable_partition(
reversed(specs_with_type), lambda t: t[0].external
)
self.should_be_runnable = UseMode.BUILDTIME_DIRECT | UseMode.RUNTIME_EXECUTABLE
self.should_setup_run_env = (
UseMode.BUILDTIME_DIRECT | UseMode.RUNTIME | UseMode.RUNTIME_EXECUTABLE
)
self.should_setup_dependent_build_env = UseMode.BUILDTIME | UseMode.BUILDTIME_DIRECT
self.should_setup_build_env = UseMode.ROOT if context == Context.BUILD else UseMode(0)
env = EnvironmentModifications()
if context == Context.RUN or context == Context.TEST:
self.should_be_runnable |= UseMode.ROOT
self.should_setup_run_env |= UseMode.ROOT
# Note: see computation of 'custom_mod_deps' and 'exe_deps' later in this
# function; these sets form the building blocks of those collections.
build_deps = set(spec.dependencies(deptype=("build", "test")))
link_deps = set(spec.traverse(root=False, deptype="link"))
build_link_deps = build_deps | link_deps
build_and_supporting_deps = set()
for build_dep in build_deps:
build_and_supporting_deps.update(build_dep.traverse(deptype="run"))
run_and_supporting_deps = set(spec.traverse(root=False, deptype=("run", "link")))
test_and_supporting_deps = set()
for test_dep in set(spec.dependencies(deptype="test")):
test_and_supporting_deps.update(test_dep.traverse(deptype="run"))
# Everything that calls setup_run_environment and setup_dependent_* needs globals set.
self.should_set_package_py_globals = (
self.should_setup_dependent_build_env | self.should_setup_run_env | UseMode.ROOT
)
# In a build context, the root and direct build deps need build-specific globals set.
self.needs_build_context = UseMode.ROOT | UseMode.BUILDTIME_DIRECT
# All dependencies that might have environment modifications to apply
custom_mod_deps = set()
if context == "build":
custom_mod_deps.update(build_and_supporting_deps)
# Tests may be performed after build
custom_mod_deps.update(test_and_supporting_deps)
else:
# test/run context
custom_mod_deps.update(run_and_supporting_deps)
if context == "test":
custom_mod_deps.update(test_and_supporting_deps)
custom_mod_deps.update(link_deps)
def set_all_package_py_globals(self):
"""Set the globals in modules of package.py files."""
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
# Determine 'exe_deps': the set of packages with binaries we want to use
if context == "build":
exe_deps = build_and_supporting_deps | test_and_supporting_deps
elif context == "run":
exe_deps = set(spec.traverse(deptype="run"))
elif context == "test":
exe_deps = test_and_supporting_deps
if self.should_set_package_py_globals & flag:
if self.context == Context.BUILD and self.needs_build_context & flag:
set_package_py_globals(pkg, context=Context.BUILD)
else:
# This includes runtime dependencies, also runtime deps of direct build deps.
set_package_py_globals(pkg, context=Context.RUN)
def default_modifications_for_dep(dep):
if dep in build_link_deps and not is_system_path(dep.prefix) and context == "build":
prefix = dep.prefix
for spec in dspec.dependents():
# Note: some specs have dependents that are unreachable from the root, so avoid
# setting globals for those.
if id(spec) not in self.nodes_in_subdag:
continue
dependent_module = ModuleChangePropagator(spec.package)
pkg.setup_dependent_package(dependent_module, spec)
dependent_module.propagate_changes_to_mro()
env.prepend_path("CMAKE_PREFIX_PATH", prefix)
def get_env_modifications(self) -> EnvironmentModifications:
"""Returns the environment variable modifications for the given input specs and context.
Environment modifications include:
- Updating PATH for packages that are required at runtime
- Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective
tools can find Spack-built dependencies (when context=build)
- Running custom package environment modifications: setup_run_environment,
setup_dependent_run_environment, setup_build_environment,
setup_dependent_build_environment.
for directory in ("lib", "lib64", "share"):
pcdir = os.path.join(prefix, directory, "pkgconfig")
if os.path.isdir(pcdir):
env.prepend_path("PKG_CONFIG_PATH", pcdir)
The (partial) order imposed on the specs is externals first, then topological
from leaf to root. That way externals cannot contribute search paths that would shadow
Spack's prefixes, and dependents override variables set by dependencies."""
env = EnvironmentModifications()
for dspec, flag in chain(self.external, self.nonexternal):
tty.debug(f"Adding env modifications for {dspec.name}")
pkg = dspec.package
if dep in exe_deps and not is_system_path(dep.prefix):
_make_runnable(dep, env)
if self.should_setup_dependent_build_env & flag:
self._make_buildtime_detectable(dspec, env)
def add_modifications_for_dep(dep):
tty.debug("Adding env modifications for {0}".format(dep.name))
# Some callers of this function only want the custom modifications.
# For callers that want both custom and default modifications, we want
# to perform the default modifications here (this groups custom
# and default modifications together on a per-package basis).
if not custom_mods_only:
default_modifications_for_dep(dep)
for root in self.specs: # there is only one root in build context
spack.builder.create(pkg).setup_dependent_build_environment(env, root)
# Perform custom modifications here (PrependPath actions performed in
# the custom method override the default environment modifications
# we do to help the build, namely for PATH, CMAKE_PREFIX_PATH, and
# PKG_CONFIG_PATH)
if dep in custom_mod_deps:
dpkg = dep.package
if set_package_py_globals:
set_module_variables_for_package(dpkg)
if self.should_setup_build_env & flag:
spack.builder.create(pkg).setup_build_environment(env)
current_module = ModuleChangePropagator(spec.package)
dpkg.setup_dependent_package(current_module, spec)
current_module.propagate_changes_to_mro()
if self.should_be_runnable & flag:
self._make_runnable(dspec, env)
if context == "build":
builder = spack.builder.create(dpkg)
builder.setup_dependent_build_environment(env, spec)
else:
dpkg.setup_dependent_run_environment(env, spec)
tty.debug("Added env modifications for {0}".format(dep.name))
if self.should_setup_run_env & flag:
run_env_mods = EnvironmentModifications()
for spec in dspec.dependents(deptype=dt.LINK | dt.RUN):
if id(spec) in self.nodes_in_subdag:
pkg.setup_dependent_run_environment(run_env_mods, spec)
pkg.setup_run_environment(run_env_mods)
if self.context == Context.BUILD:
# Don't let the runtime environment of comiler like dependencies leak into the
# build env
run_env_mods.drop("CC", "CXX", "F77", "FC")
env.extend(run_env_mods)
# Note that we want to perform environment modifications in a fixed order.
# The Spec.traverse method provides this: i.e. in addition to
# the post-order semantics, it also guarantees a fixed traversal order
# among dependencies which are not constrained by post-order semantics.
for dspec in spec.traverse(root=False, order="post"):
if dspec.external:
add_modifications_for_dep(dspec)
return env
for dspec in spec.traverse(root=False, order="post"):
# Default env modifications for non-external packages can override
# custom modifications of external packages (this can only occur
# for modifications to PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH)
if not dspec.external:
add_modifications_for_dep(dspec)
def _make_buildtime_detectable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
if is_system_path(dep.prefix):
return
return env
env.prepend_path("CMAKE_PREFIX_PATH", dep.prefix)
for d in ("lib", "lib64", "share"):
pcdir = os.path.join(dep.prefix, d, "pkgconfig")
if os.path.isdir(pcdir):
env.prepend_path("PKG_CONFIG_PATH", pcdir)
def _make_runnable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
if is_system_path(dep.prefix):
return
for d in ("bin", "bin64"):
bin_dir = os.path.join(dep.prefix, d)
if os.path.isdir(bin_dir):
env.prepend_path("PATH", bin_dir)
def get_cmake_prefix_path(pkg):
@@ -996,7 +1089,7 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
context = kwargs.get("context", "build")
context: str = kwargs.get("context", "build")
try:
# We are in the child process. Python sets sys.stdin to
@@ -1012,7 +1105,7 @@ def _setup_pkg_and_run(
if not kwargs.get("fake", False):
kwargs["unmodified_env"] = os.environ.copy()
kwargs["env_modifications"] = setup_package(
pkg, dirty=kwargs.get("dirty", False), context=context
pkg, dirty=kwargs.get("dirty", False), context=Context.from_string(context)
)
return_value = function(pkg, kwargs)
write_pipe.send(return_value)

View File

@@ -46,6 +46,7 @@ class AutotoolsPackage(spack.package_base.PackageBase):
depends_on("gnuconfig", type="build", when="target=ppc64le:")
depends_on("gnuconfig", type="build", when="target=aarch64:")
depends_on("gnuconfig", type="build", when="target=riscv64:")
depends_on("gmake", type="build")
conflicts("platform=windows")
def flags_to_build_system_args(self, flags):

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_environment
import spack.builder
from .cmake import CMakeBuilder, CMakePackage
@@ -34,6 +35,11 @@ def cmake_cache_option(name, boolean_value, comment="", force=False):
return 'set({0} {1} CACHE BOOL "{2}"{3})\n'.format(name, value, comment, force_str)
def cmake_cache_filepath(name, value, comment=""):
"""Generate a string for a cmake cache variable of type FILEPATH"""
return 'set({0} "{1}" CACHE FILEPATH "{2}")\n'.format(name, value, comment)
class CachedCMakeBuilder(CMakeBuilder):
#: Phases of a Cached CMake package
#: Note: the initconfig phase is used for developer builds as a final phase to stop on
@@ -257,6 +263,15 @@ def initconfig_hardware_entries(self):
entries.append(
cmake_cache_path("HIP_CXX_COMPILER", "{0}".format(self.spec["hip"].hipcc))
)
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "clang++"))
)
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
@@ -271,13 +286,29 @@ def initconfig_hardware_entries(self):
def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
cmake_rpaths_env = spack.build_environment.get_rpaths(self.pkg)
cmake_rpaths_path = ";".join(cmake_rpaths_env)
complete_rpath_list = cmake_rpaths_path
if "SPACK_COMPILER_EXTRA_RPATHS" in os.environ:
spack_extra_rpaths_env = os.environ["SPACK_COMPILER_EXTRA_RPATHS"]
spack_extra_rpaths_path = spack_extra_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_extra_rpaths_path)
if "SPACK_COMPILER_IMPLICIT_RPATHS" in os.environ:
spack_implicit_rpaths_env = os.environ["SPACK_COMPILER_IMPLICIT_RPATHS"]
spack_implicit_rpaths_path = spack_implicit_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_implicit_rpaths_path)
return [
"#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!",
"#------------------{0}".format("-" * 60),
"# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path),
"#------------------{0}\n".format("-" * 60),
cmake_cache_path("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
]

View File

@@ -0,0 +1,89 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "CargoPackage"
build_system("cargo")
with when("build_system=cargo"):
depends_on("rust", type="build")
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.CargoBuilder.build`
2. :py:meth:`~.CargoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+----------------------+
| **Method** | **Purpose** |
+===============================================+======================+
| :py:meth:`~.CargoBuilder.build_args` | Specify arguments |
| | to ``cargo install`` |
+-----------------------------------------------+----------------------+
| :py:meth:`~.CargoBuilder.check_args` | Specify arguments |
| | to ``cargo test`` |
+-----------------------------------------------+----------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
@property
def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return []
@property
def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
def install(self, pkg, spec, prefix):
"""Copy build files into package prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).cargo("test", *self.check_args)

View File

@@ -142,10 +142,10 @@ def flags_to_build_system_args(self, flags):
# We specify for each of them.
if flags["ldflags"]:
ldflags = " ".join(flags["ldflags"])
ld_string = "-DCMAKE_{0}_LINKER_FLAGS={1}"
# cmake has separate linker arguments for types of builds.
for type in ["EXE", "MODULE", "SHARED", "STATIC"]:
self.cmake_flag_args.append(ld_string.format(type, ldflags))
self.cmake_flag_args.append(f"-DCMAKE_EXE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_MODULE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_SHARED_LINKER_FLAGS={ldflags}")
# CMake has libs options separated by language. Apply ours to each.
if flags["ldlibs"]:

View File

@@ -136,12 +136,12 @@ def cuda_flags(arch_list):
conflicts("%gcc@11:", when="+cuda ^cuda@:11.4.0")
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.1")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -0,0 +1,98 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using the Go toolchain."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "GoPackage"
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "go"
build_system("go")
with when("build_system=go"):
# TODO: this seems like it should be depends_on, see
# setup_dependent_build_environment in go for why I kept it like this
extends("go@1.14:", type="build")
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.GoBuilder.build`
2. :py:meth:`~.GoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+--------------------+
| **Method** | **Purpose** |
+===============================================+====================+
| :py:meth:`~.GoBuilder.build_args` | Specify arguments |
| | to ``go build`` |
+-----------------------------------------------+--------------------+
| :py:meth:`~.GoBuilder.check_args` | Specify arguments |
| | to ``go test`` |
+-----------------------------------------------+--------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
def setup_build_environment(self, env):
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
@property
def build_directory(self):
"""Return the directory containing the main go.mod."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property
def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).go("build", *self.build_args)
def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin."""
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).go("test", *self.check_args)

View File

@@ -9,7 +9,8 @@
import spack.builder
import spack.package_base
from spack.directives import build_system, conflicts
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BaseBuilder,
@@ -29,7 +30,10 @@ class MakefilePackage(spack.package_base.PackageBase):
legacy_buildsystem = "makefile"
build_system("makefile")
conflicts("platform=windows", when="build_system=makefile")
with when("build_system=makefile"):
conflicts("platform=windows")
depends_on("gmake", type="build")
@spack.builder.builder("makefile")

View File

@@ -7,13 +7,12 @@
import os
import platform
import shutil
from os.path import basename, dirname, isdir
from os.path import basename, isdir
from llnl.util.filesystem import find_headers, find_libraries, join_path
from llnl.util.filesystem import HeaderList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
from spack.directives import conflicts, variant
from spack.package import mkdirp
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -56,10 +55,21 @@ def component_dir(self):
"""Subdirectory for this component in the install prefix."""
raise NotImplementedError
@property
def v2_layout_versions(self):
"""Version that implements the v2 directory layout."""
raise NotImplementedError
@property
def v2_layout(self):
"""Returns true if this version implements the v2 directory layout."""
return self.spec.satisfies(self.v2_layout_versions)
@property
def component_prefix(self):
"""Path to component <prefix>/<component>/<version>."""
return self.prefix.join(join_path(self.component_dir, self.spec.version))
v = self.spec.version.up_to(2) if self.v2_layout else self.spec.version
return self.prefix.join(self.component_dir).join(str(v))
@property
def env_script_args(self):
@@ -113,8 +123,9 @@ def install_component(self, installer_path):
shutil.rmtree("/var/intel/installercache", ignore_errors=True)
# Some installers have a bug and do not return an error code when failing
if not isdir(join_path(self.prefix, self.component_dir)):
raise RuntimeError("install failed")
install_dir = self.component_prefix
if not isdir(install_dir):
raise RuntimeError("install failed to directory: {0}".format(install_dir))
def setup_run_environment(self, env):
"""Adds environment variables to the generated module file.
@@ -129,7 +140,7 @@ def setup_run_environment(self, env):
if "~envmods" not in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh"), *self.env_script_args
self.component_prefix.env.join("vars.sh"), *self.env_script_args
)
)
@@ -168,16 +179,40 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""
def header_directories(self, dirs):
h = HeaderList([])
h.directories = dirs
return h
@property
def headers(self):
include_path = join_path(self.component_prefix, "include")
return find_headers("*", include_path, recursive=True)
return self.header_directories(
[self.component_prefix.include, self.component_prefix.include.join(self.component_dir)]
)
@property
def libs(self):
lib_path = join_path(self.component_prefix, "lib", "intel64")
lib_path = lib_path if isdir(lib_path) else dirname(lib_path)
return find_libraries("*", root=lib_path, shared=True, recursive=True)
# for v2_layout all libraries are in the top level, v1 sometimes put them in intel64
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries
that expose functionality in sdk subdirectories.
Implement the method directly in the package if something
different is needed.
"""
@property
def headers(self):
return self.header_directories([self.component_prefix.sdk.include])
@property
def libs(self):
return find_libraries("*", self.component_prefix.sdk.lib64)
class IntelOneApiStaticLibraryList:
@@ -212,3 +247,7 @@ def link_flags(self):
@property
def ld_flags(self):
return "{0} {1}".format(self.search_flags, self.link_flags)
#: Tuple of Intel math libraries, exported to packages
INTEL_MATH_LIBRARIES = ("intel-mkl", "intel-oneapi-mkl", "intel-parallel-studio")

View File

@@ -6,14 +6,14 @@
import os
import re
import shutil
import stat
from typing import Optional
from typing import Iterable, List, Mapping, Optional
import archspec
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder
import spack.config
@@ -24,19 +24,38 @@
import spack.spec
import spack.store
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError, SpecError
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.util.executable import Executable
from spack.version import Version
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
"""Iterable that yields KEY=VALUE paths through a dictionary.
Args:
dictionary: Possibly nested dictionary of arbitrary keys and values.
Yields:
A single path through the dictionary.
"""
for key, item in dictionary.items():
if isinstance(item, dict):
# Recursive case
for value in _flatten_dict(item):
yield f"{key}={value}"
else:
# Base case
yield f"{key}={item}"
class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self):
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
These are used to test whether or not the installation succeeded.
@@ -51,7 +70,7 @@ def import_modules(self):
detected, this property can be overridden by the package.
Returns:
list: list of strings of module names
List of strings of module names.
"""
modules = []
pkg = self.spec["python"].package
@@ -88,14 +107,14 @@ def import_modules(self):
return modules
@property
def skip_modules(self):
def skip_modules(self) -> Iterable[str]:
"""Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained).
Returns:
list: list of strings of module names
List of strings of module names.
"""
return []
@@ -171,12 +190,12 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(to_remove)
def test_imports(self):
def test_imports(self) -> None:
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python
python = inspect.getmodule(self).python # type: ignore[union-attr]
for module in self.import_modules:
with test_part(
self,
@@ -301,24 +320,27 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
def homepage(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/"
return f"https://pypi.org/project/{name}/"
return None
@lang.classproperty
def url(cls):
def url(cls) -> Optional[str]:
if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
@lang.classproperty
def list_url(cls):
def list_url(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
return f"https://pypi.org/simple/{name}/"
return None
@property
def headers(self):
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
# Remove py- prefix in package name
@@ -336,7 +358,7 @@ def headers(self):
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def libs(self):
def libs(self) -> LibraryList:
"""Discover libraries in platlib."""
# Remove py- prefix in package name
@@ -353,51 +375,6 @@ def libs(self):
raise NoLibrariesError(msg.format(self.spec.name, root))
def fixup_shebangs(path: str, old_interpreter: bytes, new_interpreter: bytes):
# Recurse into the install prefix and fixup shebangs
exe = stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
dirs = [path]
hardlinks = set()
while dirs:
with os.scandir(dirs.pop()) as entries:
for entry in entries:
if entry.is_dir(follow_symlinks=False):
dirs.append(entry.path)
continue
# Only consider files, not symlinks
if not entry.is_file(follow_symlinks=False):
continue
lstat = entry.stat(follow_symlinks=False)
# Skip over files that are not executable
if not (lstat.st_mode & exe):
continue
# Don't modify hardlinks more than once
if lstat.st_nlink > 1:
key = (lstat.st_ino, lstat.st_dev)
if key in hardlinks:
continue
hardlinks.add(key)
# Finally replace shebangs if any.
with open(entry.path, "rb+") as f:
contents = f.read(2)
if contents != b"#!":
continue
contents += f.read()
if old_interpreter not in contents:
continue
f.seek(0)
f.write(contents.replace(old_interpreter, new_interpreter))
f.truncate()
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
phases = ("install",)
@@ -409,13 +386,13 @@ class PythonPipBuilder(BaseBuilder):
legacy_long_methods = ("install_options", "global_options", "config_settings")
#: Names associated with package attributes in the old build-system format
legacy_attributes = ("build_directory", "install_time_test_callbacks")
legacy_attributes = ("archive_files", "build_directory", "install_time_test_callbacks")
#: Callback names for install-time test
install_time_test_callbacks = ["test"]
@staticmethod
def std_args(cls):
def std_args(cls) -> List[str]:
return [
# Verbose
"-vvv",
@@ -440,7 +417,7 @@ def std_args(cls):
]
@property
def build_directory(self):
def build_directory(self) -> str:
"""The root directory of the Python package.
This is usually the directory containing one of the following files:
@@ -451,127 +428,76 @@ def build_directory(self):
"""
return self.pkg.stage.source_path
def config_settings(self, spec, prefix):
def config_settings(self, spec: Spec, prefix: Prefix) -> Mapping[str, object]:
"""Configuration settings to be passed to the PEP 517 build backend.
Requires pip 22.1 or newer.
Requires pip 22.1 or newer for keys that appear only a single time,
or pip 23.1 or newer if the same key appears multiple times.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
dict: dictionary of KEY, VALUE settings
Possibly nested dictionary of KEY, VALUE settings.
"""
return {}
def install_options(self, spec, prefix):
def install_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra arguments to be supplied to the setup.py install command.
Requires pip 23.0 or older.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
def global_options(self, spec, prefix):
def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra global options to be supplied to the setup.py call before the install
or bdist_wheel command.
Deprecated in pip 23.1.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
@property
def _build_venv_path(self):
"""Return the path to the virtual environment used for building when
python is external."""
return os.path.join(self.spec.package.stage.path, "build_env")
@property
def _build_venv_python(self) -> Executable:
"""Return the Python executable in the build virtual environment when
python is external."""
return Executable(os.path.join(self._build_venv_path, "bin", "python"))
def install(self, pkg, spec, prefix):
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
python: Executable = spec["python"].command
# Since we invoke pip with --no-build-isolation, we have to make sure that pip cannot
# execute hooks from user and system site-packages.
if spec["python"].external:
# There are no environment variables to disable the system site-packages, so we use a
# virtual environment instead. The downside of this approach is that pip produces
# incorrect shebangs that refer to the virtual environment, which we have to fix up.
python("-m", "venv", "--without-pip", self._build_venv_path)
pip = self._build_venv_python
else:
# For a Spack managed Python, system site-packages is empty/unused by design, so it
# suffices to disable user site-packages, for which there is an environment variable.
pip = python
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
args = PythonPipBuilder.std_args(pkg) + ["--prefix=" + prefix]
for key, value in self.config_settings(spec, prefix).items():
if spec["py-pip"].version < Version("22.1"):
raise SpecError(
"'{}' package uses 'config_settings' which is only supported by "
"pip 22.1+. Add the following line to the package to fix this:\n\n"
' depends_on("py-pip@22.1:", type="build")'.format(spec.name)
)
args.append("--config-settings={}={}".format(key, value))
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]
for setting in _flatten_dict(self.config_settings(spec, prefix)):
args.append(f"--config-settings={setting}")
for option in self.install_options(spec, prefix):
args.append("--install-option=" + option)
args.append(f"--install-option={option}")
for option in self.global_options(spec, prefix):
args.append("--global-option=" + option)
args.append(f"--global-option={option}")
if pkg.stage.archive_file and pkg.stage.archive_file.endswith(".whl"):
args.append(pkg.stage.archive_file)
else:
args.append(".")
pip = spec["python"].command
# Hide user packages, since we don't have build isolation. This is
# necessary because pip / setuptools may run hooks from arbitrary
# packages during the build. There is no equivalent variable to hide
# system packages, so this is not reliable for external Python.
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
with fs.working_dir(self.build_directory):
pip(*args)
@spack.builder.run_after("install")
def fixup_shebangs_pointing_to_build(self):
"""When installing a package using an external python, we use a temporary virtual
environment which improves build isolation. The downside is that pip produces shebangs
that point to the temporary virtual environment. This method fixes them up to point to the
underlying Python."""
# No need to fixup shebangs if no build venv was used. (this post install function also
# runs when install was overridden in another package, so check existence of the venv path)
if not os.path.exists(self._build_venv_path):
return
# Use sys.executable, since that's what pip uses.
interpreter = (
lambda python: python("-c", "import sys; print(sys.executable)", output=str)
.strip()
.encode("utf-8")
)
fixup_shebangs(
path=self.spec.prefix,
old_interpreter=interpreter(self._build_venv_python),
new_interpreter=interpreter(self.spec["python"].command),
)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -64,7 +64,7 @@ class RacketBuilder(spack.builder.Builder):
@property
def subdirectory(self):
if self.racket_name:
if self.pkg.racket_name:
return "pkgs/{0}".format(self.pkg.racket_name)
return None

View File

@@ -108,6 +108,8 @@ class ROCmPackage(PackageBase):
"gfx90a:xnack+",
"gfx90c",
"gfx940",
"gfx941",
"gfx942",
"gfx1010",
"gfx1011",
"gfx1012",
@@ -168,6 +170,8 @@ def hip_flags(amdgpu_target):
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")

View File

@@ -25,6 +25,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import memoized
from llnl.util.tty.color import cescape, colorize
import spack
import spack.binary_distribution as bindist
@@ -45,10 +46,26 @@
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
JOB_RETRY_CONDITIONS = ["always"]
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
JOB_RETRY_CONDITIONS = [
# "always",
"unknown_failure",
"script_failure",
"api_failure",
"stuck_or_timeout_failure",
"runner_system_failure",
"runner_unsupported",
"stale_schedule",
# "job_execution_timeout",
"archived_failure",
"unmet_prerequisites",
"scheduler_failure",
"data_integrity_failure",
]
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
# TODO: Remove this in Spack 0.23
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{arch=architecture}"
@@ -96,15 +113,6 @@ def _remove_reserved_tags(tags):
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _get_spec_string(spec):
format_elements = ["{name}{@version}", "{%compiler}"]
if spec.architecture:
format_elements.append(" {arch=architecture}")
return spec.format("".join(format_elements))
def _spec_deps_key(s):
return "{0}/{1}".format(s.name, s.dag_hash(7))
@@ -209,22 +217,22 @@ def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisi
tty.msg("Staging summary ([x] means a job needs rebuilding):")
for stage_index, stage in enumerate(stages):
tty.msg(" stage {0} ({1} jobs):".format(stage_index, len(stage)))
tty.msg(f" stage {stage_index} ({len(stage)} jobs):")
for job in sorted(stage):
for job in sorted(stage, key=lambda j: (not rebuild_decisions[j].rebuild, j)):
s = spec_labels[job]
rebuild = rebuild_decisions[job].rebuild
reason = rebuild_decisions[job].reason
reason_msg = " ({0})".format(reason) if reason else ""
tty.msg(
" [{1}] {0} -> {2}{3}".format(
job, "x" if rebuild else " ", _get_spec_string(s), reason_msg
)
)
if rebuild_decisions[job].mirrors:
tty.msg(" found on the following mirrors:")
for murl in rebuild_decisions[job].mirrors:
tty.msg(" {0}".format(murl))
reason_msg = f" ({reason})" if reason else ""
spec_fmt = "{name}{@version}{%compiler}{/hash:7}"
if rebuild_decisions[job].rebuild:
status = colorize("@*g{[x]} ")
msg = f" {status}{s.cformat(spec_fmt)}{reason_msg}"
else:
msg = f"{s.format(spec_fmt)}{reason_msg}"
if rebuild_decisions[job].mirrors:
msg += f" [{', '.join(rebuild_decisions[job].mirrors)}]"
msg = colorize(f" @K - {cescape(msg)}@.")
tty.msg(msg)
def _compute_spec_deps(spec_list):
@@ -678,7 +686,7 @@ def generate_gitlab_ci_yaml(
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option.
on branch name, facilitated by this option. DEPRECATED
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
@@ -775,17 +783,39 @@ def generate_gitlab_ci_yaml(
"instead.",
)
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors:
if remote_mirror_override:
tty.die(
"Using the deprecated --buildcache-destination cli option and "
"having a mirror named 'buildcache-destination' at the same time "
"is not allowed"
)
buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
# TODO: Remove this block in spack 0.23
remote_mirror_url = None
if deprecated_mirror_config:
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if spack_buildcache_copy:
buildcache_copies = {}
buildcache_copy_src_prefix = remote_mirror_override or remote_mirror_url
buildcache_copy_src_prefix = (
buildcache_destination.fetch_url
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
buildcache_copy_dest_prefix = spack_buildcache_copy
# Check for a list of "known broken" specs that we should not bother
@@ -797,6 +827,7 @@ def generate_gitlab_ci_yaml(
enable_artifacts_buildcache = False
if "enable-artifacts-buildcache" in ci_config:
tty.warn("Support for enable-artifacts-buildcache will be removed in Spack 0.23")
enable_artifacts_buildcache = ci_config["enable-artifacts-buildcache"]
rebuild_index_enabled = True
@@ -805,13 +836,15 @@ def generate_gitlab_ci_yaml(
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
tty.warn("Support for temporary-storage-url-prefix will be removed in Spack 0.23")
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
# If a remote mirror override (alternate buildcache destination) was
# specified, add it here in case it has already built hashes we might
# generate.
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if remote_mirror_override:
if deprecated_mirror_config and remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
# Overriding the main mirror in this case might result
# in skipping jobs on a release pipeline because specs are
@@ -831,8 +864,9 @@ def generate_gitlab_ci_yaml(
cfg.default_modify_scope(),
)
# TODO: Remove this block in Spack 0.23
shared_pr_mirror = None
if spack_pipeline_type == "spack_pull_request":
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
stack_name = os.environ.get("SPACK_CI_STACK_NAME", "")
shared_pr_mirror = url_util.join(SHARED_PR_MIRROR_URL, stack_name)
spack.mirror.add(
@@ -884,6 +918,7 @@ def generate_gitlab_ci_yaml(
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
# TODO: Remove this line in Spack 0.23
local_mirror_dir = os.path.join(pipeline_artifacts_dir, "mirror")
user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data")
@@ -898,13 +933,13 @@ def generate_gitlab_ci_yaml(
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir)
rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir)
# TODO: Remove this line in Spack 0.23
rel_local_mirror_dir = os.path.join(local_mirror_dir, ci_project_dir)
rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir)
# Speed up staging by first fetching binary indices from all mirrors
# (including the override mirror we may have just added above).
try:
bindist.binary_index.update()
bindist.BINARY_INDEX.update()
except bindist.FetchCacheError as e:
tty.warn(e)
@@ -1113,6 +1148,7 @@ def main_script_replacements(cmd):
},
)
# TODO: Remove this block in Spack 0.23
if enable_artifacts_buildcache:
bc_root = os.path.join(local_mirror_dir, "build_cache")
job_object["artifacts"]["paths"].extend(
@@ -1142,10 +1178,12 @@ def main_script_replacements(cmd):
_print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions)
# Clean up remote mirror override if enabled
if remote_mirror_override:
spack.mirror.remove("ci_pr_mirror", cfg.default_modify_scope())
if spack_pipeline_type == "spack_pull_request":
spack.mirror.remove("ci_shared_pr_mirror", cfg.default_modify_scope())
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config:
if remote_mirror_override:
spack.mirror.remove("ci_pr_mirror", cfg.default_modify_scope())
if spack_pipeline_type == "spack_pull_request":
spack.mirror.remove("ci_shared_pr_mirror", cfg.default_modify_scope())
tty.debug("{0} build jobs generated in {1} stages".format(job_id, stage_id))
@@ -1176,10 +1214,28 @@ def main_script_replacements(cmd):
sync_job["needs"] = [
{"job": generate_job_name, "pipeline": "{0}".format(parent_pipeline_id)}
]
if "variables" not in sync_job:
sync_job["variables"] = {}
sync_job["variables"]["SPACK_COPY_ONLY_DESTINATION"] = (
buildcache_destination.fetch_url
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
if "buildcache-source" in pipeline_mirrors:
buildcache_source = pipeline_mirrors["buildcache-source"].fetch_url
else:
# TODO: Remove this condition in Spack 0.23
buildcache_source = os.environ.get("SPACK_SOURCE_MIRROR", None)
sync_job["variables"]["SPACK_BUILDCACHE_SOURCE"] = buildcache_source
output_object["copy"] = sync_job
job_id += 1
if job_id > 0:
# TODO: Remove this block in Spack 0.23
if temp_storage_url_prefix:
# There were some rebuild jobs scheduled, so we will need to
# schedule a job to clean up the temporary storage location
@@ -1213,6 +1269,13 @@ def main_script_replacements(cmd):
signing_job["when"] = "always"
signing_job["retry"] = {"max": 2, "when": ["always"]}
signing_job["interruptible"] = True
if "variables" not in signing_job:
signing_job["variables"] = {}
signing_job["variables"]["SPACK_BUILDCACHE_DESTINATION"] = (
buildcache_destination.push_url # need the s3 url for aws s3 sync
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
output_object["sign-pkgs"] = signing_job
@@ -1221,13 +1284,13 @@ def main_script_replacements(cmd):
stage_names.append("stage-rebuild-index")
final_job = spack_ci_ir["jobs"]["reindex"]["attributes"]
index_target_mirror = mirror_urls[0]
if remote_mirror_override:
index_target_mirror = remote_mirror_override
final_job["stage"] = "stage-rebuild-index"
target_mirror = remote_mirror_override or remote_mirror_url
if buildcache_destination:
target_mirror = buildcache_destination.push_url
final_job["script"] = _unpack_script(
final_job["script"],
op=lambda cmd: cmd.replace("{index_target_mirror}", index_target_mirror),
op=lambda cmd: cmd.replace("{index_target_mirror}", target_mirror),
)
final_job["when"] = "always"
@@ -1249,20 +1312,24 @@ def main_script_replacements(cmd):
"SPACK_CONCRETE_ENV_DIR": rel_concrete_env_dir,
"SPACK_VERSION": spack_version,
"SPACK_CHECKOUT_VERSION": version_to_clone,
# TODO: Remove this line in Spack 0.23
"SPACK_REMOTE_MIRROR_URL": remote_mirror_url,
"SPACK_JOB_LOG_DIR": rel_job_log_dir,
"SPACK_JOB_REPRO_DIR": rel_job_repro_dir,
"SPACK_JOB_TEST_DIR": rel_job_test_dir,
# TODO: Remove this line in Spack 0.23
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
# TODO: Remove this line in Spack 0.23
"SPACK_CI_SHARED_PR_MIRROR_URL": shared_pr_mirror or "None",
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
}
if remote_mirror_override:
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
(output_object["variables"]["SPACK_REMOTE_MIRROR_OVERRIDE"]) = remote_mirror_override
spack_stack_name = os.environ.get("SPACK_CI_STACK_NAME", None)
@@ -2002,43 +2069,23 @@ def process_command(name, commands, repro_dir, run=True, exit_on_failure=True):
def create_buildcache(
input_spec: spack.spec.Spec,
*,
pipeline_mirror_url: Optional[str] = None,
buildcache_mirror_url: Optional[str] = None,
sign_binaries: bool = False,
input_spec: spack.spec.Spec, *, destination_mirror_urls: List[str], sign_binaries: bool = False
) -> List[PushResult]:
"""Create the buildcache at the provided mirror(s).
Arguments:
input_spec: Installed spec to package and push
buildcache_mirror_url: URL for the buildcache mirror
pipeline_mirror_url: URL for the pipeline mirror
destination_mirror_urls: List of urls to push to
sign_binaries: Whether or not to sign buildcache entry
Returns: A list of PushResults, indicating success or failure.
"""
results = []
# Create buildcache in either the main remote mirror, or in the
# per-PR mirror, if this is a PR pipeline
if buildcache_mirror_url:
for mirror_url in destination_mirror_urls:
results.append(
PushResult(
success=push_mirror_contents(input_spec, buildcache_mirror_url, sign_binaries),
url=buildcache_mirror_url,
)
)
# Create another copy of that buildcache in the per-pipeline
# temporary storage mirror (this is only done if either
# artifacts buildcache is enabled or a temporary storage url
# prefix is set)
if pipeline_mirror_url:
results.append(
PushResult(
success=push_mirror_contents(input_spec, pipeline_mirror_url, sign_binaries),
url=pipeline_mirror_url,
success=push_mirror_contents(input_spec, mirror_url, sign_binaries), url=mirror_url
)
)
@@ -2218,13 +2265,13 @@ def build_name(self):
spec.architecture,
self.build_group,
)
tty.verbose(
tty.debug(
"Generated CDash build name ({0}) from the {1}".format(build_name, spec.name)
)
return build_name
build_name = os.environ.get("SPACK_CDASH_BUILD_NAME")
tty.verbose("Using CDash build name ({0}) from the environment".format(build_name))
tty.debug("Using CDash build name ({0}) from the environment".format(build_name))
return build_name
@property # type: ignore
@@ -2238,11 +2285,11 @@ def build_stamp(self):
Returns: (str) current CDash build stamp"""
build_stamp = os.environ.get("SPACK_CDASH_BUILD_STAMP")
if build_stamp:
tty.verbose("Using build stamp ({0}) from the environment".format(build_stamp))
tty.debug("Using build stamp ({0}) from the environment".format(build_stamp))
return build_stamp
build_stamp = cdash_build_stamp(self.build_group, time.time())
tty.verbose("Generated new build stamp ({0})".format(build_stamp))
tty.debug("Generated new build stamp ({0})".format(build_stamp))
return build_stamp
@property # type: ignore

View File

@@ -6,10 +6,8 @@
import argparse
import os
import re
import shlex
import sys
from textwrap import dedent
from typing import List, Match, Tuple
from typing import List, Union
import llnl.string
import llnl.util.tty as tty
@@ -147,89 +145,37 @@ def get_command(cmd_name):
return getattr(get_module(cmd_name), pname)
class _UnquotedFlags:
"""Use a heuristic in `.extract()` to detect whether the user is trying to set
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
def quote_kvp(string: str) -> str:
"""For strings like ``name=value`` or ``name==value``, quote and escape the value if needed.
If the heuristic finds a match (which can be checked with `__bool__()`), a warning
message explaining how to quote multiple flags correctly can be generated with
`.report()`.
This is a compromise to respect quoting of key-value pairs on the CLI. The shell
strips quotes from quoted arguments, so we cannot know *exactly* how CLI arguments
were quoted. To compensate, we re-add quotes around anything staritng with ``name=``
or ``name==``, and we assume the rest of the argument is the value. This covers the
common cases of passign flags, e.g., ``cflags="-O2 -g"`` on the command line.
"""
match = re.match(spack.parser.SPLIT_KVP, string)
if not match:
return string
flags_arg_pattern = re.compile(
r'^({0})=([^\'"].*)$'.format("|".join(spack.spec.FlagMap.valid_compiler_flags()))
)
def __init__(self, all_unquoted_flag_pairs: List[Tuple[Match[str], str]]):
self._flag_pairs = all_unquoted_flag_pairs
def __bool__(self) -> bool:
return bool(self._flag_pairs)
@classmethod
def extract(cls, sargs: str) -> "_UnquotedFlags":
all_unquoted_flag_pairs: List[Tuple[Match[str], str]] = []
prev_flags_arg = None
for arg in shlex.split(sargs):
if prev_flags_arg is not None:
all_unquoted_flag_pairs.append((prev_flags_arg, arg))
prev_flags_arg = cls.flags_arg_pattern.match(arg)
return cls(all_unquoted_flag_pairs)
def report(self) -> str:
single_errors = [
"({0}) {1} {2} => {3}".format(
i + 1,
match.group(0),
next_arg,
'{0}="{1} {2}"'.format(match.group(1), match.group(2), next_arg),
)
for i, (match, next_arg) in enumerate(self._flag_pairs)
]
return dedent(
"""\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
intent was to set multiple flags, try quoting them together as described below.
Possible flag quotation errors (with the correctly-quoted version after the =>):
{0}"""
).format("\n".join(single_errors))
key, delim, value = match.groups()
return f"{key}{delim}{spack.parser.quote_if_needed(value)}"
def parse_specs(args, **kwargs):
def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
"""
concretize = kwargs.get("concretize", False)
normalize = kwargs.get("normalize", False)
tests = kwargs.get("tests", False)
args = [args] if isinstance(args, str) else args
arg_string = " ".join([quote_kvp(arg) for arg in args])
sargs = args
if not isinstance(args, str):
sargs = " ".join(args)
unquoted_flags = _UnquotedFlags.extract(sargs)
try:
specs = spack.parser.parse(sargs)
for spec in specs:
if concretize:
spec.concretize(tests=tests) # implies normalize
elif normalize:
spec.normalize(tests=tests)
return specs
except spack.error.SpecError as e:
msg = e.message
if e.long_message:
msg += e.long_message
# Unquoted flags will be read as a variant or hash
if unquoted_flags and ("variant" in msg or "hash" in msg):
msg += "\n\n"
msg += unquoted_flags.report()
raise spack.error.SpackError(msg) from e
specs = spack.parser.parse(arg_string)
for spec in specs:
if concretize:
spec.concretize(tests=tests)
return specs
def matching_spec_from_env(spec):

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "add a spec to an environment"
section = "environments"

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import warnings
import llnl.util.tty as tty
import llnl.util.tty.colify
import llnl.util.tty.color as cl
@@ -52,8 +54,10 @@ def setup_parser(subparser):
def configs(parser, args):
reports = spack.audit.run_group(args.subcommand)
_process_reports(reports)
with warnings.catch_warnings():
warnings.simplefilter("ignore")
reports = spack.audit.run_group(args.subcommand)
_process_reports(reports)
def packages(parser, args):

View File

@@ -15,13 +15,13 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
from spack.cmd.common import arguments
description = "manage bootstrap configuration"
section = "system"
@@ -68,12 +68,8 @@
def _add_scope_option(parser):
scopes = spack.config.scopes()
parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
@@ -106,7 +102,7 @@ def setup_parser(subparser):
disable.add_argument("name", help="name of the source to be disabled", nargs="?", default=None)
reset = sp.add_parser("reset", help="reset bootstrapping configuration to Spack defaults")
spack.cmd.common.arguments.add_common_arguments(reset, ["yes_to_all"])
arguments.add_common_arguments(reset, ["yes_to_all"])
root = sp.add_parser("root", help="get/set the root bootstrap directory")
_add_scope_option(root)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd.common.env_utility as env_utility
from spack.context import Context
description = (
"run a command in a spec's install environment, or dump its environment to screen or file"
@@ -14,4 +15,4 @@
def build_env(parser, args):
env_utility.emulate_env_utility("build-env", "build", args)
env_utility.emulate_env_utility("build-env", Context.BUILD, args)

View File

@@ -3,36 +3,59 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import copy
import glob
import hashlib
import json
import multiprocessing.pool
import os
import shutil
import sys
import tempfile
from typing import List
import urllib.request
from typing import Dict, List, Optional, Tuple
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.string import plural
from llnl.util.lang import elide_list
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.error
import spack.hash_types as ht
import spack.mirror
import spack.oci.oci
import spack.oci.opener
import spack.relocate
import spack.repo
import spack.spec
import spack.stage
import spack.store
import spack.user_environment
import spack.util.crypto
import spack.util.url as url_util
import spack.util.web as web_util
from spack.build_environment import determine_number_of_jobs
from spack.cmd import display_specs
from spack.cmd.common import arguments
from spack.oci.image import (
Digest,
ImageReference,
default_config,
default_index_tag,
default_manifest,
default_tag,
tag_is_spec,
)
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
upload_blob_with_retry,
upload_manifest_with_retry,
)
from spack.spec import Spec, save_dependency_specfiles
from spack.stage import Stage
description = "create, download and install binary packages"
section = "packaging"
@@ -53,12 +76,26 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
push_sign = push.add_mutually_exclusive_group(required=False)
push_sign.add_argument(
"--unsigned", "-u", action="store_true", help="push unsigned buildcache tarballs"
"--unsigned",
"-u",
action="store_false",
dest="signed",
default=None,
help="push unsigned buildcache tarballs",
)
push_sign.add_argument(
"--signed",
action="store_true",
dest="signed",
default=None,
help="push signed buildcache tarballs",
)
push_sign.add_argument(
"--key", "-k", metavar="key", type=str, default=None, help="key for signing"
)
push.add_argument("mirror", type=str, help="mirror name, path, or URL")
push.add_argument(
"mirror", type=arguments.mirror_name_or_url, help="mirror name, path, or URL"
)
push.add_argument(
"--update-index",
"--rebuild-index",
@@ -84,7 +121,10 @@ def setup_parser(subparser: argparse.ArgumentParser):
action="store_true",
help="stop pushing on first failure (default is best effort)",
)
arguments.add_common_arguments(push, ["specs"])
push.add_argument(
"--base-image", default=None, help="specify the base image for the buildcache. "
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
install = subparsers.add_parser("install", help=install_fn.__doc__)
@@ -154,23 +194,22 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
# used to construct scope arguments below
scopes = spack.config.scopes()
check.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True)
check_spec_or_specfile.add_argument(
# Unfortunately there are 3 ways to do the same thing here:
check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check_spec_or_specfile.add_argument(
check_specs.add_argument(
"--spec-file",
help="check single spec from json or yaml file instead of release specs file",
)
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn)
@@ -268,6 +307,21 @@ def _matching_specs(specs: List[Spec]) -> List[Spec]:
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
def _format_spec(spec: Spec) -> str:
return spec.cformat("{name}{@version}{/hash:7}")
def _progress(i: int, total: int):
if total > 1:
digits = len(str(total))
return f"[{i+1:{digits}}/{total}] "
return ""
def _make_pool():
return multiprocessing.pool.Pool(determine_number_of_jobs(parallel=True))
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -281,15 +335,38 @@ def push_fn(args):
else:
specs = spack.cmd.require_active_env("buildcache push").all_specs()
mirror = arguments.mirror_name_or_url(args.mirror)
if args.allow_root:
tty.warn(
"The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22"
)
url = mirror.push_url
mirror: spack.mirror.Mirror = args.mirror
# Check if this is an OCI image.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)
except ValueError:
image_ref = None
push_url = mirror.push_url
# When neither --signed, --unsigned nor --key are specified, use the mirror's default.
if args.signed is None and not args.key:
unsigned = not mirror.signed
else:
unsigned = not (args.key or args.signed)
# For OCI images, we require dependencies to be pushed for now.
if image_ref:
if "dependencies" not in args.things_to_install:
tty.die("Dependencies must be pushed for OCI images.")
if not unsigned:
tty.warn(
"Code signing is currently not supported for OCI images. "
"Use --unsigned to silence this warning."
)
# This is a list of installed, non-external specs.
specs = bindist.specs_to_be_packaged(
specs,
root="package" in args.things_to_install,
@@ -299,45 +376,47 @@ def push_fn(args):
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.
if len(specs) > 1:
tty.info(f"Selected {len(specs)} specs to push to {url}")
tty.info(f"Selected {len(specs)} specs to push to {push_url}")
skipped = []
failed = []
# tty printing
color = clr.get_color_when()
format_spec = lambda s: s.format("{name}{@version}{/hash:7}", color=color)
total_specs = len(specs)
digits = len(str(total_specs))
# TODO: unify this logic in the future.
if image_ref:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
skipped = _push_oci(args, image_ref, specs, tmpdir, pool)
else:
skipped = []
for i, spec in enumerate(specs):
try:
bindist.push_or_raise(
spec,
url,
bindist.PushOptions(
force=args.force,
unsigned=args.unsigned,
key=args.key,
regenerate_index=args.update_index,
),
)
for i, spec in enumerate(specs):
try:
bindist.push_or_raise(
spec,
push_url,
bindist.PushOptions(
force=args.force,
unsigned=unsigned,
key=args.key,
regenerate_index=args.update_index,
),
)
if total_specs > 1:
msg = f"[{i+1:{digits}}/{total_specs}] Pushed {format_spec(spec)}"
else:
msg = f"Pushed {format_spec(spec)} to {url}"
msg = f"{_progress(i, len(specs))}Pushed {_format_spec(spec)}"
if len(specs) == 1:
msg += f" to {push_url}"
tty.info(msg)
tty.info(msg)
except bindist.NoOverwriteException:
skipped.append(_format_spec(spec))
except bindist.NoOverwriteException:
skipped.append(format_spec(spec))
# Catch any other exception unless the fail fast option is set
except Exception as e:
if args.fail_fast or isinstance(e, (bindist.PickKeyException, bindist.NoKeyException)):
raise
failed.append((format_spec(spec), e))
# Catch any other exception unless the fail fast option is set
except Exception as e:
if args.fail_fast or isinstance(
e, (bindist.PickKeyException, bindist.NoKeyException)
):
raise
failed.append((_format_spec(spec), e))
if skipped:
if len(specs) == 1:
@@ -364,6 +443,341 @@ def push_fn(args):
),
)
# Update the index if requested
# TODO: remove update index logic out of bindist; should be once after all specs are pushed
# not once per spec.
if image_ref and len(skipped) < len(specs) and args.update_index:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
_update_index_oci(image_ref, tmpdir, pool)
def _get_spack_binary_blob(image_ref: ImageReference) -> Optional[spack.oci.oci.Blob]:
"""Get the spack tarball layer digests and size if it exists"""
try:
manifest, config = get_manifest_and_config_with_retry(image_ref)
return spack.oci.oci.Blob(
compressed_digest=Digest.from_string(manifest["layers"][-1]["digest"]),
uncompressed_digest=Digest.from_string(config["rootfs"]["diff_ids"][-1]),
size=manifest["layers"][-1]["size"],
)
except Exception:
return None
def _push_single_spack_binary_blob(image_ref: ImageReference, spec: spack.spec.Spec, tmpdir: str):
filename = os.path.join(tmpdir, f"{spec.dag_hash()}.tar.gz")
# Create an oci.image.layer aka tarball of the package
compressed_tarfile_checksum, tarfile_checksum = spack.oci.oci.create_tarball(spec, filename)
blob = spack.oci.oci.Blob(
Digest.from_sha256(compressed_tarfile_checksum),
Digest.from_sha256(tarfile_checksum),
os.path.getsize(filename),
)
# Upload the blob
upload_blob_with_retry(image_ref, file=filename, digest=blob.compressed_digest)
# delete the file
os.unlink(filename)
return blob
def _retrieve_env_dict_from_config(config: dict) -> dict:
"""Retrieve the environment variables from the image config file.
Sets a default value for PATH if it is not present.
Args:
config (dict): The image config file.
Returns:
dict: The environment variables.
"""
env = {"PATH": "/bin:/usr/bin"}
if "Env" in config.get("config", {}):
for entry in config["config"]["Env"]:
key, value = entry.split("=", 1)
env[key] = value
return env
def _archspec_to_gooarch(spec: spack.spec.Spec) -> str:
name = spec.target.family.name
name_map = {"aarch64": "arm64", "x86_64": "amd64"}
return name_map.get(name, name)
def _put_manifest(
base_images: Dict[str, Tuple[dict, dict]],
checksums: Dict[str, spack.oci.oci.Blob],
spec: spack.spec.Spec,
image_ref: ImageReference,
tmpdir: str,
):
architecture = _archspec_to_gooarch(spec)
dependencies = list(
reversed(
list(
s
for s in spec.traverse(order="topo", deptype=("link", "run"), root=True)
if not s.external
)
)
)
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
spack.user_environment.environment_modifications_for_specs(spec).apply_modifications(env)
# Create an oci.image.config file
config = copy.deepcopy(base_config)
# Add the diff ids of the dependencies
for s in dependencies:
config["rootfs"]["diff_ids"].append(str(checksums[s.dag_hash()].uncompressed_digest))
# Set the environment variables
config["config"]["Env"] = [f"{k}={v}" for k, v in env.items()]
# From the OCI v1.0 spec:
# > Any extra fields in the Image JSON struct are considered implementation
# > specific and MUST be ignored by any implementations which are unable to
# > interpret them.
# We use this to store the Spack spec, so we can use it to create an index.
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,
}
config.update(spec_dict)
config_file = os.path.join(tmpdir, f"{spec.dag_hash()}.config.json")
with open(config_file, "w") as f:
json.dump(config, f, separators=(",", ":"))
config_file_checksum = Digest.from_sha256(
spack.util.crypto.checksum(hashlib.sha256, config_file)
)
# Upload the config file
upload_blob_with_retry(image_ref, file=config_file, digest=config_file_checksum)
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
"schemaVersion": 2,
"config": {
"mediaType": base_manifest["config"]["mediaType"],
"digest": str(config_file_checksum),
"size": os.path.getsize(config_file),
},
"layers": [
*(layer for layer in base_manifest["layers"]),
*(
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"digest": str(checksums[s.dag_hash()].compressed_digest),
"size": checksums[s.dag_hash()].size,
}
for s in dependencies
),
],
"annotations": {"org.opencontainers.image.description": spec.format()},
}
image_ref_for_spec = image_ref.with_tag(default_tag(spec))
# Finally upload the manifest
upload_manifest_with_retry(image_ref_for_spec, oci_manifest=oci_manifest)
# delete the config file
os.unlink(config_file)
return image_ref_for_spec
def _push_oci(
args,
image_ref: ImageReference,
installed_specs_with_deps: List[Spec],
tmpdir: str,
pool: multiprocessing.pool.Pool,
) -> List[str]:
"""Push specs to an OCI registry
Args:
args: The command line arguments.
image_ref: The image reference.
installed_specs_with_deps: The installed specs to push, excluding externals,
including deps, ordered from roots to leaves.
Returns:
List[str]: The list of skipped specs (already in the buildcache).
"""
# Reverse the order
installed_specs_with_deps = list(reversed(installed_specs_with_deps))
# The base image to use for the package. When not set, we use
# the OCI registry only for storage, and do not use any base image.
base_image_ref: Optional[ImageReference] = (
ImageReference.from_string(args.base_image) if args.base_image else None
)
# Spec dag hash -> blob
checksums: Dict[str, spack.oci.oci.Blob] = {}
# arch -> (manifest, config)
base_images: Dict[str, Tuple[dict, dict]] = {}
# Specs not uploaded because they already exist
skipped = []
if not args.force:
tty.info("Checking for existing specs in the buildcache")
to_be_uploaded = []
tags_to_check = (image_ref.with_tag(default_tag(s)) for s in installed_specs_with_deps)
available_blobs = pool.map(_get_spack_binary_blob, tags_to_check)
for spec, maybe_blob in zip(installed_specs_with_deps, available_blobs):
if maybe_blob is not None:
checksums[spec.dag_hash()] = maybe_blob
skipped.append(_format_spec(spec))
else:
to_be_uploaded.append(spec)
else:
to_be_uploaded = installed_specs_with_deps
if not to_be_uploaded:
return skipped
tty.info(
f"{len(to_be_uploaded)} specs need to be pushed to {image_ref.domain}/{image_ref.name}"
)
# Upload blobs
new_blobs = pool.starmap(
_push_single_spack_binary_blob, ((image_ref, spec, tmpdir) for spec in to_be_uploaded)
)
# And update the spec to blob mapping
for spec, blob in zip(to_be_uploaded, new_blobs):
checksums[spec.dag_hash()] = blob
# Copy base image layers, probably fine to do sequentially.
for spec in to_be_uploaded:
architecture = _archspec_to_gooarch(spec)
# Get base image details, if we don't have them yet
if architecture in base_images:
continue
if base_image_ref is None:
base_images[architecture] = (default_manifest(), default_config(architecture, "linux"))
else:
base_images[architecture] = copy_missing_layers_with_retry(
base_image_ref, image_ref, architecture
)
# Upload manifests
tty.info("Uploading manifests")
pushed_image_ref = pool.starmap(
_put_manifest,
((base_images, checksums, spec, image_ref, tmpdir) for spec in to_be_uploaded),
)
# Print the image names of the top-level specs
for spec, ref in zip(to_be_uploaded, pushed_image_ref):
tty.info(f"Pushed {_format_spec(spec)} to {ref}")
return skipped
def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
# Don't allow recursion here, since Spack itself always uploads
# vnd.oci.image.manifest.v1+json, not vnd.oci.image.index.v1+json
_, config = get_manifest_and_config_with_retry(image_ref.with_tag(tag), tag, recurse=0)
# Do very basic validation: if "spec" is a key in the config, it
# must be a Spec object too.
return config if "spec" in config else None
def _update_index_oci(
image_ref: ImageReference, tmpdir: str, pool: multiprocessing.pool.Pool
) -> None:
response = spack.oci.opener.urlopen(urllib.request.Request(url=image_ref.tags_url()))
spack.oci.opener.ensure_status(response, 200)
tags = json.load(response)["tags"]
# Fetch all image config files in parallel
spec_dicts = pool.starmap(
_config_from_tag, ((image_ref, tag) for tag in tags if tag_is_spec(tag))
)
# Populate the database
db_root_dir = os.path.join(tmpdir, "db_root")
db = bindist.BuildCacheDatabase(db_root_dir)
for spec_dict in spec_dicts:
spec = Spec.from_dict(spec_dict)
db.add(spec, directory_layout=None)
db.mark(spec, "in_buildcache", True)
# Create the index.json file
index_json_path = os.path.join(tmpdir, "index.json")
with open(index_json_path, "w") as f:
db._write_to_file(f)
# Create an empty config.json file
empty_config_json_path = os.path.join(tmpdir, "config.json")
with open(empty_config_json_path, "wb") as f:
f.write(b"{}")
# Upload the index.json file
index_shasum = Digest.from_sha256(spack.util.crypto.checksum(hashlib.sha256, index_json_path))
upload_blob_with_retry(image_ref, file=index_json_path, digest=index_shasum)
# Upload the config.json file
empty_config_digest = Digest.from_sha256(
spack.util.crypto.checksum(hashlib.sha256, empty_config_json_path)
)
upload_blob_with_retry(image_ref, file=empty_config_json_path, digest=empty_config_digest)
# Push a manifest file that references the index.json file as a layer
# Notice that we push this as if it is an image, which it of course is not.
# When the ORAS spec becomes official, we can use that instead of a fake image.
# For now we just use the OCI image spec, so that we don't run into issues with
# automatic garbage collection of blobs that are not referenced by any image manifest.
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
"schemaVersion": 2,
# Config is just an empty {} file for now, and irrelevant
"config": {
"mediaType": "application/vnd.oci.image.config.v1+json",
"digest": str(empty_config_digest),
"size": os.path.getsize(empty_config_json_path),
},
# The buildcache index is the only layer, and is not a tarball, we lie here.
"layers": [
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"digest": str(index_shasum),
"size": os.path.getsize(index_json_path),
}
],
}
upload_manifest_with_retry(image_ref.with_tag(default_index_tag), oci_manifest)
def install_fn(args):
"""install from a binary package"""
@@ -414,22 +828,31 @@ def preview_fn(args):
)
def check_fn(args):
def check_fn(args: argparse.Namespace):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt
this command uses the process exit code to indicate its result, specifically, if the
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
f"Use `spack buildcache check {specs_arg}` instead."
)
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs, specs)
if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()
@@ -522,7 +945,7 @@ def copy_buildcache_file(src_url, dest_url, local_path=None):
local_path = os.path.join(tmpdir, os.path.basename(src_url))
try:
temp_stage = Stage(src_url, path=os.path.dirname(local_path))
temp_stage = spack.stage.Stage(src_url, path=os.path.dirname(local_path))
try:
temp_stage.create()
temp_stage.fetch()
@@ -616,6 +1039,20 @@ def manifest_copy(manifest_file_list):
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)
except ValueError:
image_ref = None
if image_ref:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
_update_index_oci(image_ref, tmpdir, pool)
return
# Otherwise, assume a normal mirror.
url = mirror.push_url
bindist.generate_package_index(url_util.join(url, bindist.build_cache_relative_path()))

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "change an existing spec in an environment"
section = "environments"

View File

@@ -3,10 +3,10 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import re
import sys
import llnl.string
import llnl.util.lang
from llnl.util import tty
@@ -15,11 +15,11 @@
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.package_base import PackageBase, deprecated_version, preferred_version
from spack.util.editor import editor
from spack.util.format import get_version_lines
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version
description = "checksum available versions of a package"
@@ -35,30 +35,30 @@ def setup_parser(subparser):
help="don't clean up staging area when command completes",
)
subparser.add_argument(
"-b",
"--batch",
"-b",
action="store_true",
default=False,
help="don't ask which versions to checksum",
)
subparser.add_argument(
"-l",
"--latest",
"-l",
action="store_true",
default=False,
help="checksum the latest available version",
)
subparser.add_argument(
"-p",
"--preferred",
"-p",
action="store_true",
default=False,
help="checksum the known Spack preferred version",
)
modes_parser = subparser.add_mutually_exclusive_group()
modes_parser.add_argument(
"-a",
"--add-to-package",
"-a",
action="store_true",
default=False,
help="add new versions to package",
@@ -66,27 +66,26 @@ def setup_parser(subparser):
modes_parser.add_argument(
"--verify", action="store_true", default=False, help="verify known package checksums"
)
arguments.add_common_arguments(subparser, ["package", "jobs"])
subparser.add_argument("package", help="name or spec (e.g. `cmake` or `cmake@3.18`)")
subparser.add_argument(
"versions", nargs=argparse.REMAINDER, help="versions to generate checksums for"
"versions",
nargs="*",
help="checksum these specific versions (if omitted, Spack searches for remote versions)",
)
arguments.add_common_arguments(subparser, ["jobs"])
subparser.epilog = (
"examples:\n"
" `spack checksum zlib@1.2` autodetects versions 1.2.0 to 1.2.13 from the remote\n"
" `spack checksum zlib 1.2.13` checksums exact version 1.2.13 directly without search\n"
)
def checksum(parser, args):
# Did the user pass 'package@version' string?
if len(args.versions) == 0 and "@" in args.package:
args.versions = [args.package.split("@")[1]]
args.package = args.package.split("@")[0]
# Make sure the user provided a package and not a URL
if not valid_fully_qualified_module_name(args.package):
tty.die("`spack checksum` accepts package names, not URLs.")
spec = spack.spec.Spec(args.package)
# Get the package we're going to generate checksums for
pkg_cls = spack.repo.PATH.get_pkg_class(args.package)
pkg = pkg_cls(spack.spec.Spec(args.package))
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
# Build a list of versions to checksum
versions = [Version(v) for v in args.versions]
# Define placeholder for remote versions.
@@ -128,18 +127,41 @@ def checksum(parser, args):
remote_versions = pkg.fetch_remote_versions(args.jobs)
url_dict = remote_versions
# A spidered URL can differ from the package.py *computed* URL, pointing to different tarballs.
# For example, GitHub release pages sometimes have multiple tarballs with different shasum:
# - releases/download/1.0/<pkg>-1.0.tar.gz (uploaded tarball)
# - archive/refs/tags/1.0.tar.gz (generated tarball)
# We wanna ensure that `spack checksum` and `spack install` ultimately use the same URL, so
# here we check whether the crawled and computed URLs disagree, and if so, prioritize the
# former if that URL exists (just sending a HEAD request that is).
url_changed_for_version = set()
for version, url in url_dict.items():
possible_urls = pkg.all_urls_for_version(version)
if url not in possible_urls:
for possible_url in possible_urls:
if web_util.url_exists(possible_url):
url_dict[version] = possible_url
break
else:
url_changed_for_version.add(version)
if not url_dict:
tty.die(f"Could not find any remote versions for {pkg.name}")
# print an empty line to create a new output section block
print()
elif len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
filtered_url_dict = spack.stage.interactive_version_filter(
url_dict,
pkg.versions,
url_changes=url_changed_for_version,
initial_verion_filter=spec.versions,
)
if not filtered_url_dict:
exit(0)
url_dict = filtered_url_dict
else:
tty.info(f"Found {llnl.string.plural(len(url_dict), 'version')} of {pkg.name}")
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
pkg.name,
keep_stage=args.keep_stage,
batch=(args.batch or len(versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options,
url_dict, pkg.name, keep_stage=args.keep_stage, fetch_options=pkg.fetch_options
)
if args.verify:

View File

@@ -16,6 +16,7 @@
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.environment.depfile
import spack.hash_types as ht
import spack.mirror
import spack.util.gpg as gpg_util
@@ -191,6 +192,14 @@ def ci_generate(args):
"""
env = spack.cmd.require_active_env(cmd_name="ci generate")
if args.copy_to:
tty.warn("The flag --copy-to is deprecated and will be removed in Spack 0.23")
if args.buildcache_destination:
tty.warn(
"The flag --buildcache-destination is deprecated and will be removed in Spack 0.23"
)
output_file = args.output_file
copy_yaml_to = args.copy_to
run_optimizer = args.optimize
@@ -264,12 +273,6 @@ def ci_rebuild(args):
if not ci_config:
tty.die("spack ci rebuild requires an env containing ci cfg")
tty.msg(
"SPACK_BUILDCACHE_DESTINATION={0}".format(
os.environ.get("SPACK_BUILDCACHE_DESTINATION", None)
)
)
# Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself.
@@ -277,6 +280,7 @@ def ci_rebuild(args):
job_log_dir = os.environ.get("SPACK_JOB_LOG_DIR")
job_test_dir = os.environ.get("SPACK_JOB_TEST_DIR")
repro_dir = os.environ.get("SPACK_JOB_REPRO_DIR")
# TODO: Remove this in Spack 0.23
local_mirror_dir = os.environ.get("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = os.environ.get("SPACK_CONCRETE_ENV_DIR")
ci_pipeline_id = os.environ.get("CI_PIPELINE_ID")
@@ -285,9 +289,12 @@ def ci_rebuild(args):
job_spec_pkg_name = os.environ.get("SPACK_JOB_SPEC_PKG_NAME")
job_spec_dag_hash = os.environ.get("SPACK_JOB_SPEC_DAG_HASH")
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE")
# TODO: Remove this in Spack 0.23
remote_mirror_override = os.environ.get("SPACK_REMOTE_MIRROR_OVERRIDE")
# TODO: Remove this in Spack 0.23
remote_mirror_url = os.environ.get("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = os.environ.get("SPACK_CI_STACK_NAME")
# TODO: Remove this in Spack 0.23
shared_pr_mirror_url = os.environ.get("SPACK_CI_SHARED_PR_MIRROR_URL")
rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING")
require_signing = os.environ.get("SPACK_REQUIRE_SIGNING")
@@ -344,21 +351,36 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors:
buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
# If no override url exists, then just push binary package to the
# normal remote mirror url.
# TODO: Remove in Spack 0.23
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
if buildcache_destination:
buildcache_mirror_url = buildcache_destination.push_url
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build
# artifacts from upstream to downstream jobs.
# TODO: Remove this in Spack 0.23
pipeline_mirror_url = None
# TODO: Remove this in Spack 0.23
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
pipeline_mirror_url = url_util.join(temp_storage_url_prefix, ci_pipeline_id)
# TODO: Remove this in Spack 0.23
enable_artifacts_mirror = False
if "enable-artifacts-buildcache" in ci_config:
enable_artifacts_mirror = ci_config["enable-artifacts-buildcache"]
@@ -454,12 +476,14 @@ def ci_rebuild(args):
# If we decided there should be a temporary storage mechanism, add that
# mirror now so it's used when we check for a hash match already
# built for this spec.
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror = spack.mirror.Mirror(pipeline_mirror_url, name=spack_ci.TEMP_STORAGE_MIRROR_NAME)
spack.mirror.add(mirror, cfg.default_modify_scope())
pipeline_mirrors.append(pipeline_mirror_url)
# Check configured mirrors for a built spec with a matching hash
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
@@ -477,7 +501,8 @@ def ci_rebuild(args):
)
pipeline_mirrors.append(remote_mirror_override)
if spack_pipeline_type == "spack_pull_request":
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
if shared_pr_mirror_url != "None":
pipeline_mirrors.append(shared_pr_mirror_url)
@@ -499,6 +524,7 @@ def ci_rebuild(args):
tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name))
for match in matches:
tty.msg(" {0}".format(match["mirror_url"]))
# TODO: Remove this block in Spack 0.23
if enable_artifacts_mirror:
matching_mirror = matches[0]["mirror_url"]
build_cache_dir = os.path.join(local_mirror_dir, "build_cache")
@@ -513,7 +539,8 @@ def ci_rebuild(args):
# only want to keep the mirror being used by the current pipeline as it's binary
# package destination. This ensures that the when we rebuild everything, we only
# consume binary dependencies built in this pipeline.
if full_rebuild:
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and full_rebuild:
spack_ci.remove_other_mirrors(pipeline_mirrors, cfg.default_modify_scope())
# No hash match anywhere means we need to rebuild spec
@@ -579,7 +606,11 @@ def ci_rebuild(args):
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(job_spec.format("{name}-{version}-{hash}")),
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,
]
@@ -676,21 +707,25 @@ def ci_rebuild(args):
# print out some instructions on how to reproduce this build failure
# outside of the pipeline environment.
if install_exit_code == 0:
if buildcache_mirror_url or pipeline_mirror_url:
for result in spack_ci.create_buildcache(
input_spec=job_spec,
buildcache_mirror_url=buildcache_mirror_url,
pipeline_mirror_url=pipeline_mirror_url,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
"{} {} to {}".format(
"Pushed" if result.success else "Failed to push",
job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when()),
result.url,
)
mirror_urls = [buildcache_mirror_url]
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror_urls.append(pipeline_mirror_url)
for result in spack_ci.create_buildcache(
input_spec=job_spec,
destination_mirror_urls=mirror_urls,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
"{} {} to {}".format(
"Pushed" if result.success else "Failed to push",
job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when()),
result.url,
)
)
# If this is a develop pipeline, check if the spec that we just built is
# on the broken-specs list. If so, remove it.

View File

@@ -12,13 +12,13 @@
import spack.bootstrap
import spack.caches
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path
from spack.cmd.common import arguments
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"

View File

@@ -796,7 +796,9 @@ def names(args: Namespace, out: IO) -> None:
commands = copy.copy(spack.cmd.all_commands())
if args.aliases:
commands.extend(spack.main.aliases.keys())
aliases = spack.config.get("config:aliases")
if aliases:
commands.extend(aliases.keys())
colify(commands, output=out)
@@ -812,8 +814,10 @@ def bash(args: Namespace, out: IO) -> None:
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
aliases = ";".join(f"{key}:{val}" for key, val in spack.main.aliases.items())
out.write(f'SPACK_ALIASES="{aliases}"\n\n')
aliases_config = spack.config.get("config:aliases")
if aliases_config:
aliases = ";".join(f"{key}:{val}" for key, val in aliases_config.items())
out.write(f'SPACK_ALIASES="{aliases}"\n\n')
writer = BashCompletionWriter(parser.prog, out, args.aliases)
writer.write(parser)

View File

@@ -124,6 +124,33 @@ def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, deptype)
class ConfigScope(argparse.Action):
"""Pick the currently configured config scopes."""
def __init__(self, *args, **kwargs) -> None:
kwargs.setdefault("metavar", spack.config.SCOPES_METAVAR)
super().__init__(*args, **kwargs)
@property
def default(self):
return self._default() if callable(self._default) else self._default
@default.setter
def default(self, value):
self._default = value
@property
def choices(self):
return spack.config.scopes().keys()
@choices.setter
def choices(self, value):
pass
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
def _cdash_reporter(namespace):
"""Helper function to create a CDash reporter. This function gets an early reference to the
argparse namespace under construction, so it can later use it to create the object.
@@ -357,10 +384,11 @@ def install_status():
"--install-status",
action="store_true",
default=True,
help="show install status of packages\n\npackages can be: "
"installed [+], missing and needed by an installed package [-], "
"installed in an upstream instance [^], "
"or not installed (no annotation)",
help=(
"show install status of packages\n"
"[+] installed [^] installed in an upstream\n"
" - not installed [-] missing dep of installed package\n"
),
)
@@ -543,7 +571,7 @@ def add_concretizer_args(subparser):
)
def add_s3_connection_args(subparser, add_help):
def add_connection_args(subparser, add_help):
subparser.add_argument(
"--s3-access-key-id", help="ID string to use to connect to this S3 mirror"
)
@@ -559,6 +587,8 @@ def add_s3_connection_args(subparser, add_help):
subparser.add_argument(
"--s3-endpoint-url", help="endpoint URL to use to connect to this S3 mirror"
)
subparser.add_argument("--oci-username", help="username to use to connect to this OCI mirror")
subparser.add_argument("--oci-password", help="password to use to connect to this OCI mirror")
def use_buildcache(cli_arg_value):

View File

@@ -0,0 +1,30 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
from typing import List
import llnl.util.tty as tty
import spack.cmd
display_args = {"long": True, "show_flags": False, "variants": False, "indent": 4}
def confirm_action(specs: List[spack.spec.Spec], participle: str, noun: str):
"""Display the list of specs to be acted on and ask for confirmation.
Args:
specs: specs to be removed
participle: action expressed as a participle, e.g. "uninstalled"
noun: action expressed as a noun, e.g. "uninstallation"
"""
tty.msg(f"The following {len(specs)} packages will be {participle}:\n")
spack.cmd.display_specs(specs, **display_args)
print("")
answer = tty.get_yes_or_no("Do you want to proceed?", default=False)
if not answer:
tty.msg(f"Aborting {noun}")
sys.exit(0)

View File

@@ -7,15 +7,15 @@
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import traverse
from spack import build_environment, traverse
from spack.cmd.common import arguments
from spack.context import Context
from spack.util.environment import dump_environment, pickle_environment
@@ -42,14 +42,14 @@ def setup_parser(subparser):
class AreDepsInstalledVisitor:
def __init__(self, context="build"):
if context not in ("build", "test"):
raise ValueError("context can only be build or test")
if context == "build":
def __init__(self, context: Context = Context.BUILD):
if context == Context.BUILD:
# TODO: run deps shouldn't be required for build env.
self.direct_deps = dt.BUILD | dt.LINK | dt.RUN
else:
elif context == Context.TEST:
self.direct_deps = dt.BUILD | dt.TEST | dt.LINK | dt.RUN
else:
raise ValueError("context can only be Context.BUILD or Context.TEST")
self.has_uninstalled_deps = False
@@ -76,7 +76,7 @@ def neighbors(self, item):
return item.edge.spec.edges_to_dependencies(depflag=depflag)
def emulate_env_utility(cmd_name, context, args):
def emulate_env_utility(cmd_name, context: Context, args):
if not args.spec:
tty.die("spack %s requires a spec." % cmd_name)
@@ -120,7 +120,7 @@ def emulate_env_utility(cmd_name, context, args):
hashes=True,
# This shows more than necessary, but we cannot dynamically change deptypes
# in Spec.tree(...).
deptypes="all" if context == "build" else ("build", "test", "link", "run"),
deptypes="all" if context == Context.BUILD else ("build", "test", "link", "run"),
),
)

View File

@@ -14,6 +14,7 @@
import spack.compilers
import spack.config
import spack.spec
from spack.cmd.common import arguments
description = "manage compilers"
section = "system"
@@ -23,20 +24,30 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="compiler_command")
scopes = spack.config.scopes()
# Find
find_parser = sp.add_parser(
"find",
aliases=["add"],
help="search the system for compilers to add to Spack configuration",
)
mixed_toolchain_group = find_parser.add_mutually_exclusive_group()
mixed_toolchain_group.add_argument(
"--mixed-toolchain",
action="store_true",
default=sys.platform == "darwin",
help="Allow mixed toolchains (for example: clang, clang++, gfortran)",
)
mixed_toolchain_group.add_argument(
"--no-mixed-toolchain",
action="store_false",
dest="mixed_toolchain",
help="Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
)
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("compilers"),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
help="configuration scope to modify",
)
@@ -47,20 +58,15 @@ def setup_parser(subparser):
)
remove_parser.add_argument("compiler_spec")
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=None,
help="configuration scope to modify",
"--scope", action=arguments.ConfigScope, default=None, help="configuration scope to modify"
)
# List
list_parser = sp.add_parser("list", help="list available compilers")
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -69,9 +75,8 @@ def setup_parser(subparser):
info_parser.add_argument("compiler_spec")
info_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -86,7 +91,9 @@ def compiler_find(args):
# Below scope=None because we want new compilers that don't appear
# in any other configuration.
new_compilers = spack.compilers.find_new_compilers(paths, scope=None)
new_compilers = spack.compilers.find_new_compilers(
paths, scope=None, mixed_toolchain=args.mixed_toolchain
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope, init_config=False)
n = len(new_compilers)

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.config
from spack.cmd.common import arguments
from spack.cmd.compiler import compiler_list
description = "list available compilers"
@@ -12,13 +12,8 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)

View File

@@ -10,7 +10,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.cmd.common.arguments
import spack.config
import spack.environment as ev
import spack.repo
@@ -18,6 +17,7 @@
import spack.schema.packages
import spack.store
import spack.util.spack_yaml as syaml
from spack.cmd.common import arguments
from spack.util.editor import editor
description = "get and set configuration options"
@@ -26,14 +26,9 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
# User can only choose one
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="config_command")
@@ -101,13 +96,13 @@ def setup_parser(subparser):
setup_parser.add_parser = add_parser
update = sp.add_parser("update", help="update configuration files to the latest format")
spack.cmd.common.arguments.add_common_arguments(update, ["yes_to_all"])
arguments.add_common_arguments(update, ["yes_to_all"])
update.add_argument("section", help="section to update")
revert = sp.add_parser(
"revert", help="revert configuration files to their state before update"
)
spack.cmd.common.arguments.add_common_arguments(revert, ["yes_to_all"])
arguments.add_common_arguments(revert, ["yes_to_all"])
revert.add_argument("section", help="section to update")
@@ -407,7 +402,9 @@ def config_prefer_upstream(args):
pkgs = {}
for spec in pref_specs:
# Collect all the upstream compilers and versions for this package.
pkg = pkgs.get(spec.name, {"version": [], "compiler": []})
pkg = pkgs.get(spec.name, {"version": []})
all = pkgs.get("all", {"compiler": []})
pkgs["all"] = all
pkgs[spec.name] = pkg
# We have no existing variant if this is our first added version.
@@ -418,8 +415,8 @@ def config_prefer_upstream(args):
pkg["version"].append(version)
compiler = str(spec.compiler)
if compiler not in pkg["compiler"]:
pkg["compiler"].append(compiler)
if compiler not in all["compiler"]:
all["compiler"].append(compiler)
# Get and list all the variants that differ from the default.
variants = []

View File

@@ -5,6 +5,7 @@
import os
import re
import sys
import urllib.parse
import llnl.util.tty as tty
@@ -62,6 +63,10 @@ class {class_name}({base_class_name}):
# notify when the package is updated.
# maintainers("github_user1", "github_user2")
# FIXME: Add the SPDX identifier of the project's license below.
# See https://spdx.org/licenses/ for a list.
license("UNKNOWN")
{versions}
{dependencies}
@@ -167,6 +172,14 @@ def configure_args(self):
return args"""
class CargoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for cargo-based packages"""
base_class_name = "CargoPackage"
body_def = ""
class CMakePackageTemplate(PackageTemplate):
"""Provides appropriate overrides for CMake-based packages"""
@@ -181,6 +194,14 @@ def cmake_args(self):
return args"""
class GoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Go-module-based packages"""
base_class_name = "GoPackage"
body_def = ""
class LuaPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for LuaRocks-based packages"""
@@ -570,28 +591,30 @@ def __init__(self, name, *args, **kwargs):
templates = {
"autotools": AutotoolsPackageTemplate,
"autoreconf": AutoreconfPackageTemplate,
"cmake": CMakePackageTemplate,
"bundle": BundlePackageTemplate,
"qmake": QMakePackageTemplate,
"maven": MavenPackageTemplate,
"scons": SconsPackageTemplate,
"waf": WafPackageTemplate,
"autotools": AutotoolsPackageTemplate,
"bazel": BazelPackageTemplate,
"bundle": BundlePackageTemplate,
"cargo": CargoPackageTemplate,
"cmake": CMakePackageTemplate,
"generic": PackageTemplate,
"go": GoPackageTemplate,
"intel": IntelPackageTemplate,
"lua": LuaPackageTemplate,
"makefile": MakefilePackageTemplate,
"maven": MavenPackageTemplate,
"meson": MesonPackageTemplate,
"octave": OctavePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"python": PythonPackageTemplate,
"qmake": QMakePackageTemplate,
"r": RPackageTemplate,
"racket": RacketPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"octave": OctavePackageTemplate,
"ruby": RubyPackageTemplate,
"makefile": MakefilePackageTemplate,
"intel": IntelPackageTemplate,
"meson": MesonPackageTemplate,
"lua": LuaPackageTemplate,
"scons": SconsPackageTemplate,
"sip": SIPPackageTemplate,
"generic": PackageTemplate,
"waf": WafPackageTemplate,
}
@@ -674,6 +697,8 @@ def __call__(self, stage, url):
clues = [
(r"/CMakeLists\.txt$", "cmake"),
(r"/NAMESPACE$", "r"),
(r"/Cargo\.toml$", "cargo"),
(r"/go\.mod$", "go"),
(r"/configure$", "autotools"),
(r"/configure\.(in|ac)$", "autoreconf"),
(r"/Makefile\.am$", "autoreconf"),
@@ -823,6 +848,11 @@ def get_versions(args, name):
# Find available versions
try:
url_dict = spack.url.find_versions_of_archive(args.url)
if len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
url_dict_filtered = spack.stage.interactive_version_filter(url_dict)
if url_dict_filtered is None:
exit(0)
url_dict = url_dict_filtered
except UndetectableVersionError:
# Use fake versions
tty.warn("Couldn't detect version in: {0}".format(args.url))
@@ -834,11 +864,7 @@ def get_versions(args, name):
url_dict = {version: args.url}
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
name,
first_stage_function=guesser,
keep_stage=args.keep_stage,
batch=(args.batch or len(url_dict) == 1),
url_dict, name, first_stage_function=guesser, keep_stage=args.keep_stage
)
versions = get_version_lines(version_hashes, url_dict)

Some files were not shown because too many files have changed in this diff Show More