Compare commits

...

558 Commits

Author SHA1 Message Date
jordialcaraz
f00c80eae8 patch tau@2.34 +rocm rocprofiler support 2025-05-19 16:54:33 -07:00
Barry
82978251fa variorum: fix ppc and arm builds 2025-05-13 08:58:54 -07:00
Todd Gamblin
a9c879d53e gdk-pixbuf: Use the official GNOME mirror. (#49690)
The `umea.se` mirror seems to have gone down (or at least is forbidden for now).

Revert the checksum changes in #47825; points at the official GNOME mirror
instead of the prior two places we were getting `gdk-pixbuf`.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 18:57:53 -06:00
Todd Gamblin
f42f59c84b concretizer: don't use clingo.Symbol for setup (#49650)
Since we moved from creating clingo symbols directly to constructing a pure string
representation of the program, we don't need to make `AspFunctions` into symbols before
turning them into strings. We can just write strings like clingo would.

This cuts about 25% off the setup time by avoiding an unnecessary round trip.

- [x] create strings directly from `AspFunctions`
- [x] remove unused `symbol()` method on `AspFunction`
- [x] setup no longer tries to call `symbol()`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Greg Becker <becker33@llnl.gov>

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2025-03-25 17:55:27 -07:00
Jon Rood
313b7d4cdb nalu-wind: add version 2.2.2. (#49685) 2025-03-25 15:45:23 -07:00
Melven Roehrig-Zoellner
bd41863797 scorep: ensure gcc-plugin is built, patch gcc@14 (#49257)
* scorep: ensure gcc-plugin is built, patch gcc@14
* scorep: patch only to non-deprecated versions
2025-03-25 14:45:53 -07:00
Robert Maaskant
b0dba4ff5a yarn: add v4.6.0, v4.7.0 (#49177)
* yarn: v4.6.0
* py-ipympl: pin yarn to v1
* rstudio: pin yarn to v1
* yarn: add v4.7.0
2025-03-25 14:37:44 -07:00
Alec Scott
4ff43d7fa9 ci: future-proof for enabling GitHub merge queues later (#49665) 2025-03-25 10:07:37 -07:00
Jon Rood
c1df1c7ee5 trilinos: fix kokkos constraints for version 16 (#49643)
* trilinos: add equals sign to kokkos dependencies.

* Fix some license headers to pass style check.

* Generalize a bit.

* Generalize a bit more.

* datatransferkit: constraing to maximum of trilinos@16.0.
2025-03-25 10:43:13 -06:00
arezaii
9ac6ecd5ba Chapel 2.4 (#49662)
* limit some patches by chapel version

* fix short output version if building main

* update patches, remove unneeded 'self' refs

* fix spack style

* update patches with changes from PR

* change py-protobuf to just protobuf dep

* add PR numbers for patches

* fix spack style

* update 2.4 sha256
2025-03-25 09:01:58 -07:00
Todd Gamblin
20ddb85020 setup-env.csh: Harden for people who like aliases (#49670)
A user had `grep` aliased to `grep -n`, which was causing `csh` setup to
fail due to number prefixes in `SPACK_ROOT`.

- [x] Prefix invocations of `grep` and `sed` (which are not builtin) with `\`
      to avoid any aliases.
- [x] Avoid using `dirname` altogether -- use csh's `:h` modifier (which does
      the same thing) instead.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 09:01:28 -07:00
Nicholas Sly
2ced87297d Add dbus patch for container builds. (#49402) 2025-03-24 19:00:39 -06:00
Piotr Sacharuk
aa00c3fe1f trilinos: Apply workaround for oneAPI compiler for problems with build (#49636)
* Fix problem at least with datatransferkit

* Include patch 11676 from trilinos

* Add patches for trilinos 13.4.1

* style check failed

* Update links for patches

* additional style check failed
2025-03-24 17:05:43 -07:00
psakievich
0158fc46aa Add recursive argument to spack develop (#46885)
* Add recursive argument to spack develop

This effort allows for a recursive develop call
which will traverse from the develop spec given back to the root(s)
and mark all packages along the path as develop.

If people are doing development across the graph then paying
fetch and full rebuild costs every time spack develop is called
is unnecessary and expensive.

Also remove the constraint for concrete specs and simply take the
max(version) if a version is not given. This should default to the
highest infinity version which is also the logical best guess for
doing development.
2025-03-24 16:50:16 -07:00
Richard Berger
8ac826cca8 hip: add missing HIPCC_LINK_FLAGS_APPEND (#49436)
* hip: add missing HIPCC_LINK_FLAGS_APPEND

---------

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2025-03-24 13:58:47 -07:00
Teague Sterling
1b829a4a28 kentutils: add v478 (#49521) 2025-03-24 13:33:41 -07:00
Robert Maaskant
e2ed1c2308 py-pymoo: add v0.6.1.3 (#49603)
* py-pymoo: add v0.6.1.3
* py-pymoo: use a when context
* py-pymoo: group build only dependencies
2025-03-24 13:29:05 -07:00
Robert Maaskant
94b828add1 prometheus: improve dependency specs (#49175)
* prometheus: improve dependency specs
* fixup! prometheus: improve dependency specs
* prometheus: fix typo in nodejs dep
* prometheus: fix checksums
2025-03-24 13:27:10 -07:00
Eric Berquist
fd7dcf3a3f sst-core: fix linkage against ncurses, zlib, and HDF5 (#49152)
* sst-core: fix for > 14.0.0 requiring ncurses

* sst-core: backport fix for curses detection

* sst-core: ensure HDF5 is ignored if not specified

* sst-core: HDF5 integration is via C++

* sst-core: switch to with_or_without for configure

* sst-core: switch to enable_or_disable for configure

* sst-core: control memory pools and debug output with variants
2025-03-24 12:45:12 -07:00
Alec Scott
e3bb0d77bc hugo: add v0.145.0 (#49576) 2025-03-24 13:23:21 -06:00
Jon Rood
25761b13e5 kokkos-kernels: rewrite package to fix errors (#49598)
* kokkos-kernels: fix eti dependency statements.

* kokkos-kernels: rewrite package.

* Fix errors.

* Style.

* Style.

* Cleanup.
2025-03-24 12:25:20 -06:00
Stephen Nicholas Swatman
ae48faa83a detray: add v0.90.0-v0.93.0 (#49658)
This commit adds detray versions 0.90.0, 0.91.0, 0.92.0, and 0.93.0.
2025-03-24 10:13:03 -07:00
Afzal Patel
e15a3b0717 hip: fix hip-tests error (#49563) 2025-03-24 10:04:19 -07:00
Andrey Perestoronin
2c8afc5443 Add new 2025.1.0 release for intel-oneapi products (#49642)
* Add new versions of intel-oneapi products

* restore advisor 2025.0.0 release

* fix styling
2025-03-24 11:02:55 -06:00
Sreenivasa Murthy Kolam
99479b7e77 rocprofiler-sdk: new package (#49406)
* rocprofiler-sdk new package
* add license, rocm tag
2025-03-24 09:57:39 -07:00
psakievich
5d0b5ed73c EnvironmentModifications: fix reverse prepend/append (#49645)
pop a single item from front / back resp. instead of remove all instances
2025-03-24 17:29:27 +01:00
Ryan Krattiger
151af13be2 Unit tests: error message when running parallel without xdist (#49632) 2025-03-24 09:25:45 -07:00
Alec Scott
93ea3f51e7 zig: add v0.14.0 (#49629)
* zig: add v0.14.0

* Fix commit hash

* Fix tag for v0.14.0
2025-03-24 08:07:34 -07:00
Alec Scott
a3abc1c492 Fix ci failures after merge of mock tests created before license transition (#49638) 2025-03-21 21:17:56 -06:00
Simon Pintarelli
401484ddf4 remove version prior 7.3 from SIRIUS (#49584) 2025-03-21 20:46:39 +01:00
Robert Maaskant
fc4e76e6fe py-setuptools-scm: fix deps (#49609) 2025-03-21 11:18:11 -07:00
Alec Scott
0853f42723 smee-client: add v3.1.1 (#49578) 2025-03-21 11:56:37 -06:00
Alec Scott
19ca69d0d8 typos: add v1.30.2 (#49577)
* typos: add v1.30.2

* Add rust dependency constraint
2025-03-21 11:56:00 -06:00
Alec Scott
036794725f bfs: add v4.0.6 (#49575) 2025-03-21 11:55:16 -06:00
Alec Scott
e5a2c9aee3 emacs: add v30.1 (#49574) 2025-03-21 11:54:31 -06:00
Alec Scott
5364b88777 fzf: add v0.60.3 (#49573) 2025-03-21 11:43:26 -06:00
Alec Scott
7d1b6324e1 npm: add v11.2.0 (#49572) 2025-03-21 11:42:45 -06:00
Alexandre DENIS
3d0263755e mpibenchmark: add v0.6 (#49612)
* mpibenchmark: add version 0.6
* mpibenchmark: fix syntax
* mpibenchmark: improve package description
2025-03-21 09:54:56 -07:00
Jon Rood
54ad5dca45 exawind: add versions and commits to tags (#49615)
* exawind: add versions and commits to tags.
* Add new version of TIOGA.
* openfast: add commits to tags.
* amr-wind: add dependencies.
* amr-wind: add more settings.

---------

Co-authored-by: jrood-nrel <jrood-nrel@users.noreply.github.com>
2025-03-21 09:49:37 -07:00
Lehman Garrison
ee206952c9 py-uv: add v0.6.8 (#49616) 2025-03-21 09:38:17 -07:00
Ryan Krattiger
4ccef372e8 E4S: Allow building newer ParaView for Linux CI (#47823)
5.11 was locked at a time when master was building by default. Allowing
building newer paraview in CI
2025-03-21 09:07:37 -07:00
Jon Rood
ac6e534806 openfast: patch versions to fix openmp bug. (#49631) 2025-03-21 09:06:56 -07:00
Greg Becker
5983f72439 fix extendee_spec for transitive dependencies on potential extendees (#48025)
* fix extendee_spec for transitive dependencies on potential extendees

* regression test

* resolve conditional extensions on direct deps

* remove outdated comment

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-03-21 08:27:51 -07:00
Stephen Sachs
6e10fac7ae openfoam: restrict the CGAL version compatible with C++14 (#47689)
* openfoam: restrict the CGAL version compatible with C++14

CGAL throws an
[error](50219fc33b/Installation/include/CGAL/config.h (L147))
if C++ lower than 17 is used, while OpenFOAM [forces
C++14](https://develop.openfoam.com/Development/openfoam/-/blob/develop/wmake/rules/General/Gcc/c++?ref_type=heads#L9).
This hard C++17 dependency was
[introduced](e54408370b)
to CGAL version 6.

* Add upper bound since openfoam now uses c++17

44f7a7268a
2025-03-21 08:23:33 -07:00
Derek Ryan Strong
ee6ea5155c Add libjpeg-turbo v3.0.4 (#48030) 2025-03-21 08:22:01 -07:00
Cyrus Harrison
48258e8ddc conduit: add v0.9.3 (#48736)
* add 0.9.3 release, fix license listed
* fix sha
2025-03-21 08:20:50 -07:00
Robert Maaskant
429b0375ed yarn: v1.22.22 (#49171) 2025-03-21 08:12:13 -07:00
Robert Maaskant
c6925ab83f new package: py-loky (#49602) 2025-03-21 08:06:36 -07:00
Wouter Deconinck
00d78dfa0c pythia8: add v8.313 (#49045)
* pythia8: add v8.313

* pythia8: conflicts ~yoda +rivet only when @8.313:
2025-03-21 07:59:55 -07:00
Wouter Deconinck
e072a91572 libx11: add v1.8.11 (#48863) 2025-03-21 07:58:12 -07:00
Wouter Deconinck
b7eb0308d4 node-js: run tests with target test-only (#49516) 2025-03-21 07:52:26 -07:00
Wouter Deconinck
c98ee6d8ac eigen: build test executables when self.run_tests (#49540) 2025-03-21 07:50:54 -07:00
Wouter Deconinck
b343ebb64e qt-base: pass SBOM PATH from cmake_args (#49596)
* qt-base: pass SBOM PATH from cmake_args

* qt-base: self.define from list

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2025-03-21 07:50:09 -07:00
Adam J. Stewart
e178d2c75d py-torchmetrics: add v1.7.0 (#49633) 2025-03-21 07:44:47 -07:00
Matt Thompson
9b64560ae6 mapl: add v2.53.3, v2.54.2 (#49610) 2025-03-21 07:18:22 -07:00
David Ozog
ca226f3506 sos: (and tests-sos:) update to v1.5.3, add main branch (#49613)
* sos/tests-sos: update to v1.5.3 & add main branch

* [@spackbot] updating style on behalf of davidozog

* sos: cleanup try/except around cloning tests

---------

Co-authored-by: davidozog <davidozog@users.noreply.github.com>
2025-03-21 09:28:16 -04:00
Caetano Melone
8569e04fea py-ruff: add v0.11.1 (#49617)
* py-ruff: add v0.11.1

Add latest version and update minimum supported rust version for 0.9.8
and up.

[before](https://github.com/astral-sh/ruff/blob/0.9.7/Cargo.toml#L7) and
[after](https://github.com/astral-sh/ruff/blob/0.9.8/Cargo.toml#L7)

* minimum rust version

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-21 09:14:13 +01:00
Robert Maaskant
32213d5e6b fix: spack audit issues (#49557) 2025-03-20 22:41:15 -07:00
Paul R. C. Kent
4891f3dbc9 rmgdft: add develop version (#49558) 2025-03-20 22:39:26 -07:00
Anderson Chauphan
2b5959c3dd trilinos: add v16.1.0 (#49628)
Signed-off-by: Anderson Chauphan <achauph@sandia.gov>
2025-03-20 22:37:43 -07:00
Suzanne Prentice
353db6752a ruby: add v3.2.5 (#49537) 2025-03-20 22:34:45 -07:00
Adam J. Stewart
bf24b8e82c py-lightning: add v2.5.1 (#49600) 2025-03-21 01:31:17 -04:00
psakievich
f2d830cd4c Get env_var mods from config (#49626) 2025-03-20 21:48:50 -05:00
brian-kelley
070bfa1ed7 KokkosKernels: apply PR 2296 as patch (#49627)
Applies this fix to all affected versions (4.0.00:4.4.00).
Fixes issue #49622.

Signed-off-by: Brian Kelley <bmkelle@sandia.gov>
2025-03-20 20:13:00 -06:00
Alec Scott
c79b6207e8 ci: add automatic checksum verification check (#45063)
Add a CI check to automatically verify the checksums of newly added
package versions:
    - [x] a new command, `spack ci verify-versions`
    - [x] a GitHub actions check to run the command
    - [x] tests for the new command

This also eliminates the suggestion for maintainers to manually verify added
checksums in the case of accidental version <--> checksum mismatches.

----

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-20 22:58:14 +01:00
Wouter Deconinck
38d77570b4 qt-tools: conflicts +assistant when 6.8.2 (#49605) 2025-03-20 16:27:44 -04:00
Piotr Sacharuk
d8885b28fa paraview: Apply workarounds for oneAPI compiler for paraview problem with build (#48892)
* Apply workarounds for oneAPI compiler for paraview problem with build

* add source of provided patches
2025-03-20 12:44:56 -05:00
eugeneswalker
abd3487570 dyninst: %gcc only required for versions <13 (#49599) 2025-03-20 09:27:11 -07:00
Richard Berger
0d760a5fd8 libfuse: fix aarch64 compile for 2.x (#47846) 2025-03-20 13:42:51 +01:00
Felix Thaler
dde91ae181 Added btop 1.4.0 (#49586) 2025-03-19 19:04:07 -06:00
Krishna Chilleri
590dbf67f3 py-cwl-utils: add v0.37 and py-schema-salad: add v8.8.20250205075315 (#49566)
* add new version

* add v8.8.20250205075315 to py-schema-salad

* Modify range to open ended

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Add open ended dependency version range

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* fix flake8 error

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-19 17:46:00 -05:00
Christoph Junghans
d199738f31 lfortran: add v0.49.0 (#49565)
* lfortran: add v0.49.0
* add v0.19.0 url as version directive argument
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-03-19 14:39:08 -06:00
snehring
f55f829437 topaz: add v0.3.7 (#49178)
* topaz: add v0.3.7
   Signed-off-by: Shane Nehring <snehring@iastate.edu>
* topaz: add older version url
   Signed-off-by: Shane Nehring <snehring@iastate.edu>

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-03-19 12:22:49 -07:00
snehring
295f3ff915 sentieon-genomics: updating checksums for patch (#48635)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-03-19 11:56:38 -07:00
Harmen Stoppels
a0ad02c247 solver: improve error message when single-valued variant cannot be satisfied (#49583) 2025-03-19 19:44:45 +01:00
Krishna Chilleri
a21d314ba7 py-cachecontrol: add v0.14.0 (#49564)
* add new version

* Update var/spack/repos/builtin/packages/py-cachecontrol/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-19 12:30:55 -05:00
Teague Sterling
a4ad8c8174 plink2: add v2.0.0-a.6.9 (#49520)
* Adding additional versions to plink2 and switching to tarballs to allow for better version detection in the future
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* plink2: add v2.0.0-a.6.9
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* Prepend "v" to version in url_for_version()
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-03-19 10:59:26 -06:00
Lehman Garrison
aa3ee3fa2a py-asdf: add v4.1.0 and related (#49454)
* py-asdf-transform-schemas: add v0.5.0
* py-asdf-coordinates-schemas: add new package at v0.3.0
* py-asdf-astropy: add new package at v0.7.1
* py-asdf: add v4.1.0
2025-03-19 08:09:05 -07:00
germanne
a8584d5eb4 asp.py: abs_control_files shouldn't ask for write rights (#49591) 2025-03-19 15:19:40 +01:00
Massimiliano Culpo
26f7b2c066 builtin: replace self.spec[self.name] with self (take 2) (#49579)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-19 12:28:57 +01:00
Massimiliano Culpo
3a715c3e07 python: remove self.spec["python"] from recipe (#49581)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-19 12:23:53 +01:00
Harmen Stoppels
963519d2b2 builtin: self.spec[self.name].command -> self.command (#49582)
* builtin: self.spec[self.name].command -> self.command

* python-venv: ensure return type is Executable instead of Executable | None
2025-03-19 11:37:01 +01:00
Krishna Chilleri
34efcb686c add new version (#49562) 2025-03-19 11:06:22 +01:00
Harmen Stoppels
5016084213 Move default implementation of pkg.command to PackageBase (#49580) 2025-03-19 09:28:29 +00:00
Massimiliano Culpo
5a04e84097 solver: allow prefer and conflict on virtuals in packages config (#45017) 2025-03-19 09:53:05 +01:00
Massimiliano Culpo
ec34e88d79 builtin: replace self.spec[self.name] by self (#49567)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-19 08:56:55 +01:00
Massimiliano Culpo
31fa12ebd3 perl: remove self references (#49560)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-19 08:54:41 +01:00
Harmen Stoppels
ecf414ed07 docs: add strong preferences to faq (#49556) 2025-03-19 08:53:14 +01:00
Rocco Meli
119bec391e nvpl-scalapack: new package (#49234)
* nvpl-scalapack

* rm variant

* nvpl-scalapack

* Apply suggestions from code review

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>

* mpi

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2025-03-19 08:47:26 +01:00
Juan Miguel Carceller
d5c0ace993 simsipm: add a new version and a variant for setting the C++ standard (#49554)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2025-03-19 01:03:29 -06:00
Teague Sterling
d6bbd8f758 vep-cache: update for vep@113.3 (#49517)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-19 01:03:10 -06:00
Mateusz Jakub Fila
f74d51bf6e fastjet: add v3.4.3 (#49526) 2025-03-19 00:48:24 -06:00
Davis Herring
821ebee53c flecsi: remove (Par)METIS dependency in 2.3.2 (#49480)
* Remove (Par)METIS dependency for new version
* Fix version constraints
2025-03-19 00:48:09 -06:00
Adrien Bernede
9dada76d34 Update hip support in radiuss packages leveraging blt@0.7.0 (#49488)
Co-authored-by: Chris White <white238@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-19 07:36:52 +01:00
Robert Maaskant
e9cc1b36bc kubernetes: add v1.30.0 -> v1.32.3 (#49211)
* kubernetes: add new versions

* kubernetes: add v1.30.11, v1.31.7, v1.32.3

* kubernetes: remove new deprecated versions and refactor build deps
2025-03-18 18:54:12 -06:00
David--Cléris Timothée
fd2c040981 hipsycl: rework llvm compatibility matrix (#49507)
* [hipsycl] add llvm 20 conflict
* add llvm matrix support & add 24.10 release

---------

Co-authored-by: tdavidcl <tdavidcl@users.noreply.github.com>
2025-03-18 15:54:03 -07:00
Robert Maaskant
33cd7d6033 kubectl: add v1.30.0 -> v1.32.3 (#49082)
* kubectl: add all versions currently supported upstream

* kubectl: build same way as kubernetes

* kubectl: revert back to GoPackage

* kubectl: fix version command

* kubectl: add v1.30.11, v1.31.7, v1.32.3

* kubectl: remove new deprecated versions

* kubectl: refactor build deps
2025-03-18 10:16:57 -07:00
Alex Richert
9c255381b1 parallelio: set WITH_PNETCDF from +/~pnetcdf (#49548) 2025-03-18 05:03:55 -06:00
SXS Bot
fd6c419682 spectre: add v2025.03.17 (#49533)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2025-03-18 05:03:37 -06:00
Robert Maaskant
9d1d808f94 py-tqdm: add v4.66.4 -> v4.67.1 (#49525) 2025-03-18 00:06:13 -06:00
Axel Huebl
7a0ef93332 WarpX: Remove Deprecated Versions (#46765)
* WarpX: Remove Deprecated Versions

* Conflict: WarpX SYCL RZ FFT Issue

Conflict out on WarpX issue until fixed
https://github.com/BLAST-WarpX/warpx/issues/5774

* Fix #49546
2025-03-17 21:40:38 -06:00
Teague Sterling
bf48b7662e wasi-sdk-prebuilt: add v25.0,v24.0,v23.0 (#49523)
* wasi-sdk-prebuilt: add v25.0,24.0,23.0

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 20:13:34 -06:00
Teague Sterling
d14333cc79 libgtop: add v2.41.1-2.41.3 (#49524)
* libgtop: new package
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* Adding pkgconfig dep
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* Adding note about https://github.com/spack/spack/pull/44323
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* libgtop: add v2.41.3
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 19:41:42 -06:00
Teague Sterling
084361124e vep: add v113.3 (#49518) 2025-03-17 19:33:42 -06:00
Lehman Garrison
a1f4cc8b73 py-corrfunc: add new package at v2.5.3 (#49502) 2025-03-17 16:43:50 -07:00
Teague Sterling
b20800e765 awscli-v2: add v2.24.24 (#49519)
* awscli-v2: add v2.24.24

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 16:24:21 -07:00
Fabio Durastante
01b1e24074 psblas: new package (#49423)
* Package for installing the PSBLAS library
* Put some version as deprecated
* Removed FIXME comments
* Added missing :
* Fixed style to comply with flex8
* Other round of style fixes to comply with flex8
* Used black to reformat the file
* Fixed typo
   Co-authored-by: Luca Heltai <luca.heltai@unipi.it>
* Added explicit .git extension
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Fixed typo on METIS string
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Added url before removing urls from the the version directives.
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Corrected typo in url.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Reordered variant and depend_on, removed deprecated and old software versions

---------

Co-authored-by: Luca Heltai <luca.heltai@unipi.it>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-03-17 10:47:59 -07:00
Harmen Stoppels
8029279dad gcc: drop redundant --with-ld and --with-as configure flags (#49538)
it's redundant due to our spec file which adds -B, and it breaks -fuse-ld=
2025-03-17 18:35:23 +01:00
Greg Sjaardema
5f4e12d8f2 seacas: add 2025-03-13 (bug fix, new functionality, portability) (#49474)
* seacas: bug fix, new functionality, portability
* Get new checksum due to moving tag forward...
2025-03-17 15:44:17 +00:00
Satish Balay
a8728e700b petsc4py, slepc4py: update homepage, add maintainers (#49383) 2025-03-17 16:19:34 +01:00
Wouter Deconinck
f8adf2b70f libunwind: variant component value setjump -> setjmp (#49508) 2025-03-17 16:12:59 +01:00
Andy Porter
d0ef2d9e00 py-fparser: add v0.2.0 (#47807)
* Update to release 0.2.0

* #47087 updates for review
2025-03-17 11:19:55 +01:00
Melven Roehrig-Zoellner
d4bd3e298a cgns: patch for include path for 4.5 (#49161)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-17 11:19:13 +01:00
sbstndb/sbstndbs
40268634b6 xsimd: add v9.0.1 -> 13.1.0 (#49156) 2025-03-17 11:18:24 +01:00
sbstndb/sbstndbs
b0e8451d83 xtl: add v0.7.7 (#49157) 2025-03-17 11:16:40 +01:00
Massimiliano Culpo
868a52387b Revert "py-flowcept: add py-flowcept package (#47745)" (#49528)
This reverts commit 3fe89115c2.
2025-03-17 11:10:19 +01:00
Matthieu Dorier
3fe89115c2 py-flowcept: add py-flowcept package (#47745)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 11:05:51 +01:00
Wouter Deconinck
412024cf21 git: add v2.48.1 and friends (#49061)
* git: add v2.47.1, v2.48.1

* git: deprecate older versions

* fixed incorrect sha256 for git-manpage for git-manpages-2.47.2.tar.gz listed at https://mirrors.edge.kernel.org/pub/software/scm/git/sha256sums.asc (#49095)

---------

Co-authored-by: Jennifer Green <jkgreen@sandia.gov>
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2025-03-17 11:02:40 +01:00
Hariharan Devarajan
91b20ed7d0 pydftracer, brahma: add new releases (#49245) 2025-03-17 11:00:43 +01:00
Robert Maaskant
0caacc6e21 py-wheel: add v0.41.3 -> v0.45.1 (#49238) 2025-03-17 10:59:23 +01:00
Robert Maaskant
651126e64c openssl: add v3.4.1 and backports (#49250)
Release notes:
- https://github.com/openssl/openssl/blob/openssl-3.4/CHANGES.md#changes-between-340-and-341-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.3/CHANGES.md#changes-between-332-and-333-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.2/CHANGES.md#changes-between-323-and-324-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.1/CHANGES.md#changes-between-317-and-318-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.0/CHANGES.md#changes-between-3015-and-3016-11-feb-2025
2025-03-17 10:57:54 +01:00
Wouter Deconinck
e15a530f32 py-onnxruntime: use CudaPackage (#47684) 2025-03-17 10:35:20 +01:00
Martin Lang
0f84623914 elpa: add 2024.05.001, 2025.01.001 (#49335) 2025-03-17 10:33:23 +01:00
Kin Fai Tse
90afa5c5ef openfoam: add v2406, v2412, fix minor link deps (#49254) 2025-03-17 10:32:15 +01:00
Alberto Sartori
024620bd7b justbuild: add v1.5.0 (#49343) 2025-03-17 09:59:49 +01:00
Wouter Deconinck
9bec8e2f4b py-setuptools-scm-git-archive: add v1.4.1 (#49347) 2025-03-17 09:50:20 +01:00
Dave Keeshan
18dd465532 verible: Add v0.0.3946 (#49362) 2025-03-17 09:47:00 +01:00
Satish Balay
a2431ec00c mpich: add v4.3.0 (#49375) 2025-03-17 09:39:18 +01:00
MatthewLieber
78abe968a0 mvapich: add v4.0 and update default pmi version (#49399)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2025-03-17 01:38:19 -07:00
Wouter Deconinck
38e9043b9e yoda: add v2.1.0; rivet: add v4.1.0 (#49382) 2025-03-17 09:37:42 +01:00
Fernando Ayats
a0599e5e27 py-chex: add 0.1.89, py-optax: add 0.2.4(#49388) 2025-03-17 09:34:42 +01:00
George Young
1cd6f4e28f py-macs3: add @3.0.3 (#49365)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2025-03-17 09:30:08 +01:00
Eric Berquist
d2298e8e99 SST: update package maintainers (#49392) 2025-03-17 09:28:39 +01:00
Robert Maaskant
e3806aeac5 py-setuptools: add v75.8.1 -> v76.0.0 (#49251)
* py-setuptools: add v75.8.1, v75.8.2

Release notes:
- https://setuptools.pypa.io/en/stable/history.html#v75-8-1
- https://setuptools.pypa.io/en/stable/history.html#v75-8-2

* py-setuptools: add v75.9.1, v76.0.0
2025-03-17 09:26:41 +01:00
Seth R. Johnson
38309ced33 CLI11: new versions, PIC option (#49397) 2025-03-17 09:25:10 +01:00
Robert Maaskant
2f21201bf8 util-linux-uuid: add v2.40.3, v2.40.4 (#49441) 2025-03-17 09:16:19 +01:00
Matt Thompson
95a0f1924d openmpi: fix internal-libevent variant (#49463) 2025-03-17 09:06:43 +01:00
Olivier Cessenat
52969dfa78 gsl: add external find (#48665) 2025-03-17 09:05:08 +01:00
Massimiliano Culpo
ee588e4bbe chameleon: update to use oneapi packages (#49498)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:50:05 +01:00
Massimiliano Culpo
461f1d186b timemory: update to use oneapi packages (#49305)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:48:57 +01:00
Massimiliano Culpo
03b864f986 ghost: remove outdated comments (#49501)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:47:50 +01:00
Harmen Stoppels
bff4fa2761 spec.py: include test deps in dag hash, remove process_hash (take two) (#49505)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-17 08:47:09 +01:00
Massimiliano Culpo
ad3fd4e7e9 fleur: update to use oneapi packages (#49500)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:46:50 +01:00
Massimiliano Culpo
a574a995f8 converge: remove package (#49499)
The package was added in 2017, and never updated
substantially. It requires users to login into
a platform to download code.

Thus, instead of updating to new versions, and add
support for OneAPI, remove the package.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:46:32 +01:00
Massimiliano Culpo
0002861daf camx: update to use oneapi packages (#49497)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:44:45 +01:00
Massimiliano Culpo
a65216f0a0 dftfe: update to use oneapi packages (#49430)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:44:30 +01:00
Sebastian Pipping
7604869198 expat: add v2.7.0 with security fixes + deprecate vulnerable 2.6.4 (#49481) 2025-03-17 08:31:56 +01:00
afzpatel
d409126c27 hip: apply LLVM_ROOT and Clang_ROOT args only when installing hip+rocm (#49368) 2025-03-17 07:37:31 +01:00
Lehman Garrison
2b0d985714 py-numexpr: add v2.10.2 (#49490) 2025-03-17 07:07:48 +01:00
Wouter Deconinck
eedec51566 dcap: add test dependency on cunit (#49510) 2025-03-17 06:56:01 +01:00
Seth R. Johnson
016954fcff vecgeom: new dev tag (#49511)
* vecgeom: add surface-dev 2

* Update hash and mark as deprecated
2025-03-17 06:54:43 +01:00
Adam J. Stewart
0f17672ddb py-numpy: add v2.2.4 (#49512) 2025-03-17 06:47:18 +01:00
Harmen Stoppels
f82de718cd Revert "spec.py: include test deps in dag hash, remove process_hash (#48936)" (#49503)
This reverts commit 2806ed2751.
2025-03-15 22:56:03 +01:00
Todd Gamblin
4f6836c878 bugfix: Scopes shouldn't dynamically maintain include lists (#49494)
Fixes #49403.

When one scope included another, we were appending to a list stored on the scope to
track what was included, and we would clear the list when the scope was removed.

This assumes that the scopes are always strictly pushed then popped, but the order can
be violated when serializing config scopes across processes (and then activating
environments in subprocesses), or if, e.g., instead of removing each scope we simply
cleared the list of config scopes. Removal can be skipped, which can cause the list of
includes on a cached scope (like the one we use for environments) to grow every time it
is pushed, and this triggers an assertion error.

There isn't actually a need to construct and destroy the include list. We can just
compute it once and cache it -- it's the same every time.

- [x] Cache included scope list on scope objects
- [x] Do not dynamically append/clear the included scope list

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-15 13:21:36 -07:00
Harmen Stoppels
2806ed2751 spec.py: include test deps in dag hash, remove process_hash (#48936) 2025-03-15 13:12:51 -07:00
Richard Berger
92b0cb5e22 llvm: add v20.1.0 (#49456) 2025-03-15 07:05:20 -06:00
Ryan Krattiger
f32b5e572a ci: remove --keep-stage flag (#49467)
logs are now copied from the install dir
2025-03-15 09:41:25 +01:00
Asa
e35c5ec104 module generation: make package tags accessible in template (#48213) 2025-03-14 15:08:14 -07:00
Harmen Stoppels
60be77f761 spack style --spec-strings: fix non-str constant issue (#49492) 2025-03-14 21:54:02 +01:00
John W. Parent
69b7c32b5d MSVC: Restore amalgamated compiler functionality (#46678)
Right now the Spack %msvc compiler is inherently a hybrid compiler
that uses Intel's oneAPI fortran compiler.

This was addressed in Spacks MSVC compiler class, but detection has
since stopped using the compiler class, so this PR moves the logic
into the `msvc` compiler package (does not delete the original code
because that is handled in #45189).

This includes a change to the general detection logic to deprioritize
paths that include a symlink anywhere in the path, in order to prefer
"2025.0/bin" over "latest/bin" for the oneAPI compiler.
2025-03-14 13:36:41 -07:00
Rocco Meli
e2c6914dfe cp2k: add dependencies (#49489)
* update

* make comment a message

* [@spackbot] updating style on behalf of RMeli

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2025-03-14 13:18:33 -06:00
Harmen Stoppels
87926e40a9 style.py: add spack style --spec-strings for compat with v1.0 (#49485)
* style.py: add spack style --spec-strings for compat with v1.0

* add --fix also, and avoid infinite recursion and too large files

* tests: check identify and check edit files
2025-03-14 19:10:39 +00:00
Diego Alvarez S.
324d733bf9 Add nextflow 24.10.5 (#49390) 2025-03-14 10:55:38 -05:00
sbstndb/sbstndbs
07bf35d54b samurai: new package (#49144)
* samurai: new package

	-	Add samurai : an HPC library of mesh and physics

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Remove Whitespace

	-	Remove whitespace for spack style check

* Update var/spack/repos/builtin/packages/samurai/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Add tags

	-	Add tags for the last versions of samurai
	-	All tags are tested and worked properly
	-	Add maintainers ("gouarin" - the samurai project lead and "sbstndb" - me, working on samurai)
	-	Add licence

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-14 09:56:01 -04:00
Vanessasaurus
72196ee4a1 Automated deployment to update package flux-sched 2025-03-13 (#49451)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-14 09:54:59 -04:00
Rocco Meli
738e41d8d2 mc main (#49476) 2025-03-14 09:52:05 -04:00
Daryl W. Grunau
f3321bdbcf draco: unify variant nomenclature with other spackages (#49479)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2025-03-14 09:51:15 -04:00
snehring
9c6f0392d5 Revert "Add package libglvnd (#49214)" (#49478)
This reverts commit 682e4bf4d4.
2025-03-14 09:50:29 -04:00
Robert Maaskant
297848c207 libxcrypt: add v4.4.36, v4.4.38 (#49405)
* libxcrypt: add v4.4.36, v4.4.38

* libxcrypt: explain why 4.4.37 is absent

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-14 07:56:30 -05:00
Adam J. Stewart
e9c2a53d83 py-torchmetrics: add v1.6.3 (#49483) 2025-03-14 07:54:23 -05:00
Sébastien Valat
77b6923906 malt: Add version 1.2.5 (#49484) 2025-03-14 07:50:29 -05:00
psakievich
8235aa1804 Trilinos launch blocking + maintainers (#49468)
* Trilinos launch blocking + maintainers

Cuda launch blocking is not needed and slowing modern apps down. 

More maintainers to spot issues like this.

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
2025-03-14 02:08:30 -06:00
eugeneswalker
d09c5a4bd4 e4s cray rhel: petsc: require +batch (#49472) 2025-03-14 05:39:02 +00:00
Alex Richert
916755e22a crtm: disable testing if not self.run_tests (#49469)
* crtm: disable testing if not self.run_tests
* Update package.py
2025-03-13 23:23:51 -06:00
Olivier Cessenat
3676381357 dmtcp: add 3.2.0 (#49465) 2025-03-13 22:59:21 -06:00
Martin Lang
de9f92c588 Patch bug in elpa's cpp (#49462)
Elpa's custom preprocessor createst temporary files for which it
assembles long filenames and then uses the last 250 characters. This
results in compilation errors when the first character happens to be a
dash.
2025-03-13 21:36:06 -06:00
Martin Lang
6ba7aa325b Slurm: extend spack external find support (#47740)
* Slurm: extend spack external find support

On Debian srun/salloc --version returns 'slurm-wlm VERSION'. Check for both strings and return the first match.

* non-capturing group for slurm determine_version

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* slurm: add detection test

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-13 21:35:49 -06:00
Mikael Simberg
c0cbbcfa0a mold: Add 2.37.1 (#49458) 2025-03-13 21:21:00 -06:00
Robert Maaskant
f2dc4ed6d3 npm: update v9 add v10 add v11 (#49181) 2025-03-13 14:24:14 -07:00
ddement
38bf1772a0 Added dependency on netcdf-c and removed need for basker variant on Trilinos (#49442) 2025-03-13 14:13:00 -07:00
Vanessasaurus
3460602fb9 flux-sched: add v0.43.0 (#49299)
* Automated deployment to update package flux-sched 2025-03-05

* add back patch from today

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-13 16:44:16 -04:00
Teague Sterling
a6ce7735e6 duckdb: add v1.2.1 remove 0.9.0-0.10.3 (deprecated) (#49356)
* duckdb: add v1.2.1 remove 0.9.0-0.10.3 (deprecated)

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Add 0.10.3 and 0.10.2 back in

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-13 16:43:37 -04:00
snehring
4b11266e03 molden: add v7.3 (#49205)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-03-13 13:38:22 -07:00
Doug Jacobsen
436ff3c818 wrf: Remove fortran variant from hdf5 (#49286)
This commit removes the +fortran variant when building HDF5 for WRF.
This seems unnecessary, and prevents building WRF with some versions of
Intel MPI, as HDF5 doesn't appear to build with Fortran support and
Intel MPI.
2025-03-13 13:47:37 -04:00
Adam J. Stewart
fa35d8f8ec fish: add v4.0.1 (#49459) 2025-03-13 13:47:30 -04:00
Dave Keeshan
6f8a3674af jimtcl: add v0.83 (#49360) 2025-03-13 13:39:48 -04:00
Dave Keeshan
39b7276a33 verilator: add v5.034 (#49363) 2025-03-13 13:39:14 -04:00
Christophe Prud'homme
d67afc7191 parmmg: add new versions up to 1.5 and new variants (#47387)
* add new versions up to 1.5 and new variants

variant vtk: make vtk optional
variant shared: build shared libs

added patch to fix parmmg cmake so that it can be used by other software with find_package

* use +private for mmg@5.8: and parmmg@1.5:

* fix style and constraint mmg version

* add a condition on patch, use private_headers from mmg PR feelpp/spack#14
2025-03-13 12:50:34 -04:00
Dom Heinzeller
8823c57b72 Add met@11.1.1, met@12.0.0, met@12.0.1, metplus@6.0.0 (#49120)
* add MET v12.0.0 and METplus v6.0.0

* Set correct dependencies for metplus@6 in var/spack/repos/builtin/packages/metplus/package.py

* Add missing dependency on proj for met@12

* Add met@12.0.1

* Change @6.0.0 to @6: for requirements in var/spack/repos/builtin/packages/metplus/package.py

* Address reviewer comments for met and metplus

---------

Co-authored-by: Rick Grubin <Richard.Grubin@noaa.gov>
2025-03-13 08:01:26 -06:00
Harmen Stoppels
c8466c4cd4 hip: sha256 change after github repo was renamed (#49460) 2025-03-13 13:16:36 +01:00
Zack Galbreath
f5ff63e68d ci: use stack-specific local mirrors (#49449)
This should help resolve the "No binary found when cache-only was specified"
errors we've recently seen in our GitLab CI pipelines.

example failing job here:
https://gitlab.spack.io/spack/spack/-/jobs/15570931#L370

This error is caused when a generate job finds a spec in the local root
binary mirror, and that spec does not yet exist in the stack-specific mirror.

The fix here is to instead locally cache the stack-specific mirrors and only
use the root-level mirror for public use.
2025-03-13 12:04:46 +01:00
Harmen Stoppels
11f52ce2f6 Warn when %compiler precedes +variant (#49410) 2025-03-13 10:45:09 +01:00
Axel Huebl
63895b39f0 WarpX: GitHub Org Moved (#49427)
The WarpX repo moved. ECP succeeded. Long live ECP.
2025-03-13 09:23:37 +01:00
Massimiliano Culpo
64220779d4 abyss: update to use oneapi packages, addv2.3.10 (#49429)
Add missing btllib dependency, needed from v2.3.6

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-13 08:56:39 +01:00
Massimiliano Culpo
774346038e plumed: update to use oneapi packages (#49432)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-13 08:32:09 +01:00
Massimiliano Culpo
03dbc3035c octave: add v9.4.0, remove mentions of old intel packages (#49431)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-13 08:31:12 +01:00
Davis Herring
ad78ed741c flecsi: add v2.3.2 (#49448)
Deprecate 2.3.0 and 2.3.1 which have the significant bug corrected
2025-03-13 00:05:41 -06:00
Matt Thompson
599d32d1c2 py-questionary: add 2.1.0 (#49414)
* py-questionary: add 2.1.0
* Reorder
* Add myself as maintainer
2025-03-12 19:31:42 -07:00
Massimiliano Culpo
e5c7fe87aa spla: update to use oneapi packages (#49435)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-12 19:26:47 -07:00
Massimiliano Culpo
cc6ab75063 speexdsp: update to use oneapi packages (#49434)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-12 19:26:32 -07:00
Massimiliano Culpo
fe00c13afa plasma: update to use oneapi packages (#49433)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-12 19:26:11 -07:00
Wouter Deconinck
d610ff6cb1 libseccomp: add v2.5.5, v2.5.6, v2.6.0 (#49243) 2025-03-12 19:14:26 -07:00
Wouter Deconinck
54f947fc2a armadillo: add v14.4.0 (#49242) 2025-03-12 19:12:44 -07:00
afzpatel
a5aa784d69 add 6.3.2 (#49266) 2025-03-12 19:10:10 -07:00
Robert Maaskant
3bd58f3b49 py-setuptools-scm: add v8.1.0, v8.2.0 (#49271)
* py-setuptools-scm: add v8.1.0, v8.2.0 and refactor deps
* fixup! py-setuptools-scm: add v8.1.0, v8.2.0 and refactor deps
* fixup! fixup! py-setuptools-scm: add v8.1.0, v8.2.0 and refactor deps
2025-03-12 19:04:51 -07:00
Wouter Deconinck
cac0beaecf (py-)onnx: add v1.17.0 (#49287)
* onnx/py-onnx: update to 1.17.0
* (py-)onnx: depends_on cmake@3.14: when=@1.17:

---------

Co-authored-by: Joseph C Wang <joequant@gmail.com>
2025-03-12 19:02:09 -07:00
kenche-linaro
406ccc2fe3 linaro-forge: add v24.1.2 (#49223) 2025-03-12 18:59:50 -07:00
snehring
40cd8e6ad8 virtualgl: add v3.1.2 (#49215)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-03-12 18:58:14 -07:00
snehring
682e4bf4d4 Add package libglvnd (#49214)
* Add package libglvnd
  Signed-off-by: Shane Nehring <snehring@iastate.edu>
* libglvnd: add virtual defaults
  Signed-off-by: Shane Nehring <snehring@iastate.edu>

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-03-12 18:53:43 -07:00
John W. Parent
56b2979966 adios2 package: turn off new options by default on windows (#47070)
... for now. Will turn them back on for Windows when necessary
adjustments are made to the package to support them.
2025-03-12 16:48:11 -07:00
Tamara Dahlgren
d518aaa4c9 path and remote_file_cache: support windows paths (#49437)
Windows paths with drives were being interpreted as network protocols
in canonicalize_path (which was expanded to handle more general URLs
in #48784).

This fixes that and adds some tests for it.
2025-03-12 22:28:37 +00:00
Thomas Dickerson
8486a80651 Fix: tensorflow empty config vars (#49424)
* Create allow-empty-config-environment-variables.patch

* Apply patch from last commit

* [@spackbot] updating style on behalf of elfprince13

---------

Co-authored-by: elfprince13 <elfprince13@users.noreply.github.com>
2025-03-12 14:25:32 -06:00
Thomas Dickerson
28341ef0a9 Fix assumption of linux platform in py-tensorflow (#49425)
post_configure_fixes assumed py-tensorflow depends on patchelf, but that dependency is platform dependent.
2025-03-12 14:05:17 -06:00
Harmen Stoppels
f89a2ada4c Move %compiler last in static spec strings (#49438) 2025-03-12 19:41:43 +01:00
Harmen Stoppels
cf804c4ea8 cppcheck: add latest, deprecate older versions (#49445) 2025-03-12 18:16:13 +01:00
Harmen Stoppels
a45d09abcd Spec to string: show %compiler at the end (#49439)
In Spack v1.0 we plan to parse caret ^ and percent % the same. Their meaning is direct and transitive dependency respectively. It means that variants, versions, arch, platform, os, target and dag hash should go before the %, so that they apply to dependent not the %dependency.
2025-03-12 18:15:34 +01:00
Harmen Stoppels
cd3068dc0b warpx: update checksum after repo name changed (#49443) 2025-03-12 15:28:07 +01:00
Althea Denlinger
de9aa3bcc6 nco: Add many versions and OpenMP support (#49014)
* Include OpenMP support
* Add many new versions of NCO
* Add maintainers

---------

Co-authored-by: Xylar Asay-Davis <xylarstorm@gmail.com>
2025-03-12 08:23:06 -06:00
Harmen Stoppels
db7ab9826d spec_parser: check next_token if not expecting next token (#49408) 2025-03-12 08:39:23 +01:00
Harmen Stoppels
9f69d9b286 get_mark_from_yaml_data: move to spack.util.spack_yaml (#49409) 2025-03-12 08:36:14 +01:00
Massimiliano Culpo
d352b71df0 Error when an anonymous spec is required for a virtual package (#49385)
When requiring a constraint on a virtual package, it makes little
sense to use anonymous specs, and our documentation shows no example
of requirements on virtual packages starting with `^`.

Right now, due to how `^` is implemented in the solver, writing:
```yaml
mpi:
  require: "^openmpi"
```
is equivalent to the more correct form:
```yaml
mpi:
  require: "openmpi"
```
but the situation will change when `%` will shift its meaning to be a
direct dependency.

To avoid later errors that are both unclear, and quite slow to get to the user,
this commit makes anonymous specs under virtual requirements an error,
and shows a clear error message pointing to the file and line where the
spec needs to be changed.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-12 08:33:42 +01:00
Robert Maaskant
4cb4634c74 oniguruma: add v6.9.10 (#49412)
* oniguruma: add v6.9.10

* oniguruma: update url to trigger ci
2025-03-11 20:02:15 -05:00
Robert Maaskant
594554935d nghttp2: add v1.65.0 (#49411) 2025-03-11 19:58:59 -05:00
Mikael Simberg
8b56470650 mold: Add 2.37.0 (#49416) 2025-03-11 19:55:09 -05:00
Michael Kuhn
ba4fd64caa postgresql: add missing perl build dependency (#49417)
Without it, the build fails with errors like this:
```
Can't locate File/Compare.pm in @INC (you may need to install the File::Compare module) (@INC contains: ...) at ../../../src/backend/catalog/Catalog.pm line 19.
```
2025-03-11 15:51:22 -05:00
Loïc Pottier
07ec8a9ba3 Added standalone package to install flux python api (#49197)
Signed-off-by: Loic Pottier <pottier1@llnl.gov>
2025-03-11 10:24:05 -07:00
Vicente Bolea
64ba324b4a adios2: fix smoke test (#49199) 2025-03-11 10:15:22 -07:00
Andy Porter
2aab567782 py-psyclone: add v3.1.0 (#49190)
* Update py-psyclone package.py to refer to 3.1.0 release
* Fix hash for 3.1.0 tarball
2025-03-11 10:13:50 -07:00
John W. Parent
d4e29c32f0 CMake: verions 3.30.8, 3.31.6 (#49192) 2025-03-11 10:09:16 -07:00
Adam J. Stewart
30e5639995 fish: add v4.0.0 (#49283) 2025-03-11 17:16:42 +01:00
Adam J. Stewart
fa4c09d04e GEOS: add v3.9.6 -> v3.13.1 (#49279) 2025-03-11 17:13:51 +01:00
Adam J. Stewart
f0a458862f py-keras: add v3.9.0 (#49300) 2025-03-11 17:10:28 +01:00
Adam J. Stewart
2938680878 py-rtree: add v1.4.0 (#49336) 2025-03-11 17:09:52 +01:00
Adam J. Stewart
a8132e5c94 libspatialindex: add v2.1.0 (#49337) 2025-03-11 17:09:01 +01:00
Massimiliano Culpo
9875a0e807 cairo: fix a few "wrong" defaults (#49415)
Having variants all conditional leaves a lot more degree of freedom to clingo,
and slows down the search.

If variants have inconsistent defaults, we might end up with multiple, equally 
sub-optimal solutions. Sometimes this creates a "plateau" in the search space.

Remove conditional boolean variants that can't be activated, since this just increases
the complexity of the model.

If 4 variants have to be all active / inactive together, it's better to use a single requires, 
than to explode it into multiple statements dealing with a single variant at a time.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-11 17:05:23 +01:00
Michael Kuhn
cb4d3a9fc2 vim: add 9.1.1194 (#49418) 2025-03-11 09:34:03 -06:00
psakievich
7d79648cb5 build_environment.py: fix external module loading (#49401)
* load external modules in topo order from leaf to root
* only load external modules of transitive link/run deps
2025-03-11 14:05:26 +01:00
Robert Maaskant
e84e5fa9bf node-js: add versions up to 22.14.0 (#49131) 2025-03-11 06:38:41 -06:00
John W. Parent
f25cbb0fe4 icu4c package: fix windows build quoting issue (#49196)
ICU4C's NMAKE seems to over-quote to the degree
that it  passes paths like ""<path>"" which
confuses the Python command line in subprocesses
the build starts
2025-03-10 18:19:28 -07:00
John W. Parent
f3257cea90 Windows Ci: Ensure consistent EOL (#49377) 2025-03-10 18:06:02 -07:00
Stephen Nicholas Swatman
d037e658a4 geomodel: add v6.10.0 (#49386)
This commit adds version 6.10.0 of the geomodel package.
2025-03-10 10:10:57 -05:00
Wouter Deconinck
a14acd97bd fjcontrib: add v1.101 (#49182) 2025-03-10 08:09:46 -07:00
Robert Maaskant
199cce879f ca-certificates-mozilla: add v2025-02-25 (#49184)
* ca-certificates-mozilla: v2025-02-25

* ca-certificates-mozilla: undo refactor
2025-03-10 09:27:57 -04:00
Robert Maaskant
7d66063bd9 go: v1.22.12, v1.23.7, v1.24.1 (#49389) 2025-03-10 09:24:52 -04:00
Massimiliano Culpo
47c6fb750a spfft: update to use oneapi packages (#49311)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-10 08:32:49 +01:00
Massimiliano Culpo
8c3ac352b7 suite-sparse: update to use oneapi packages (#49310)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-10 08:32:01 +01:00
Massimiliano Culpo
d6ac16ca16 dyhidrogen: update to use oneapi packages (#49303)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-10 08:29:47 +01:00
Peter Scheibel
75e37c6db5 use default modify scope if no scope contains key (#48777)
If you use `spack config change` to modify a `require:` section that did
not exist before, Spack was inserting the merged configuration into the
highest modification scope (which for example would clutter the
environment's `spack.yaml` with a bunch of configuration details 
from the defaults).
2025-03-09 21:31:56 -07:00
Tamara Dahlgren
3f8dcfc6ed Support independent includes with conditional, optional, and remote entries (#48784)
Supersedes #46792.
Closes #40018.
Closes #31026.
Closes #2700.

There were a number of feature requests for os-specific config. This enables os-specific
config without adding a lot of special sub-scopes.

Support `include:` as an independent configuration schema, allowing users to include
configuration scopes from files or directories. Includes can be:
* conditional (similar to definitions in environments), and/or
* optional (i.e., the include will be skipped if it does not exist).

Includes can be paths or URLs (`ftp`, `https`, `http` or `file`). Paths can be absolute or
relative . Environments can include configuration files using the same schema. Remote includes 
must be checked by `sha256`.

Includes can also be recursive, and this modifies the config system accordingly so that
we push included configuration scopes on the stack *before* their including scopes, and
we remove configuration scopes from the stack when their including scopes are removed.

For example, you could have an `include.yaml` file (e.g., under `$HOME/.spack`) to specify
global includes:

```
include:
- ./enable_debug.yaml
- path: https://github.com/spack/spack-configs/blob/main/NREL/configs/mac/config.yaml
  sha256: 37f982915b03de18cc4e722c42c5267bf04e46b6a6d6e0ef3a67871fcb1d258b
```

Or an environment `spack.yaml`:

```
spack:
  include:
  - path: "/path/to/a/config-dir-or-file"
    when: os == "ventura"
  - ./path/relative/to/containing/file/that/is/required
  - path: "/path/with/spack/variables/$os/$target"
    optional: true
  - path: https://raw.githubusercontent.com/spack/spack-configs/refs/heads/main/path/to/required/raw/config.yaml
    sha256: 26e871804a92cd07bb3d611b31b4156ae93d35b6a6d6e0ef3a67871fcb1d258b
```

Updated TODO:
- [x] Get existing unit tests to pass with Todd's changes
- [x] Resolve new (or old) circular imports
- [x] Ensure remote includes (global) work
- [x] Ensure remote includes for environments work (note: caches remote
      files under user cache root)
- [x] add sha256 field to include paths, validate, and require for remote includes
- [x] add sha256 remote file unit tests
- [x] revisit how diamond includes should work
- [x] support recursive includes
- [x] add recursive include unit tests
- [x] update docs and unit test to indicate ordering of recursive includes with
      conflicting options is deferred to follow-on work

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-09 19:33:44 -07:00
Satish Balay
07d4915e82 petsc, py-petsc4py: add v3.22.4 (#49374) 2025-03-08 12:23:31 -06:00
Ryan Krattiger
77ff574d94 Revert "CI: Set the cache path for all platforms (#49373)" (#49381)
This reverts commit 50b56ee1ce.
2025-03-08 08:29:05 +01:00
Rémi Lacroix
5783f950cf google-cloud-cli: Install missing "platform" directory (#49367)
Ignore the bundled Python since it's provided by Spack.

This fixes the "gsutil" command.
2025-03-07 23:16:57 -06:00
Edoardo Zoni
1c76c88f2c WarpX & pyAMReX 25.03 (#49328)
* pyAMReX 25.03

* Adding EZoni to pyAMReX Spack package maintainers

* WarpX 25.03

* Fix SHA-256 checksums

---------

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2025-03-07 21:03:28 -08:00
Ryan Krattiger
50b56ee1ce CI: Set the cache path for all platforms (#49373)
The SPACK_USER_CACHE_PATH was being overwritten in the windows CI
before_script. This should set the path for all systems unless
explicitly overridden.
2025-03-07 17:07:56 -06:00
wspear
be521c441e Conflict bugged +comm variant (#49371)
This will be fixed in the next tau release. Conflicted up to current. @kwryankrattiger
2025-03-07 16:29:46 -06:00
Wouter Deconinck
61ffb87757 actsvg: add v0.4.51 (#49352) 2025-03-07 16:53:02 +01:00
Chris Marsh
950b4c5847 py-rpy2: add missing libiconv (#49355)
* add missing libiconv

* use the virtual provider
2025-03-07 08:16:39 -06:00
Piotr Sacharuk
ac078f262d raja: Apply workarounds for oneAPI compiler for problem with build (#49290) 2025-03-07 06:44:11 -05:00
Harmen Stoppels
fd62f0f3a8 repo create: set api: vX.Y (#49344) 2025-03-07 08:34:55 +01:00
Chris White
ca977ea9e1 Fix missing hipBlas symbol (#49298)
Co-authored-by: Eric B. Chin <chin23@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2025-03-06 13:43:18 -08:00
Robert Maaskant
0d2c624bcb glib: add v2.82.5 (#49281) 2025-03-06 17:49:14 +01:00
Alec Scott
765b6b7150 py-aiojobs: new-package (#49329)
* py-aiojobs: new-package

* Update var/spack/repos/builtin/packages/py-aiojobs/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix minimum required python dependency based on feedback

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-06 07:11:06 -06:00
Seth R. Johnson
a91f96292c vecgeom: add development version of surface branch (#49313)
* vecgeom: add development version of surface branch

* Use tag on main branch

* Get full repo for versioning on master branch
2025-03-06 05:32:33 -05:00
Wouter Deconinck
18487a45ed xz: add v5.4.7, v5.6.2, v5.6.3 (#49330) 2025-03-06 09:47:25 +01:00
Wouter Deconinck
29485e2125 meson: add v1.5.2, v1.6.1, v1.7.0 (#49244) 2025-03-05 22:36:06 -06:00
dependabot[bot]
7674ea0b7d build(deps): bump types-six in /.github/workflows/requirements/style (#49295)
Bumps [types-six](https://github.com/python/typeshed) from 1.17.0.20241205 to 1.17.0.20250304.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-05 22:34:49 -06:00
Wouter Deconinck
693376ea97 qt-*: add v6.8.2 (#49320) 2025-03-05 20:03:34 -07:00
Massimiliano Culpo
88bf2a8bcf globalarrays: add unconditional dep on C++ (#49317)
See https://gitlab.spack.io/spack/spack/-/jobs/15482194

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 20:03:09 -07:00
Wouter Deconinck
03e9ca0a76 QtPackage: set QT_ADDITIONAL_SBOM_DOCUMENT_PATHS (#49319)
* QtPackage: set QT_ADDITIONAL_SBOM_DOCUMENT_PATHS

* QtPackage: self.spec.satisfies("@6.9:")

* QtPackage: if self.spec.satisfies("@6.9:")
2025-03-05 19:53:35 -07:00
Massimiliano Culpo
18399d0bd1 qt-svg: add dependency on C (#49316)
https://gitlab.spack.io/spack/spack/-/jobs/15482214

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 19:53:10 -07:00
Dan Bonachea
3aabff77d7 GASNet 2025.2 update (#49327)
* gasnet: deprecate old versions
  GASNet versions more than 2 years old are not supported.
  Update description text.
* gasnet: add 2025.2.0-snapshot version
2025-03-05 19:48:31 -07:00
Chris Marsh
aa86342814 Ensure if TCL is already sourced on the system the lib paths don't interfere with spack's install step (#49325) 2025-03-05 19:48:04 -07:00
Weiqun Zhang
170a276f18 amrex: add v25.03 (#49252)
Starting from amrex-25.03, FFT is enabled by default in spack build.
2025-03-05 15:53:25 -08:00
Massimiliano Culpo
313524dc6d qrupdate: update to use oneapi packages (#49304)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:37 -05:00
Massimiliano Culpo
5aae6e25a5 arpack-ng: update to use oneapi packages (#49302)
Also, remove deprecated versions

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:13 -05:00
Massimiliano Culpo
b58a52b6ce abinit: update to use oneapi packages (#49301)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:01 -05:00
Chris White
32760e2885 sundials: expand patch when rule (#49296) 2025-03-05 16:13:19 +01:00
Harmen Stoppels
125feb125c Define Package API version (#49274)
Defines `spack.package_api_version` and `spack.min_package_api_version` 
as tuples (major, minor). 

This defines resp. the current Package API version implemented by this version 
of Spack and the minimal Package API version it is backwards compatible with.

Repositories can optionally define:
```yaml
repo:
    namespace: my_repo
    api: v1.2
```
which indicates they are compatible with versions of Spack that implement 
Package API `>= 1.2` and `< 2.0`. When the `api` key is omitted, the default 
`v1.0` is assumed.
2025-03-05 15:42:48 +01:00
Wouter Deconinck
8677063142 QtPackage: modify QT_ADDITIONAL_PACKAGES_PREFIX_PATH handling (#49297)
* QtPackage: mv QT_ADDITIONAL_PACKAGES_PREFIX_PATH handling

* geomodel: support Qt6

* qt-base: rm import re
2025-03-05 09:09:32 -05:00
Massimiliano Culpo
f015b18230 hydrogen: update to use oneapi packages (#49293)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 09:06:32 +01:00
Massimiliano Culpo
aa9e610fa6 elemental: remove deprecated package (#49291)
This package has not been maintained since 2016.

We maintain an active fork in the hydrogen
package, so remove this one.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 08:36:05 +01:00
Wouter Deconinck
7d62045c30 py-networkx: add up to v3.4.2 (#49289)
* py-networkx: add new versions up to 3.4.2
* py-networkx: add more requirements
* py-networkx: fix typo
* py-networkx: fix python and py-setuptools dependencies

---------

Co-authored-by: Joseph C Wang <joequant@gmail.com>
2025-03-04 17:02:54 -08:00
Chris Marsh
5b03173b99 r-packages: add missing gettext dependencies (#48910)
* add gettext dependency

* typo

* style
2025-03-04 17:07:01 -06:00
mvlopri
36fcdb8cfa Update the incorrect sha for the SEACAS package.py (#49292)
The sha256sum for the 2025-02-27 version of SEACAS is incorrect
due to the movement of the tagged version.
2025-03-04 16:03:28 -07:00
Chris Marsh
7d5b17fbf2 py-rpy2: Add 3.5.17 (#48911)
* Update rpy2 to newest version and clean up package

* Add me as maintainer

* Update depends section as per review. Add ipython variant. Fix some ranges and add support for python 3.9. Deprecated outdated versions

* refine depends_on and remove redundant version info

* style
2025-03-04 15:58:12 -07:00
Piotr Sacharuk
d6e3292955 flux-sched: Apply workarounds for oneAPI compiler for problem with build (#49282) 2025-03-04 15:28:33 -07:00
Chris Marsh
60f54df964 Explicitly depend on gettext for libintl (#48908) 2025-03-04 16:25:31 -06:00
Wouter Deconinck
487df807cc veccore: add typo fix for clang (#49288)
* veccore: add typo for clang

* veccore: apply ScalarWrapper.h patch for all compilers

---------

Co-authored-by: Joseph C Wang <joequant@gmail.com>
2025-03-04 14:35:47 -07:00
Zack Galbreath
cacdf84964 ci: add support for high priority local mirror (#49264) 2025-03-04 14:47:37 -06:00
fbrechin
e2293c758f Adding ability for repo paths from a manifest file to be expanded when creating an environment. (#49084)
* Adding ability for repo paths from a manifest file to be expanded when creating an environment.

A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.

* Fixing new unit test for env repo var substitution

* Adding ability for repo paths from a manifest file to be expanded when creating an environment.

A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.

* Messed up resolving last rebase
2025-03-04 09:52:28 -08:00
Harmen Stoppels
f5a275adf5 gitignore: remove *_archive (#49278) 2025-03-04 18:37:18 +01:00
Paul
615ced32cd protobuf: add v3.29.3 (#49246) 2025-03-04 11:29:53 -06:00
Massimiliano Culpo
bc04d963e5 Remove debug print statements in unit-tests (#49280)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-04 18:29:45 +01:00
Taillefumier Mathieu
11051ce5c7 CP2K: Add GRPP support (#49232) 2025-03-04 06:54:27 -07:00
Adam J. Stewart
631bddc52e py-pyarrow: add v19.0.1 (#49149)
* py-pyarrow: add v19.0.1

* Environment variables no longer needed either

* Remove py-pyarrow variants
2025-03-04 13:20:52 +01:00
Adam J. Stewart
b5f40aa7fb OpenCV: fix +cuda build (#49146) 2025-03-04 13:19:57 +01:00
Adam J. Stewart
57e0798af2 py-pip: mark Python 3.12+ support (#49148) 2025-03-04 13:18:38 +01:00
Chris White
0161b662f7 conduit: do not pass link flags to ar (#49263) 2025-03-03 19:53:11 -07:00
afzpatel
aa55b19680 fix +asan in ROCm packages (#48745)
* fix asan for hsa-rocr-dev
* add libclang_rt.asan-x86_64.so to LD_LIBRARY_PATH
* fix +asan for hipsparselt
* fix rocm-openmp-extras asan and add rccl +asan support
* add missing comgr build env variables
* add missing rocm-smi-lib build env variables
* minor dependency change
* fix style
2025-03-03 17:57:34 -08:00
dependabot[bot]
8cfffd88fa build(deps): bump pytest from 8.3.4 to 8.3.5 in /lib/spack/docs (#49268)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.4 to 8.3.5.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.4...8.3.5)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:18:42 -06:00
dependabot[bot]
2f8dcb8097 build(deps): bump python-levenshtein in /lib/spack/docs (#49269)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.26.1 to 0.27.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.26.1...v0.27.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:17:48 -06:00
dependabot[bot]
5b70fa8cc8 build(deps): bump sphinx from 8.2.1 to 8.2.3 in /lib/spack/docs (#49270)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.2.1 to 8.2.3.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.2.1...v8.2.3)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:17:08 -06:00
Adam J. Stewart
b4025e89ed py-torchmetrics: add v1.6.2 (#49262) 2025-03-03 19:15:49 -06:00
Eric Berquist
8db74e1b2f tmux: add 3.5a, 3.5, and 3.3 (#49259)
* tmux: add 3.5a, 3.5, and 3.3

* tmux: patch is in releases from 3.5 onward

* tmux: versions 3.5 and newer can use jemalloc
2025-03-03 19:12:45 -06:00
Wouter Deconinck
1fcfbadba7 qwt: add v6.2.0, v6.3.0, support Qt6 (#45604)
* qwt: support building against Qt6

* qwt: fix style

* qwt: depends_on qt-base+opengl+widgets when +opengl

* visit: patch for missing cmath include

---------

Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2025-03-03 16:25:48 -08:00
Chris White
13ec35873f Axom: Changes from Axom repository (#49183)
* pull in new changes from axom project

* add new versions

* convert more conditionals to spec.satisfies

-------------
Co-authored-by: white238 <white238@users.noreply.github.com>
2025-03-03 15:47:45 -08:00
Philip Fackler
f96b6eac2b xolotl: new package (#48876)
* Adding xolotl package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add blank line

* Update var/spack/repos/builtin/packages/xolotl/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/xolotl/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Switch to CudaPackage and remove source dir from runtime env

* [@spackbot] updating style on behalf of PhilipFackler

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-03 15:18:28 -06:00
Rocco Meli
933a1a5cd9 update (#49261) 2025-03-03 10:38:10 -07:00
Stephen Nicholas Swatman
b2b9914efc acts dependencies: new versions as of 2025/03/03 (#49253)
This commit adds ACTS version 39.2.0 and detray version 0.89.0.
2025-03-03 09:32:59 -07:00
Rocco Meli
9ce9596981 multicharge: add v0.3.1 (#49255)
* multicharge: add v0.3.1

* fix url
2025-03-03 15:32:29 +01:00
Wouter Deconinck
fc30fe1f6b librsvg: add v2.56.4, v2.57.3, v2.58.2 (#45734)
* librsvg: add v2.56.4, v2.57.3, v2.58.2

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-03-02 14:08:43 -08:00
Paul
25a4b98359 jacamar-ci: add v0.25.0 (#49248) 2025-03-02 14:50:43 -06:00
Adam J. Stewart
05c34b7312 py-pymc3: not compatible with numpy 2 (#49225) 2025-03-01 13:43:05 -06:00
Tahmid Khan
b22842af56 globalarrays: Add variant cxx which adds the --enable-cxx flag (#49241) 2025-03-01 13:16:04 -06:00
Vanessasaurus
0bef028692 Automated deployment to update package flux-sched 2025-02-28 (#49229)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-01 08:48:41 -07:00
Vanessasaurus
935facd069 Automated deployment to update package flux-security 2025-02-28 (#49230)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-01 08:47:19 -07:00
Adam J. Stewart
87e5255bbc py-matplotlib: add v3.10.1 (#49233) 2025-03-01 16:22:49 +01:00
dependabot[bot]
b42f0d793d build(deps): bump isort in /.github/workflows/requirements/style (#49212)
Bumps [isort](https://github.com/PyCQA/isort) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/PyCQA/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/PyCQA/isort/compare/6.0.0...6.0.1)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-01 08:18:06 -07:00
dependabot[bot]
ccca0d3354 build(deps): bump isort from 6.0.0 to 6.0.1 in /lib/spack/docs (#49213)
Bumps [isort](https://github.com/PyCQA/isort) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/PyCQA/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/PyCQA/isort/compare/6.0.0...6.0.1)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-01 08:17:39 -07:00
HELICS-bot
9699bbc7b9 helics: Add version 3.6.1 (#49231)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-03-01 08:16:21 -07:00
Raffaele Solcà
c7e251de9f Add dla-future v0.8.0 (#49235) 2025-03-01 08:14:52 -07:00
Robert Maaskant
d788b15529 libmd: add version 1.1.0 (#49239)
Release notes can be read at https://archive.hadrons.org/software/libmd/libmd-1.1.0.announce
2025-03-01 08:11:12 -07:00
Harmen Stoppels
8e7489bc17 Revert "Honor cmake_prefix_paths property if available (#42569)" (#49237)
This reverts commit fe171a560b.
2025-02-28 23:33:02 +01:00
John W. Parent
d234df62d7 Solver: Cache Concretization Results (#48198)
Concretizer caching for reusing solver results
2025-02-28 12:42:00 -06:00
Mikhail Titov
4a5922a0ec py-radical-*: new version 1.90 (#48586)
* rct: update packages (RE, RG, RP, RS, RU) with new version 1.90

* radical: added `url_for_version` for older versions

* radical: set latest versions for `radical.pilot` and `radical.utils`

* radical: fixed `url_for_version` setup

* radical: set the latest version for `radical.entk`

* radical: fixed style for `url_for_version`

* Apply suggestions from code review (python version dependency)

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-28 07:38:45 -07:00
John W. Parent
5bd184aaaf Windows Rpath: Allow package test rpaths (#47072)
On Windows, libraries search their directory for dependencies, and
we help libraries in Spack-built packages locate their dependencies
by symlinking them into the dependent's directory (we refer to this
as simulated RPATHing).

We extend the convenience functionality here to support base library
directories outside of the package prefix: this is primarily for
running tests in the build directory (which is not located inside
of the final install prefix chosen by spack).
2025-02-27 19:16:00 -08:00
Mikael Simberg
464c3b96fa fmt: Add 11.1.4 (#49218) 2025-02-27 19:12:26 -06:00
Scott Wittenburg
60544a4e84 ci: avoid py-mpi4py tests on darwin (#49227) 2025-02-27 18:07:59 -07:00
Greg Sjaardema
a664d98f37 seacas: new version with change set support (#49224)
This release contains modifications to most of the SEACAS applications to support ChangeSets to some degree.
See https://github.com/SandiaLabs/seacas/wiki/Dynamic_Topology for information about Change Sets and
See https://github.com/SandiaLabs/seacas/wiki/Supporting-Change-Sets for information about how the various seacas applications are supporting the use or creation of change sets.

The release also includes various other small changes including formatting, portability, intallation, TPL version updates, and spelling.
2025-02-27 18:02:51 -07:00
Sinan
0e3d7efb0f alps: add conflict (#48751)
Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2025-02-27 17:57:55 -07:00
Chris Green
a8cd0b99f3 New recipes for PlantUML and py-sphinxcontrib-plantuml (#49204)
* new-recipe: plantuml
* new-recipe: py-sphinxcontrib-plantuml
2025-02-27 16:57:23 -08:00
Alec Scott
a43df598a1 rust: add v1.85.0 (#49158) 2025-02-27 13:18:23 -06:00
Alec Scott
a7163cd0fa gnutls: add master, improve styling (#49080) 2025-02-27 13:13:23 -06:00
Kyle Knoepfel
fe171a560b Honor cmake_prefix_paths property if available (#42569)
* Honor package-specified cmake_prefix_paths at runtime

* Add paths in the correct order and prune duplicates

* Normalize paths for windows' sake
2025-02-27 11:11:22 -07:00
Ritwik Patil
24abc3294a sendme: new package (#49133)
* add sendme package

* style fix

* add docstring for test function

* changed maintainer string, run test after install

* removed redundant test

* Follow the common package license header format

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-27 09:59:09 -07:00
MatthewLieber
2dea0073b2 mvapich-plus: new package (#48507)
* add mvapich-plus 4.0

* run spack style fix

* fix license issue

* change styling of mvapich-plus package based on review

* using spack style --fix

* fix more typos

* Apply suggestions from code review

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Adding CudaPackage and RocmPackage Mixins

---------

Co-authored-by: Matt Lieber <lieber.31@osu.edu>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-27 09:36:26 -07:00
Massimiliano Culpo
31ecefbfd2 heppdt: add dependency on C (#49219)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-27 08:10:19 -06:00
Harmen Stoppels
7363047b82 schema: additionalKeysAreSpecs (#49221)
Currently we validate all keys as specs, but it's meant to validate only additional keys in all cases.
2025-02-27 12:17:25 +01:00
Massimiliano Culpo
12fe7aef65 pipelines: extract changes from compiler as nodes (#49222)
* Split requirements to get better error messages in case of unsat solves.
* use list requirements instead of string
* activate static_analysis in a few pipelines

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-27 12:13:34 +01:00
Harmen Stoppels
5da4f18188 schema/modules.py: remove lmod props from tcl schema (#49220) 2025-02-27 10:48:22 +01:00
Marc T. Henry de Frahan
61c54ed28b Remove FSI variant from Nalu-Wind (#49209)
* Remove fsi variant from Nalu-Wind

* fix exawind
2025-02-26 16:57:56 -07:00
eugeneswalker
677caec3c6 Ci reactivate darwin pipelines (#48453)
* ci: darwin stacks: update tags following system updates

* disable SPACK_CI_DISABLE_STACKS; only enable *darwin* stacks for testing

* manually chmod u+w tmp/ before cleanup due to issue#49147

* comment out failing specs for now

* re-enable logic for disabling stacks

* add explanatory comment for darwin after_script additions

* remove more darwin-only targetting

* restore build_stage to default location

* move build-job-remove out of individual darwin stacks into darwin top level config

* keep build_stage in $spack/tmp for now
2025-02-26 17:34:22 -06:00
eugeneswalker
b914bd6638 e4s oneapi ci stack: mpi: require intel-oneapi-mpi (#49043)
* e4s oneapi ci stack: mpi: require intel-oneapi-mpi

* nrm ^py-scipy cflags="-Wno-error=incompatible-function-pointer-types"

* add explanatory comment
2025-02-26 23:07:57 +00:00
Massimiliano Culpo
3caa3132aa python: allow it as a build-tool again (#49201)
Python was removed from being a build tool in #46980, due to issues
when reusing specs. This PR adds a new rule to match the interpreter
among different Python packages, in clingo.

It also adds a bunch of new "build-tools", so that specs like:
```
py-matplotlib backend=tkagg
```
can be concretized in one go.

Modifications:
- [x] Make `py-matplotlib backend=tkagg` concretizable
- [x] Add unit-tests to ensure situations like in #46980 do not happen

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-26 15:04:31 -08:00
Massimiliano Culpo
dbd531112c Assign priorities to configuration scopes (take 2) (#49187)
Currently, the custom config scopes are pushed at the top when constructing
configuration, and are demoted whenever a context manager activating an
environment is used - see #48414 for details. Workflows that rely on the order
in the [docs](https://spack.readthedocs.io/en/latest/configuration.html#custom-scopes)
are thus fragile, and may break

This PR allows to assign priorities to scopes, and ensures that scopes of lower priorities
are always "below" scopes of higher priorities. When scopes have the same priority,
what matters is the insertion order.

Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
      adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
      add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-26 10:52:19 -08:00
psakievich
ae5e121502 Preserve --lines (#49194)
This does not propagate in parsing. Open to other ideas.
2025-02-26 17:48:01 +00:00
snehring
929cfc8e5a relion: add v5.0.0 (#49174)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-02-26 09:17:54 -08:00
Chris Marsh
bad28e7f9f py-natsort: add new variant +icu and dependent package (#48907)
* Add new package py-pyicu to support new py-natsort variant +icu

* note version req location

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* bound icu variant

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-26 10:42:13 -06:00
Patrick Lavin
3d63fe91b0 sst-elements: add support for --enable-ariel-mpi flag (#49135) 2025-02-26 07:30:51 -07:00
Mikhail Titov
95af020310 py-psij-python: new version 0.9.9 (#48610)
* py-psij-python: new version 0.9.9

* Update var/spack/repos/builtin/packages/py-psij-python/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* fixed py3.8 dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-26 08:13:48 -06:00
Seth R. Johnson
2147b9d95e g4vg: new version 1.0.3 (#49195) 2025-02-25 16:27:41 -06:00
psakievich
68636e7c19 lua-lpeg inheritance fix (#49065)
The parent class function doesn't return the path to the config file. This is one potential fix, or we can add the return back to base builder.
2025-02-25 15:14:54 -07:00
Elsa Gonsiorowski, PhD
f56675648a mpifileutils: add v0.12 (#49132)
* mpifileutils: update for v0.12 release

* removed @adammoody from maintainers
2025-02-25 13:39:25 -07:00
Tara Drwenski
3a219d114d Petsc: add in hipblas dependency on hipblas-common (#49017) 2025-02-25 10:35:36 -06:00
Wouter Deconinck
3cefa7047c davix: add v0.8.8, v0.8.9, v0.8.10 (#49057)
* davix: add v0.8.8, v0.8.9, v0.8.10

* davix: url_for_version

* davix: depends on googletest when @0.8.8: (type test, maybe build)

* davix: define DAVIX_TESTS
2025-02-25 10:05:31 -06:00
Cédric Chevalier
35013773ba Fix setup.fish syntax (#49176)
* Fix setup.fish syntax

* Simplify conditional in share/spack/setup-env.fish

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-25 07:11:55 -06:00
AMD Toolchain Support
e28379e98b python: limit parallellism in compileall (#48441) 2025-02-25 13:54:59 +01:00
afzpatel
93329d7f99 add ck variant to miopen-hip (#49143) 2025-02-25 05:48:49 -07:00
Massimiliano Culpo
9e508b0321 Revert "Assign priorities to configuration scopes (#48420)" (#49185)
All the build jobs in pipelines are apparently relying on the bug that was fixed.

The issue was not caught in the PR because generation jobs were fine, and
there was nothing to rebuild.

Reverting to fix pipelines in a new PR.

This reverts commit 3ad99d75f9.
2025-02-25 02:33:41 -08:00
Adam J. Stewart
2c26c429a7 py-sphinx: add v8.2.0 (#49107) 2025-02-25 10:44:58 +01:00
dependabot[bot]
1cc63e2b7c build(deps): bump sphinx from 8.2.0 to 8.2.1 in /lib/spack/docs (#49180)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.2.0 to 8.2.1.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/v8.2.1/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.2.0...v8.2.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-25 02:33:03 -07:00
Harmen Stoppels
4e311a22d0 spec.py: remove VariantMap.concrete (#49170)
VariantMap.concrete is unused, and would be incorrect if it were used
due to conditional variants.

Just let the Spec dictate what is concrete and what is not.
2025-02-25 10:18:06 +01:00
Massimiliano Culpo
3ad99d75f9 Assign priorities to configuration scopes (#48420)
Currently, environments can end up with higher priority than `-C` custom
config scopes and `-c` command line arguments sometimes. This shouldn't
happen -- those explicit CLI scopes should override active environments.

Up to now configuration behaved like a stack, where scopes could be only be
pushed at the top. This PR allows to assign priorities to scopes, and ensures
that scopes of lower priorities are always "below" scopes of higher priorities.

When scopes have the same priority, what matters is the insertion order.

Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
      adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
      add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`
- [x] Remove `unify:false` from custom `-C` scope in pipelines

On the last modification: on `develop`, pipelines are relying on the environment
being able to override `-C` scopes, which is a bug. After this fix, we need to be
explicit about the unification strategy in each stack, and remove the blanket
`unify:false` from the highest priority scope

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-25 00:58:16 -08:00
Adam J. Stewart
b79c01077d py-sympy: add v1.13.1 (#48951)
* py-sympy: add v1.13.1
2025-02-24 15:09:00 -08:00
Jim Galarowicz
4385f36b8d survey: add latest releases and python path settings for building with autoload none. Ref issue: 42535 (#48050)
* Update survey package file with latest releases and python path settings for building with autoload none.
* Submitting reformatted file.
* update survey package file with libmonitor dependency changes, take out py-gpustat, and minor comment change.
* Trigger build.
2025-02-24 15:04:40 -08:00
Axel Huebl
a85f1cfa4b WarpX 25.02 (#48917)
* pyAMReX: 25.02
* PICMI: 0.33.0
* WarpX: 25.02
* `amrex +fft` depends on `pkgconfig`
* Updated CMake logic uses `pkgconfig`
2025-02-24 14:48:27 -08:00
Melven Roehrig-Zoellner
13524fa8ed gcc: fix package.py for gcc@:9 (#49173) 2025-02-24 15:44:04 -07:00
Mikael Simberg
738c73975e mimalloc: Add new versions (#49168) 2025-02-24 13:04:22 -07:00
Mikael Simberg
bf9d72f87b ut: Add 2.3.0 (#49169) 2025-02-24 12:59:31 -07:00
Mikael Simberg
674cca3c4a asio: add 1.32.0 (#49167) 2025-02-24 12:39:18 -07:00
Cory Quammen
7a95e2beb5 paraview: add patch for Intel Classic compilers (#49116)
ParaView 5.12.0 through 5.13.2 do not compile. See
https://gitlab.kitware.com/vtk/vtk/-/issues/19620.
2025-02-24 11:27:03 -06:00
Adam J. Stewart
5ab71814a9 py-torchgeo: correct pyvista dep (#49140) 2025-02-24 09:06:33 -08:00
Harmen Stoppels
e783a2851d Revert "Repo.packages_with_tags: do not construct a set of all packages (#49141)" (#49172)
This reverts commit 0da5bafaf2.
2025-02-24 16:46:41 +01:00
Stephen Nicholas Swatman
29e3a28071 vecmem: add v1.14.0 (#49166)
This commit adds version 1.14.0 of the vecmem package.
2025-02-24 08:08:52 -06:00
Harmen Stoppels
4e7a5e9362 spack verify libraries: verify dependencies of installed packages can be resolved (#49124)
Currently, we have `config:shared_linking:missing_library_policy` to error
or warn when shared libraries cannot be resolved upon install.

The new `spack verify libraries` command allows users to run this post
install hook at any point in time to check whether their current
installations can resolve shared libs in rpaths.
2025-02-24 11:28:06 +01:00
Harmen Stoppels
89d1dfa340 python: deprecate old patch versions, remove patches that do not apply (#48958) 2025-02-24 03:23:05 -07:00
Harmen Stoppels
974abc8067 Add typehints for directory_layout / Spec.prefix (#48652) 2025-02-24 09:47:07 +00:00
Massimiliano Culpo
2f9ad5f34d spec.py: fix virtual reconstruction for old specs (#49103)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-02-24 01:23:37 -07:00
Harmen Stoppels
9555ceeb8a glib: various fixes (#48840)
* remove preferred to allow seamless python@3.12 usage

* glib: remove deprecated versions

* glib: use extends because python-venv is pulled in from build deps and put into path

* dont patch patch versions, use new patch releases containing the fix instead

* restrict patch of shebangs, group relevant bits together

* simplify lowerbound

* fix pinned glib version

---------

Co-authored-by: Chris Marsh <chrismarsh.c2@gmail.com>
2025-02-24 09:17:45 +01:00
Harmen Stoppels
6cd74efa90 Spec.ensure_external_path_if_external, Spec.inject_patches_variant -> spack.solver.asp (#48988)
y
2025-02-24 08:36:23 +01:00
Wouter Deconinck
3b3735a2cc root: add v6.34.04 (#49163)
* root: add v6.34.04

* root: add conflict for gcc-15 with earlier versions

---------

Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-02-23 22:17:47 -07:00
dependabot[bot]
2ffbc0d053 build(deps): bump mypy in /.github/workflows/requirements/style (#49165)
Bumps [mypy](https://github.com/python/mypy) from 1.11.2 to 1.15.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.11.2...v1.15.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-23 22:29:24 -06:00
Dom Heinzeller
a92419ffe4 Partial bug fix + conflict for compiling node-js@21: with gcc@11.2 (#48494)
* Bug fix for compiling node-js@21: with gcc@11.2 (var/spack/repos/builtin/packages/node-js/package.py var/spack/repos/builtin/packages/node-js/wasm-compiler-gcc11p2.patch)

Since this bug fix is not sufficient, add a conflict for node-js@21: with gcc@11.2

* In var/spack/repos/builtin/packages/node-js/package.py, restrict patch wasm-compiler-gcc11p2.patch to versions 21:22 for gcc@11.2
2025-02-23 19:13:56 -06:00
Juan Miguel Carceller
92c16d085f gtkplus: add conflict with GCC 14 (#48661)
* gtkplus: add conflict with GCC 14

* gtkplus: conflict gcc@14: when @:3.24.35

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-23 15:28:55 -07:00
Adam J. Stewart
c94024d51d py-timm: add v1.0.15 (#49159) 2025-02-23 14:43:33 -07:00
Harmen Stoppels
11915ca568 apr-util: add missing libxcrypt (#49160) 2025-02-23 13:30:09 -07:00
Buldram
4729b6e837 chafa: new package (#49162)
* chafa: new package

* Require at least one of +shared/+static
2025-02-23 13:29:50 -07:00
Seth R. Johnson
2f1978cf2f celeritas: add 'develop' branch (#49004)
* Revert "REVERTME: move celeritas changes to another branch"

This reverts commit a063e43aaf.

* Use predicted g4vg version

* Use

* fixup! Use predicted g4vg version

* Use spec for versions and improve dependency specification
2025-02-23 13:29:31 -07:00
Chase Phelps
d4045c1ef3 py-perfdump: new package (#49035)
* py-perfdump: new package

* Update package.py

style

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update package.py

remove unneeded rpath and pythonroot

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-23 13:29:13 -07:00
Pranav Sivaraman
a0f8aaf4e7 setup-env.fish: fix version checking for completions (#48806) 2025-02-23 20:28:51 +00:00
Wouter Deconinck
b7a5e9ca03 root: add v6.34.00, v6.34.02 (#48129)
* root: add v6.34.00, v6.34.02

* root: prefer 6.32.08

* root: updated variants

* root: treat v6.34 as stable, no preference for v6.32

* root: add variants geom, geombuilder

* delphes: depends on root +geom +opengl

* dd4hep: depends on root +geom

* pandoramonitoring: depends on root +geom

* root: actually pass geom, geombuilder to cmake

---------

Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-02-23 12:18:10 -06:00
Adam J. Stewart
7e4b8aa020 py-pyproj: add v3.7.1 (#49066) 2025-02-22 20:46:00 +01:00
Derek Ryan Strong
f5aa15034e Add fpart v1.7.0 (#49119) 2025-02-22 10:54:52 -06:00
Phil Tooley
f210be30d8 LLVM,GCC: Keep stable-series releases a bit longer (#49113) 2025-02-22 09:54:36 -06:00
Richard Berger
c63741a089 py-sphinx-rtd-dark-mode: add version 1.3.0 (#49136) 2025-02-22 08:31:13 -06:00
Andrey Perestoronin
4c99ffd81f new impi intel package 2021.14.2 release (#49114) 2025-02-22 08:02:53 -05:00
Nils Vu
1331332dcf libxsmm: update URL (#49155) 2025-02-22 02:04:49 -07:00
dmagdavector
910a4e6d22 slirp4netns: add v1.2.3, v1.3.1 (#48569) 2025-02-21 16:00:28 -08:00
Piotr Sacharuk
93f1ec20aa Update openturns versions (#48872) 2025-02-21 15:59:23 -08:00
Harmen Stoppels
9edbe5aed1 liburing: requires(...) (#49041)
* liburing: requires

* Update var/spack/repos/builtin/packages/liburing/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 15:45:54 -08:00
Krishna Chilleri
a574c7610b py-schema-salad: add v8.7.20241021092521 and py-mypy: add v1.12.1 (#49127)
* add new versions of dependencies

* modify pypi url for newest version

* add option for url depending on version number

* add version ranges of dependencies

* [@spackbot] updating style on behalf of kchilleri

* remove unnecessary py-cache-control version number
2025-02-21 16:07:32 -07:00
Alec Scott
4742f053af emacs: improve gui variant to cover both linux and macos (#49054)
* emacs: improve gui variant to cover both linux and macos

* emacs: fix optional deps type
2025-02-21 16:16:44 -06:00
Alec Scott
b06c5c7e81 fzf: fix go cache protection to allow delete (#49151) 2025-02-21 13:52:33 -07:00
Tobias Ribizel
03fa150185 typst: add v0.13.0 (#49134)
* spack: add version 0.13

* typst: fix version order

Co-authored-by: Alec Scott <hi@alecbcs.com>

* typst: more precise version requirements

* typst: use build_directory

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 11:25:50 -08:00
ByteHamster
b304a2d854 Fix installing rust@nightly (#49098)
Installing `rust@nightly` fails because the package file declares a conflict of rust versions older than `:1.64` with `gcc>=13`. However, because `nightly` is alphanumerically smaller than any actual version number, `nightly` is incorrectly detected to have a conflict with `gcc>=13` as well. Marking `nightly` as an infinity version instead solves this.
2025-02-21 09:53:46 -08:00
Ryan Krattiger
1fa1864b37 Reproducer should deduce artifact root from concrete environment (#45281)
* Reproducer should decude artifact root from concrete environment

* Add documentation on the layout of the artifacts directory

* Use dag hash in the container name

* Add reproducer options to improve local testing

* --use-local-head allows running reproducer with
  the current Spack HEAD commit rather than computing
  a commit for the reproducer

* Add test to verify commits and recreating reproduction environment

* Add test for non-merge commit case

* ci reproduce-build: Drop overwrite option
in favor of throwing an error if the working dir is non-empty
2025-02-21 10:46:43 -06:00
Harmen Stoppels
0da5bafaf2 Repo.packages_with_tags: do not construct a set of all packages (#49141) 2025-02-21 16:23:42 +01:00
Massimiliano Culpo
f4614a4931 Extract some package changes from compiler as deps (#49138) 2025-02-21 12:52:34 +01:00
Harmen Stoppels
b8ec69112f Extracted changes from 45189 (#49137)
Co-Authored-By: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 10:58:00 +01:00
Massimiliano Culpo
a3645fd372 Make BaseConfiguration pickleable (#47545)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 09:54:22 +01:00
dependabot[bot]
9bcd86071f build(deps): bump sphinx from 8.1.3 to 8.2.0 in /lib/spack/docs (#49118)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.1.3 to 8.2.0.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.1.3...v8.2.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 16:42:00 -08:00
Robert Maaskant
b126335800 go: add v1.24.0 (#49104)
* go-bootstrap: add v1.22.12

* go: add v1.24.0

* Reformat package, don't deprecate versions without an active CVE

---------

Co-authored-by: Alec Scott <scott112@llnl.gov>
2025-02-20 16:07:52 -08:00
Derek Ryan Strong
d32b6099b3 sw4: fix build options (#48774)
* Fix sw4 build options

* Update constraint to hdf5@1.14

* Change edit to setup_build_environment

* Use append_flags

* Fix style
2025-02-20 15:02:52 -07:00
Harmen Stoppels
3e8cb852b0 spec.py: use json.dumps directly to avoid hash breakage (#48884) 2025-02-20 17:39:07 +01:00
Richard Berger
c8d7aa1772 lammps: fix pace link dep 2025-02-20 17:37:19 +01:00
Buldram
ec836d740f py-tensorflow: patch for v2.15 build errors (#49001)
* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: fix clang build and add clang version constraints

* py-tensorflow: use compiler wrapper

* py-tensorflow: relax clang conflict
2025-02-20 13:13:04 +01:00
Teague Sterling
cacdaaf3a9 bcftools: add v1.21, v1.20 (#49070)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-20 01:58:21 -07:00
Dave Keeshan
165e6b1d5e xnedit: new package (#41255) 2025-02-20 08:03:11 +01:00
Paul
70f5300cf2 Add new package for Jacamar CI (#48424) 2025-02-19 22:18:15 -06:00
Buldram
81e08167e2 nim: fix Musl build with new compilers (#48487)
* nim: fix build with new compilers

* narrow condition for disabling warnings

* move flags into offending module

disables warnings also for compiling projects other than the Nim compiler when necessary

* specify different versions pthread modules

* instead patch SysThread type

* adapt patch for old Nim versions

* Specify hypothetical `:@0.19.6` for patch version constraint
2025-02-19 22:16:12 -06:00
Alex Richert
e9d8c5767b crtm-fix: 3.1.1.2 (#48755)
* crtm-fix: 3.1.1.2

* correct checksum

* exclude test files

* Update package.py
2025-02-19 22:13:08 -06:00
Teague Sterling
4cefa973cd htslib: add v1.21 (#49056)
* Adding variants based of configure flags and an option to compile with PIC

* Adding GCS to htslib

* Revisions from review

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating descriptions, fixing flags, fixing version and variant conditions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* htslib: add v1.21

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-19 21:43:10 -06:00
Pruvost Florent
a2bd221ee4 chameleon: update to 1.3.0 (#49112) 2025-02-19 20:03:23 -07:00
Dom Heinzeller
adbb41c7df Bug fixes for fckit (disable finalization of DDTs) and ectrans (disable use of Fortran contiguous keyword) (#49111)
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ECTRANS_HAVE_CONTIGUOUS_ISSUE to turn off problematic use of Fortran 'contiguous' keyword
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ENABLE_FINAL=OFF to turn off problematic finalization of derived data types
* Update links to issues in fckit and ectrans
* Fix wrong cmake argument for ECTRANS_HAVE_CONTIGUOUS_ISSUE in var/spack/repos/builtin/packages/ectrans/package.py
2025-02-19 20:03:05 -07:00
Joseph Wang
2554c7bd21 py-onnxruntime: add v1.18.0 -> v1.19.2 (#46329)
* py-onnxruntime: add new versions

* py-onnxruntime: add constraints

* py-onnxruntime: fix typo

* py-onnxruntime: fix style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 12:02:27 -08:00
Garth N. Wells
e274e855f1 py-nanobind: add v2.5.0 (#48953)
Also removes inactive maintainer.
2025-02-19 11:47:25 -08:00
Matthew L. Curry
e76ebf2cf7 xz: Work around ASM declaration issue with NVHPC (#49006)
This commit works around an issue described below that xz encounters
during compilation with nvhpc.

https://forums.developer.nvidia.com/t/problem-in-inline-assembly-when-using-multiple-asm-declarations/210952
2025-02-19 11:38:29 -08:00
Pranav Sivaraman
11ba5ebbcd jsoncons: new package (#49105) 2025-02-19 11:13:08 -08:00
Adam J. Stewart
53262b968b py-scikit-image: add v0.25.2 (#49101) 2025-02-19 11:12:07 -08:00
etiennemlb
39620085d4 Add new packages: PDI (and dependencies/plugins) (#48710)
* Add new packages: PDI
* Fix style and  typos
* License and pdi python version/shebang issue
* Version update
* 1.8.0 cutoff and dependency simplifications
* Remove unused guard
2025-02-19 09:56:03 -08:00
Krishna Chilleri
78c985fce4 hpc-beeflow: New package (#49036)
* hpc-beeflow: New package

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* add new version of py-fastapi

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 11:18:01 -06:00
Wouter Deconinck
f9e4d3898a cargo: avoid need to use super().build_args with std_build_args (#49071)
* cargo: avoid need to use super().build_args with std_build_args

* cargo: fix style

* jujutsu: avoid need for super().build_args
2025-02-19 09:01:56 -08:00
Lehman Garrison
75c3d0a053 py-yt: add 4.4.0 and dependencies (#47571)
* py-ewah-bool-utils: add new package

* py-extension-helpers: add 1.2.0

* py-regions: add new package

* py-erfa: add 2.0.1.5

* py-yt: add 4.4.0

* py-yt: respect build_jobs
2025-02-19 08:14:35 -07:00
Stephen Nicholas Swatman
6afe002c94 vecmem: fix SYCL compiler specification (#49108)
This commit adds an additional requirement to the vecmem package,
requiring the OneAPI compiler iff the `sycl` variant is turned on. This
allows us to correctly set the non-standard `SYCLCXX` environment
variable.
2025-02-19 08:54:47 -06:00
Alexandre DENIS
f76e01707a mpi-sync-clocks: new package (#47834)
* mpi_ysnc_clocks: new package

* mpi_sync_clocks: move package to the right location

* mpi_sync_clocks: add copyright header

* mpi-sync-clocks: rename package mpi_sync_clocks -> mpi-sync-clocks to comply with naming convention

* mpi-sync-clocks: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpi-sync-clocks: streamline autogen

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:32:04 -06:00
Alexandre DENIS
738ca8e2c2 mpibenchmark: new package (#47835)
* mpibenchmark: new package

* mpibenchmark: add copyright header

* mpibenchmark: move the package to the right location

* mpibenchmark: explicitely disable CUDA & ROCm

* mpibenchmark: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline management of --enable/--disable

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline autogen

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: fix coding style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:31:15 -06:00
Seth R. Johnson
817df233fb g4vg/vecgeom: add version1.0.2 and patch cuda build failures (#49110)
* vecgeom: remove old patches and add patch for CUDA 11

* g4vg: add 1.0.2
2025-02-19 06:29:01 -07:00
Adam J. Stewart
5ea4d04450 py-scipy: add v1.15.2 (#49074) 2025-02-19 13:08:20 +01:00
Taillefumier Mathieu
49bf5a349e cp2k: fine graining control of the GPU modules (#48925)
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2025-02-19 09:29:31 +01:00
Robert Maaskant
2427b9649d ca-certificates-mozilla: add 2024-12-31 and deprecate older (#49096) 2025-02-19 00:34:18 -07:00
Robert Maaskant
0cec2c9fc6 glab: v1.52.0 and v1.53.0 (#49094) 2025-02-19 00:34:05 -07:00
Harmen Stoppels
c97be2a9d7 checksum.py tests: extract add_versions_to_pkg fixture (#49100) 2025-02-19 07:33:50 +00:00
Joe Schoonover
3fbdfc464b fluidnumerics-self: new package (#48636)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: fluidnumerics-joe <fluidnumerics-joe@users.noreply.github.com>
2025-02-19 00:23:38 -07:00
Buldram
c4449cb201 chez-scheme: new package (#49067)
* chez-scheme: new package

* Add separate zuo package, correct dep flags, disable threads and libffi by default

* zuo: add +big, parallelize build
2025-02-18 16:05:37 -06:00
Christian Heusel
1601193e12 likwid: Fix the perms script (#48666)
* likwid: Fix the perms script

The script loops over the path (which includes the prefix), but
additionally adds the prefix up front which results in duplicate paths
and a double "/" in the command like in the following one:

    chown root:root /opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn//opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn/sbin/likwid-accessD

Additionally the path is currently not quoted which can potentially
result in word splitting for weird paths.

Signed-off-by: Christian Heusel <christian@heusel.eu>

* likwid: Make the perm scripts' name unique

Also move it into the proper binary folder as per the Filesystem
Hierarchy Standard.

Signed-off-by: Christian Heusel <christian@heusel.eu>

---------

Signed-off-by: Christian Heusel <christian@heusel.eu>
2025-02-18 13:18:03 -07:00
Alex Richert
9c5b3ccb4e wgrib2: add cmake builder (#48447)
* wgrib2: add cmake builder

* wgrib2 add maintainer

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* style fixes; add tar.gz for old vers

* license update

* wgrib2: don't restrict openjpeg variant by version

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 13:12:42 -07:00
Robert Maaskant
68517389a0 rclone: add v1.68.2 v1.69.0 v1.69.1 (#49083) 2025-02-18 10:42:53 -08:00
Carson Woods
df5ad63331 py-loguru: add v0.7.0 -> v0.7.3 (#48268)
* Added new 4 new versions of py-loguru

* Added new build dependency

* Removed py-setuptools on versions beyond 0.7.2

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fixed erroneous dependency range on py-aiocontextvars

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 10:34:13 -07:00
Dave Keeshan
e79dc4422e libgpiod: new package (#47724)
* Initial checkin of compile file for the package libgpiod, this covers versions, 1.6.3 through 2.2

* Update var/spack/repos/builtin/packages/libgpiod/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/libgpiod/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update how version to url is manage in the x.y.0 cases based on fix by @wdconinc

* Remove redundant url at the top since it is now inside url_for_version function

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 11:13:51 -06:00
Gregor Olenik
f0e5568a54 neofoam: new package (#47214)
* add neofoam package.py

* add Henning as second maintainer

* format file

* fixup! SPDX License entry

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/neofoam/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 11:13:43 -06:00
Laurent Chardon
1a1f0aa07b xios: update to v2.6 (#48680)
* xios: update to v2.6

- xios: update to v2.6
- xios: patch remap earcut.hpp for missing include file
- xios: address C++11 requirement
- xios: instruct to use external boost and blitz libraries

* xios: bump revision to 2714

- Bump revision to changeset 2714
- Deprecate older versions that I can't manage to compile
- Confirm clang workaround is still needed
2025-02-18 08:04:06 -07:00
chaoos
c4ea924977 quda: new package (#48939)
* add quda package

* [@spackbot] updating style on behalf of chaoos

* adjusted header comment

* add quda package

* [@spackbot] updating style on behalf of chaoos

* adjusted header comment

* addressing reviewers comments

* formatting, adjusted build types, added tags hep and lattice

* [@spackbot] updating style on behalf of chaoos

* adjusted preferred version

* [@spackbot] updating style on behalf of chaoos
2025-02-18 07:59:01 -07:00
Wouter Deconinck
57df23a51f libx*: add new versions of X packages (#49060) 2025-02-18 15:46:50 +01:00
Harmen Stoppels
97d66b637f Fix tests modifying package.py files (#49093) 2025-02-18 13:28:07 +01:00
Harmen Stoppels
92e1807672 petsc: fix can provide vs provides issue (#49077) 2025-02-18 13:07:42 +01:00
Harmen Stoppels
e4a8d45d86 views: resolve symlinked dir - dir conflict when same file (#49039)
A directory and a symlink to it under the same relative path in a
different prefix

```
/prefix1/dir/
/prefix1/dir/file
/prefix2/dir -> /prefix1/dir/
```

are not a blocker to create a view. The view structure simply looks like
this:

```
/view/dir/
/view/dir/file
```

This should be the case independently of the order in which we visit
prefixes, so we could in principle create views order independently.
2025-02-18 06:53:17 +01:00
Wouter Deconinck
d6669845ed fjcontrib: add v1.055, v1.056, and v1.100 with patch (#49048) 2025-02-18 06:38:51 +01:00
Wouter Deconinck
60efada6a2 catch2: add v3.8.0 (#49052) 2025-02-18 06:37:23 +01:00
Wouter Deconinck
a093a65a25 armadillo: add v14.2.3 (#49051) 2025-02-18 06:33:09 +01:00
Wouter Deconinck
1524aceb9a gocryptfs: add v2.5.1 (#49063) 2025-02-18 06:32:25 +01:00
Wouter Deconinck
9d0766be48 hep: static_analysis: true (#49069) 2025-02-18 06:27:54 +01:00
Wouter Deconinck
7e89b3521a openloops: add v2.1.3, v2.1.4 (#49064) 2025-02-18 06:25:51 +01:00
Harmen Stoppels
2e372c53ab spec.py: remove Spec.virtual_dependencies (#49079) 2025-02-18 06:17:55 +01:00
Alec Scott
8639779002 pass: add master, improve styling (#49081) 2025-02-18 06:13:34 +01:00
Wouter Deconinck
a0e09139fc bioconductor-*: rm in favor of r-* copies (#49089) 2025-02-18 06:06:26 +01:00
Wouter Deconinck
b02ac87c55 apptainer/singularity/singularityCE: variant suid default False (#49088) 2025-02-18 05:26:00 +01:00
dependabot[bot]
a9da160160 build(deps): bump flake8 from 7.1.1 to 7.1.2 in /lib/spack/docs (#49087)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.1 to 7.1.2.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.1...7.1.2)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 20:08:05 -07:00
Alec Scott
f1678f4c7b bfs: add v4.0.5 (#49049)
* bfs: add v4.0.5, liburing: v2.4, v2.9

* Re-enable bfs on developer tools pipelines
2025-02-17 19:05:35 -06:00
dependabot[bot]
dae3b69f2c build(deps): bump flake8 in /.github/workflows/requirements/style (#49086)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.1 to 7.1.2.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.1...7.1.2)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 18:13:42 -06:00
Brian Vanderwende
cec7e6c4b5 GCC 14 needs C-standards workaround flags (#48019) 2025-02-17 16:33:16 -07:00
dmagdavector
5931236f55 py-nbconvert: add v7.14.2 to 7.16 (#47944) 2025-02-17 15:43:21 -07:00
Miranda Mundt
e695185770 py-pyomo: bump to 6.8.2; add new subdep package py-linear-tree (#48164)
* damaris: add v1.12.0, update maintainers (#48674)

Co-authored-by: Etienne Ndamlabin <jean-etienne.ndamlabin-mboula@inria.fr>

* Bump up the version for rocm-6.3.1 release (#48440)

This PR updates the versions for the rocm recipes for rocm-6.3.1 release.

* py-flash-attn: add missing triton dependency (#48645)

* hep stack: additional event generator packages (#48565)

* hep stack: additional event generator packages

* hep: adidtional packages

* hep: collier doesn't have +pic +shared

* py-awkward-cpp: fix scikit-build-core range of applicability

* hep: disable agile

* hep: disable garfieldpp and genie

* py-wxpython: depends_on pkgconfig even if using external wxwidgets

* hep: disable professor

* papi: fix error finding gmake during post-install testing (#48592)

* JAX: add v0.4.32+ (#46346)

* JAX: add v0.4.34

* Disable search for clang

* Update CUDA flags

* Add py-jax 0.4.33, comment out until py-jaxlib 0.4.33 is also released

* Fix GCC build

* Try TF_NVCC_CLANG

* py-jax: add v0.4.34

* jax no longer has separate tags for jaxlib

* Install compiled wheel

* Join path before glob

* Wheel is in spack stage, not tmp path

* Add 0.4.35

* Add newer versions

* Build system has been refactored yet again

* Drop clang

* Fix build with source tarball, rocm support

* Support GCC

* Remove clang-specific compiler flags

* enable_cuda flag was removed

* Fix logic

* py-jax: add v0.4.38

* Add patch to fix GCC support

* Patch no longer needed

* Skip patching, directly pass flags

* New flags

* Remove unused import

* Patch changed

* Use older version of patch

* Newer patch

* Add CUDA symlink

* Symlink more directories

* Recursive symlink

* Import function

* Recursive search

* Undo cuda changes

* Add v0.5.0

* I quit

* py-geemap: add new package (#48602)

* easi: add v1.5.1; relax yaml-cpp and lua requirements (#48675)

* thepeg: extend the rivet@:3 dependency up to version 2.3 (#48691)

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>

* py-ipyrad: adding version 0.9.102 (#48686)

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* autodock-vina: adding version 1.2.6 (#48684)

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* toybox: add v0.8.12 (#48657)

* Changes for NVIDIA HPC SDK 25.1 (#48696)

* update hypre version and add new memalign for petsc (#47831)

* petsc+rocm: add dependency on hipblas-common (#48644)

* spec.py: fix ArchSpec.intersects (#48741)

fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.

* ucx: adding 1.18.0 (#48742)

* Adding UCX 1.18.0
* Verified and correct hash.

* MAGMA: add v2.9.0 (#48750)

* Deprecate frontend/backend os/target (#47756)

* package api: drop wildcard re-export (#48760)

* package api: drop wildcard re-export

To ensure package repos are forward/backward compatibility with Spack,
we should explicitly export all symbols we want to expose in the public
package API, and drop `from spack.something import *` because
removal/addition to the public API will go unnoticed.

Also `llnl.util.filesystem` has some methods that shouldn't be exposed
in the package API, so better to enumerate a subset explicitly.

* remove flatten_dependencies / install_dependency_symlinks

* py-cmake: remove. remove deprecated cmake versions (#48763)

* Remove pipelines and images based on ppc64le (#48767)

* petsc: only conflict with kokkos@4.5: if it is enabled (#48698)

* hpctoolkit: Add `+docs` variant and manpages (#48566)

* py-mdit-py-plugins: Add new versions 0.3.5, 0.4.2
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* py-myst-parser: Add new versions 0.19.0 to 4.0.0
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* hpctoolkit: Add +docs variant and manpages
   This commit unconditionally enables manpages for the HPCToolkit tools.
   The new `+docs` variant enables additional documentation, specifically
   the user's manual. Both require new build-time dependencies.
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>

---------

Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>

* fmt: simplify +pic (#48766)

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

* Docs/bugfix: correct return for Adding flags to configure (#48434)

* binutils: conflict on configuration with build issues (#42949)

* Create SALT package.py (#48758)

* Create SALT package.py

Added a package for the SALT Source AnaLysis Toolkit
@zbeekman

* [@spackbot] updating style on behalf of wspear

* Update package.py

Line wrap

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>

* smee-client: add v2.0.4 (#48384)

* CMake: add v3.31.5, v3.30.7 (#48759)

* builtin: remove redundant imports (#48765)

* builtin: remove redundant llnl.util.filesystem import
* remove redundant import spack.version
* unsorted fixes
* more spack.version

* abinit: pass flag correctly (#48788)

* Add py-zarr 3, which includes a new required package py-donfig, and a bug fix to the patch range with numcodecs (#48786)

* libxc: add CMake builder (#48772)

* libsmeagol

* libxc cmake

* cmake support

* revert changes

* make spackbot happy

* fix

* Update package.py

* hipblaslt: update cmake dependency (#48637)

* hipblaslt: update cmake dependency

1 error found in build log:
  >> 3    CMake Error at CMakeLists.txt:24 (cmake_minimum_required):
     4      CMake 3.25.2 or higher is required.  You are running version 3.22.1
     5
     6
     7    -- Configuring incomplete, errors occurred!

See build log for details:
  /scratch/svcpetsc/spack-rocm/spack-stage/spack-stage-hipblaslt-6.3.0-pabb7t4rheqkz74lfzbsnqi6vnpiqwlq/spack-build-out.txt

* Update var/spack/repos/builtin/packages/hipblaslt/package.py

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

* Move from python2 compliant IOError and EnvironmentError to python3-only  OSError (#48764)

* IOError -> OSError

* also do EnvironmentError

* py-sphinx: mark Python compatibility (#48796)

* spack.package: wrap llnl.util.tty (#48793)

avoid import of llnl.util.tty in packages

* spack.package: re-export EnvironmentModifications / Prefix (#48792)

* Remove unused values (#48795)

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>

* petsc, py-petsc4py: add v3.22.3 (#48785)

* libsmeagol (#48776)

* libsmeagol

* add support for intel and add conflicts

* cp2k

* Bug Fix: Better incremental check for CMake (#48775)

* Bug Fix: Better incremental check for CMake

* Fix syntax error

* Ensure match of config artifact with generator

* add cdo@2.5.0 (#48801)

* Added salt variant to tau (#48782)

* Added salt variant to tau

* Update package.py

* [@spackbot] updating style on behalf of wspear

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>

* Remove patch on main (#48798)

Patch got merged: https://github.com/natefoo/slurm-drmaa/pull/62

* kokkos-nvcc-wrapper: add version 4.5.00 and 4.5.01 (#48802)

* env create: create copies of relative include files in envs created from manifest (#48689)

Currently, environments created from manifest files with relative includes result in broken
references to config files.

This PR modifies `spack env create` to create local copies in the new environment of any local
config files from relative paths in the environment manifest passed as an init file.

This PR does not change the behavior if the include is an absolute path or if the include is from
a relative path outside the environment directory, but it does warn about missing relative includes if
they are inside the environment directory.

Includes regression test and short blurb in docs.

* py-sphinx-rtd-theme: add v2.0.0, v3.0.0 (#48756)

* Add versions 2 and 3 of py-sphinx-rtd-theme.
   Allow for versions of py-sphinx greater than 6.
   Fix the Python version for older versions that depend on distutils.
   Get the py-docutils dependency from the py-sphinx recipe.
* Depend purely on the py-docutils dependency in py-sphinx.
* More refined dependency versioning.
* Fixed versioning for py-sphinx and py-docutils.

* CP2K: add 2025.1 version and DFTD4 support (#48489)

* cp2k: add dftd4 variant

* better conflict and make support

* typo

* Update var/spack/repos/builtin/packages/cp2k/package.py

* Update var/spack/repos/builtin/packages/cp2k/package.py

* oci/opener.py: respect system proxy settings (#48783)

* vtk-m: CMAKE_CXX_COMPILER is not a BOOL (#48813)

* gcc: remove --with-ld=ld-classic (#48826)

* nanotron: add new package (#48582)

* nanotron: add new package

Also, update some dependencies and add missing ones.

* Add variant +examples needed to execute example scripts

* fix: add missing branch attribute

* Remove master version

* fix: use Github hash

* embree: fix tests by building tutorial's embree_viewer for tests (#48392)

* import-check: improve how problematic imports are displayed (#48825)

The import-check action now presents problematic import statements
introduced by the PR better.

The idea is roughly:

* Let (V₁, E₁) be the graph of modules as vertices and import statements
  as edges before the change
* Let (V₂, E₂) be the graph after the code change, which is typically a small
  perturbation of (V₁, E₁).
* X₁ = FAS(V₁, E₁) is the feedback arc set before (a minimal set of edges to
  delete to make it acyclic)
* X₂ = FAS(V₂, E₂ ∖ X₁) is the feedback arc set after deletion of the minimal
  set of edges that made the old graph acyclic.
* X₃ = FAS(V₂, E₂) is the feedback arc set after

Previously I displayed X₁ and X₃ and users had to diff themselves.

Now, I'm showing X₂, which is a small set, typically directly related to
code changes.

However, it can be that a small code change adding say 2 problematic imports
creates a completely different solution X₃ that only requires deletion of just 1
different import. In that case the user is informed that they can potentially do
less work.

So for PR #48784 the output is now:

> The overall number of problematic import statements increased by 1 from 31 to 32.
> This is likely a direct consequence of the following import statements:
> 
> ```
> spack/config imports: spack.spec, spack.util.path, spack.util.remote_file_cache
> ```
> 
> However, instead of removing 3 import statements, it is sufficient to remove only 1
> import statement from the following list:
> 
> ```
> spack/concretize imports: spack.bootstrap, spack.solver.asp
> spack/environment imports: spack.bootstrap, spack.environment
> spack/fetch_strategy imports: spack.version.git_ref_lookup
> spack/install_test imports: spack.build_environment, spack.package_base
> spack/modules imports: spack.modules
> spack/platforms imports: spack.config
> spack/relocate imports: spack.bootstrap
> spack/repo imports: spack.package_base, spack.patch, spack.tag
> spack/spec imports: spack.binary_distribution, spack.compiler, spack.compilers, spack.concretize, spack.environment, spack.hash_types, spack.provider_index, spack.repo, spack.spec_parser, spack.store, spack.traverse, spack.variant, spack.version.git_ref_lookup
> spack/subprocess_context imports: spack.environment
> spack/util/gpg imports: spack.bootstrap
> spack/util/package_hash imports: spack.package_base
> spack/util/path imports: spack.config, spack.environment
> spack/util/remote_file_cache imports: spack.util.web
> ```

from which the user can figure out that
`spack/util/remote_file_cache imports: spack.util.web` is the "bottleneck" now.

* spack_yaml: use unambiguous variable name (#48832)

* style: fix `not in` and `is not` (#48831)

These are some changes that `ruff check --fix` would make that the current
`spack style` also agrees with.  Make the changes now so that the `ruff`
change is less disruptive.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* Set version to v1.0.0.dev0 (#48791)

* import-check: enable color output (#48842)

* berkeleygw: add -o flag to tar extraction (#48816)

when extracting as root user, avoid that tar attempts to change file ownership

* llvm: deprecate old patch releases (#48762)

* cuda: add v12.8 (#48708)

* sirius: patch pugixml (#48841)

* ziatest: add new package (#48809)

* gitlab: remove isc stacks (#48811)

* gcc: deprecate old patch releases (#48761)

* CP2K: use libxc@7 for master/next release (#48808)

* packge_base.py: remove _patches_by_hash (#48768)

* cdash: avoid build_opener (#48846)

* g4vg: new package (#48844)

* g4vg: new package

* [@spackbot] updating style on behalf of sethrj

* r-dmrcate: add v2.16.0, v3.0.0, v3.2.0 (#48158)

* r-dmrcate: add new versions
* r-dmrcate: require `r@4.3.0` for v2.99.0+
* r-dmrcate: update dependencies

* py-tensorflow: add 2.18.0-rocm-enhanced (#48711)

* py-tensorflow: add 2.18.0-rocm-enhanced

* fix style

* fix style

* fix style

* review changes

* review changes

* remove hipblaslt dependency

* remove ci changes and force ROCm 6.3.1 for newest TF

* remove rocm 6.3.1 dependency

* simplify configure fix

* rocm-examples and rocjpeg: new packages (#47695)

* new package: rocm-examples
* add new package rocjpeg and update rocm-examples for 6.3.0
* fix licenses
* add versions 6.3.1
* change homepage and git
* add f-string

* Tesseract v5.5.0 (#48866)

* leptonica: adding v1.85.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>
* tesseract: adding v5.5.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* py-geojson: Add new package  (#48847)

* Add new package py-geojson
* fix when

* isa-l: add v2.31.1 (#48859)

* Add new package func (#48849)

* icu4c: add v75.1, v76.1 (#48858)

* Add new package py-metis (#48848)

* cppgsl: add v4.1.0 (#48864)

* libdrm: add v2.4.124 (#48860)

* amrex: add v25.02 (#48853)

* sherpa: support cxxstd=20 when=@3: (#48829)

* sherpa: support cxxstd=20 when=@3:

* hep: sherpa cxxstd=20

* netlib-scalapack: Update version (#48667)

* Update scalapack version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use url_for_version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use spec.satisfies instead of version()

---------

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* fcgi: add v2.4.3, v2.4.4 (#48856)

* update pyproject.toml for `ruff format` (#48823)

Add ruff configuration to `pyproject.toml`.

This allows `ruff format` in the Spack repository to format all the files we care about, 
with our line length of 99, the exceptions we already put in place, and excluding things
we don't auto-format, like vendored dependencies.

Right now it'll reformat 175 or so files, but only slightly, in places where `ruff` differs from
`black`. For the most part I like the ruff format decisions better than `black`, but none of
the changes seem too severe.

This does not change `spack style` -- I figure that can come later but this at least will
let people start playing with `ruff`.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* dd4hep: add v1.31 (#48850)

* libsm: add v1.2.5 (#48862)

* Apply workaround for oneAPI compiler for upcxx problem with a template argument list (#48843)

* Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword

* Revert "Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword"

This reverts commit faf9b8ce85.

* Apply workaround for oneAPI compiler

* style problem resolved

* use spec.satisfies syntax

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>

* Remove ISC stacks environment files (#48851)

Follow-up to #48811

* Remove variable from cmake.py (#48824)

* Remove variable from cmake.py

#48775 left a dangling variable that was not caught in CI but by the eyes of @haampie. Restructure variable to local method.

* [@spackbot] updating style on behalf of psakievich

* Update cmake.py

* Update lib/spack/spack/build_systems/cmake.py

* Update lib/spack/spack/build_systems/cmake.py

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>

* sirius: add v7.6.2 (#48797)

Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

* gha: standalone import-check (#48873)

* ci: add codecov token secret to coverage upload job (#48880)

Codecov needs to see the token secret when uploading, so we have to
add this line to the workflow YAML:

```yaml
  with:
    token: ${{ secrets.CODECOV_TOKEN }}
```

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* ci: bump import-check (#48883)

* pfind: new package (#48685)

* py-elevation: new package (#48836)

* spec.py: fix hash change due to None vs {} (#48854)

* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules

* nnn: new package (#46174)

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>

* Fix esmf usage, add new version (#48835)

* bash: add autotools dependencies (#48874)

* Scotch: add v7.0.6, add testing option (#48781)

* code-server: update to v4.96.4 (#48828)

* acts: add v39.0.0 (#48839)

This commit adds version 39.0.0 of the ACTS package which, as far as I
can tell, doesn't require any dependency updates.

* spec.py: ensure spec.extra_attributes is {} if is null in json (#48896)

* harfbuzz: add v10.2.0 (#48857)

* libice: add v1.1.2 (#48861)

* relocate.py: don't warn about symlinks (#48904)

`relocate_links` warns when the target is absolute and not matched by
any prefix from the prefix to prefix map.

This can lead to false positives, cause the prefix to prefix map does
not contain trivial/identity entries whenever a package is installed to
its original location.

Since relocate_links is the odd one out there (we don't warn about
similar issues with rpaths, etc), just remove the warning.

* hip-tests: new package (#47273)

* hip-tests: add new package
* remove hip-tests from hip recipe
* remove old versions
* fix style
* add missing import
* bump hip-tests to 6.3.1
* fix style

* flecsi: new version 2.3.1 (#48867)

* flecsi: add new version 2.3.1, remove develop
* flecsi: remove kokkos and openmp variants moving forward
* flecsi: propagate cuda and rocm settings from kokkos
* Update var/spack/repos/builtin/packages/flecsi/package.py
   Co-authored-by: Davis Herring <herring@lanl.gov>
* flecsi: remove redundant depends_on lines
* flecsi: correct legion dependency
* flecsi: deprecate v2.0.0 and v2.1.0
* flecsi: force +openmp if ^kokkos+openmp

---------

Co-authored-by: Davis Herring <herring@lanl.gov>

* Update py-arch, py-statsmodels (add 0.14.1), py-patsy (add 0.5.4) to be able to use py-cython@3 (#48769)

* Add py-patsy@0.5.4
* Correct py-numpy dependency in py-arch
* Add py-statsmodels@0.14.1 and update dependencies
* Add climbfuji as maintainer for py-patsy
* Add climbfuji as maintainer for py-statsmodels
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

* enzyme: add v0.0.172 (#48881)

* spec.py: ensure == is false if true modulo precomputed dag hash (#48889)

* llvm: fix @15 %apple-clang@16 (#48887)

* spiner: add v1.6.3 (#48871)

* spiner: update package logic

* singularity-eos: remove spiner cuda_arch propagation

* spiner: add version 1.6.3

* sherpa: +hepmc3root only when +root (#48827)

* sherpa: +hepmc3root only when +root

* sherpa: fix style

* salt: add v0.3.0 (#48877)

* salt: Add v0.3.0 of SALT

This version contains important bug fixes for building and parsing
projects containing Fortran

* salt: Be more explicit about dependency types

 - llvm+clang+flang is needed at build, link and runtime for the
   correct operation of SALT
 - Testing with llvm@master ( llvm > 19.x ) shows that SALT is
   currently incompatible with the latest llvm API so an updated salt
   will be required when LLVM 20 is released

* openturbine: add new package (#48683)

* PyTorch: add v2.6.0 (#48794)

* mummer4: patching to allow building with %gcc@13: (#38292)

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* fms: add 2025.01, 2024.03 (#48812)

* py-shapely: add v2.0.7 (#48810)

* spectre: add v2025.01.30 (#48803)

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>

* py-numba: Add version 0.61  (#48837)

* amdfftw: fix broken build, adjust flags for performance tuning (#48754)

With CFLAGS, the code path in the amdfftw build system will bypass the logic around AMD_ARCH.

---------

Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>

* justbuild: add v1.4.3 (#48898)

* nvpl-blas, nvpl-lapack: add v0.4.0.1, v0.3.0 (#48901)

* hep stack: build also with cuda and rocm where possible (#48528)

* lcio: add v2.22.4 (#48895)

* Add a message for CMake incremental build (#48905)

* Add a message for CMake incremental build

Requested message to explain CMake phase is getting skipped.

* [@spackbot] updating style on behalf of psakievich

* Update import

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>

* dcap: depends_on libxcrypt (#48903)

* Add new version of r-curl (#48912)

* pika: Add 0.32.0 (#48897)

* icu4c: no cxxstd flag option on Windows (#48510)

* ICU4C: Don't reference a spec variant on a platform on which it's not defined

* icu4c: no cxx flag on Windows

* dla-future-fortran: add v0.3.0 (#48900)

* simgrid: add v3.36 (#48909)

* dyninst: cleanup package (#47637)

* Use more idiomatic construct, shorten recipe
* Remove deprecated versions, and associated patches
* Remove v10.0.0

* Windows: Update default config for stage location (#48511)

Current location is within the Spack prefix, which causes builds to
pollute VCS with stage artifacts and generally inflates the Spack
install prefix.

This PR moves it to the user cache location now that we can
consistently support paths with spaces on Windows.

* py-maturin: add v1.8.2 and refined dependencies (#48915)

* clingo-bootstrap: fix +optimized build (#48931)

* fix regression `apple-clang` vs `%apple-clang`
* use f-strings
* remove --verbose flag from LDFLAGS

* Fix regression due to dyninst update (#48935)

* trexio: fix issues with autotools build system (#48923)

* package_base.py: remove use_cray_compiler_names (#48932)

* Apply workarounds for oneAPI compiler for ascent problem with build (#48918)

* Apply workarounds for oneAPI compiler for ascent problem with build

* Use the way with use patch through the PR address

* stylecheck - missing comma

* libgcrypt: fix enforced -O0 (#48940)

Signed-off-by: Shane Nehring <snehring@iastate.edu>
Co-authored-by: Shane Nehring <snehring@iastate.edu>

* serialbox: add version 2.6.2 (#48937)

* nwchem: add master (#48919)

* Add possibility to build nwchem from master branch

* add oneapi@2025: patch for @7.2.3

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>

* Python: add new versions (#48950)

* Python: add new versions

* black

* reframe: add v4.6.4 -> v4.7.2 (#48242)

* go: add v1.23.6 (#48955)

* qmcpack: add v4.0.0 (#48921)

* py-einops: add v0.8.1 (#48954)

* flux-sched: add v0.42.1 (#48952)

Co-authored-by: github-actions <github-actions@users.noreply.github.com>

* Quantum ESPRESSO: add v7.4.1 (#48949)

* duckdb: add v1.2.0 (#48902)

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* mapl: add v2.53.1, v2.54.1 (#48944)

* log.py: remove setenv calls (#48933)

* nim: add v2.2.2 (#48929)

* Update GFE packages (#48899)

* OpenMPI: add version 4.1.8 (#48922)

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

* extrae: tighten dependencies on boost for +dyninst (#48938)

* py-iterative-stats: add 0.1.1 (#48959)

* import-check: bump (#48968)

* ports-of-call: add v1.6.0, v1.7.0, v1.7.1 (#48870)

* acts dependencies: new versions as of 2025/02/10 (#48969)

This commit adds detray v0.88.0 and GeoModel v6.9.0.

* cbtf-krell: Update Boost dependency (#47133)

* Update Boost
* Add gotcha
* Add patch for build errors
* Allow building with latest Dyninst
* Fix patch url

* lua-sol2: Apply workaround for oneAPI compiler for problem with build (#48920)

* Bump up the version for rocm-6.3.2 release (#48787)

* Bump up the version for rocm-6.3.2 release

* rocm-openmp-extras update and style correction

* Updating mivisionx, omniperf, rccl & rocprofiler-systems

* Updating hipsparselt & rocm-opencl

* rocprofiler-systems on gcc-13 and rvs commit instead of patch

* Updated rocjpeg & rocm-examples for 6.3.2

* ROCPROFSYS_BUILD_DYNINST & DYNINST_BUILD_TBB are required only with gcc-13

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

* spack compiler find: detect `flang-new` and `flang` in newer LLVM versions (#48914)

* rivet: patch missing header in 3.1.10 (#48977)

* concretizer: reduce search space with static analysis (#48729)

Currently, when we setup the ASP problem for `clingo`, we don't take into account the configuration. This results in setting up ASP problems that are larger than necessary, with possibly redundant information, and higher concretization times. 

This PR tries to improve things by adding an opt-in feature that computes the _possible dependencies_ of a solve taking also into account the current configuration, and avoids adding possible dependencies that we are certain can't be in the final solution.

The feature can be activated with:
```yaml
concretizer:
  static_analysis: true
```

Examples of simple rules to discard dependencies are:
- Dependencies that are not buildable, and for which no binary is present (e.g. `cray-mpich` etc. on non Cray systems)
- Dependencies that are not for the current platform (e.g. `msmpi` on non Windows platforms)
- Conditional dependencies that cannot be activated, because of some user requirement (e.g. `cuda` etc. if the user requires `~cuda` in configuration)
- Virtual providers that cannot be used, because of a requirement on a virtual

The speed-up these rules seem to give depends on the use case at hand, but if the configuration is updated properly, they are noticeable. 

Since in cases where there is no rule to exclude packages upfront, reuse is active, and this option is activated, it's possible to see some minor slow down, the feature has been added as opt-in, so it's turned off by default.

* spack.util.elf: catch seek errors (#48972)

* hep: rivet: require hepmc=3 (#48976)

* Fix performance issue on macOS (#48997)

archspec.cpu.host() is not memoized, so compute
it as less as possible.

---------

Co-authored-by: alalazo <alalazo@users.noreply.github.com>

* style.py: fix false negative in redundant import statements (#48980)

* PyTorch: build flash attention by default, except in CI (#48521)

* PyTorch: build flash attention by default, except in CI

* Variant is boolean, only available when +cuda/+rocm

* desc -> _desc

* kokkos et al. : don't monkeypatch spec in callbacks (#48916)

Currently, a few packages using kokkos rely on
kokkos itself monkeypatching its own spec to
provide some attribute.

In this commit we change this attribute to be
defined on the package, and never be monkeypatched.

* gmake: add empty libs property, remove link deptypes from dependents (#48995)

* package_hash.py: move metadata_attrs inline out of package_base (#48981)

* gmake: fix def libs/headers (#49009)

* Spec.package_class -> spack.repo.PATH.get_pkg_class (#48985)

* libfabric: use the class variable to get the list of fabrics (#49007)

Suggested by: alalazo <alalazo@users.noreply.github.com>

Signed-off-by: Justin Cook <jscook@lbl.gov>

* py-transformers: add new versions (#49000)

* py-transformers: add new versions

* py-tokenizers: add new versions

* Apply suggestions from code review

* lcio: Add latest 2.22.5 tag (#48991)

* concretize.lp: don't warn about deprecation when external (#49008)

* Spec.is_virtual -> spack.repo.PATH.is_virtual (#48986)

* g4vg: add 'develop' branch (#49003)

* g4vg: add develop version

* celeritas: add develop version

* Fix style

* REVERTME: move celeritas changes to another branch

* Get full repo

* remove unneeded variable

* Remove spack.repo.PATH.is_virtual call from SpecBuildInterface.(#48984)

This PR is effectively a breaking change extracted from #45189, which removes 
support for spec["mpi"] if spec itself is openmpi / mpich that could provide mpi; 
from the Spec instance we don't have any parent it provides it to, 
hence it's a KeyError.

* Spec.validate_detection -> spack.detection.path.validate_detection (#48987)

* hep: add missing language dependencies (#48963)

* highfive: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* lhapdf: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* vc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989140

* davix: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14989131

* pandorasdk: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989130

* veccore: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989118

* pythia6: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989116

* jwt-cpp: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* collier: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* hepmc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989112

* clhep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989075

* fastjet: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14981340

* gosam-contrib: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14978873

* thepeg: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997553

* cepgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* podio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* pandoramonitoring: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* lcio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997513

* geant4: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997202

* evtgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996817

* apfel: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996779

* collier: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14996770

* vecgeom: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003840

* dd4hep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003839

* opendatadetector: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007666

* acts: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007827

* hepmc: remove dependency on fortran

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* thepeg: remove fortran dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* acts: add a conditional build dependency

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* opendatadetector: add comment to explain C dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* apptainer: get_full_repo for branch main (#49002)

* unifyfs: Apply workaround for oneAPI compiler for problem with build (#48962)

* solver: add type-hints to OutputConfiguration (#48979)

* Ci: ensure file path comparsion uses posix paths (#47033)

Git always produces posix paths, ensure we're always comparsing apples to apples by normalizing paths compared to git output to posix.

* postgresql: add v17.2 (#47811)

* postgresql: add version 17.2

* postgresql: install flex, bison and perl when building versions 17 and up

* postgresql: do not install perl by default when building versions 17 and up

* cray-mpich: adding partial GTL support (#45830)

cray-mpich now has a rocm variant. You can use gtl_lib in the
flag_handler like so:

```python
    def flag_handler(self, name, flags):
        wrapper_flags = []
        environment_flags = []
        build_system_flags = []

        if self.spec.satisfies("+rocm"):
            if self.spec.satisfies("^cray-mpich"):
                gtl_lib = self.spec["cray-mpich"].package.gtl_lib
                build_system_flags.extend(gtl_lib.get(name) or [])
            # hipcc is not wrapped, we need to pass the flags via the
            # build system.
            build_system_flags.extend(flags)

        return (wrapper_flags, environment_flags, build_system_flags)
```

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>

* lammps: use the Cray GTL (#46090)

* lammps: add 20241119 and 20250204 releases (#48978)

* plog: add new package (#48975)

* binutils: add debuginfod variant + update deps (#49011)

* h5hut: Remove H5_USE_110_API for newer versions (#48885)

* h5hut: Remove H5_USE_110_API for newer versions

* h5hut: style reformat and add maintainer

* h5hut: correct version syntax for <v2.x.x

* h5hut: Bump default version to 2.0.0rc7 and remove older rc candidates

* Update var/spack/repos/builtin/packages/h5hut/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* spack debug create-db-tarball: remove after test failures (#49025)

* bubblewrap: add versions up to v0.11.0 (#49023)

* gftl: add v1.15.2 (#48992)

* py-torchgeo: pyvista dep has been removed (#48990)

* py-memray: add v1.15.0 (#48989)

* yosys: add v0.50 (#48983)

* laszip: Add version 3.4.4. (#48982)

* py-numpy: add v2.2.3 (#49029)

* pandora{pfa,sdk,monitoring}: add new versions and allow setting the C++ standard (#48300)

* pandoramonitoring: add v3.6.0; pandorapfa: add v4.11.2

Remove variables that are not being used in pandorasdk. Use the C++ standard
from ROOT when possible and pass -Wno-error to override the -Werror that will
typically fail with a new standard. Add a cxxstd variant for pandorasdk

* Fix style

* Update var/spack/repos/builtin/packages/pandorasdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix style

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* easi@1.5.2 (#49013)

* Put in maintainers for Kokkos Tools Spack package  (#49018)

* Kokkos Tools package.py: fix maintainers
* Kokkos Tools package.py: remove white space between first and second maintainer in comma-separated list
* [@spackbot] updating style on behalf of vlkale
* Correct maintainers syntax
   Co-authored-by: Richard Berger <richard.berger@outlook.com>

---------

Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>

* zlib package: Ensure correct lib search on Windows (#48512)

* Name of zlib's library differs on Windows; also account for name
  differing when building +shared
* `zlib`'s `.libs` implementation was searching for the runtime
  libraries (the .dlls) and should be searching for link-time libs

* Update openfast, amr-wind, and nalu-wind packages (#48994)

* acts: conflicts ^geant4@11.3: when @:35 (#49028)

* yq: add versions 4.44.5 and 4.44.6 and 4.45.1 (#49027)

* openblas: .libs() uses self.libraries attribute (#48942)

Currently this is hardcoded to the same value as listed in the class
definition. If one ever overrides this attribute, such as:

```
packages:
  openblas:
    package_attributes:
      libraries = [ 'libopenblaso64', ]
```

this patch will make sure that override also in the
`spec['openblas'].libs()` call. (Which happens in `hypre`, likely
others).

( see
https://spack.readthedocs.io/en/latest/packages_yaml.html#assigning-package-attributes
)

Thanks to becker33 for debugging help in Slack

* RepoSplit/tests: update repo tests relying on builtin package repo to only use mock repos (#48926)

* RepoSplit/tests: update repo tests relying on builtin

* test_repo_last_mtime: skip on windows due to mtime issues in CI

* MesonPackage: depends_on pkgconfig (#46955)

meson's `dependency` function often uses pkg-config to locate a dependency, and may even fall back to cmake. The former case is very common, and since packagers often forget to add the tiny pkgconfig package as a build dep, we do it for them.

* Spec.__getitem__: restrict to direct deps + transitive runtime deps (#49016)

With this change spec["pkg"] searches only direct dependencies and transitive link/run
dependencies, ordered by depth. This avoids situations where we pick up unwanted 
deps of build/test deps. 

To reach those, you need to do spec["build_dep"]["pkg"] explicitly.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* views: normalize paths on case insensitive file systems (#47370)

On macOS, prefix_a/file and prefix_b/FILE map to the same file view/file or view/FILE.

This commit ensures that we test whether a view is created on a case insensitive filesystem and handle projection conflicts accordingly.

* Allow tuning max_dupes for build dependencies (#48948)

Up to now, Spack was allowing all build-tools that
may appear in the DAG to have 2 max_dupes.

This is not needed in practice for most of them,
and adding them out of caution just increases
grounding and concretization time.

This PR makes the value of max_dupes configurable
per package, and sets only a few known packages to
2 max_dupes by default.

In case user needs different values, they can
tune the configuration for their use case.

* vtk: fix 9.4.1 concretization (#48946)

* seacas: conflict 2024-06-27 with windows

* vtk: fix 9.4.1 seacas dependency

* cairo: add new version and update build system (#48822)

* update cairo for new meson build system

* update patch range. remove old conflict

* style

* update pango to reflect the changes in cairo

* refine depends

* style

* add lzo depends

* add +shared

* non self-referential variant requireme

* style

* Move +shared variant back to just autotools as meson automatically handles it

* clarify patch when=

* update based on reviews. switch from conflicts to requires to enforce variant synchronization

* refine conflicts and requires

* better group build deps together

* comment for meson build lower version bound

* clarifying comments

* clarify version ranges, enforce build_system with version ranges

* style

* cairo:  no need to require for build_systems

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* fzf: add v0.60.0, improve styling (#49059)

* fd: improve documentation and styling to help newer maintainers (#49058)

* direnv: add master, fix up package for better documentation (#49053)

* GDAL: add v3.10.2 (#49042)

* pbwt: new package (#49055)

* pbwt: add v2.1, v2.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* py-xarray-regrid: Add new package (#48834)

* Add py-xarray-regrid and required dep flox

* remove boiler plate

* Add missing py310 dep

* py-flox, py-xarray-regrid: add type=("build", "run") to python dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* build(deps): bump isort in /.github/workflows/requirements/style (#48746)

Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* build(deps): bump black from 24.10.0 to 25.1.0 in /lib/spack/docs (#48780)

Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* surfer: new package (#48432)

* Add waveform viewer, surfer for RTL simulations

* Ran black over the code following style check failure

* build(deps): bump black in /.github/workflows/requirements/style (#48779)

Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* new package: jujutsu (#48231)

* lis: add v2.0.28 -> v2.1.7 (#48308)

* Added LIS 2.1.7

* Added LIS versions from 2.0.28 to 2.1.7

* apply black v25.1.0 (#49076)

* build(deps): bump isort from 5.13.2 to 6.0.0 in /lib/spack/docs (#48747)

Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Spec.__contains__: restrict to direct build and transitive runtime deps (#49072)

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* mochi-margo/mochi-thallium: new versions (#49037)

* mochi-margo/mochi-thallium: new versions

* mochi-thallium: fixing style

* mochi-thallium: fixing required dependency on mochi-margo versions

* Add new recipe aotriton for rocm. (#49038)

* add new reciple aotriton for rocm. used for py-torch

* update the git info

* fix style error

* fix style error

* fix style error

* address review comments

* fix style error

* update maintainers (#48295)

* acts dependencies: new versions as of 2025/02/17 (#49073)

This commit adds detray v0.88.1, covfie v0.12.0 and v0.12.1, as well as
ACTS v37.1.0.

* magma: remove cuda_arch constraint on 2.9.0+ (#49019)

* plsm: new package (#48875)

* Adding plsm package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add description

* Add blank line

* Add cuda_arch and update int64 handling

* gnutls: add v3.8.9 (#49062)

* gnutls: add v3.8.9

* gnutls: address super small nitpick

* gnutls: fix git url

* qemacs: add v6.4.1, fix +doc (#48722)

* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies (#47926)

* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies

* py-importlib-resources: add v5.13 to 6.4

* Revert "py-importlib-resources: add v5.13 to 6.4"

This reverts commit 1df208874c799b99dcfc43f13ae85f9324c59b52.

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Signed-off-by: Justin Cook <jscook@lbl.gov>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Etienne Ndamlabin <88906611+endamlabin@users.noreply.github.com>
Co-authored-by: Etienne Ndamlabin <jean-etienne.ndamlabin-mboula@inria.fr>
Co-authored-by: Sreenivasa Murthy Kolam <sreenivasamurthy.kolam@amd.com>
Co-authored-by: Thomas Bouvier <contact@thomas-bouvier.io>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: G-Ragghianti <33492707+G-Ragghianti@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: David Schneller <12698011+davschneller@users.noreply.github.com>
Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: snehring <7978778+snehring@users.noreply.github.com>
Co-authored-by: Buldram <buldram@proton.me>
Co-authored-by: jmuddnv <143751186+jmuddnv@users.noreply.github.com>
Co-authored-by: Thomas-Ulrich <ulrich@geophysik.uni-muenchen.de>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Filippo Spiga <spiga.filippo@gmail.com>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Eric Berquist <727571+berquist@users.noreply.github.com>
Co-authored-by: wspear <wspear@cs.uoregon.edu>
Co-authored-by: wspear <wspear@users.noreply.github.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Chris Marsh <chrismarsh.c2@gmail.com>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>
Co-authored-by: psakievich <psakiev@sandia.gov>
Co-authored-by: Brian Spilner <Try2Code@users.noreply.github.com>
Co-authored-by: Dominic Hofer <6570912+dominichofer@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: danielsjensen1 <dsjense@sandia.gov>
Co-authored-by: Till Ehrengruber <till.ehrengruber@cscs.ch>
Co-authored-by: Henri Menke <henri@henrimenke.de>
Co-authored-by: pauleonix <paul.grosse-bley@ziti.uni-heidelberg.de>
Co-authored-by: Mosè Giordano <765740+giordano@users.noreply.github.com>
Co-authored-by: Zack Galbreath <zack.galbreath@kitware.com>
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Taillefumier Mathieu <29380261+mtaillefumier@users.noreply.github.com>
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Piotr Sacharuk <107190444+PiotrSacharuk@users.noreply.github.com>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: rfbgo <109985755+rfbgo@users.noreply.github.com>
Co-authored-by: Felix Thaler <thaler@cscs.ch>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: japlews <22622327+japlews@users.noreply.github.com>
Co-authored-by: George Young <A-N-Other@users.noreply.github.com>
Co-authored-by: Stephen Nicholas Swatman <stephen@v25.nl>
Co-authored-by: Davis Herring <herring@lanl.gov>
Co-authored-by: Dom Heinzeller <dom.heinzeller@icloud.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Izaak "Zaak" Beekman <contact@izaakbeekman.com>
Co-authored-by: ddement <ddement@gatech.edu>
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Matt Thompson <matthew.thompson@nasa.gov>
Co-authored-by: SXS Bot <31972027+sxs-bot@users.noreply.github.com>
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
Co-authored-by: AMD Toolchain Support <73240730+amd-toolchain-support@users.noreply.github.com>
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
Co-authored-by: Alberto Sartori <alberto.sartori@huawei.com>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Thomas Madlener <thomas.madlener@desy.de>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Vinícius <viniciusvgp@gmail.com>
Co-authored-by: Teague Sterling <teaguesterling@users.noreply.github.com>
Co-authored-by: Shane Nehring <snehring@iastate.edu>
Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>
Co-authored-by: Robert Mijakovic <robert.mijakovic@gmail.com>
Co-authored-by: Paul <bryantpj@ornl.gov>
Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
Co-authored-by: Vanessasaurus <814322+vsoch@users.noreply.github.com>
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Howard Pritchard <howardp@lanl.gov>
Co-authored-by: jgraciahlrs <gracia@hlrs.de>
Co-authored-by: Fernando Ayats <ayatsfer@gmail.com>
Co-authored-by: Tim Haines <thaines.astro@gmail.com>
Co-authored-by: renjithravindrankannath <94420380+renjithravindrankannath@users.noreply.github.com>
Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
Co-authored-by: alalazo <alalazo@users.noreply.github.com>
Co-authored-by: Tara Drwenski <tdrwenski@users.noreply.github.com>
Co-authored-by: Justin Cook <jscook@lbl.gov>
Co-authored-by: jean-francois-sa <jleblancrichard@simplyanalytics.com>
Co-authored-by: etiennemlb <eti.malaboeuf@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
Co-authored-by: Dmitri Smirnov <dmixsmi@gmail.com>
Co-authored-by: John Biddiscombe <biddisco@cscs.ch>
Co-authored-by: Dave Keeshan <96727608+davekeeshan@users.noreply.github.com>
Co-authored-by: Rémi Lacroix <remi.lacroix@idris.fr>
Co-authored-by: Vivek Kale <11766050+vlkale@users.noreply.github.com>
Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Marc T. Henry de Frahan <marc.henrydefrahan@nrel.gov>
Co-authored-by: Robert Maaskant <RobertMaaskant@users.noreply.github.com>
Co-authored-by: Matthew Lesko <matthew.w.lesko@nasa.gov>
Co-authored-by: Paul Gessinger <hello@paulgessinger.com>
Co-authored-by: Vicente Bolea <vicente.bolea@kitware.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pranav Sivaraman <pranavsivaraman@gmail.com>
Co-authored-by: Filippo Barbari <filippo.barbari@gmail.com>
Co-authored-by: Matthieu Dorier <mdorier@anl.gov>
Co-authored-by: Nai-Yuan Chiang <sorakid507@gmail.com>
Co-authored-by: Cameron Rutherford <rcamruzz@amazon.com>
Co-authored-by: Philip Fackler <49726797+PhilipFackler@users.noreply.github.com>
Co-authored-by: dmagdavector <david.magda@vectorinstitute.ai>
2025-02-17 15:17:31 -07:00
dmagdavector
5356469ba5 py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies (#47926)
* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies

* py-importlib-resources: add v5.13 to 6.4

* Revert "py-importlib-resources: add v5.13 to 6.4"

This reverts commit 1df208874c799b99dcfc43f13ae85f9324c59b52.
2025-02-17 13:17:31 -07:00
Buldram
605c3de633 qemacs: add v6.4.1, fix +doc (#48722) 2025-02-17 12:39:43 -06:00
Wouter Deconinck
45c4446b90 gnutls: add v3.8.9 (#49062)
* gnutls: add v3.8.9

* gnutls: address super small nitpick

* gnutls: fix git url
2025-02-17 09:44:00 -08:00
Philip Fackler
4ba6407cb8 plsm: new package (#48875)
* Adding plsm package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add description

* Add blank line

* Add cuda_arch and update int64 handling
2025-02-17 12:20:29 -05:00
Cameron Rutherford
c221635c79 magma: remove cuda_arch constraint on 2.9.0+ (#49019) 2025-02-17 17:49:49 +01:00
Stephen Nicholas Swatman
46ff553ec2 acts dependencies: new versions as of 2025/02/17 (#49073)
This commit adds detray v0.88.1, covfie v0.12.0 and v0.12.1, as well as
ACTS v37.1.0.
2025-02-17 08:58:09 -06:00
Nai-Yuan Chiang
fcc85adc7f update maintainers (#48295) 2025-02-17 08:21:52 -06:00
Sreenivasa Murthy Kolam
da1ac0fdd4 Add new recipe aotriton for rocm. (#49038)
* add new reciple aotriton for rocm. used for py-torch

* update the git info

* fix style error

* fix style error

* fix style error

* address review comments

* fix style error
2025-02-17 08:20:36 -06:00
Matthieu Dorier
0accf26472 mochi-margo/mochi-thallium: new versions (#49037)
* mochi-margo/mochi-thallium: new versions

* mochi-thallium: fixing style

* mochi-thallium: fixing required dependency on mochi-margo versions
2025-02-17 08:18:22 -06:00
Harmen Stoppels
545750873e Spec.__contains__: restrict to direct build and transitive runtime deps (#49072)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-17 13:57:16 +01:00
dependabot[bot]
7d4523a9fc build(deps): bump isort from 5.13.2 to 6.0.0 in /lib/spack/docs (#48747)
Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 12:05:41 +01:00
Harmen Stoppels
754a64d1fe apply black v25.1.0 (#49076) 2025-02-17 11:42:12 +01:00
Filippo Barbari
b11578ed7c lis: add v2.0.28 -> v2.1.7 (#48308)
* Added LIS 2.1.7

* Added LIS versions from 2.0.28 to 2.1.7
2025-02-16 19:07:39 -07:00
Pranav Sivaraman
c80dcd8f84 new package: jujutsu (#48231) 2025-02-16 19:41:23 -06:00
dependabot[bot]
aaaf4477c9 build(deps): bump black in /.github/workflows/requirements/style (#48779)
Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 17:42:46 -07:00
Dave Keeshan
27f123efad surfer: new package (#48432)
* Add waveform viewer, surfer for RTL simulations

* Ran black over the code following style check failure
2025-02-16 17:54:18 -06:00
dependabot[bot]
2b52639032 build(deps): bump black from 24.10.0 to 25.1.0 in /lib/spack/docs (#48780)
Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 16:37:36 -07:00
dependabot[bot]
a472adf2cb build(deps): bump isort in /.github/workflows/requirements/style (#48746)
Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 16:27:35 -07:00
Chris Marsh
79972d7b57 py-xarray-regrid: Add new package (#48834)
* Add py-xarray-regrid and required dep flox

* remove boiler plate

* Add missing py310 dep

* py-flox, py-xarray-regrid: add type=("build", "run") to python dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-16 12:22:39 -07:00
Teague Sterling
0ffb61e215 pbwt: new package (#49055)
* pbwt: add v2.1, v2.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-16 12:08:50 -06:00
Adam J. Stewart
cdd261b63f GDAL: add v3.10.2 (#49042) 2025-02-16 11:48:30 -06:00
Alec Scott
900574ddb3 direnv: add master, fix up package for better documentation (#49053) 2025-02-16 11:35:47 -06:00
Alec Scott
6bc4af11f4 fd: improve documentation and styling to help newer maintainers (#49058) 2025-02-16 11:16:00 -06:00
Alec Scott
6d35a75c4f fzf: add v0.60.0, improve styling (#49059) 2025-02-16 11:13:53 -06:00
Chris Marsh
7e65c57861 cairo: add new version and update build system (#48822)
* update cairo for new meson build system

* update patch range. remove old conflict

* style

* update pango to reflect the changes in cairo

* refine depends

* style

* add lzo depends

* add +shared

* non self-referential variant requireme

* style

* Move +shared variant back to just autotools as meson automatically handles it

* clarify patch when=

* update based on reviews. switch from conflicts to requires to enforce variant synchronization

* refine conflicts and requires

* better group build deps together

* comment for meson build lower version bound

* clarifying comments

* clarify version ranges, enforce build_system with version ranges

* style

* cairo:  no need to require for build_systems

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-15 18:37:43 -07:00
Vicente Bolea
9213bf5919 vtk: fix 9.4.1 concretization (#48946)
* seacas: conflict 2024-06-27 with windows

* vtk: fix 9.4.1 seacas dependency
2025-02-14 16:14:04 -06:00
Massimiliano Culpo
ccd205bfeb Allow tuning max_dupes for build dependencies (#48948)
Up to now, Spack was allowing all build-tools that
may appear in the DAG to have 2 max_dupes.

This is not needed in practice for most of them,
and adding them out of caution just increases
grounding and concretization time.

This PR makes the value of max_dupes configurable
per package, and sets only a few known packages to
2 max_dupes by default.

In case user needs different values, they can
tune the configuration for their use case.
2025-02-14 14:25:12 +01:00
Paul Gessinger
114bd5744f views: normalize paths on case insensitive file systems (#47370)
On macOS, prefix_a/file and prefix_b/FILE map to the same file view/file or view/FILE.

This commit ensures that we test whether a view is created on a case insensitive filesystem and handle projection conflicts accordingly.
2025-02-14 09:35:40 +01:00
Harmen Stoppels
8ef5f1027a Spec.__getitem__: restrict to direct deps + transitive runtime deps (#49016)
With this change spec["pkg"] searches only direct dependencies and transitive link/run
dependencies, ordered by depth. This avoids situations where we pick up unwanted 
deps of build/test deps. 

To reach those, you need to do spec["build_dep"]["pkg"] explicitly.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-14 08:45:50 +01:00
Wouter Deconinck
36bc53ee07 MesonPackage: depends_on pkgconfig (#46955)
meson's `dependency` function often uses pkg-config to locate a dependency, and may even fall back to cmake. The former case is very common, and since packagers often forget to add the tiny pkgconfig package as a build dep, we do it for them.
2025-02-14 08:36:16 +01:00
Tamara Dahlgren
236b8fc009 RepoSplit/tests: update repo tests relying on builtin package repo to only use mock repos (#48926)
* RepoSplit/tests: update repo tests relying on builtin

* test_repo_last_mtime: skip on windows due to mtime issues in CI
2025-02-13 21:49:50 -08:00
Matthew Lesko
1a42bf043f openblas: .libs() uses self.libraries attribute (#48942)
Currently this is hardcoded to the same value as listed in the class
definition. If one ever overrides this attribute, such as:

```
packages:
  openblas:
    package_attributes:
      libraries = [ 'libopenblaso64', ]
```

this patch will make sure that override also in the
`spec['openblas'].libs()` call. (Which happens in `hypre`, likely
others).

( see
https://spack.readthedocs.io/en/latest/packages_yaml.html#assigning-package-attributes
)

Thanks to becker33 for debugging help in Slack
2025-02-13 21:44:12 -08:00
Robert Maaskant
87cc3280b6 yq: add versions 4.44.5 and 4.44.6 and 4.45.1 (#49027) 2025-02-13 19:48:42 -07:00
Wouter Deconinck
7dc75d5f8c acts: conflicts ^geant4@11.3: when @:35 (#49028) 2025-02-13 19:33:33 -07:00
Marc T. Henry de Frahan
a1bff46435 Update openfast, amr-wind, and nalu-wind packages (#48994) 2025-02-13 19:28:11 -07:00
John W. Parent
12e0eb6178 zlib package: Ensure correct lib search on Windows (#48512)
* Name of zlib's library differs on Windows; also account for name
  differing when building +shared
* `zlib`'s `.libs` implementation was searching for the runtime
  libraries (the .dlls) and should be searching for link-time libs
2025-02-13 17:15:03 -07:00
Vivek Kale
0eb55a0b8f Put in maintainers for Kokkos Tools Spack package (#49018)
* Kokkos Tools package.py: fix maintainers
* Kokkos Tools package.py: remove white space between first and second maintainer in comma-separated list
* [@spackbot] updating style on behalf of vlkale
* Correct maintainers syntax
   Co-authored-by: Richard Berger <richard.berger@outlook.com>

---------

Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
2025-02-13 16:35:22 -07:00
Thomas-Ulrich
6925a53937 easi@1.5.2 (#49013) 2025-02-13 16:29:03 -07:00
Juan Miguel Carceller
e34f04df5e pandora{pfa,sdk,monitoring}: add new versions and allow setting the C++ standard (#48300)
* pandoramonitoring: add v3.6.0; pandorapfa: add v4.11.2

Remove variables that are not being used in pandorasdk. Use the C++ standard
from ROOT when possible and pass -Wno-error to override the -Werror that will
typically fail with a new standard. Add a cxxstd variant for pandorasdk

* Fix style

* Update var/spack/repos/builtin/packages/pandorasdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix style

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-13 16:09:40 -07:00
Adam J. Stewart
3fe13f0891 py-numpy: add v2.2.3 (#49029) 2025-02-13 15:37:08 -07:00
Rémi Lacroix
c8d244b621 laszip: Add version 3.4.4. (#48982) 2025-02-13 12:47:30 -08:00
Dave Keeshan
bc3132f2a9 yosys: add v0.50 (#48983) 2025-02-13 12:46:21 -08:00
Adam J. Stewart
b79d0bfc80 py-memray: add v1.15.0 (#48989) 2025-02-13 12:45:03 -08:00
Adam J. Stewart
f678e8af4d py-torchgeo: pyvista dep has been removed (#48990) 2025-02-13 12:43:25 -08:00
Matt Thompson
9985ecf6a7 gftl: add v1.15.2 (#48992) 2025-02-13 12:40:24 -08:00
Buldram
5e981797f5 bubblewrap: add versions up to v0.11.0 (#49023) 2025-02-13 06:20:29 -07:00
Harmen Stoppels
a7b542dd37 spack debug create-db-tarball: remove after test failures (#49025) 2025-02-13 13:10:28 +01:00
John Biddiscombe
4dc1a900e2 h5hut: Remove H5_USE_110_API for newer versions (#48885)
* h5hut: Remove H5_USE_110_API for newer versions

* h5hut: style reformat and add maintainer

* h5hut: correct version syntax for <v2.x.x

* h5hut: Bump default version to 2.0.0rc7 and remove older rc candidates

* Update var/spack/repos/builtin/packages/h5hut/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-02-13 11:01:14 +01:00
Harmen Stoppels
8697371d82 binutils: add debuginfod variant + update deps (#49011) 2025-02-13 08:00:06 +01:00
Dmitri Smirnov
61899fcfc1 plog: add new package (#48975) 2025-02-13 07:54:00 +01:00
Richard Berger
9697c1934c lammps: add 20241119 and 20250204 releases (#48978) 2025-02-12 15:57:36 -08:00
etiennemlb
a011b49e1e lammps: use the Cray GTL (#46090) 2025-02-12 15:07:51 -07:00
etiennemlb
ae50757f3c cray-mpich: adding partial GTL support (#45830)
cray-mpich now has a rocm variant. You can use gtl_lib in the
flag_handler like so:

```python
    def flag_handler(self, name, flags):
        wrapper_flags = []
        environment_flags = []
        build_system_flags = []

        if self.spec.satisfies("+rocm"):
            if self.spec.satisfies("^cray-mpich"):
                gtl_lib = self.spec["cray-mpich"].package.gtl_lib
                build_system_flags.extend(gtl_lib.get(name) or [])
            # hipcc is not wrapped, we need to pass the flags via the
            # build system.
            build_system_flags.extend(flags)

        return (wrapper_flags, environment_flags, build_system_flags)
```

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
2025-02-12 12:01:40 -08:00
jean-francois-sa
6f1dce95f9 postgresql: add v17.2 (#47811)
* postgresql: add version 17.2

* postgresql: install flex, bison and perl when building versions 17 and up

* postgresql: do not install perl by default when building versions 17 and up
2025-02-12 13:28:55 -06:00
John W. Parent
fd59d3e589 Ci: ensure file path comparsion uses posix paths (#47033)
Git always produces posix paths, ensure we're always comparsing apples to apples by normalizing paths compared to git output to posix.
2025-02-12 12:14:16 -07:00
Massimiliano Culpo
0172208c52 solver: add type-hints to OutputConfiguration (#48979) 2025-02-12 20:12:12 +01:00
Piotr Sacharuk
02c2516e88 unifyfs: Apply workaround for oneAPI compiler for problem with build (#48962) 2025-02-12 10:25:34 -08:00
Wouter Deconinck
a8e37ccbbb apptainer: get_full_repo for branch main (#49002) 2025-02-12 09:06:59 -08:00
Massimiliano Culpo
f0f463c8dc hep: add missing language dependencies (#48963)
* highfive: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* lhapdf: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* vc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989140

* davix: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14989131

* pandorasdk: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989130

* veccore: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989118

* pythia6: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989116

* jwt-cpp: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* collier: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* hepmc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989112

* clhep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989075

* fastjet: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14981340

* gosam-contrib: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14978873

* thepeg: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997553

* cepgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* podio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* pandoramonitoring: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* lcio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997513

* geant4: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997202

* evtgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996817

* apfel: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996779

* collier: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14996770

* vecgeom: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003840

* dd4hep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003839

* opendatadetector: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007666

* acts: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007827

* hepmc: remove dependency on fortran

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* thepeg: remove fortran dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* acts: add a conditional build dependency

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* opendatadetector: add comment to explain C dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-12 10:51:23 -06:00
Harmen Stoppels
a137da1cd5 Spec.validate_detection -> spack.detection.path.validate_detection (#48987) 2025-02-12 09:30:14 -07:00
Harmen Stoppels
03e972314f Remove spack.repo.PATH.is_virtual call from SpecBuildInterface.(#48984)
This PR is effectively a breaking change extracted from #45189, which removes 
support for spec["mpi"] if spec itself is openmpi / mpich that could provide mpi; 
from the Spec instance we don't have any parent it provides it to, 
hence it's a KeyError.
2025-02-12 17:17:45 +01:00
Seth R. Johnson
9a7a3d2743 g4vg: add 'develop' branch (#49003)
* g4vg: add develop version

* celeritas: add develop version

* Fix style

* REVERTME: move celeritas changes to another branch

* Get full repo

* remove unneeded variable
2025-02-12 09:04:42 -07:00
Harmen Stoppels
76f00a3659 Spec.is_virtual -> spack.repo.PATH.is_virtual (#48986) 2025-02-12 16:57:27 +01:00
Harmen Stoppels
cd98781fb4 concretize.lp: don't warn about deprecation when external (#49008) 2025-02-12 16:27:37 +01:00
Thomas Madlener
dd16f451fc lcio: Add latest 2.22.5 tag (#48991) 2025-02-12 08:26:12 -07:00
Buldram
4f6cd5abde py-transformers: add new versions (#49000)
* py-transformers: add new versions

* py-tokenizers: add new versions

* Apply suggestions from code review
2025-02-12 07:58:32 -07:00
Justin Cook
d7f05e08be libfabric: use the class variable to get the list of fabrics (#49007)
Suggested by: alalazo <alalazo@users.noreply.github.com>

Signed-off-by: Justin Cook <jscook@lbl.gov>
2025-02-12 14:07:06 +01:00
Harmen Stoppels
9747978c7f Spec.package_class -> spack.repo.PATH.get_pkg_class (#48985) 2025-02-12 11:52:04 +01:00
Harmen Stoppels
f043455ccc gmake: fix def libs/headers (#49009) 2025-02-12 11:07:55 +01:00
Harmen Stoppels
fb9d6427e6 package_hash.py: move metadata_attrs inline out of package_base (#48981) 2025-02-12 10:38:35 +01:00
Tara Drwenski
76e83e10c1 gmake: add empty libs property, remove link deptypes from dependents (#48995) 2025-02-12 10:16:30 +01:00
Massimiliano Culpo
af89bdf632 kokkos et al. : don't monkeypatch spec in callbacks (#48916)
Currently, a few packages using kokkos rely on
kokkos itself monkeypatching its own spec to
provide some attribute.

In this commit we change this attribute to be
defined on the package, and never be monkeypatched.
2025-02-12 07:27:49 +01:00
Adam J. Stewart
46f5b192ef PyTorch: build flash attention by default, except in CI (#48521)
* PyTorch: build flash attention by default, except in CI

* Variant is boolean, only available when +cuda/+rocm

* desc -> _desc
2025-02-11 13:20:10 -08:00
Harmen Stoppels
18cd922aab style.py: fix false negative in redundant import statements (#48980) 2025-02-11 19:30:50 +01:00
Massimiliano Culpo
5518ad9611 Fix performance issue on macOS (#48997)
archspec.cpu.host() is not memoized, so compute
it as less as possible.

---------

Co-authored-by: alalazo <alalazo@users.noreply.github.com>
2025-02-11 18:54:32 +01:00
Wouter Deconinck
57a1807443 hep: rivet: require hepmc=3 (#48976) 2025-02-11 01:33:45 -07:00
Harmen Stoppels
3909308d5c spack.util.elf: catch seek errors (#48972) 2025-02-11 08:52:52 +01:00
Massimiliano Culpo
54210270c8 concretizer: reduce search space with static analysis (#48729)
Currently, when we setup the ASP problem for `clingo`, we don't take into account the configuration. This results in setting up ASP problems that are larger than necessary, with possibly redundant information, and higher concretization times. 

This PR tries to improve things by adding an opt-in feature that computes the _possible dependencies_ of a solve taking also into account the current configuration, and avoids adding possible dependencies that we are certain can't be in the final solution.

The feature can be activated with:
```yaml
concretizer:
  static_analysis: true
```

Examples of simple rules to discard dependencies are:
- Dependencies that are not buildable, and for which no binary is present (e.g. `cray-mpich` etc. on non Cray systems)
- Dependencies that are not for the current platform (e.g. `msmpi` on non Windows platforms)
- Conditional dependencies that cannot be activated, because of some user requirement (e.g. `cuda` etc. if the user requires `~cuda` in configuration)
- Virtual providers that cannot be used, because of a requirement on a virtual

The speed-up these rules seem to give depends on the use case at hand, but if the configuration is updated properly, they are noticeable. 

Since in cases where there is no rule to exclude packages upfront, reuse is active, and this option is activated, it's possible to see some minor slow down, the feature has been added as opt-in, so it's turned off by default.
2025-02-11 08:44:20 +01:00
Wouter Deconinck
1a71bb046e rivet: patch missing header in 3.1.10 (#48977) 2025-02-11 07:40:37 +01:00
Peter Scheibel
dbd6857d32 spack compiler find: detect flang-new and flang in newer LLVM versions (#48914) 2025-02-11 07:34:28 +01:00
renjithravindrankannath
025bc24996 Bump up the version for rocm-6.3.2 release (#48787)
* Bump up the version for rocm-6.3.2 release

* rocm-openmp-extras update and style correction

* Updating mivisionx, omniperf, rccl & rocprofiler-systems

* Updating hipsparselt & rocm-opencl

* rocprofiler-systems on gcc-13 and rvs commit instead of patch

* Updated rocjpeg & rocm-examples for 6.3.2

* ROCPROFSYS_BUILD_DYNINST & DYNINST_BUILD_TBB are required only with gcc-13

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>
2025-02-10 21:03:23 -08:00
Piotr Sacharuk
01e16b58a3 lua-sol2: Apply workaround for oneAPI compiler for problem with build (#48920) 2025-02-10 21:51:57 -07:00
Tim Haines
f71e202f24 cbtf-krell: Update Boost dependency (#47133)
* Update Boost
* Add gotcha
* Add patch for build errors
* Allow building with latest Dyninst
* Fix patch url
2025-02-10 13:05:05 -08:00
Stephen Nicholas Swatman
f7edd10c17 acts dependencies: new versions as of 2025/02/10 (#48969)
This commit adds detray v0.88.0 and GeoModel v6.9.0.
2025-02-10 13:27:54 -07:00
Richard Berger
153c0805dd ports-of-call: add v1.6.0, v1.7.0, v1.7.1 (#48870) 2025-02-10 11:55:59 -08:00
Harmen Stoppels
5d8517ef69 import-check: bump (#48968) 2025-02-10 20:48:59 +01:00
884 changed files with 17032 additions and 9393 deletions

View File

@@ -9,6 +9,7 @@ on:
branches:
- develop
- releases/**
merge_group:
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -25,13 +26,17 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
if: ${{ github.event_name == 'push' }}
if: ${{ github.event_name == 'push' || github.event_name == 'merge_group' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# For merge group events, compare against the target branch (main)
base: ${{ github.event_name == 'merge_group' && github.event.merge_group.base_ref || '' }}
# For merge group events, use the merge group head ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || github.ref }}
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
@@ -76,10 +81,11 @@ jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
uses: ./.github/workflows/prechecks.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
with_packages: ${{ needs.changes.outputs.packages }}
import-check:
needs: [ changes ]
@@ -93,7 +99,7 @@ jobs:
- name: Success
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
else
exit 0
@@ -101,6 +107,7 @@ jobs:
coverage:
needs: [ unit-tests, prechecks ]
if: ${{ needs.changes.outputs.core }}
uses: ./.github/workflows/coverage.yml
secrets: inherit
@@ -113,10 +120,10 @@ jobs:
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
echo "Bootstrap tests failed."
exit 1
else
exit 0

View File

@@ -39,7 +39,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
repository: haampie/circular-import-fighter
ref: e38bcd0aa46368e30648b61b7f0d8c1ca68aadff
ref: 4cdb0bf15f04ab6b49041d5ef1bfd9644cce7f33
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter

View File

@@ -1,4 +1,4 @@
name: style
name: prechecks
on:
workflow_call:
@@ -6,6 +6,9 @@ on:
with_coverage:
required: true
type: string
with_packages:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -30,6 +33,7 @@ jobs:
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
@@ -53,12 +57,25 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
verify-checksums:
if: ${{ inputs.with_packages == 'true' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 2
- name: Verify Added Checksums
run: |
bin/spack ci verify-versions HEAD^1 HEAD
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest

View File

@@ -1,7 +1,7 @@
black==24.10.0
black==25.1.0
clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.11.2
types-six==1.17.0.20241205
flake8==7.1.2
isort==6.0.1
mypy==1.15.0
types-six==1.17.0.20250304
vermin==1.6.0

1
.gitignore vendored
View File

@@ -201,7 +201,6 @@ tramp
# Org-mode
.org-id-locations
*_archive
# flymake-mode
*_flymake.*

View File

@@ -43,6 +43,28 @@ concretizer:
# (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Maximum number of duplicates in a DAG, when using a strategy that allows duplicates. "default" is the
# number used if there isn't a more specific alternative
max_dupes:
default: 1
# Virtuals
c: 2
cxx: 2
fortran: 1
# Regular packages
cmake: 2
gmake: 2
python: 2
python-venv: 2
py-cython: 2
py-flit-core: 2
py-pip: 2
py-setuptools: 2
py-wheel: 2
xcb-proto: 2
# Compilers
gcc: 2
llvm: 2
# Option to specify compatibility between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
@@ -63,3 +85,7 @@ concretizer:
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true
# Static analysis may reduce the concretization time by generating smaller ASP problems, in
# cases where there are requirements that prevent part of the search space to be explored.
static_analysis: false

View File

@@ -1761,19 +1761,24 @@ Verifying installations
The ``spack verify`` command can be used to verify the validity of
Spack-installed packages any time after installation.
^^^^^^^^^^^^^^^^^^^^^^^^^
``spack verify manifest``
^^^^^^^^^^^^^^^^^^^^^^^^^
At installation time, Spack creates a manifest of every file in the
installation prefix. For links, Spack tracks the mode, ownership, and
destination. For directories, Spack tracks the mode, and
ownership. For files, Spack tracks the mode, ownership, modification
time, hash, and size. The Spack verify command will check, for every
file in each package, whether any of those attributes have changed. It
will also check for newly added files or deleted files from the
installation prefix. Spack can either check all installed packages
time, hash, and size. The ``spack verify manifest`` command will check,
for every file in each package, whether any of those attributes have
changed. It will also check for newly added files or deleted files from
the installation prefix. Spack can either check all installed packages
using the `-a,--all` or accept specs listed on the command line to
verify.
The ``spack verify`` command can also verify for individual files that
they haven't been altered since installation time. If the given file
The ``spack verify manifest`` command can also verify for individual files
that they haven't been altered since installation time. If the given file
is not in a Spack installation prefix, Spack will report that it is
not owned by any package. To check individual files instead of specs,
use the ``-f,--files`` option.
@@ -1788,6 +1793,22 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
^^^^^^^^^^^^^^^^^^^^^^^^^^
``spack verify libraries``
^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``spack verify libraries`` command can be used to verify that packages
do not have accidental system dependencies. This command scans the install
prefixes of packages for executables and shared libraries, and resolves
their needed libraries in their RPATHs. When needed libraries cannot be
located, an error is reported. This typically indicates that a package
was linked against a system library, instead of a library provided by
a Spack package.
This verification can also be enabled as a post-install hook by setting
``config:shared_linking:missing_library_policy`` to ``error`` or ``warn``
in :ref:`config.yaml <config-yaml>`.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -223,6 +223,10 @@ def setup(sphinx):
("py:class", "spack.compiler.CompilerCache"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
("py:class", "llnl.util.lang.KT"),
("py:class", "llnl.util.lang.VT"),
("py:obj", "llnl.util.lang.KT"),
("py:obj", "llnl.util.lang.VT"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -125,6 +125,8 @@ are stored in ``$spack/var/spack/cache``. These are stored indefinitely
by default. Can be purged with :ref:`spack clean --downloads
<cmd-spack-clean>`.
.. _Misc Cache:
--------------------
``misc_cache``
--------------------
@@ -334,3 +336,52 @@ create a new alias called ``inst`` that will always call ``install -v``:
aliases:
inst: install -v
-------------------------------
``concretization_cache:enable``
-------------------------------
When set to ``true``, Spack will utilize a cache of solver outputs from
successful concretization runs. When enabled, Spack will check the concretization
cache prior to running the solver. If a previous request to solve a given
problem is present in the cache, Spack will load the concrete specs and other
solver data from the cache rather than running the solver. Specs not previously
concretized will be added to the cache on a successful solve. The cache additionally
holds solver statistics, so commands like ``spack solve`` will still return information
about the run that produced a given solver result.
This cache is a subcache of the :ref:`Misc Cache` and as such will be cleaned when the Misc
Cache is cleaned.
When ``false`` or ommitted, all concretization requests will be performed from scatch
----------------------------
``concretization_cache:url``
----------------------------
Path to the location where Spack will root the concretization cache. Currently this only supports
paths on the local filesystem.
Default location is under the :ref:`Misc Cache` at: ``$misc_cache/concretization``
------------------------------------
``concretization_cache:entry_limit``
------------------------------------
Sets a limit on the number of concretization results that Spack will cache. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.
-----------------------------------
``concretization_cache:size_limit``
-----------------------------------
Sets a limit on the size of the concretization cache in bytes. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.

View File

@@ -14,6 +14,7 @@ case you want to skip directly to specific docs:
* :ref:`compilers.yaml <compiler-config>`
* :ref:`concretizer.yaml <concretizer-options>`
* :ref:`config.yaml <config-yaml>`
* :ref:`include.yaml <include-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <packages-config>`

View File

@@ -457,6 +457,13 @@ developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the
``spack develop`` command.
When working deep in the graph it is often desirable to have multiple specs marked
as ``develop`` so you don't have to restage and/or do full rebuilds each time you
call ``spack install``. The ``--recursive`` flag can be used in these scenarios
to ensure that all the dependents of the initial spec you provide are also marked
as develop specs. The ``--recursive`` flag requires a pre-concretized environment
so the graph can be traversed from the supplied spec all the way to the root specs.
For packages with ``git`` attributes, git branches, tags, and commits can
also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone
@@ -670,24 +677,45 @@ This configuration sets the default compiler for all packages to
Included configurations
^^^^^^^^^^^^^^^^^^^^^^^
Spack environments allow an ``include`` heading in their yaml
schema. This heading pulls in external configuration files and applies
them to the environment.
Spack environments allow an ``include`` heading in their yaml schema.
This heading pulls in external configuration files and applies them to
the environment.
.. code-block:: yaml
spack:
include:
- relative/path/to/config.yaml
- environment/relative/path/to/config.yaml
- https://github.com/path/to/raw/config/compilers.yaml
- /absolute/path/to/packages.yaml
- path: /path/to/$os/$target/environment
optional: true
- path: /path/to/os-specific/config-dir
when: os == "ventura"
Included configuration files are required *unless* they are explicitly optional
or the entry's condition evaluates to ``false``. Optional includes are specified
with the ``optional`` clause and conditional with the ``when`` clause. (See
:ref:`include-yaml` for more information on optional and conditional entries.)
Files are listed using paths to individual files or directories containing them.
Path entries may be absolute or relative to the environment or specified as
URLs. URLs to individual files need link to the **raw** form of the file's
contents (e.g., `GitHub
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
or `GitLab
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or schemes) are
supported. Spack-specific, environment and user path variables can be used.
(See :ref:`config-file-variables` for more information.)
.. warning::
Recursive includes are not currently processed in a breadth-first manner
so the value of a configuration option that is altered by multiple included
files may not be what you expect. This will be addressed in a future
update.
Environments can include files or URLs. File paths can be relative or
absolute. URLs include the path to the text for individual files or
can be the path to a directory containing configuration files.
Spack supports ``file``, ``http``, ``https`` and ``ftp`` protocols (or
schemes). Spack-specific, environment and user path variables may be
used in these paths. See :ref:`config-file-variables` for more information.
^^^^^^^^^^^^^^^^^^^^^^^^
Configuration precedence

View File

@@ -30,7 +30,7 @@ than always choosing the latest versions or default variants.
.. note::
As a rule of thumb: requirements + constraints > reuse > preferences > defaults.
As a rule of thumb: requirements + constraints > strong preferences > reuse > preferences > defaults.
The following set of criteria (from lowest to highest precedence) explain
common cases where concretization output may seem surprising at first.
@@ -56,7 +56,19 @@ common cases where concretization output may seem surprising at first.
concretizer:
reuse: dependencies # other options are 'true' and 'false'
3. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
3. :ref:`Strong preferences <package-strong-preferences>` configured in ``packages.yaml``
are higher priority than reuse, and can be used to strongly prefer a specific version
or variant, without erroring out if it's not possible. Strong preferences are specified
as follows:
.. code-block:: yaml
packages:
foo:
prefer:
- "@1.1: ~mpi"
4. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
and constraints from the command line as well as ``package.py`` files override all
of the above. Requirements are specified as follows:
@@ -66,6 +78,8 @@ common cases where concretization output may seem surprising at first.
foo:
require:
- "@1.2: +mpi"
conflicts:
- "@1.4"
Requirements and constraints restrict the set of possible solutions, while reuse
behavior and preferences influence what an optimal solution looks like.

View File

@@ -0,0 +1,51 @@
.. Copyright Spack Project Developers. See COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _include-yaml:
===============================
Include Settings (include.yaml)
===============================
Spack allows you to include configuration files through ``include.yaml``.
Using the ``include:`` heading results in pulling in external configuration
information to be used by any Spack command.
Included configuration files are required *unless* they are explicitly optional
or the entry's condition evaluates to ``false``. Optional includes are specified
with the ``optional`` clause and conditional with the ``when`` clause. For
example,
.. code-block:: yaml
include:
- /path/to/a/required/config.yaml
- path: /path/to/$os/$target/config
optional: true
- path: /path/to/os-specific/config-dir
when: os == "ventura"
shows all three. The first entry, ``/path/to/a/required/config.yaml``,
indicates that included ``config.yaml`` file is required (so must exist).
Use of ``optional: true`` for ``/path/to/$os/$target/config`` means
the path is only included if it exists. The condition ``os == "ventura"``
in the ``when`` clause for ``/path/to/os-specific/config-dir`` means the
path is only included when the operating system (``os``) is ``ventura``.
The same conditions and variables in `Spec List References
<https://spack.readthedocs.io/en/latest/environments.html#spec-list-references>`_
can be used for conditional activation in the ``when`` clauses.
Included files can be specified by path or by their parent directory.
Paths may be absolute, relative (to the configuration file including the path),
or specified as URLs. Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or
schemes) are supported. Spack-specific, environment and user path variables
can be used. (See :ref:`config-file-variables` for more information.)
.. warning::
Recursive includes are not currently processed in a breadth-first manner
so the value of a configuration option that is altered by multiple included
files may not be what you expect. This will be addressed in a future
update.

View File

@@ -71,6 +71,7 @@ or refer to the full manual below.
configuration
config_yaml
include_yaml
packages_yaml
build_settings
environments

View File

@@ -486,6 +486,8 @@ present. For instance with a configuration like:
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-strong-preferences:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Conflicts and strong preferences
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -820,6 +820,69 @@ presence of a ``SPACK_CDASH_AUTH_TOKEN`` environment variable during the
build group on CDash called "Release Testing" (that group will be created if
it didn't already exist).
.. _ci_artifacts:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
CI Artifacts Directory Layout
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
When running the CI build using the command ``spack ci rebuild`` a number of directories are created for
storing data generated during the CI job. The default root directory for artifacts is ``job_scratch_root``.
This can be overridden by passing the argument ``--artifacts-root`` to the ``spack ci generate`` command
or by setting the ``SPACK_ARTIFACTS_ROOT`` environment variable in the build job scripts.
The top level directories under the artifact root are ``concrete_environment``, ``logs``, ``reproduction``,
``tests``, and ``user_data``. Spack does not restrict what is written to any of these directories nor does
it require user specified files be written to any specific directory.
------------------------
``concrete_environment``
------------------------
The directory ``concrete_environment`` is used to communicate the ci generate processed ``spack.yaml`` and
the concrete ``spack.lock`` for the CI environment.
--------
``logs``
--------
The directory ``logs`` contains the spack build log, ``spack-build-out.txt``, and the spack build environment
modification file, ``spack-build-mod-env.txt``. Additionally all files specified by the packages ``Builder``
property ``archive_files`` are also copied here (ie. ``CMakeCache.txt`` in ``CMakeBuilder``).
----------------
``reproduction``
----------------
The directory ``reproduction`` is used to store the files needed by the ``spack reproduce-build`` command.
This includes ``repro.json``, copies of all of the files in ``concrete_environment``, the concrete spec
JSON file for the current spec being built, and all of the files written in the artifacts root directory.
The ``repro.json`` file is not versioned and is only designed to work with the version of spack CI was run with.
An example of what a ``repro.json`` may look like is here.
.. code:: json
{
"job_name": "adios2@2.9.2 /feaevuj %gcc@11.4.0 arch=linux-ubuntu20.04-x86_64_v3 E4S ROCm External",
"job_spec_json": "adios2.json",
"ci_project_dir": "/builds/spack/spack"
}
---------
``tests``
---------
The directory ``tests`` is used to store output from running ``spack test <job spec>``. This may or may not have
data in it depending on the package that was built and the availability of tests.
-------------
``user_data``
-------------
The directory ``user_data`` is used to store everything else that shouldn't be copied to the ``reproduction`` direcotory.
Users may use this to store additional logs or metrics or other types of files generated by the build job.
-------------------------------------
Using a custom spack in your pipeline
-------------------------------------

View File

@@ -1,13 +1,13 @@
sphinx==8.1.3
sphinx==8.2.3
sphinxcontrib-programoutput==0.18
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
python-levenshtein==0.27.1
docutils==0.21.2
pygments==2.19.1
urllib3==2.3.0
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1
pytest==8.3.5
isort==6.0.1
black==25.1.0
flake8==7.1.2
mypy==1.11.1

View File

@@ -7,6 +7,7 @@
import fnmatch
import glob
import hashlib
import io
import itertools
import numbers
import os
@@ -20,6 +21,7 @@
from contextlib import contextmanager
from itertools import accumulate
from typing import (
IO,
Callable,
Deque,
Dict,
@@ -2454,26 +2456,69 @@ class WindowsSimulatedRPath:
and vis versa.
"""
def __init__(self, package, link_install_prefix=True):
def __init__(
self,
package,
base_modification_prefix: Optional[Union[str, pathlib.Path]] = None,
link_install_prefix: bool = True,
):
"""
Args:
package (spack.package_base.PackageBase): Package requiring links
base_modification_prefix (str|pathlib.Path): Path representation indicating
the root directory in which to establish the simulated rpath, ie where the
symlinks that comprise the "rpath" behavior will be installed.
Note: This is a mutually exclusive option with `link_install_prefix` using
both is an error.
Default: None
link_install_prefix (bool): Link against package's own install or stage root.
Packages that run their own executables during build and require rpaths to
the build directory during build time require this option. Default: install
the build directory during build time require this option.
Default: install
root
Note: This is a mutually exclusive option with `base_modification_prefix`, using
both is an error.
"""
self.pkg = package
self._addl_rpaths = set()
self._addl_rpaths: set[str] = set()
if link_install_prefix and base_modification_prefix:
raise RuntimeError(
"Invalid combination of arguments given to WindowsSimulated RPath.\n"
"Select either `link_install_prefix` to create an install prefix rpath"
" or specify a `base_modification_prefix` for any other link type. "
"Specifying both arguments is invalid."
)
if not (link_install_prefix or base_modification_prefix):
raise RuntimeError(
"Insufficient arguments given to WindowsSimulatedRpath.\n"
"WindowsSimulatedRPath requires one of link_install_prefix"
" or base_modification_prefix to be specified."
" Neither was provided."
)
self.link_install_prefix = link_install_prefix
self._additional_library_dependents = set()
if base_modification_prefix:
self.base_modification_prefix = pathlib.Path(base_modification_prefix)
else:
self.base_modification_prefix = pathlib.Path(self.pkg.prefix)
self._additional_library_dependents: set[pathlib.Path] = set()
if not self.link_install_prefix:
tty.debug(f"Generating rpath for non install context: {base_modification_prefix}")
@property
def library_dependents(self):
"""
Set of directories where package binaries/libraries are located.
"""
return set([pathlib.Path(self.pkg.prefix.bin)]) | self._additional_library_dependents
base_pths = set()
if self.link_install_prefix:
base_pths.add(pathlib.Path(self.pkg.prefix.bin))
base_pths |= self._additional_library_dependents
return base_pths
def add_library_dependent(self, *dest):
"""
@@ -2489,6 +2534,12 @@ def add_library_dependent(self, *dest):
new_pth = pathlib.Path(pth).parent
else:
new_pth = pathlib.Path(pth)
path_is_in_prefix = new_pth.is_relative_to(self.base_modification_prefix)
if not path_is_in_prefix:
raise RuntimeError(
f"Attempting to generate rpath symlink out of rpath context:\
{str(self.base_modification_prefix)}"
)
self._additional_library_dependents.add(new_pth)
@property
@@ -2577,6 +2628,33 @@ def establish_link(self):
self._link(library, lib_dir)
def make_package_test_rpath(pkg, test_dir: Union[str, pathlib.Path]):
"""Establishes a temp Windows simulated rpath for the pkg in the testing directory
so an executable can test the libraries/executables with proper access
to dependent dlls
Note: this is a no-op on all other platforms besides Windows
Args:
pkg (spack.package_base.PackageBase): the package for which the rpath should be computed
test_dir: the testing directory in which we should construct an rpath
"""
# link_install_prefix as false ensures we're not linking into the install prefix
mini_rpath = WindowsSimulatedRPath(pkg, link_install_prefix=False)
# add the testing directory as a location to install rpath symlinks
mini_rpath.add_library_dependent(test_dir)
# check for whether build_directory is available, if not
# assume the stage root is the build dir
build_dir_attr = getattr(pkg, "build_directory", None)
build_directory = build_dir_attr if build_dir_attr else pkg.stage.path
# add the build dir & build dir bin
mini_rpath.add_rpath(os.path.join(build_directory, "bin"))
mini_rpath.add_rpath(os.path.join(build_directory))
# construct rpath
mini_rpath.establish_link()
@system_path_filter
@memoized
def can_access_dir(path):
@@ -2805,6 +2883,20 @@ def keep_modification_time(*filenames):
os.utime(f, (os.path.getatime(f), mtime))
@contextmanager
def temporary_file_position(stream):
orig_pos = stream.tell()
yield
stream.seek(orig_pos)
@contextmanager
def current_file_position(stream: IO[str], loc: int, relative_to=io.SEEK_CUR):
with temporary_file_position(stream):
stream.seek(loc, relative_to)
yield
@contextmanager
def temporary_dir(
suffix: Optional[str] = None, prefix: Optional[str] = None, dir: Optional[str] = None

View File

@@ -11,10 +11,11 @@
import re
import sys
import traceback
import types
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Dict, Iterable, List, Tuple, TypeVar
from typing import Callable, Dict, Iterable, List, Mapping, Optional, Tuple, TypeVar
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -707,14 +708,24 @@ def __init__(self, wrapped_object):
class Singleton:
"""Simple wrapper for lazily initialized singleton objects."""
"""Wrapper for lazily initialized singleton objects."""
def __init__(self, factory):
def __init__(self, factory: Callable[[], object]):
"""Create a new singleton to be inited with the factory function.
Most factories will simply create the object to be initialized and
return it.
In some cases, e.g. when bootstrapping some global state, the singleton
may need to be initialized incrementally. If the factory returns a generator
instead of a regular object, the singleton will assign each result yielded by
the generator to the singleton instance. This allows methods called by
the factory in later stages to refer back to the singleton.
Args:
factory (function): function taking no arguments that
creates the singleton instance.
factory (function): function taking no arguments that creates the
singleton instance.
"""
self.factory = factory
self._instance = None
@@ -722,7 +733,16 @@ def __init__(self, factory):
@property
def instance(self):
if self._instance is None:
self._instance = self.factory()
instance = self.factory()
if isinstance(instance, types.GeneratorType):
# if it's a generator, assign every value
for value in instance:
self._instance = value
else:
# if not, just assign the result like a normal singleton
self._instance = instance
return self._instance
def __getattr__(self, name):
@@ -1080,3 +1100,88 @@ def __set__(self, instance, value):
def factory(self, instance, owner):
raise NotImplementedError("must be implemented by derived classes")
KT = TypeVar("KT")
VT = TypeVar("VT")
class PriorityOrderedMapping(Mapping[KT, VT]):
"""Mapping that iterates over key according to an integer priority. If the priority is
the same for two keys, insertion order is what matters.
The priority is set when the key/value pair is added. If not set, the highest current priority
is used.
"""
_data: Dict[KT, VT]
_priorities: List[Tuple[int, KT]]
def __init__(self) -> None:
self._data = {}
# Tuple of (priority, key)
self._priorities = []
def __getitem__(self, key: KT) -> VT:
return self._data[key]
def __len__(self) -> int:
return len(self._data)
def __iter__(self):
yield from (key for _, key in self._priorities)
def __reversed__(self):
yield from (key for _, key in reversed(self._priorities))
def reversed_keys(self):
"""Iterates over keys from the highest priority, to the lowest."""
return reversed(self)
def reversed_values(self):
"""Iterates over values from the highest priority, to the lowest."""
yield from (self._data[key] for _, key in reversed(self._priorities))
def _highest_priority(self) -> int:
if not self._priorities:
return 0
result, _ = self._priorities[-1]
return result
def add(self, key: KT, *, value: VT, priority: Optional[int] = None) -> None:
"""Adds a key/value pair to the mapping, with a specific priority.
If the priority is None, then it is assumed to be the highest priority value currently
in the container.
Raises:
ValueError: when the same priority is already in the mapping
"""
if priority is None:
priority = self._highest_priority()
if key in self._data:
self.remove(key)
self._priorities.append((priority, key))
# We rely on sort being stable
self._priorities.sort(key=lambda x: x[0])
self._data[key] = value
assert len(self._data) == len(self._priorities)
def remove(self, key: KT) -> VT:
"""Removes a key from the mapping.
Returns:
The value associated with the key being removed
Raises:
KeyError: if the key is not in the mapping
"""
if key not in self._data:
raise KeyError(f"cannot find {key}")
popped_item = self._data.pop(key)
self._priorities = [(p, k) for p, k in self._priorities if k != key]
assert len(self._data) == len(self._priorities)
return popped_item

View File

@@ -41,6 +41,16 @@ def __init__(self, dst, src_a=None, src_b=None):
self.src_a = src_a
self.src_b = src_b
def __repr__(self) -> str:
return f"MergeConflict(dst={self.dst!r}, src_a={self.src_a!r}, src_b={self.src_b!r})"
def _samefile(a: str, b: str):
try:
return os.path.samefile(a, b)
except OSError:
return False
class SourceMergeVisitor(BaseDirectoryVisitor):
"""
@@ -50,9 +60,14 @@ class SourceMergeVisitor(BaseDirectoryVisitor):
- A list of merge conflicts in dst/
"""
def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
def __init__(
self, ignore: Optional[Callable[[str], bool]] = None, normalize_paths: bool = False
):
self.ignore = ignore if ignore is not None else lambda f: False
# On case-insensitive filesystems, normalize paths to detect duplications
self.normalize_paths = normalize_paths
# When mapping <src root> to <dst root>/<projection>, we need to prepend the <projection>
# bit to the relative path in the destination dir.
self.projection: str = ""
@@ -71,10 +86,88 @@ def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
# and can run mkdir in order.
self.directories: Dict[str, Tuple[str, str]] = {}
# If the visitor is configured to normalize paths, keep a map of
# normalized path to: original path, root directory + relative path
self._directories_normalized: Dict[str, Tuple[str, str, str]] = {}
# Files to link. Maps dst_rel to (src_root, src_rel). This is an ordered dict, where files
# are guaranteed to be grouped by src_root in the order they were visited.
self.files: Dict[str, Tuple[str, str]] = {}
# If the visitor is configured to normalize paths, keep a map of
# normalized path to: original path, root directory + relative path
self._files_normalized: Dict[str, Tuple[str, str, str]] = {}
def _in_directories(self, proj_rel_path: str) -> bool:
"""
Check if a path is already in the directory list
"""
if self.normalize_paths:
return proj_rel_path.lower() in self._directories_normalized
else:
return proj_rel_path in self.directories
def _directory(self, proj_rel_path: str) -> Tuple[str, str, str]:
"""
Get the directory that is mapped to a path
"""
if self.normalize_paths:
return self._directories_normalized[proj_rel_path.lower()]
else:
return (proj_rel_path, *self.directories[proj_rel_path])
def _del_directory(self, proj_rel_path: str):
"""
Remove a directory from the list of directories
"""
del self.directories[proj_rel_path]
if self.normalize_paths:
del self._directories_normalized[proj_rel_path.lower()]
def _add_directory(self, proj_rel_path: str, root: str, rel_path: str):
"""
Add a directory to the list of directories.
Also stores the normalized version for later lookups
"""
self.directories[proj_rel_path] = (root, rel_path)
if self.normalize_paths:
self._directories_normalized[proj_rel_path.lower()] = (proj_rel_path, root, rel_path)
def _in_files(self, proj_rel_path: str) -> bool:
"""
Check if a path is already in the files list
"""
if self.normalize_paths:
return proj_rel_path.lower() in self._files_normalized
else:
return proj_rel_path in self.files
def _file(self, proj_rel_path: str) -> Tuple[str, str, str]:
"""
Get the file that is mapped to a path
"""
if self.normalize_paths:
return self._files_normalized[proj_rel_path.lower()]
else:
return (proj_rel_path, *self.files[proj_rel_path])
def _del_file(self, proj_rel_path: str):
"""
Remove a file from the list of files
"""
del self.files[proj_rel_path]
if self.normalize_paths:
del self._files_normalized[proj_rel_path.lower()]
def _add_file(self, proj_rel_path: str, root: str, rel_path: str):
"""
Add a file to the list of files
Also stores the normalized version for later lookups
"""
self.files[proj_rel_path] = (root, rel_path)
if self.normalize_paths:
self._files_normalized[proj_rel_path.lower()] = (proj_rel_path, root, rel_path)
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Register a directory if dst / rel_path is not blocked by a file or ignored.
@@ -84,23 +177,28 @@ def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
if self.ignore(rel_path):
# Don't recurse when dir is ignored.
return False
elif proj_rel_path in self.files:
# Can't create a dir where a file is.
src_a_root, src_a_relpath = self.files[proj_rel_path]
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_b=os.path.join(root, rel_path),
elif self._in_files(proj_rel_path):
# A file-dir conflict is fatal except if they're the same file (symlinked dir).
src_a = os.path.join(*self._file(proj_rel_path))
src_b = os.path.join(root, rel_path)
if not _samefile(src_a, src_b):
self.fatal_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
return False
elif proj_rel_path in self.directories:
return False
# Remove the link in favor of the dir.
existing_proj_rel_path, _, _ = self._file(proj_rel_path)
self._del_file(existing_proj_rel_path)
self._add_directory(proj_rel_path, root, rel_path)
return True
elif self._in_directories(proj_rel_path):
# No new directory, carry on.
return True
else:
# Register new directory.
self.directories[proj_rel_path] = (root, rel_path)
self._add_directory(proj_rel_path, root, rel_path)
return True
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
@@ -132,7 +230,7 @@ def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bo
if handle_as_dir:
return self.before_visit_dir(root, rel_path, depth)
self.visit_file(root, rel_path, depth)
self.visit_file(root, rel_path, depth, symlink=True)
return False
def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = False) -> None:
@@ -140,30 +238,23 @@ def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = Fa
if self.ignore(rel_path):
pass
elif proj_rel_path in self.directories:
# Can't create a file where a dir is; fatal error
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(*self.directories[proj_rel_path]),
src_b=os.path.join(root, rel_path),
elif self._in_directories(proj_rel_path):
# Can't create a file where a dir is, unless they are the same file (symlinked dir),
# in which case we simply drop the symlink in favor of the actual dir.
src_a = os.path.join(*self._directory(proj_rel_path))
src_b = os.path.join(root, rel_path)
if not symlink or not _samefile(src_a, src_b):
self.fatal_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
elif proj_rel_path in self.files:
elif self._in_files(proj_rel_path):
# When two files project to the same path, they conflict iff they are distinct.
# If they are the same (i.e. one links to the other), register regular files rather
# than symlinks. The reason is that in copy-type views, we need a copy of the actual
# file, not the symlink.
src_a = os.path.join(*self.files[proj_rel_path])
src_a = os.path.join(*self._file(proj_rel_path))
src_b = os.path.join(root, rel_path)
try:
samefile = os.path.samefile(src_a, src_b)
except OSError:
samefile = False
if not samefile:
if not _samefile(src_a, src_b):
# Distinct files produce a conflict.
self.file_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
@@ -173,12 +264,12 @@ def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = Fa
if not symlink:
# Remove the link in favor of the actual file. The del is necessary to maintain the
# order of the files dict, which is grouped by root.
del self.files[proj_rel_path]
self.files[proj_rel_path] = (root, rel_path)
existing_proj_rel_path, _, _ = self._file(proj_rel_path)
self._del_file(existing_proj_rel_path)
self._add_file(proj_rel_path, root, rel_path)
else:
# Otherwise register this file to be linked.
self.files[proj_rel_path] = (root, rel_path)
self._add_file(proj_rel_path, root, rel_path)
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
@@ -197,11 +288,11 @@ def set_projection(self, projection: str) -> None:
path = ""
for part in self.projection.split(os.sep):
path = os.path.join(path, part)
if path not in self.files:
self.directories[path] = ("<projection>", path)
if not self._in_files(path):
self._add_directory(path, "<projection>", path)
else:
# Can't create a dir where a file is.
src_a_root, src_a_relpath = self.files[path]
_, src_a_root, src_a_relpath = self._file(path)
self.fatal_conflicts.append(
MergeConflict(
dst=path,
@@ -227,8 +318,8 @@ def __init__(self, source_merge_visitor: SourceMergeVisitor):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir is a file in a src dir, add a conflict,
# and don't traverse deeper
if rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
if self.src._in_files(rel_path):
_, src_a_root, src_a_relpath = self.src._file(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
@@ -238,8 +329,9 @@ def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir was also a src dir, remove the mkdir
# action, and traverse deeper.
if rel_path in self.src.directories:
del self.src.directories[rel_path]
if self.src._in_directories(rel_path):
existing_proj_rel_path, _, _ = self.src._directory(rel_path)
self.src._del_directory(existing_proj_rel_path)
return True
# If the destination dir does not appear in the src dir,
@@ -252,38 +344,24 @@ def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bo
be seen as files; we should not accidentally merge
source dir with a symlinked dest dir.
"""
# Always conflict
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
if rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
self.visit_file(root, rel_path, depth)
# Never descend into symlinked target dirs.
return False
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# Can't merge a file if target already exists
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
if self.src._in_directories(rel_path):
_, src_a_root, src_a_relpath = self.src._directory(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
elif rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
elif self.src._in_files(rel_path):
_, src_a_root, src_a_relpath = self.src._file(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)

View File

@@ -269,7 +269,7 @@ def __init__(
@staticmethod
def _poll_interval_generator(
_wait_times: Optional[Tuple[float, float, float]] = None
_wait_times: Optional[Tuple[float, float, float]] = None,
) -> Generator[float, None, None]:
"""This implements a backoff scheme for polling a contended resource
by suggesting a succession of wait times between polls.

View File

@@ -2,8 +2,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utility classes for logging the output of blocks of code.
"""
"""Utility classes for logging the output of blocks of code."""
import atexit
import ctypes
import errno

View File

@@ -13,6 +13,18 @@
__version__ = "1.0.0.dev0"
spack_version = __version__
#: The current Package API version implemented by this version of Spack. The Package API defines
#: the Python interface for packages as well as the layout of package repositories. The minor
#: version is incremented when the package API is extended in a backwards-compatible way. The major
#: version is incremented upon breaking changes. This version is changed independently from the
#: Spack version.
package_api_version = (1, 0)
#: The minimum Package API version that this version of Spack is compatible with. This should
#: always be a tuple of the form ``(major, 0)``, since compatibility with vX.Y implies
#: compatibility with vX.0.
min_package_api_version = (1, 0)
def __try_int(v):
try:
@@ -79,4 +91,6 @@ def get_short_version() -> str:
"get_version",
"get_spack_commit",
"get_short_version",
"package_api_version",
"min_package_api_version",
]

View File

@@ -1010,7 +1010,7 @@ def _issues_in_depends_on_directive(pkgs, error_cls):
for dep_name, dep in deps_by_name.items():
def check_virtual_with_variants(spec, msg):
if not spec.virtual or not spec.variants:
if not spack.repo.PATH.is_virtual(spec.name) or not spec.variants:
return
error = error_cls(
f"{pkg_name}: {msg}",

View File

@@ -923,7 +923,7 @@ class FileTypes:
UNKNOWN = 2
NOT_ISO8859_1_TEXT = re.compile(b"[\x00\x7F-\x9F]")
NOT_ISO8859_1_TEXT = re.compile(b"[\x00\x7f-\x9f]")
def file_type(f: IO[bytes]) -> int:
@@ -2529,10 +2529,10 @@ def install_root_node(
allow_missing: when true, allows installing a node with missing dependencies
"""
# Early termination
if spec.external or spec.virtual:
warnings.warn("Skipping external or virtual package {0}".format(spec.format()))
if spec.external or not spec.concrete:
warnings.warn("Skipping external or abstract spec {0}".format(spec.format()))
return
elif spec.concrete and spec.installed and not force:
elif spec.installed and not force:
warnings.warn("Package for spec {0} already installed.".format(spec.format()))
return

View File

@@ -292,7 +292,12 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install()
PackageInstaller(
[concrete_spec.package],
fail_fast=True,
package_use_cache=False,
dependencies_use_cache=False,
).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -362,6 +367,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
return

View File

@@ -881,21 +881,6 @@ def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def load_external_modules(pkg):
"""Traverse a package's spec DAG and load any external modules.
Traverse a package's dependencies and load any external modules
associated with them.
Args:
pkg (spack.package_base.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
for external_module in external_modules:
load_module(external_module)
def setup_package(pkg, dirty, context: Context = Context.BUILD):
"""Execute all environment setup routines."""
if context not in (Context.BUILD, Context.TEST):
@@ -946,7 +931,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules:
load_module(mod)
load_external_modules(pkg)
load_external_modules(setup_context)
# Make sure nothing's strange about the Spack environment.
validate(env_mods, tty.warn)
@@ -1235,6 +1220,21 @@ def _make_runnable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
env.prepend_path("PATH", bin_dir)
def load_external_modules(context: SetupContext) -> None:
"""Traverse a package's spec DAG and load any external modules.
Traverse a package's dependencies and load any external modules
associated with them.
Args:
context: A populated SetupContext object
"""
for spec, _ in context.external:
external_modules = spec.external_modules or []
for external_module in external_modules:
load_module(external_module)
def _setup_pkg_and_run(
serialized_pkg: "spack.subprocess_context.PackageInstallContext",
function: Callable,

View File

@@ -12,6 +12,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import depends_on
from .cmake import CMakeBuilder, CMakePackage
@@ -277,17 +278,24 @@ def initconfig_hardware_entries(self):
entries.append("# ROCm")
entries.append("#------------------{0}\n".format("-" * 30))
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "clang++"))
)
if spec.satisfies("^blt@0.7:"):
rocm_root = os.path.dirname(spec["llvm-amdgpu"].prefix)
entries.append(cmake_cache_path("ROCM_PATH", rocm_root))
else:
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath(
"CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "amdclang++")
)
)
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
@@ -371,6 +379,10 @@ class CachedCMakePackage(CMakePackage):
CMakeBuilder = CachedCMakeBuilder
# These dependencies are assumed in the builder
depends_on("c", type="build")
depends_on("cxx", type="build")
def flag_handler(self, name, flags):
if name in ("cflags", "cxxflags", "cppflags", "fflags"):
return None, None, None # handled in the cmake cache

View File

@@ -70,10 +70,16 @@ def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def std_build_args(self):
"""Standard arguments for ``cargo build`` provided as a property for
convenience of package writers."""
return ["-j", str(self.pkg.module.make_jobs)]
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return ["-j", str(self.pkg.module.make_jobs)]
return []
@property
def check_args(self):
@@ -88,7 +94,9 @@ def build(
) -> None:
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
pkg.module.cargo(
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
)
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix

View File

@@ -48,6 +48,9 @@ class MesonPackage(spack.package_base.PackageBase):
variant("strip", default=False, description="Strip targets on install")
depends_on("meson", type="build")
depends_on("ninja", type="build")
# Meson uses pkg-config for dependency detection, and this dependency is
# often overlooked by packages that use meson as a build system.
depends_on("pkgconfig", type="build")
# Python detection in meson requires distutils to be importable, but distutils no longer
# exists in Python 3.12. In Spack, we can't use setuptools as distutils replacement,
# because the distutils-precedence.pth startup file that setuptools ships with is not run

View File

@@ -142,7 +142,7 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
# Only if environment modifications are desired (default is +envmods)
if "~envmods" not in self.spec:
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
self.component_prefix.env.join("vars.sh"), *self.env_script_args

View File

@@ -6,6 +6,7 @@
import codecs
import json
import os
import pathlib
import re
import shutil
import stat
@@ -13,16 +14,16 @@
import tempfile
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set
from typing import Callable, Dict, List, Set, Union
from urllib.request import Request
import llnl.path
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.color import cescape, colorize
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.concretize
import spack.config as cfg
import spack.environment as ev
@@ -32,6 +33,7 @@
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
@@ -40,6 +42,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.version import GitVersion, StandardVersion
from .common import (
IS_WINDOWS,
@@ -78,11 +81,53 @@ def get_change_revisions():
return None, None
def get_added_versions(
checksums_version_dict: Dict[str, Union[StandardVersion, GitVersion]],
path: str,
from_ref: str = "HEAD~1",
to_ref: str = "HEAD",
) -> List[Union[StandardVersion, GitVersion]]:
"""Get a list of the versions added between `from_ref` and `to_ref`.
Args:
checksums_version_dict (Dict): all package versions keyed by known checksums.
path (str): path to the package.py
from_ref (str): oldest git ref, defaults to `HEAD~1`
to_ref (str): newer git ref, defaults to `HEAD`
Returns: list of versions added between refs
"""
git_exe = spack.util.git.git(required=True)
# Gather git diff
diff_lines = git_exe("diff", from_ref, to_ref, "--", path, output=str).split("\n")
# Store added and removed versions
# Removed versions are tracked here to determine when versions are moved in a file
# and show up as both added and removed in a git diff.
added_checksums = set()
removed_checksums = set()
# Scrape diff for modified versions and prune added versions if they show up
# as also removed (which means they've actually just moved in the file and
# we shouldn't need to rechecksum them)
for checksum in checksums_version_dict.keys():
for line in diff_lines:
if checksum in line:
if line.startswith("+"):
added_checksums.add(checksum)
if line.startswith("-"):
removed_checksums.add(checksum)
return [checksums_version_dict[c] for c in added_checksums - removed_checksums]
def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
"""Given an environment manifest path and two revisions to compare, return
whether or not the stack was changed. Returns True if the environment
manifest changed between the provided revisions (or additionally if the
`.gitlab-ci.yml` file itself changed). Returns False otherwise."""
# git returns posix paths always, normalize input to be comptaible
# with that
env_path = llnl.path.convert_to_posix_path(env_path)
git = spack.util.git.git()
if git:
with fs.working_dir(spack.paths.prefix):
@@ -220,7 +265,7 @@ def rebuild_filter(s: spack.spec.Spec) -> RebuildDecision:
def _format_pruning_message(spec: spack.spec.Spec, prune: bool, reasons: List[str]) -> str:
reason_msg = ", ".join(reasons)
spec_fmt = "{name}{@version}{%compiler}{/hash:7}"
spec_fmt = "{name}{@version}{/hash:7}{%compiler}"
if not prune:
status = colorize("@*g{[x]} ")
@@ -577,22 +622,25 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
tty.debug(f"job spec: {job_spec}")
try:
pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name)
job_pkg = pkg_cls(job_spec)
tty.debug(f"job package: {job_pkg}")
except AssertionError:
msg = f"Cannot copy stage logs: job spec ({job_spec}) must be concrete"
tty.error(msg)
package_metadata_root = pathlib.Path(spack.store.STORE.layout.metadata_path(job_spec))
except spack.error.SpackError as e:
tty.error(f"Cannot copy logs: {str(e)}")
return
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
copy_files_to_artifacts(file, job_log_dir)
# Get the package's archived files
archive_files = []
archive_root = package_metadata_root / "archived-files"
if archive_root.is_dir():
archive_files = [f for f in archive_root.rglob("*") if f.is_file()]
else:
msg = "Cannot copy package archived files: archived-files must be a directory"
tty.warn(msg)
build_log_zipped = package_metadata_root / "spack-build-out.txt.gz"
build_env_mods = package_metadata_root / "spack-build-env.txt"
for f in [build_log_zipped, build_env_mods, *archive_files]:
copy_files_to_artifacts(str(f), job_log_dir)
def copy_test_logs_to_artifacts(test_stage, job_test_dir):
@@ -612,7 +660,7 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def download_and_extract_artifacts(url, work_dir):
def download_and_extract_artifacts(url, work_dir) -> str:
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -620,6 +668,10 @@ def download_and_extract_artifacts(url, work_dir):
url (str): Complete url to artifacts.zip file
work_dir (str): Path to destination where artifacts should be extracted
Output:
Artifacts root path relative to the archive root
"""
tty.msg(f"Fetching artifacts from: {url}")
@@ -637,13 +689,25 @@ def download_and_extract_artifacts(url, work_dir):
response = urlopen(request, timeout=SPACK_CDASH_TIMEOUT)
with open(artifacts_zip_path, "wb") as out_file:
shutil.copyfileobj(response, out_file)
with zipfile.ZipFile(artifacts_zip_path) as zip_file:
zip_file.extractall(work_dir)
# Get the artifact root
artifact_root = ""
for f in zip_file.filelist:
if "spack.lock" in f.filename:
artifact_root = os.path.dirname(os.path.dirname(f.filename))
break
except OSError as e:
raise SpackError(f"Error fetching artifacts: {e}")
finally:
try:
os.remove(artifacts_zip_path)
except FileNotFoundError:
# If the file doesn't exist we are already raising
pass
with zipfile.ZipFile(artifacts_zip_path) as zip_file:
zip_file.extractall(work_dir)
os.remove(artifacts_zip_path)
return artifact_root
def get_spack_info():
@@ -757,7 +821,7 @@ def setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None):
return True
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime, use_local_head):
"""Given a url to gitlab artifacts.zip from a failed 'spack ci rebuild' job,
attempt to setup an environment in which the failure can be reproduced
locally. This entails the following:
@@ -771,8 +835,11 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commands to run to reproduce the build once inside the container.
"""
work_dir = os.path.realpath(work_dir)
if os.path.exists(work_dir) and os.listdir(work_dir):
raise SpackError(f"Cannot run reproducer in non-emptry working dir:\n {work_dir}")
platform_script_ext = "ps1" if IS_WINDOWS else "sh"
download_and_extract_artifacts(url, work_dir)
artifact_root = download_and_extract_artifacts(url, work_dir)
gpg_path = None
if gpg_url:
@@ -834,6 +901,9 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
with open(repro_file, encoding="utf-8") as fd:
repro_details = json.load(fd)
spec_file = fs.find(work_dir, repro_details["job_spec_json"])[0]
reproducer_spec = spack.spec.Spec.from_specfile(spec_file)
repro_dir = os.path.dirname(repro_file)
rel_repro_dir = repro_dir.replace(work_dir, "").lstrip(os.path.sep)
@@ -894,17 +964,20 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commit_regex = re.compile(r"commit\s+([^\s]+)")
merge_commit_regex = re.compile(r"Merge\s+([^\s]+)\s+into\s+([^\s]+)")
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
if use_local_head:
commit_1 = "HEAD"
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
if m:
commit_1 = m.group(1)
setup_result = False
if commit_1:
@@ -979,6 +1052,8 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"entrypoint", entrypoint_script, work_dir, run=False, exit_on_failure=False
)
# Attempt to create a unique name for the reproducer container
container_suffix = "_" + reproducer_spec.dag_hash() if reproducer_spec else ""
docker_command = [
runtime,
"run",
@@ -986,14 +1061,14 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"-t",
"--rm",
"--name",
"spack_reproducer",
f"spack_reproducer{container_suffix}",
"-v",
":".join([work_dir, mounted_workdir, "Z"]),
"-v",
":".join(
[
os.path.join(work_dir, "jobs_scratch_dir"),
os.path.join(mount_as_dir, "jobs_scratch_dir"),
os.path.join(work_dir, artifact_root),
os.path.join(mount_as_dir, artifact_root),
"Z",
]
),

View File

@@ -330,7 +330,7 @@ def ensure_single_spec_or_die(spec, matching_specs):
if len(matching_specs) <= 1:
return
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{ arch=architecture}"
format_string = "{name}{@version}{ arch=architecture} {%compiler.name}{@compiler.version}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
@@ -471,12 +471,11 @@ def get_arg(name, default=None):
nfmt = "{fullname}" if namespaces else "{name}"
ffmt = ""
if full_compiler or flags:
ffmt += "{%compiler.name}"
ffmt += "{compiler_flags} {%compiler.name}"
if full_compiler:
ffmt += "{@compiler.version}"
ffmt += " {compiler_flags}"
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + ffmt + vfmt
format_string = nfmt + "{@version}" + vfmt + ffmt
def fmt(s, depth=0):
"""Formatter function for all output specs"""

View File

@@ -4,7 +4,7 @@
import re
import sys
from typing import Dict, Optional
from typing import Dict, Optional, Tuple
import llnl.string
import llnl.util.lang
@@ -181,7 +181,11 @@ def checksum(parser, args):
print()
if args.add_to_package:
add_versions_to_package(pkg, version_lines, args.batch)
path = spack.repo.PATH.filename_for_package_name(pkg.name)
num_versions_added = add_versions_to_pkg(path, version_lines)
tty.msg(f"Added {num_versions_added} new versions to {pkg.name} in {path}")
if not args.batch and sys.stdin.isatty():
editor(path)
def print_checksum_status(pkg: PackageBase, version_hashes: dict):
@@ -227,20 +231,9 @@ def print_checksum_status(pkg: PackageBase, version_hashes: dict):
tty.die("Invalid checksums found.")
def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool):
"""
Add checksumed versions to a package's instructions and open a user's
editor so they may double check the work of the function.
Args:
pkg (spack.package_base.PackageBase): A package class for a given package in Spack.
version_lines (str): A string of rendered version lines.
"""
# Get filename and path for package
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
def _update_version_statements(package_src: str, version_lines: str) -> Tuple[int, str]:
"""Returns a tuple of number of versions added and the package's modified contents."""
num_versions_added = 0
version_statement_re = re.compile(r"([\t ]+version\([^\)]*\))")
version_re = re.compile(r'[\t ]+version\(\s*"([^"]+)"[^\)]*\)')
@@ -252,33 +245,34 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool
if match:
new_versions.append((Version(match.group(1)), ver_line))
with open(filename, "r+", encoding="utf-8") as f:
contents = f.read()
split_contents = version_statement_re.split(contents)
split_contents = version_statement_re.split(package_src)
for i, subsection in enumerate(split_contents):
# If there are no more versions to add we should exit
if len(new_versions) <= 0:
break
for i, subsection in enumerate(split_contents):
# If there are no more versions to add we should exit
if len(new_versions) <= 0:
break
# Check if the section contains a version
contents_version = version_re.match(subsection)
if contents_version is not None:
parsed_version = Version(contents_version.group(1))
# Check if the section contains a version
contents_version = version_re.match(subsection)
if contents_version is not None:
parsed_version = Version(contents_version.group(1))
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIXME", "\n"]
num_versions_added += 1
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIXME", "\n"]
num_versions_added += 1
elif parsed_version == new_versions[0][0]:
new_versions.pop(0)
elif parsed_version == new_versions[0][0]:
new_versions.pop(0)
# Seek back to the start of the file so we can rewrite the file contents.
f.seek(0)
f.writelines("".join(split_contents))
return num_versions_added, "".join(split_contents)
tty.msg(f"Added {num_versions_added} new versions to {pkg.name}")
tty.msg(f"Open {filename} to review the additions.")
if sys.stdout.isatty() and not is_batch:
editor(filename)
def add_versions_to_pkg(path: str, version_lines: str) -> int:
"""Add new versions to a package.py file. Returns the number of versions added."""
with open(path, "r", encoding="utf-8") as f:
package_src = f.read()
num_versions_added, package_src = _update_version_statements(package_src, version_lines)
if num_versions_added > 0:
with open(path, "w", encoding="utf-8") as f:
f.write(package_src)
return num_versions_added

View File

@@ -4,12 +4,15 @@
import json
import os
import re
import shutil
import sys
from typing import Dict
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util import tty
import spack.binary_distribution as bindist
import spack.ci as spack_ci
@@ -18,12 +21,22 @@
import spack.cmd.common.arguments
import spack.config as cfg
import spack.environment as ev
import spack.error
import spack.fetch_strategy
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.package_base
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.executable
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
description = "manage continuous integration pipelines"
section = "build"
@@ -32,6 +45,7 @@
SPACK_COMMAND = "spack"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
BUILTIN = re.compile(r"var\/spack\/repos\/builtin\/packages\/([^\/]+)\/package\.py")
def deindent(desc):
@@ -176,6 +190,11 @@ def setup_parser(subparser):
reproduce.add_argument(
"-s", "--autostart", help="Run docker reproducer automatically", action="store_true"
)
reproduce.add_argument(
"--use-local-head",
help="Use the HEAD of the local Spack instead of reproducing a commit",
action="store_true",
)
gpg_group = reproduce.add_mutually_exclusive_group(required=False)
gpg_group.add_argument(
"--gpg-file", help="Path to public GPG key for validating binary cache installs"
@@ -186,6 +205,16 @@ def setup_parser(subparser):
reproduce.set_defaults(func=ci_reproduce)
# Verify checksums inside of ci workflows
verify_versions = subparsers.add_parser(
"verify-versions",
description=deindent(ci_verify_versions.__doc__),
help=spack.cmd.first_line(ci_verify_versions.__doc__),
)
verify_versions.add_argument("from_ref", help="git ref from which start looking at changes")
verify_versions.add_argument("to_ref", help="git ref to end looking at changes")
verify_versions.set_defaults(func=ci_verify_versions)
def ci_generate(args):
"""generate jobs file from a CI-aware spack file
@@ -422,7 +451,7 @@ def ci_rebuild(args):
# Arguments when installing the root from sources
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
root_install_args = install_args + ["--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
@@ -459,8 +488,7 @@ def ci_rebuild(args):
job_spec.to_dict(hash=ht.dag_hash),
)
# We generated the "spack install ..." command to "--keep-stage", copy
# any logs from the staging directory to artifacts now
# Copy logs and archived files from the install metadata (.spack) directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# If the installation succeeded and we're running stand-alone tests for
@@ -608,7 +636,12 @@ def ci_reproduce(args):
gpg_key_url = None
return spack_ci.reproduce_ci_job(
args.job_url, args.working_dir, args.autostart, gpg_key_url, args.runtime
args.job_url,
args.working_dir,
args.autostart,
gpg_key_url,
args.runtime,
args.use_local_head,
)
@@ -650,6 +683,159 @@ def _gitlab_artifacts_url(url: str) -> str:
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def validate_standard_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the checksum of a package version based on a tarball.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version checksum
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
url_dict: Dict[spack.version.StandardVersion, str] = {}
for version in versions:
url = pkg.find_valid_url_for_version(version)
url_dict[version] = url
version_hashes = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, fetch_options=pkg.fetch_options
)
valid_checksums = True
for version, sha in version_hashes.items():
if sha != pkg.versions[version]["sha256"]:
tty.error(
f"Invalid checksum found {pkg.name}@{version}\n"
f" [package.py] {pkg.versions[version]['sha256']}\n"
f" [Downloaded] {sha}"
)
valid_checksums = False
continue
tty.info(f"Validated {pkg.name}@{version} --> {sha}")
return valid_checksums
def validate_git_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the commit and tag of a package version based on a git repository.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
valid_commit = True
for version in versions:
fetcher = spack.fetch_strategy.for_package_version(pkg, version)
with spack.stage.Stage(fetcher) as stage:
known_commit = pkg.versions[version]["commit"]
try:
stage.fetch()
except spack.error.FetchError:
tty.error(
f"Invalid commit for {pkg.name}@{version}\n"
f" {known_commit} could not be checked out in the git repository."
)
valid_commit = False
continue
# Test if the specified tag matches the commit in the package.py
# We retrieve the commit associated with a tag and compare it to the
# commit that is located in the package.py file.
if "tag" in pkg.versions[version]:
tag = pkg.versions[version]["tag"]
try:
with fs.working_dir(stage.source_path):
found_commit = fetcher.git(
"rev-list", "-n", "1", tag, output=str, error=str
).strip()
except spack.util.executable.ProcessError:
tty.error(
f"Invalid tag for {pkg.name}@{version}\n"
f" {tag} could not be found in the git repository."
)
valid_commit = False
continue
if found_commit != known_commit:
tty.error(
f"Mismatched tag <-> commit found for {pkg.name}@{version}\n"
f" [package.py] {known_commit}\n"
f" [Downloaded] {found_commit}"
)
valid_commit = False
continue
# If we have downloaded the repository, found the commit, and compared
# the tag (if specified) we can conclude that the version is pointing
# at what we would expect.
tty.info(f"Validated {pkg.name}@{version} --> {known_commit}")
return valid_commit
def ci_verify_versions(args):
"""validate version checksum & commits between git refs
This command takes a from_ref and to_ref arguments and
then parses the git diff between the two to determine which packages
have been modified verifies the new checksums inside of them.
"""
with fs.working_dir(spack.paths.prefix):
# We use HEAD^1 explicitly on the merge commit created by
# GitHub Actions. However HEAD~1 is a safer default for the helper function.
files = spack.util.git.get_modified_files(from_ref=args.from_ref, to_ref=args.to_ref)
# Get a list of package names from the modified files.
pkgs = [(m.group(1), p) for p in files for m in [BUILTIN.search(p)] if m]
failed_version = False
for pkg_name, path in pkgs:
spec = spack.spec.Spec(pkg_name)
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
# Skip checking manual download packages and trust the maintainers
if pkg.manual_download:
tty.warn(f"Skipping manual download package: {pkg_name}")
continue
# Store versions checksums / commits for future loop
checksums_version_dict = {}
commits_version_dict = {}
for version in pkg.versions:
# If the package version defines a sha256 we'll use that as the high entropy
# string to detect which versions have been added between from_ref and to_ref
if "sha256" in pkg.versions[version]:
checksums_version_dict[pkg.versions[version]["sha256"]] = version
# If a package version instead defines a commit we'll use that as a
# high entropy string to detect new versions.
elif "commit" in pkg.versions[version]:
commits_version_dict[pkg.versions[version]["commit"]] = version
# TODO: enforce every version have a commit or a sha256 defined if not
# an infinite version (there are a lot of package's where this doesn't work yet.)
with fs.working_dir(spack.paths.prefix):
added_checksums = spack_ci.get_added_versions(
checksums_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
added_commits = spack_ci.get_added_versions(
commits_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
if added_checksums:
failed_version = not validate_standard_versions(pkg, added_checksums) or failed_version
if added_commits:
failed_version = not validate_git_versions(pkg, added_commits) or failed_version
if failed_version:
sys.exit(1)
def ci(parser, args):
if args.func:
return args.func(args)

View File

@@ -528,7 +528,6 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually
# specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line")

View File

@@ -350,9 +350,12 @@ def _config_change(config_path, match_spec_str=None):
if spack.config.get(key_path, scope=scope):
ideal_scope_to_modify = scope
break
# If we find our key in a specific scope, that's the one we want
# to modify. Otherwise we use the default write scope.
write_scope = ideal_scope_to_modify or spack.config.default_modify_scope()
update_path = f"{key_path}:[{str(spec)}]"
spack.config.add(update_path, scope=ideal_scope_to_modify)
spack.config.add(update_path, scope=write_scope)
else:
raise ValueError("'config change' can currently only change 'require' sections")

View File

@@ -2,23 +2,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import platform
import re
import sys
from datetime import datetime
from glob import glob
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.paths
import spack.platforms
import spack.spec
import spack.store
import spack.util.git
from spack.util.executable import which
description = "debugging commands for troubleshooting Spack"
section = "developer"
@@ -27,63 +15,9 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="debug_command")
sp.add_parser("create-db-tarball", help="create a tarball of Spack's installation metadata")
sp.add_parser("report", help="print information useful for bug reports")
def _debug_tarball_suffix():
now = datetime.now()
suffix = now.strftime("%Y-%m-%d-%H%M%S")
git = spack.util.git.git()
if not git:
return "nobranch-nogit-%s" % suffix
with working_dir(spack.paths.prefix):
if not os.path.isdir(".git"):
return "nobranch.nogit.%s" % suffix
# Get symbolic branch name and strip any special chars (mainly '/')
symbolic = git("rev-parse", "--abbrev-ref", "--short", "HEAD", output=str).strip()
symbolic = re.sub(r"[^\w.-]", "-", symbolic)
# Get the commit hash too.
commit = git("rev-parse", "--short", "HEAD", output=str).strip()
if symbolic == commit:
return "nobranch.%s.%s" % (commit, suffix)
else:
return "%s.%s.%s" % (symbolic, commit, suffix)
def create_db_tarball(args):
tar = which("tar")
tarball_name = "spack-db.%s.tar.gz" % _debug_tarball_suffix()
tarball_path = os.path.abspath(tarball_name)
base = os.path.basename(str(spack.store.STORE.root))
transform_args = []
# Currently --transform and -s are not supported by Windows native tar
if "GNU" in tar("--version", output=str):
transform_args = ["--transform", "s/^%s/%s/" % (base, tarball_name)]
elif sys.platform != "win32":
transform_args = ["-s", "/^%s/%s/" % (base, tarball_name)]
wd = os.path.dirname(str(spack.store.STORE.root))
with working_dir(wd):
files = [spack.store.STORE.db._index_path]
files += glob("%s/*/*/*/.spack/spec.json" % base)
files += glob("%s/*/*/*/.spack/spec.yaml" % base)
files = [os.path.relpath(f) for f in files]
args = ["-czf", tarball_path]
args += transform_args
args += files
tar(*args)
tty.msg("Created %s" % tarball_name)
def report(args):
host_platform = spack.platforms.host()
host_os = host_platform.default_operating_system()
@@ -95,5 +29,5 @@ def report(args):
def debug(parser, args):
action = {"create-db-tarball": create_db_tarball, "report": report}
action[args.debug_command](args)
if args.debug_command == "report":
report(args)

View File

@@ -9,9 +9,9 @@
import spack.cmd
import spack.environment as ev
import spack.package_base
import spack.store
from spack.cmd.common import arguments
from spack.solver.input_analysis import create_graph_analyzer
description = "show dependencies of a package"
section = "basic"
@@ -55,7 +55,7 @@ def dependencies(parser, args):
env = ev.active_environment()
spec = spack.cmd.disambiguate_spec(specs[0], env)
format_string = "{name}{@version}{%compiler}{/hash:7}"
format_string = "{name}{@version}{/hash:7}{%compiler}"
if sys.stdout.isatty():
tty.msg("Dependencies of %s" % spec.format(format_string, color=True))
deps = spack.store.STORE.db.installed_relatives(
@@ -68,15 +68,17 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package_base.possible_dependencies(
dependencies, virtuals, _ = create_graph_analyzer().possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,
depflag=args.deptype,
allowed_deps=args.deptype,
)
if not args.expand_virtuals:
dependencies.update(virtuals)
if spec.name in dependencies:
del dependencies[spec.name]
dependencies.remove(spec.name)
if dependencies:
colify(sorted(dependencies))

View File

@@ -93,7 +93,7 @@ def dependents(parser, args):
env = ev.active_environment()
spec = spack.cmd.disambiguate_spec(specs[0], env)
format_string = "{name}{@version}{%compiler}{/hash:7}"
format_string = "{name}{@version}{/hash:7}{%compiler}"
if sys.stdout.isatty():
tty.msg("Dependents of %s" % spec.cformat(format_string))
deps = spack.store.STORE.db.installed_relatives(spec, "parents", args.transitive)

View File

@@ -3,11 +3,13 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from typing import Optional
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.environment
import spack.fetch_strategy
import spack.repo
import spack.spec
@@ -31,37 +33,33 @@ def setup_parser(subparser):
"--no-clone",
action="store_false",
dest="clone",
default=None,
help="do not clone, the package already exists at the source path",
)
clone_group.add_argument(
"--clone",
action="store_true",
dest="clone",
default=None,
help="clone the package even if the path already exists",
default=True,
help=(
"(default) clone the package unless the path already exists, "
"use --force to overwrite"
),
)
subparser.add_argument(
"-f", "--force", help="remove any files or directories that block cloning source code"
)
subparser.add_argument(
"-r",
"--recursive",
action="store_true",
help="traverse nodes of the graph to mark everything up to the root as a develop spec",
)
arguments.add_common_arguments(subparser, ["spec"])
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
entry = {"spec": str(spec)}
if path != spec.name:
entry["path"] = path
def change_fn(section):
section[spec.name] = entry
spack.config.change_or_add("develop", find_fn, change_fn)
def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
# "steal" the source code via staging API. We ask for a stage
# to be created, then copy it afterwards somewhere else. It would be
@@ -83,86 +81,159 @@ def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
package.stage.steal_source(abspath)
def develop(parser, args):
# Note: we could put develop specs in any scope, but I assume
# users would only ever want to do this for either (a) an active
# env or (b) a specified config file (e.g. that is included by
# an environment)
# TODO: when https://github.com/spack/spack/pull/35307 is merged,
# an active env is not required if a scope is specified
env = spack.cmd.require_active_env(cmd_name="develop")
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
def assure_concrete_spec(env: spack.environment.Environment, spec: spack.spec.Spec):
version = spec.versions.concrete_range_as_version
if not version:
# first check environment for a matching concrete spec
matching_specs = env.all_matching_specs(spec)
if matching_specs:
version = matching_specs[0].version
test_spec = spack.spec.Spec(f"{spec}@{version}")
for m_spec in matching_specs:
if not m_spec.satisfies(test_spec):
raise SpackError(
f"{spec.name}: has multiple concrete instances in the graph that can't be"
" satisified by a single develop spec. To use `spack develop` ensure one"
" of the following:"
f"\n a) {spec.name} nodes can satisfy the same develop spec (minimally "
"this means they all share the same version)"
f"\n b) Provide a concrete develop spec ({spec.name}@[version]) to clearly"
" indicate what should be developed"
)
else:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# download all dev specs
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"]
msg += " because its path already exists."
tty.msg(msg)
continue
def setup_src_code(spec: spack.spec.Spec, src_path: str, clone: bool = True, force: bool = False):
"""
Handle checking, cloning or overwriting source code
"""
assert spec.versions
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
_retrieve_develop_source(spec, abspath)
if clone:
_clone(spec, src_path, force)
if not env.dev_specs:
tty.warn("No develop specs to download")
return
specs = spack.cmd.parse_specs(args.spec)
if len(specs) > 1:
raise SpackError("spack develop requires at most one named spec")
spec = specs[0]
if not clone and not os.path.exists(src_path):
raise SpackError(f"Provided path {src_path} does not exist")
version = spec.versions.concrete_range_as_version
if not version:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
version = max(spack.repo.PATH.get_pkg_class(spec.fullname).versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# If user does not specify --path, we choose to create a directory in the
# active environment's directory, named after the spec
path = args.path or spec.name
if not os.path.isabs(path):
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
else:
abspath = path
# clone default: only if the path doesn't exist
clone = args.clone
if clone is None:
clone = not os.path.exists(abspath)
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
if not clone and not os.path.exists(abspath):
raise SpackError("Provided path %s does not exist" % abspath)
entry = {"spec": str(spec)}
if path and path != spec.name:
entry["path"] = path
if clone:
if os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
def change_fn(section):
section[spec.name] = entry
_retrieve_develop_source(spec, abspath)
spack.config.change_or_add("develop", find_fn, change_fn)
def update_env(
env: spack.environment.Environment,
spec: spack.spec.Spec,
specified_path: Optional[str] = None,
build_dir: Optional[str] = None,
):
"""
Update the spack.yaml file with additions or changes from a develop call
"""
tty.debug(f"Updating develop config for {env.name} transactionally")
if not specified_path:
dev_entry = env.dev_specs.get(spec.name)
if dev_entry:
specified_path = dev_entry.get("path", None)
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if args.build_directory is not None:
if build_dir is not None:
spack.config.add(
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
f"packages:{spec.name}:package_attributes:build_directory:{build_dir}",
env.scope_name,
)
_update_config(spec, path)
# add develop spec and update path
_update_config(spec, specified_path)
def _clone(spec: spack.spec.Spec, abspath: str, force: bool = False):
if os.path.exists(abspath):
if force:
shutil.rmtree(abspath)
else:
msg = f"Skipping developer download of {spec.name}"
msg += f" because its path {abspath} already exists."
tty.msg(msg)
return
# cloning can take a while and it's nice to get a message for the longer clones
tty.msg(f"Cloning source code for {spec}")
_retrieve_develop_source(spec, abspath)
def _abs_code_path(
env: spack.environment.Environment, spec: spack.spec.Spec, path: Optional[str] = None
):
src_path = path if path else spec.name
return spack.util.path.canonicalize_path(src_path, default_wd=env.path)
def _dev_spec_generator(args, env):
"""
Generator function to loop over all the develop specs based on how the command is called
If no specs are supplied then loop over the develop specs listed in the environment.
"""
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
yield spec, abspath
else:
specs = spack.cmd.parse_specs(args.spec)
if (args.path or args.build_directory) and len(specs) > 1:
raise SpackError(
"spack develop requires at most one named spec when using the --path or"
" --build-directory arguments"
)
for spec in specs:
if args.recursive:
concrete_specs = env.all_matching_specs(spec)
if not concrete_specs:
tty.warn(
f"{spec.name} has no matching concrete specs in the environment and "
"will be skipped. `spack develop --recursive` requires a concretized"
" environment"
)
else:
for s in concrete_specs:
for node_spec in s.traverse(direction="parents", root=True):
tty.debug(f"Recursive develop for {node_spec.name}")
yield node_spec, _abs_code_path(env, node_spec, args.path)
else:
yield spec, _abs_code_path(env, spec, args.path)
def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
for spec, abspath in _dev_spec_generator(args, env):
assure_concrete_spec(env, spec)
setup_src_code(spec, abspath, clone=args.clone, force=args.force)
update_env(env, spec, args.path, args.build_directory)

View File

@@ -73,7 +73,7 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2 @B{fabrics=psm,mrail,sock} @g{%gcc}
mvapich2, built with gcc compiler, with support for multiple fabrics
"""

View File

@@ -545,7 +545,7 @@ def _not_license_excluded(self, x):
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
elif spack.repo.PATH.get_pkg_class(x.fullname).redistribute_source(x):
return True
else:
tty.debug(

View File

@@ -383,8 +383,10 @@ def modules_cmd(parser, args, module_type, callbacks=callbacks):
query = " ".join(str(s) for s in args.constraint_specs)
msg = f"the constraint '{query}' matches multiple packages:\n"
for s in specs:
spec_fmt = "{hash:7} {name}{@version}{%compiler}"
spec_fmt += "{compiler_flags}{variants}{arch=architecture}"
spec_fmt = (
"{hash:7} {name}{@version}{compiler_flags}{variants}"
"{arch=architecture} {%compiler}"
)
msg += "\t" + s.cformat(spec_fmt) + "\n"
tty.die(msg, "In this context exactly *one* match is needed.")

View File

@@ -41,7 +41,11 @@ def providers(parser, args):
specs = spack.cmd.parse_specs(args.virtual_package)
# Check prerequisites
non_virtual = [str(s) for s in specs if not s.virtual or s.name not in valid_virtuals]
non_virtual = [
str(s)
for s in specs
if not spack.repo.PATH.is_virtual(s.name) or s.name not in valid_virtuals
]
if non_virtual:
msg = "non-virtual specs cannot be part of the query "
msg += "[{0}]\n".format(", ".join(non_virtual))

View File

@@ -6,8 +6,9 @@
import os
import re
import sys
from itertools import zip_longest
from typing import Dict, List, Optional
import warnings
from itertools import islice, zip_longest
from typing import Callable, Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
@@ -16,6 +17,9 @@
import spack.paths
import spack.repo
import spack.util.git
import spack.util.spack_yaml
from spack.spec_parser import SPEC_TOKENIZER, SpecTokens
from spack.tokenize import Token
from spack.util.executable import Executable, which
description = "runs source code style checks on spack"
@@ -198,6 +202,13 @@ def setup_parser(subparser):
action="append",
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
)
subparser.add_argument(
"--spec-strings",
action="store_true",
help="upgrade spec strings in Python, JSON and YAML files for compatibility with Spack "
"v1.0 and v0.x. Example: spack style --spec-strings $(git ls-files). Note: this flag "
"will be removed in Spack v1.0.",
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
@@ -423,7 +434,8 @@ def _run_import_check(
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
# Find at most two occurences: the first is the import itself, the second is its usage.
if len(list(islice(re.finditer(rf"{re.escape(m.group(1))}(?!\w)", contents), 2))) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
@@ -438,7 +450,7 @@ def _run_import_check(
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
if re.search(rf"import {re.escape(module)}(?!\w|\.)", contents):
continue
to_add.add(module)
exit_code = 1
@@ -506,7 +518,196 @@ def _bootstrap_dev_dependencies():
spack.bootstrap.ensure_environment_dependencies()
IS_PROBABLY_COMPILER = re.compile(r"%[a-zA-Z_][a-zA-Z0-9\-]")
def _spec_str_reorder_compiler(idx: int, blocks: List[List[Token]]) -> None:
# only move the compiler to the back if it exists and is not already at the end
if not 0 <= idx < len(blocks) - 1:
return
# if there's only whitespace after the compiler, don't move it
if all(token.kind == SpecTokens.WS for block in blocks[idx + 1 :] for token in block):
return
# rotate left and always add at least one WS token between compiler and previous token
compiler_block = blocks.pop(idx)
if compiler_block[0].kind != SpecTokens.WS:
compiler_block.insert(0, Token(SpecTokens.WS, " "))
# delete the WS tokens from the new first block if it was at the very start, to prevent leading
# WS tokens.
while idx == 0 and blocks[0][0].kind == SpecTokens.WS:
blocks[0].pop(0)
blocks.append(compiler_block)
def _spec_str_format(spec_str: str) -> Optional[str]:
"""Given any string, try to parse as spec string, and rotate the compiler token to the end
of each spec instance. Returns the formatted string if it was changed, otherwise None."""
# We parse blocks of tokens that include leading whitespace, and move the compiler block to
# the end when we hit a dependency ^... or the end of a string.
# [@3.1][ +foo][ +bar][ %gcc@3.1][ +baz]
# [@3.1][ +foo][ +bar][ +baz][ %gcc@3.1]
current_block: List[Token] = []
blocks: List[List[Token]] = []
compiler_block_idx = -1
in_edge_attr = False
for token in SPEC_TOKENIZER.tokenize(spec_str):
if token.kind == SpecTokens.UNEXPECTED:
# parsing error, we cannot fix this string.
return None
elif token.kind in (SpecTokens.COMPILER, SpecTokens.COMPILER_AND_VERSION):
# multiple compilers are not supported in Spack v0.x, so early return
if compiler_block_idx != -1:
return None
current_block.append(token)
blocks.append(current_block)
current_block = []
compiler_block_idx = len(blocks) - 1
elif token.kind in (
SpecTokens.START_EDGE_PROPERTIES,
SpecTokens.DEPENDENCY,
SpecTokens.UNQUALIFIED_PACKAGE_NAME,
SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME,
):
_spec_str_reorder_compiler(compiler_block_idx, blocks)
compiler_block_idx = -1
if token.kind == SpecTokens.START_EDGE_PROPERTIES:
in_edge_attr = True
current_block.append(token)
blocks.append(current_block)
current_block = []
elif token.kind == SpecTokens.END_EDGE_PROPERTIES:
in_edge_attr = False
current_block.append(token)
blocks.append(current_block)
current_block = []
elif in_edge_attr:
current_block.append(token)
elif token.kind in (
SpecTokens.VERSION_HASH_PAIR,
SpecTokens.GIT_VERSION,
SpecTokens.VERSION,
SpecTokens.PROPAGATED_BOOL_VARIANT,
SpecTokens.BOOL_VARIANT,
SpecTokens.PROPAGATED_KEY_VALUE_PAIR,
SpecTokens.KEY_VALUE_PAIR,
SpecTokens.DAG_HASH,
):
current_block.append(token)
blocks.append(current_block)
current_block = []
elif token.kind == SpecTokens.WS:
current_block.append(token)
else:
raise ValueError(f"unexpected token {token}")
if current_block:
blocks.append(current_block)
_spec_str_reorder_compiler(compiler_block_idx, blocks)
new_spec_str = "".join(token.value for block in blocks for token in block)
return new_spec_str if spec_str != new_spec_str else None
SpecStrHandler = Callable[[str, int, int, str, str], None]
def _spec_str_default_handler(path: str, line: int, col: int, old: str, new: str):
"""A SpecStrHandler that prints formatted spec strings and their locations."""
print(f"{path}:{line}:{col}: `{old}` -> `{new}`")
def _spec_str_fix_handler(path: str, line: int, col: int, old: str, new: str):
"""A SpecStrHandler that updates formatted spec strings in files."""
with open(path, "r", encoding="utf-8") as f:
lines = f.readlines()
new_line = lines[line - 1].replace(old, new)
if new_line == lines[line - 1]:
tty.warn(f"{path}:{line}:{col}: could not apply fix: `{old}` -> `{new}`")
return
lines[line - 1] = new_line
print(f"{path}:{line}:{col}: fixed `{old}` -> `{new}`")
with open(path, "w", encoding="utf-8") as f:
f.writelines(lines)
def _spec_str_ast(path: str, tree: ast.AST, handler: SpecStrHandler) -> None:
"""Walk the AST of a Python file and apply handler to formatted spec strings."""
has_constant = sys.version_info >= (3, 8)
for node in ast.walk(tree):
if has_constant and isinstance(node, ast.Constant) and isinstance(node.value, str):
current_str = node.value
elif not has_constant and isinstance(node, ast.Str):
current_str = node.s
else:
continue
if not IS_PROBABLY_COMPILER.search(current_str):
continue
new = _spec_str_format(current_str)
if new is not None:
handler(path, node.lineno, node.col_offset, current_str, new)
def _spec_str_json_and_yaml(path: str, data: dict, handler: SpecStrHandler) -> None:
"""Walk a YAML or JSON data structure and apply handler to formatted spec strings."""
queue = [data]
seen = set()
while queue:
current = queue.pop(0)
if id(current) in seen:
continue
seen.add(id(current))
if isinstance(current, dict):
queue.extend(current.values())
queue.extend(current.keys())
elif isinstance(current, list):
queue.extend(current)
elif isinstance(current, str) and IS_PROBABLY_COMPILER.search(current):
new = _spec_str_format(current)
if new is not None:
mark = getattr(current, "_start_mark", None)
if mark:
line, col = mark.line + 1, mark.column + 1
else:
line, col = 0, 0
handler(path, line, col, current, new)
def _check_spec_strings(
paths: List[str], handler: SpecStrHandler = _spec_str_default_handler
) -> None:
"""Open Python, JSON and YAML files, and format their string literals that look like spec
strings. A handler is called for each formatting, which can be used to print or apply fixes."""
for path in paths:
is_json_or_yaml = path.endswith(".json") or path.endswith(".yaml") or path.endswith(".yml")
is_python = path.endswith(".py")
if not is_json_or_yaml and not is_python:
continue
try:
with open(path, "r", encoding="utf-8") as f:
# skip files that are likely too large to be user code or config
if os.fstat(f.fileno()).st_size > 1024 * 1024:
warnings.warn(f"skipping {path}: too large.")
continue
if is_json_or_yaml:
_spec_str_json_and_yaml(path, spack.util.spack_yaml.load_config(f), handler)
elif is_python:
_spec_str_ast(path, ast.parse(f.read()), handler)
except (OSError, spack.util.spack_yaml.SpackYAMLError, SyntaxError, ValueError):
warnings.warn(f"skipping {path}")
continue
def style(parser, args):
if args.spec_strings:
if not args.files:
tty.die("No files provided to check spec strings.")
handler = _spec_str_fix_handler if args.fix else _spec_str_default_handler
return _check_spec_strings(args.files, handler)
# save initial working directory for relativizing paths later
args.initial_working_dir = os.getcwd()

View File

@@ -252,7 +252,9 @@ def has_test_and_tags(pkg_class):
hashes = env.all_hashes() if env else None
specs = spack.store.STORE.db.query(hashes=hashes)
specs = list(filter(lambda s: has_test_and_tags(s.package_class), specs))
specs = list(
filter(lambda s: has_test_and_tags(spack.repo.PATH.get_pkg_class(s.fullname)), specs)
)
spack.cmd.display_specs(specs, long=True)

View File

@@ -17,6 +17,7 @@
pytest = None # type: ignore
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
@@ -216,7 +217,7 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest
@@ -236,6 +237,12 @@ def unit_test(parser, args, unknown_args):
pytest_root = spack.extensions.load_extension(args.extension)
if args.numprocesses is not None and args.numprocesses > 1:
try:
import xdist # noqa: F401
except ImportError:
tty.error("parallel unit-test requires pytest-xdist module")
return 1
pytest_args.extend(
[
"--dist",

View File

@@ -2,35 +2,48 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import io
from typing import List, Optional
import llnl.util.tty as tty
from llnl.string import plural
from llnl.util.filesystem import visit_directory_tree
import spack.cmd
import spack.environment as ev
import spack.spec
import spack.store
import spack.verify
import spack.verify_libraries
from spack.cmd.common import arguments
description = "check that all spack packages are on disk as installed"
description = "verify spack installations on disk"
section = "admin"
level = "long"
MANIFEST_SUBPARSER: Optional[argparse.ArgumentParser] = None
def setup_parser(subparser):
setup_parser.parser = subparser
subparser.add_argument(
def setup_parser(subparser: argparse.ArgumentParser):
global MANIFEST_SUBPARSER
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="verify_command")
MANIFEST_SUBPARSER = sp.add_parser(
"manifest", help=verify_manifest.__doc__, description=verify_manifest.__doc__
)
MANIFEST_SUBPARSER.add_argument(
"-l", "--local", action="store_true", help="verify only locally installed packages"
)
subparser.add_argument(
MANIFEST_SUBPARSER.add_argument(
"-j", "--json", action="store_true", help="ouptut json-formatted errors"
)
subparser.add_argument("-a", "--all", action="store_true", help="verify all packages")
subparser.add_argument(
MANIFEST_SUBPARSER.add_argument("-a", "--all", action="store_true", help="verify all packages")
MANIFEST_SUBPARSER.add_argument(
"specs_or_files", nargs=argparse.REMAINDER, help="specs or files to verify"
)
type = subparser.add_mutually_exclusive_group()
type.add_argument(
manifest_sp_type = MANIFEST_SUBPARSER.add_mutually_exclusive_group()
manifest_sp_type.add_argument(
"-s",
"--specs",
action="store_const",
@@ -39,7 +52,7 @@ def setup_parser(subparser):
default="specs",
help="treat entries as specs (default)",
)
type.add_argument(
manifest_sp_type.add_argument(
"-f",
"--files",
action="store_const",
@@ -49,14 +62,67 @@ def setup_parser(subparser):
help="treat entries as absolute filenames\n\ncannot be used with '-a'",
)
libraries_subparser = sp.add_parser(
"libraries", help=verify_libraries.__doc__, description=verify_libraries.__doc__
)
arguments.add_common_arguments(libraries_subparser, ["constraint"])
def verify(parser, args):
cmd = args.verify_command
if cmd == "libraries":
return verify_libraries(args)
elif cmd == "manifest":
return verify_manifest(args)
parser.error("invalid verify subcommand")
def verify_libraries(args):
"""verify that shared libraries of install packages can be located in rpaths (Linux only)"""
specs_from_db = [s for s in args.specs(installed=True) if not s.external]
tty.info(f"Checking {len(specs_from_db)} packages for shared library resolution")
errors = 0
for spec in specs_from_db:
try:
pkg = spec.package
except Exception:
tty.warn(f"Skipping {spec.cformat('{name}{@version}{/hash}')} due to missing package")
error_msg = _verify_libraries(spec, pkg.unresolved_libraries)
if error_msg is not None:
errors += 1
tty.error(error_msg)
if errors:
tty.error(f"Cannot resolve shared libraries in {plural(errors, 'package')}")
return 1
def _verify_libraries(spec: spack.spec.Spec, unresolved_libraries: List[str]) -> Optional[str]:
"""Go over the prefix of the installed spec and verify its shared libraries can be resolved."""
visitor = spack.verify_libraries.ResolveSharedElfLibDepsVisitor(
[*spack.verify_libraries.ALLOW_UNRESOLVED, *unresolved_libraries]
)
visit_directory_tree(spec.prefix, visitor)
if not visitor.problems:
return None
output = io.StringIO()
visitor.write(output, indent=4, brief=True)
message = output.getvalue().rstrip()
return f"{spec.cformat('{name}{@version}{/hash}')}: {spec.prefix}:\n{message}"
def verify_manifest(args):
"""verify that install directories have not been modified since installation"""
local = args.local
if args.type == "files":
if args.all:
setup_parser.parser.print_help()
return 1
MANIFEST_SUBPARSER.error("cannot use --all with --files")
for file in args.specs_or_files:
results = spack.verify.check_file_manifest(file)
@@ -87,8 +153,7 @@ def verify(parser, args):
env = ev.active_environment()
specs = list(map(lambda x: spack.cmd.disambiguate_spec(x, env, local=local), spec_args))
else:
setup_parser.parser.print_help()
return 1
MANIFEST_SUBPARSER.error("use --all or specify specs to verify")
for spec in specs:
tty.debug("Verifying package %s")

View File

@@ -220,7 +220,7 @@ def concretize_one(spec: Union[str, Spec], tests: TestsType = False) -> Spec:
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
if spack.repo.PATH.is_virtual(spec.name):
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]

View File

@@ -32,9 +32,10 @@
import copy
import functools
import os
import os.path
import re
import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Optional, Tuple, Union
import jsonschema
@@ -42,7 +43,6 @@
import spack.error
import spack.paths
import spack.platforms
import spack.schema
import spack.schema.bootstrap
import spack.schema.cdash
@@ -54,17 +54,20 @@
import spack.schema.develop
import spack.schema.env
import spack.schema.env_vars
import spack.schema.include
import spack.schema.merged
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
import spack.schema.repos
import spack.schema.upstreams
import spack.schema.view
# Hacked yaml for configuration files preserves line numbers.
import spack.util.remote_file_cache as rfc_util
import spack.util.spack_yaml as syaml
import spack.util.web as web_util
from spack.util.cpus import cpus_available
from spack.util.spack_yaml import get_mark_from_yaml_data
from .enums import ConfigScopePriority
#: Dict from section names -> schema for that section
SECTION_SCHEMAS: Dict[str, Any] = {
@@ -72,6 +75,7 @@
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"env_vars": spack.schema.env_vars.schema,
"include": spack.schema.include.schema,
"view": spack.schema.view.schema,
"develop": spack.schema.develop.schema,
"mirrors": spack.schema.mirrors.schema,
@@ -119,6 +123,17 @@
#: Type used for raw YAML configuration
YamlConfigDict = Dict[str, Any]
#: prefix for name of included configuration scopes
INCLUDE_SCOPE_PREFIX = "include"
#: safeguard for recursive includes -- maximum include depth
MAX_RECURSIVE_INCLUDES = 100
def _include_cache_location():
"""Location to cache included configuration files."""
return os.path.join(spack.paths.user_cache_path, "includes")
class ConfigScope:
def __init__(self, name: str) -> None:
@@ -126,6 +141,25 @@ def __init__(self, name: str) -> None:
self.writable = False
self.sections = syaml.syaml_dict()
#: names of any included scopes
self._included_scopes: Optional[List["ConfigScope"]] = None
@property
def included_scopes(self) -> List["ConfigScope"]:
"""Memoized list of included scopes, in the order they appear in this scope."""
if self._included_scopes is None:
self._included_scopes = []
includes = self.get_section("include")
if includes:
include_paths = [included_path(data) for data in includes["include"]]
for path in include_paths:
included_scope = include_path_scope(path)
if included_scope:
self._included_scopes.append(included_scope)
return self._included_scopes
def get_section_filename(self, section: str) -> str:
raise NotImplementedError
@@ -408,26 +442,18 @@ def _method(self, *args, **kwargs):
return _method
class Configuration:
"""A full Spack configuration, from a hierarchy of config files.
ScopeWithOptionalPriority = Union[ConfigScope, Tuple[int, ConfigScope]]
ScopeWithPriority = Tuple[int, ConfigScope]
This class makes it easy to add a new scope on top of an existing one.
"""
class Configuration:
"""A hierarchical configuration, merging a number of scopes at different priorities."""
# convert to typing.OrderedDict when we drop 3.6, or OrderedDict when we reach 3.9
scopes: Dict[str, ConfigScope]
scopes: lang.PriorityOrderedMapping[str, ConfigScope]
def __init__(self, *scopes: ConfigScope) -> None:
"""Initialize a configuration with an initial list of scopes.
Args:
scopes: list of scopes to add to this
Configuration, ordered from lowest to highest precedence
"""
self.scopes = collections.OrderedDict()
for scope in scopes:
self.push_scope(scope)
def __init__(self) -> None:
self.scopes = lang.PriorityOrderedMapping()
self.format_updates: Dict[str, List[ConfigScope]] = collections.defaultdict(list)
def ensure_unwrapped(self) -> "Configuration":
@@ -435,36 +461,59 @@ def ensure_unwrapped(self) -> "Configuration":
return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
"""Scope with the highest precedence"""
return next(self.scopes.reversed_values()) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
def push_scope(
self, scope: ConfigScope, priority: Optional[int] = None, _depth: int = 0
) -> None:
"""Adds a scope to the Configuration, at a given priority.
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}", level=2)
self.scopes[scope.name] = scope
If a priority is not given, it is assumed to be the current highest priority.
@_config_mutator
def pop_scope(self) -> ConfigScope:
"""Remove the highest precedence scope and return it."""
name, scope = self.scopes.popitem(last=True) # type: ignore[call-arg]
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
return scope
Args:
scope: scope to be added
priority: priority of the scope
"""
# TODO: As a follow on to #48784, change this to create a graph of the
# TODO: includes AND ensure properly sorted such that the order included
# TODO: at the highest level is reflected in the value of an option that
# TODO: is set in multiple included files.
# before pushing the scope itself, push any included scopes recursively, at same priority
for included_scope in reversed(scope.included_scopes):
if _depth + 1 > MAX_RECURSIVE_INCLUDES: # make sure we're not recursing endlessly
mark = ""
if hasattr(included_scope, "path") and syaml.marked(included_scope.path):
mark = included_scope.path._start_mark # type: ignore
raise RecursiveIncludeError(
f"Maximum include recursion exceeded in {included_scope.name}", str(mark)
)
# record this inclusion so that remove_scope() can use it
self.push_scope(included_scope, priority=priority, _depth=_depth + 1)
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
self.scopes.add(scope.name, value=scope, priority=priority)
@_config_mutator
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
"""Remove scope by name; has no effect when ``scope_name`` does not exist"""
scope = self.scopes.pop(scope_name, None)
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
try:
scope = self.scopes.remove(scope_name)
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {str(scope)}", level=2)
except KeyError as e:
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {e}", level=2)
return None
# transitively remove included scopes
for included_scope in scope.included_scopes:
assert (
included_scope.name in self.scopes
), f"Included scope '{included_scope.name}' was never added to configuration!"
self.remove_scope(included_scope.name)
return scope
@property
@@ -473,15 +522,13 @@ def writable_scopes(self) -> Generator[ConfigScope, None, None]:
return (s for s in self.scopes.values() if s.writable)
def highest_precedence_scope(self) -> ConfigScope:
"""Writable scope with highest precedence."""
return next(s for s in reversed(self.scopes.values()) if s.writable) # type: ignore
"""Writable scope with the highest precedence."""
return next(s for s in self.scopes.reversed_values() if s.writable)
def highest_precedence_non_platform_scope(self) -> ConfigScope:
"""Writable non-platform scope with highest precedence"""
"""Writable non-platform scope with the highest precedence"""
return next(
s
for s in reversed(self.scopes.values()) # type: ignore
if s.writable and not s.is_platform_dependent
s for s in self.scopes.reversed_values() if s.writable and not s.is_platform_dependent
)
def matching_scopes(self, reg_expr) -> List[ConfigScope]:
@@ -748,7 +795,7 @@ def override(
"""
if isinstance(path_or_scope, ConfigScope):
overrides = path_or_scope
CONFIG.push_scope(path_or_scope)
CONFIG.push_scope(path_or_scope, priority=None)
else:
base_name = _OVERRIDES_BASE_NAME
# Ensure the new override gets a unique scope name
@@ -762,7 +809,7 @@ def override(
break
overrides = InternalConfigScope(scope_name)
CONFIG.push_scope(overrides)
CONFIG.push_scope(overrides, priority=None)
CONFIG.set(path_or_scope, value, scope=scope_name)
try:
@@ -772,13 +819,86 @@ def override(
assert scope is overrides
def _add_platform_scope(cfg: Configuration, name: str, path: str, writable: bool = True) -> None:
def _add_platform_scope(
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
) -> None:
"""Add a platform-specific subdirectory for the current platform."""
import spack.platforms # circular dependency
platform = spack.platforms.host().name
scope = DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform), writable=writable
)
cfg.push_scope(scope)
cfg.push_scope(scope, priority=priority)
#: Class for the relevance of an optional path conditioned on a limited
#: python code that evaluates to a boolean and or explicit specification
#: as optional.
class IncludePath(NamedTuple):
path: str
when: str
sha256: str
optional: bool
def included_path(entry: Union[str, dict]) -> IncludePath:
"""Convert the included path entry into an IncludePath.
Args:
entry: include configuration entry
Returns: converted entry, where an empty ``when`` means the path is
not conditionally included
"""
if isinstance(entry, str):
return IncludePath(path=entry, sha256="", when="", optional=False)
path = entry["path"]
sha256 = entry.get("sha256", "")
when = entry.get("when", "")
optional = entry.get("optional", False)
return IncludePath(path=path, sha256=sha256, when=when, optional=optional)
def include_path_scope(include: IncludePath) -> Optional[ConfigScope]:
"""Instantiate an appropriate configuration scope for the given path.
Args:
include: optional include path
Returns: configuration scope
Raises:
ValueError: included path has an unsupported URL scheme, is required
but does not exist; configuration stage directory argument is missing
ConfigFileError: unable to access remote configuration file(s)
"""
# circular dependencies
import spack.spec
if (not include.when) or spack.spec.eval_conditional(include.when):
config_path = rfc_util.local_path(include.path, include.sha256, _include_cache_location)
if not config_path:
raise ConfigFileError(f"Unable to fetch remote configuration from {include.path}")
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = f"{INCLUDE_SCOPE_PREFIX}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
return DirectoryConfigScope(config_name, config_path)
if os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = f"{INCLUDE_SCOPE_PREFIX}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
return SingleFileScope(config_name, config_path, spack.schema.merged.schema)
if not include.optional:
path = f" at ({config_path})" if config_path != include.path else ""
raise ValueError(f"Required path ({include.path}) does not exist{path}")
return None
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
@@ -806,18 +926,17 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
return config_paths
def create() -> Configuration:
def create_incremental() -> Generator[Configuration, None, None]:
"""Singleton Configuration instance.
This constructs one instance associated with this module and returns
it. It is bundled inside a function so that configuration can be
initialized lazily.
"""
cfg = Configuration()
# first do the builtin, hardcoded defaults
builtin = InternalConfigScope("_builtin", CONFIG_DEFAULTS)
cfg.push_scope(builtin)
cfg = create_from(
(ConfigScopePriority.BUILTIN, InternalConfigScope("_builtin", CONFIG_DEFAULTS))
)
# Builtin paths to configuration files in Spack
configuration_paths = [
@@ -847,16 +966,29 @@ def create() -> Configuration:
# add each scope and its platform-specific directory
for name, path in configuration_paths:
cfg.push_scope(DirectoryConfigScope(name, path))
cfg.push_scope(DirectoryConfigScope(name, path), priority=ConfigScopePriority.CONFIG_FILES)
# Each scope can have per-platform overrides in subdirectories
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, name, path)
# yield the config incrementally so that each config level's init code can get
# data from the one below. This can be tricky, but it enables us to have a
# single unified config system.
#
# TODO: think about whether we want to restrict what types of config can be used
# at each level. e.g., we may want to just more forcibly disallow remote
# config (which uses ssl and other config options) for some of the scopes,
# to make the bootstrap issues more explicit, even if allowing config scope
# init to reference lower scopes is more flexible.
yield cfg
return cfg
def create() -> Configuration:
"""Create a configuration using create_incremental(), return the last yielded result."""
return list(create_incremental())[-1]
#: This is the singleton configuration instance for Spack.
CONFIG: Configuration = lang.Singleton(create) # type: ignore
CONFIG: Configuration = lang.Singleton(create_incremental) # type: ignore
def add_from_file(filename: str, scope: Optional[str] = None) -> None:
@@ -952,10 +1084,11 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
Accepts the path syntax described in ``get()``.
"""
return CONFIG.set(path, value, scope)
result = CONFIG.set(path, value, scope)
return result
def scopes() -> Dict[str, ConfigScope]:
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
"""Convenience function to get list of configuration scopes."""
return CONFIG.scopes
@@ -1409,7 +1542,7 @@ def ensure_latest_format_fn(section: str) -> Callable[[YamlConfigDict], bool]:
@contextlib.contextmanager
def use_configuration(
*scopes_or_paths: Union[ConfigScope, str]
*scopes_or_paths: Union[ScopeWithOptionalPriority, str]
) -> Generator[Configuration, None, None]:
"""Use the configuration scopes passed as arguments within the context manager.
@@ -1424,7 +1557,7 @@ def use_configuration(
global CONFIG
# Normalize input and construct a Configuration object
configuration = _config_from(scopes_or_paths)
configuration = create_from(*scopes_or_paths)
CONFIG.clear_caches(), configuration.clear_caches()
saved_config, CONFIG = CONFIG, configuration
@@ -1435,137 +1568,44 @@ def use_configuration(
CONFIG = saved_config
def _normalize_input(entry: Union[ScopeWithOptionalPriority, str]) -> ScopeWithPriority:
if isinstance(entry, tuple):
return entry
default_priority = ConfigScopePriority.CONFIG_FILES
if isinstance(entry, ConfigScope):
return default_priority, entry
# Otherwise we need to construct it
path = os.path.normpath(entry)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
return default_priority, DirectoryConfigScope(name, path)
@lang.memoized
def _config_from(scopes_or_paths: List[Union[ConfigScope, str]]) -> Configuration:
scopes = []
for scope_or_path in scopes_or_paths:
# If we have a config scope we are already done
if isinstance(scope_or_path, ConfigScope):
scopes.append(scope_or_path)
continue
# Otherwise we need to construct it
path = os.path.normpath(scope_or_path)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
scopes.append(DirectoryConfigScope(name, path))
configuration = Configuration(*scopes)
return configuration
def raw_github_gitlab_url(url: str) -> str:
"""Transform a github URL to the raw form to avoid undesirable html.
def create_from(*scopes_or_paths: Union[ScopeWithOptionalPriority, str]) -> Configuration:
"""Creates a configuration object from the scopes passed in input.
Args:
url: url to be converted to raw form
*scopes_or_paths: either a tuple of (priority, ConfigScope), or a ConfigScope, or a string
If priority is not given, it is assumed to be ConfigScopePriority.CONFIG_FILES. If a
string is given, a DirectoryConfigScope is created from it.
Returns:
Raw github/gitlab url or the original url
Examples:
>>> builtin_scope = InternalConfigScope("_builtin", {"config": {"build_jobs": 1}})
>>> cl_scope = InternalConfigScope("command_line", {"config": {"build_jobs": 10}})
>>> cfg = create_from(
... (ConfigScopePriority.COMMAND_LINE, cl_scope),
... (ConfigScopePriority.BUILTIN, builtin_scope)
... )
"""
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
# actual URL under https://raw.githubusercontent.com/ with '/blob'
# removed and or, '/blame' if needed.
if "github" in url or "gitlab" in url:
return url.replace("/blob/", "/raw/")
return url
def collect_urls(base_url: str) -> list:
"""Return a list of configuration URLs.
Arguments:
base_url: URL for a configuration (yaml) file or a directory
containing yaml file(s)
Returns:
List of configuration file(s) or empty list if none
"""
if not base_url:
return []
extension = ".yaml"
if base_url.endswith(extension):
return [base_url]
# Collect configuration URLs if the base_url is a "directory".
_, links = web_util.spider(base_url, 0)
return [link for link in links if link.endswith(extension)]
def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) -> str:
"""Retrieve configuration file(s) at the specified URL.
Arguments:
url: URL for a configuration (yaml) file or a directory containing
yaml file(s)
dest_dir: destination directory
skip_existing: Skip files that already exist in dest_dir if
``True``; otherwise, replace those files
Returns:
Path to the corresponding file if URL is or contains a
single file and it is the only file in the destination directory or
the root (dest_dir) directory if multiple configuration files exist
or are retrieved.
"""
def _fetch_file(url):
raw = raw_github_gitlab_url(url)
tty.debug(f"Reading config from url {raw}")
return web_util.fetch_url_text(raw, dest_dir=dest_dir)
if not url:
raise ConfigFileError("Cannot retrieve configuration without a URL")
# Return the local path to the cached configuration file OR to the
# directory containing the cached configuration files.
config_links = collect_urls(url)
existing_files = os.listdir(dest_dir) if os.path.isdir(dest_dir) else []
paths = []
for config_url in config_links:
basename = os.path.basename(config_url)
if skip_existing and basename in existing_files:
tty.warn(
f"Will not fetch configuration from {config_url} since a "
f"version already exists in {dest_dir}"
)
path = os.path.join(dest_dir, basename)
else:
path = _fetch_file(config_url)
if path:
paths.append(path)
if paths:
return dest_dir if len(paths) > 1 else paths[0]
raise ConfigFileError(f"Cannot retrieve configuration (yaml) from {url}")
def get_mark_from_yaml_data(obj):
"""Try to get ``spack.util.spack_yaml`` mark from YAML data.
We try the object, and if that fails we try its first member (if it's a container).
Returns:
mark if one is found, otherwise None.
"""
# mark of object itelf
mark = getattr(obj, "_start_mark", None)
if mark:
return mark
# mark of first member if it is a container
if isinstance(obj, (list, dict)):
first_member = next(iter(obj), None)
if first_member:
mark = getattr(first_member, "_start_mark", None)
return mark
scopes_with_priority = [_normalize_input(x) for x in scopes_or_paths]
result = Configuration()
for priority, scope in scopes_with_priority:
result.push_scope(scope, priority=priority)
return result
def determine_number_of_jobs(
@@ -1672,3 +1712,7 @@ def get_path(path, data):
# give up and return None if nothing worked
return None
class RecursiveIncludeError(spack.error.SpackError):
"""Too many levels of recursive includes."""

View File

@@ -41,6 +41,8 @@
Union,
)
import spack.repo
try:
import uuid
@@ -1124,7 +1126,7 @@ def _add(
installation_time:
Date and time of installation
allow_missing: if True, don't warn when installation is not found on on disk
This is useful when installing specs without build deps.
This is useful when installing specs without build/test deps.
"""
if not spec.concrete:
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
@@ -1144,10 +1146,8 @@ def _add(
edge.spec,
explicit=False,
installation_time=installation_time,
# allow missing build-only deps. This prevents excessive warnings when a spec is
# installed, and its build dep is missing a build dep; there's no need to install
# the build dep's build dep first, and there's no need to warn about it missing.
allow_missing=allow_missing or edge.depflag == dt.BUILD,
# allow missing build / test only deps
allow_missing=allow_missing or edge.depflag & (dt.BUILD | dt.TEST) == edge.depflag,
)
# Make sure the directory layout agrees whether the spec is installed
@@ -1556,7 +1556,12 @@ def _query(
# If we did fine something, the query spec can't be virtual b/c we matched an actual
# package installation, so skip the virtual check entirely. If we *didn't* find anything,
# check all the deferred specs *if* the query is virtual.
if not results and query_spec is not None and deferred and query_spec.virtual:
if (
not results
and query_spec is not None
and deferred
and spack.repo.PATH.is_virtual(query_spec.name)
):
results = [spec for spec in deferred if spec.satisfies(query_spec)]
return results

View File

@@ -310,7 +310,7 @@ def find_windows_kit_roots() -> List[str]:
@staticmethod
def find_windows_kit_bin_paths(
kit_base: Union[Optional[str], Optional[list]] = None
kit_base: Union[Optional[str], Optional[list]] = None,
) -> List[str]:
"""Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
@@ -325,7 +325,7 @@ def find_windows_kit_bin_paths(
@staticmethod
def find_windows_kit_lib_paths(
kit_base: Union[Optional[str], Optional[list]] = None
kit_base: Union[Optional[str], Optional[list]] = None,
) -> List[str]:
"""Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base

View File

@@ -7,6 +7,7 @@
import collections
import concurrent.futures
import os
import pathlib
import re
import sys
import traceback
@@ -15,6 +16,7 @@
import llnl.util.filesystem
import llnl.util.lang
import llnl.util.symlink
import llnl.util.tty
import spack.error
@@ -70,13 +72,21 @@ def dedupe_paths(paths: List[str]) -> List[str]:
"""Deduplicate paths based on inode and device number. In case the list contains first a
symlink and then the directory it points to, the symlink is replaced with the directory path.
This ensures that we pick for example ``/usr/bin`` over ``/bin`` if the latter is a symlink to
the former`."""
the former."""
seen: Dict[Tuple[int, int], str] = {}
linked_parent_check = lambda x: any(
[llnl.util.symlink.islink(str(y)) for y in pathlib.Path(x).parents]
)
for path in paths:
identifier = file_identifier(path)
if identifier not in seen:
seen[identifier] = path
elif not os.path.islink(path):
# we also want to deprioritize paths if they contain a symlink in any parent
# (not just the basedir): e.g. oneapi has "latest/bin",
# where "latest" is a symlink to 2025.0"
elif not (llnl.util.symlink.islink(path) or linked_parent_check(path)):
seen[identifier] = path
return list(seen.values())
@@ -243,7 +253,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
self, *, pkg: Type["spack.package_base.PackageBase"], paths: Iterable[str]
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
@@ -259,6 +269,8 @@ def detect_specs(
)
return []
from spack.repo import PATH as repo_path
result = []
for candidate_path, items_in_prefix in _group_by_prefix(
llnl.util.lang.dedupe(paths)
@@ -305,7 +317,10 @@ def detect_specs(
resolved_specs[spec] = candidate_path
try:
spec.validate_detection()
# Validate the spec calling a package specific method
pkg_cls = repo_path.get_pkg_class(spec.name)
validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None)
validate_fn(spec, spec.extra_attributes)
except Exception as e:
msg = (
f'"{spec}" has been detected on the system but will '

View File

@@ -462,8 +462,7 @@ def _execute_extends(pkg):
if dep_spec.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, spack.spec.Spec("python-venv"), when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[dep_spec.name] = (dep_spec, None)
pkg.extendees[dep_spec.name] = (dep_spec, when_spec)
return _execute_extends
@@ -568,7 +567,7 @@ def patch(
"""
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency],
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep

View File

@@ -25,7 +25,7 @@
}
def _check_concrete(spec):
def _check_concrete(spec: "spack.spec.Spec") -> None:
"""If the spec is not concrete, raise a ValueError"""
if not spec.concrete:
raise ValueError("Specs passed to a DirectoryLayout must be concrete!")
@@ -51,7 +51,7 @@ def specs_from_metadata_dirs(root: str) -> List["spack.spec.Spec"]:
spec = _get_spec(prefix)
if spec:
spec.prefix = prefix
spec.set_prefix(prefix)
specs.append(spec)
continue
@@ -84,7 +84,7 @@ class DirectoryLayout:
def __init__(
self,
root,
root: str,
*,
projections: Optional[Dict[str, str]] = None,
hash_length: Optional[int] = None,
@@ -120,17 +120,17 @@ def __init__(
self.manifest_file_name = "install_manifest.json"
@property
def hidden_file_regexes(self):
def hidden_file_regexes(self) -> Tuple[str]:
return ("^{0}$".format(re.escape(self.metadata_dir)),)
def relative_path_for_spec(self, spec):
def relative_path_for_spec(self, spec: "spack.spec.Spec") -> str:
_check_concrete(spec)
projection = spack.projections.get_projection(self.projections, spec)
path = spec.format_path(projection)
return str(Path(path))
def write_spec(self, spec, path):
def write_spec(self, spec: "spack.spec.Spec", path: str) -> None:
"""Write a spec out to a file."""
_check_concrete(spec)
with open(path, "w", encoding="utf-8") as f:
@@ -138,7 +138,7 @@ def write_spec(self, spec, path):
# the full provenance, so it's availabe if we want it later
spec.to_json(f, hash=ht.dag_hash)
def write_host_environment(self, spec):
def write_host_environment(self, spec: "spack.spec.Spec") -> None:
"""The host environment is a json file with os, kernel, and spack
versioning. We use it in the case that an analysis later needs to
easily access this information.
@@ -148,7 +148,7 @@ def write_host_environment(self, spec):
with open(env_file, "w", encoding="utf-8") as fd:
sjson.dump(environ, fd)
def read_spec(self, path):
def read_spec(self, path: str) -> "spack.spec.Spec":
"""Read the contents of a file and parse them as a spec"""
try:
with open(path, encoding="utf-8") as f:
@@ -159,26 +159,28 @@ def read_spec(self, path):
# Too late for conversion; spec_file_path() already called.
spec = spack.spec.Spec.from_yaml(f)
else:
raise SpecReadError(
"Did not recognize spec file extension:" " {0}".format(extension)
)
raise SpecReadError(f"Did not recognize spec file extension: {extension}")
except Exception as e:
if spack.config.get("config:debug"):
raise
raise SpecReadError("Unable to read file: %s" % path, "Cause: " + str(e))
raise SpecReadError(f"Unable to read file: {path}", f"Cause: {e}")
# Specs read from actual installations are always concrete
spec._mark_concrete()
return spec
def spec_file_path(self, spec):
def spec_file_path(self, spec: "spack.spec.Spec") -> str:
"""Gets full path to spec file"""
_check_concrete(spec)
yaml_path = os.path.join(self.metadata_path(spec), self._spec_file_name_yaml)
json_path = os.path.join(self.metadata_path(spec), self.spec_file_name)
return yaml_path if os.path.exists(yaml_path) else json_path
def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
def deprecated_file_path(
self,
deprecated_spec: "spack.spec.Spec",
deprecator_spec: Optional["spack.spec.Spec"] = None,
) -> str:
"""Gets full path to spec file for deprecated spec
If the deprecator_spec is provided, use that. Otherwise, assume
@@ -212,16 +214,16 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
return yaml_path if os.path.exists(yaml_path) else json_path
def metadata_path(self, spec):
def metadata_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(spec.prefix, self.metadata_dir)
def env_metadata_path(self, spec):
def env_metadata_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(self.metadata_path(spec), "install_environment.json")
def build_packages_path(self, spec):
def build_packages_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(self.metadata_path(spec), self.packages_dir)
def create_install_directory(self, spec):
def create_install_directory(self, spec: "spack.spec.Spec") -> None:
_check_concrete(spec)
# Create install directory with properly configured permissions
@@ -239,7 +241,7 @@ def create_install_directory(self, spec):
self.write_spec(spec, self.spec_file_path(spec))
def ensure_installed(self, spec):
def ensure_installed(self, spec: "spack.spec.Spec") -> None:
"""
Throws InconsistentInstallDirectoryError if:
1. spec prefix does not exist
@@ -266,7 +268,7 @@ def ensure_installed(self, spec):
"Spec file in %s does not match hash!" % spec_file_path
)
def path_for_spec(self, spec):
def path_for_spec(self, spec: "spack.spec.Spec") -> str:
"""Return absolute path from the root to a directory for the spec."""
_check_concrete(spec)
@@ -277,23 +279,13 @@ def path_for_spec(self, spec):
assert not path.startswith(self.root)
return os.path.join(self.root, path)
def remove_install_directory(self, spec, deprecated=False):
def remove_install_directory(self, spec: "spack.spec.Spec", deprecated: bool = False) -> None:
"""Removes a prefix and any empty parent directories from the root.
Raised RemoveFailedError if something goes wrong.
"""
path = self.path_for_spec(spec)
assert path.startswith(self.root)
# Windows readonly files cannot be removed by Python
# directly, change permissions before attempting to remove
if sys.platform == "win32":
kwargs = {
"ignore_errors": False,
"onerror": fs.readonly_file_handler(ignore_errors=False),
}
else:
kwargs = {} # the default value for ignore_errors is false
if deprecated:
if os.path.exists(path):
try:
@@ -304,7 +296,16 @@ def remove_install_directory(self, spec, deprecated=False):
raise RemoveFailedError(spec, path, e) from e
elif os.path.exists(path):
try:
shutil.rmtree(path, **kwargs)
if sys.platform == "win32":
# Windows readonly files cannot be removed by Python
# directly, change permissions before attempting to remove
shutil.rmtree(
path,
ignore_errors=False,
onerror=fs.readonly_file_handler(ignore_errors=False),
)
else:
shutil.rmtree(path)
except OSError as e:
raise RemoveFailedError(spec, path, e) from e

View File

@@ -12,3 +12,13 @@ class InstallRecordStatus(enum.Flag):
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING
class ConfigScopePriority(enum.IntEnum):
"""Priorities of the different kind of config scopes used by Spack"""
BUILTIN = 0
CONFIG_FILES = 1
CUSTOM = 2
ENVIRONMENT = 3
COMMAND_LINE = 4

View File

@@ -166,7 +166,7 @@ def __init__(
" ".join(self._install_target(s.safe_name()) for s in item.prereqs),
item.target.spec_hash(),
item.target.unsafe_format(
"{name}{@version}{%compiler}{variants}{arch=architecture}"
"{name}{@version}{variants}{ arch=architecture} {%compiler}"
),
item.buildcache_flag,
)

View File

@@ -10,8 +10,6 @@
import re
import shutil
import stat
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
@@ -32,7 +30,6 @@
import spack.paths
import spack.repo
import spack.schema.env
import spack.schema.merged
import spack.spec
import spack.spec_list
import spack.store
@@ -43,7 +40,6 @@
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url
from spack import traverse
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
@@ -51,6 +47,8 @@
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
from ..enums import ConfigScopePriority
SpecPair = spack.concretize.SpecPair
#: environment variable used to indicate the active environment
@@ -387,6 +385,7 @@ def create_in_dir(
# dev paths in this environment to refer to their original
# locations.
_rewrite_relative_dev_paths_on_relocation(env, init_file_dir)
_rewrite_relative_repos_paths_on_relocation(env, init_file_dir)
return env
@@ -403,8 +402,8 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
dev_path = substitute_path_variables(entry["path"])
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
# Skip if the expanded path is the same (e.g. when absolute)
if dev_path == expanded_path:
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry["path"] == expanded_path:
continue
tty.debug("Expanding develop path for {0} to {1}".format(name, expanded_path))
@@ -419,6 +418,34 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
env._re_read()
def _rewrite_relative_repos_paths_on_relocation(env, init_file_dir):
"""When initializing the environment from a manifest file and we plan
to store the environment in a different directory, we have to rewrite
relative repo paths to absolute ones and expand environment variables."""
with env:
repos_specs = spack.config.get("repos", default={}, scope=env.scope_name)
if not repos_specs:
return
for i, entry in enumerate(repos_specs):
repo_path = substitute_path_variables(entry)
expanded_path = spack.util.path.canonicalize_path(repo_path, default_wd=init_file_dir)
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry == expanded_path:
continue
tty.debug("Expanding repo path for {0} to {1}".format(entry, expanded_path))
repos_specs[i] = expanded_path
spack.config.set("repos", repos_specs, scope=env.scope_name)
env.repos_specs = None
# If we changed the environment's spack.yaml scope, that will not be reflected
# in the manifest that we read
env._re_read()
def environment_dir_from_name(name: str, exists_ok: bool = True) -> str:
"""Returns the directory associated with a named environment.
@@ -546,13 +573,6 @@ def _write_yaml(data, str_or_file):
syaml.dump_config(data, str_or_file, default_flow_style=False)
def _eval_conditional(string):
"""Evaluate conditional definitions using restricted variable scope."""
valid_variables = spack.spec.get_host_environment()
valid_variables.update({"re": re, "env": os.environ})
return eval(string, valid_variables)
def _is_dev_spec_and_has_changed(spec):
"""Check if the passed spec is a dev build and whether it has changed since the
last installation"""
@@ -985,7 +1005,7 @@ def _process_definition(self, entry):
"""Process a single spec definition item."""
when_string = entry.get("when")
if when_string is not None:
when = _eval_conditional(when_string)
when = spack.spec.eval_conditional(when_string)
assert len([x for x in entry if x != "when"]) == 1
else:
when = True
@@ -1108,11 +1128,6 @@ def user_specs(self):
@property
def dev_specs(self):
if not self._dev_specs:
self._dev_specs = self._read_dev_specs()
return self._dev_specs
def _read_dev_specs(self):
dev_specs = {}
dev_config = spack.config.get("develop", {})
for name, entry in dev_config.items():
@@ -1530,9 +1545,6 @@ def _get_specs_to_concretize(
return new_user_specs, kept_user_specs, specs_to_concretize
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
# Avoid cyclic dependency
import spack.solver.asp
# Exit early if the set of concretized specs is the set of user specs
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
if not new_user_specs:
@@ -2392,6 +2404,8 @@ def invalidate_repository_cache(self):
def __enter__(self):
self._previous_active = _active_environment
if self._previous_active:
deactivate()
activate(self)
return self
@@ -2641,20 +2655,23 @@ def _ensure_env_dir():
# error handling for bad manifests is handled on other code paths
return
# TODO: make this recursive
includes = manifest[TOP_LEVEL_KEY].get("include", [])
for include in includes:
if os.path.isabs(include):
included_path = spack.config.included_path(include)
path = included_path.path
if os.path.isabs(path):
continue
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
abspath = pathlib.Path(os.path.normpath(environment_dir / path))
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
if common_path != environment_dir:
tty.debug(f"Will not copy relative include from outside environment: {include}")
tty.debug(f"Will not copy relative include file from outside environment: {path}")
continue
orig_abspath = os.path.normpath(envfile.parent / include)
orig_abspath = os.path.normpath(envfile.parent / path)
if not os.path.exists(orig_abspath):
tty.warn(f"Included file does not exist; will not copy: '{include}'")
tty.warn(f"Included file does not exist; will not copy: '{path}'")
continue
fs.touchp(abspath)
@@ -2877,7 +2894,7 @@ def extract_name(_item):
continue
condition_str = item.get("when", "True")
if not _eval_conditional(condition_str):
if not spack.spec.eval_conditional(condition_str):
continue
yield idx, item
@@ -2938,127 +2955,20 @@ def __iter__(self):
def __str__(self):
return str(self.manifest_file)
@property
def included_config_scopes(self) -> List[spack.config.ConfigScope]:
"""List of included configuration scopes from the manifest.
Scopes are listed in the YAML file in order from highest to
lowest precedence, so configuration from earlier scope will take
precedence over later ones.
This routine returns them in the order they should be pushed onto
the internal scope stack (so, in reverse, from lowest to highest).
Returns: Configuration scopes associated with the environment manifest
Raises:
SpackEnvironmentError: if the manifest includes a remote file but
no configuration stage directory has been identified
"""
scopes: List[spack.config.ConfigScope] = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self[TOP_LEVEL_KEY].get("include", [])
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# If scheme is not valid, config_path is not a url
# of a type Spack is generally aware
if spack.util.url.validate_scheme(include_url.scheme):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
config_path = os.path.join(self.manifest_dir, config_path)
config_path = os.path.normpath(os.path.realpath(config_path))
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = f"env:{self.name}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
scopes.append(
spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
)
else:
missing.append(config_path)
continue
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
raise spack.config.ConfigFileError(msg)
return scopes
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest. On the first call this
instantiates all the scopes, on subsequent calls it returns the cached list."""
if self._config_scopes is not None:
return self._config_scopes
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name,
str(self.manifest_file),
spack.schema.env.schema,
yaml_path=[TOP_LEVEL_KEY],
),
)
]
ensure_no_disallowed_env_config_mods(scopes)
self._config_scopes = scopes
@@ -3067,14 +2977,12 @@ def env_config_scopes(self) -> List[spack.config.ConfigScope]:
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
spack.config.CONFIG.push_scope(scope, priority=ConfigScopePriority.ENVIRONMENT)
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager
def use_config(self):

View File

@@ -8,6 +8,7 @@
import llnl.util.tty as tty
from llnl.util.tty.color import colorize
import spack.config
import spack.environment as ev
import spack.repo
import spack.schema.environment
@@ -158,7 +159,8 @@ def activate(
# become PATH variables.
#
env_vars_yaml = env.manifest.configuration.get("env_vars", None)
with env.manifest.use_config():
env_vars_yaml = spack.config.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml))
@@ -195,7 +197,8 @@ def deactivate() -> EnvironmentModifications:
if active is None:
return env_mods
env_vars_yaml = active.manifest.configuration.get("env_vars", None)
with active.manifest.use_config():
env_vars_yaml = spack.config.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml).reversed())

View File

@@ -9,7 +9,8 @@
import shutil
import stat
import sys
from typing import Callable, Dict, Optional
import tempfile
from typing import Callable, Dict, List, Optional
from typing_extensions import Literal
@@ -77,7 +78,7 @@ def view_copy(
# Order of this dict is somewhat irrelevant
prefix_to_projection = {
s.prefix: view.get_projection_for_spec(s)
str(s.prefix): view.get_projection_for_spec(s)
for s in spec.traverse(root=True, order="breadth")
if not s.external
}
@@ -184,7 +185,7 @@ def __init__(
def link(self, src: str, dst: str, spec: Optional[spack.spec.Spec] = None) -> None:
self._link(src, dst, self, spec)
def add_specs(self, *specs, **kwargs):
def add_specs(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Add given specs to view.
@@ -199,19 +200,19 @@ def add_specs(self, *specs, **kwargs):
"""
raise NotImplementedError
def add_standalone(self, spec):
def add_standalone(self, spec: spack.spec.Spec) -> bool:
"""
Add (link) a standalone package into this view.
"""
raise NotImplementedError
def check_added(self, spec):
def check_added(self, spec: spack.spec.Spec) -> bool:
"""
Check if the given concrete spec is active in this view.
"""
raise NotImplementedError
def remove_specs(self, *specs, **kwargs):
def remove_specs(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Removes given specs from view.
@@ -230,25 +231,25 @@ def remove_specs(self, *specs, **kwargs):
"""
raise NotImplementedError
def remove_standalone(self, spec):
def remove_standalone(self, spec: spack.spec.Spec) -> None:
"""
Remove (unlink) a standalone package from this view.
"""
raise NotImplementedError
def get_projection_for_spec(self, spec):
def get_projection_for_spec(self, spec: spack.spec.Spec) -> str:
"""
Get the projection in this view for a spec.
"""
raise NotImplementedError
def get_all_specs(self):
def get_all_specs(self) -> List[spack.spec.Spec]:
"""
Get all specs currently active in this view.
"""
raise NotImplementedError
def get_spec(self, spec):
def get_spec(self, spec: spack.spec.Spec) -> Optional[spack.spec.Spec]:
"""
Return the actual spec linked in this view (i.e. do not look it up
in the database by name).
@@ -262,7 +263,7 @@ def get_spec(self, spec):
"""
raise NotImplementedError
def print_status(self, *specs, **kwargs):
def print_status(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Print a short summary about the given specs, detailing whether..
* ..they are active in the view.
@@ -642,7 +643,7 @@ def print_status(self, *specs, **kwargs):
specs.sort()
abbreviated = [
s.cformat("{name}{@version}{%compiler}{compiler_flags}{variants}")
s.cformat("{name}{@version}{compiler_flags}{variants}{%compiler}")
for s in specs
]
@@ -693,7 +694,7 @@ def _sanity_check_view_projection(self, specs):
raise ConflictingSpecsError(current_spec, conflicting_spec)
seen[metadata_dir] = current_spec
def add_specs(self, *specs: spack.spec.Spec) -> None:
def add_specs(self, *specs, **kwargs) -> None:
"""Link a root-to-leaf topologically ordered list of specs into the view."""
assert all((s.concrete for s in specs))
if len(specs) == 0:
@@ -708,7 +709,10 @@ def add_specs(self, *specs: spack.spec.Spec) -> None:
def skip_list(file):
return os.path.basename(file) == spack.store.STORE.layout.metadata_dir
visitor = SourceMergeVisitor(ignore=skip_list)
# Determine if the root is on a case-insensitive filesystem
normalize_paths = is_folder_on_case_insensitive_filesystem(self._root)
visitor = SourceMergeVisitor(ignore=skip_list, normalize_paths=normalize_paths)
# Gather all the directories to be made and files to be linked
for spec in specs:
@@ -827,7 +831,7 @@ def get_projection_for_spec(self, spec):
#####################
# utility functions #
#####################
def get_spec_from_file(filename):
def get_spec_from_file(filename) -> Optional[spack.spec.Spec]:
try:
with open(filename, "r", encoding="utf-8") as f:
return spack.spec.Spec.from_yaml(f)
@@ -884,3 +888,8 @@ def get_dependencies(specs):
class ConflictingProjectionsError(SpackError):
"""Raised when a view has a projections file and is given one manually."""
def is_folder_on_case_insensitive_filesystem(path: str) -> bool:
with tempfile.NamedTemporaryFile(dir=path, prefix=".sentinel") as sentinel:
return os.path.exists(os.path.join(path, os.path.basename(sentinel.name).upper()))

View File

@@ -42,10 +42,10 @@
import llnl.util.tty.color
import spack.deptypes as dt
import spack.repo
import spack.spec
import spack.tengine
import spack.traverse
from spack.solver.input_analysis import create_graph_analyzer
def find(seq, predicate):
@@ -482,7 +482,7 @@ class SimpleDAG(DotGraphBuilder):
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
def node_entry(self, node):
format_option = "{name}{@version}{%compiler}{/hash:7}"
format_option = "{name}{@version}{/hash:7}{%compiler}"
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
def edge_entry(self, edge):
@@ -515,7 +515,7 @@ def visit(self, edge):
super().visit(edge)
def node_entry(self, node):
node_str = node.format("{name}{@version}{%compiler}{/hash:7}")
node_str = node.format("{name}{@version}{/hash:7}{%compiler}")
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
if node.dag_hash() in self.main_unified_space:
options = f'[label="{node_str}", group="main_psid"]'
@@ -537,10 +537,11 @@ def edge_entry(self, edge):
def _static_edges(specs, depflag):
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(expand_virtuals=True, depflag=depflag)
*_, edges = create_graph_analyzer().possible_dependencies(
spec.name, expand_virtuals=True, allowed_deps=depflag
)
for parent_name, dependencies in possible.items():
for parent_name, dependencies in edges.items():
for dependency_name in dependencies:
yield spack.spec.DependencySpec(
spack.spec.Spec(parent_name),

View File

@@ -6,7 +6,7 @@
import spack.deptypes as dt
import spack.repo
hashes = []
HASHES = []
class SpecHashDescriptor:
@@ -23,7 +23,7 @@ def __init__(self, depflag: dt.DepFlag, package_hash, name, override=None):
self.depflag = depflag
self.package_hash = package_hash
self.name = name
hashes.append(self)
HASHES.append(self)
# Allow spec hashes to have an alternate computation method
self.override = override
@@ -43,13 +43,9 @@ def __repr__(self):
)
#: Spack's deployment hash. Includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(depflag=dt.BUILD | dt.LINK | dt.RUN, package_hash=True, name="hash")
#: Hash descriptor used only to transfer a DAG, as is, across processes
process_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="process_hash"
#: The DAG hash includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="hash"
)

View File

@@ -2,198 +2,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import fnmatch
import io
import os
import re
from typing import Dict, List, Union
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, visit_directory_tree
from llnl.util.lang import stable_partition
from llnl.util.filesystem import visit_directory_tree
import spack.config
import spack.error
import spack.util.elf as elf
#: Patterns for names of libraries that are allowed to be unresolved when *just* looking at RPATHs
#: added by Spack. These are libraries outside of Spack's control, and assumed to be located in
#: default search paths of the dynamic linker.
ALLOW_UNRESOLVED = [
# kernel
"linux-vdso.so.*",
"libselinux.so.*",
# musl libc
"ld-musl-*.so.*",
# glibc
"ld-linux*.so.*",
"ld64.so.*",
"libanl.so.*",
"libc.so.*",
"libdl.so.*",
"libm.so.*",
"libmemusage.so.*",
"libmvec.so.*",
"libnsl.so.*",
"libnss_compat.so.*",
"libnss_db.so.*",
"libnss_dns.so.*",
"libnss_files.so.*",
"libnss_hesiod.so.*",
"libpcprofile.so.*",
"libpthread.so.*",
"libresolv.so.*",
"librt.so.*",
"libSegFault.so.*",
"libthread_db.so.*",
"libutil.so.*",
# gcc -- this is required even with gcc-runtime, because e.g. libstdc++ depends on libgcc_s,
# but the binaries we copy from the compiler don't have an $ORIGIN rpath.
"libasan.so.*",
"libatomic.so.*",
"libcc1.so.*",
"libgcc_s.so.*",
"libgfortran.so.*",
"libgomp.so.*",
"libitm.so.*",
"liblsan.so.*",
"libquadmath.so.*",
"libssp.so.*",
"libstdc++.so.*",
"libtsan.so.*",
"libubsan.so.*",
# systemd
"libudev.so.*",
# cuda driver
"libcuda.so.*",
]
def is_compatible(parent: elf.ElfFile, child: elf.ElfFile) -> bool:
return (
child.elf_hdr.e_type == elf.ELF_CONSTANTS.ET_DYN
and parent.is_little_endian == child.is_little_endian
and parent.is_64_bit == child.is_64_bit
and parent.elf_hdr.e_machine == child.elf_hdr.e_machine
)
def candidate_matches(current_elf: elf.ElfFile, candidate_path: bytes) -> bool:
try:
with open(candidate_path, "rb") as g:
return is_compatible(current_elf, elf.parse_elf(g))
except (OSError, elf.ElfParsingError):
return False
class Problem:
def __init__(
self, resolved: Dict[bytes, bytes], unresolved: List[bytes], relative_rpaths: List[bytes]
) -> None:
self.resolved = resolved
self.unresolved = unresolved
self.relative_rpaths = relative_rpaths
class ResolveSharedElfLibDepsVisitor(BaseDirectoryVisitor):
def __init__(self, allow_unresolved_patterns: List[str]) -> None:
self.problems: Dict[str, Problem] = {}
self._allow_unresolved_regex = re.compile(
"|".join(fnmatch.translate(x) for x in allow_unresolved_patterns)
)
def allow_unresolved(self, needed: bytes) -> bool:
try:
name = needed.decode("utf-8")
except UnicodeDecodeError:
return False
return bool(self._allow_unresolved_regex.match(name))
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# We work with byte strings for paths.
path = os.path.join(root, rel_path).encode("utf-8")
# For $ORIGIN interpolation: should not have trailing dir seperator.
origin = os.path.dirname(path)
# Retrieve the needed libs + rpaths.
try:
with open(path, "rb") as f:
parsed_elf = elf.parse_elf(f, interpreter=False, dynamic_section=True)
except (OSError, elf.ElfParsingError):
# Not dealing with an invalid ELF file.
return
# If there's no needed libs all is good
if not parsed_elf.has_needed:
return
# Get the needed libs and rpaths (notice: byte strings)
# Don't force an encoding cause paths are just a bag of bytes.
needed_libs = parsed_elf.dt_needed_strs
rpaths = parsed_elf.dt_rpath_str.split(b":") if parsed_elf.has_rpath else []
# We only interpolate $ORIGIN, not $LIB and $PLATFORM, they're not really
# supported in general. Also remove empty paths.
rpaths = [x.replace(b"$ORIGIN", origin) for x in rpaths if x]
# Do not allow relative rpaths (they are relative to the current working directory)
rpaths, relative_rpaths = stable_partition(rpaths, os.path.isabs)
# If there's a / in the needed lib, it's opened directly, otherwise it needs
# a search.
direct_libs, search_libs = stable_partition(needed_libs, lambda x: b"/" in x)
# Do not allow relative paths in direct libs (they are relative to the current working
# directory)
direct_libs, unresolved = stable_partition(direct_libs, os.path.isabs)
resolved: Dict[bytes, bytes] = {}
for lib in search_libs:
if self.allow_unresolved(lib):
continue
for rpath in rpaths:
candidate = os.path.join(rpath, lib)
if candidate_matches(parsed_elf, candidate):
resolved[lib] = candidate
break
else:
unresolved.append(lib)
# Check if directly opened libs are compatible
for lib in direct_libs:
if candidate_matches(parsed_elf, lib):
resolved[lib] = lib
else:
unresolved.append(lib)
if unresolved or relative_rpaths:
self.problems[rel_path] = Problem(resolved, unresolved, relative_rpaths)
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
pass
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# There can be binaries in .spack/test which shouldn't be checked.
if rel_path == ".spack":
return False
return True
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
return False
class CannotLocateSharedLibraries(spack.error.SpackError):
pass
def maybe_decode(byte_str: bytes) -> Union[str, bytes]:
try:
return byte_str.decode("utf-8")
except UnicodeDecodeError:
return byte_str
import spack.verify_libraries
def post_install(spec, explicit):
@@ -204,36 +20,23 @@ def post_install(spec, explicit):
if policy == "ignore" or spec.external or spec.platform not in ("linux", "freebsd"):
return
visitor = ResolveSharedElfLibDepsVisitor(
[*ALLOW_UNRESOLVED, *spec.package.unresolved_libraries]
visitor = spack.verify_libraries.ResolveSharedElfLibDepsVisitor(
[*spack.verify_libraries.ALLOW_UNRESOLVED, *spec.package.unresolved_libraries]
)
visit_directory_tree(spec.prefix, visitor)
# All good?
if not visitor.problems:
return
# For now just list the issues (print it in ldd style, except we don't recurse)
output = io.StringIO()
output.write("not all executables and libraries can resolve their dependencies:\n")
for path, problem in visitor.problems.items():
output.write(path)
output.write("\n")
for needed, full_path in problem.resolved.items():
output.write(" ")
if needed == full_path:
output.write(maybe_decode(needed))
else:
output.write(f"{maybe_decode(needed)} => {maybe_decode(full_path)}")
output.write("\n")
for not_found in problem.unresolved:
output.write(f" {maybe_decode(not_found)} => not found\n")
for relative_rpath in problem.relative_rpaths:
output.write(f" {maybe_decode(relative_rpath)} => relative rpath\n")
output = io.StringIO("not all executables and libraries can resolve their dependencies:\n")
visitor.write(output)
message = output.getvalue().strip()
if policy == "error":
raise CannotLocateSharedLibraries(message)
tty.warn(message)
class CannotLocateSharedLibraries(spack.error.SpackError):
pass

View File

@@ -21,7 +21,6 @@
from llnl.util.lang import nullcontext
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.config
import spack.error
import spack.package_base
@@ -398,7 +397,7 @@ def stand_alone_tests(self, kwargs):
Args:
kwargs (dict): arguments to be used by the test process
"""
import spack.build_environment
import spack.build_environment # avoid circular dependency
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
@@ -463,6 +462,8 @@ def write_tested_status(self):
@contextlib.contextmanager
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
import spack.build_environment # avoid circular dependency
wdir = "." if work_dir is None else work_dir
tester = pkg.tester
assert test_name and test_name.startswith(
@@ -566,7 +567,7 @@ def copy_test_files(pkg: Pb, test_spec: spack.spec.Spec):
# copy test data into test stage data dir
try:
pkg_cls = test_spec.package_class
pkg_cls = spack.repo.PATH.get_pkg_class(test_spec.fullname)
except spack.repo.UnknownPackageError:
tty.debug(f"{test_spec.name}: skipping test data copy since no package class found")
return
@@ -623,7 +624,7 @@ def test_functions(
vpkgs = virtuals(pkg)
for vname in vpkgs:
try:
classes.append((Spec(vname)).package_class)
classes.append(spack.repo.PATH.get_pkg_class(vname))
except spack.repo.UnknownPackageError:
tty.debug(f"{vname}: virtual does not appear to have a package file")
@@ -668,7 +669,7 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
# grab test functions associated with the spec, which may be virtual
try:
tests = test_functions(spec.package_class)
tests = test_functions(spack.repo.PATH.get_pkg_class(spec.fullname))
except spack.repo.UnknownPackageError:
# Some virtuals don't have a package so we don't want to report
# them as not having tests when that isn't appropriate.

View File

@@ -47,6 +47,8 @@
import spack.util.environment
import spack.util.lock
from .enums import ConfigScopePriority
#: names of profile statistics
stat_names = pstats.Stats.sort_arg_dict_default
@@ -872,14 +874,19 @@ def add_command_line_scopes(
scopes = ev.environment_path_scopes(name, path)
if scopes is None:
if os.path.isdir(path): # directory with config files
cfg.push_scope(spack.config.DirectoryConfigScope(name, path, writable=False))
spack.config._add_platform_scope(cfg, name, path, writable=False)
cfg.push_scope(
spack.config.DirectoryConfigScope(name, path, writable=False),
priority=ConfigScopePriority.CUSTOM,
)
spack.config._add_platform_scope(
cfg, name, path, priority=ConfigScopePriority.CUSTOM, writable=False
)
continue
else:
raise spack.error.ConfigError(f"Invalid configuration scope: {path}")
for scope in scopes:
cfg.push_scope(scope)
cfg.push_scope(scope, priority=ConfigScopePriority.CUSTOM)
def _main(argv=None):
@@ -952,7 +959,9 @@ def _main(argv=None):
# Push scopes from the command line last
if args.config_scopes:
add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
spack.config.CONFIG.push_scope(spack.config.InternalConfigScope("command_line"))
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope("command_line"), priority=ConfigScopePriority.COMMAND_LINE
)
setup_main_options(args)
# ------------------------------------------------------------------------
@@ -998,6 +1007,7 @@ def finish_parse_and_run(parser, cmd_name, main_args, env_format_error):
args, unknown = parser.parse_known_args(main_args.command)
# we need to inherit verbose since the install command checks for it
args.verbose = main_args.verbose
args.lines = main_args.lines
# Now that we know what command this is and what its args are, determine
# whether we can continue with a bad environment and raise if not.

View File

@@ -330,18 +330,17 @@ class BaseConfiguration:
default_projections = {"all": "{name}/{version}-{compiler.name}-{compiler.version}"}
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
# Module where type(self) is defined
m = inspect.getmodule(self)
assert m is not None # make mypy happy
self.module = m
# Spec for which we want to generate a module file
self.spec = spec
self.name = module_set_name
self.explicit = explicit
# Dictionary of configuration options that should be applied
# to the spec
# Dictionary of configuration options that should be applied to the spec
self.conf = merge_config_rules(self.module.configuration(self.name), self.spec)
@property
def module(self):
return inspect.getmodule(self)
@property
def projections(self):
"""Projection from specs to module names"""
@@ -565,6 +564,12 @@ def __init__(self, configuration):
def spec(self):
return self.conf.spec
@tengine.context_property
def tags(self):
if not hasattr(self.spec.package, "tags"):
return []
return self.spec.package.tags
@tengine.context_property
def timestamp(self):
return datetime.datetime.now()
@@ -775,10 +780,6 @@ def __init__(
) -> None:
self.spec = spec
# This class is meant to be derived. Get the module of the
# actual writer.
self.module = inspect.getmodule(self)
assert self.module is not None # make mypy happy
m = self.module
# Create the triplet of configuration/layout/context
@@ -816,6 +817,10 @@ def __init__(
name = type(self).__name__
raise ModulercHeaderNotDefined(msg.format(name))
@property
def module(self):
return inspect.getmodule(self)
def _get_template(self):
"""Gets the template that will be rendered for this spec."""
# Get templates and put them in the order of importance:

View File

@@ -209,7 +209,7 @@ def provides(self):
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:
if self.spec.package.provides(x):
provides[x] = self.spec[x]
provides[x] = self.spec
return provides
@property

View File

@@ -22,7 +22,6 @@
import textwrap
import time
import traceback
import typing
from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, TypeVar, Union
from typing_extensions import Literal
@@ -49,6 +48,7 @@
import spack.store
import spack.url
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.web
import spack.variant
@@ -126,9 +126,10 @@ def windows_establish_runtime_linkage(self):
# Spack should in general not modify things it has not installed
# we can reasonably expect externals to have their link interface properly established
if sys.platform == "win32" and not self.spec.external:
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link()
win_rpath = fsys.WindowsSimulatedRPath(self)
win_rpath.add_library_dependent(*self.win_add_library_dependent())
win_rpath.add_rpath(*self.win_add_rpath())
win_rpath.establish_link()
#: Registers which are the detectable packages, by repo and package name
@@ -710,19 +711,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
#: Do not include @ here in order not to unnecessarily ping the users.
maintainers: List[str] = []
#: List of attributes to be excluded from a package's hash.
metadata_attrs = [
"homepage",
"url",
"urls",
"list_url",
"extendable",
"parallel",
"make_jobs",
"maintainers",
"tags",
]
#: Set to ``True`` to indicate the stand-alone test requires a compiler.
#: It is used to ensure a compiler and build dependencies like 'cmake'
#: are available to build a custom test code.
@@ -756,7 +744,6 @@ def __init__(self, spec):
# Set up timing variables
self._fetch_time = 0.0
self.win_rpath = fsys.WindowsSimulatedRPath(self)
super().__init__()
def __getitem__(self, key: str) -> "PackageBase":
@@ -822,104 +809,6 @@ def get_variant(self, name: str) -> spack.variant.Variant:
except StopIteration:
raise ValueError(f"No variant '{name}' on spec: {self.spec}")
@classmethod
def possible_dependencies(
cls,
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
visited: Optional[dict] = None,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Return dict of possible dependencies of this package.
Args:
transitive (bool or None): return all transitive dependencies if
True, only direct dependencies if False (default True)..
expand_virtuals (bool or None): expand virtual dependencies into
all possible implementations (default True)
depflag: dependency types to consider
visited (dict or None): dict of names of dependencies visited so
far, mapped to their immediate dependencies' names.
missing (dict or None): dict to populate with packages and their
*missing* dependencies.
virtuals (set): if provided, populate with virtuals seen so far.
Returns:
(dict): dictionary mapping dependency names to *their*
immediate dependencies
Each item in the returned dictionary maps a (potentially
transitive) dependency of this package to its possible
*immediate* dependencies. If ``expand_virtuals`` is ``False``,
virtual package names wil be inserted as keys mapped to empty
sets of dependencies. Virtuals, if not expanded, are treated as
though they have no immediate dependencies.
Missing dependencies by default are ignored, but if a
missing dict is provided, it will be populated with package names
mapped to any dependencies they have that are in no
repositories. This is only populated if transitive is True.
Note: the returned dict *includes* the package itself.
"""
visited = {} if visited is None else visited
missing = {} if missing is None else missing
visited.setdefault(cls.name, set())
for name, conditions in cls.dependencies_by_name(when=True).items():
# check whether this dependency could be of the type asked for
depflag_union = 0
for deplist in conditions.values():
for dep in deplist:
depflag_union |= dep.depflag
if not (depflag & depflag_union):
continue
# expand virtuals if enabled, otherwise just stop at virtuals
if spack.repo.PATH.is_virtual(name):
if virtuals is not None:
virtuals.add(name)
if expand_virtuals:
providers = spack.repo.PATH.providers_for(name)
dep_names = [spec.name for spec in providers]
else:
visited.setdefault(cls.name, set()).add(name)
visited.setdefault(name, set())
continue
else:
dep_names = [name]
# add the dependency names to the visited dict
visited.setdefault(cls.name, set()).update(set(dep_names))
# recursively traverse dependencies
for dep_name in dep_names:
if dep_name in visited:
continue
visited.setdefault(dep_name, set())
# skip the rest if not transitive
if not transitive:
continue
try:
dep_cls = spack.repo.PATH.get_pkg_class(dep_name)
except spack.repo.UnknownPackageError:
# log unknown packages
missing.setdefault(cls.name, set()).add(dep_name)
continue
dep_cls.possible_dependencies(
transitive, expand_virtuals, depflag, visited, missing, virtuals
)
return visited
@classproperty
def package_dir(cls):
"""Directory where the package.py file lives."""
@@ -1399,12 +1288,13 @@ def extendee_spec(self):
if not self.extendees:
return None
deps = []
# If the extendee is in the spec's deps already, return that.
for dep in self.spec.traverse(deptype=("link", "run")):
if dep.name in self.extendees:
deps.append(dep)
deps = [
dep
for dep in self.spec.dependencies(deptype=("link", "run"))
for d, when in self.extendees.values()
if dep.satisfies(d) and self.spec.satisfies(when)
]
if deps:
assert len(deps) == 1
@@ -1481,6 +1371,14 @@ def prefix(self):
def home(self):
return self.prefix
@property
def command(self) -> spack.util.executable.Executable:
"""Returns the main executable for this package."""
path = os.path.join(self.home.bin, self.spec.name)
if fsys.is_exe(path):
return spack.util.executable.Executable(path)
raise RuntimeError(f"Unable to locate {self.spec.name} command in {self.home.bin}")
@property # type: ignore[misc]
@memoized
def compiler(self):
@@ -2284,47 +2182,6 @@ def rpath_args(self):
build_system_flags = PackageBase.build_system_flags
def possible_dependencies(
*pkg_or_spec: Union[str, spack.spec.Spec, typing.Type[PackageBase]],
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Get the possible dependencies of a number of packages.
See ``PackageBase.possible_dependencies`` for details.
"""
packages = []
for pos in pkg_or_spec:
if isinstance(pos, PackageMeta) and issubclass(pos, PackageBase):
packages.append(pos)
continue
if not isinstance(pos, spack.spec.Spec):
pos = spack.spec.Spec(pos)
if spack.repo.PATH.is_virtual(pos.name):
packages.extend(p.package_class for p in spack.repo.PATH.providers_for(pos.name))
continue
else:
packages.append(pos.package_class)
visited: Dict[str, Set[str]] = {}
for pkg in packages:
pkg.possible_dependencies(
visited=visited,
transitive=transitive,
expand_virtuals=expand_virtuals,
depflag=depflag,
missing=missing,
virtuals=virtuals,
)
return visited
def deprecated_version(pkg: PackageBase, version: Union[str, StandardVersion]) -> bool:
"""Return True iff the version is deprecated.

View File

@@ -83,6 +83,7 @@ def __init__(
level: int,
working_dir: str,
reverse: bool = False,
ordering_key: Optional[Tuple[str, int]] = None,
) -> None:
"""Initialize a new Patch instance.
@@ -92,6 +93,7 @@ def __init__(
level: patch level
working_dir: relative path *within* the stage to change to
reverse: reverse the patch
ordering_key: key used to ensure patches are applied in a consistent order
"""
# validate level (must be an integer >= 0)
if not isinstance(level, int) or not level >= 0:
@@ -105,6 +107,13 @@ def __init__(
self.working_dir = working_dir
self.reverse = reverse
# The ordering key is passed when executing package.py directives, and is only relevant
# after a solve to build concrete specs with consistently ordered patches. For concrete
# specs read from a file, we add patches in the order of its patches variants and the
# ordering_key is irrelevant. In that case, use a default value so we don't need to branch
# on whether ordering_key is None where it's used, just to make static analysis happy.
self.ordering_key: Tuple[str, int] = ordering_key or ("", 0)
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
@@ -202,9 +211,8 @@ def __init__(
msg += "package %s.%s does not exist." % (pkg.namespace, pkg.name)
raise ValueError(msg)
super().__init__(pkg, abs_path, level, working_dir, reverse)
super().__init__(pkg, abs_path, level, working_dir, reverse, ordering_key)
self.path = abs_path
self.ordering_key = ordering_key
@property
def sha256(self) -> str:
@@ -266,13 +274,11 @@ def __init__(
archive_sha256: sha256 sum of the *archive*, if the patch is compressed
(only required for compressed URL patches)
"""
super().__init__(pkg, url, level, working_dir, reverse)
super().__init__(pkg, url, level, working_dir, reverse, ordering_key)
self.url = url
self._stage: Optional["spack.stage.Stage"] = None
self.ordering_key = ordering_key
if allowed_archive(self.url) and not archive_sha256:
raise spack.error.PatchDirectiveError(
"Compressed patches require 'archive_sha256' "

View File

@@ -108,6 +108,8 @@ def _get_user_cache_path():
#: transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
default_misc_cache_path = os.path.join(user_cache_path, "cache")
#: concretization cache for Spack concretizations
default_conc_cache_path = os.path.join(default_misc_cache_path, "concretization")
# Below paths pull configuration from the host environment.
#

View File

@@ -283,21 +283,21 @@ def relocate_text_bin(binaries: Iterable[str], prefix_to_prefix: PrefixToPrefix)
def is_macho_magic(magic: bytes) -> bool:
return (
# In order of popularity: 64-bit mach-o le/be, 32-bit mach-o le/be.
magic.startswith(b"\xCF\xFA\xED\xFE")
or magic.startswith(b"\xFE\xED\xFA\xCF")
or magic.startswith(b"\xCE\xFA\xED\xFE")
or magic.startswith(b"\xFE\xED\xFA\xCE")
magic.startswith(b"\xcf\xfa\xed\xfe")
or magic.startswith(b"\xfe\xed\xfa\xcf")
or magic.startswith(b"\xce\xfa\xed\xfe")
or magic.startswith(b"\xfe\xed\xfa\xce")
# universal binaries: 0xcafebabe be (most common?) or 0xbebafeca le (not sure if exists).
# Here we need to disambiguate mach-o and JVM class files. In mach-o the next 4 bytes are
# the number of binaries; in JVM class files it's the java version number. We assume there
# are less than 10 binaries in a universal binary.
or (magic.startswith(b"\xCA\xFE\xBA\xBE") and int.from_bytes(magic[4:8], "big") < 10)
or (magic.startswith(b"\xBE\xBA\xFE\xCA") and int.from_bytes(magic[4:8], "little") < 10)
or (magic.startswith(b"\xca\xfe\xba\xbe") and int.from_bytes(magic[4:8], "big") < 10)
or (magic.startswith(b"\xbe\xba\xfe\xca") and int.from_bytes(magic[4:8], "little") < 10)
)
def is_elf_magic(magic: bytes) -> bool:
return magic.startswith(b"\x7FELF")
return magic.startswith(b"\x7fELF")
def is_binary(filename: str) -> bool:
@@ -406,8 +406,8 @@ def fixup_macos_rpaths(spec):
entries which makes it harder to adjust with ``install_name_tool
-delete_rpath``.
"""
if spec.external or spec.virtual:
tty.warn("external or virtual package cannot be fixed up: {0!s}".format(spec))
if spec.external or not spec.concrete:
tty.warn("external/abstract spec cannot be fixed up: {0!s}".format(spec))
return False
if "platform=darwin" not in spec:

View File

@@ -32,6 +32,7 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.caches
import spack.config
import spack.error
@@ -49,6 +50,8 @@
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
ROOT_PYTHON_NAMESPACE = "spack.pkg"
_API_REGEX = re.compile(r"^v(\d+)\.(\d+)$")
def python_package_for_repo(namespace):
"""Returns the full namespace of a repository, given its relative one
@@ -909,19 +912,52 @@ def __reduce__(self):
return RepoPath.unmarshal, self.marshal()
def _parse_package_api_version(
config: Dict[str, Any],
min_api: Tuple[int, int] = spack.min_package_api_version,
max_api: Tuple[int, int] = spack.package_api_version,
) -> Tuple[int, int]:
api = config.get("api")
if api is None:
package_api = (1, 0)
else:
if not isinstance(api, str):
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
api_match = _API_REGEX.match(api)
if api_match is None:
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
package_api = (int(api_match.group(1)), int(api_match.group(2)))
if min_api <= package_api <= max_api:
return package_api
min_str = ".".join(str(i) for i in min_api)
max_str = ".".join(str(i) for i in max_api)
curr_str = ".".join(str(i) for i in package_api)
raise BadRepoError(
f"Package API v{curr_str} is not supported by this version of Spack ("
f"must be between v{min_str} and v{max_str})"
)
class Repo:
"""Class representing a package repository in the filesystem.
Each package repository must have a top-level configuration file
called `repo.yaml`.
Each package repository must have a top-level configuration file called `repo.yaml`.
Currently, `repo.yaml` must define:
It contains the following keys:
`namespace`:
A Python namespace where the repository's packages should live.
`subdirectory`:
An optional subdirectory name where packages are placed
`api`:
A string of the form vX.Y that indicates the Package API version. The default is "v1.0".
For the repo to be compatible with the current version of Spack, the version must be
greater than or equal to :py:data:`spack.min_package_api_version` and less than or equal to
:py:data:`spack.package_api_version`.
"""
def __init__(
@@ -958,7 +994,7 @@ def check(condition, msg):
f"{os.path.join(root, repo_config_name)} must define a namespace.",
)
self.namespace = config["namespace"]
self.namespace: str = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
@@ -971,12 +1007,14 @@ def check(condition, msg):
# Keep name components around for checking prefixes.
self._names = self.full_namespace.split(".")
packages_dir = config.get("subdirectory", packages_dir_name)
packages_dir: str = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
)
self.package_api = _parse_package_api_version(config)
# Class attribute overrides by package name
self.overrides = overrides or {}
@@ -1026,7 +1064,7 @@ def is_prefix(self, fullname: str) -> bool:
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self) -> Dict[str, str]:
def _read_config(self) -> Dict[str, Any]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file, encoding="utf-8") as reponame_file:
@@ -1368,6 +1406,8 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
config.write(f" namespace: '{namespace}'\n")
if subdir != packages_dir_name:
config.write(f" subdirectory: '{subdir}'\n")
x, y = spack.package_api_version
config.write(f" api: v{x}.{y}\n")
except OSError as e:
# try to clean up.

View File

@@ -177,7 +177,7 @@ def build_report_for_package(self, report_dir, package, duration):
# something went wrong pre-cdash "configure" phase b/c we have an exception and only
# "update" was encounterd.
# dump the report in the configure line so teams can see what the issue is
if len(phases_encountered) == 1 and package["exception"]:
if len(phases_encountered) == 1 and package.get("exception"):
# TODO this mapping is not ideal since these are pre-configure errors
# we need to determine if a more appropriate cdash phase can be utilized
# for now we will add a message to the log explaining this

View File

@@ -7,8 +7,7 @@
import warnings
import jsonschema
import llnl.util.lang
import jsonschema.validators
from spack.error import SpecSyntaxError
@@ -18,59 +17,59 @@ class DeprecationMessage(typing.NamedTuple):
error: bool
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import spack.spec_parser
def _validate_spec(validator, is_spec, instance, schema):
"""Check if all additional keys are valid specs."""
import spack.spec_parser
if not validator.is_type(instance, "object"):
return
if not validator.is_type(instance, "object"):
return
for spec_str in instance:
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
properties = schema.get("properties") or {}
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
if not deprecated:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft7Validator,
{"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties},
)
for spec_str in instance:
if spec_str in properties:
continue
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
Validator = llnl.util.lang.Singleton(_make_validator)
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
if not deprecated:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
Validator = jsonschema.validators.extend(
jsonschema.Draft7Validator,
{"additionalKeysAreSpecs": _validate_spec, "deprecatedProperties": _deprecated_properties},
)
def _append(string: str) -> bool:

View File

@@ -84,9 +84,14 @@
"duplicates": {
"type": "object",
"properties": {
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]}
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]},
"max_dupes": {
"type": "object",
"additional_properties": {"type": "integer", "minimum": 1},
},
},
},
"static_analysis": {"type": "boolean"},
"timeout": {"type": "integer", "minimum": 0},
"error_on_timeout": {"type": "boolean"},
"os_compatible": {"type": "object", "additionalProperties": {"type": "array"}},

View File

@@ -58,6 +58,15 @@
{"type": "string"}, # deprecated
]
},
"concretization_cache": {
"type": "object",
"properties": {
"enable": {"type": "boolean"},
"url": {"type": "string"},
"entry_limit": {"type": "integer", "minimum": 0},
"size_limit": {"type": "integer", "minimum": 0},
},
},
"install_hash_length": {"type": "integer", "minimum": 1},
"install_path_scheme": {"type": "string"}, # deprecated
"build_stage": {

View File

@@ -3,12 +3,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for Cray descriptive manifest: this describes a set of
installed packages on the system and also specifies dependency
relationships between them (so this provides more information than
external entries in packages configuration).
installed packages on the system and also specifies dependency
relationships between them (so this provides more information than
external entries in packages configuration).
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
"""
from typing import Any, Dict

View File

@@ -29,11 +29,7 @@
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spec_list_schema,
"include_concrete": include_concrete,
},
{"specs": spec_list_schema, "include_concrete": include_concrete},
),
}
}

View File

@@ -0,0 +1,41 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for include.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/include.py
:lines: 12-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"include": {
"type": "array",
"default": [],
"additionalProperties": False,
"items": {
"anyOf": [
{
"type": "object",
"properties": {
"when": {"type": "string"},
"path": {"type": "string"},
"sha256": {"type": "string"},
"optional": {"type": "boolean"},
},
"required": ["path"],
"additionalProperties": False,
},
{"type": "string"},
]
},
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack include configuration file schema",
"properties": properties,
}

View File

@@ -21,6 +21,7 @@
import spack.schema.definitions
import spack.schema.develop
import spack.schema.env_vars
import spack.schema.include
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -40,6 +41,7 @@
spack.schema.definitions.properties,
spack.schema.develop.properties,
spack.schema.env_vars.properties,
spack.schema.include.properties,
spack.schema.mirrors.properties,
spack.schema.modules.properties,
spack.schema.packages.properties,
@@ -48,7 +50,6 @@
spack.schema.view.properties,
)
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -39,7 +39,7 @@
"load": array_of_strings,
"suffixes": {
"type": "object",
"validate_spec": True,
"additionalKeysAreSpecs": True,
"additionalProperties": {"type": "string"}, # key
},
"environment": spack.schema.environment.definition,
@@ -48,40 +48,44 @@
projections_scheme = spack.schema.projections.properties["projections"]
module_type_configuration: Dict = {
common_props = {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
}
tcl_configuration = {
"type": "object",
"default": {},
"validate_spec": True,
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
},
"additionalKeysAreSpecs": True,
"properties": {**common_props},
"additionalProperties": module_file_configuration,
}
tcl_configuration = module_type_configuration.copy()
lmod_configuration = module_type_configuration.copy()
lmod_configuration["properties"].update(
{
lmod_configuration = {
"type": "object",
"default": {},
"additionalKeysAreSpecs": True,
"properties": {
**common_props,
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"validate_spec": True,
"additionalKeysAreSpecs": True,
"additionalProperties": array_of_strings,
},
}
)
},
"additionalProperties": module_file_configuration,
}
module_config_properties = {
"use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]},

File diff suppressed because it is too large Load Diff

View File

@@ -265,6 +265,7 @@ error(100, "Cannot select a single version for virtual '{0}'", Virtual)
% If we select a deprecated version, mark the package as deprecated
attr("deprecated", node(ID, Package), Version) :-
attr("version", node(ID, Package), Version),
not external(node(ID, Package)),
pkg_fact(Package, deprecated_version(Version)).
error(100, "Package '{0}' needs the deprecated version '{1}', and this is not allowed", Package, Version)
@@ -523,6 +524,16 @@ error(10, "'{0}' is not a valid dependency for any package in the DAG", Package)
:- attr("node", node(ID, Package)),
not needed(node(ID, Package)).
% Extensions depending on each other must all extend the same node (e.g. all Python packages
% depending on each other must depend on the same Python interpreter)
error(100, "{0} and {1} must depend on the same {2}", ExtensionParent, ExtensionChild, ExtendeePackage)
:- depends_on(ExtensionParent, ExtensionChild),
attr("extends", ExtensionParent, ExtendeePackage),
depends_on(ExtensionParent, node(X, ExtendeePackage)),
depends_on(ExtensionChild, node(Y, ExtendeePackage)),
X != Y.
#defined dependency_type/2.
%-----------------------------------------------------------------------------
@@ -586,6 +597,13 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_on_edge", _, ProviderNode, Virtual).
% This is needed to allow requirement on virtuals,
% when a virtual root is requested
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_root", node(min_dupe_id, Virtual)),
attr("root", ProviderNode),
provider(ProviderNode, node(min_dupe_id, Virtual)).
% dependencies on virtuals also imply that the virtual is a virtual node
1 { attr("virtual_node", node(0..X-1, Virtual)) : max_dupes(Virtual, X) }
:- node_depends_on_virtual(PackageNode, Virtual).
@@ -942,12 +960,14 @@ error(100, "Cannot set variant '{0}' for package '{1}' because the variant condi
build(node(ID, Package)).
% at most one variant value for single-valued variants.
error(100, "'{0}' required multiple values for single-valued variant '{1}'", Package, Variant)
error(100, "'{0}' requires conflicting variant values 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2)
:- attr("node", node(ID, Package)),
node_has_variant(node(ID, Package), Variant, _),
variant_single_value(node(ID, Package), Variant),
build(node(ID, Package)),
2 { attr("variant_value", node(ID, Package), Variant, Value) }.
attr("variant_value", node(ID, Package), Variant, Value1),
attr("variant_value", node(ID, Package), Variant, Value2),
Value1 < Value2,
build(node(ID, Package)).
error(100, "No valid value for variant '{1}' of package '{0}'", Package, Variant)
:- attr("node", node(ID, Package)),

View File

@@ -31,16 +31,19 @@ class AspObject:
"""Object representing a piece of ASP code."""
def _id(thing: Any) -> Union[str, AspObject]:
def _id(thing: Any) -> Union[str, int, AspObject]:
"""Quote string if needed for it to be a valid identifier."""
if isinstance(thing, AspObject):
if isinstance(thing, bool):
return f'"{thing}"'
elif isinstance(thing, (AspObject, int)):
return thing
elif isinstance(thing, bool):
return f'"{str(thing)}"'
elif isinstance(thing, int):
return str(thing)
else:
return f'"{str(thing)}"'
if isinstance(thing, str):
# escape characters that cannot be in clingo strings
thing = thing.replace("\\", r"\\")
thing = thing.replace("\n", r"\n")
thing = thing.replace('"', r"\"")
return f'"{thing}"'
class AspVar(AspObject):
@@ -90,26 +93,9 @@ def __call__(self, *args: Any) -> "AspFunction":
"""
return AspFunction(self.name, self.args + args)
def _argify(self, arg: Any) -> Any:
"""Turn the argument into an appropriate clingo symbol"""
if isinstance(arg, bool):
return clingo().String(str(arg))
elif isinstance(arg, int):
return clingo().Number(arg)
elif isinstance(arg, AspFunction):
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
elif isinstance(arg, AspVar):
return clingo().Variable(arg.name)
return clingo().String(str(arg))
def symbol(self):
"""Return a clingo symbol for this function"""
return clingo().Function(
self.name, [self._argify(arg) for arg in self.args], positive=True
)
def __str__(self) -> str:
return f"{self.name}({', '.join(str(_id(arg)) for arg in self.args)})"
args = f"({','.join(str(_id(arg)) for arg in self.args)})"
return f"{self.name}{args}"
def __repr__(self) -> str:
return str(self)

View File

@@ -1,179 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
from typing import List, Set
from llnl.util import lang
import spack.deptypes as dt
import spack.package_base
import spack.repo
import spack.spec
PossibleDependencies = Set[str]
class Counter:
"""Computes the possible packages and the maximum number of duplicates
allowed for each of them.
Args:
specs: abstract specs to concretize
tests: if True, add test dependencies to the list of possible packages
"""
def __init__(self, specs: List["spack.spec.Spec"], tests: bool) -> None:
runtime_pkgs = spack.repo.PATH.packages_with_tags("runtime")
runtime_virtuals = set()
for x in runtime_pkgs:
pkg_class = spack.repo.PATH.get_pkg_class(x)
runtime_virtuals.update(pkg_class.provided_virtual_names())
self.specs = specs + [spack.spec.Spec(x) for x in runtime_pkgs]
self.link_run_types: dt.DepFlag = dt.LINK | dt.RUN | dt.TEST
self.all_types: dt.DepFlag = dt.ALL
if not tests:
self.link_run_types = dt.LINK | dt.RUN
self.all_types = dt.LINK | dt.RUN | dt.BUILD
self._possible_dependencies: PossibleDependencies = set()
self._possible_virtuals: Set[str] = (
set(x.name for x in specs if x.virtual) | runtime_virtuals
)
def possible_dependencies(self) -> PossibleDependencies:
"""Returns the list of possible dependencies"""
self.ensure_cache_values()
return self._possible_dependencies
def possible_virtuals(self) -> Set[str]:
"""Returns the list of possible virtuals"""
self.ensure_cache_values()
return self._possible_virtuals
def ensure_cache_values(self) -> None:
"""Ensure the cache values have been computed"""
if self._possible_dependencies:
return
self._compute_cache_values()
def possible_packages_facts(self, gen: "spack.solver.asp.PyclingoDriver", fn) -> None:
"""Emit facts associated with the possible packages"""
raise NotImplementedError("must be implemented by derived classes")
def _compute_cache_values(self):
raise NotImplementedError("must be implemented by derived classes")
class NoDuplicatesCounter(Counter):
def _compute_cache_values(self):
result = spack.package_base.possible_dependencies(
*self.specs, virtuals=self._possible_virtuals, depflag=self.all_types
)
self._possible_dependencies = set(result)
def possible_packages_facts(self, gen, fn):
gen.h2("Maximum number of nodes (packages)")
for package_name in sorted(self.possible_dependencies()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self.possible_dependencies()):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class MinimalDuplicatesCounter(NoDuplicatesCounter):
def __init__(self, specs, tests):
super().__init__(specs, tests)
self._link_run: PossibleDependencies = set()
self._direct_build: PossibleDependencies = set()
self._total_build: PossibleDependencies = set()
self._link_run_virtuals: Set[str] = set()
def _compute_cache_values(self):
self._link_run = set(
spack.package_base.possible_dependencies(
*self.specs, virtuals=self._possible_virtuals, depflag=self.link_run_types
)
)
self._link_run_virtuals.update(self._possible_virtuals)
for x in self._link_run:
build_dependencies = spack.repo.PATH.get_pkg_class(x).dependencies_of_type(dt.BUILD)
virtuals, reals = lang.stable_partition(
build_dependencies, spack.repo.PATH.is_virtual_safe
)
self._possible_virtuals.update(virtuals)
for virtual_dep in virtuals:
providers = spack.repo.PATH.providers_for(virtual_dep)
self._direct_build.update(str(x) for x in providers)
self._direct_build.update(reals)
self._total_build = set(
spack.package_base.possible_dependencies(
*self._direct_build, virtuals=self._possible_virtuals, depflag=self.all_types
)
)
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Packages with at multiple possible nodes (build-tools)")
for package_name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.max_dupes(package_name, 2))
gen.fact(fn.multiple_unification_sets(package_name))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)
gen.h2("Maximum number of nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
count = min(count, 2)
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
gen.h2("Build unification sets ")
for name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.multiple_unification_sets(name))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
counter = collections.Counter(
list(self._link_run_virtuals) + list(self._possible_virtuals)
)
gen.h2("Maximum number of virtual nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
gen.fact(fn.max_dupes(pkg, count))
gen.newline()

View File

@@ -117,7 +117,7 @@ error(0, "Cannot find a valid provider for virtual {0}", Virtual, startcauses, C
condition_holds(Cause, node(CID, TriggerPkg)).
% At most one variant value for single-valued variants
error(0, "'{0}' required multiple values for single-valued variant '{1}'\n Requested 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2, startcauses, Cause1, X, Cause2, X)
error(0, "'{0}' requires conflicting variant values 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2, startcauses, Cause1, X, Cause2, X)
:- attr("node", node(X, Package)),
node_has_variant(node(X, Package), Variant, VariantID),
variant_single_value(node(X, Package), Variant),

View File

@@ -0,0 +1,539 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Classes to analyze the input of a solve, and provide information to set up the ASP problem"""
import collections
from typing import Dict, List, NamedTuple, Set, Tuple, Union
import archspec.cpu
from llnl.util import lang, tty
import spack.binary_distribution
import spack.config
import spack.deptypes as dt
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.error import SpackError
RUNTIME_TAG = "runtime"
class PossibleGraph(NamedTuple):
real_pkgs: Set[str]
virtuals: Set[str]
edges: Dict[str, Set[str]]
class PossibleDependencyGraph:
"""Returns information needed to set up an ASP problem"""
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
raise NotImplementedError
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
raise NotImplementedError
def possible_dependencies(
self,
*specs: Union[spack.spec.Spec, str],
allowed_deps: dt.DepFlag,
transitive: bool = True,
strict_depflag: bool = False,
expand_virtuals: bool = True,
) -> PossibleGraph:
"""Returns the set of possible dependencies, and the set of possible virtuals.
Both sets always include runtime packages, which may be injected by compilers.
Args:
transitive: return transitive dependencies if True, only direct dependencies if False
allowed_deps: dependency types to consider
strict_depflag: if True, only the specific dep type is considered, if False any
deptype that intersects with allowed deptype is considered
expand_virtuals: expand virtual dependencies into all possible implementations
"""
raise NotImplementedError
class NoStaticAnalysis(PossibleDependencyGraph):
"""Implementation that tries to minimize the setup time (i.e. defaults to give fast
answers), rather than trying to reduce the ASP problem size with more complex analysis.
"""
def __init__(self, *, configuration: spack.config.Configuration, repo: spack.repo.RepoPath):
self.configuration = configuration
self.repo = repo
self.runtime_pkgs = set(self.repo.packages_with_tags(RUNTIME_TAG))
self.runtime_virtuals = set()
self._platform_condition = spack.spec.Spec(
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
)
for x in self.runtime_pkgs:
pkg_class = self.repo.get_pkg_class(x)
self.runtime_virtuals.update(pkg_class.provided_virtual_names())
try:
self.libc_pkgs = [x.name for x in self.providers_for("libc")]
except spack.repo.UnknownPackageError:
self.libc_pkgs = []
def is_virtual(self, name: str) -> bool:
return self.repo.is_virtual(name)
@lang.memoized
def is_allowed_on_this_platform(self, *, pkg_name: str) -> bool:
"""Returns true if a package is allowed on the current host"""
pkg_cls = self.repo.get_pkg_class(pkg_name)
for when_spec, conditions in pkg_cls.requirements.items():
if not when_spec.intersects(self._platform_condition):
continue
for requirements, _, _ in conditions:
if not any(x.intersects(self._platform_condition) for x in requirements):
tty.debug(f"[{__name__}] {pkg_name} is not for this platform")
return False
return True
def providers_for(self, virtual_str: str) -> List[spack.spec.Spec]:
"""Returns a list of possible providers for the virtual string in input."""
return self.repo.providers_for(virtual_str)
def can_be_installed(self, *, pkg_name) -> bool:
"""Returns True if a package can be installed, False otherwise."""
return True
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
return False
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
platform = spack.platforms.host()
default_target = archspec.cpu.TARGETS[platform.default]
# Construct the list of targets which are compatible with the host
candidate_targets = [default_target] + default_target.ancestors
granularity = self.configuration.get("concretizer:targets:granularity")
host_compatible = self.configuration.get("concretizer:targets:host_compatible")
# Add targets which are not compatible with the current host
if not host_compatible:
additional_targets_in_family = sorted(
[
t
for t in archspec.cpu.TARGETS.values()
if (t.family.name == default_target.family.name and t not in candidate_targets)
],
key=lambda x: len(x.ancestors),
reverse=True,
)
candidate_targets += additional_targets_in_family
# Check if we want only generic architecture
if granularity == "generic":
candidate_targets = [t for t in candidate_targets if t.vendor == "generic"]
return candidate_targets
def possible_dependencies(
self,
*specs: Union[spack.spec.Spec, str],
allowed_deps: dt.DepFlag,
transitive: bool = True,
strict_depflag: bool = False,
expand_virtuals: bool = True,
) -> PossibleGraph:
stack = [x for x in self._package_list(specs)]
virtuals: Set[str] = set()
edges: Dict[str, Set[str]] = {}
while stack:
pkg_name = stack.pop()
if pkg_name in edges:
continue
edges[pkg_name] = set()
# Since libc is not buildable, there is no need to extend the
# search space with libc dependencies.
if pkg_name in self.libc_pkgs:
continue
pkg_cls = self.repo.get_pkg_class(pkg_name=pkg_name)
for name, conditions in pkg_cls.dependencies_by_name(when=True).items():
if all(self.unreachable(pkg_name=pkg_name, when_spec=x) for x in conditions):
tty.debug(
f"[{__name__}] Not adding {name} as a dep of {pkg_name}, because "
f"conditions cannot be met"
)
continue
if not self._has_deptypes(
conditions, allowed_deps=allowed_deps, strict=strict_depflag
):
continue
if name in virtuals:
continue
dep_names = set()
if self.is_virtual(name):
virtuals.add(name)
if expand_virtuals:
providers = self.providers_for(name)
dep_names = {spec.name for spec in providers}
else:
dep_names = {name}
edges[pkg_name].update(dep_names)
if not transitive:
continue
for dep_name in dep_names:
if dep_name in edges:
continue
if not self._is_possible(pkg_name=dep_name):
continue
stack.append(dep_name)
real_packages = set(edges)
if not transitive:
# We exit early, so add children from the edges information
for root, children in edges.items():
real_packages.update(x for x in children if self._is_possible(pkg_name=x))
virtuals.update(self.runtime_virtuals)
real_packages = real_packages | self.runtime_pkgs
return PossibleGraph(real_pkgs=real_packages, virtuals=virtuals, edges=edges)
def _package_list(self, specs: Tuple[Union[spack.spec.Spec, str], ...]) -> List[str]:
stack = []
for current_spec in specs:
if isinstance(current_spec, str):
current_spec = spack.spec.Spec(current_spec)
if self.repo.is_virtual(current_spec.name):
stack.extend([p.name for p in self.providers_for(current_spec.name)])
continue
stack.append(current_spec.name)
return sorted(set(stack))
def _has_deptypes(self, dependencies, *, allowed_deps: dt.DepFlag, strict: bool) -> bool:
if strict is True:
return any(
dep.depflag == allowed_deps for deplist in dependencies.values() for dep in deplist
)
return any(
dep.depflag & allowed_deps for deplist in dependencies.values() for dep in deplist
)
def _is_possible(self, *, pkg_name):
try:
return self.is_allowed_on_this_platform(pkg_name=pkg_name) and self.can_be_installed(
pkg_name=pkg_name
)
except spack.repo.UnknownPackageError:
return False
class StaticAnalysis(NoStaticAnalysis):
"""Performs some static analysis of the configuration, store, etc. to provide more precise
answers on whether some packages can be installed, or used as a provider.
It increases the setup time, but might decrease the grounding and solve time considerably,
especially when requirements restrict the possible choices for providers.
"""
def __init__(
self,
*,
configuration: spack.config.Configuration,
repo: spack.repo.RepoPath,
store: spack.store.Store,
binary_index: spack.binary_distribution.BinaryCacheIndex,
):
super().__init__(configuration=configuration, repo=repo)
self.store = store
self.binary_index = binary_index
@lang.memoized
def providers_for(self, virtual_str: str) -> List[spack.spec.Spec]:
candidates = super().providers_for(virtual_str)
result = []
for spec in candidates:
if not self._is_provider_candidate(pkg_name=spec.name, virtual=virtual_str):
continue
result.append(spec)
return result
@lang.memoized
def buildcache_specs(self) -> List[spack.spec.Spec]:
self.binary_index.update()
return self.binary_index.get_all_built_specs()
@lang.memoized
def can_be_installed(self, *, pkg_name) -> bool:
if self.configuration.get(f"packages:{pkg_name}:buildable", True):
return True
if self.configuration.get(f"packages:{pkg_name}:externals", []):
return True
reuse = self.configuration.get("concretizer:reuse")
if reuse is not False and self.store.db.query(pkg_name):
return True
if reuse is not False and any(x.name == pkg_name for x in self.buildcache_specs()):
return True
tty.debug(f"[{__name__}] {pkg_name} cannot be installed")
return False
@lang.memoized
def _is_provider_candidate(self, *, pkg_name: str, virtual: str) -> bool:
if not self.is_allowed_on_this_platform(pkg_name=pkg_name):
return False
if not self.can_be_installed(pkg_name=pkg_name):
return False
virtual_spec = spack.spec.Spec(virtual)
if self.unreachable(pkg_name=virtual_spec.name, when_spec=pkg_name):
tty.debug(f"[{__name__}] {pkg_name} cannot be a provider for {virtual}")
return False
return True
@lang.memoized
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
candidates = self.configuration.get(f"packages:{pkg_name}:require", [])
if not candidates and pkg_name != "all":
return self.unreachable(pkg_name="all", when_spec=when_spec)
if not candidates:
return False
if isinstance(candidates, str):
candidates = [candidates]
union_requirement = spack.spec.Spec()
for c in candidates:
if not isinstance(c, str):
continue
try:
union_requirement.constrain(c)
except SpackError:
# Less optimized, but shouldn't fail
pass
if not union_requirement.intersects(when_spec):
return True
return False
def create_graph_analyzer() -> PossibleDependencyGraph:
static_analysis = spack.config.CONFIG.get("concretizer:static_analysis", False)
if static_analysis:
return StaticAnalysis(
configuration=spack.config.CONFIG,
repo=spack.repo.PATH,
store=spack.store.STORE,
binary_index=spack.binary_distribution.BINARY_INDEX,
)
return NoStaticAnalysis(configuration=spack.config.CONFIG, repo=spack.repo.PATH)
class Counter:
"""Computes the possible packages and the maximum number of duplicates
allowed for each of them.
Args:
specs: abstract specs to concretize
tests: if True, add test dependencies to the list of possible packages
"""
def __init__(
self, specs: List["spack.spec.Spec"], tests: bool, possible_graph: PossibleDependencyGraph
) -> None:
self.possible_graph = possible_graph
self.specs = specs
self.link_run_types: dt.DepFlag = dt.LINK | dt.RUN | dt.TEST
self.all_types: dt.DepFlag = dt.ALL
if not tests:
self.link_run_types = dt.LINK | dt.RUN
self.all_types = dt.LINK | dt.RUN | dt.BUILD
self._possible_dependencies: Set[str] = set()
self._possible_virtuals: Set[str] = {
x.name for x in specs if spack.repo.PATH.is_virtual(x.name)
}
def possible_dependencies(self) -> Set[str]:
"""Returns the list of possible dependencies"""
self.ensure_cache_values()
return self._possible_dependencies
def possible_virtuals(self) -> Set[str]:
"""Returns the list of possible virtuals"""
self.ensure_cache_values()
return self._possible_virtuals
def ensure_cache_values(self) -> None:
"""Ensure the cache values have been computed"""
if self._possible_dependencies:
return
self._compute_cache_values()
def possible_packages_facts(self, gen: "spack.solver.asp.ProblemInstanceBuilder", fn) -> None:
"""Emit facts associated with the possible packages"""
raise NotImplementedError("must be implemented by derived classes")
def _compute_cache_values(self) -> None:
raise NotImplementedError("must be implemented by derived classes")
class NoDuplicatesCounter(Counter):
def _compute_cache_values(self) -> None:
self._possible_dependencies, virtuals, _ = self.possible_graph.possible_dependencies(
*self.specs, allowed_deps=self.all_types
)
self._possible_virtuals.update(virtuals)
def possible_packages_facts(self, gen: "spack.solver.asp.ProblemInstanceBuilder", fn) -> None:
gen.h2("Maximum number of nodes (packages)")
for package_name in sorted(self.possible_dependencies()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self.possible_dependencies()):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class MinimalDuplicatesCounter(NoDuplicatesCounter):
def __init__(
self, specs: List["spack.spec.Spec"], tests: bool, possible_graph: PossibleDependencyGraph
) -> None:
super().__init__(specs, tests, possible_graph)
self._link_run: Set[str] = set()
self._direct_build: Set[str] = set()
self._total_build: Set[str] = set()
self._link_run_virtuals: Set[str] = set()
def _compute_cache_values(self) -> None:
self._link_run, virtuals, _ = self.possible_graph.possible_dependencies(
*self.specs, allowed_deps=self.link_run_types
)
self._possible_virtuals.update(virtuals)
self._link_run_virtuals.update(virtuals)
for x in self._link_run:
reals, virtuals, _ = self.possible_graph.possible_dependencies(
x, allowed_deps=dt.BUILD, transitive=False, strict_depflag=True
)
self._possible_virtuals.update(virtuals)
self._direct_build.update(reals)
self._total_build, virtuals, _ = self.possible_graph.possible_dependencies(
*self._direct_build, allowed_deps=self.all_types
)
self._possible_virtuals.update(virtuals)
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Packages with at multiple possible nodes (build-tools)")
default = spack.config.CONFIG.get("concretizer:duplicates:max_dupes:default", 2)
for package_name in sorted(self.possible_dependencies() & build_tools):
max_dupes = spack.config.CONFIG.get(
f"concretizer:duplicates:max_dupes:{package_name}", default
)
gen.fact(fn.max_dupes(package_name, max_dupes))
if max_dupes > 1:
gen.fact(fn.multiple_unification_sets(package_name))
gen.newline()
gen.h2("Maximum number of nodes (link-run virtuals)")
for package_name in sorted(self._link_run_virtuals):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (other virtuals)")
for package_name in sorted(self.possible_virtuals() - self._link_run_virtuals):
max_dupes = spack.config.CONFIG.get(
f"concretizer:duplicates:max_dupes:{package_name}", default
)
gen.fact(fn.max_dupes(package_name, max_dupes))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)
gen.h2("Maximum number of nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
count = min(count, 2)
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
gen.h2("Build unification sets ")
for name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.multiple_unification_sets(name))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
counter = collections.Counter(
list(self._link_run_virtuals) + list(self._possible_virtuals)
)
gen.h2("Maximum number of virtual nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
def create_counter(
specs: List[spack.spec.Spec], tests: bool, possible_graph: PossibleDependencyGraph
) -> Counter:
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
if strategy == "full":
return FullDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)
if strategy == "minimal":
return MinimalDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)
return NoDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)

View File

@@ -10,7 +10,7 @@
import spack.error
import spack.package_base
import spack.spec
from spack.config import get_mark_from_yaml_data
from spack.util.spack_yaml import get_mark_from_yaml_data
class RequirementKind(enum.Enum):
@@ -66,18 +66,29 @@ def rules_from_package_py(self, pkg: spack.package_base.PackageBase) -> List[Req
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
kind, requests = self._raw_yaml_data(virtual_str, section="require", virtual=True)
result = self._rules_from_requirements(virtual_str, requests, kind=kind)
kind, requests = self._raw_yaml_data(virtual_str, section="prefer", virtual=True)
result.extend(self._rules_from_preferences(virtual_str, preferences=requests, kind=kind))
kind, requests = self._raw_yaml_data(virtual_str, section="conflict", virtual=True)
result.extend(self._rules_from_conflicts(virtual_str, conflicts=requests, kind=kind))
return result
def rules_from_require(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg, section="require")
kind, requirements = self._raw_yaml_data(pkg.name, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, preferences = self._raw_yaml_data(pkg.name, section="prefer")
return self._rules_from_preferences(pkg.name, preferences=preferences, kind=kind)
def _rules_from_preferences(
self, pkg_name: str, *, preferences, kind: RequirementKind
) -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
@@ -86,7 +97,7 @@ def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[Require
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
pkg_name=pkg_name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
@@ -97,8 +108,13 @@ def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[Require
return result
def rules_from_conflict(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, conflicts = self._raw_yaml_data(pkg.name, section="conflict")
return self._rules_from_conflicts(pkg.name, conflicts=conflicts, kind=kind)
def _rules_from_conflicts(
self, pkg_name: str, *, conflicts, kind: RequirementKind
) -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
@@ -107,7 +123,7 @@ def rules_from_conflict(self, pkg: spack.package_base.PackageBase) -> List[Requi
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
pkg_name=pkg_name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
@@ -129,10 +145,14 @@ def _parse_prefer_conflict_item(self, item):
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg: spack.package_base.PackageBase, *, section: str):
def _raw_yaml_data(self, pkg_name: str, *, section: str, virtual: bool = False):
config = self.config.get("packages")
data = config.get(pkg.name, {}).get(section, [])
data = config.get(pkg_name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if virtual:
return RequirementKind.VIRTUAL, data
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
@@ -165,7 +185,8 @@ def _rules_from_requirements(
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint) for constraint in constraints
parse_spec_from_yaml_string(constraint, named=kind == RequirementKind.VIRTUAL)
for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
@@ -203,7 +224,7 @@ def reject_requirement_constraint(
s.validate_or_raise()
except spack.error.SpackError as e:
tty.debug(
f"[SETUP] Rejecting the default '{constraint}' requirement "
f"[{__name__}] Rejecting the default '{constraint}' requirement "
f"on '{pkg_name}': {str(e)}",
level=2,
)
@@ -211,21 +232,37 @@ def reject_requirement_constraint(
return False
def parse_spec_from_yaml_string(string: str) -> spack.spec.Spec:
def parse_spec_from_yaml_string(string: str, *, named: bool = False) -> spack.spec.Spec:
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Arguments:
Args:
string: a string representing a ``Spec`` from config YAML.
named: if True, the spec must have a name
"""
try:
return spack.spec.Spec(string)
result = spack.spec.Spec(string)
except spack.error.SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise spack.error.SpecSyntaxError(msg) from e
raise e
if named is True and not result.name:
msg = f"expected a named spec, but got '{string}' instead"
mark = get_mark_from_yaml_data(string)
# Add a hint in case it's dependencies
deps = result.dependencies()
if len(deps) == 1:
msg = f"{msg}. Did you mean '{deps[0]}'?"
if mark:
msg = f"{mark.name}:{mark.line + 1}: {msg}"
raise spack.error.SpackError(msg)
return result

View File

@@ -52,6 +52,7 @@
import enum
import io
import itertools
import json
import os
import pathlib
import platform
@@ -96,9 +97,7 @@
import spack.spec_parser
import spack.store
import spack.traverse
import spack.util.executable
import spack.util.hash
import spack.util.module_cmd as md
import spack.util.prefix
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
@@ -175,15 +174,17 @@
#: Spec(Spec("string").format()) == Spec("string)"
DEFAULT_FORMAT = (
"{name}{@versions}"
"{%compiler.name}{@compiler.versions}{compiler_flags}"
"{compiler_flags}"
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
" {%compiler.name}{@compiler.versions}"
)
#: Display format, which eliminates extra `@=` in the output, for readability.
DISPLAY_FORMAT = (
"{name}{@version}"
"{%compiler.name}{@compiler.version}{compiler_flags}"
"{compiler_flags}"
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
" {%compiler.name}{@compiler.version}"
)
#: Regular expression to pull spec contents out of clearsigned signature
@@ -798,7 +799,7 @@ def update_deptypes(self, depflag: dt.DepFlag) -> bool:
self.depflag = new
return True
def update_virtuals(self, virtuals: Tuple[str, ...]) -> bool:
def update_virtuals(self, virtuals: Iterable[str]) -> bool:
"""Update the list of provided virtuals"""
old = self.virtuals
self.virtuals = tuple(sorted(set(virtuals).union(self.virtuals)))
@@ -1108,28 +1109,6 @@ def clear(self):
self.edges.clear()
def _command_default_handler(spec: "Spec"):
"""Default handler when looking for the 'command' attribute.
Tries to search for ``spec.name`` in the ``spec.home.bin`` directory.
Parameters:
spec: spec that is being queried
Returns:
Executable: An executable of the command
Raises:
RuntimeError: If the command is not found
"""
home = getattr(spec.package, "home")
path = os.path.join(home.bin, spec.name)
if fs.is_exe(path):
return spack.util.executable.Executable(path)
raise RuntimeError(f"Unable to locate {spec.name} command in {home.bin}")
def _headers_default_handler(spec: "Spec"):
"""Default handler when looking for the 'headers' attribute.
@@ -1333,18 +1312,22 @@ class SpecBuildInterface(lang.ObjectWrapper):
home = ForwardQueryToPackage("home", default_handler=None)
headers = ForwardQueryToPackage("headers", default_handler=_headers_default_handler)
libs = ForwardQueryToPackage("libs", default_handler=_libs_default_handler)
command = ForwardQueryToPackage(
"command", default_handler=_command_default_handler, _indirect=True
)
command = ForwardQueryToPackage("command", default_handler=None, _indirect=True)
def __init__(self, spec: "Spec", name: str, query_parameters: List[str], _parent: "Spec"):
def __init__(
self,
spec: "Spec",
name: str,
query_parameters: List[str],
_parent: "Spec",
is_virtual: bool,
):
super().__init__(spec)
# Adding new attributes goes after super() call since the ObjectWrapper
# resets __dict__ to behave like the passed object
original_spec = getattr(spec, "wrapped_obj", spec)
self.wrapped_obj = original_spec
self.token = original_spec, name, query_parameters, _parent
is_virtual = spack.repo.PATH.is_virtual(name)
self.token = original_spec, name, query_parameters, _parent, is_virtual
self.last_query = QueryState(
name=name, extra_parameters=query_parameters, isvirtual=is_virtual
)
@@ -1507,7 +1490,7 @@ def __init__(self, spec_like=None, *, external_path=None, external_modules=None)
self.abstract_hash = None
# initial values for all spec hash types
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, None)
# cache for spec's prefix, computed lazily by prefix property
@@ -1905,10 +1888,22 @@ def package_class(self):
"""Internal package call gets only the class object for a package.
Use this to just get package metadata.
"""
warnings.warn(
"`Spec.package_class` is deprecated and will be removed in version 1.0.0. Use "
"`spack.repo.PATH.get_pkg_class(spec.fullname) instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
return spack.repo.PATH.get_pkg_class(self.fullname)
@property
def virtual(self):
warnings.warn(
"`Spec.virtual` is deprecated and will be removed in version 1.0.0. Use "
"`spack.repo.PATH.is_virtual(spec.name)` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
return spack.repo.PATH.is_virtual(self.name)
@property
@@ -2088,32 +2083,34 @@ def long_spec(self):
def short_spec(self):
"""Returns a version of the spec with the dependencies hashed
instead of completely enumerated."""
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{ arch=architecture}{/hash:7}"
return self.format(spec_format)
return self.format(
"{name}{@version}{variants}{ arch=architecture}"
"{/hash:7}{%compiler.name}{@compiler.version}"
)
@property
def cshort_spec(self):
"""Returns an auto-colorized version of ``self.short_spec``."""
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{ arch=architecture}{/hash:7}"
return self.cformat(spec_format)
return self.cformat(
"{name}{@version}{variants}{ arch=architecture}"
"{/hash:7}{%compiler.name}{@compiler.version}"
)
@property
def prefix(self):
def prefix(self) -> spack.util.prefix.Prefix:
if not self._concrete:
raise spack.error.SpecError("Spec is not concrete: " + str(self))
raise spack.error.SpecError(f"Spec is not concrete: {self}")
if self._prefix is None:
upstream, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
_, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
if record and record.path:
self.prefix = record.path
self.set_prefix(record.path)
else:
self.prefix = spack.store.STORE.layout.path_for_spec(self)
self.set_prefix(spack.store.STORE.layout.path_for_spec(self))
assert self._prefix is not None
return self._prefix
@prefix.setter
def prefix(self, value):
def set_prefix(self, value: str) -> None:
self._prefix = spack.util.prefix.Prefix(llnl.path.convert_to_platform_path(value))
def spec_hash(self, hash):
@@ -2127,7 +2124,9 @@ def spec_hash(self, hash):
if hash.override is not None:
return hash.override(self)
node_dict = self.to_node_dict(hash=hash)
json_text = sjson.dump(node_dict)
json_text = json.dumps(
node_dict, ensure_ascii=True, indent=None, separators=(",", ":"), sort_keys=False
)
# This implements "frankenhashes", preserving the last 7 characters of the
# original hash when splicing so that we can avoid relocation issues
out = spack.util.hash.b32_hash(json_text)
@@ -2174,30 +2173,16 @@ def package_hash(self):
def dag_hash(self, length=None):
"""This is Spack's default hash, used to identify installations.
Same as the full hash (includes package hash and build/link/run deps).
Tells us when package files and any dependencies have changes.
NOTE: Versions of Spack prior to 0.18 only included link and run deps.
NOTE: Versions of Spack prior to 1.0 only did not include test deps.
"""
return self._cached_hash(ht.dag_hash, length)
def process_hash(self, length=None):
"""Hash used to transfer specs among processes.
This hash includes build and test dependencies and is only used to
serialize a spec and pass it around among processes.
"""
return self._cached_hash(ht.process_hash, length)
def dag_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.dag_hash(), bits)
def process_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.process_hash(), bits)
def _lookup_hash(self):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
@@ -2688,7 +2673,7 @@ def name_and_dependency_types(s: str) -> Tuple[str, dt.DepFlag]:
return name, depflag
def spec_and_dependency_types(
s: Union[Spec, Tuple[Spec, str]]
s: Union[Spec, Tuple[Spec, str]],
) -> Tuple[Spec, dt.DepFlag]:
"""Given a non-string key in the literal, extracts the spec
and its dependency types.
@@ -2717,7 +2702,7 @@ def spec_and_dependency_types(
return spec_builder(spec_dict)
@staticmethod
def from_dict(data):
def from_dict(data) -> "Spec":
"""Construct a spec from JSON/YAML.
Args:
@@ -2740,7 +2725,7 @@ def from_dict(data):
return spec
@staticmethod
def from_yaml(stream):
def from_yaml(stream) -> "Spec":
"""Construct a spec from YAML.
Args:
@@ -2750,7 +2735,7 @@ def from_yaml(stream):
return Spec.from_dict(data)
@staticmethod
def from_json(stream):
def from_json(stream) -> "Spec":
"""Construct a spec from JSON.
Args:
@@ -2760,7 +2745,7 @@ def from_json(stream):
data = sjson.load(stream)
return Spec.from_dict(data)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON spec:", str(e)) from e
raise sjson.SpackJSONError("error parsing JSON spec:", e) from e
@staticmethod
def extract_json_from_clearsig(data):
@@ -2808,24 +2793,6 @@ def from_detection(
s.extra_attributes = extra_attributes
return s
def validate_detection(self):
"""Validate the detection of an external spec.
This method is used as part of Spack's detection protocol, and is
not meant for client code use.
"""
# Assert that _extra_attributes is a Mapping and not None,
# which likely means the spec was created with Spec.from_detection
msg = 'cannot validate "{0}" since it was not created ' "using Spec.from_detection".format(
self
)
assert isinstance(self.extra_attributes, collections.abc.Mapping), msg
# Validate the spec calling a package specific method
pkg_cls = spack.repo.PATH.get_pkg_class(self.name)
validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None)
validate_fn(self, self.extra_attributes)
def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer."""
# FIXME: _patches_in_order_of_appearance is attached after concretization
@@ -2842,94 +2809,6 @@ def _patches_assigned(self):
return True
@staticmethod
def inject_patches_variant(root):
# This dictionary will store object IDs rather than Specs as keys
# since the Spec __hash__ will change as patches are added to them
spec_to_patches = {}
for s in root.traverse():
# After concretizing, assign namespaces to anything left.
# Note that this doesn't count as a "change". The repository
# configuration is constant throughout a spack run, and
# normalize and concretize evaluate Packages using Repo.get(),
# which respects precedence. So, a namespace assignment isn't
# changing how a package name would have been interpreted and
# we can do it as late as possible to allow as much
# compatibility across repositories as possible.
if s.namespace is None:
s.namespace = spack.repo.PATH.repo_for_pkg(s.name).namespace
if s.concrete:
continue
# Add any patches from the package to the spec.
patches = set()
for cond, patch_list in s.package_class.patches.items():
if s.satisfies(cond):
for patch in patch_list:
patches.add(patch)
if patches:
spec_to_patches[id(s)] = patches
# Also record all patches required on dependencies by
# depends_on(..., patch=...)
for dspec in root.traverse_edges(deptype=all, cover="edges", root=False):
if dspec.spec.concrete:
continue
pkg_deps = dspec.parent.package_class.dependencies
patches = []
for cond, deps_by_name in pkg_deps.items():
if not dspec.parent.satisfies(cond):
continue
dependency = deps_by_name.get(dspec.spec.name)
if not dependency:
continue
for pcond, patch_list in dependency.patches.items():
if dspec.spec.satisfies(pcond):
patches.extend(patch_list)
if patches:
all_patches = spec_to_patches.setdefault(id(dspec.spec), set())
for patch in patches:
all_patches.add(patch)
for spec in root.traverse():
if id(spec) not in spec_to_patches:
continue
patches = list(lang.dedupe(spec_to_patches[id(spec)]))
mvar = spec.variants.setdefault("patches", vt.MultiValuedVariant("patches", ()))
mvar.value = tuple(p.sha256 for p in patches)
# FIXME: Monkey patches mvar to store patches order
full_order_keys = list(tuple(p.ordering_key) + (p.sha256,) for p in patches)
ordered_hashes = sorted(full_order_keys)
tty.debug(
"Ordered hashes [{0}]: ".format(spec.name)
+ ", ".join("/".join(str(e) for e in t) for t in ordered_hashes)
)
mvar._patches_in_order_of_appearance = list(t[-1] for t in ordered_hashes)
@staticmethod
def ensure_external_path_if_external(external_spec):
if external_spec.external_modules and not external_spec.external_path:
compiler = spack.compilers.compiler_for_spec(
external_spec.compiler, external_spec.architecture
)
for mod in compiler.modules:
md.load_module(mod)
# Get the path from the module the package can override the default
# (this is mostly needed for Cray)
pkg_cls = spack.repo.PATH.get_pkg_class(external_spec.name)
package = pkg_cls(external_spec)
external_spec.external_path = getattr(
package, "external_prefix", md.path_from_modules(external_spec.external_modules)
)
@staticmethod
def ensure_no_deprecated(root):
"""Raise if a deprecated spec is in the dag.
@@ -3084,7 +2963,7 @@ def validate_or_raise(self):
# FIXME: raise just the first one encountered
for spec in self.traverse():
# raise an UnknownPackageError if the spec's package isn't real.
if (not spec.virtual) and spec.name:
if spec.name and not spack.repo.PATH.is_virtual(spec.name):
spack.repo.PATH.get_pkg_class(spec.fullname)
# validate compiler in addition to the package name.
@@ -3093,7 +2972,7 @@ def validate_or_raise(self):
raise UnsupportedCompilerError(spec.compiler.name)
# Ensure correctness of variants (if the spec is not virtual)
if not spec.virtual:
if not spack.repo.PATH.is_virtual(spec.name):
Spec.ensure_valid_variants(spec)
substitute_abstract_variants(spec)
@@ -3111,7 +2990,7 @@ def ensure_valid_variants(spec):
if spec.concrete:
return
pkg_cls = spec.package_class
pkg_cls = spack.repo.PATH.get_pkg_class(spec.fullname)
pkg_variants = pkg_cls.variant_names()
# reserved names are variants that may be set on any package
# but are not necessarily recorded by the package's class
@@ -3328,7 +3207,9 @@ def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
if self.virtual and other.virtual:
self_virtual = spack.repo.PATH.is_virtual(self.name)
other_virtual = spack.repo.PATH.is_virtual(other.name)
if self_virtual and other_virtual:
# Two virtual specs intersect only if there are providers for both
lhs = spack.repo.PATH.providers_for(str(self))
rhs = spack.repo.PATH.providers_for(str(other))
@@ -3336,8 +3217,8 @@ def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
return bool(intersection)
# A provider can satisfy a virtual dependency.
elif self.virtual or other.virtual:
virtual_spec, non_virtual_spec = (self, other) if self.virtual else (other, self)
elif self_virtual or other_virtual:
virtual_spec, non_virtual_spec = (self, other) if self_virtual else (other, self)
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.PATH.get_pkg_class(non_virtual_spec.fullname)
@@ -3407,12 +3288,20 @@ def _intersects_dependencies(self, other):
# These two loops handle cases where there is an overly restrictive
# vpkg in one spec for a provider in the other (e.g., mpi@3: is not
# compatible with mpich2)
for spec in self.virtual_dependencies():
if spec.name in other_index and not other_index.providers_for(spec):
for spec in self.traverse():
if (
spack.repo.PATH.is_virtual(spec.name)
and spec.name in other_index
and not other_index.providers_for(spec)
):
return False
for spec in other.virtual_dependencies():
if spec.name in self_index and not self_index.providers_for(spec):
for spec in other.traverse():
if (
spack.repo.PATH.is_virtual(spec.name)
and spec.name in self_index
and not self_index.providers_for(spec)
):
return False
return True
@@ -3442,7 +3331,9 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
# A concrete provider can satisfy a virtual dependency.
if not self.virtual and other.virtual:
if not spack.repo.PATH.is_virtual(self.name) and spack.repo.PATH.is_virtual(
other.name
):
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.PATH.get_pkg_class(self.fullname)
@@ -3510,7 +3401,7 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
lhs_edges: Dict[str, Set[DependencySpec]] = collections.defaultdict(set)
for rhs_edge in other.traverse_edges(root=False, cover="edges"):
# If we are checking for ^mpi we need to verify if there is any edge
if rhs_edge.spec.virtual:
if spack.repo.PATH.is_virtual(rhs_edge.spec.name):
rhs_edge.update_virtuals(virtuals=(rhs_edge.spec.name,))
if not rhs_edge.virtuals:
@@ -3554,10 +3445,6 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
for rhs in other.traverse(root=False)
)
def virtual_dependencies(self):
"""Return list of any virtual deps in this spec."""
return [spec for spec in self.traverse() if spec.virtual]
@property # type: ignore[misc] # decorated prop not supported in mypy
def patches(self):
"""Return patch objects for any patch sha256 sums on this Spec.
@@ -3662,11 +3549,11 @@ def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True)
if self._concrete:
self._dunder_hash = other._dunder_hash
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, getattr(other, h.attr, None))
else:
self._dunder_hash = None
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, None)
return changed
@@ -3747,30 +3634,23 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
order = lambda: itertools.chain(
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"),
# Consider all direct dependencies and transitive runtime dependencies
order = itertools.chain(
self.edges_to_dependencies(depflag=dt.ALL),
self.traverse_edges(deptype=dt.LINK | dt.RUN, order="breadth", cover="edges"),
)
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try:
child: Spec = next(
e.spec
for e in itertools.chain(
(e for e in order() if e.spec.name == name or name in e.virtuals),
# for historical reasons
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)),
)
)
except StopIteration:
raise KeyError(f"No spec with name {name} in {self}")
edge = next((e for e in order if e.spec.name == name or name in e.virtuals))
except StopIteration as e:
raise KeyError(f"No spec with name {name} in {self}") from e
if self._concrete:
return SpecBuildInterface(child, name, query_parameters, _parent=self)
return SpecBuildInterface(
edge.spec, name, query_parameters, _parent=self, is_virtual=name in edge.virtuals
)
return child
return edge.spec
def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec.
@@ -3786,8 +3666,11 @@ def __contains__(self, spec):
# if anonymous or same name, we only have to look at the root
if not spec.name or spec.name == self.name:
return self.satisfies(spec)
else:
return any(s.satisfies(spec) for s in self.traverse(root=False))
try:
dep = self[spec.name]
except KeyError:
return False
return dep.satisfies(spec)
def eq_dag(self, other, deptypes=True, vs=None, vo=None):
"""True if the full dependency DAGs of specs are equal."""
@@ -3862,16 +3745,6 @@ def _cmp_iter(self):
# serialized before the hash change and one after, are considered different.
yield self.dag_hash() if self.concrete else None
# This needs to be in _cmp_iter so that no specs with different process hashes
# are considered the same by `__hash__` or `__eq__`.
#
# TODO: We should eventually unify the `_cmp_*` methods with `to_node_dict` so
# TODO: there aren't two sources of truth, but this needs some thought, since
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield self.process_hash() if self.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
@@ -4525,7 +4398,7 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
"""
Clears all cached hashes in a Spec, while preserving other properties.
"""
for h in ht.hashes:
for h in ht.HASHES:
if h.attr not in ignore:
if hasattr(self, h.attr):
setattr(self, h.attr, None)
@@ -4534,18 +4407,12 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
setattr(self, attr, None)
def __hash__(self):
# If the spec is concrete, we leverage the process hash and just use
# a 64-bit prefix of it. The process hash has the advantage that it's
# computed once per concrete spec, and it's saved -- so if we read
# concrete specs we don't need to recompute the whole hash. This is
# good for large, unchanging specs.
#
# We use the process hash instead of the DAG hash here because the DAG
# hash includes the package hash, which can cause infinite recursion,
# and which isn't defined unless the spec has a known package.
# If the spec is concrete, we leverage the dag hash and just use a 64-bit prefix of it.
# The dag hash has the advantage that it's computed once per concrete spec, and it's saved
# -- so if we read concrete specs we don't need to recompute the whole hash.
if self.concrete:
if not self._dunder_hash:
self._dunder_hash = self.process_hash_bit_prefix(64)
self._dunder_hash = self.dag_hash_bit_prefix(64)
return self._dunder_hash
# This is the normal hash for lazy_lexicographic_ordering. It's
@@ -4554,7 +4421,7 @@ def __hash__(self):
return hash(lang.tuplify(self._cmp_iter))
def __reduce__(self):
return Spec.from_dict, (self.to_dict(hash=ht.process_hash),)
return Spec.from_dict, (self.to_dict(hash=ht.dag_hash),)
def attach_git_version_lookup(self):
# Add a git lookup method for GitVersions
@@ -4697,17 +4564,6 @@ def constrain(self, other: "VariantMap") -> bool:
return changed
@property
def concrete(self):
"""Returns True if the spec is concrete in terms of variants.
Returns:
bool: True or False
"""
return self.spec._concrete or all(
v in self for v in self.spec.package_class.variant_names()
)
def copy(self) -> "VariantMap":
clone = VariantMap(self.spec)
for name, variant in self.items():
@@ -4765,14 +4621,14 @@ def substitute_abstract_variants(spec: Spec):
elif name in vt.reserved_names:
continue
variant_defs = spec.package_class.variant_definitions(name)
variant_defs = spack.repo.PATH.get_pkg_class(spec.fullname).variant_definitions(name)
valid_defs = []
for when, vdef in variant_defs:
if when.intersects(spec):
valid_defs.append(vdef)
if not valid_defs:
if name not in spec.package_class.variant_names():
if name not in spack.repo.PATH.get_pkg_class(spec.fullname).variant_names():
unknown.append(name)
else:
whens = [str(when) for when, _ in variant_defs]
@@ -4834,31 +4690,51 @@ def merge_abstract_anonymous_specs(*abstract_specs: Spec):
return merged_spec
def reconstruct_virtuals_on_edges(spec):
"""Reconstruct virtuals on edges. Used to read from old DB and reindex.
def reconstruct_virtuals_on_edges(spec: Spec) -> None:
"""Reconstruct virtuals on edges. Used to read from old DB and reindex."""
virtuals_needed: Dict[str, Set[str]] = {}
virtuals_provided: Dict[str, Set[str]] = {}
for edge in spec.traverse_edges(cover="edges", root=False):
parent_key = edge.parent.dag_hash()
if parent_key not in virtuals_needed:
# Construct which virtuals are needed by parent
virtuals_needed[parent_key] = set()
try:
parent_pkg = edge.parent.package
except Exception as e:
warnings.warn(
f"cannot reconstruct virtual dependencies on {edge.parent.name}: {e}"
)
continue
Args:
spec: spec on which we want to reconstruct virtuals
"""
# Collect all possible virtuals
possible_virtuals = set()
for node in spec.traverse():
try:
possible_virtuals.update({x for x in node.package.dependencies if Spec(x).virtual})
except Exception as e:
warnings.warn(f"cannot reconstruct virtual dependencies on package {node.name}: {e}")
virtuals_needed[parent_key].update(
name
for name, when_deps in parent_pkg.dependencies_by_name(when=True).items()
if spack.repo.PATH.is_virtual(name)
and any(edge.parent.satisfies(x) for x in when_deps)
)
if not virtuals_needed[parent_key]:
continue
# Assume all incoming edges to provider are marked with virtuals=
for vspec in possible_virtuals:
try:
provider = spec[vspec]
except KeyError:
# Virtual not in the DAG
child_key = edge.spec.dag_hash()
if child_key not in virtuals_provided:
virtuals_provided[child_key] = set()
try:
child_pkg = edge.spec.package
except Exception as e:
warnings.warn(
f"cannot reconstruct virtual dependencies on {edge.parent.name}: {e}"
)
continue
virtuals_provided[child_key].update(x.name for x in child_pkg.virtuals_provided)
if not virtuals_provided[child_key]:
continue
for edge in provider.edges_from_dependents():
edge.update_virtuals([vspec])
virtuals_to_add = virtuals_needed[parent_key] & virtuals_provided[child_key]
if virtuals_to_add:
edge.update_virtuals(virtuals_to_add)
class SpecfileReaderBase:
@@ -4867,7 +4743,7 @@ def from_node_dict(cls, node):
spec = Spec()
name, node = cls.name_and_data(node)
for h in ht.hashes:
for h in ht.HASHES:
setattr(spec, h.attr, node.get(h.name, None))
spec.name = name
@@ -5050,7 +4926,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
"""
for dep_name, elt in deps.items():
if isinstance(elt, dict):
for h in ht.hashes:
for h in ht.HASHES:
if h.name in elt:
dep_hash, deptypes = elt[h.name], elt["type"]
hash_type = h.name
@@ -5093,7 +4969,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
dep_name = dep["name"]
if isinstance(elt, dict):
# new format: elements of dependency spec are keyed.
for h in ht.hashes:
for h in ht.HASHES:
if h.name in elt:
dep_hash, deptypes, hash_type, virtuals = cls.extract_info_from_dep(elt, h)
break
@@ -5203,6 +5079,13 @@ def get_host_environment() -> Dict[str, Any]:
}
def eval_conditional(string):
"""Evaluate conditional definitions using restricted variable scope."""
valid_variables = get_host_environment()
valid_variables.update({"re": re, "env": os.environ})
return eval(string, valid_variables)
class SpecParseError(spack.error.SpecError):
"""Wrapper for ParseError for when we're parsing specs."""
@@ -5361,8 +5244,10 @@ def __init__(self, spec):
class AmbiguousHashError(spack.error.SpecError):
def __init__(self, msg, *specs):
spec_fmt = "{namespace}.{name}{@version}{%compiler}{compiler_flags}"
spec_fmt += "{variants}{ arch=architecture}{/hash:7}"
spec_fmt = (
"{namespace}.{name}{@version}{compiler_flags}{variants}"
"{ arch=architecture}{/hash:7}{%compiler}"
)
specs_str = "\n " + "\n ".join(spec.format(spec_fmt) for spec in specs)
super().__init__(msg + specs_str)

View File

@@ -60,13 +60,17 @@
import pathlib
import re
import sys
from typing import Iterator, List, Optional
import traceback
import warnings
from typing import Iterator, List, Optional, Tuple
from llnl.util.tty import color
import spack.deptypes
import spack.error
import spack.paths
import spack.spec
import spack.util.spack_yaml
import spack.version
from spack.tokenize import Token, TokenBase, Tokenizer
@@ -204,6 +208,32 @@ def __init__(self, tokens: List[Token], text: str):
super().__init__(message)
def _warn_about_variant_after_compiler(literal_str: str, issues: List[str]):
"""Issue a warning if variant or other token is preceded by a compiler token. The warning is
only issued if it's actionable: either we know the config file it originates from, or we have
call site that's not internal to Spack."""
ignore = [spack.paths.lib_path, spack.paths.bin_path]
mark = spack.util.spack_yaml.get_mark_from_yaml_data(literal_str)
issue_str = ", ".join(issues)
error = f"{issue_str} in `{literal_str}`"
# warning from config file
if mark:
warnings.warn(f"{mark.name}:{mark.line + 1}: {error}")
return
# warning from hopefully package.py
for frame in reversed(traceback.extract_stack()):
if frame.lineno and not any(frame.filename.startswith(path) for path in ignore):
warnings.warn_explicit(
error,
category=spack.error.SpackAPIWarning,
filename=frame.filename,
lineno=frame.lineno,
)
return
class SpecParser:
"""Parse text into specs"""
@@ -242,26 +272,31 @@ def add_dependency(dep, **edge_properties):
raise SpecParsingError(str(e), self.ctx.current_token, self.literal_str) from e
initial_spec = initial_spec or spack.spec.Spec()
root_spec = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
root_spec, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
while True:
if self.ctx.accept(SpecTokens.START_EDGE_PROPERTIES):
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
edge_properties.setdefault("depflag", 0)
edge_properties.setdefault("virtuals", ())
dependency = self._parse_node(root_spec)
dependency, warnings = self._parse_node(root_spec)
parser_warnings.extend(warnings)
add_dependency(dependency, **edge_properties)
elif self.ctx.accept(SpecTokens.DEPENDENCY):
dependency = self._parse_node(root_spec)
dependency, warnings = self._parse_node(root_spec)
parser_warnings.extend(warnings)
add_dependency(dependency, depflag=0, virtuals=())
else:
break
if parser_warnings:
_warn_about_variant_after_compiler(self.literal_str, parser_warnings)
return root_spec
def _parse_node(self, root_spec):
dependency = SpecNodeParser(self.ctx, self.literal_str).parse()
dependency, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse()
if dependency is None:
msg = (
"the dependency sigil and any optional edge attributes must be followed by a "
@@ -270,7 +305,7 @@ def _parse_node(self, root_spec):
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
if root_spec.concrete:
raise spack.spec.RedundantSpecError(root_spec, "^" + str(dependency))
return dependency
return dependency, parser_warnings
def all_specs(self) -> List["spack.spec.Spec"]:
"""Return all the specs that remain to be parsed"""
@@ -290,7 +325,7 @@ def __init__(self, ctx, literal_str):
def parse(
self, initial_spec: Optional["spack.spec.Spec"] = None
) -> Optional["spack.spec.Spec"]:
) -> Tuple["spack.spec.Spec", List[str]]:
"""Parse a single spec node from a stream of tokens
Args:
@@ -299,12 +334,15 @@ def parse(
Return
The object passed as argument
"""
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
return initial_spec
parser_warnings: List[str] = []
last_compiler = None
if initial_spec is None:
initial_spec = spack.spec.Spec()
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
return initial_spec, parser_warnings
# If we start with a package name we have a named spec, we cannot
# accept another package name afterwards in a node
if self.ctx.accept(SpecTokens.UNQUALIFIED_PACKAGE_NAME):
@@ -318,7 +356,7 @@ def parse(
initial_spec.namespace = namespace
elif self.ctx.accept(SpecTokens.FILENAME):
return FileParser(self.ctx).parse(initial_spec)
return FileParser(self.ctx).parse(initial_spec), parser_warnings
def raise_parsing_error(string: str, cause: Optional[Exception] = None):
"""Raise a spec parsing error with token context."""
@@ -331,6 +369,12 @@ def add_flag(name: str, value: str, propagate: bool):
except Exception as e:
raise_parsing_error(str(e), e)
def warn_if_after_compiler(token: str):
"""Register a warning for %compiler followed by +variant that will in the future apply
to the compiler instead of the current root."""
if last_compiler:
parser_warnings.append(f"`{token}` should go before `{last_compiler}`")
while True:
if self.ctx.accept(SpecTokens.COMPILER):
if self.has_compiler:
@@ -339,6 +383,7 @@ def add_flag(name: str, value: str, propagate: bool):
compiler_name = self.ctx.current_token.value[1:]
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
self.has_compiler = True
last_compiler = self.ctx.current_token.value
elif self.ctx.accept(SpecTokens.COMPILER_AND_VERSION):
if self.has_compiler:
@@ -349,6 +394,7 @@ def add_flag(name: str, value: str, propagate: bool):
compiler_name.strip(), compiler_version
)
self.has_compiler = True
last_compiler = self.ctx.current_token.value
elif (
self.ctx.accept(SpecTokens.VERSION_HASH_PAIR)
@@ -363,14 +409,17 @@ def add_flag(name: str, value: str, propagate: bool):
)
initial_spec.attach_git_version_lookup()
self.has_version = True
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0] == "+"
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0:2] == "++"
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
@@ -378,6 +427,7 @@ def add_flag(name: str, value: str, propagate: bool):
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
@@ -385,17 +435,19 @@ def add_flag(name: str, value: str, propagate: bool):
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.expect(SpecTokens.DAG_HASH):
if initial_spec.abstract_hash:
break
self.ctx.accept(SpecTokens.DAG_HASH)
initial_spec.abstract_hash = self.ctx.current_token.value[1:]
warn_if_after_compiler(self.ctx.current_token.value)
else:
break
return initial_spec
return initial_spec, parser_warnings
class FileParser:
@@ -485,23 +537,18 @@ def parse_one_or_raise(
text (str): text to be parsed
initial_spec: buffer where to parse the spec. If None a new one will be created.
"""
stripped_text = text.strip()
parser = SpecParser(stripped_text)
parser = SpecParser(text)
result = parser.next_spec(initial_spec)
last_token = parser.ctx.current_token
next_token = parser.ctx.next_token
if last_token is not None and last_token.end != len(stripped_text):
message = "a single spec was requested, but parsed more than one:"
message += f"\n{text}"
if last_token is not None:
underline = f"\n{' ' * last_token.end}{'^' * (len(text) - last_token.end)}"
message += color.colorize(f"@*r{{{underline}}}")
if next_token:
message = f"expected a single spec, but got more:\n{text}"
underline = f"\n{' ' * next_token.start}{'^' * len(next_token.value)}"
message += color.colorize(f"@*r{{{underline}}}")
raise ValueError(message)
if result is None:
message = "a single spec was requested, but none was parsed:"
message += f"\n{text}"
raise ValueError(message)
raise ValueError("expected a single spec, but got none")
return result

View File

@@ -1,7 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
""" Test ABI-based splicing of dependencies """
"""Test ABI-based splicing of dependencies"""
from typing import List

View File

@@ -129,7 +129,7 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec(
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
f"pkg-a foobar=bar target={root_target_range} %gcc@10 ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec = spack.concretize.concretize_one(spec)

View File

@@ -200,7 +200,11 @@ def dummy_prefix(tmpdir):
@pytest.mark.requires_executables(*required_executables)
@pytest.mark.maybeslow
@pytest.mark.usefixtures(
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
"default_config",
"cache_directory",
"install_dir_default_layout",
"temporary_mirror",
"mutable_mock_env_path",
)
def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
"""
@@ -272,7 +276,11 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
@pytest.mark.maybeslow
@pytest.mark.nomockstage
@pytest.mark.usefixtures(
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
"default_config",
"cache_directory",
"install_dir_default_layout",
"temporary_mirror",
"mutable_mock_env_path",
)
def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
"""
@@ -569,7 +577,6 @@ def test_FetchCacheError_only_accepts_lists_of_errors():
def test_FetchCacheError_pretty_printing_multiple():
e = bindist.FetchCacheError([RuntimeError("Oops!"), TypeError("Trouble!")])
str_e = str(e)
print("'" + str_e + "'")
assert "Multiple errors" in str_e
assert "Error 1: RuntimeError: Oops!" in str_e
assert "Error 2: TypeError: Trouble!" in str_e
@@ -1133,7 +1140,7 @@ def test_get_valid_spec_file_no_json(tmp_path, filename):
def test_download_tarball_with_unsupported_layout_fails(tmp_path, mutable_config, capsys):
layout_version = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION + 1
spec = Spec("gmake@4.4.1%gcc@13.1.0 arch=linux-ubuntu23.04-zen2")
spec = Spec("gmake@4.4.1 arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec._mark_concrete()
spec_dict = spec.to_dict()
spec_dict["buildcache_layout_version"] = layout_version

View File

@@ -388,7 +388,7 @@ def test_wrapper_variables(
root = spack.concretize.concretize_one("dt-diamond")
for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name)
s.set_prefix(f"/{s.name}-prefix/")
dep_pkg = root["dt-diamond-left"].package
dep_lib_paths = ["/test/path/to/ex1.so", "/test/path/to/subdir/ex2.so"]
@@ -396,7 +396,7 @@ def test_wrapper_variables(
dep_libs = LibraryList(dep_lib_paths)
dep2_pkg = root["dt-diamond-right"].package
dep2_pkg.spec.prefix = str(installation_dir_with_headers)
dep2_pkg.spec.set_prefix(str(installation_dir_with_headers))
setattr(dep_pkg, "libs", dep_libs)
try:
@@ -542,7 +542,7 @@ def test_build_jobs_sequential_is_sequential():
spack.config.determine_number_of_jobs(
parallel=False,
max_cpus=8,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 8}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 8}}),
),
@@ -556,7 +556,7 @@ def test_build_jobs_command_line_overrides():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=1,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 10}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 1}}),
),
@@ -567,7 +567,7 @@ def test_build_jobs_command_line_overrides():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=100,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 10}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 100}}),
),
@@ -581,7 +581,7 @@ def test_build_jobs_defaults():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=10,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 1}})
),
)
@@ -591,7 +591,7 @@ def test_build_jobs_defaults():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=10,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 100}})
),
)

View File

@@ -403,8 +403,8 @@ def test_autoreconf_search_path_args_multiple(default_mock_concretization, tmpdi
aclocal_fst = str(tmpdir.mkdir("fst").mkdir("share").mkdir("aclocal"))
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == [
"-I",
aclocal_fst,
@@ -422,8 +422,8 @@ def test_autoreconf_search_path_args_skip_automake(default_mock_concretization,
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.name = "automake"
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == ["-I", aclocal_snd]
@@ -434,7 +434,7 @@ def test_autoreconf_search_path_args_external_order(default_mock_concretization,
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.external_path = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == [
"-I",
aclocal_snd,
@@ -447,8 +447,8 @@ def test_autoreconf_search_path_skip_nonexisting(default_mock_concretization, tm
"""Skip -I flags for non-existing directories"""
spec = default_mock_concretization("dttop")
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == []

View File

@@ -210,7 +210,6 @@ def check_args_contents(cc, args, must_contain, must_not_contain):
"""
with set_env(SPACK_TEST_COMMAND="dump-args"):
cc_modified_args = cc(*args, output=str).strip().split("\n")
print(cc_modified_args)
for a in must_contain:
assert a in cc_modified_args
for a in must_not_contain:

View File

@@ -18,6 +18,7 @@
import spack.repo as repo
import spack.util.git
from spack.test.conftest import MockHTTPResponse
from spack.version import Version
pytestmark = [pytest.mark.usefixtures("mock_packages")]
@@ -30,6 +31,43 @@ def repro_dir(tmp_path):
yield result
def test_get_added_versions_new_checksum(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
checksum_versions = {
"3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04": Version("2.1.5"),
"a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a": Version("2.1.4"),
"6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200": Version("2.0.7"),
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[-1], to_ref=commits[-2]
)
assert len(added_versions) == 1
assert added_versions[0] == Version("2.1.5")
def test_get_added_versions_new_commit(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
checksum_versions = {
"74253725f884e2424a0dd8ae3f69896d5377f325": Version("2.1.6"),
"3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04": Version("2.1.5"),
"a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a": Version("2.1.4"),
"6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200": Version("2.0.7"),
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[2], to_ref=commits[1]
)
assert len(added_versions) == 1
assert added_versions[0] == Version("2.1.6")
def test_pipeline_dag(config, tmpdir):
r"""Test creation, pruning, and traversal of PipelineDAG using the
following package dependency graph:
@@ -347,7 +385,6 @@ def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
for key, val in expectations.items():
affected_specs = ci.get_spec_filter_list(e1, touched, dependent_traverse_depth=key)
affected_pkg_names = set([s.name for s in affected_specs])
print(f"{key}: {affected_pkg_names}")
assert affected_pkg_names == val

View File

@@ -12,7 +12,7 @@
build_env = SpackCommand("build-env")
@pytest.mark.parametrize("pkg", [("zlib",), ("zlib", "--")])
@pytest.mark.parametrize("pkg", [("pkg-c",), ("pkg-c", "--")])
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_it_just_runs(pkg):
build_env(*pkg)
@@ -38,7 +38,7 @@ def test_build_env_requires_a_spec(args):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_dump(shell_as, shell, tmpdir):
with tmpdir.as_cwd():
build_env("--dump", _out_file, "zlib")
build_env("--dump", _out_file, "pkg-c")
with open(_out_file, encoding="utf-8") as f:
if shell == "pwsh":
assert any(line.startswith("$Env:PATH") for line in f.readlines())
@@ -51,7 +51,7 @@ def test_dump(shell_as, shell, tmpdir):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_pickle(tmpdir):
with tmpdir.as_cwd():
build_env("--pickle", _out_file, "zlib")
build_env("--pickle", _out_file, "pkg-c")
environment = pickle.load(open(_out_file, "rb"))
assert isinstance(environment, dict)
assert "PATH" in environment

View File

@@ -148,7 +148,7 @@ def test_update_key_index(
s = spack.concretize.concretize_one("libdwarf")
# Install a package
install(s.name)
install("--fake", s.name)
# Put installed package in the buildcache, which, because we're signing
# it, should result in the public key getting pushed to the buildcache
@@ -178,7 +178,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
s = spack.concretize.concretize_one("libdwarf")
# Install and generate build cache index
PackageInstaller([s.package], explicit=True).install()
PackageInstaller([s.package], fake=True, explicit=True).install()
metadata_file = spack.binary_distribution.tarball_name(s, ".spec.json")
@@ -214,13 +214,11 @@ def verify_mirror_contents():
if in_env_pkg in p:
found_pkg = True
if not found_pkg:
print("Expected to find {0} in {1}".format(in_env_pkg, dest_mirror_dir))
assert False
assert found_pkg, f"Expected to find {in_env_pkg} in {dest_mirror_dir}"
# Install a package and put it in the buildcache
s = spack.concretize.concretize_one(out_env_pkg)
install(s.name)
install("--fake", s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name)
env("create", "test")

Some files were not shown because too many files have changed in this diff Show More