Compare commits

...

309 Commits

Author SHA1 Message Date
Nils Vu
1331332dcf libxsmm: update URL (#49155) 2025-02-22 02:04:49 -07:00
dmagdavector
910a4e6d22 slirp4netns: add v1.2.3, v1.3.1 (#48569) 2025-02-21 16:00:28 -08:00
Piotr Sacharuk
93f1ec20aa Update openturns versions (#48872) 2025-02-21 15:59:23 -08:00
Harmen Stoppels
9edbe5aed1 liburing: requires(...) (#49041)
* liburing: requires

* Update var/spack/repos/builtin/packages/liburing/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 15:45:54 -08:00
Krishna Chilleri
a574c7610b py-schema-salad: add v8.7.20241021092521 and py-mypy: add v1.12.1 (#49127)
* add new versions of dependencies

* modify pypi url for newest version

* add option for url depending on version number

* add version ranges of dependencies

* [@spackbot] updating style on behalf of kchilleri

* remove unnecessary py-cache-control version number
2025-02-21 16:07:32 -07:00
Alec Scott
4742f053af emacs: improve gui variant to cover both linux and macos (#49054)
* emacs: improve gui variant to cover both linux and macos

* emacs: fix optional deps type
2025-02-21 16:16:44 -06:00
Alec Scott
b06c5c7e81 fzf: fix go cache protection to allow delete (#49151) 2025-02-21 13:52:33 -07:00
Tobias Ribizel
03fa150185 typst: add v0.13.0 (#49134)
* spack: add version 0.13

* typst: fix version order

Co-authored-by: Alec Scott <hi@alecbcs.com>

* typst: more precise version requirements

* typst: use build_directory

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 11:25:50 -08:00
ByteHamster
b304a2d854 Fix installing rust@nightly (#49098)
Installing `rust@nightly` fails because the package file declares a conflict of rust versions older than `:1.64` with `gcc>=13`. However, because `nightly` is alphanumerically smaller than any actual version number, `nightly` is incorrectly detected to have a conflict with `gcc>=13` as well. Marking `nightly` as an infinity version instead solves this.
2025-02-21 09:53:46 -08:00
Ryan Krattiger
1fa1864b37 Reproducer should deduce artifact root from concrete environment (#45281)
* Reproducer should decude artifact root from concrete environment

* Add documentation on the layout of the artifacts directory

* Use dag hash in the container name

* Add reproducer options to improve local testing

* --use-local-head allows running reproducer with
  the current Spack HEAD commit rather than computing
  a commit for the reproducer

* Add test to verify commits and recreating reproduction environment

* Add test for non-merge commit case

* ci reproduce-build: Drop overwrite option
in favor of throwing an error if the working dir is non-empty
2025-02-21 10:46:43 -06:00
Harmen Stoppels
0da5bafaf2 Repo.packages_with_tags: do not construct a set of all packages (#49141) 2025-02-21 16:23:42 +01:00
Massimiliano Culpo
f4614a4931 Extract some package changes from compiler as deps (#49138) 2025-02-21 12:52:34 +01:00
Harmen Stoppels
b8ec69112f Extracted changes from 45189 (#49137)
Co-Authored-By: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 10:58:00 +01:00
Massimiliano Culpo
a3645fd372 Make BaseConfiguration pickleable (#47545)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 09:54:22 +01:00
dependabot[bot]
9bcd86071f build(deps): bump sphinx from 8.1.3 to 8.2.0 in /lib/spack/docs (#49118)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.1.3 to 8.2.0.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.1.3...v8.2.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 16:42:00 -08:00
Robert Maaskant
b126335800 go: add v1.24.0 (#49104)
* go-bootstrap: add v1.22.12

* go: add v1.24.0

* Reformat package, don't deprecate versions without an active CVE

---------

Co-authored-by: Alec Scott <scott112@llnl.gov>
2025-02-20 16:07:52 -08:00
Derek Ryan Strong
d32b6099b3 sw4: fix build options (#48774)
* Fix sw4 build options

* Update constraint to hdf5@1.14

* Change edit to setup_build_environment

* Use append_flags

* Fix style
2025-02-20 15:02:52 -07:00
Harmen Stoppels
3e8cb852b0 spec.py: use json.dumps directly to avoid hash breakage (#48884) 2025-02-20 17:39:07 +01:00
Richard Berger
c8d7aa1772 lammps: fix pace link dep 2025-02-20 17:37:19 +01:00
Buldram
ec836d740f py-tensorflow: patch for v2.15 build errors (#49001)
* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: fix clang build and add clang version constraints

* py-tensorflow: use compiler wrapper

* py-tensorflow: relax clang conflict
2025-02-20 13:13:04 +01:00
Teague Sterling
cacdaaf3a9 bcftools: add v1.21, v1.20 (#49070)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-20 01:58:21 -07:00
Dave Keeshan
165e6b1d5e xnedit: new package (#41255) 2025-02-20 08:03:11 +01:00
Paul
70f5300cf2 Add new package for Jacamar CI (#48424) 2025-02-19 22:18:15 -06:00
Buldram
81e08167e2 nim: fix Musl build with new compilers (#48487)
* nim: fix build with new compilers

* narrow condition for disabling warnings

* move flags into offending module

disables warnings also for compiling projects other than the Nim compiler when necessary

* specify different versions pthread modules

* instead patch SysThread type

* adapt patch for old Nim versions

* Specify hypothetical `:@0.19.6` for patch version constraint
2025-02-19 22:16:12 -06:00
Alex Richert
e9d8c5767b crtm-fix: 3.1.1.2 (#48755)
* crtm-fix: 3.1.1.2

* correct checksum

* exclude test files

* Update package.py
2025-02-19 22:13:08 -06:00
Teague Sterling
4cefa973cd htslib: add v1.21 (#49056)
* Adding variants based of configure flags and an option to compile with PIC

* Adding GCS to htslib

* Revisions from review

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating descriptions, fixing flags, fixing version and variant conditions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* htslib: add v1.21

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-19 21:43:10 -06:00
Pruvost Florent
a2bd221ee4 chameleon: update to 1.3.0 (#49112) 2025-02-19 20:03:23 -07:00
Dom Heinzeller
adbb41c7df Bug fixes for fckit (disable finalization of DDTs) and ectrans (disable use of Fortran contiguous keyword) (#49111)
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ECTRANS_HAVE_CONTIGUOUS_ISSUE to turn off problematic use of Fortran 'contiguous' keyword
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ENABLE_FINAL=OFF to turn off problematic finalization of derived data types
* Update links to issues in fckit and ectrans
* Fix wrong cmake argument for ECTRANS_HAVE_CONTIGUOUS_ISSUE in var/spack/repos/builtin/packages/ectrans/package.py
2025-02-19 20:03:05 -07:00
Joseph Wang
2554c7bd21 py-onnxruntime: add v1.18.0 -> v1.19.2 (#46329)
* py-onnxruntime: add new versions

* py-onnxruntime: add constraints

* py-onnxruntime: fix typo

* py-onnxruntime: fix style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 12:02:27 -08:00
Garth N. Wells
e274e855f1 py-nanobind: add v2.5.0 (#48953)
Also removes inactive maintainer.
2025-02-19 11:47:25 -08:00
Matthew L. Curry
e76ebf2cf7 xz: Work around ASM declaration issue with NVHPC (#49006)
This commit works around an issue described below that xz encounters
during compilation with nvhpc.

https://forums.developer.nvidia.com/t/problem-in-inline-assembly-when-using-multiple-asm-declarations/210952
2025-02-19 11:38:29 -08:00
Pranav Sivaraman
11ba5ebbcd jsoncons: new package (#49105) 2025-02-19 11:13:08 -08:00
Adam J. Stewart
53262b968b py-scikit-image: add v0.25.2 (#49101) 2025-02-19 11:12:07 -08:00
etiennemlb
39620085d4 Add new packages: PDI (and dependencies/plugins) (#48710)
* Add new packages: PDI
* Fix style and  typos
* License and pdi python version/shebang issue
* Version update
* 1.8.0 cutoff and dependency simplifications
* Remove unused guard
2025-02-19 09:56:03 -08:00
Krishna Chilleri
78c985fce4 hpc-beeflow: New package (#49036)
* hpc-beeflow: New package

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* add new version of py-fastapi

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 11:18:01 -06:00
Wouter Deconinck
f9e4d3898a cargo: avoid need to use super().build_args with std_build_args (#49071)
* cargo: avoid need to use super().build_args with std_build_args

* cargo: fix style

* jujutsu: avoid need for super().build_args
2025-02-19 09:01:56 -08:00
Lehman Garrison
75c3d0a053 py-yt: add 4.4.0 and dependencies (#47571)
* py-ewah-bool-utils: add new package

* py-extension-helpers: add 1.2.0

* py-regions: add new package

* py-erfa: add 2.0.1.5

* py-yt: add 4.4.0

* py-yt: respect build_jobs
2025-02-19 08:14:35 -07:00
Stephen Nicholas Swatman
6afe002c94 vecmem: fix SYCL compiler specification (#49108)
This commit adds an additional requirement to the vecmem package,
requiring the OneAPI compiler iff the `sycl` variant is turned on. This
allows us to correctly set the non-standard `SYCLCXX` environment
variable.
2025-02-19 08:54:47 -06:00
Alexandre DENIS
f76e01707a mpi-sync-clocks: new package (#47834)
* mpi_ysnc_clocks: new package

* mpi_sync_clocks: move package to the right location

* mpi_sync_clocks: add copyright header

* mpi-sync-clocks: rename package mpi_sync_clocks -> mpi-sync-clocks to comply with naming convention

* mpi-sync-clocks: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpi-sync-clocks: streamline autogen

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:32:04 -06:00
Alexandre DENIS
738ca8e2c2 mpibenchmark: new package (#47835)
* mpibenchmark: new package

* mpibenchmark: add copyright header

* mpibenchmark: move the package to the right location

* mpibenchmark: explicitely disable CUDA & ROCm

* mpibenchmark: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline management of --enable/--disable

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline autogen

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: fix coding style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:31:15 -06:00
Seth R. Johnson
817df233fb g4vg/vecgeom: add version1.0.2 and patch cuda build failures (#49110)
* vecgeom: remove old patches and add patch for CUDA 11

* g4vg: add 1.0.2
2025-02-19 06:29:01 -07:00
Adam J. Stewart
5ea4d04450 py-scipy: add v1.15.2 (#49074) 2025-02-19 13:08:20 +01:00
Taillefumier Mathieu
49bf5a349e cp2k: fine graining control of the GPU modules (#48925)
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2025-02-19 09:29:31 +01:00
Robert Maaskant
2427b9649d ca-certificates-mozilla: add 2024-12-31 and deprecate older (#49096) 2025-02-19 00:34:18 -07:00
Robert Maaskant
0cec2c9fc6 glab: v1.52.0 and v1.53.0 (#49094) 2025-02-19 00:34:05 -07:00
Harmen Stoppels
c97be2a9d7 checksum.py tests: extract add_versions_to_pkg fixture (#49100) 2025-02-19 07:33:50 +00:00
Joe Schoonover
3fbdfc464b fluidnumerics-self: new package (#48636)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: fluidnumerics-joe <fluidnumerics-joe@users.noreply.github.com>
2025-02-19 00:23:38 -07:00
Buldram
c4449cb201 chez-scheme: new package (#49067)
* chez-scheme: new package

* Add separate zuo package, correct dep flags, disable threads and libffi by default

* zuo: add +big, parallelize build
2025-02-18 16:05:37 -06:00
Christian Heusel
1601193e12 likwid: Fix the perms script (#48666)
* likwid: Fix the perms script

The script loops over the path (which includes the prefix), but
additionally adds the prefix up front which results in duplicate paths
and a double "/" in the command like in the following one:

    chown root:root /opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn//opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn/sbin/likwid-accessD

Additionally the path is currently not quoted which can potentially
result in word splitting for weird paths.

Signed-off-by: Christian Heusel <christian@heusel.eu>

* likwid: Make the perm scripts' name unique

Also move it into the proper binary folder as per the Filesystem
Hierarchy Standard.

Signed-off-by: Christian Heusel <christian@heusel.eu>

---------

Signed-off-by: Christian Heusel <christian@heusel.eu>
2025-02-18 13:18:03 -07:00
Alex Richert
9c5b3ccb4e wgrib2: add cmake builder (#48447)
* wgrib2: add cmake builder

* wgrib2 add maintainer

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* style fixes; add tar.gz for old vers

* license update

* wgrib2: don't restrict openjpeg variant by version

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 13:12:42 -07:00
Robert Maaskant
68517389a0 rclone: add v1.68.2 v1.69.0 v1.69.1 (#49083) 2025-02-18 10:42:53 -08:00
Carson Woods
df5ad63331 py-loguru: add v0.7.0 -> v0.7.3 (#48268)
* Added new 4 new versions of py-loguru

* Added new build dependency

* Removed py-setuptools on versions beyond 0.7.2

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fixed erroneous dependency range on py-aiocontextvars

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 10:34:13 -07:00
Dave Keeshan
e79dc4422e libgpiod: new package (#47724)
* Initial checkin of compile file for the package libgpiod, this covers versions, 1.6.3 through 2.2

* Update var/spack/repos/builtin/packages/libgpiod/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/libgpiod/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update how version to url is manage in the x.y.0 cases based on fix by @wdconinc

* Remove redundant url at the top since it is now inside url_for_version function

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 11:13:51 -06:00
Gregor Olenik
f0e5568a54 neofoam: new package (#47214)
* add neofoam package.py

* add Henning as second maintainer

* format file

* fixup! SPDX License entry

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/neofoam/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 11:13:43 -06:00
Laurent Chardon
1a1f0aa07b xios: update to v2.6 (#48680)
* xios: update to v2.6

- xios: update to v2.6
- xios: patch remap earcut.hpp for missing include file
- xios: address C++11 requirement
- xios: instruct to use external boost and blitz libraries

* xios: bump revision to 2714

- Bump revision to changeset 2714
- Deprecate older versions that I can't manage to compile
- Confirm clang workaround is still needed
2025-02-18 08:04:06 -07:00
chaoos
c4ea924977 quda: new package (#48939)
* add quda package

* [@spackbot] updating style on behalf of chaoos

* adjusted header comment

* add quda package

* [@spackbot] updating style on behalf of chaoos

* adjusted header comment

* addressing reviewers comments

* formatting, adjusted build types, added tags hep and lattice

* [@spackbot] updating style on behalf of chaoos

* adjusted preferred version

* [@spackbot] updating style on behalf of chaoos
2025-02-18 07:59:01 -07:00
Wouter Deconinck
57df23a51f libx*: add new versions of X packages (#49060) 2025-02-18 15:46:50 +01:00
Harmen Stoppels
97d66b637f Fix tests modifying package.py files (#49093) 2025-02-18 13:28:07 +01:00
Harmen Stoppels
92e1807672 petsc: fix can provide vs provides issue (#49077) 2025-02-18 13:07:42 +01:00
Harmen Stoppels
e4a8d45d86 views: resolve symlinked dir - dir conflict when same file (#49039)
A directory and a symlink to it under the same relative path in a
different prefix

```
/prefix1/dir/
/prefix1/dir/file
/prefix2/dir -> /prefix1/dir/
```

are not a blocker to create a view. The view structure simply looks like
this:

```
/view/dir/
/view/dir/file
```

This should be the case independently of the order in which we visit
prefixes, so we could in principle create views order independently.
2025-02-18 06:53:17 +01:00
Wouter Deconinck
d6669845ed fjcontrib: add v1.055, v1.056, and v1.100 with patch (#49048) 2025-02-18 06:38:51 +01:00
Wouter Deconinck
60efada6a2 catch2: add v3.8.0 (#49052) 2025-02-18 06:37:23 +01:00
Wouter Deconinck
a093a65a25 armadillo: add v14.2.3 (#49051) 2025-02-18 06:33:09 +01:00
Wouter Deconinck
1524aceb9a gocryptfs: add v2.5.1 (#49063) 2025-02-18 06:32:25 +01:00
Wouter Deconinck
9d0766be48 hep: static_analysis: true (#49069) 2025-02-18 06:27:54 +01:00
Wouter Deconinck
7e89b3521a openloops: add v2.1.3, v2.1.4 (#49064) 2025-02-18 06:25:51 +01:00
Harmen Stoppels
2e372c53ab spec.py: remove Spec.virtual_dependencies (#49079) 2025-02-18 06:17:55 +01:00
Alec Scott
8639779002 pass: add master, improve styling (#49081) 2025-02-18 06:13:34 +01:00
Wouter Deconinck
a0e09139fc bioconductor-*: rm in favor of r-* copies (#49089) 2025-02-18 06:06:26 +01:00
Wouter Deconinck
b02ac87c55 apptainer/singularity/singularityCE: variant suid default False (#49088) 2025-02-18 05:26:00 +01:00
dependabot[bot]
a9da160160 build(deps): bump flake8 from 7.1.1 to 7.1.2 in /lib/spack/docs (#49087)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.1 to 7.1.2.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.1...7.1.2)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 20:08:05 -07:00
Alec Scott
f1678f4c7b bfs: add v4.0.5 (#49049)
* bfs: add v4.0.5, liburing: v2.4, v2.9

* Re-enable bfs on developer tools pipelines
2025-02-17 19:05:35 -06:00
dependabot[bot]
dae3b69f2c build(deps): bump flake8 in /.github/workflows/requirements/style (#49086)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.1 to 7.1.2.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.1...7.1.2)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 18:13:42 -06:00
Brian Vanderwende
cec7e6c4b5 GCC 14 needs C-standards workaround flags (#48019) 2025-02-17 16:33:16 -07:00
dmagdavector
5931236f55 py-nbconvert: add v7.14.2 to 7.16 (#47944) 2025-02-17 15:43:21 -07:00
Miranda Mundt
e695185770 py-pyomo: bump to 6.8.2; add new subdep package py-linear-tree (#48164)
* damaris: add v1.12.0, update maintainers (#48674)

Co-authored-by: Etienne Ndamlabin <jean-etienne.ndamlabin-mboula@inria.fr>

* Bump up the version for rocm-6.3.1 release (#48440)

This PR updates the versions for the rocm recipes for rocm-6.3.1 release.

* py-flash-attn: add missing triton dependency (#48645)

* hep stack: additional event generator packages (#48565)

* hep stack: additional event generator packages

* hep: adidtional packages

* hep: collier doesn't have +pic +shared

* py-awkward-cpp: fix scikit-build-core range of applicability

* hep: disable agile

* hep: disable garfieldpp and genie

* py-wxpython: depends_on pkgconfig even if using external wxwidgets

* hep: disable professor

* papi: fix error finding gmake during post-install testing (#48592)

* JAX: add v0.4.32+ (#46346)

* JAX: add v0.4.34

* Disable search for clang

* Update CUDA flags

* Add py-jax 0.4.33, comment out until py-jaxlib 0.4.33 is also released

* Fix GCC build

* Try TF_NVCC_CLANG

* py-jax: add v0.4.34

* jax no longer has separate tags for jaxlib

* Install compiled wheel

* Join path before glob

* Wheel is in spack stage, not tmp path

* Add 0.4.35

* Add newer versions

* Build system has been refactored yet again

* Drop clang

* Fix build with source tarball, rocm support

* Support GCC

* Remove clang-specific compiler flags

* enable_cuda flag was removed

* Fix logic

* py-jax: add v0.4.38

* Add patch to fix GCC support

* Patch no longer needed

* Skip patching, directly pass flags

* New flags

* Remove unused import

* Patch changed

* Use older version of patch

* Newer patch

* Add CUDA symlink

* Symlink more directories

* Recursive symlink

* Import function

* Recursive search

* Undo cuda changes

* Add v0.5.0

* I quit

* py-geemap: add new package (#48602)

* easi: add v1.5.1; relax yaml-cpp and lua requirements (#48675)

* thepeg: extend the rivet@:3 dependency up to version 2.3 (#48691)

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>

* py-ipyrad: adding version 0.9.102 (#48686)

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* autodock-vina: adding version 1.2.6 (#48684)

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* toybox: add v0.8.12 (#48657)

* Changes for NVIDIA HPC SDK 25.1 (#48696)

* update hypre version and add new memalign for petsc (#47831)

* petsc+rocm: add dependency on hipblas-common (#48644)

* spec.py: fix ArchSpec.intersects (#48741)

fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.

* ucx: adding 1.18.0 (#48742)

* Adding UCX 1.18.0
* Verified and correct hash.

* MAGMA: add v2.9.0 (#48750)

* Deprecate frontend/backend os/target (#47756)

* package api: drop wildcard re-export (#48760)

* package api: drop wildcard re-export

To ensure package repos are forward/backward compatibility with Spack,
we should explicitly export all symbols we want to expose in the public
package API, and drop `from spack.something import *` because
removal/addition to the public API will go unnoticed.

Also `llnl.util.filesystem` has some methods that shouldn't be exposed
in the package API, so better to enumerate a subset explicitly.

* remove flatten_dependencies / install_dependency_symlinks

* py-cmake: remove. remove deprecated cmake versions (#48763)

* Remove pipelines and images based on ppc64le (#48767)

* petsc: only conflict with kokkos@4.5: if it is enabled (#48698)

* hpctoolkit: Add `+docs` variant and manpages (#48566)

* py-mdit-py-plugins: Add new versions 0.3.5, 0.4.2
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* py-myst-parser: Add new versions 0.19.0 to 4.0.0
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* hpctoolkit: Add +docs variant and manpages
   This commit unconditionally enables manpages for the HPCToolkit tools.
   The new `+docs` variant enables additional documentation, specifically
   the user's manual. Both require new build-time dependencies.
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>

---------

Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>

* fmt: simplify +pic (#48766)

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

* Docs/bugfix: correct return for Adding flags to configure (#48434)

* binutils: conflict on configuration with build issues (#42949)

* Create SALT package.py (#48758)

* Create SALT package.py

Added a package for the SALT Source AnaLysis Toolkit
@zbeekman

* [@spackbot] updating style on behalf of wspear

* Update package.py

Line wrap

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>

* smee-client: add v2.0.4 (#48384)

* CMake: add v3.31.5, v3.30.7 (#48759)

* builtin: remove redundant imports (#48765)

* builtin: remove redundant llnl.util.filesystem import
* remove redundant import spack.version
* unsorted fixes
* more spack.version

* abinit: pass flag correctly (#48788)

* Add py-zarr 3, which includes a new required package py-donfig, and a bug fix to the patch range with numcodecs (#48786)

* libxc: add CMake builder (#48772)

* libsmeagol

* libxc cmake

* cmake support

* revert changes

* make spackbot happy

* fix

* Update package.py

* hipblaslt: update cmake dependency (#48637)

* hipblaslt: update cmake dependency

1 error found in build log:
  >> 3    CMake Error at CMakeLists.txt:24 (cmake_minimum_required):
     4      CMake 3.25.2 or higher is required.  You are running version 3.22.1
     5
     6
     7    -- Configuring incomplete, errors occurred!

See build log for details:
  /scratch/svcpetsc/spack-rocm/spack-stage/spack-stage-hipblaslt-6.3.0-pabb7t4rheqkz74lfzbsnqi6vnpiqwlq/spack-build-out.txt

* Update var/spack/repos/builtin/packages/hipblaslt/package.py

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

* Move from python2 compliant IOError and EnvironmentError to python3-only  OSError (#48764)

* IOError -> OSError

* also do EnvironmentError

* py-sphinx: mark Python compatibility (#48796)

* spack.package: wrap llnl.util.tty (#48793)

avoid import of llnl.util.tty in packages

* spack.package: re-export EnvironmentModifications / Prefix (#48792)

* Remove unused values (#48795)

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>

* petsc, py-petsc4py: add v3.22.3 (#48785)

* libsmeagol (#48776)

* libsmeagol

* add support for intel and add conflicts

* cp2k

* Bug Fix: Better incremental check for CMake (#48775)

* Bug Fix: Better incremental check for CMake

* Fix syntax error

* Ensure match of config artifact with generator

* add cdo@2.5.0 (#48801)

* Added salt variant to tau (#48782)

* Added salt variant to tau

* Update package.py

* [@spackbot] updating style on behalf of wspear

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>

* Remove patch on main (#48798)

Patch got merged: https://github.com/natefoo/slurm-drmaa/pull/62

* kokkos-nvcc-wrapper: add version 4.5.00 and 4.5.01 (#48802)

* env create: create copies of relative include files in envs created from manifest (#48689)

Currently, environments created from manifest files with relative includes result in broken
references to config files.

This PR modifies `spack env create` to create local copies in the new environment of any local
config files from relative paths in the environment manifest passed as an init file.

This PR does not change the behavior if the include is an absolute path or if the include is from
a relative path outside the environment directory, but it does warn about missing relative includes if
they are inside the environment directory.

Includes regression test and short blurb in docs.

* py-sphinx-rtd-theme: add v2.0.0, v3.0.0 (#48756)

* Add versions 2 and 3 of py-sphinx-rtd-theme.
   Allow for versions of py-sphinx greater than 6.
   Fix the Python version for older versions that depend on distutils.
   Get the py-docutils dependency from the py-sphinx recipe.
* Depend purely on the py-docutils dependency in py-sphinx.
* More refined dependency versioning.
* Fixed versioning for py-sphinx and py-docutils.

* CP2K: add 2025.1 version and DFTD4 support (#48489)

* cp2k: add dftd4 variant

* better conflict and make support

* typo

* Update var/spack/repos/builtin/packages/cp2k/package.py

* Update var/spack/repos/builtin/packages/cp2k/package.py

* oci/opener.py: respect system proxy settings (#48783)

* vtk-m: CMAKE_CXX_COMPILER is not a BOOL (#48813)

* gcc: remove --with-ld=ld-classic (#48826)

* nanotron: add new package (#48582)

* nanotron: add new package

Also, update some dependencies and add missing ones.

* Add variant +examples needed to execute example scripts

* fix: add missing branch attribute

* Remove master version

* fix: use Github hash

* embree: fix tests by building tutorial's embree_viewer for tests (#48392)

* import-check: improve how problematic imports are displayed (#48825)

The import-check action now presents problematic import statements
introduced by the PR better.

The idea is roughly:

* Let (V₁, E₁) be the graph of modules as vertices and import statements
  as edges before the change
* Let (V₂, E₂) be the graph after the code change, which is typically a small
  perturbation of (V₁, E₁).
* X₁ = FAS(V₁, E₁) is the feedback arc set before (a minimal set of edges to
  delete to make it acyclic)
* X₂ = FAS(V₂, E₂ ∖ X₁) is the feedback arc set after deletion of the minimal
  set of edges that made the old graph acyclic.
* X₃ = FAS(V₂, E₂) is the feedback arc set after

Previously I displayed X₁ and X₃ and users had to diff themselves.

Now, I'm showing X₂, which is a small set, typically directly related to
code changes.

However, it can be that a small code change adding say 2 problematic imports
creates a completely different solution X₃ that only requires deletion of just 1
different import. In that case the user is informed that they can potentially do
less work.

So for PR #48784 the output is now:

> The overall number of problematic import statements increased by 1 from 31 to 32.
> This is likely a direct consequence of the following import statements:
> 
> ```
> spack/config imports: spack.spec, spack.util.path, spack.util.remote_file_cache
> ```
> 
> However, instead of removing 3 import statements, it is sufficient to remove only 1
> import statement from the following list:
> 
> ```
> spack/concretize imports: spack.bootstrap, spack.solver.asp
> spack/environment imports: spack.bootstrap, spack.environment
> spack/fetch_strategy imports: spack.version.git_ref_lookup
> spack/install_test imports: spack.build_environment, spack.package_base
> spack/modules imports: spack.modules
> spack/platforms imports: spack.config
> spack/relocate imports: spack.bootstrap
> spack/repo imports: spack.package_base, spack.patch, spack.tag
> spack/spec imports: spack.binary_distribution, spack.compiler, spack.compilers, spack.concretize, spack.environment, spack.hash_types, spack.provider_index, spack.repo, spack.spec_parser, spack.store, spack.traverse, spack.variant, spack.version.git_ref_lookup
> spack/subprocess_context imports: spack.environment
> spack/util/gpg imports: spack.bootstrap
> spack/util/package_hash imports: spack.package_base
> spack/util/path imports: spack.config, spack.environment
> spack/util/remote_file_cache imports: spack.util.web
> ```

from which the user can figure out that
`spack/util/remote_file_cache imports: spack.util.web` is the "bottleneck" now.

* spack_yaml: use unambiguous variable name (#48832)

* style: fix `not in` and `is not` (#48831)

These are some changes that `ruff check --fix` would make that the current
`spack style` also agrees with.  Make the changes now so that the `ruff`
change is less disruptive.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* Set version to v1.0.0.dev0 (#48791)

* import-check: enable color output (#48842)

* berkeleygw: add -o flag to tar extraction (#48816)

when extracting as root user, avoid that tar attempts to change file ownership

* llvm: deprecate old patch releases (#48762)

* cuda: add v12.8 (#48708)

* sirius: patch pugixml (#48841)

* ziatest: add new package (#48809)

* gitlab: remove isc stacks (#48811)

* gcc: deprecate old patch releases (#48761)

* CP2K: use libxc@7 for master/next release (#48808)

* packge_base.py: remove _patches_by_hash (#48768)

* cdash: avoid build_opener (#48846)

* g4vg: new package (#48844)

* g4vg: new package

* [@spackbot] updating style on behalf of sethrj

* r-dmrcate: add v2.16.0, v3.0.0, v3.2.0 (#48158)

* r-dmrcate: add new versions
* r-dmrcate: require `r@4.3.0` for v2.99.0+
* r-dmrcate: update dependencies

* py-tensorflow: add 2.18.0-rocm-enhanced (#48711)

* py-tensorflow: add 2.18.0-rocm-enhanced

* fix style

* fix style

* fix style

* review changes

* review changes

* remove hipblaslt dependency

* remove ci changes and force ROCm 6.3.1 for newest TF

* remove rocm 6.3.1 dependency

* simplify configure fix

* rocm-examples and rocjpeg: new packages (#47695)

* new package: rocm-examples
* add new package rocjpeg and update rocm-examples for 6.3.0
* fix licenses
* add versions 6.3.1
* change homepage and git
* add f-string

* Tesseract v5.5.0 (#48866)

* leptonica: adding v1.85.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>
* tesseract: adding v5.5.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>

* py-geojson: Add new package  (#48847)

* Add new package py-geojson
* fix when

* isa-l: add v2.31.1 (#48859)

* Add new package func (#48849)

* icu4c: add v75.1, v76.1 (#48858)

* Add new package py-metis (#48848)

* cppgsl: add v4.1.0 (#48864)

* libdrm: add v2.4.124 (#48860)

* amrex: add v25.02 (#48853)

* sherpa: support cxxstd=20 when=@3: (#48829)

* sherpa: support cxxstd=20 when=@3:

* hep: sherpa cxxstd=20

* netlib-scalapack: Update version (#48667)

* Update scalapack version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use url_for_version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use spec.satisfies instead of version()

---------

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* fcgi: add v2.4.3, v2.4.4 (#48856)

* update pyproject.toml for `ruff format` (#48823)

Add ruff configuration to `pyproject.toml`.

This allows `ruff format` in the Spack repository to format all the files we care about, 
with our line length of 99, the exceptions we already put in place, and excluding things
we don't auto-format, like vendored dependencies.

Right now it'll reformat 175 or so files, but only slightly, in places where `ruff` differs from
`black`. For the most part I like the ruff format decisions better than `black`, but none of
the changes seem too severe.

This does not change `spack style` -- I figure that can come later but this at least will
let people start playing with `ruff`.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* dd4hep: add v1.31 (#48850)

* libsm: add v1.2.5 (#48862)

* Apply workaround for oneAPI compiler for upcxx problem with a template argument list (#48843)

* Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword

* Revert "Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword"

This reverts commit faf9b8ce85.

* Apply workaround for oneAPI compiler

* style problem resolved

* use spec.satisfies syntax

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>

* Remove ISC stacks environment files (#48851)

Follow-up to #48811

* Remove variable from cmake.py (#48824)

* Remove variable from cmake.py

#48775 left a dangling variable that was not caught in CI but by the eyes of @haampie. Restructure variable to local method.

* [@spackbot] updating style on behalf of psakievich

* Update cmake.py

* Update lib/spack/spack/build_systems/cmake.py

* Update lib/spack/spack/build_systems/cmake.py

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>

* sirius: add v7.6.2 (#48797)

Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

* gha: standalone import-check (#48873)

* ci: add codecov token secret to coverage upload job (#48880)

Codecov needs to see the token secret when uploading, so we have to
add this line to the workflow YAML:

```yaml
  with:
    token: ${{ secrets.CODECOV_TOKEN }}
```

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* ci: bump import-check (#48883)

* pfind: new package (#48685)

* py-elevation: new package (#48836)

* spec.py: fix hash change due to None vs {} (#48854)

* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules

* nnn: new package (#46174)

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>

* Fix esmf usage, add new version (#48835)

* bash: add autotools dependencies (#48874)

* Scotch: add v7.0.6, add testing option (#48781)

* code-server: update to v4.96.4 (#48828)

* acts: add v39.0.0 (#48839)

This commit adds version 39.0.0 of the ACTS package which, as far as I
can tell, doesn't require any dependency updates.

* spec.py: ensure spec.extra_attributes is {} if is null in json (#48896)

* harfbuzz: add v10.2.0 (#48857)

* libice: add v1.1.2 (#48861)

* relocate.py: don't warn about symlinks (#48904)

`relocate_links` warns when the target is absolute and not matched by
any prefix from the prefix to prefix map.

This can lead to false positives, cause the prefix to prefix map does
not contain trivial/identity entries whenever a package is installed to
its original location.

Since relocate_links is the odd one out there (we don't warn about
similar issues with rpaths, etc), just remove the warning.

* hip-tests: new package (#47273)

* hip-tests: add new package
* remove hip-tests from hip recipe
* remove old versions
* fix style
* add missing import
* bump hip-tests to 6.3.1
* fix style

* flecsi: new version 2.3.1 (#48867)

* flecsi: add new version 2.3.1, remove develop
* flecsi: remove kokkos and openmp variants moving forward
* flecsi: propagate cuda and rocm settings from kokkos
* Update var/spack/repos/builtin/packages/flecsi/package.py
   Co-authored-by: Davis Herring <herring@lanl.gov>
* flecsi: remove redundant depends_on lines
* flecsi: correct legion dependency
* flecsi: deprecate v2.0.0 and v2.1.0
* flecsi: force +openmp if ^kokkos+openmp

---------

Co-authored-by: Davis Herring <herring@lanl.gov>

* Update py-arch, py-statsmodels (add 0.14.1), py-patsy (add 0.5.4) to be able to use py-cython@3 (#48769)

* Add py-patsy@0.5.4
* Correct py-numpy dependency in py-arch
* Add py-statsmodels@0.14.1 and update dependencies
* Add climbfuji as maintainer for py-patsy
* Add climbfuji as maintainer for py-statsmodels
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

* enzyme: add v0.0.172 (#48881)

* spec.py: ensure == is false if true modulo precomputed dag hash (#48889)

* llvm: fix @15 %apple-clang@16 (#48887)

* spiner: add v1.6.3 (#48871)

* spiner: update package logic

* singularity-eos: remove spiner cuda_arch propagation

* spiner: add version 1.6.3

* sherpa: +hepmc3root only when +root (#48827)

* sherpa: +hepmc3root only when +root

* sherpa: fix style

* salt: add v0.3.0 (#48877)

* salt: Add v0.3.0 of SALT

This version contains important bug fixes for building and parsing
projects containing Fortran

* salt: Be more explicit about dependency types

 - llvm+clang+flang is needed at build, link and runtime for the
   correct operation of SALT
 - Testing with llvm@master ( llvm > 19.x ) shows that SALT is
   currently incompatible with the latest llvm API so an updated salt
   will be required when LLVM 20 is released

* openturbine: add new package (#48683)

* PyTorch: add v2.6.0 (#48794)

* mummer4: patching to allow building with %gcc@13: (#38292)

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* fms: add 2025.01, 2024.03 (#48812)

* py-shapely: add v2.0.7 (#48810)

* spectre: add v2025.01.30 (#48803)

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>

* py-numba: Add version 0.61  (#48837)

* amdfftw: fix broken build, adjust flags for performance tuning (#48754)

With CFLAGS, the code path in the amdfftw build system will bypass the logic around AMD_ARCH.

---------

Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>

* justbuild: add v1.4.3 (#48898)

* nvpl-blas, nvpl-lapack: add v0.4.0.1, v0.3.0 (#48901)

* hep stack: build also with cuda and rocm where possible (#48528)

* lcio: add v2.22.4 (#48895)

* Add a message for CMake incremental build (#48905)

* Add a message for CMake incremental build

Requested message to explain CMake phase is getting skipped.

* [@spackbot] updating style on behalf of psakievich

* Update import

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>

* dcap: depends_on libxcrypt (#48903)

* Add new version of r-curl (#48912)

* pika: Add 0.32.0 (#48897)

* icu4c: no cxxstd flag option on Windows (#48510)

* ICU4C: Don't reference a spec variant on a platform on which it's not defined

* icu4c: no cxx flag on Windows

* dla-future-fortran: add v0.3.0 (#48900)

* simgrid: add v3.36 (#48909)

* dyninst: cleanup package (#47637)

* Use more idiomatic construct, shorten recipe
* Remove deprecated versions, and associated patches
* Remove v10.0.0

* Windows: Update default config for stage location (#48511)

Current location is within the Spack prefix, which causes builds to
pollute VCS with stage artifacts and generally inflates the Spack
install prefix.

This PR moves it to the user cache location now that we can
consistently support paths with spaces on Windows.

* py-maturin: add v1.8.2 and refined dependencies (#48915)

* clingo-bootstrap: fix +optimized build (#48931)

* fix regression `apple-clang` vs `%apple-clang`
* use f-strings
* remove --verbose flag from LDFLAGS

* Fix regression due to dyninst update (#48935)

* trexio: fix issues with autotools build system (#48923)

* package_base.py: remove use_cray_compiler_names (#48932)

* Apply workarounds for oneAPI compiler for ascent problem with build (#48918)

* Apply workarounds for oneAPI compiler for ascent problem with build

* Use the way with use patch through the PR address

* stylecheck - missing comma

* libgcrypt: fix enforced -O0 (#48940)

Signed-off-by: Shane Nehring <snehring@iastate.edu>
Co-authored-by: Shane Nehring <snehring@iastate.edu>

* serialbox: add version 2.6.2 (#48937)

* nwchem: add master (#48919)

* Add possibility to build nwchem from master branch

* add oneapi@2025: patch for @7.2.3

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>

* Python: add new versions (#48950)

* Python: add new versions

* black

* reframe: add v4.6.4 -> v4.7.2 (#48242)

* go: add v1.23.6 (#48955)

* qmcpack: add v4.0.0 (#48921)

* py-einops: add v0.8.1 (#48954)

* flux-sched: add v0.42.1 (#48952)

Co-authored-by: github-actions <github-actions@users.noreply.github.com>

* Quantum ESPRESSO: add v7.4.1 (#48949)

* duckdb: add v1.2.0 (#48902)

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* mapl: add v2.53.1, v2.54.1 (#48944)

* log.py: remove setenv calls (#48933)

* nim: add v2.2.2 (#48929)

* Update GFE packages (#48899)

* OpenMPI: add version 4.1.8 (#48922)

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

* extrae: tighten dependencies on boost for +dyninst (#48938)

* py-iterative-stats: add 0.1.1 (#48959)

* import-check: bump (#48968)

* ports-of-call: add v1.6.0, v1.7.0, v1.7.1 (#48870)

* acts dependencies: new versions as of 2025/02/10 (#48969)

This commit adds detray v0.88.0 and GeoModel v6.9.0.

* cbtf-krell: Update Boost dependency (#47133)

* Update Boost
* Add gotcha
* Add patch for build errors
* Allow building with latest Dyninst
* Fix patch url

* lua-sol2: Apply workaround for oneAPI compiler for problem with build (#48920)

* Bump up the version for rocm-6.3.2 release (#48787)

* Bump up the version for rocm-6.3.2 release

* rocm-openmp-extras update and style correction

* Updating mivisionx, omniperf, rccl & rocprofiler-systems

* Updating hipsparselt & rocm-opencl

* rocprofiler-systems on gcc-13 and rvs commit instead of patch

* Updated rocjpeg & rocm-examples for 6.3.2

* ROCPROFSYS_BUILD_DYNINST & DYNINST_BUILD_TBB are required only with gcc-13

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

* spack compiler find: detect `flang-new` and `flang` in newer LLVM versions (#48914)

* rivet: patch missing header in 3.1.10 (#48977)

* concretizer: reduce search space with static analysis (#48729)

Currently, when we setup the ASP problem for `clingo`, we don't take into account the configuration. This results in setting up ASP problems that are larger than necessary, with possibly redundant information, and higher concretization times. 

This PR tries to improve things by adding an opt-in feature that computes the _possible dependencies_ of a solve taking also into account the current configuration, and avoids adding possible dependencies that we are certain can't be in the final solution.

The feature can be activated with:
```yaml
concretizer:
  static_analysis: true
```

Examples of simple rules to discard dependencies are:
- Dependencies that are not buildable, and for which no binary is present (e.g. `cray-mpich` etc. on non Cray systems)
- Dependencies that are not for the current platform (e.g. `msmpi` on non Windows platforms)
- Conditional dependencies that cannot be activated, because of some user requirement (e.g. `cuda` etc. if the user requires `~cuda` in configuration)
- Virtual providers that cannot be used, because of a requirement on a virtual

The speed-up these rules seem to give depends on the use case at hand, but if the configuration is updated properly, they are noticeable. 

Since in cases where there is no rule to exclude packages upfront, reuse is active, and this option is activated, it's possible to see some minor slow down, the feature has been added as opt-in, so it's turned off by default.

* spack.util.elf: catch seek errors (#48972)

* hep: rivet: require hepmc=3 (#48976)

* Fix performance issue on macOS (#48997)

archspec.cpu.host() is not memoized, so compute
it as less as possible.

---------

Co-authored-by: alalazo <alalazo@users.noreply.github.com>

* style.py: fix false negative in redundant import statements (#48980)

* PyTorch: build flash attention by default, except in CI (#48521)

* PyTorch: build flash attention by default, except in CI

* Variant is boolean, only available when +cuda/+rocm

* desc -> _desc

* kokkos et al. : don't monkeypatch spec in callbacks (#48916)

Currently, a few packages using kokkos rely on
kokkos itself monkeypatching its own spec to
provide some attribute.

In this commit we change this attribute to be
defined on the package, and never be monkeypatched.

* gmake: add empty libs property, remove link deptypes from dependents (#48995)

* package_hash.py: move metadata_attrs inline out of package_base (#48981)

* gmake: fix def libs/headers (#49009)

* Spec.package_class -> spack.repo.PATH.get_pkg_class (#48985)

* libfabric: use the class variable to get the list of fabrics (#49007)

Suggested by: alalazo <alalazo@users.noreply.github.com>

Signed-off-by: Justin Cook <jscook@lbl.gov>

* py-transformers: add new versions (#49000)

* py-transformers: add new versions

* py-tokenizers: add new versions

* Apply suggestions from code review

* lcio: Add latest 2.22.5 tag (#48991)

* concretize.lp: don't warn about deprecation when external (#49008)

* Spec.is_virtual -> spack.repo.PATH.is_virtual (#48986)

* g4vg: add 'develop' branch (#49003)

* g4vg: add develop version

* celeritas: add develop version

* Fix style

* REVERTME: move celeritas changes to another branch

* Get full repo

* remove unneeded variable

* Remove spack.repo.PATH.is_virtual call from SpecBuildInterface.(#48984)

This PR is effectively a breaking change extracted from #45189, which removes 
support for spec["mpi"] if spec itself is openmpi / mpich that could provide mpi; 
from the Spec instance we don't have any parent it provides it to, 
hence it's a KeyError.

* Spec.validate_detection -> spack.detection.path.validate_detection (#48987)

* hep: add missing language dependencies (#48963)

* highfive: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* lhapdf: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* vc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989140

* davix: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14989131

* pandorasdk: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989130

* veccore: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989118

* pythia6: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989116

* jwt-cpp: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* collier: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* hepmc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989112

* clhep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989075

* fastjet: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14981340

* gosam-contrib: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14978873

* thepeg: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997553

* cepgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* podio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* pandoramonitoring: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* lcio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997513

* geant4: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997202

* evtgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996817

* apfel: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996779

* collier: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14996770

* vecgeom: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003840

* dd4hep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003839

* opendatadetector: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007666

* acts: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007827

* hepmc: remove dependency on fortran

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* thepeg: remove fortran dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* acts: add a conditional build dependency

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* opendatadetector: add comment to explain C dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* apptainer: get_full_repo for branch main (#49002)

* unifyfs: Apply workaround for oneAPI compiler for problem with build (#48962)

* solver: add type-hints to OutputConfiguration (#48979)

* Ci: ensure file path comparsion uses posix paths (#47033)

Git always produces posix paths, ensure we're always comparsing apples to apples by normalizing paths compared to git output to posix.

* postgresql: add v17.2 (#47811)

* postgresql: add version 17.2

* postgresql: install flex, bison and perl when building versions 17 and up

* postgresql: do not install perl by default when building versions 17 and up

* cray-mpich: adding partial GTL support (#45830)

cray-mpich now has a rocm variant. You can use gtl_lib in the
flag_handler like so:

```python
    def flag_handler(self, name, flags):
        wrapper_flags = []
        environment_flags = []
        build_system_flags = []

        if self.spec.satisfies("+rocm"):
            if self.spec.satisfies("^cray-mpich"):
                gtl_lib = self.spec["cray-mpich"].package.gtl_lib
                build_system_flags.extend(gtl_lib.get(name) or [])
            # hipcc is not wrapped, we need to pass the flags via the
            # build system.
            build_system_flags.extend(flags)

        return (wrapper_flags, environment_flags, build_system_flags)
```

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>

* lammps: use the Cray GTL (#46090)

* lammps: add 20241119 and 20250204 releases (#48978)

* plog: add new package (#48975)

* binutils: add debuginfod variant + update deps (#49011)

* h5hut: Remove H5_USE_110_API for newer versions (#48885)

* h5hut: Remove H5_USE_110_API for newer versions

* h5hut: style reformat and add maintainer

* h5hut: correct version syntax for <v2.x.x

* h5hut: Bump default version to 2.0.0rc7 and remove older rc candidates

* Update var/spack/repos/builtin/packages/h5hut/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* spack debug create-db-tarball: remove after test failures (#49025)

* bubblewrap: add versions up to v0.11.0 (#49023)

* gftl: add v1.15.2 (#48992)

* py-torchgeo: pyvista dep has been removed (#48990)

* py-memray: add v1.15.0 (#48989)

* yosys: add v0.50 (#48983)

* laszip: Add version 3.4.4. (#48982)

* py-numpy: add v2.2.3 (#49029)

* pandora{pfa,sdk,monitoring}: add new versions and allow setting the C++ standard (#48300)

* pandoramonitoring: add v3.6.0; pandorapfa: add v4.11.2

Remove variables that are not being used in pandorasdk. Use the C++ standard
from ROOT when possible and pass -Wno-error to override the -Werror that will
typically fail with a new standard. Add a cxxstd variant for pandorasdk

* Fix style

* Update var/spack/repos/builtin/packages/pandorasdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix style

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* easi@1.5.2 (#49013)

* Put in maintainers for Kokkos Tools Spack package  (#49018)

* Kokkos Tools package.py: fix maintainers
* Kokkos Tools package.py: remove white space between first and second maintainer in comma-separated list
* [@spackbot] updating style on behalf of vlkale
* Correct maintainers syntax
   Co-authored-by: Richard Berger <richard.berger@outlook.com>

---------

Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>

* zlib package: Ensure correct lib search on Windows (#48512)

* Name of zlib's library differs on Windows; also account for name
  differing when building +shared
* `zlib`'s `.libs` implementation was searching for the runtime
  libraries (the .dlls) and should be searching for link-time libs

* Update openfast, amr-wind, and nalu-wind packages (#48994)

* acts: conflicts ^geant4@11.3: when @:35 (#49028)

* yq: add versions 4.44.5 and 4.44.6 and 4.45.1 (#49027)

* openblas: .libs() uses self.libraries attribute (#48942)

Currently this is hardcoded to the same value as listed in the class
definition. If one ever overrides this attribute, such as:

```
packages:
  openblas:
    package_attributes:
      libraries = [ 'libopenblaso64', ]
```

this patch will make sure that override also in the
`spec['openblas'].libs()` call. (Which happens in `hypre`, likely
others).

( see
https://spack.readthedocs.io/en/latest/packages_yaml.html#assigning-package-attributes
)

Thanks to becker33 for debugging help in Slack

* RepoSplit/tests: update repo tests relying on builtin package repo to only use mock repos (#48926)

* RepoSplit/tests: update repo tests relying on builtin

* test_repo_last_mtime: skip on windows due to mtime issues in CI

* MesonPackage: depends_on pkgconfig (#46955)

meson's `dependency` function often uses pkg-config to locate a dependency, and may even fall back to cmake. The former case is very common, and since packagers often forget to add the tiny pkgconfig package as a build dep, we do it for them.

* Spec.__getitem__: restrict to direct deps + transitive runtime deps (#49016)

With this change spec["pkg"] searches only direct dependencies and transitive link/run
dependencies, ordered by depth. This avoids situations where we pick up unwanted 
deps of build/test deps. 

To reach those, you need to do spec["build_dep"]["pkg"] explicitly.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* views: normalize paths on case insensitive file systems (#47370)

On macOS, prefix_a/file and prefix_b/FILE map to the same file view/file or view/FILE.

This commit ensures that we test whether a view is created on a case insensitive filesystem and handle projection conflicts accordingly.

* Allow tuning max_dupes for build dependencies (#48948)

Up to now, Spack was allowing all build-tools that
may appear in the DAG to have 2 max_dupes.

This is not needed in practice for most of them,
and adding them out of caution just increases
grounding and concretization time.

This PR makes the value of max_dupes configurable
per package, and sets only a few known packages to
2 max_dupes by default.

In case user needs different values, they can
tune the configuration for their use case.

* vtk: fix 9.4.1 concretization (#48946)

* seacas: conflict 2024-06-27 with windows

* vtk: fix 9.4.1 seacas dependency

* cairo: add new version and update build system (#48822)

* update cairo for new meson build system

* update patch range. remove old conflict

* style

* update pango to reflect the changes in cairo

* refine depends

* style

* add lzo depends

* add +shared

* non self-referential variant requireme

* style

* Move +shared variant back to just autotools as meson automatically handles it

* clarify patch when=

* update based on reviews. switch from conflicts to requires to enforce variant synchronization

* refine conflicts and requires

* better group build deps together

* comment for meson build lower version bound

* clarifying comments

* clarify version ranges, enforce build_system with version ranges

* style

* cairo:  no need to require for build_systems

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* fzf: add v0.60.0, improve styling (#49059)

* fd: improve documentation and styling to help newer maintainers (#49058)

* direnv: add master, fix up package for better documentation (#49053)

* GDAL: add v3.10.2 (#49042)

* pbwt: new package (#49055)

* pbwt: add v2.1, v2.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* py-xarray-regrid: Add new package (#48834)

* Add py-xarray-regrid and required dep flox

* remove boiler plate

* Add missing py310 dep

* py-flox, py-xarray-regrid: add type=("build", "run") to python dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* build(deps): bump isort in /.github/workflows/requirements/style (#48746)

Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* build(deps): bump black from 24.10.0 to 25.1.0 in /lib/spack/docs (#48780)

Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* surfer: new package (#48432)

* Add waveform viewer, surfer for RTL simulations

* Ran black over the code following style check failure

* build(deps): bump black in /.github/workflows/requirements/style (#48779)

Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* new package: jujutsu (#48231)

* lis: add v2.0.28 -> v2.1.7 (#48308)

* Added LIS 2.1.7

* Added LIS versions from 2.0.28 to 2.1.7

* apply black v25.1.0 (#49076)

* build(deps): bump isort from 5.13.2 to 6.0.0 in /lib/spack/docs (#48747)

Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Spec.__contains__: restrict to direct build and transitive runtime deps (#49072)

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* mochi-margo/mochi-thallium: new versions (#49037)

* mochi-margo/mochi-thallium: new versions

* mochi-thallium: fixing style

* mochi-thallium: fixing required dependency on mochi-margo versions

* Add new recipe aotriton for rocm. (#49038)

* add new reciple aotriton for rocm. used for py-torch

* update the git info

* fix style error

* fix style error

* fix style error

* address review comments

* fix style error

* update maintainers (#48295)

* acts dependencies: new versions as of 2025/02/17 (#49073)

This commit adds detray v0.88.1, covfie v0.12.0 and v0.12.1, as well as
ACTS v37.1.0.

* magma: remove cuda_arch constraint on 2.9.0+ (#49019)

* plsm: new package (#48875)

* Adding plsm package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add description

* Add blank line

* Add cuda_arch and update int64 handling

* gnutls: add v3.8.9 (#49062)

* gnutls: add v3.8.9

* gnutls: address super small nitpick

* gnutls: fix git url

* qemacs: add v6.4.1, fix +doc (#48722)

* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies (#47926)

* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies

* py-importlib-resources: add v5.13 to 6.4

* Revert "py-importlib-resources: add v5.13 to 6.4"

This reverts commit 1df208874c799b99dcfc43f13ae85f9324c59b52.

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Signed-off-by: Justin Cook <jscook@lbl.gov>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Etienne Ndamlabin <88906611+endamlabin@users.noreply.github.com>
Co-authored-by: Etienne Ndamlabin <jean-etienne.ndamlabin-mboula@inria.fr>
Co-authored-by: Sreenivasa Murthy Kolam <sreenivasamurthy.kolam@amd.com>
Co-authored-by: Thomas Bouvier <contact@thomas-bouvier.io>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: G-Ragghianti <33492707+G-Ragghianti@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: David Schneller <12698011+davschneller@users.noreply.github.com>
Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: snehring <7978778+snehring@users.noreply.github.com>
Co-authored-by: Buldram <buldram@proton.me>
Co-authored-by: jmuddnv <143751186+jmuddnv@users.noreply.github.com>
Co-authored-by: Thomas-Ulrich <ulrich@geophysik.uni-muenchen.de>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Filippo Spiga <spiga.filippo@gmail.com>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Eric Berquist <727571+berquist@users.noreply.github.com>
Co-authored-by: wspear <wspear@cs.uoregon.edu>
Co-authored-by: wspear <wspear@users.noreply.github.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Chris Marsh <chrismarsh.c2@gmail.com>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>
Co-authored-by: psakievich <psakiev@sandia.gov>
Co-authored-by: Brian Spilner <Try2Code@users.noreply.github.com>
Co-authored-by: Dominic Hofer <6570912+dominichofer@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: danielsjensen1 <dsjense@sandia.gov>
Co-authored-by: Till Ehrengruber <till.ehrengruber@cscs.ch>
Co-authored-by: Henri Menke <henri@henrimenke.de>
Co-authored-by: pauleonix <paul.grosse-bley@ziti.uni-heidelberg.de>
Co-authored-by: Mosè Giordano <765740+giordano@users.noreply.github.com>
Co-authored-by: Zack Galbreath <zack.galbreath@kitware.com>
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Taillefumier Mathieu <29380261+mtaillefumier@users.noreply.github.com>
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Piotr Sacharuk <107190444+PiotrSacharuk@users.noreply.github.com>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: rfbgo <109985755+rfbgo@users.noreply.github.com>
Co-authored-by: Felix Thaler <thaler@cscs.ch>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: japlews <22622327+japlews@users.noreply.github.com>
Co-authored-by: George Young <A-N-Other@users.noreply.github.com>
Co-authored-by: Stephen Nicholas Swatman <stephen@v25.nl>
Co-authored-by: Davis Herring <herring@lanl.gov>
Co-authored-by: Dom Heinzeller <dom.heinzeller@icloud.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Izaak "Zaak" Beekman <contact@izaakbeekman.com>
Co-authored-by: ddement <ddement@gatech.edu>
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Matt Thompson <matthew.thompson@nasa.gov>
Co-authored-by: SXS Bot <31972027+sxs-bot@users.noreply.github.com>
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
Co-authored-by: AMD Toolchain Support <73240730+amd-toolchain-support@users.noreply.github.com>
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
Co-authored-by: Alberto Sartori <alberto.sartori@huawei.com>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Thomas Madlener <thomas.madlener@desy.de>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Vinícius <viniciusvgp@gmail.com>
Co-authored-by: Teague Sterling <teaguesterling@users.noreply.github.com>
Co-authored-by: Shane Nehring <snehring@iastate.edu>
Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>
Co-authored-by: Robert Mijakovic <robert.mijakovic@gmail.com>
Co-authored-by: Paul <bryantpj@ornl.gov>
Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
Co-authored-by: Vanessasaurus <814322+vsoch@users.noreply.github.com>
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Howard Pritchard <howardp@lanl.gov>
Co-authored-by: jgraciahlrs <gracia@hlrs.de>
Co-authored-by: Fernando Ayats <ayatsfer@gmail.com>
Co-authored-by: Tim Haines <thaines.astro@gmail.com>
Co-authored-by: renjithravindrankannath <94420380+renjithravindrankannath@users.noreply.github.com>
Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
Co-authored-by: alalazo <alalazo@users.noreply.github.com>
Co-authored-by: Tara Drwenski <tdrwenski@users.noreply.github.com>
Co-authored-by: Justin Cook <jscook@lbl.gov>
Co-authored-by: jean-francois-sa <jleblancrichard@simplyanalytics.com>
Co-authored-by: etiennemlb <eti.malaboeuf@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
Co-authored-by: Dmitri Smirnov <dmixsmi@gmail.com>
Co-authored-by: John Biddiscombe <biddisco@cscs.ch>
Co-authored-by: Dave Keeshan <96727608+davekeeshan@users.noreply.github.com>
Co-authored-by: Rémi Lacroix <remi.lacroix@idris.fr>
Co-authored-by: Vivek Kale <11766050+vlkale@users.noreply.github.com>
Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Marc T. Henry de Frahan <marc.henrydefrahan@nrel.gov>
Co-authored-by: Robert Maaskant <RobertMaaskant@users.noreply.github.com>
Co-authored-by: Matthew Lesko <matthew.w.lesko@nasa.gov>
Co-authored-by: Paul Gessinger <hello@paulgessinger.com>
Co-authored-by: Vicente Bolea <vicente.bolea@kitware.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pranav Sivaraman <pranavsivaraman@gmail.com>
Co-authored-by: Filippo Barbari <filippo.barbari@gmail.com>
Co-authored-by: Matthieu Dorier <mdorier@anl.gov>
Co-authored-by: Nai-Yuan Chiang <sorakid507@gmail.com>
Co-authored-by: Cameron Rutherford <rcamruzz@amazon.com>
Co-authored-by: Philip Fackler <49726797+PhilipFackler@users.noreply.github.com>
Co-authored-by: dmagdavector <david.magda@vectorinstitute.ai>
2025-02-17 15:17:31 -07:00
dmagdavector
5356469ba5 py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies (#47926)
* py-fastjsonschema: add 2.17 to 2.21.1; note python dependencies

* py-importlib-resources: add v5.13 to 6.4

* Revert "py-importlib-resources: add v5.13 to 6.4"

This reverts commit 1df208874c799b99dcfc43f13ae85f9324c59b52.
2025-02-17 13:17:31 -07:00
Buldram
605c3de633 qemacs: add v6.4.1, fix +doc (#48722) 2025-02-17 12:39:43 -06:00
Wouter Deconinck
45c4446b90 gnutls: add v3.8.9 (#49062)
* gnutls: add v3.8.9

* gnutls: address super small nitpick

* gnutls: fix git url
2025-02-17 09:44:00 -08:00
Philip Fackler
4ba6407cb8 plsm: new package (#48875)
* Adding plsm package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add description

* Add blank line

* Add cuda_arch and update int64 handling
2025-02-17 12:20:29 -05:00
Cameron Rutherford
c221635c79 magma: remove cuda_arch constraint on 2.9.0+ (#49019) 2025-02-17 17:49:49 +01:00
Stephen Nicholas Swatman
46ff553ec2 acts dependencies: new versions as of 2025/02/17 (#49073)
This commit adds detray v0.88.1, covfie v0.12.0 and v0.12.1, as well as
ACTS v37.1.0.
2025-02-17 08:58:09 -06:00
Nai-Yuan Chiang
fcc85adc7f update maintainers (#48295) 2025-02-17 08:21:52 -06:00
Sreenivasa Murthy Kolam
da1ac0fdd4 Add new recipe aotriton for rocm. (#49038)
* add new reciple aotriton for rocm. used for py-torch

* update the git info

* fix style error

* fix style error

* fix style error

* address review comments

* fix style error
2025-02-17 08:20:36 -06:00
Matthieu Dorier
0accf26472 mochi-margo/mochi-thallium: new versions (#49037)
* mochi-margo/mochi-thallium: new versions

* mochi-thallium: fixing style

* mochi-thallium: fixing required dependency on mochi-margo versions
2025-02-17 08:18:22 -06:00
Harmen Stoppels
545750873e Spec.__contains__: restrict to direct build and transitive runtime deps (#49072)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-17 13:57:16 +01:00
dependabot[bot]
7d4523a9fc build(deps): bump isort from 5.13.2 to 6.0.0 in /lib/spack/docs (#48747)
Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 12:05:41 +01:00
Harmen Stoppels
754a64d1fe apply black v25.1.0 (#49076) 2025-02-17 11:42:12 +01:00
Filippo Barbari
b11578ed7c lis: add v2.0.28 -> v2.1.7 (#48308)
* Added LIS 2.1.7

* Added LIS versions from 2.0.28 to 2.1.7
2025-02-16 19:07:39 -07:00
Pranav Sivaraman
c80dcd8f84 new package: jujutsu (#48231) 2025-02-16 19:41:23 -06:00
dependabot[bot]
aaaf4477c9 build(deps): bump black in /.github/workflows/requirements/style (#48779)
Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 17:42:46 -07:00
Dave Keeshan
27f123efad surfer: new package (#48432)
* Add waveform viewer, surfer for RTL simulations

* Ran black over the code following style check failure
2025-02-16 17:54:18 -06:00
dependabot[bot]
2b52639032 build(deps): bump black from 24.10.0 to 25.1.0 in /lib/spack/docs (#48780)
Bumps [black](https://github.com/psf/black) from 24.10.0 to 25.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.10.0...25.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 16:37:36 -07:00
dependabot[bot]
a472adf2cb build(deps): bump isort in /.github/workflows/requirements/style (#48746)
Bumps [isort](https://github.com/pycqa/isort) from 5.13.2 to 6.0.0.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.13.2...6.0.0)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-16 16:27:35 -07:00
Chris Marsh
79972d7b57 py-xarray-regrid: Add new package (#48834)
* Add py-xarray-regrid and required dep flox

* remove boiler plate

* Add missing py310 dep

* py-flox, py-xarray-regrid: add type=("build", "run") to python dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-16 12:22:39 -07:00
Teague Sterling
0ffb61e215 pbwt: new package (#49055)
* pbwt: add v2.1, v2.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-16 12:08:50 -06:00
Adam J. Stewart
cdd261b63f GDAL: add v3.10.2 (#49042) 2025-02-16 11:48:30 -06:00
Alec Scott
900574ddb3 direnv: add master, fix up package for better documentation (#49053) 2025-02-16 11:35:47 -06:00
Alec Scott
6bc4af11f4 fd: improve documentation and styling to help newer maintainers (#49058) 2025-02-16 11:16:00 -06:00
Alec Scott
6d35a75c4f fzf: add v0.60.0, improve styling (#49059) 2025-02-16 11:13:53 -06:00
Chris Marsh
7e65c57861 cairo: add new version and update build system (#48822)
* update cairo for new meson build system

* update patch range. remove old conflict

* style

* update pango to reflect the changes in cairo

* refine depends

* style

* add lzo depends

* add +shared

* non self-referential variant requireme

* style

* Move +shared variant back to just autotools as meson automatically handles it

* clarify patch when=

* update based on reviews. switch from conflicts to requires to enforce variant synchronization

* refine conflicts and requires

* better group build deps together

* comment for meson build lower version bound

* clarifying comments

* clarify version ranges, enforce build_system with version ranges

* style

* cairo:  no need to require for build_systems

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-15 18:37:43 -07:00
Vicente Bolea
9213bf5919 vtk: fix 9.4.1 concretization (#48946)
* seacas: conflict 2024-06-27 with windows

* vtk: fix 9.4.1 seacas dependency
2025-02-14 16:14:04 -06:00
Massimiliano Culpo
ccd205bfeb Allow tuning max_dupes for build dependencies (#48948)
Up to now, Spack was allowing all build-tools that
may appear in the DAG to have 2 max_dupes.

This is not needed in practice for most of them,
and adding them out of caution just increases
grounding and concretization time.

This PR makes the value of max_dupes configurable
per package, and sets only a few known packages to
2 max_dupes by default.

In case user needs different values, they can
tune the configuration for their use case.
2025-02-14 14:25:12 +01:00
Paul Gessinger
114bd5744f views: normalize paths on case insensitive file systems (#47370)
On macOS, prefix_a/file and prefix_b/FILE map to the same file view/file or view/FILE.

This commit ensures that we test whether a view is created on a case insensitive filesystem and handle projection conflicts accordingly.
2025-02-14 09:35:40 +01:00
Harmen Stoppels
8ef5f1027a Spec.__getitem__: restrict to direct deps + transitive runtime deps (#49016)
With this change spec["pkg"] searches only direct dependencies and transitive link/run
dependencies, ordered by depth. This avoids situations where we pick up unwanted 
deps of build/test deps. 

To reach those, you need to do spec["build_dep"]["pkg"] explicitly.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-14 08:45:50 +01:00
Wouter Deconinck
36bc53ee07 MesonPackage: depends_on pkgconfig (#46955)
meson's `dependency` function often uses pkg-config to locate a dependency, and may even fall back to cmake. The former case is very common, and since packagers often forget to add the tiny pkgconfig package as a build dep, we do it for them.
2025-02-14 08:36:16 +01:00
Tamara Dahlgren
236b8fc009 RepoSplit/tests: update repo tests relying on builtin package repo to only use mock repos (#48926)
* RepoSplit/tests: update repo tests relying on builtin

* test_repo_last_mtime: skip on windows due to mtime issues in CI
2025-02-13 21:49:50 -08:00
Matthew Lesko
1a42bf043f openblas: .libs() uses self.libraries attribute (#48942)
Currently this is hardcoded to the same value as listed in the class
definition. If one ever overrides this attribute, such as:

```
packages:
  openblas:
    package_attributes:
      libraries = [ 'libopenblaso64', ]
```

this patch will make sure that override also in the
`spec['openblas'].libs()` call. (Which happens in `hypre`, likely
others).

( see
https://spack.readthedocs.io/en/latest/packages_yaml.html#assigning-package-attributes
)

Thanks to becker33 for debugging help in Slack
2025-02-13 21:44:12 -08:00
Robert Maaskant
87cc3280b6 yq: add versions 4.44.5 and 4.44.6 and 4.45.1 (#49027) 2025-02-13 19:48:42 -07:00
Wouter Deconinck
7dc75d5f8c acts: conflicts ^geant4@11.3: when @:35 (#49028) 2025-02-13 19:33:33 -07:00
Marc T. Henry de Frahan
a1bff46435 Update openfast, amr-wind, and nalu-wind packages (#48994) 2025-02-13 19:28:11 -07:00
John W. Parent
12e0eb6178 zlib package: Ensure correct lib search on Windows (#48512)
* Name of zlib's library differs on Windows; also account for name
  differing when building +shared
* `zlib`'s `.libs` implementation was searching for the runtime
  libraries (the .dlls) and should be searching for link-time libs
2025-02-13 17:15:03 -07:00
Vivek Kale
0eb55a0b8f Put in maintainers for Kokkos Tools Spack package (#49018)
* Kokkos Tools package.py: fix maintainers
* Kokkos Tools package.py: remove white space between first and second maintainer in comma-separated list
* [@spackbot] updating style on behalf of vlkale
* Correct maintainers syntax
   Co-authored-by: Richard Berger <richard.berger@outlook.com>

---------

Co-authored-by: vlkale <vlkale@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
2025-02-13 16:35:22 -07:00
Thomas-Ulrich
6925a53937 easi@1.5.2 (#49013) 2025-02-13 16:29:03 -07:00
Juan Miguel Carceller
e34f04df5e pandora{pfa,sdk,monitoring}: add new versions and allow setting the C++ standard (#48300)
* pandoramonitoring: add v3.6.0; pandorapfa: add v4.11.2

Remove variables that are not being used in pandorasdk. Use the C++ standard
from ROOT when possible and pass -Wno-error to override the -Werror that will
typically fail with a new standard. Add a cxxstd variant for pandorasdk

* Fix style

* Update var/spack/repos/builtin/packages/pandorasdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix style

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-13 16:09:40 -07:00
Adam J. Stewart
3fe13f0891 py-numpy: add v2.2.3 (#49029) 2025-02-13 15:37:08 -07:00
Rémi Lacroix
c8d244b621 laszip: Add version 3.4.4. (#48982) 2025-02-13 12:47:30 -08:00
Dave Keeshan
bc3132f2a9 yosys: add v0.50 (#48983) 2025-02-13 12:46:21 -08:00
Adam J. Stewart
b79d0bfc80 py-memray: add v1.15.0 (#48989) 2025-02-13 12:45:03 -08:00
Adam J. Stewart
f678e8af4d py-torchgeo: pyvista dep has been removed (#48990) 2025-02-13 12:43:25 -08:00
Matt Thompson
9985ecf6a7 gftl: add v1.15.2 (#48992) 2025-02-13 12:40:24 -08:00
Buldram
5e981797f5 bubblewrap: add versions up to v0.11.0 (#49023) 2025-02-13 06:20:29 -07:00
Harmen Stoppels
a7b542dd37 spack debug create-db-tarball: remove after test failures (#49025) 2025-02-13 13:10:28 +01:00
John Biddiscombe
4dc1a900e2 h5hut: Remove H5_USE_110_API for newer versions (#48885)
* h5hut: Remove H5_USE_110_API for newer versions

* h5hut: style reformat and add maintainer

* h5hut: correct version syntax for <v2.x.x

* h5hut: Bump default version to 2.0.0rc7 and remove older rc candidates

* Update var/spack/repos/builtin/packages/h5hut/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-02-13 11:01:14 +01:00
Harmen Stoppels
8697371d82 binutils: add debuginfod variant + update deps (#49011) 2025-02-13 08:00:06 +01:00
Dmitri Smirnov
61899fcfc1 plog: add new package (#48975) 2025-02-13 07:54:00 +01:00
Richard Berger
9697c1934c lammps: add 20241119 and 20250204 releases (#48978) 2025-02-12 15:57:36 -08:00
etiennemlb
a011b49e1e lammps: use the Cray GTL (#46090) 2025-02-12 15:07:51 -07:00
etiennemlb
ae50757f3c cray-mpich: adding partial GTL support (#45830)
cray-mpich now has a rocm variant. You can use gtl_lib in the
flag_handler like so:

```python
    def flag_handler(self, name, flags):
        wrapper_flags = []
        environment_flags = []
        build_system_flags = []

        if self.spec.satisfies("+rocm"):
            if self.spec.satisfies("^cray-mpich"):
                gtl_lib = self.spec["cray-mpich"].package.gtl_lib
                build_system_flags.extend(gtl_lib.get(name) or [])
            # hipcc is not wrapped, we need to pass the flags via the
            # build system.
            build_system_flags.extend(flags)

        return (wrapper_flags, environment_flags, build_system_flags)
```

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Richard Berger <richard.berger@outlook.com>
2025-02-12 12:01:40 -08:00
jean-francois-sa
6f1dce95f9 postgresql: add v17.2 (#47811)
* postgresql: add version 17.2

* postgresql: install flex, bison and perl when building versions 17 and up

* postgresql: do not install perl by default when building versions 17 and up
2025-02-12 13:28:55 -06:00
John W. Parent
fd59d3e589 Ci: ensure file path comparsion uses posix paths (#47033)
Git always produces posix paths, ensure we're always comparsing apples to apples by normalizing paths compared to git output to posix.
2025-02-12 12:14:16 -07:00
Massimiliano Culpo
0172208c52 solver: add type-hints to OutputConfiguration (#48979) 2025-02-12 20:12:12 +01:00
Piotr Sacharuk
02c2516e88 unifyfs: Apply workaround for oneAPI compiler for problem with build (#48962) 2025-02-12 10:25:34 -08:00
Wouter Deconinck
a8e37ccbbb apptainer: get_full_repo for branch main (#49002) 2025-02-12 09:06:59 -08:00
Massimiliano Culpo
f0f463c8dc hep: add missing language dependencies (#48963)
* highfive: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* lhapdf: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989283

* vc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989140

* davix: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14989131

* pandorasdk: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989130

* veccore: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989118

* pythia6: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989116

* jwt-cpp: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* collier: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989115

* hepmc: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989112

* clhep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14989075

* fastjet: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14981340

* gosam-contrib: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14978873

* thepeg: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997553

* cepgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* podio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* pandoramonitoring: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997552

* lcio: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997513

* geant4: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14997202

* evtgen: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996817

* apfel: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/14996779

* collier: add dependency on C, C++

https://gitlab.spack.io/spack/spack/-/jobs/14996770

* vecgeom: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003840

* dd4hep: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15003839

* opendatadetector: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007666

* acts: add dependency on C

https://gitlab.spack.io/spack/spack/-/jobs/15007827

* hepmc: remove dependency on fortran

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* thepeg: remove fortran dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* acts: add a conditional build dependency

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* opendatadetector: add comment to explain C dep

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-12 10:51:23 -06:00
Harmen Stoppels
a137da1cd5 Spec.validate_detection -> spack.detection.path.validate_detection (#48987) 2025-02-12 09:30:14 -07:00
Harmen Stoppels
03e972314f Remove spack.repo.PATH.is_virtual call from SpecBuildInterface.(#48984)
This PR is effectively a breaking change extracted from #45189, which removes 
support for spec["mpi"] if spec itself is openmpi / mpich that could provide mpi; 
from the Spec instance we don't have any parent it provides it to, 
hence it's a KeyError.
2025-02-12 17:17:45 +01:00
Seth R. Johnson
9a7a3d2743 g4vg: add 'develop' branch (#49003)
* g4vg: add develop version

* celeritas: add develop version

* Fix style

* REVERTME: move celeritas changes to another branch

* Get full repo

* remove unneeded variable
2025-02-12 09:04:42 -07:00
Harmen Stoppels
76f00a3659 Spec.is_virtual -> spack.repo.PATH.is_virtual (#48986) 2025-02-12 16:57:27 +01:00
Harmen Stoppels
cd98781fb4 concretize.lp: don't warn about deprecation when external (#49008) 2025-02-12 16:27:37 +01:00
Thomas Madlener
dd16f451fc lcio: Add latest 2.22.5 tag (#48991) 2025-02-12 08:26:12 -07:00
Buldram
4f6cd5abde py-transformers: add new versions (#49000)
* py-transformers: add new versions

* py-tokenizers: add new versions

* Apply suggestions from code review
2025-02-12 07:58:32 -07:00
Justin Cook
d7f05e08be libfabric: use the class variable to get the list of fabrics (#49007)
Suggested by: alalazo <alalazo@users.noreply.github.com>

Signed-off-by: Justin Cook <jscook@lbl.gov>
2025-02-12 14:07:06 +01:00
Harmen Stoppels
9747978c7f Spec.package_class -> spack.repo.PATH.get_pkg_class (#48985) 2025-02-12 11:52:04 +01:00
Harmen Stoppels
f043455ccc gmake: fix def libs/headers (#49009) 2025-02-12 11:07:55 +01:00
Harmen Stoppels
fb9d6427e6 package_hash.py: move metadata_attrs inline out of package_base (#48981) 2025-02-12 10:38:35 +01:00
Tara Drwenski
76e83e10c1 gmake: add empty libs property, remove link deptypes from dependents (#48995) 2025-02-12 10:16:30 +01:00
Massimiliano Culpo
af89bdf632 kokkos et al. : don't monkeypatch spec in callbacks (#48916)
Currently, a few packages using kokkos rely on
kokkos itself monkeypatching its own spec to
provide some attribute.

In this commit we change this attribute to be
defined on the package, and never be monkeypatched.
2025-02-12 07:27:49 +01:00
Adam J. Stewart
46f5b192ef PyTorch: build flash attention by default, except in CI (#48521)
* PyTorch: build flash attention by default, except in CI

* Variant is boolean, only available when +cuda/+rocm

* desc -> _desc
2025-02-11 13:20:10 -08:00
Harmen Stoppels
18cd922aab style.py: fix false negative in redundant import statements (#48980) 2025-02-11 19:30:50 +01:00
Massimiliano Culpo
5518ad9611 Fix performance issue on macOS (#48997)
archspec.cpu.host() is not memoized, so compute
it as less as possible.

---------

Co-authored-by: alalazo <alalazo@users.noreply.github.com>
2025-02-11 18:54:32 +01:00
Wouter Deconinck
57a1807443 hep: rivet: require hepmc=3 (#48976) 2025-02-11 01:33:45 -07:00
Harmen Stoppels
3909308d5c spack.util.elf: catch seek errors (#48972) 2025-02-11 08:52:52 +01:00
Massimiliano Culpo
54210270c8 concretizer: reduce search space with static analysis (#48729)
Currently, when we setup the ASP problem for `clingo`, we don't take into account the configuration. This results in setting up ASP problems that are larger than necessary, with possibly redundant information, and higher concretization times. 

This PR tries to improve things by adding an opt-in feature that computes the _possible dependencies_ of a solve taking also into account the current configuration, and avoids adding possible dependencies that we are certain can't be in the final solution.

The feature can be activated with:
```yaml
concretizer:
  static_analysis: true
```

Examples of simple rules to discard dependencies are:
- Dependencies that are not buildable, and for which no binary is present (e.g. `cray-mpich` etc. on non Cray systems)
- Dependencies that are not for the current platform (e.g. `msmpi` on non Windows platforms)
- Conditional dependencies that cannot be activated, because of some user requirement (e.g. `cuda` etc. if the user requires `~cuda` in configuration)
- Virtual providers that cannot be used, because of a requirement on a virtual

The speed-up these rules seem to give depends on the use case at hand, but if the configuration is updated properly, they are noticeable. 

Since in cases where there is no rule to exclude packages upfront, reuse is active, and this option is activated, it's possible to see some minor slow down, the feature has been added as opt-in, so it's turned off by default.
2025-02-11 08:44:20 +01:00
Wouter Deconinck
1a71bb046e rivet: patch missing header in 3.1.10 (#48977) 2025-02-11 07:40:37 +01:00
Peter Scheibel
dbd6857d32 spack compiler find: detect flang-new and flang in newer LLVM versions (#48914) 2025-02-11 07:34:28 +01:00
renjithravindrankannath
025bc24996 Bump up the version for rocm-6.3.2 release (#48787)
* Bump up the version for rocm-6.3.2 release

* rocm-openmp-extras update and style correction

* Updating mivisionx, omniperf, rccl & rocprofiler-systems

* Updating hipsparselt & rocm-opencl

* rocprofiler-systems on gcc-13 and rvs commit instead of patch

* Updated rocjpeg & rocm-examples for 6.3.2

* ROCPROFSYS_BUILD_DYNINST & DYNINST_BUILD_TBB are required only with gcc-13

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>
2025-02-10 21:03:23 -08:00
Piotr Sacharuk
01e16b58a3 lua-sol2: Apply workaround for oneAPI compiler for problem with build (#48920) 2025-02-10 21:51:57 -07:00
Tim Haines
f71e202f24 cbtf-krell: Update Boost dependency (#47133)
* Update Boost
* Add gotcha
* Add patch for build errors
* Allow building with latest Dyninst
* Fix patch url
2025-02-10 13:05:05 -08:00
Stephen Nicholas Swatman
f7edd10c17 acts dependencies: new versions as of 2025/02/10 (#48969)
This commit adds detray v0.88.0 and GeoModel v6.9.0.
2025-02-10 13:27:54 -07:00
Richard Berger
153c0805dd ports-of-call: add v1.6.0, v1.7.0, v1.7.1 (#48870) 2025-02-10 11:55:59 -08:00
Harmen Stoppels
5d8517ef69 import-check: bump (#48968) 2025-02-10 20:48:59 +01:00
Fernando Ayats
f23cae6a86 py-iterative-stats: add 0.1.1 (#48959) 2025-02-10 07:42:20 -08:00
jgraciahlrs
e6e67f8e0a extrae: tighten dependencies on boost for +dyninst (#48938) 2025-02-10 10:43:31 +01:00
Howard Pritchard
e6bef4ca9b OpenMPI: add version 4.1.8 (#48922)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2025-02-10 10:00:14 +01:00
Matt Thompson
e3e0bef0de Update GFE packages (#48899) 2025-02-10 09:59:15 +01:00
Buldram
42486d93ec nim: add v2.2.2 (#48929) 2025-02-10 09:58:15 +01:00
Harmen Stoppels
6d608a9664 log.py: remove setenv calls (#48933) 2025-02-10 09:53:36 +01:00
Matt Thompson
04313afc63 mapl: add v2.53.1, v2.54.1 (#48944) 2025-02-10 09:42:35 +01:00
Teague Sterling
f839d2ba56 duckdb: add v1.2.0 (#48902)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-10 09:37:46 +01:00
Paul R. C. Kent
2b1a8b1913 Quantum ESPRESSO: add v7.4.1 (#48949) 2025-02-10 09:20:21 +01:00
Vanessasaurus
8907003648 flux-sched: add v0.42.1 (#48952)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-02-10 09:17:56 +01:00
Adam J. Stewart
8afdba4bf7 py-einops: add v0.8.1 (#48954) 2025-02-10 09:13:54 +01:00
Paul R. C. Kent
57cabbfb10 qmcpack: add v4.0.0 (#48921) 2025-02-10 09:02:38 +01:00
Paul
c71efb9040 go: add v1.23.6 (#48955) 2025-02-10 08:37:34 +01:00
Robert Mijakovic
c5dd2d43d2 reframe: add v4.6.4 -> v4.7.2 (#48242) 2025-02-10 08:25:12 +01:00
Adam J. Stewart
34338ef757 Python: add new versions (#48950)
* Python: add new versions

* black
2025-02-09 17:20:19 +01:00
Piotr Sacharuk
c0bdc37226 nwchem: add master (#48919)
* Add possibility to build nwchem from master branch

* add oneapi@2025: patch for @7.2.3

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2025-02-07 10:56:04 -08:00
Sergey Kosukhin
8bad9fb804 serialbox: add version 2.6.2 (#48937) 2025-02-07 10:30:31 -08:00
Harmen Stoppels
2df7cc0087 libgcrypt: fix enforced -O0 (#48940)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
Co-authored-by: Shane Nehring <snehring@iastate.edu>
2025-02-07 19:08:51 +01:00
Piotr Sacharuk
40d40ccc52 Apply workarounds for oneAPI compiler for ascent problem with build (#48918)
* Apply workarounds for oneAPI compiler for ascent problem with build

* Use the way with use patch through the PR address

* stylecheck - missing comma
2025-02-07 06:20:19 -07:00
Harmen Stoppels
afe7d6c39e package_base.py: remove use_cray_compiler_names (#48932) 2025-02-07 12:47:17 +01:00
Rocco Meli
113733d9fb trexio: fix issues with autotools build system (#48923) 2025-02-07 03:42:46 -07:00
Massimiliano Culpo
a8e2da5bb8 Fix regression due to dyninst update (#48935) 2025-02-07 10:46:42 +01:00
Harmen Stoppels
97750189b6 clingo-bootstrap: fix +optimized build (#48931)
* fix regression `apple-clang` vs `%apple-clang`
* use f-strings
* remove --verbose flag from LDFLAGS
2025-02-07 09:46:05 +01:00
Teague Sterling
bcd40835a0 py-maturin: add v1.8.2 and refined dependencies (#48915) 2025-02-07 01:23:33 -07:00
John W. Parent
2c3f2c5733 Windows: Update default config for stage location (#48511)
Current location is within the Spack prefix, which causes builds to
pollute VCS with stage artifacts and generally inflates the Spack
install prefix.

This PR moves it to the user cache location now that we can
consistently support paths with spaces on Windows.
2025-02-06 23:25:11 -08:00
Massimiliano Culpo
302d74394b dyninst: cleanup package (#47637)
* Use more idiomatic construct, shorten recipe
* Remove deprecated versions, and associated patches
* Remove v10.0.0
2025-02-07 08:20:05 +01:00
Vinícius
cf94dc7823 simgrid: add v3.36 (#48909) 2025-02-06 21:03:00 -07:00
Rocco Meli
4411ee3382 dla-future-fortran: add v0.3.0 (#48900) 2025-02-06 18:56:10 -07:00
John W. Parent
f790ce0f72 icu4c: no cxxstd flag option on Windows (#48510)
* ICU4C: Don't reference a spec variant on a platform on which it's not defined

* icu4c: no cxx flag on Windows
2025-02-06 18:55:57 -07:00
Mikael Simberg
64d53037db pika: Add 0.32.0 (#48897) 2025-02-06 18:49:55 -07:00
Chris Marsh
4aef50739b Add new version of r-curl (#48912) 2025-02-06 18:38:09 -07:00
Wouter Deconinck
a6e966f6f2 dcap: depends_on libxcrypt (#48903) 2025-02-06 18:37:51 -07:00
psakievich
1f428c4188 Add a message for CMake incremental build (#48905)
* Add a message for CMake incremental build

Requested message to explain CMake phase is getting skipped.

* [@spackbot] updating style on behalf of psakievich

* Update import

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
2025-02-07 01:32:06 +00:00
Thomas Madlener
731e48b1bd lcio: add v2.22.4 (#48895) 2025-02-06 18:20:34 -07:00
Wouter Deconinck
74ff9ad821 hep stack: build also with cuda and rocm where possible (#48528) 2025-02-06 17:09:15 -07:00
Alberto Invernizzi
16a4eff689 nvpl-blas, nvpl-lapack: add v0.4.0.1, v0.3.0 (#48901) 2025-02-06 15:27:28 -07:00
Alberto Sartori
d0b0d8db50 justbuild: add v1.4.3 (#48898) 2025-02-06 15:21:16 -07:00
AMD Toolchain Support
54f591cce5 amdfftw: fix broken build, adjust flags for performance tuning (#48754)
With CFLAGS, the code path in the amdfftw build system will bypass the logic around AMD_ARCH.

---------

Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2025-02-06 15:14:44 -07:00
Chris Marsh
8677bb4d43 py-numba: Add version 0.61 (#48837) 2025-02-06 07:54:53 -08:00
SXS Bot
b66b80a96a spectre: add v2025.01.30 (#48803)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2025-02-06 13:49:46 +01:00
Adam J. Stewart
10e21f399c py-shapely: add v2.0.7 (#48810) 2025-02-06 13:45:08 +01:00
Matt Thompson
56892f6140 fms: add 2025.01, 2024.03 (#48812) 2025-02-06 13:44:46 +01:00
George Young
7eddc4b1f8 mummer4: patching to allow building with %gcc@13: (#38292)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-06 13:37:42 +01:00
Adam J. Stewart
3c7392bbcc PyTorch: add v2.6.0 (#48794) 2025-02-06 13:32:53 +01:00
ddement
bb0517f4d9 openturbine: add new package (#48683) 2025-02-06 13:32:14 +01:00
Izaak "Zaak" Beekman
c8994ee50f salt: add v0.3.0 (#48877)
* salt: Add v0.3.0 of SALT

This version contains important bug fixes for building and parsing
projects containing Fortran

* salt: Be more explicit about dependency types

 - llvm+clang+flang is needed at build, link and runtime for the
   correct operation of SALT
 - Testing with llvm@master ( llvm > 19.x ) shows that SALT is
   currently incompatible with the latest llvm API so an updated salt
   will be required when LLVM 20 is released
2025-02-06 13:02:08 +01:00
Wouter Deconinck
4b2f5638f2 sherpa: +hepmc3root only when +root (#48827)
* sherpa: +hepmc3root only when +root

* sherpa: fix style
2025-02-06 12:33:27 +01:00
Richard Berger
31312a379f spiner: add v1.6.3 (#48871)
* spiner: update package logic

* singularity-eos: remove spiner cuda_arch propagation

* spiner: add version 1.6.3
2025-02-06 11:47:44 +01:00
Harmen Stoppels
b0d5f272b0 llvm: fix @15 %apple-clang@16 (#48887) 2025-02-06 10:52:36 +01:00
Harmen Stoppels
1c93fef160 spec.py: ensure == is false if true modulo precomputed dag hash (#48889) 2025-02-06 09:55:27 +01:00
Axel Huebl
8bb5f4faf4 enzyme: add v0.0.172 (#48881) 2025-02-06 09:40:14 +01:00
Dom Heinzeller
f76ab5f72f Update py-arch, py-statsmodels (add 0.14.1), py-patsy (add 0.5.4) to be able to use py-cython@3 (#48769)
* Add py-patsy@0.5.4
* Correct py-numpy dependency in py-arch
* Add py-statsmodels@0.14.1 and update dependencies
* Add climbfuji as maintainer for py-patsy
* Add climbfuji as maintainer for py-statsmodels
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py
2025-02-05 16:23:33 -08:00
Richard Berger
49c831edc3 flecsi: new version 2.3.1 (#48867)
* flecsi: add new version 2.3.1, remove develop
* flecsi: remove kokkos and openmp variants moving forward
* flecsi: propagate cuda and rocm settings from kokkos
* Update var/spack/repos/builtin/packages/flecsi/package.py
   Co-authored-by: Davis Herring <herring@lanl.gov>
* flecsi: remove redundant depends_on lines
* flecsi: correct legion dependency
* flecsi: deprecate v2.0.0 and v2.1.0
* flecsi: force +openmp if ^kokkos+openmp

---------

Co-authored-by: Davis Herring <herring@lanl.gov>
2025-02-05 16:18:58 -08:00
afzpatel
c943c8c1d2 hip-tests: new package (#47273)
* hip-tests: add new package
* remove hip-tests from hip recipe
* remove old versions
* fix style
* add missing import
* bump hip-tests to 6.3.1
* fix style
2025-02-05 15:35:04 -08:00
Harmen Stoppels
e0e6f29584 relocate.py: don't warn about symlinks (#48904)
`relocate_links` warns when the target is absolute and not matched by
any prefix from the prefix to prefix map.

This can lead to false positives, cause the prefix to prefix map does
not contain trivial/identity entries whenever a package is installed to
its original location.

Since relocate_links is the odd one out there (we don't warn about
similar issues with rpaths, etc), just remove the warning.
2025-02-05 12:33:44 -07:00
Wouter Deconinck
72bc3bb803 libice: add v1.1.2 (#48861) 2025-02-05 10:35:34 -08:00
Wouter Deconinck
dba8fe2b96 harfbuzz: add v10.2.0 (#48857) 2025-02-05 10:35:18 -08:00
Harmen Stoppels
4487598d60 spec.py: ensure spec.extra_attributes is {} if is null in json (#48896) 2025-02-05 17:55:53 +01:00
Stephen Nicholas Swatman
495537cf56 acts: add v39.0.0 (#48839)
This commit adds version 39.0.0 of the ACTS package which, as far as I
can tell, doesn't require any dependency updates.
2025-02-05 09:39:21 -06:00
George Young
22c3b4099f code-server: update to v4.96.4 (#48828) 2025-02-05 04:38:11 -07:00
japlews
13978d11a0 Scotch: add v7.0.6, add testing option (#48781) 2025-02-05 12:04:00 +01:00
Alec Scott
a22114b20b bash: add autotools dependencies (#48874) 2025-02-05 11:26:43 +01:00
Chris Marsh
c10624390f Fix esmf usage, add new version (#48835) 2025-02-05 11:24:48 +01:00
Felix Thaler
fb3d9de80b nnn: new package (#46174)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2025-02-05 11:23:28 +01:00
Harmen Stoppels
fbb688af07 spec.py: fix hash change due to None vs {} (#48854)
* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules
2025-02-05 09:48:00 +01:00
Chris Marsh
d34b709425 py-elevation: new package (#48836) 2025-02-05 09:46:49 +01:00
rfbgo
cb0b188cf6 pfind: new package (#48685) 2025-02-05 09:31:10 +01:00
Harmen Stoppels
9a2b0aca66 ci: bump import-check (#48883) 2025-02-05 09:17:01 +01:00
Todd Gamblin
89a8ab3233 ci: add codecov token secret to coverage upload job (#48880)
Codecov needs to see the token secret when uploading, so we have to
add this line to the workflow YAML:

```yaml
  with:
    token: ${{ secrets.CODECOV_TOKEN }}
```

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-05 01:18:02 +00:00
Harmen Stoppels
5d87166c07 gha: standalone import-check (#48873) 2025-02-04 22:18:30 +01:00
Taillefumier Mathieu
15c989b3fe sirius: add v7.6.2 (#48797)
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2025-02-04 13:08:02 -07:00
psakievich
b7f556e4b4 Remove variable from cmake.py (#48824)
* Remove variable from cmake.py

#48775 left a dangling variable that was not caught in CI but by the eyes of @haampie. Restructure variable to local method.

* [@spackbot] updating style on behalf of psakievich

* Update cmake.py

* Update lib/spack/spack/build_systems/cmake.py

* Update lib/spack/spack/build_systems/cmake.py

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
2025-02-04 15:20:15 +01:00
Zack Galbreath
36f32ceda3 Remove ISC stacks environment files (#48851)
Follow-up to #48811
2025-02-04 10:31:40 +01:00
Piotr Sacharuk
01d77ed915 Apply workaround for oneAPI compiler for upcxx problem with a template argument list (#48843)
* Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword

* Revert "Fix upcxx problem with  a template argument list is expected after a name prefixed by the template keyword"

This reverts commit faf9b8ce85.

* Apply workaround for oneAPI compiler

* style problem resolved

* use spec.satisfies syntax

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2025-02-04 09:26:44 +01:00
Wouter Deconinck
0049f8332d libsm: add v1.2.5 (#48862) 2025-02-03 23:03:10 -07:00
Wouter Deconinck
39c10c3116 dd4hep: add v1.31 (#48850) 2025-02-03 22:48:17 -07:00
Todd Gamblin
71d1901831 update pyproject.toml for ruff format (#48823)
Add ruff configuration to `pyproject.toml`.

This allows `ruff format` in the Spack repository to format all the files we care about, 
with our line length of 99, the exceptions we already put in place, and excluding things
we don't auto-format, like vendored dependencies.

Right now it'll reformat 175 or so files, but only slightly, in places where `ruff` differs from
`black`. For the most part I like the ruff format decisions better than `black`, but none of
the changes seem too severe.

This does not change `spack style` -- I figure that can come later but this at least will
let people start playing with `ruff`.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-03 20:56:05 -08:00
Wouter Deconinck
41e0863b86 fcgi: add v2.4.3, v2.4.4 (#48856) 2025-02-03 21:32:07 -07:00
Taillefumier Mathieu
a75d83f65c netlib-scalapack: Update version (#48667)
* Update scalapack version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use url_for_version

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>

* use spec.satisfies instead of version()

---------

Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
2025-02-03 21:31:40 -07:00
Wouter Deconinck
f2f13964fb sherpa: support cxxstd=20 when=@3: (#48829)
* sherpa: support cxxstd=20 when=@3:

* hep: sherpa cxxstd=20
2025-02-03 22:24:35 -06:00
Weiqun Zhang
9b032018d6 amrex: add v25.02 (#48853) 2025-02-03 21:23:08 -07:00
Wouter Deconinck
7d470c05be libdrm: add v2.4.124 (#48860) 2025-02-03 21:03:18 -07:00
Wouter Deconinck
664fe9e9e6 cppgsl: add v4.1.0 (#48864) 2025-02-03 20:53:01 -07:00
Chris Marsh
2745a519e2 Add new package py-metis (#48848) 2025-02-03 20:43:12 -07:00
Wouter Deconinck
4348ee1c75 icu4c: add v75.1, v76.1 (#48858) 2025-02-03 20:42:48 -07:00
Chris Marsh
8e39fb1e54 Add new package func (#48849) 2025-02-03 20:38:12 -07:00
Wouter Deconinck
09458312a3 isa-l: add v2.31.1 (#48859) 2025-02-03 20:37:51 -07:00
Chris Marsh
5fd0693df4 py-geojson: Add new package (#48847)
* Add new package py-geojson
* fix when
2025-02-03 20:37:35 -07:00
snehring
f58684429d Tesseract v5.5.0 (#48866)
* leptonica: adding v1.85.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>
* tesseract: adding v5.5.0
  Signed-off-by: Shane Nehring <snehring@iastate.edu>

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-02-03 16:07:54 -08:00
afzpatel
409611a479 rocm-examples and rocjpeg: new packages (#47695)
* new package: rocm-examples
* add new package rocjpeg and update rocm-examples for 6.3.0
* fix licenses
* add versions 6.3.1
* change homepage and git
* add f-string
2025-02-03 13:11:42 -08:00
afzpatel
dd98cfb839 py-tensorflow: add 2.18.0-rocm-enhanced (#48711)
* py-tensorflow: add 2.18.0-rocm-enhanced

* fix style

* fix style

* fix style

* review changes

* review changes

* remove hipblaslt dependency

* remove ci changes and force ROCm 6.3.1 for newest TF

* remove rocm 6.3.1 dependency

* simplify configure fix
2025-02-03 13:38:25 -07:00
Mosè Giordano
5c91667dab r-dmrcate: add v2.16.0, v3.0.0, v3.2.0 (#48158)
* r-dmrcate: add new versions
* r-dmrcate: require `r@4.3.0` for v2.99.0+
* r-dmrcate: update dependencies
2025-02-03 12:17:47 -08:00
Seth R. Johnson
9efd6f3f11 g4vg: new package (#48844)
* g4vg: new package

* [@spackbot] updating style on behalf of sethrj
2025-02-03 11:01:10 -07:00
Harmen Stoppels
a8f5289801 cdash: avoid build_opener (#48846) 2025-02-03 18:41:40 +01:00
Harmen Stoppels
ac635aa777 packge_base.py: remove _patches_by_hash (#48768) 2025-02-03 16:17:42 +01:00
Rocco Meli
45dcddf9c3 CP2K: use libxc@7 for master/next release (#48808) 2025-02-03 15:59:44 +01:00
Harmen Stoppels
f1660722e7 gcc: deprecate old patch releases (#48761) 2025-02-03 15:27:14 +01:00
Zack Galbreath
04b44d841c gitlab: remove isc stacks (#48811) 2025-02-03 15:26:59 +01:00
Mosè Giordano
7f30502297 ziatest: add new package (#48809) 2025-02-03 14:41:38 +01:00
Rocco Meli
61b1586c51 sirius: patch pugixml (#48841) 2025-02-03 06:33:00 -07:00
pauleonix
8579efcadf cuda: add v12.8 (#48708) 2025-02-03 13:49:30 +01:00
Harmen Stoppels
1c3e2b5425 llvm: deprecate old patch releases (#48762) 2025-02-03 11:17:53 +01:00
Henri Menke
011ef0aaaf berkeleygw: add -o flag to tar extraction (#48816)
when extracting as root user, avoid that tar attempts to change file ownership
2025-02-03 11:08:56 +01:00
Harmen Stoppels
9642f3f49a import-check: enable color output (#48842) 2025-02-03 11:02:21 +01:00
Harmen Stoppels
a6c9b55fad Set version to v1.0.0.dev0 (#48791) 2025-02-03 01:42:32 -08:00
Todd Gamblin
608ed967e1 style: fix not in and is not (#48831)
These are some changes that `ruff check --fix` would make that the current
`spack style` also agrees with.  Make the changes now so that the `ruff`
change is less disruptive.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-03 00:49:38 -08:00
Todd Gamblin
742eaa32b7 spack_yaml: use unambiguous variable name (#48832) 2025-02-03 09:07:39 +01:00
Harmen Stoppels
763b35a2e0 import-check: improve how problematic imports are displayed (#48825)
The import-check action now presents problematic import statements
introduced by the PR better.

The idea is roughly:

* Let (V₁, E₁) be the graph of modules as vertices and import statements
  as edges before the change
* Let (V₂, E₂) be the graph after the code change, which is typically a small
  perturbation of (V₁, E₁).
* X₁ = FAS(V₁, E₁) is the feedback arc set before (a minimal set of edges to
  delete to make it acyclic)
* X₂ = FAS(V₂, E₂ ∖ X₁) is the feedback arc set after deletion of the minimal
  set of edges that made the old graph acyclic.
* X₃ = FAS(V₂, E₂) is the feedback arc set after

Previously I displayed X₁ and X₃ and users had to diff themselves.

Now, I'm showing X₂, which is a small set, typically directly related to
code changes.

However, it can be that a small code change adding say 2 problematic imports
creates a completely different solution X₃ that only requires deletion of just 1
different import. In that case the user is informed that they can potentially do
less work.

So for PR #48784 the output is now:

> The overall number of problematic import statements increased by 1 from 31 to 32.
> This is likely a direct consequence of the following import statements:
> 
> ```
> spack/config imports: spack.spec, spack.util.path, spack.util.remote_file_cache
> ```
> 
> However, instead of removing 3 import statements, it is sufficient to remove only 1
> import statement from the following list:
> 
> ```
> spack/concretize imports: spack.bootstrap, spack.solver.asp
> spack/environment imports: spack.bootstrap, spack.environment
> spack/fetch_strategy imports: spack.version.git_ref_lookup
> spack/install_test imports: spack.build_environment, spack.package_base
> spack/modules imports: spack.modules
> spack/platforms imports: spack.config
> spack/relocate imports: spack.bootstrap
> spack/repo imports: spack.package_base, spack.patch, spack.tag
> spack/spec imports: spack.binary_distribution, spack.compiler, spack.compilers, spack.concretize, spack.environment, spack.hash_types, spack.provider_index, spack.repo, spack.spec_parser, spack.store, spack.traverse, spack.variant, spack.version.git_ref_lookup
> spack/subprocess_context imports: spack.environment
> spack/util/gpg imports: spack.bootstrap
> spack/util/package_hash imports: spack.package_base
> spack/util/path imports: spack.config, spack.environment
> spack/util/remote_file_cache imports: spack.util.web
> ```

from which the user can figure out that
`spack/util/remote_file_cache imports: spack.util.web` is the "bottleneck" now.
2025-02-02 20:56:38 -08:00
Wouter Deconinck
12280f864c embree: fix tests by building tutorial's embree_viewer for tests (#48392) 2025-02-02 20:57:39 -06:00
Thomas Bouvier
253ba05732 nanotron: add new package (#48582)
* nanotron: add new package

Also, update some dependencies and add missing ones.

* Add variant +examples needed to execute example scripts

* fix: add missing branch attribute

* Remove master version

* fix: use Github hash
2025-02-02 14:17:26 -07:00
Harmen Stoppels
195b869e1c gcc: remove --with-ld=ld-classic (#48826) 2025-02-01 22:22:53 +01:00
Wouter Deconinck
393961ffd6 vtk-m: CMAKE_CXX_COMPILER is not a BOOL (#48813) 2025-02-01 20:10:10 +01:00
Till Ehrengruber
392a58e9be oci/opener.py: respect system proxy settings (#48783) 2025-02-01 09:26:55 +01:00
Rocco Meli
0e8e97a811 CP2K: add 2025.1 version and DFTD4 support (#48489)
* cp2k: add dftd4 variant

* better conflict and make support

* typo

* Update var/spack/repos/builtin/packages/cp2k/package.py

* Update var/spack/repos/builtin/packages/cp2k/package.py
2025-02-01 00:58:46 -07:00
danielsjensen1
43a0cbe7a2 py-sphinx-rtd-theme: add v2.0.0, v3.0.0 (#48756)
* Add versions 2 and 3 of py-sphinx-rtd-theme.
   Allow for versions of py-sphinx greater than 6.
   Fix the Python version for older versions that depend on distutils.
   Get the py-docutils dependency from the py-sphinx recipe.
* Depend purely on the py-docutils dependency in py-sphinx.
* More refined dependency versioning.
* Fixed versioning for py-sphinx and py-docutils.
2025-01-31 22:46:44 -07:00
Greg Becker
bb35a98079 env create: create copies of relative include files in envs created from manifest (#48689)
Currently, environments created from manifest files with relative includes result in broken
references to config files.

This PR modifies `spack env create` to create local copies in the new environment of any local
config files from relative paths in the environment manifest passed as an init file.

This PR does not change the behavior if the include is an absolute path or if the include is from
a relative path outside the environment directory, but it does warn about missing relative includes if
they are inside the environment directory.

Includes regression test and short blurb in docs.
2025-02-01 01:41:18 +00:00
Richard Berger
fa7e0e8230 kokkos-nvcc-wrapper: add version 4.5.00 and 4.5.01 (#48802) 2025-01-31 16:47:39 -07:00
Dominic Hofer
2c128751f5 Remove patch on main (#48798)
Patch got merged: https://github.com/natefoo/slurm-drmaa/pull/62
2025-01-31 12:51:34 -07:00
wspear
fb0493a366 Added salt variant to tau (#48782)
* Added salt variant to tau

* Update package.py

* [@spackbot] updating style on behalf of wspear

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>
2025-01-31 13:44:10 -06:00
Brian Spilner
6d1b6e7087 add cdo@2.5.0 (#48801) 2025-01-31 12:05:42 -07:00
psakievich
759518182c Bug Fix: Better incremental check for CMake (#48775)
* Bug Fix: Better incremental check for CMake

* Fix syntax error

* Ensure match of config artifact with generator
2025-01-31 08:48:41 -06:00
Rocco Meli
7ebabfcf0e libsmeagol (#48776)
* libsmeagol

* add support for intel and add conflicts

* cp2k
2025-01-31 12:11:13 +01:00
Satish Balay
6203ae31d2 petsc, py-petsc4py: add v3.22.3 (#48785) 2025-01-31 02:11:16 -07:00
Harmen Stoppels
6b13017ded Remove unused values (#48795)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-01-31 08:21:44 +01:00
Harmen Stoppels
2c51b5853f spack.package: re-export EnvironmentModifications / Prefix (#48792) 2025-01-31 08:20:15 +01:00
Harmen Stoppels
d0cbd056a8 spack.package: wrap llnl.util.tty (#48793)
avoid import of llnl.util.tty in packages
2025-01-31 08:17:29 +01:00
Adam J. Stewart
e1b579a8b4 py-sphinx: mark Python compatibility (#48796) 2025-01-30 11:27:57 -08:00
Harmen Stoppels
b02dcf697d Move from python2 compliant IOError and EnvironmentError to python3-only OSError (#48764)
* IOError -> OSError

* also do EnvironmentError
2025-01-30 09:32:57 -08:00
Satish Balay
6e046b04c7 hipblaslt: update cmake dependency (#48637)
* hipblaslt: update cmake dependency

1 error found in build log:
  >> 3    CMake Error at CMakeLists.txt:24 (cmake_minimum_required):
     4      CMake 3.25.2 or higher is required.  You are running version 3.22.1
     5
     6
     7    -- Configuring incomplete, errors occurred!

See build log for details:
  /scratch/svcpetsc/spack-rocm/spack-stage/spack-stage-hipblaslt-6.3.0-pabb7t4rheqkz74lfzbsnqi6vnpiqwlq/spack-build-out.txt

* Update var/spack/repos/builtin/packages/hipblaslt/package.py

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>

---------

Co-authored-by: afzpatel <122491982+afzpatel@users.noreply.github.com>
2025-01-30 09:20:34 -08:00
Rocco Meli
d196795437 libxc: add CMake builder (#48772)
* libsmeagol

* libxc cmake

* cmake support

* revert changes

* make spackbot happy

* fix

* Update package.py
2025-01-30 18:13:07 +01:00
Chris Marsh
0d444fb4e7 Add py-zarr 3, which includes a new required package py-donfig, and a bug fix to the patch range with numcodecs (#48786) 2025-01-30 06:55:15 -08:00
Todd Gamblin
467e631260 abinit: pass flag correctly (#48788) 2025-01-30 11:44:00 +01:00
Harmen Stoppels
f21de698f7 builtin: remove redundant imports (#48765)
* builtin: remove redundant llnl.util.filesystem import
* remove redundant import spack.version
* unsorted fixes
* more spack.version
2025-01-30 09:18:47 +01:00
John W. Parent
59532986be CMake: add v3.31.5, v3.30.7 (#48759) 2025-01-29 19:03:50 -07:00
Alec Scott
36fd547b40 smee-client: add v2.0.4 (#48384) 2025-01-29 16:22:04 -08:00
wspear
b5f9dea6d0 Create SALT package.py (#48758)
* Create SALT package.py

Added a package for the SALT Source AnaLysis Toolkit
@zbeekman

* [@spackbot] updating style on behalf of wspear

* Update package.py

Line wrap

---------

Co-authored-by: wspear <wspear@users.noreply.github.com>
2025-01-29 15:47:48 -06:00
Eric Berquist
5904834295 binutils: conflict on configuration with build issues (#42949) 2025-01-29 22:38:44 +01:00
Tamara Dahlgren
2da8a1d1e3 Docs/bugfix: correct return for Adding flags to configure (#48434) 2025-01-29 13:04:22 -08:00
Juan Miguel Carceller
d50eba40d9 fmt: simplify +pic (#48766)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2025-01-29 14:02:40 -07:00
Jonathon Anderson
8d3a733b77 hpctoolkit: Add +docs variant and manpages (#48566)
* py-mdit-py-plugins: Add new versions 0.3.5, 0.4.2
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* py-myst-parser: Add new versions 0.19.0 to 4.0.0
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
* hpctoolkit: Add +docs variant and manpages
   This commit unconditionally enables manpages for the HPCToolkit tools.
   The new `+docs` variant enables additional documentation, specifically
   the user's manual. Both require new build-time dependencies.
  Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>

---------

Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
2025-01-29 09:26:54 -08:00
Richard Berger
dfa86dce08 petsc: only conflict with kokkos@4.5: if it is enabled (#48698) 2025-01-29 10:07:58 -07:00
Massimiliano Culpo
3d82e5c573 Remove pipelines and images based on ppc64le (#48767) 2025-01-29 16:36:25 +01:00
Harmen Stoppels
a77f903f4d py-cmake: remove. remove deprecated cmake versions (#48763) 2025-01-29 15:06:32 +01:00
Harmen Stoppels
92260b179d package api: drop wildcard re-export (#48760)
* package api: drop wildcard re-export

To ensure package repos are forward/backward compatibility with Spack,
we should explicitly export all symbols we want to expose in the public
package API, and drop `from spack.something import *` because
removal/addition to the public API will go unnoticed.

Also `llnl.util.filesystem` has some methods that shouldn't be exposed
in the package API, so better to enumerate a subset explicitly.

* remove flatten_dependencies / install_dependency_symlinks
2025-01-29 15:00:39 +01:00
Massimiliano Culpo
196c912b8a Deprecate frontend/backend os/target (#47756) 2025-01-29 13:22:51 +01:00
G-Ragghianti
0f54995e53 MAGMA: add v2.9.0 (#48750) 2025-01-29 00:17:37 -07:00
Filippo Spiga
9d1332f1a1 ucx: adding 1.18.0 (#48742)
* Adding UCX 1.18.0
* Verified and correct hash.
2025-01-28 09:26:19 -07:00
652 changed files with 8608 additions and 5216 deletions

View File

@@ -40,17 +40,17 @@ jobs:
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos-stream9, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream9'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[centos-stream9, 'linux/amd64,linux/arm64', 'centos:stream9'],
[leap15, 'linux/amd64,linux/arm64', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
[fedora39, 'linux/amd64,linux/arm64', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:

View File

@@ -81,6 +81,10 @@ jobs:
with:
with_coverage: ${{ needs.changes.outputs.core }}
import-check:
needs: [ changes ]
uses: ./.github/workflows/import-check.yaml
all-prechecks:
needs: [ prechecks ]
if: ${{ always() }}

View File

@@ -33,3 +33,4 @@ jobs:
with:
verbose: true
fail_ci_if_error: false
token: ${{ secrets.CODECOV_TOKEN }}

49
.github/workflows/import-check.yaml vendored Normal file
View File

@@ -0,0 +1,49 @@
name: import-check
on:
workflow_call:
jobs:
# Check we don't make the situation with circular imports worse
import-check:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
- uses: julia-actions/cache@v2
# PR: use the base of the PR as the old commit
- name: Checkout PR base commit
if: github.event_name == 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
ref: ${{ github.event.pull_request.base.sha }}
path: old
# not a PR: use the previous commit as the old commit
- name: Checkout previous commit
if: github.event_name != 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 2
path: old
- name: Checkout previous commit
if: github.event_name != 'pull_request'
run: git -C old reset --hard HEAD^
- name: Checkout new commit
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
path: new
- name: Install circular import checker
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
repository: haampie/circular-import-fighter
ref: 4cdb0bf15f04ab6b49041d5ef1bfd9644cce7f33
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter
run: make -j dependencies
- name: Circular import check
working-directory: circular-import-fighter
run: make -j compare "SPACK_ROOT=../old ../new"

View File

@@ -1,7 +1,7 @@
black==24.10.0
black==25.1.0
clingo==5.7.1
flake8==7.1.1
isort==5.13.2
flake8==7.1.2
isort==6.0.0
mypy==1.11.2
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -86,66 +86,6 @@ jobs:
spack -d bootstrap now --dev
spack -d style -t black
spack unit-test -V
# Check we don't make the situation with circular imports worse
import-check:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
- uses: julia-actions/cache@v2
# PR: use the base of the PR as the old commit
- name: Checkout PR base commit
if: github.event_name == 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
ref: ${{ github.event.pull_request.base.sha }}
path: old
# not a PR: use the previous commit as the old commit
- name: Checkout previous commit
if: github.event_name != 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 2
path: old
- name: Checkout previous commit
if: github.event_name != 'pull_request'
run: git -C old reset --hard HEAD^
- name: Checkout new commit
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
path: new
- name: Install circular import checker
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
repository: haampie/circular-import-fighter
ref: b5d6ce9be35f602cca7d5a6aa0259fca10639cca
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter
run: make -j dependencies
- name: Problematic imports before
working-directory: circular-import-fighter
run: make SPACK_ROOT=../old SUFFIX=.old
- name: Problematic imports after
working-directory: circular-import-fighter
run: make SPACK_ROOT=../new SUFFIX=.new
- name: Compare import cycles
working-directory: circular-import-fighter
run: |
edges_before="$(head -n1 solution.old)"
edges_after="$(head -n1 solution.new)"
if [ "$edges_after" -gt "$edges_before" ]; then
printf '\033[1;31mImport check failed: %s imports need to be deleted, ' "$edges_after"
printf 'previously this was %s\033[0m\n' "$edges_before"
printf 'Compare \033[1;97m"Problematic imports before"\033[0m and '
printf '\033[1;97m"Problematic imports after"\033[0m.\n'
exit 1
else
printf '\033[1;32mImport check passed: %s <= %s\033[0m\n' "$edges_after" "$edges_before"
fi
# Further style checks from pylint
pylint:

View File

@@ -43,6 +43,22 @@ concretizer:
# (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Maximum number of duplicates in a DAG, when using a strategy that allows duplicates. "default" is the
# number used if there isn't a more specific alternative
max_dupes:
default: 1
# Virtuals
c: 2
cxx: 2
fortran: 1
# Regular packages
cmake: 2
gmake: 2
py-cython: 2
py-flit-core: 2
py-setuptools: 2
gcc: 2
llvm: 2
# Option to specify compatibility between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
@@ -63,3 +79,7 @@ concretizer:
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true
# Static analysis may reduce the concretization time by generating smaller ASP problems, in
# cases where there are requirements that prevent part of the search space to be explored.
static_analysis: false

View File

@@ -1,5 +1,5 @@
config:
locks: false
build_stage::
- '$spack/.staging'
- '$user_cache_path/stage'
stage_name: '{name}-{version}-{hash:7}'

View File

@@ -272,9 +272,9 @@ often lists dependencies and the flags needed to locate them. The
"environment variables" section lists environment variables that the
build system uses to pass flags to the compiler and linker.
^^^^^^^^^^^^^^^^^^^^^^^^^^
Addings flags to configure
^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^
Adding flags to configure
^^^^^^^^^^^^^^^^^^^^^^^^^
For most of the flags you encounter, you will want a variant to
optionally enable/disable them. You can then optionally pass these
@@ -285,7 +285,7 @@ function like so:
def configure_args(self):
args = []
...
if self.spec.satisfies("+mpi"):
args.append("--enable-mpi")
else:
@@ -299,7 +299,10 @@ Alternatively, you can use the :ref:`enable_or_disable <autotools_enable_or_dis
.. code-block:: python
def configure_args(self):
return [self.enable_or_disable("mpi")]
args = []
...
args.extend(self.enable_or_disable("mpi"))
return args
Note that we are explicitly disabling MPI support if it is not
@@ -344,7 +347,14 @@ typically used to enable or disable some feature within the package.
default=False,
description="Memchecker support for debugging [degrades performance]"
)
config_args.extend(self.enable_or_disable("memchecker"))
...
def configure_args(self):
args = []
...
args.extend(self.enable_or_disable("memchecker"))
return args
In this example, specifying the variant ``+memchecker`` will generate
the following configuration options:

View File

@@ -361,7 +361,6 @@ and the tags associated with the class of runners to build on.
* ``.linux_neoverse_n1``
* ``.linux_neoverse_v1``
* ``.linux_neoverse_v2``
* ``.linux_power``
* ``.linux_skylake``
* ``.linux_x86_64``
* ``.linux_x86_64_v4``

View File

@@ -112,6 +112,19 @@ the original but may concretize differently in the presence of different
explicit or default configuration settings (e.g., a different version of
Spack or for a different user account).
Environments created from a manifest will copy any included configs
from relative paths inside the environment. Relative paths from
outside the environment will cause errors, and absolute paths will be
kept absolute. For example, if ``spack.yaml`` includes:
.. code-block:: yaml
spack:
include: [./config.yaml]
then the created environment will have its own copy of the file
``config.yaml`` copied from the location in the original environment.
Create an environment from a ``spack.lock`` file using:
.. code-block:: console
@@ -160,7 +173,7 @@ accepts. If an environment already exists then spack will simply activate it
and ignore the create-specific flags.
.. code-block:: console
$ spack env activate --create -p myenv
# ...
# [creates if myenv does not exist yet]
@@ -424,8 +437,8 @@ Developing Packages in a Spack Environment
The ``spack develop`` command allows one to develop Spack packages in
an environment. It requires a spec containing a concrete version, and
will configure Spack to install the package from local source.
If a version is not provided from the command line interface then spack
will configure Spack to install the package from local source.
If a version is not provided from the command line interface then spack
will automatically pick the highest version the package has defined.
This means any infinity versions (``develop``, ``main``, ``stable``) will be
preferred in this selection process.
@@ -435,9 +448,9 @@ set, and Spack will ensure the package and its dependents are rebuilt
any time the environment is installed if the package's local source
code has been modified. Spack's native implementation to check for modifications
is to check if ``mtime`` is newer than the installation.
A custom check can be created by overriding the ``detect_dev_src_change`` method
in your package class. This is particularly useful for projects using custom spack repo's
to drive development and want to optimize performance.
A custom check can be created by overriding the ``detect_dev_src_change`` method
in your package class. This is particularly useful for projects using custom spack repo's
to drive development and want to optimize performance.
Spack ensures that all instances of a
developed package in the environment are concretized to match the
@@ -453,7 +466,7 @@ Further development on ``foo`` can be tested by re-installing the environment,
and eventually committed and pushed to the upstream git repo.
If the package being developed supports out-of-source builds then users can use the
``--build_directory`` flag to control the location and name of the build directory.
``--build_directory`` flag to control the location and name of the build directory.
This is a shortcut to set the ``package_attributes:build_directory`` in the
``packages`` configuration (see :ref:`assigning-package-attributes`).
The supplied location will become the build-directory for that package in all future builds.

View File

@@ -820,6 +820,69 @@ presence of a ``SPACK_CDASH_AUTH_TOKEN`` environment variable during the
build group on CDash called "Release Testing" (that group will be created if
it didn't already exist).
.. _ci_artifacts:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
CI Artifacts Directory Layout
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
When running the CI build using the command ``spack ci rebuild`` a number of directories are created for
storing data generated during the CI job. The default root directory for artifacts is ``job_scratch_root``.
This can be overridden by passing the argument ``--artifacts-root`` to the ``spack ci generate`` command
or by setting the ``SPACK_ARTIFACTS_ROOT`` environment variable in the build job scripts.
The top level directories under the artifact root are ``concrete_environment``, ``logs``, ``reproduction``,
``tests``, and ``user_data``. Spack does not restrict what is written to any of these directories nor does
it require user specified files be written to any specific directory.
------------------------
``concrete_environment``
------------------------
The directory ``concrete_environment`` is used to communicate the ci generate processed ``spack.yaml`` and
the concrete ``spack.lock`` for the CI environment.
--------
``logs``
--------
The directory ``logs`` contains the spack build log, ``spack-build-out.txt``, and the spack build environment
modification file, ``spack-build-mod-env.txt``. Additionally all files specified by the packages ``Builder``
property ``archive_files`` are also copied here (ie. ``CMakeCache.txt`` in ``CMakeBuilder``).
----------------
``reproduction``
----------------
The directory ``reproduction`` is used to store the files needed by the ``spack reproduce-build`` command.
This includes ``repro.json``, copies of all of the files in ``concrete_environment``, the concrete spec
JSON file for the current spec being built, and all of the files written in the artifacts root directory.
The ``repro.json`` file is not versioned and is only designed to work with the version of spack CI was run with.
An example of what a ``repro.json`` may look like is here.
.. code:: json
{
"job_name": "adios2@2.9.2 /feaevuj %gcc@11.4.0 arch=linux-ubuntu20.04-x86_64_v3 E4S ROCm External",
"job_spec_json": "adios2.json",
"ci_project_dir": "/builds/spack/spack"
}
---------
``tests``
---------
The directory ``tests`` is used to store output from running ``spack test <job spec>``. This may or may not have
data in it depending on the package that was built and the availability of tests.
-------------
``user_data``
-------------
The directory ``user_data`` is used to store everything else that shouldn't be copied to the ``reproduction`` direcotory.
Users may use this to store additional logs or metrics or other types of files generated by the build job.
-------------------------------------
Using a custom spack in your pipeline
-------------------------------------

View File

@@ -1,4 +1,4 @@
sphinx==8.1.3
sphinx==8.2.0
sphinxcontrib-programoutput==0.18
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
@@ -7,7 +7,7 @@ docutils==0.21.2
pygments==2.19.1
urllib3==2.3.0
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1
isort==6.0.0
black==25.1.0
flake8==7.1.2
mypy==1.11.1

View File

@@ -668,7 +668,7 @@ def copy(src, dest, _permissions=False):
_permissions (bool): for internal use only
Raises:
IOError: if *src* does not match any files or directories
OSError: if *src* does not match any files or directories
ValueError: if *src* matches multiple files but *dest* is
not a directory
"""
@@ -679,7 +679,7 @@ def copy(src, dest, _permissions=False):
files = glob.glob(src)
if not files:
raise IOError("No such file or directory: '{0}'".format(src))
raise OSError("No such file or directory: '{0}'".format(src))
if len(files) > 1 and not os.path.isdir(dest):
raise ValueError(
"'{0}' matches multiple files but '{1}' is not a directory".format(src, dest)
@@ -710,7 +710,7 @@ def install(src, dest):
dest (str): the destination file or directory
Raises:
IOError: if *src* does not match any files or directories
OSError: if *src* does not match any files or directories
ValueError: if *src* matches multiple files but *dest* is
not a directory
"""
@@ -748,7 +748,7 @@ def copy_tree(
_permissions (bool): for internal use only
Raises:
IOError: if *src* does not match any files or directories
OSError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if _permissions:
@@ -762,7 +762,7 @@ def copy_tree(
files = glob.glob(src)
if not files:
raise IOError("No such file or directory: '{0}'".format(src))
raise OSError("No such file or directory: '{0}'".format(src))
# For Windows hard-links and junctions, the source path must exist to make a symlink. Add
# all symlinks to this list while traversing the tree, then when finished, make all
@@ -843,7 +843,7 @@ def install_tree(src, dest, symlinks=True, ignore=None):
ignore (typing.Callable): function indicating which files to ignore
Raises:
IOError: if *src* does not match any files or directories
OSError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)

View File

@@ -41,6 +41,16 @@ def __init__(self, dst, src_a=None, src_b=None):
self.src_a = src_a
self.src_b = src_b
def __repr__(self) -> str:
return f"MergeConflict(dst={self.dst!r}, src_a={self.src_a!r}, src_b={self.src_b!r})"
def _samefile(a: str, b: str):
try:
return os.path.samefile(a, b)
except OSError:
return False
class SourceMergeVisitor(BaseDirectoryVisitor):
"""
@@ -50,9 +60,14 @@ class SourceMergeVisitor(BaseDirectoryVisitor):
- A list of merge conflicts in dst/
"""
def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
def __init__(
self, ignore: Optional[Callable[[str], bool]] = None, normalize_paths: bool = False
):
self.ignore = ignore if ignore is not None else lambda f: False
# On case-insensitive filesystems, normalize paths to detect duplications
self.normalize_paths = normalize_paths
# When mapping <src root> to <dst root>/<projection>, we need to prepend the <projection>
# bit to the relative path in the destination dir.
self.projection: str = ""
@@ -71,10 +86,88 @@ def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
# and can run mkdir in order.
self.directories: Dict[str, Tuple[str, str]] = {}
# If the visitor is configured to normalize paths, keep a map of
# normalized path to: original path, root directory + relative path
self._directories_normalized: Dict[str, Tuple[str, str, str]] = {}
# Files to link. Maps dst_rel to (src_root, src_rel). This is an ordered dict, where files
# are guaranteed to be grouped by src_root in the order they were visited.
self.files: Dict[str, Tuple[str, str]] = {}
# If the visitor is configured to normalize paths, keep a map of
# normalized path to: original path, root directory + relative path
self._files_normalized: Dict[str, Tuple[str, str, str]] = {}
def _in_directories(self, proj_rel_path: str) -> bool:
"""
Check if a path is already in the directory list
"""
if self.normalize_paths:
return proj_rel_path.lower() in self._directories_normalized
else:
return proj_rel_path in self.directories
def _directory(self, proj_rel_path: str) -> Tuple[str, str, str]:
"""
Get the directory that is mapped to a path
"""
if self.normalize_paths:
return self._directories_normalized[proj_rel_path.lower()]
else:
return (proj_rel_path, *self.directories[proj_rel_path])
def _del_directory(self, proj_rel_path: str):
"""
Remove a directory from the list of directories
"""
del self.directories[proj_rel_path]
if self.normalize_paths:
del self._directories_normalized[proj_rel_path.lower()]
def _add_directory(self, proj_rel_path: str, root: str, rel_path: str):
"""
Add a directory to the list of directories.
Also stores the normalized version for later lookups
"""
self.directories[proj_rel_path] = (root, rel_path)
if self.normalize_paths:
self._directories_normalized[proj_rel_path.lower()] = (proj_rel_path, root, rel_path)
def _in_files(self, proj_rel_path: str) -> bool:
"""
Check if a path is already in the files list
"""
if self.normalize_paths:
return proj_rel_path.lower() in self._files_normalized
else:
return proj_rel_path in self.files
def _file(self, proj_rel_path: str) -> Tuple[str, str, str]:
"""
Get the file that is mapped to a path
"""
if self.normalize_paths:
return self._files_normalized[proj_rel_path.lower()]
else:
return (proj_rel_path, *self.files[proj_rel_path])
def _del_file(self, proj_rel_path: str):
"""
Remove a file from the list of files
"""
del self.files[proj_rel_path]
if self.normalize_paths:
del self._files_normalized[proj_rel_path.lower()]
def _add_file(self, proj_rel_path: str, root: str, rel_path: str):
"""
Add a file to the list of files
Also stores the normalized version for later lookups
"""
self.files[proj_rel_path] = (root, rel_path)
if self.normalize_paths:
self._files_normalized[proj_rel_path.lower()] = (proj_rel_path, root, rel_path)
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Register a directory if dst / rel_path is not blocked by a file or ignored.
@@ -84,23 +177,28 @@ def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
if self.ignore(rel_path):
# Don't recurse when dir is ignored.
return False
elif proj_rel_path in self.files:
# Can't create a dir where a file is.
src_a_root, src_a_relpath = self.files[proj_rel_path]
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_b=os.path.join(root, rel_path),
elif self._in_files(proj_rel_path):
# A file-dir conflict is fatal except if they're the same file (symlinked dir).
src_a = os.path.join(*self._file(proj_rel_path))
src_b = os.path.join(root, rel_path)
if not _samefile(src_a, src_b):
self.fatal_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
return False
elif proj_rel_path in self.directories:
return False
# Remove the link in favor of the dir.
existing_proj_rel_path, _, _ = self._file(proj_rel_path)
self._del_file(existing_proj_rel_path)
self._add_directory(proj_rel_path, root, rel_path)
return True
elif self._in_directories(proj_rel_path):
# No new directory, carry on.
return True
else:
# Register new directory.
self.directories[proj_rel_path] = (root, rel_path)
self._add_directory(proj_rel_path, root, rel_path)
return True
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
@@ -132,7 +230,7 @@ def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bo
if handle_as_dir:
return self.before_visit_dir(root, rel_path, depth)
self.visit_file(root, rel_path, depth)
self.visit_file(root, rel_path, depth, symlink=True)
return False
def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = False) -> None:
@@ -140,30 +238,23 @@ def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = Fa
if self.ignore(rel_path):
pass
elif proj_rel_path in self.directories:
# Can't create a file where a dir is; fatal error
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(*self.directories[proj_rel_path]),
src_b=os.path.join(root, rel_path),
elif self._in_directories(proj_rel_path):
# Can't create a file where a dir is, unless they are the same file (symlinked dir),
# in which case we simply drop the symlink in favor of the actual dir.
src_a = os.path.join(*self._directory(proj_rel_path))
src_b = os.path.join(root, rel_path)
if not symlink or not _samefile(src_a, src_b):
self.fatal_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
elif proj_rel_path in self.files:
elif self._in_files(proj_rel_path):
# When two files project to the same path, they conflict iff they are distinct.
# If they are the same (i.e. one links to the other), register regular files rather
# than symlinks. The reason is that in copy-type views, we need a copy of the actual
# file, not the symlink.
src_a = os.path.join(*self.files[proj_rel_path])
src_a = os.path.join(*self._file(proj_rel_path))
src_b = os.path.join(root, rel_path)
try:
samefile = os.path.samefile(src_a, src_b)
except OSError:
samefile = False
if not samefile:
if not _samefile(src_a, src_b):
# Distinct files produce a conflict.
self.file_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
@@ -173,12 +264,12 @@ def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = Fa
if not symlink:
# Remove the link in favor of the actual file. The del is necessary to maintain the
# order of the files dict, which is grouped by root.
del self.files[proj_rel_path]
self.files[proj_rel_path] = (root, rel_path)
existing_proj_rel_path, _, _ = self._file(proj_rel_path)
self._del_file(existing_proj_rel_path)
self._add_file(proj_rel_path, root, rel_path)
else:
# Otherwise register this file to be linked.
self.files[proj_rel_path] = (root, rel_path)
self._add_file(proj_rel_path, root, rel_path)
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
@@ -197,11 +288,11 @@ def set_projection(self, projection: str) -> None:
path = ""
for part in self.projection.split(os.sep):
path = os.path.join(path, part)
if path not in self.files:
self.directories[path] = ("<projection>", path)
if not self._in_files(path):
self._add_directory(path, "<projection>", path)
else:
# Can't create a dir where a file is.
src_a_root, src_a_relpath = self.files[path]
_, src_a_root, src_a_relpath = self._file(path)
self.fatal_conflicts.append(
MergeConflict(
dst=path,
@@ -227,8 +318,8 @@ def __init__(self, source_merge_visitor: SourceMergeVisitor):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir is a file in a src dir, add a conflict,
# and don't traverse deeper
if rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
if self.src._in_files(rel_path):
_, src_a_root, src_a_relpath = self.src._file(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
@@ -238,8 +329,9 @@ def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir was also a src dir, remove the mkdir
# action, and traverse deeper.
if rel_path in self.src.directories:
del self.src.directories[rel_path]
if self.src._in_directories(rel_path):
existing_proj_rel_path, _, _ = self.src._directory(rel_path)
self.src._del_directory(existing_proj_rel_path)
return True
# If the destination dir does not appear in the src dir,
@@ -252,38 +344,24 @@ def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bo
be seen as files; we should not accidentally merge
source dir with a symlinked dest dir.
"""
# Always conflict
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
if rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
self.visit_file(root, rel_path, depth)
# Never descend into symlinked target dirs.
return False
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# Can't merge a file if target already exists
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
if self.src._in_directories(rel_path):
_, src_a_root, src_a_relpath = self.src._directory(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
)
)
elif rel_path in self.src.files:
src_a_root, src_a_relpath = self.src.files[rel_path]
elif self.src._in_files(rel_path):
_, src_a_root, src_a_relpath = self.src._file(rel_path)
self.src.fatal_conflicts.append(
MergeConflict(
rel_path, os.path.join(src_a_root, src_a_relpath), os.path.join(root, rel_path)
@@ -308,7 +386,7 @@ class LinkTree:
def __init__(self, source_root):
if not os.path.exists(source_root):
raise IOError("No such file or directory: '%s'", source_root)
raise OSError("No such file or directory: '%s'", source_root)
self._root = source_root

View File

@@ -269,7 +269,7 @@ def __init__(
@staticmethod
def _poll_interval_generator(
_wait_times: Optional[Tuple[float, float, float]] = None
_wait_times: Optional[Tuple[float, float, float]] = None,
) -> Generator[float, None, None]:
"""This implements a backoff scheme for polling a contended resource
by suggesting a succession of wait times between polls.
@@ -391,7 +391,7 @@ def _poll_lock(self, op: int) -> bool:
return True
except IOError as e:
except OSError as e:
# EAGAIN and EACCES == locked by another process (so try again)
if e.errno not in (errno.EAGAIN, errno.EACCES):
raise

View File

@@ -2,8 +2,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utility classes for logging the output of blocks of code.
"""
"""Utility classes for logging the output of blocks of code."""
import atexit
import ctypes
import errno
@@ -344,26 +343,6 @@ def close(self):
self.file.close()
@contextmanager
def replace_environment(env):
"""Replace the current environment (`os.environ`) with `env`.
If `env` is empty (or None), this unsets all current environment
variables.
"""
env = env or {}
old_env = os.environ.copy()
try:
os.environ.clear()
for name, val in env.items():
os.environ[name] = val
yield
finally:
os.environ.clear()
for name, val in old_env.items():
os.environ[name] = val
def log_output(*args, **kwargs):
"""Context manager that logs its output to a file.
@@ -447,7 +426,6 @@ def __init__(
self.echo = echo
self.debug = debug
self.buffer = buffer
self.env = env # the environment to use for _writer_daemon
self.filter_fn = filter_fn
self._active = False # used to prevent re-entry
@@ -519,21 +497,20 @@ def __enter__(self):
# just don't forward input if this fails
pass
with replace_environment(self.env):
self.process = multiprocessing.Process(
target=_writer_daemon,
args=(
input_fd,
read_fd,
self.write_fd,
self.echo,
self.log_file,
child_pipe,
self.filter_fn,
),
)
self.process.daemon = True # must set before start()
self.process.start()
self.process = multiprocessing.Process(
target=_writer_daemon,
args=(
input_fd,
read_fd,
self.write_fd,
self.echo,
self.log_file,
child_pipe,
self.filter_fn,
),
)
self.process.daemon = True # must set before start()
self.process.start()
finally:
if input_fd:
@@ -729,10 +706,7 @@ class winlog:
Does not support the use of 'v' toggling as nixlog does.
"""
def __init__(
self, file_like=None, echo=False, debug=0, buffer=False, env=None, filter_fn=None
):
self.env = env
def __init__(self, file_like=None, echo=False, debug=0, buffer=False, filter_fn=None):
self.debug = debug
self.echo = echo
self.logfile = file_like
@@ -789,11 +763,10 @@ def background_reader(reader, echo_writer, _kill):
reader.close()
self._active = True
with replace_environment(self.env):
self._thread = Thread(
target=background_reader, args=(self.reader, self.echo_writer, self._kill)
)
self._thread.start()
self._thread = Thread(
target=background_reader, args=(self.reader, self.echo_writer, self._kill)
)
self._thread.start()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
@@ -918,7 +891,7 @@ def _writer_daemon(
try:
if stdin_file.read(1) == "v":
echo = not echo
except IOError as e:
except OSError as e:
# If SIGTTIN is ignored, the system gives EIO
# to let the caller know the read failed b/c it
# was in the bg. Ignore that too.
@@ -1013,7 +986,7 @@ def wrapped(*args, **kwargs):
while True:
try:
return function(*args, **kwargs)
except IOError as e:
except OSError as e:
if e.errno == errno.EINTR:
continue
raise

View File

@@ -10,7 +10,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.24.0.dev0"
__version__ = "1.0.0.dev0"
spack_version = __version__

View File

@@ -1010,7 +1010,7 @@ def _issues_in_depends_on_directive(pkgs, error_cls):
for dep_name, dep in deps_by_name.items():
def check_virtual_with_variants(spec, msg):
if not spec.virtual or not spec.variants:
if not spack.repo.PATH.is_virtual(spec.name) or not spec.variants:
return
error = error_cls(
f"{pkg_name}: {msg}",

View File

@@ -923,7 +923,7 @@ class FileTypes:
UNKNOWN = 2
NOT_ISO8859_1_TEXT = re.compile(b"[\x00\x7F-\x9F]")
NOT_ISO8859_1_TEXT = re.compile(b"[\x00\x7f-\x9f]")
def file_type(f: IO[bytes]) -> int:
@@ -2529,10 +2529,10 @@ def install_root_node(
allow_missing: when true, allows installing a node with missing dependencies
"""
# Early termination
if spec.external or spec.virtual:
warnings.warn("Skipping external or virtual package {0}".format(spec.format()))
if spec.external or not spec.concrete:
warnings.warn("Skipping external or abstract spec {0}".format(spec.format()))
return
elif spec.concrete and spec.installed and not force:
elif spec.installed and not force:
warnings.warn("Package for spec {0} already installed.".format(spec.format()))
return

View File

@@ -27,9 +27,9 @@
class ClingoBootstrapConcretizer:
def __init__(self, configuration):
self.host_platform = spack.platforms.host()
self.host_os = self.host_platform.operating_system("frontend")
self.host_os = self.host_platform.default_operating_system()
self.host_target = archspec.cpu.host().family
self.host_architecture = spack.spec.ArchSpec.frontend_arch()
self.host_architecture = spack.spec.ArchSpec.default_arch()
self.host_architecture.target = str(self.host_target)
self.host_compiler = self._valid_compiler_or_raise()
self.host_python = self.python_external_spec()

View File

@@ -141,7 +141,7 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch()
arch = spack.spec.ArchSpec.default_arch()
if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers()

View File

@@ -292,7 +292,12 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install()
PackageInstaller(
[concrete_spec.package],
fail_fast=True,
package_use_cache=False,
dependencies_use_cache=False,
).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -362,6 +367,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
return

View File

@@ -12,6 +12,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import depends_on
from .cmake import CMakeBuilder, CMakePackage
@@ -371,6 +372,10 @@ class CachedCMakePackage(CMakePackage):
CMakeBuilder = CachedCMakeBuilder
# These dependencies are assumed in the builder
depends_on("c", type="build")
depends_on("cxx", type="build")
def flag_handler(self, name, flags):
if name in ("cflags", "cxxflags", "cppflags", "fflags"):
return None, None, None # handled in the cmake cache

View File

@@ -70,10 +70,16 @@ def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def std_build_args(self):
"""Standard arguments for ``cargo build`` provided as a property for
convenience of package writers."""
return ["-j", str(self.pkg.module.make_jobs)]
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return ["-j", str(self.pkg.module.make_jobs)]
return []
@property
def check_args(self):
@@ -88,7 +94,9 @@ def build(
) -> None:
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
pkg.module.cargo(
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
)
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix

View File

@@ -11,6 +11,7 @@
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util import tty
from llnl.util.lang import stable_partition
import spack.builder
@@ -458,11 +459,23 @@ def cmake(
) -> None:
"""Runs ``cmake`` in the build directory"""
# skip cmake phase if it is an incremental develop build
if spec.is_develop and os.path.isfile(
os.path.join(self.build_directory, "CMakeCache.txt")
):
return
if spec.is_develop:
# skip cmake phase if it is an incremental develop build
# Determine the files that will re-run CMake that are generated from a successful
# configure step based on state
primary_generator = _extract_primary_generator(self.generator)
configure_artifact = "Makefile"
if primary_generator == "Ninja":
configure_artifact = "ninja.build"
if os.path.isfile(os.path.join(self.build_directory, configure_artifact)):
tty.msg(
"Incremental build criteria satisfied."
"Skipping CMake configure step. To force configuration run"
f" `spack clean {pkg.name}`"
)
return
options = self.std_cmake_args
options += self.cmake_args()

View File

@@ -15,7 +15,7 @@ class CudaPackage(PackageBase):
"""Auxiliary class which contains CUDA variant, dependencies and conflicts
and is meant to unify and facilitate its usage.
Maintainers: ax3l, Rombur, davidbeckingsale
Maintainers: ax3l, Rombur, davidbeckingsale, pauleonix
"""
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
@@ -47,6 +47,12 @@ class CudaPackage(PackageBase):
"89",
"90",
"90a",
"100",
"100a",
"101",
"101a",
"120",
"120a",
)
# FIXME: keep cuda and cuda_arch separate to make usage easier until
@@ -99,39 +105,56 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
# CUDA version vs Architecture
# https://en.wikipedia.org/wiki/CUDA#GPUs_supported
# https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#deprecated-features
# Tesla support:
depends_on("cuda@:6.0", when="cuda_arch=10")
depends_on("cuda@:6.5", when="cuda_arch=11")
depends_on("cuda@2.1:6.5", when="cuda_arch=12")
depends_on("cuda@2.1:6.5", when="cuda_arch=13")
# Fermi support:
depends_on("cuda@3.0:8.0", when="cuda_arch=20")
depends_on("cuda@3.2:8.0", when="cuda_arch=21")
# Kepler support:
depends_on("cuda@5.0:10.2", when="cuda_arch=30")
depends_on("cuda@5.0:10.2", when="cuda_arch=32")
depends_on("cuda@5.0:11.8", when="cuda_arch=35")
depends_on("cuda@6.5:11.8", when="cuda_arch=37")
# Maxwell support:
depends_on("cuda@6.0:", when="cuda_arch=50")
depends_on("cuda@6.5:", when="cuda_arch=52")
depends_on("cuda@6.5:", when="cuda_arch=53")
# Pascal support:
depends_on("cuda@8.0:", when="cuda_arch=60")
depends_on("cuda@8.0:", when="cuda_arch=61")
depends_on("cuda@8.0:", when="cuda_arch=62")
# Volta support:
depends_on("cuda@9.0:", when="cuda_arch=70")
# Turing support:
depends_on("cuda@9.0:", when="cuda_arch=72")
depends_on("cuda@10.0:", when="cuda_arch=75")
# Ampere support:
depends_on("cuda@11.0:", when="cuda_arch=80")
depends_on("cuda@11.1:", when="cuda_arch=86")
depends_on("cuda@11.4:", when="cuda_arch=87")
# Ada support:
depends_on("cuda@11.8:", when="cuda_arch=89")
# Hopper support:
depends_on("cuda@12.0:", when="cuda_arch=90")
depends_on("cuda@12.0:", when="cuda_arch=90a")
# Blackwell support:
depends_on("cuda@12.8:", when="cuda_arch=100")
depends_on("cuda@12.8:", when="cuda_arch=100a")
depends_on("cuda@12.8:", when="cuda_arch=101")
depends_on("cuda@12.8:", when="cuda_arch=101a")
depends_on("cuda@12.8:", when="cuda_arch=120")
depends_on("cuda@12.8:", when="cuda_arch=120a")
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
@@ -163,6 +186,7 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.6")
conflicts("%gcc@15:", when="+cuda ^cuda@:12.8")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
@@ -171,6 +195,7 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
conflicts("%clang@19:", when="+cuda ^cuda@:12.6")
conflicts("%clang@20:", when="+cuda ^cuda@:12.8")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -48,6 +48,9 @@ class MesonPackage(spack.package_base.PackageBase):
variant("strip", default=False, description="Strip targets on install")
depends_on("meson", type="build")
depends_on("ninja", type="build")
# Meson uses pkg-config for dependency detection, and this dependency is
# often overlooked by packages that use meson as a build system.
depends_on("pkgconfig", type="build")
# Python detection in meson requires distutils to be importable, but distutils no longer
# exists in Python 3.12. In Spack, we can't use setuptools as distutils replacement,
# because the distutils-precedence.pth startup file that setuptools ships with is not run

View File

@@ -142,7 +142,7 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
# Only if environment modifications are desired (default is +envmods)
if "~envmods" not in self.spec:
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
self.component_prefix.env.join("vars.sh"), *self.env_script_args

View File

@@ -264,16 +264,17 @@ def update_external_dependencies(self, extendee_spec=None):
# Ensure architecture information is present
if not python.architecture:
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
host_target = host_platform.target("default_target")
host_os = host_platform.default_operating_system()
host_target = host_platform.default_target()
python.architecture = spack.spec.ArchSpec(
(str(host_platform), str(host_os), str(host_target))
)
else:
if not python.architecture.platform:
python.architecture.platform = spack.platforms.host()
platform = spack.platforms.by_name(python.architecture.platform)
if not python.architecture.os:
python.architecture.os = "default_os"
python.architecture.os = platform.default_operating_system()
if not python.architecture.target:
python.architecture.target = archspec.cpu.host().family.name

View File

@@ -14,8 +14,9 @@
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set
from urllib.request import HTTPHandler, Request, build_opener
from urllib.request import Request
import llnl.path
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.color import cescape, colorize
@@ -62,6 +63,8 @@
PushResult = namedtuple("PushResult", "success url")
urlopen = web_util.urlopen # alias for mocking in tests
def get_change_revisions():
"""If this is a git repo get the revisions to use when checking
@@ -81,6 +84,9 @@ def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
whether or not the stack was changed. Returns True if the environment
manifest changed between the provided revisions (or additionally if the
`.gitlab-ci.yml` file itself changed). Returns False otherwise."""
# git returns posix paths always, normalize input to be comptaible
# with that
env_path = llnl.path.convert_to_posix_path(env_path)
git = spack.util.git.git()
if git:
with fs.working_dir(spack.paths.prefix):
@@ -610,7 +616,7 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def download_and_extract_artifacts(url, work_dir):
def download_and_extract_artifacts(url, work_dir) -> str:
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -618,6 +624,10 @@ def download_and_extract_artifacts(url, work_dir):
url (str): Complete url to artifacts.zip file
work_dir (str): Path to destination where artifacts should be extracted
Output:
Artifacts root path relative to the archive root
"""
tty.msg(f"Fetching artifacts from: {url}")
@@ -627,31 +637,33 @@ def download_and_extract_artifacts(url, work_dir):
if token:
headers["PRIVATE-TOKEN"] = token
opener = build_opener(HTTPHandler)
request = Request(url, headers=headers)
request.get_method = lambda: "GET"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
msg = f"Error response code ({response_code}) in reproduce_ci_job"
raise SpackError(msg)
request = Request(url, headers=headers, method="GET")
artifacts_zip_path = os.path.join(work_dir, "artifacts.zip")
os.makedirs(work_dir, exist_ok=True)
if not os.path.exists(work_dir):
os.makedirs(work_dir)
try:
response = urlopen(request, timeout=SPACK_CDASH_TIMEOUT)
with open(artifacts_zip_path, "wb") as out_file:
shutil.copyfileobj(response, out_file)
with open(artifacts_zip_path, "wb") as out_file:
shutil.copyfileobj(response, out_file)
with zipfile.ZipFile(artifacts_zip_path) as zip_file:
zip_file.extractall(work_dir)
# Get the artifact root
artifact_root = ""
for f in zip_file.filelist:
if "spack.lock" in f.filename:
artifact_root = os.path.dirname(os.path.dirname(f.filename))
break
except OSError as e:
raise SpackError(f"Error fetching artifacts: {e}")
finally:
try:
os.remove(artifacts_zip_path)
except FileNotFoundError:
# If the file doesn't exist we are already raising
pass
zip_file = zipfile.ZipFile(artifacts_zip_path)
zip_file.extractall(work_dir)
zip_file.close()
os.remove(artifacts_zip_path)
return artifact_root
def get_spack_info():
@@ -765,7 +777,7 @@ def setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None):
return True
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime, use_local_head):
"""Given a url to gitlab artifacts.zip from a failed 'spack ci rebuild' job,
attempt to setup an environment in which the failure can be reproduced
locally. This entails the following:
@@ -779,8 +791,11 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commands to run to reproduce the build once inside the container.
"""
work_dir = os.path.realpath(work_dir)
if os.path.exists(work_dir) and os.listdir(work_dir):
raise SpackError(f"Cannot run reproducer in non-emptry working dir:\n {work_dir}")
platform_script_ext = "ps1" if IS_WINDOWS else "sh"
download_and_extract_artifacts(url, work_dir)
artifact_root = download_and_extract_artifacts(url, work_dir)
gpg_path = None
if gpg_url:
@@ -842,6 +857,9 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
with open(repro_file, encoding="utf-8") as fd:
repro_details = json.load(fd)
spec_file = fs.find(work_dir, repro_details["job_spec_json"])[0]
reproducer_spec = spack.spec.Spec.from_specfile(spec_file)
repro_dir = os.path.dirname(repro_file)
rel_repro_dir = repro_dir.replace(work_dir, "").lstrip(os.path.sep)
@@ -902,17 +920,20 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commit_regex = re.compile(r"commit\s+([^\s]+)")
merge_commit_regex = re.compile(r"Merge\s+([^\s]+)\s+into\s+([^\s]+)")
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
if use_local_head:
commit_1 = "HEAD"
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
if m:
commit_1 = m.group(1)
setup_result = False
if commit_1:
@@ -987,6 +1008,8 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"entrypoint", entrypoint_script, work_dir, run=False, exit_on_failure=False
)
# Attempt to create a unique name for the reproducer container
container_suffix = "_" + reproducer_spec.dag_hash() if reproducer_spec else ""
docker_command = [
runtime,
"run",
@@ -994,14 +1017,14 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"-t",
"--rm",
"--name",
"spack_reproducer",
f"spack_reproducer{container_suffix}",
"-v",
":".join([work_dir, mounted_workdir, "Z"]),
"-v",
":".join(
[
os.path.join(work_dir, "jobs_scratch_dir"),
os.path.join(mount_as_dir, "jobs_scratch_dir"),
os.path.join(work_dir, artifact_root),
os.path.join(mount_as_dir, artifact_root),
"Z",
]
),

View File

@@ -3,6 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import warnings
import archspec.cpu
@@ -51,10 +52,10 @@ def setup_parser(subparser):
"-t", "--target", action="store_true", default=False, help="print only the target"
)
parts2.add_argument(
"-f", "--frontend", action="store_true", default=False, help="print frontend"
"-f", "--frontend", action="store_true", default=False, help="print frontend (DEPRECATED)"
)
parts2.add_argument(
"-b", "--backend", action="store_true", default=False, help="print backend"
"-b", "--backend", action="store_true", default=False, help="print backend (DEPRECATED)"
)
@@ -98,15 +99,14 @@ def arch(parser, args):
display_targets(archspec.cpu.TARGETS)
return
os_args, target_args = "default_os", "default_target"
if args.frontend:
os_args, target_args = "frontend", "frontend"
warnings.warn("the argument --frontend is deprecated, and will be removed in Spack v1.0")
elif args.backend:
os_args, target_args = "backend", "backend"
warnings.warn("the argument --backend is deprecated, and will be removed in Spack v1.0")
host_platform = spack.platforms.host()
host_os = host_platform.operating_system(os_args)
host_target = host_platform.target(target_args)
host_os = host_platform.default_operating_system()
host_target = host_platform.default_target()
if args.family:
host_target = host_target.family
elif args.generic:

View File

@@ -4,7 +4,7 @@
import re
import sys
from typing import Dict, Optional
from typing import Dict, Optional, Tuple
import llnl.string
import llnl.util.lang
@@ -181,7 +181,11 @@ def checksum(parser, args):
print()
if args.add_to_package:
add_versions_to_package(pkg, version_lines, args.batch)
path = spack.repo.PATH.filename_for_package_name(pkg.name)
num_versions_added = add_versions_to_pkg(path, version_lines)
tty.msg(f"Added {num_versions_added} new versions to {pkg.name} in {path}")
if not args.batch and sys.stdin.isatty():
editor(path)
def print_checksum_status(pkg: PackageBase, version_hashes: dict):
@@ -227,20 +231,9 @@ def print_checksum_status(pkg: PackageBase, version_hashes: dict):
tty.die("Invalid checksums found.")
def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool):
"""
Add checksumed versions to a package's instructions and open a user's
editor so they may double check the work of the function.
Args:
pkg (spack.package_base.PackageBase): A package class for a given package in Spack.
version_lines (str): A string of rendered version lines.
"""
# Get filename and path for package
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
def _update_version_statements(package_src: str, version_lines: str) -> Tuple[int, str]:
"""Returns a tuple of number of versions added and the package's modified contents."""
num_versions_added = 0
version_statement_re = re.compile(r"([\t ]+version\([^\)]*\))")
version_re = re.compile(r'[\t ]+version\(\s*"([^"]+)"[^\)]*\)')
@@ -252,33 +245,34 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool
if match:
new_versions.append((Version(match.group(1)), ver_line))
with open(filename, "r+", encoding="utf-8") as f:
contents = f.read()
split_contents = version_statement_re.split(contents)
split_contents = version_statement_re.split(package_src)
for i, subsection in enumerate(split_contents):
# If there are no more versions to add we should exit
if len(new_versions) <= 0:
break
for i, subsection in enumerate(split_contents):
# If there are no more versions to add we should exit
if len(new_versions) <= 0:
break
# Check if the section contains a version
contents_version = version_re.match(subsection)
if contents_version is not None:
parsed_version = Version(contents_version.group(1))
# Check if the section contains a version
contents_version = version_re.match(subsection)
if contents_version is not None:
parsed_version = Version(contents_version.group(1))
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIXME", "\n"]
num_versions_added += 1
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIXME", "\n"]
num_versions_added += 1
elif parsed_version == new_versions[0][0]:
new_versions.pop(0)
elif parsed_version == new_versions[0][0]:
new_versions.pop(0)
# Seek back to the start of the file so we can rewrite the file contents.
f.seek(0)
f.writelines("".join(split_contents))
return num_versions_added, "".join(split_contents)
tty.msg(f"Added {num_versions_added} new versions to {pkg.name}")
tty.msg(f"Open {filename} to review the additions.")
if sys.stdout.isatty() and not is_batch:
editor(filename)
def add_versions_to_pkg(path: str, version_lines: str) -> int:
"""Add new versions to a package.py file. Returns the number of versions added."""
with open(path, "r", encoding="utf-8") as f:
package_src = f.read()
num_versions_added, package_src = _update_version_statements(package_src, version_lines)
if num_versions_added > 0:
with open(path, "w", encoding="utf-8") as f:
f.write(package_src)
return num_versions_added

View File

@@ -176,6 +176,11 @@ def setup_parser(subparser):
reproduce.add_argument(
"-s", "--autostart", help="Run docker reproducer automatically", action="store_true"
)
reproduce.add_argument(
"--use-local-head",
help="Use the HEAD of the local Spack instead of reproducing a commit",
action="store_true",
)
gpg_group = reproduce.add_mutually_exclusive_group(required=False)
gpg_group.add_argument(
"--gpg-file", help="Path to public GPG key for validating binary cache installs"
@@ -608,7 +613,12 @@ def ci_reproduce(args):
gpg_key_url = None
return spack_ci.reproduce_ci_job(
args.job_url, args.working_dir, args.autostart, gpg_key_url, args.runtime
args.job_url,
args.working_dir,
args.autostart,
gpg_key_url,
args.runtime,
args.use_local_head,
)

View File

@@ -2,23 +2,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import platform
import re
import sys
from datetime import datetime
from glob import glob
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.paths
import spack.platforms
import spack.spec
import spack.store
import spack.util.git
from spack.util.executable import which
description = "debugging commands for troubleshooting Spack"
section = "developer"
@@ -27,67 +15,13 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="debug_command")
sp.add_parser("create-db-tarball", help="create a tarball of Spack's installation metadata")
sp.add_parser("report", help="print information useful for bug reports")
def _debug_tarball_suffix():
now = datetime.now()
suffix = now.strftime("%Y-%m-%d-%H%M%S")
git = spack.util.git.git()
if not git:
return "nobranch-nogit-%s" % suffix
with working_dir(spack.paths.prefix):
if not os.path.isdir(".git"):
return "nobranch.nogit.%s" % suffix
# Get symbolic branch name and strip any special chars (mainly '/')
symbolic = git("rev-parse", "--abbrev-ref", "--short", "HEAD", output=str).strip()
symbolic = re.sub(r"[^\w.-]", "-", symbolic)
# Get the commit hash too.
commit = git("rev-parse", "--short", "HEAD", output=str).strip()
if symbolic == commit:
return "nobranch.%s.%s" % (commit, suffix)
else:
return "%s.%s.%s" % (symbolic, commit, suffix)
def create_db_tarball(args):
tar = which("tar")
tarball_name = "spack-db.%s.tar.gz" % _debug_tarball_suffix()
tarball_path = os.path.abspath(tarball_name)
base = os.path.basename(str(spack.store.STORE.root))
transform_args = []
# Currently --transform and -s are not supported by Windows native tar
if "GNU" in tar("--version", output=str):
transform_args = ["--transform", "s/^%s/%s/" % (base, tarball_name)]
elif sys.platform != "win32":
transform_args = ["-s", "/^%s/%s/" % (base, tarball_name)]
wd = os.path.dirname(str(spack.store.STORE.root))
with working_dir(wd):
files = [spack.store.STORE.db._index_path]
files += glob("%s/*/*/*/.spack/spec.json" % base)
files += glob("%s/*/*/*/.spack/spec.yaml" % base)
files = [os.path.relpath(f) for f in files]
args = ["-czf", tarball_path]
args += transform_args
args += files
tar(*args)
tty.msg("Created %s" % tarball_name)
def report(args):
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("frontend")
host_target = host_platform.target("frontend")
host_os = host_platform.default_operating_system()
host_target = host_platform.default_target()
architecture = spack.spec.ArchSpec((str(host_platform), str(host_os), str(host_target)))
print("* **Spack:**", spack.get_version())
print("* **Python:**", platform.python_version())
@@ -95,5 +29,5 @@ def report(args):
def debug(parser, args):
action = {"create-db-tarball": create_db_tarball, "report": report}
action[args.debug_command](args)
if args.debug_command == "report":
report(args)

View File

@@ -9,9 +9,9 @@
import spack.cmd
import spack.environment as ev
import spack.package_base
import spack.store
from spack.cmd.common import arguments
from spack.solver.input_analysis import create_graph_analyzer
description = "show dependencies of a package"
section = "basic"
@@ -68,15 +68,17 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package_base.possible_dependencies(
dependencies, virtuals, _ = create_graph_analyzer().possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,
depflag=args.deptype,
allowed_deps=args.deptype,
)
if not args.expand_virtuals:
dependencies.update(virtuals)
if spec.name in dependencies:
del dependencies[spec.name]
dependencies.remove(spec.name)
if dependencies:
colify(sorted(dependencies))

View File

@@ -125,7 +125,7 @@ def develop(parser, args):
version = spec.versions.concrete_range_as_version
if not version:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
version = max(spack.repo.PATH.get_pkg_class(spec.fullname).versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])

View File

@@ -110,10 +110,7 @@ def external_find(args):
# Note that KeyboardInterrupt does not subclass Exception
# (so CTRL-C will terminate the program as expected).
skip_msg = "Skipping manifest and continuing with other external checks"
if (isinstance(e, IOError) or isinstance(e, OSError)) and e.errno in [
errno.EPERM,
errno.EACCES,
]:
if isinstance(e, OSError) and e.errno in (errno.EPERM, errno.EACCES):
# The manifest file does not have sufficient permissions enabled:
# print a warning and keep going
tty.warn("Unable to read manifest due to insufficient permissions.", skip_msg)

View File

@@ -54,10 +54,6 @@
@m{target=target} specific <target> processor
@m{arch=platform-os-target} shortcut for all three above
cross-compiling:
@m{os=backend} or @m{os=be} build for compute node (backend)
@m{os=frontend} or @m{os=fe} build for login node (frontend)
dependencies:
^dependency [constraints] specify constraints on dependencies
^@K{/hash} build with a specific installed

View File

@@ -545,7 +545,7 @@ def _not_license_excluded(self, x):
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
elif spack.repo.PATH.get_pkg_class(x.fullname).redistribute_source(x):
return True
else:
tty.debug(

View File

@@ -41,7 +41,11 @@ def providers(parser, args):
specs = spack.cmd.parse_specs(args.virtual_package)
# Check prerequisites
non_virtual = [str(s) for s in specs if not s.virtual or s.name not in valid_virtuals]
non_virtual = [
str(s)
for s in specs
if not spack.repo.PATH.is_virtual(s.name) or s.name not in valid_virtuals
]
if non_virtual:
msg = "non-virtual specs cannot be part of the query "
msg += "[{0}]\n".format(", ".join(non_virtual))

View File

@@ -6,7 +6,7 @@
import os
import re
import sys
from itertools import zip_longest
from itertools import islice, zip_longest
from typing import Dict, List, Optional
import llnl.util.tty as tty
@@ -423,7 +423,8 @@ def _run_import_check(
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
# Find at most two occurences: the first is the import itself, the second is its usage.
if len(list(islice(re.finditer(rf"{re.escape(m.group(1))}(?!\w)", contents), 2))) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
@@ -438,7 +439,7 @@ def _run_import_check(
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
if re.search(rf"import {re.escape(module)}(?!\w|\.)", contents):
continue
to_add.add(module)
exit_code = 1

View File

@@ -177,16 +177,15 @@ def test_run(args):
matching = spack.store.STORE.db.query_local(spec, hashes=hashes, explicit=explicit)
if spec and not matching:
tty.warn("No {0}installed packages match spec {1}".format(explicit_str, spec))
"""
TODO: Need to write out a log message and/or CDASH Testing
output that package not installed IF continue to process
these issues here.
if args.log_format:
# Proceed with the spec assuming the test process
# to ensure report package as skipped (e.g., for CI)
specs_to_test.append(spec)
"""
# TODO: Need to write out a log message and/or CDASH Testing
# output that package not installed IF continue to process
# these issues here.
# if args.log_format:
# # Proceed with the spec assuming the test process
# # to ensure report package as skipped (e.g., for CI)
# specs_to_test.append(spec)
specs_to_test.extend(matching)
@@ -253,7 +252,9 @@ def has_test_and_tags(pkg_class):
hashes = env.all_hashes() if env else None
specs = spack.store.STORE.db.query(hashes=hashes)
specs = list(filter(lambda s: has_test_and_tags(s.package_class), specs))
specs = list(
filter(lambda s: has_test_and_tags(spack.repo.PATH.get_pkg_class(s.fullname)), specs)
)
spack.cmd.display_specs(specs, long=True)

View File

@@ -216,7 +216,7 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest

View File

@@ -801,17 +801,17 @@ def _extract_compiler_paths(spec: "spack.spec.Spec") -> Optional[Dict[str, str]]
def _extract_os_and_target(spec: "spack.spec.Spec"):
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target")
operating_system = host_platform.default_operating_system()
target = host_platform.default_target()
else:
target = spec.architecture.target
if not target:
target = spack.platforms.host().target("default_target")
target = spack.platforms.host().default_target()
operating_system = spec.os
if not operating_system:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
operating_system = host_platform.default_operating_system()
return operating_system, target

View File

@@ -220,7 +220,7 @@ def concretize_one(spec: Union[str, Spec], tests: TestsType = False) -> Spec:
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
if spack.repo.PATH.is_virtual(spec.name):
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]

View File

@@ -57,7 +57,7 @@ def validate(configuration_file):
# Set the default value of the concretization strategy to unify and
# warn if the user explicitly set another value
env_dict.setdefault("concretizer", {"unify": True})
if not env_dict["concretizer"]["unify"] is True:
if env_dict["concretizer"]["unify"] is not True:
warnings.warn(
'"concretizer:unify" is not set to "true", which means the '
"generated image may contain different variants of the same "

View File

@@ -41,6 +41,8 @@
Union,
)
import spack.repo
try:
import uuid
@@ -1556,7 +1558,12 @@ def _query(
# If we did fine something, the query spec can't be virtual b/c we matched an actual
# package installation, so skip the virtual check entirely. If we *didn't* find anything,
# check all the deferred specs *if* the query is virtual.
if not results and query_spec is not None and deferred and query_spec.virtual:
if (
not results
and query_spec is not None
and deferred
and spack.repo.PATH.is_virtual(query_spec.name)
):
results = [spec for spec in deferred if spec.satisfies(query_spec)]
return results

View File

@@ -310,7 +310,7 @@ def find_windows_kit_roots() -> List[str]:
@staticmethod
def find_windows_kit_bin_paths(
kit_base: Union[Optional[str], Optional[list]] = None
kit_base: Union[Optional[str], Optional[list]] = None,
) -> List[str]:
"""Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
@@ -325,7 +325,7 @@ def find_windows_kit_bin_paths(
@staticmethod
def find_windows_kit_lib_paths(
kit_base: Union[Optional[str], Optional[list]] = None
kit_base: Union[Optional[str], Optional[list]] = None,
) -> List[str]:
"""Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base

View File

@@ -243,7 +243,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
self, *, pkg: Type["spack.package_base.PackageBase"], paths: Iterable[str]
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
@@ -259,6 +259,8 @@ def detect_specs(
)
return []
from spack.repo import PATH as repo_path
result = []
for candidate_path, items_in_prefix in _group_by_prefix(
llnl.util.lang.dedupe(paths)
@@ -305,7 +307,10 @@ def detect_specs(
resolved_specs[spec] = candidate_path
try:
spec.validate_detection()
# Validate the spec calling a package specific method
pkg_cls = repo_path.get_pkg_class(spec.name)
validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None)
validate_fn(spec, spec.extra_attributes)
except Exception as e:
msg = (
f'"{spec}" has been detected on the system but will '

View File

@@ -568,7 +568,7 @@ def patch(
"""
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency],
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep

View File

@@ -581,7 +581,7 @@ def _error_on_nonempty_view_dir(new_root):
# Check if the target path lexists
try:
st = os.lstat(new_root)
except (IOError, OSError):
except OSError:
return
# Empty directories are fine
@@ -861,7 +861,7 @@ def regenerate(self, concrete_roots: List[Spec]) -> None:
):
try:
shutil.rmtree(old_root)
except (IOError, OSError) as e:
except OSError as e:
msg = "Failed to remove old view at %s\n" % old_root
msg += str(e)
tty.warn(msg)
@@ -2554,7 +2554,7 @@ def is_latest_format(manifest):
try:
with open(manifest, encoding="utf-8") as f:
data = syaml.load(f)
except (OSError, IOError):
except OSError:
return True
top_level_key = _top_level_key(data)
changed = spack.schema.env.update(data[top_level_key])
@@ -2634,6 +2634,32 @@ def _ensure_env_dir():
shutil.copy(envfile, target_manifest)
# Copy relative path includes that live inside the environment dir
try:
manifest = EnvironmentManifestFile(environment_dir)
except Exception:
# error handling for bad manifests is handled on other code paths
return
includes = manifest[TOP_LEVEL_KEY].get("include", [])
for include in includes:
if os.path.isabs(include):
continue
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
if common_path != environment_dir:
tty.debug(f"Will not copy relative include from outside environment: {include}")
continue
orig_abspath = os.path.normpath(envfile.parent / include)
if not os.path.exists(orig_abspath):
tty.warn(f"Included file does not exist; will not copy: '{include}'")
continue
fs.touchp(abspath)
shutil.copy(orig_abspath, abspath)
class EnvironmentManifestFile(collections.abc.Mapping):
"""Manages the in-memory representation of a manifest file, and its synchronization

View File

@@ -187,7 +187,7 @@ def path_for_extension(target_name: str, *, paths: List[str]) -> str:
if name == target_name:
return path
else:
raise IOError('extension "{0}" not found'.format(target_name))
raise OSError('extension "{0}" not found'.format(target_name))
def get_module(cmd_name):

View File

@@ -9,6 +9,7 @@
import shutil
import stat
import sys
import tempfile
from typing import Callable, Dict, Optional
from typing_extensions import Literal
@@ -427,7 +428,7 @@ def needs_file(spec, file):
try:
with open(manifest_file, "r", encoding="utf-8") as f:
manifest = s_json.load(f)
except (OSError, IOError):
except OSError:
# if we can't load it, assume it doesn't know about the file.
manifest = {}
return test_path in manifest
@@ -708,7 +709,10 @@ def add_specs(self, *specs: spack.spec.Spec) -> None:
def skip_list(file):
return os.path.basename(file) == spack.store.STORE.layout.metadata_dir
visitor = SourceMergeVisitor(ignore=skip_list)
# Determine if the root is on a case-insensitive filesystem
normalize_paths = is_folder_on_case_insensitive_filesystem(self._root)
visitor = SourceMergeVisitor(ignore=skip_list, normalize_paths=normalize_paths)
# Gather all the directories to be made and files to be linked
for spec in specs:
@@ -831,7 +835,7 @@ def get_spec_from_file(filename):
try:
with open(filename, "r", encoding="utf-8") as f:
return spack.spec.Spec.from_yaml(f)
except IOError:
except OSError:
return None
@@ -884,3 +888,8 @@ def get_dependencies(specs):
class ConflictingProjectionsError(SpackError):
"""Raised when a view has a projections file and is given one manually."""
def is_folder_on_case_insensitive_filesystem(path: str) -> bool:
with tempfile.NamedTemporaryFile(dir=path, prefix=".sentinel") as sentinel:
return os.path.exists(os.path.join(path, os.path.basename(sentinel.name).upper()))

View File

@@ -42,10 +42,10 @@
import llnl.util.tty.color
import spack.deptypes as dt
import spack.repo
import spack.spec
import spack.tengine
import spack.traverse
from spack.solver.input_analysis import create_graph_analyzer
def find(seq, predicate):
@@ -537,10 +537,11 @@ def edge_entry(self, edge):
def _static_edges(specs, depflag):
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(expand_virtuals=True, depflag=depflag)
*_, edges = create_graph_analyzer().possible_dependencies(
spec.name, expand_virtuals=True, allowed_deps=depflag
)
for parent_name, dependencies in possible.items():
for parent_name, dependencies in edges.items():
for dependency_name in dependencies:
yield spack.spec.DependencySpec(
spack.spec.Spec(parent_name),

View File

@@ -26,7 +26,7 @@ def is_shared_library_elf(filepath):
with open(filepath, "rb") as f:
elf = parse_elf(f, interpreter=True, dynamic_section=True)
return elf.has_pt_dynamic and (elf.has_soname or not elf.has_pt_interp)
except (IOError, OSError, ElfParsingError):
except (OSError, ElfParsingError):
return False

View File

@@ -66,6 +66,8 @@
"libudev.so.*",
# cuda driver
"libcuda.so.*",
# intel-oneapi-runtime
"libur_loader.so.*",
]

View File

@@ -166,7 +166,7 @@ def filter_shebangs_in_directory(directory, filenames=None):
# Only look at executable, non-symlink files.
try:
st = os.lstat(path)
except (IOError, OSError):
except OSError:
continue
if stat.S_ISLNK(st.st_mode) or stat.S_ISDIR(st.st_mode) or not st.st_mode & is_exe:

View File

@@ -566,7 +566,7 @@ def copy_test_files(pkg: Pb, test_spec: spack.spec.Spec):
# copy test data into test stage data dir
try:
pkg_cls = test_spec.package_class
pkg_cls = spack.repo.PATH.get_pkg_class(test_spec.fullname)
except spack.repo.UnknownPackageError:
tty.debug(f"{test_spec.name}: skipping test data copy since no package class found")
return
@@ -623,7 +623,7 @@ def test_functions(
vpkgs = virtuals(pkg)
for vname in vpkgs:
try:
classes.append((Spec(vname)).package_class)
classes.append(spack.repo.PATH.get_pkg_class(vname))
except spack.repo.UnknownPackageError:
tty.debug(f"{vname}: virtual does not appear to have a package file")
@@ -668,7 +668,7 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
# grab test functions associated with the spec, which may be virtual
try:
tests = test_functions(spec.package_class)
tests = test_functions(spack.repo.PATH.get_pkg_class(spec.fullname))
except spack.repo.UnknownPackageError:
# Some virtuals don't have a package so we don't want to report
# them as not having tests when that isn't appropriate.

View File

@@ -814,7 +814,7 @@ def get_depflags(self, pkg: "spack.package_base.PackageBase") -> int:
# Include build dependencies if pkg is going to be built from sources, or
# if build deps are explicitly requested.
if include_build_deps or not (
cache_only or pkg.spec.installed and not pkg.spec.dag_hash() in self.overwrite
cache_only or pkg.spec.installed and pkg.spec.dag_hash() not in self.overwrite
):
depflag |= dt.BUILD
if self.run_tests(pkg):
@@ -2436,11 +2436,7 @@ def _real_install(self) -> None:
# DEBUGGING TIP - to debug this section, insert an IPython
# embed here, and run the sections below without log capture
log_contextmanager = log_output(
log_file,
self.echo,
True,
env=self.unmodified_env,
filter_fn=self.filter_fn,
log_file, self.echo, True, filter_fn=self.filter_fn
)
with log_contextmanager as logger:

View File

@@ -163,7 +163,7 @@ def format_help_sections(self, level):
# lazily add all commands to the parser when needed.
add_all_commands(self)
"""Print help on subcommands in neatly formatted sections."""
# Print help on subcommands in neatly formatted sections.
formatter = self._get_formatter()
# Create a list of subcommand actions. Argparse internals are nasty!
@@ -728,7 +728,7 @@ def _compatible_sys_types():
with the current host.
"""
host_platform = spack.platforms.host()
host_os = str(host_platform.operating_system("default_os"))
host_os = str(host_platform.default_operating_system())
host_target = archspec.cpu.host()
compatible_targets = [host_target] + host_target.ancestors

View File

@@ -64,7 +64,7 @@ def from_local_path(path: str):
@staticmethod
def from_url(url: str):
"""Create an anonymous mirror by URL. This method validates the URL."""
if not urllib.parse.urlparse(url).scheme in supported_url_schemes:
if urllib.parse.urlparse(url).scheme not in supported_url_schemes:
raise ValueError(
f'"{url}" is not a valid mirror URL. '
f"Scheme must be one of {supported_url_schemes}."

View File

@@ -330,18 +330,17 @@ class BaseConfiguration:
default_projections = {"all": "{name}/{version}-{compiler.name}-{compiler.version}"}
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
# Module where type(self) is defined
m = inspect.getmodule(self)
assert m is not None # make mypy happy
self.module = m
# Spec for which we want to generate a module file
self.spec = spec
self.name = module_set_name
self.explicit = explicit
# Dictionary of configuration options that should be applied
# to the spec
# Dictionary of configuration options that should be applied to the spec
self.conf = merge_config_rules(self.module.configuration(self.name), self.spec)
@property
def module(self):
return inspect.getmodule(self)
@property
def projections(self):
"""Projection from specs to module names"""
@@ -775,10 +774,6 @@ def __init__(
) -> None:
self.spec = spec
# This class is meant to be derived. Get the module of the
# actual writer.
self.module = inspect.getmodule(self)
assert self.module is not None # make mypy happy
m = self.module
# Create the triplet of configuration/layout/context
@@ -816,6 +811,10 @@ def __init__(
name = type(self).__name__
raise ModulercHeaderNotDefined(msg.format(name))
@property
def module(self):
return inspect.getmodule(self)
def _get_template(self):
"""Gets the template that will be rendered for this spec."""
# Get templates and put them in the order of importance:

View File

@@ -209,7 +209,7 @@ def provides(self):
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:
if self.spec.package.provides(x):
provides[x] = self.spec[x]
provides[x] = self.spec
return provides
@property

View File

@@ -383,6 +383,7 @@ def create_opener():
"""Create an opener that can handle OCI authentication."""
opener = urllib.request.OpenerDirector()
for handler in [
urllib.request.ProxyHandler(),
urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(),

View File

@@ -2,31 +2,64 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# flake8: noqa: F401
"""spack.util.package is a set of useful build tools and directives for packages.
# flake8: noqa: F401, E402
"""spack.package defines the public API for Spack packages, by re-exporting useful symbols from
other modules. Packages should import this module, instead of importing from spack.* directly
to ensure forward compatibility with future versions of Spack."""
Everything in this module is automatically imported into Spack package files.
"""
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
from spack.error import InstallError, NoHeadersError, NoLibrariesError
# Emulate some shell commands for convenience
env = environ
cd = chdir
pwd = getcwd
# import most common types used in packages
from typing import Dict, List, Optional
import llnl.util.filesystem
from llnl.util.filesystem import *
class tty:
import llnl.util.tty as _tty
debug = _tty.debug
error = _tty.error
info = _tty.info
msg = _tty.msg
warn = _tty.warn
from llnl.util.filesystem import (
FileFilter,
FileList,
HeaderList,
LibraryList,
ancestor,
can_access,
change_sed_delimiter,
copy,
copy_tree,
filter_file,
find,
find_all_headers,
find_first,
find_headers,
find_libraries,
find_system_libraries,
force_remove,
force_symlink,
install,
install_tree,
is_exe,
join_path,
keep_modification_time,
library_extensions,
mkdirp,
remove_directory_contents,
remove_linked_tree,
rename,
set_executable,
set_install_permissions,
touch,
working_dir,
)
from llnl.util.symlink import symlink
import spack.util.executable
# These props will be overridden when the build env is set up.
from spack.build_environment import MakeExecutable
from spack.build_systems.aspell_dict import AspellDictPackage
from spack.build_systems.autotools import AutotoolsPackage
@@ -76,7 +109,24 @@
from spack.builder import BaseBuilder
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import *
from spack.directives import (
build_system,
can_splice,
conditional,
conflicts,
depends_on,
extends,
license,
maintainers,
patch,
provides,
redistribute,
requires,
resource,
variant,
version,
)
from spack.error import InstallError, NoHeadersError, NoLibrariesError
from spack.install_test import (
SkipTest,
cache_extra_test_sources,
@@ -86,26 +136,28 @@
install_test_root,
test_part,
)
from spack.installer import ExternalPackageError, InstallLockError, UpstreamPackageError
from spack.mixins import filter_compiler_wrappers
from spack.multimethod import default_args, when
from spack.package_base import (
DependencyConflictError,
build_system_flags,
env_flags,
flatten_dependencies,
inject_flags,
install_dependency_symlinks,
on_package_attributes,
from spack.package_base import build_system_flags, env_flags, inject_flags, on_package_attributes
from spack.package_completions import (
bash_completion_path,
fish_completion_path,
zsh_completion_path,
)
from spack.package_completions import *
from spack.phase_callbacks import run_after, run_before
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.spec import Spec
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable, ProcessError, which, which_string
from spack.util.filesystem import fix_darwin_install_name
from spack.util.prefix import Prefix
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver
# Emulate some shell commands for convenience
env = environ
cd = chdir
pwd = getcwd
# These are just here for editor support; they may be set when the build env is set up.
configure: Executable
make_jobs: int

View File

@@ -22,7 +22,6 @@
import textwrap
import time
import traceback
import typing
from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, TypeVar, Union
from typing_extensions import Literal
@@ -30,7 +29,6 @@
import llnl.util.filesystem as fsys
import llnl.util.tty as tty
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.compilers
import spack.config
@@ -67,10 +65,6 @@
]
FLAG_HANDLER_TYPE = Callable[[str, Iterable[str]], FLAG_HANDLER_RETURN_TYPE]
"""Allowed URL schemes for spack packages."""
_ALLOWED_URL_SCHEMES = ["http", "https", "ftp", "file", "git"]
#: Filename for the Spack build/install log.
_spack_build_logfile = "spack-build-out.txt"
@@ -702,9 +696,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
#: Verbosity level, preserved across installs.
_verbose = None
#: index of patches by sha256 sum, built lazily
_patches_by_hash = None
#: Package homepage where users can find more information about the package
homepage: Optional[str] = None
@@ -718,19 +709,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
#: Do not include @ here in order not to unnecessarily ping the users.
maintainers: List[str] = []
#: List of attributes to be excluded from a package's hash.
metadata_attrs = [
"homepage",
"url",
"urls",
"list_url",
"extendable",
"parallel",
"make_jobs",
"maintainers",
"tags",
]
#: Set to ``True`` to indicate the stand-alone test requires a compiler.
#: It is used to ensure a compiler and build dependencies like 'cmake'
#: are available to build a custom test code.
@@ -830,104 +808,6 @@ def get_variant(self, name: str) -> spack.variant.Variant:
except StopIteration:
raise ValueError(f"No variant '{name}' on spec: {self.spec}")
@classmethod
def possible_dependencies(
cls,
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
visited: Optional[dict] = None,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Return dict of possible dependencies of this package.
Args:
transitive (bool or None): return all transitive dependencies if
True, only direct dependencies if False (default True)..
expand_virtuals (bool or None): expand virtual dependencies into
all possible implementations (default True)
depflag: dependency types to consider
visited (dict or None): dict of names of dependencies visited so
far, mapped to their immediate dependencies' names.
missing (dict or None): dict to populate with packages and their
*missing* dependencies.
virtuals (set): if provided, populate with virtuals seen so far.
Returns:
(dict): dictionary mapping dependency names to *their*
immediate dependencies
Each item in the returned dictionary maps a (potentially
transitive) dependency of this package to its possible
*immediate* dependencies. If ``expand_virtuals`` is ``False``,
virtual package names wil be inserted as keys mapped to empty
sets of dependencies. Virtuals, if not expanded, are treated as
though they have no immediate dependencies.
Missing dependencies by default are ignored, but if a
missing dict is provided, it will be populated with package names
mapped to any dependencies they have that are in no
repositories. This is only populated if transitive is True.
Note: the returned dict *includes* the package itself.
"""
visited = {} if visited is None else visited
missing = {} if missing is None else missing
visited.setdefault(cls.name, set())
for name, conditions in cls.dependencies_by_name(when=True).items():
# check whether this dependency could be of the type asked for
depflag_union = 0
for deplist in conditions.values():
for dep in deplist:
depflag_union |= dep.depflag
if not (depflag & depflag_union):
continue
# expand virtuals if enabled, otherwise just stop at virtuals
if spack.repo.PATH.is_virtual(name):
if virtuals is not None:
virtuals.add(name)
if expand_virtuals:
providers = spack.repo.PATH.providers_for(name)
dep_names = [spec.name for spec in providers]
else:
visited.setdefault(cls.name, set()).add(name)
visited.setdefault(name, set())
continue
else:
dep_names = [name]
# add the dependency names to the visited dict
visited.setdefault(cls.name, set()).update(set(dep_names))
# recursively traverse dependencies
for dep_name in dep_names:
if dep_name in visited:
continue
visited.setdefault(dep_name, set())
# skip the rest if not transitive
if not transitive:
continue
try:
dep_cls = spack.repo.PATH.get_pkg_class(dep_name)
except spack.repo.UnknownPackageError:
# log unknown packages
missing.setdefault(cls.name, set()).add(dep_name)
continue
dep_cls.possible_dependencies(
transitive, expand_virtuals, depflag, visited, missing, virtuals
)
return visited
@classproperty
def package_dir(cls):
"""Directory where the package.py file lives."""
@@ -2292,85 +2172,6 @@ def rpath_args(self):
build_system_flags = PackageBase.build_system_flags
def install_dependency_symlinks(pkg, spec, prefix):
"""
Execute a dummy install and flatten dependencies.
This routine can be used in a ``package.py`` definition by setting
``install = install_dependency_symlinks``.
This feature comes in handy for creating a common location for the
the installation of third-party libraries.
"""
flatten_dependencies(spec, prefix)
def use_cray_compiler_names():
"""Compiler names for builds that rely on cray compiler names."""
os.environ["CC"] = "cc"
os.environ["CXX"] = "CC"
os.environ["FC"] = "ftn"
os.environ["F77"] = "ftn"
def flatten_dependencies(spec, flat_dir):
"""Make each dependency of spec present in dir via symlink."""
for dep in spec.traverse(root=False):
name = dep.name
dep_path = spack.store.STORE.layout.path_for_spec(dep)
dep_files = LinkTree(dep_path)
os.mkdir(flat_dir + "/" + name)
conflict = dep_files.find_conflict(flat_dir + "/" + name)
if conflict:
raise DependencyConflictError(conflict)
dep_files.merge(flat_dir + "/" + name)
def possible_dependencies(
*pkg_or_spec: Union[str, spack.spec.Spec, typing.Type[PackageBase]],
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Get the possible dependencies of a number of packages.
See ``PackageBase.possible_dependencies`` for details.
"""
packages = []
for pos in pkg_or_spec:
if isinstance(pos, PackageMeta) and issubclass(pos, PackageBase):
packages.append(pos)
continue
if not isinstance(pos, spack.spec.Spec):
pos = spack.spec.Spec(pos)
if spack.repo.PATH.is_virtual(pos.name):
packages.extend(p.package_class for p in spack.repo.PATH.providers_for(pos.name))
continue
else:
packages.append(pos.package_class)
visited: Dict[str, Set[str]] = {}
for pkg in packages:
pkg.possible_dependencies(
visited=visited,
transitive=transitive,
expand_virtuals=expand_virtuals,
depflag=depflag,
missing=missing,
virtuals=virtuals,
)
return visited
def deprecated_version(pkg: PackageBase, version: Union[str, StandardVersion]) -> bool:
"""Return True iff the version is deprecated.

View File

@@ -52,8 +52,7 @@ def use_platform(new_platform):
import spack.config
msg = '"{0}" must be an instance of Platform'
assert isinstance(new_platform, Platform), msg.format(new_platform)
assert isinstance(new_platform, Platform), f'"{new_platform}" must be an instance of Platform'
original_host_fn = host

View File

@@ -1,42 +1,22 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import warnings
from typing import Optional
import archspec.cpu
import llnl.util.lang
import spack.error
class NoPlatformError(spack.error.SpackError):
def __init__(self):
msg = "Could not determine a platform for this machine"
super().__init__(msg)
@llnl.util.lang.lazy_lexicographic_ordering
class Platform:
"""Platform is an abstract class extended by subclasses.
To add a new type of platform (such as cray_xe), create a subclass and set all the
class attributes such as priority, front_target, back_target, front_os, back_os.
Platform also contain a priority class attribute. A lower number signifies higher
priority. These numbers are arbitrarily set and can be changed though often there
isn't much need unless a new platform is added and the user wants that to be
detected first.
Targets are created inside the platform subclasses. Most architecture (like linux,
and darwin) will have only one target family (x86_64) but in the case of Cray
machines, there is both a frontend and backend processor. The user can specify
which targets are present on front-end and back-end architecture.
Depending on the platform, operating systems are either autodetected or are
set. The user can set the frontend and backend operating setting by the class
attributes front_os and back_os. The operating system will be responsible for
compiler detection.
"""
# Subclass sets number. Controls detection order
@@ -45,82 +25,72 @@ class attributes such as priority, front_target, back_target, front_os, back_os.
#: binary formats used on this platform; used by relocation logic
binary_formats = ["elf"]
front_end: Optional[str] = None
back_end: Optional[str] = None
default: Optional[str] = None # The default back end target.
front_os: Optional[str] = None
back_os: Optional[str] = None
default_os: Optional[str] = None
default: str
default_os: str
reserved_targets = ["default_target", "frontend", "fe", "backend", "be"]
reserved_oss = ["default_os", "frontend", "fe", "backend", "be"]
deprecated_names = ["frontend", "fe", "backend", "be"]
def __init__(self, name):
self.targets = {}
self.operating_sys = {}
self.name = name
self._init_targets()
def add_target(self, name: str, target: archspec.cpu.Microarchitecture) -> None:
"""Used by the platform specific subclass to list available targets.
Raises an error if the platform specifies a name
that is reserved by spack as an alias.
"""
if name in Platform.reserved_targets:
msg = "{0} is a spack reserved alias and cannot be the name of a target"
raise ValueError(msg.format(name))
msg = f"{name} is a spack reserved alias and cannot be the name of a target"
raise ValueError(msg)
self.targets[name] = target
def _add_archspec_targets(self):
def _init_targets(self):
self.default = archspec.cpu.host().name
for name, microarchitecture in archspec.cpu.TARGETS.items():
self.add_target(name, microarchitecture)
def target(self, name):
"""This is a getter method for the target dictionary
that handles defaulting based on the values provided by default,
front-end, and back-end. This can be overwritten
by a subclass for which we want to provide further aliasing options.
"""
# TODO: Check if we can avoid using strings here
name = str(name)
if name == "default_target":
if name in Platform.deprecated_names:
warnings.warn(f"target={name} is deprecated, use target={self.default} instead")
if name in Platform.reserved_targets:
name = self.default
elif name == "frontend" or name == "fe":
name = self.front_end
elif name == "backend" or name == "be":
name = self.back_end
return self.targets.get(name, None)
def add_operating_system(self, name, os_class):
"""Add the operating_system class object into the
platform.operating_sys dictionary.
"""
if name in Platform.reserved_oss:
msg = "{0} is a spack reserved alias and cannot be the name of an OS"
raise ValueError(msg.format(name))
if name in Platform.reserved_oss + Platform.deprecated_names:
msg = f"{name} is a spack reserved alias and cannot be the name of an OS"
raise ValueError(msg)
self.operating_sys[name] = os_class
def default_target(self):
return self.target(self.default)
def default_operating_system(self):
return self.operating_system(self.default_os)
def operating_system(self, name):
if name == "default_os":
if name in Platform.deprecated_names:
warnings.warn(f"os={name} is deprecated, use os={self.default_os} instead")
if name in Platform.reserved_oss:
name = self.default_os
if name == "frontend" or name == "fe":
name = self.front_os
if name == "backend" or name == "be":
name = self.back_os
return self.operating_sys.get(name, None)
def setup_platform_environment(self, pkg, env):
"""Subclass can override this method if it requires any
platform-specific build environment modifications.
"""Platform-specific build environment modifications.
This method is meant toi be overridden by subclasses, when needed.
"""
pass
@classmethod
def detect(cls):
"""Return True if the the host platform is detected to be the current
Platform class, False otherwise.
"""Returns True if the host platform is detected to be the current Platform class,
False otherwise.
Derived classes are responsible for implementing this method.
"""
@@ -135,11 +105,7 @@ def __str__(self):
def _cmp_iter(self):
yield self.name
yield self.default
yield self.front_end
yield self.back_end
yield self.default_os
yield self.front_os
yield self.back_os
def targets():
for t in sorted(self.targets.values()):

View File

@@ -4,8 +4,6 @@
import platform as py_platform
import archspec.cpu
from spack.operating_systems.mac_os import MacOs
from spack.version import Version
@@ -19,18 +17,8 @@ class Darwin(Platform):
def __init__(self):
super().__init__("darwin")
self._add_archspec_targets()
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
mac_os = MacOs()
self.default_os = str(mac_os)
self.front_os = str(mac_os)
self.back_os = str(mac_os)
self.add_operating_system(str(mac_os), mac_os)
@classmethod

View File

@@ -3,8 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import archspec.cpu
from spack.operating_systems.freebsd import FreeBSDOs
from ._platform import Platform
@@ -15,18 +13,8 @@ class FreeBSD(Platform):
def __init__(self):
super().__init__("freebsd")
self._add_archspec_targets()
# Get specific default
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
os = FreeBSDOs()
self.default_os = str(os)
self.front_os = self.default_os
self.back_os = self.default_os
self.add_operating_system(str(os), os)
@classmethod

View File

@@ -3,8 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import archspec.cpu
from spack.operating_systems.linux_distro import LinuxDistro
from ._platform import Platform
@@ -15,18 +13,8 @@ class Linux(Platform):
def __init__(self):
super().__init__("linux")
self._add_archspec_targets()
# Get specific default
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
linux_dist = LinuxDistro()
self.default_os = str(linux_dist)
self.front_os = self.default_os
self.back_os = self.default_os
self.add_operating_system(str(linux_dist), linux_dist)
@classmethod

View File

@@ -16,31 +16,19 @@ class Test(Platform):
if platform.system().lower() == "darwin":
binary_formats = ["macho"]
if platform.machine() == "arm64":
front_end = "aarch64"
back_end = "m1"
default = "m1"
else:
front_end = "x86_64"
back_end = "core2"
default = "core2"
front_os = "redhat6"
back_os = "debian6"
default_os = "debian6"
default = "m1" if platform.machine() == "arm64" else "core2"
def __init__(self, name=None):
name = name or "test"
super().__init__(name)
self.add_target(self.default, archspec.cpu.TARGETS[self.default])
self.add_target(self.front_end, archspec.cpu.TARGETS[self.front_end])
self.add_operating_system("debian6", spack.operating_systems.OperatingSystem("debian", 6))
self.add_operating_system("redhat6", spack.operating_systems.OperatingSystem("redhat", 6))
self.add_operating_system(
self.default_os, spack.operating_systems.OperatingSystem("debian", 6)
)
self.add_operating_system(
self.front_os, spack.operating_systems.OperatingSystem("redhat", 6)
)
def _init_targets(self):
targets = ("aarch64", "m1") if platform.machine() == "arm64" else ("x86_64", "core2")
for t in targets:
self.add_target(t, archspec.cpu.TARGETS[t])
@classmethod
def detect(cls):

View File

@@ -4,8 +4,6 @@
import platform
import archspec.cpu
from spack.operating_systems.windows_os import WindowsOs
from ._platform import Platform
@@ -16,18 +14,8 @@ class Windows(Platform):
def __init__(self):
super().__init__("windows")
self._add_archspec_targets()
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
windows_os = WindowsOs()
self.default_os = str(windows_os)
self.front_os = str(windows_os)
self.back_os = str(windows_os)
self.add_operating_system(str(windows_os), windows_os)
@classmethod

View File

@@ -236,22 +236,15 @@ def relocate_elf_binaries(binaries: Iterable[str], prefix_to_prefix: Dict[str, s
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def _warn_if_link_cant_be_relocated(link: str, target: str):
if not os.path.isabs(target):
return
tty.warn(f'Symbolic link at "{link}" to "{target}" cannot be relocated')
def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Relocate links to a new install prefix."""
regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys()))
for link in links:
old_target = readlink(link)
if not os.path.isabs(old_target):
continue
match = regex.match(old_target)
# No match.
if match is None:
_warn_if_link_cant_be_relocated(link, old_target)
continue
new_target = prefix_to_prefix[match.group()] + old_target[match.end() :]
@@ -290,21 +283,21 @@ def relocate_text_bin(binaries: Iterable[str], prefix_to_prefix: PrefixToPrefix)
def is_macho_magic(magic: bytes) -> bool:
return (
# In order of popularity: 64-bit mach-o le/be, 32-bit mach-o le/be.
magic.startswith(b"\xCF\xFA\xED\xFE")
or magic.startswith(b"\xFE\xED\xFA\xCF")
or magic.startswith(b"\xCE\xFA\xED\xFE")
or magic.startswith(b"\xFE\xED\xFA\xCE")
magic.startswith(b"\xcf\xfa\xed\xfe")
or magic.startswith(b"\xfe\xed\xfa\xcf")
or magic.startswith(b"\xce\xfa\xed\xfe")
or magic.startswith(b"\xfe\xed\xfa\xce")
# universal binaries: 0xcafebabe be (most common?) or 0xbebafeca le (not sure if exists).
# Here we need to disambiguate mach-o and JVM class files. In mach-o the next 4 bytes are
# the number of binaries; in JVM class files it's the java version number. We assume there
# are less than 10 binaries in a universal binary.
or (magic.startswith(b"\xCA\xFE\xBA\xBE") and int.from_bytes(magic[4:8], "big") < 10)
or (magic.startswith(b"\xBE\xBA\xFE\xCA") and int.from_bytes(magic[4:8], "little") < 10)
or (magic.startswith(b"\xca\xfe\xba\xbe") and int.from_bytes(magic[4:8], "big") < 10)
or (magic.startswith(b"\xbe\xba\xfe\xca") and int.from_bytes(magic[4:8], "little") < 10)
)
def is_elf_magic(magic: bytes) -> bool:
return magic.startswith(b"\x7FELF")
return magic.startswith(b"\x7fELF")
def is_binary(filename: str) -> bool:
@@ -413,8 +406,8 @@ def fixup_macos_rpaths(spec):
entries which makes it harder to adjust with ``install_name_tool
-delete_rpath``.
"""
if spec.external or spec.virtual:
tty.warn("external or virtual package cannot be fixed up: {0!s}".format(spec))
if spec.external or not spec.concrete:
tty.warn("external/abstract spec cannot be fixed up: {0!s}".format(spec))
return False
if "platform=darwin" not in spec:

View File

@@ -1041,7 +1041,7 @@ def _read_config(self) -> Dict[str, str]:
return yaml_data["repo"]
except IOError:
except OSError:
tty.die(f"Error reading {self.config_file} when opening {self.root}")
def get(self, spec: "spack.spec.Spec") -> "spack.package_base.PackageBase":
@@ -1179,8 +1179,9 @@ def all_package_paths(self) -> Generator[str, None, None]:
yield self.package_path(name)
def packages_with_tags(self, *tags: str) -> Set[str]:
v = set(self.all_package_names())
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
v = set(self.tag_index[tags[0].lower()])
for tag in tags[1:]:
v.intersection_update(self.tag_index[tag.lower()])
return v
def all_package_classes(self) -> Generator[Type["spack.package_base.PackageBase"], None, None]:
@@ -1369,7 +1370,7 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
if subdir != packages_dir_name:
config.write(f" subdirectory: '{subdir}'\n")
except (IOError, OSError) as e:
except OSError as e:
# try to clean up.
if existed:
shutil.rmtree(config_path, ignore_errors=True)

View File

@@ -1,6 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import codecs
import collections
import hashlib
import os
@@ -13,7 +14,7 @@
import xml.sax.saxutils
from typing import Dict, Optional
from urllib.parse import urlencode
from urllib.request import HTTPSHandler, Request, build_opener
from urllib.request import Request
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
@@ -24,10 +25,10 @@
import spack.spec
import spack.tengine
import spack.util.git
import spack.util.web as web_util
from spack.error import SpackError
from spack.util.crypto import checksum
from spack.util.log_parse import parse_log_events
from spack.util.web import ssl_create_default_context
from .base import Reporter
from .extract import extract_test_parts
@@ -106,7 +107,7 @@ def __init__(self, configuration: CDashConfiguration):
self.site = configuration.site or socket.gethostname()
self.osname = platform.system()
self.osrelease = platform.release()
self.target = spack.platforms.host().target("default_target")
self.target = spack.platforms.host().default_target()
self.starttime = int(time.time())
self.endtime = self.starttime
self.buildstamp = (
@@ -176,7 +177,7 @@ def build_report_for_package(self, report_dir, package, duration):
# something went wrong pre-cdash "configure" phase b/c we have an exception and only
# "update" was encounterd.
# dump the report in the configure line so teams can see what the issue is
if len(phases_encountered) == 1 and package["exception"]:
if len(phases_encountered) == 1 and package.get("exception"):
# TODO this mapping is not ideal since these are pre-configure errors
# we need to determine if a more appropriate cdash phase can be utilized
# for now we will add a message to the log explaining this
@@ -433,7 +434,6 @@ def upload(self, filename):
# Compute md5 checksum for the contents of this file.
md5sum = checksum(hashlib.md5, filename, block_size=8192)
opener = build_opener(HTTPSHandler(context=ssl_create_default_context()))
with open(filename, "rb") as f:
params_dict = {
"build": self.buildname,
@@ -443,26 +443,21 @@ def upload(self, filename):
}
encoded_params = urlencode(params_dict)
url = "{0}&{1}".format(self.cdash_upload_url, encoded_params)
request = Request(url, data=f)
request = Request(url, data=f, method="PUT")
request.add_header("Content-Type", "text/xml")
request.add_header("Content-Length", os.path.getsize(filename))
if self.authtoken:
request.add_header("Authorization", "Bearer {0}".format(self.authtoken))
try:
# By default, urllib2 only support GET and POST.
# CDash expects this file to be uploaded via PUT.
request.get_method = lambda: "PUT"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response = web_util.urlopen(request, timeout=SPACK_CDASH_TIMEOUT)
if self.current_package_name not in self.buildIds:
resp_value = response.read()
if isinstance(resp_value, bytes):
resp_value = resp_value.decode("utf-8")
resp_value = codecs.getreader("utf-8")(response).read()
match = self.buildid_regexp.search(resp_value)
if match:
buildid = match.group(1)
self.buildIds[self.current_package_name] = buildid
except Exception as e:
print("Upload to CDash failed: {0}".format(e))
print(f"Upload to CDash failed: {e}")
def finalize_report(self):
if self.buildIds:

View File

@@ -84,9 +84,14 @@
"duplicates": {
"type": "object",
"properties": {
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]}
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]},
"max_dupes": {
"type": "object",
"additional_properties": {"type": "integer", "minimum": 1},
},
},
},
"static_analysis": {"type": "boolean"},
"timeout": {"type": "integer", "minimum": 0},
"error_on_timeout": {"type": "boolean"},
"os_compatible": {"type": "object", "additionalProperties": {"type": "array"}},

View File

@@ -3,12 +3,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for Cray descriptive manifest: this describes a set of
installed packages on the system and also specifies dependency
relationships between them (so this provides more information than
external entries in packages configuration).
installed packages on the system and also specifies dependency
relationships between them (so this provides more information than
external entries in packages configuration).
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
"""
from typing import Any, Dict

View File

@@ -6,6 +6,7 @@
import copy
import enum
import functools
import io
import itertools
import os
import pathlib
@@ -62,7 +63,7 @@
parse_files,
parse_term,
)
from .counter import FullDuplicatesCounter, MinimalDuplicatesCounter, NoDuplicatesCounter
from .input_analysis import create_counter, create_graph_analyzer
from .requirements import RequirementKind, RequirementParser, RequirementRule
from .version_order import concretization_version_order
@@ -73,17 +74,19 @@
#: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32"
#: Data class that contain configuration on what a
#: clingo solve should output.
#:
#: Args:
#: timers (bool): Print out coarse timers for different solve phases.
#: stats (bool): Whether to output Clingo's internal solver statistics.
#: out: Optional output stream for the generated ASP program.
#: setup_only (bool): if True, stop after setup and don't solve (default False).
OutputConfiguration = collections.namedtuple(
"OutputConfiguration", ["timers", "stats", "out", "setup_only"]
)
class OutputConfiguration(NamedTuple):
"""Data class that contains configuration on what a clingo solve should output."""
#: Print out coarse timers for different solve phases
timers: bool
#: Whether to output Clingo's internal solver statistics
stats: bool
#: Optional output stream for the generated ASP program
out: Optional[io.IOBase]
#: If True, stop after setup and don't solve
setup_only: bool
#: Default output configuration for a solve
DEFAULT_OUTPUT_CONFIGURATION = OutputConfiguration(
@@ -271,15 +274,6 @@ def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunc
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
def _create_counter(specs: List[spack.spec.Spec], tests: bool):
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
if strategy == "full":
return FullDuplicatesCounter(specs, tests=tests)
if strategy == "minimal":
return MinimalDuplicatesCounter(specs, tests=tests)
return NoDuplicatesCounter(specs, tests=tests)
def all_libcs() -> Set[spack.spec.Spec]:
"""Return a set of all libc specs targeted by any configured compiler. If none, fall back to
libc determined from the current Python process if dynamically linked."""
@@ -507,7 +501,7 @@ def _compute_specs_from_answer_set(self):
# The specs must be unified to get here, so it is safe to associate any satisfying spec
# with the input. Multiple inputs may be matched to the same concrete spec
node = SpecBuilder.make_node(pkg=input_spec.name)
if input_spec.virtual:
if spack.repo.PATH.is_virtual(input_spec.name):
providers = [
spec.name for spec in answer.values() if spec.package.provides(input_spec.name)
]
@@ -1121,6 +1115,8 @@ class SpackSolverSetup:
"""Class to set up and run a Spack concretization solve."""
def __init__(self, tests: bool = False):
self.possible_graph = create_graph_analyzer()
# these are all initialized in setup()
self.gen: "ProblemInstanceBuilder" = ProblemInstanceBuilder()
self.requirement_parser = RequirementParser(spack.config.CONFIG)
@@ -2087,7 +2083,11 @@ def _spec_clauses(
f: Union[Type[_Head], Type[_Body]] = _Body if body else _Head
if spec.name:
clauses.append(f.node(spec.name) if not spec.virtual else f.virtual_node(spec.name))
clauses.append(
f.node(spec.name)
if not spack.repo.PATH.is_virtual(spec.name)
else f.virtual_node(spec.name)
)
if spec.namespace:
clauses.append(f.namespace(spec.name, spec.namespace))
@@ -2114,7 +2114,7 @@ def _spec_clauses(
for value in variant.value_as_tuple:
# ensure that the value *can* be valid for the spec
if spec.name and not spec.concrete and not spec.virtual:
if spec.name and not spec.concrete and not spack.repo.PATH.is_virtual(spec.name):
variant_defs = vt.prevalidate_variant_value(
self.pkg_class(spec.name), variant, spec
)
@@ -2397,38 +2397,20 @@ def keyfun(os):
def target_defaults(self, specs):
"""Add facts about targets and target compatibility."""
self.gen.h2("Default target")
platform = spack.platforms.host()
uarch = archspec.cpu.TARGETS.get(platform.default)
self.gen.h2("Target compatibility")
# Construct the list of targets which are compatible with the host
candidate_targets = [uarch] + uarch.ancestors
# Get configuration options
granularity = spack.config.get("concretizer:targets:granularity")
host_compatible = spack.config.get("concretizer:targets:host_compatible")
# Add targets which are not compatible with the current host
if not host_compatible:
additional_targets_in_family = sorted(
[
t
for t in archspec.cpu.TARGETS.values()
if (t.family.name == uarch.family.name and t not in candidate_targets)
],
key=lambda x: len(x.ancestors),
reverse=True,
)
candidate_targets += additional_targets_in_family
# Check if we want only generic architecture
if granularity == "generic":
candidate_targets = [t for t in candidate_targets if t.vendor == "generic"]
# Add targets explicitly requested from specs
candidate_targets = []
for x in self.possible_graph.candidate_targets():
if all(
self.possible_graph.unreachable(pkg_name=pkg_name, when_spec=f"target={x}")
for pkg_name in self.pkgs
):
tty.debug(f"[{__name__}] excluding target={x}, cause no package can use it")
continue
candidate_targets.append(x)
host_compatible = spack.config.CONFIG.get("concretizer:targets:host_compatible")
for spec in specs:
if not spec.architecture or not spec.architecture.target:
continue
@@ -2444,6 +2426,8 @@ def target_defaults(self, specs):
if ancestor not in candidate_targets:
candidate_targets.append(ancestor)
platform = spack.platforms.host()
uarch = archspec.cpu.TARGETS.get(platform.default)
best_targets = {uarch.family.name}
for compiler_id, known_compiler in enumerate(self.possible_compilers):
if not known_compiler.available:
@@ -2501,7 +2485,6 @@ def target_defaults(self, specs):
self.gen.newline()
self.default_targets = list(sorted(set(self.default_targets)))
self.target_preferences()
def virtual_providers(self):
@@ -2605,7 +2588,14 @@ def define_variant_values(self):
# Tell the concretizer about possible values from specs seen in spec_clauses().
# We might want to order these facts by pkg and name if we are debugging.
for pkg_name, variant_def_id, value in self.variant_values_from_specs:
vid = self.variant_ids_by_def_id[variant_def_id]
try:
vid = self.variant_ids_by_def_id[variant_def_id]
except KeyError:
tty.debug(
f"[{__name__}] cannot retrieve id of the {value} variant from {pkg_name}"
)
continue
self.gen.fact(fn.pkg_fact(pkg_name, fn.variant_possible_value(vid, value)))
def register_concrete_spec(self, spec, possible):
@@ -2676,7 +2666,7 @@ def setup(
"""
check_packages_exist(specs)
node_counter = _create_counter(specs, tests=self.tests)
node_counter = create_counter(specs, tests=self.tests, possible_graph=self.possible_graph)
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
self.libcs = sorted(all_libcs()) # type: ignore[type-var]
@@ -2684,7 +2674,9 @@ def setup(
# Fail if we already know an unreachable node is requested
for spec in specs:
missing_deps = [
str(d) for d in spec.traverse() if d.name not in self.pkgs and not d.virtual
str(d)
for d in spec.traverse()
if d.name not in self.pkgs and not spack.repo.PATH.is_virtual(d.name)
]
if missing_deps:
raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
@@ -2901,7 +2893,11 @@ def literal_specs(self, specs):
pkg_name = clause.args[1]
self.gen.fact(fn.mentioned_in_literal(trigger_id, root_name, pkg_name))
requirements.append(fn.attr("virtual_root" if spec.virtual else "root", spec.name))
requirements.append(
fn.attr(
"virtual_root" if spack.repo.PATH.is_virtual(spec.name) else "root", spec.name
)
)
cache[imposed_spec_key] = (effect_id, requirements)
self.gen.fact(fn.pkg_fact(spec.name, fn.condition_effect(condition_id, effect_id)))
@@ -3489,7 +3485,7 @@ def external_spec_selected(self, node, idx):
self._specs[node].extra_attributes = spec_info.get("extra_attributes", {})
# If this is an extension, update the dependencies to include the extendee
package = self._specs[node].package_class(self._specs[node])
package = spack.repo.PATH.get_pkg_class(self._specs[node].fullname)(self._specs[node])
extendee_spec = package.extendee_spec
if extendee_spec:
@@ -4102,10 +4098,10 @@ def _check_input_and_extract_concrete_specs(specs):
reusable = []
for root in specs:
for s in root.traverse():
if s.virtual:
continue
if s.concrete:
reusable.append(s)
elif spack.repo.PATH.is_virtual(s.name):
continue
spack.spec.Spec.ensure_valid_variants(s)
return reusable

View File

@@ -265,6 +265,7 @@ error(100, "Cannot select a single version for virtual '{0}'", Virtual)
% If we select a deprecated version, mark the package as deprecated
attr("deprecated", node(ID, Package), Version) :-
attr("version", node(ID, Package), Version),
not external(node(ID, Package)),
pkg_fact(Package, deprecated_version(Version)).
error(100, "Package '{0}' needs the deprecated version '{1}', and this is not allowed", Package, Version)

View File

@@ -1,179 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
from typing import List, Set
from llnl.util import lang
import spack.deptypes as dt
import spack.package_base
import spack.repo
import spack.spec
PossibleDependencies = Set[str]
class Counter:
"""Computes the possible packages and the maximum number of duplicates
allowed for each of them.
Args:
specs: abstract specs to concretize
tests: if True, add test dependencies to the list of possible packages
"""
def __init__(self, specs: List["spack.spec.Spec"], tests: bool) -> None:
runtime_pkgs = spack.repo.PATH.packages_with_tags("runtime")
runtime_virtuals = set()
for x in runtime_pkgs:
pkg_class = spack.repo.PATH.get_pkg_class(x)
runtime_virtuals.update(pkg_class.provided_virtual_names())
self.specs = specs + [spack.spec.Spec(x) for x in runtime_pkgs]
self.link_run_types: dt.DepFlag = dt.LINK | dt.RUN | dt.TEST
self.all_types: dt.DepFlag = dt.ALL
if not tests:
self.link_run_types = dt.LINK | dt.RUN
self.all_types = dt.LINK | dt.RUN | dt.BUILD
self._possible_dependencies: PossibleDependencies = set()
self._possible_virtuals: Set[str] = (
set(x.name for x in specs if x.virtual) | runtime_virtuals
)
def possible_dependencies(self) -> PossibleDependencies:
"""Returns the list of possible dependencies"""
self.ensure_cache_values()
return self._possible_dependencies
def possible_virtuals(self) -> Set[str]:
"""Returns the list of possible virtuals"""
self.ensure_cache_values()
return self._possible_virtuals
def ensure_cache_values(self) -> None:
"""Ensure the cache values have been computed"""
if self._possible_dependencies:
return
self._compute_cache_values()
def possible_packages_facts(self, gen: "spack.solver.asp.PyclingoDriver", fn) -> None:
"""Emit facts associated with the possible packages"""
raise NotImplementedError("must be implemented by derived classes")
def _compute_cache_values(self):
raise NotImplementedError("must be implemented by derived classes")
class NoDuplicatesCounter(Counter):
def _compute_cache_values(self):
result = spack.package_base.possible_dependencies(
*self.specs, virtuals=self._possible_virtuals, depflag=self.all_types
)
self._possible_dependencies = set(result)
def possible_packages_facts(self, gen, fn):
gen.h2("Maximum number of nodes (packages)")
for package_name in sorted(self.possible_dependencies()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self.possible_dependencies()):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class MinimalDuplicatesCounter(NoDuplicatesCounter):
def __init__(self, specs, tests):
super().__init__(specs, tests)
self._link_run: PossibleDependencies = set()
self._direct_build: PossibleDependencies = set()
self._total_build: PossibleDependencies = set()
self._link_run_virtuals: Set[str] = set()
def _compute_cache_values(self):
self._link_run = set(
spack.package_base.possible_dependencies(
*self.specs, virtuals=self._possible_virtuals, depflag=self.link_run_types
)
)
self._link_run_virtuals.update(self._possible_virtuals)
for x in self._link_run:
build_dependencies = spack.repo.PATH.get_pkg_class(x).dependencies_of_type(dt.BUILD)
virtuals, reals = lang.stable_partition(
build_dependencies, spack.repo.PATH.is_virtual_safe
)
self._possible_virtuals.update(virtuals)
for virtual_dep in virtuals:
providers = spack.repo.PATH.providers_for(virtual_dep)
self._direct_build.update(str(x) for x in providers)
self._direct_build.update(reals)
self._total_build = set(
spack.package_base.possible_dependencies(
*self._direct_build, virtuals=self._possible_virtuals, depflag=self.all_types
)
)
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Packages with at multiple possible nodes (build-tools)")
for package_name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.max_dupes(package_name, 2))
gen.fact(fn.multiple_unification_sets(package_name))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)
gen.h2("Maximum number of nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
count = min(count, 2)
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
gen.h2("Build unification sets ")
for name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.multiple_unification_sets(name))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
counter = collections.Counter(
list(self._link_run_virtuals) + list(self._possible_virtuals)
)
gen.h2("Maximum number of virtual nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
gen.fact(fn.max_dupes(pkg, count))
gen.newline()

View File

@@ -0,0 +1,539 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Classes to analyze the input of a solve, and provide information to set up the ASP problem"""
import collections
from typing import Dict, List, NamedTuple, Set, Tuple, Union
import archspec.cpu
from llnl.util import lang, tty
import spack.binary_distribution
import spack.config
import spack.deptypes as dt
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.error import SpackError
RUNTIME_TAG = "runtime"
class PossibleGraph(NamedTuple):
real_pkgs: Set[str]
virtuals: Set[str]
edges: Dict[str, Set[str]]
class PossibleDependencyGraph:
"""Returns information needed to set up an ASP problem"""
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
raise NotImplementedError
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
raise NotImplementedError
def possible_dependencies(
self,
*specs: Union[spack.spec.Spec, str],
allowed_deps: dt.DepFlag,
transitive: bool = True,
strict_depflag: bool = False,
expand_virtuals: bool = True,
) -> PossibleGraph:
"""Returns the set of possible dependencies, and the set of possible virtuals.
Both sets always include runtime packages, which may be injected by compilers.
Args:
transitive: return transitive dependencies if True, only direct dependencies if False
allowed_deps: dependency types to consider
strict_depflag: if True, only the specific dep type is considered, if False any
deptype that intersects with allowed deptype is considered
expand_virtuals: expand virtual dependencies into all possible implementations
"""
raise NotImplementedError
class NoStaticAnalysis(PossibleDependencyGraph):
"""Implementation that tries to minimize the setup time (i.e. defaults to give fast
answers), rather than trying to reduce the ASP problem size with more complex analysis.
"""
def __init__(self, *, configuration: spack.config.Configuration, repo: spack.repo.RepoPath):
self.configuration = configuration
self.repo = repo
self.runtime_pkgs = set(self.repo.packages_with_tags(RUNTIME_TAG))
self.runtime_virtuals = set()
self._platform_condition = spack.spec.Spec(
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
)
for x in self.runtime_pkgs:
pkg_class = self.repo.get_pkg_class(x)
self.runtime_virtuals.update(pkg_class.provided_virtual_names())
try:
self.libc_pkgs = [x.name for x in self.providers_for("libc")]
except spack.repo.UnknownPackageError:
self.libc_pkgs = []
def is_virtual(self, name: str) -> bool:
return self.repo.is_virtual(name)
@lang.memoized
def is_allowed_on_this_platform(self, *, pkg_name: str) -> bool:
"""Returns true if a package is allowed on the current host"""
pkg_cls = self.repo.get_pkg_class(pkg_name)
for when_spec, conditions in pkg_cls.requirements.items():
if not when_spec.intersects(self._platform_condition):
continue
for requirements, _, _ in conditions:
if not any(x.intersects(self._platform_condition) for x in requirements):
tty.debug(f"[{__name__}] {pkg_name} is not for this platform")
return False
return True
def providers_for(self, virtual_str: str) -> List[spack.spec.Spec]:
"""Returns a list of possible providers for the virtual string in input."""
return self.repo.providers_for(virtual_str)
def can_be_installed(self, *, pkg_name) -> bool:
"""Returns True if a package can be installed, False otherwise."""
return True
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
return False
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
platform = spack.platforms.host()
default_target = archspec.cpu.TARGETS[platform.default]
# Construct the list of targets which are compatible with the host
candidate_targets = [default_target] + default_target.ancestors
granularity = self.configuration.get("concretizer:targets:granularity")
host_compatible = self.configuration.get("concretizer:targets:host_compatible")
# Add targets which are not compatible with the current host
if not host_compatible:
additional_targets_in_family = sorted(
[
t
for t in archspec.cpu.TARGETS.values()
if (t.family.name == default_target.family.name and t not in candidate_targets)
],
key=lambda x: len(x.ancestors),
reverse=True,
)
candidate_targets += additional_targets_in_family
# Check if we want only generic architecture
if granularity == "generic":
candidate_targets = [t for t in candidate_targets if t.vendor == "generic"]
return candidate_targets
def possible_dependencies(
self,
*specs: Union[spack.spec.Spec, str],
allowed_deps: dt.DepFlag,
transitive: bool = True,
strict_depflag: bool = False,
expand_virtuals: bool = True,
) -> PossibleGraph:
stack = [x for x in self._package_list(specs)]
virtuals: Set[str] = set()
edges: Dict[str, Set[str]] = {}
while stack:
pkg_name = stack.pop()
if pkg_name in edges:
continue
edges[pkg_name] = set()
# Since libc is not buildable, there is no need to extend the
# search space with libc dependencies.
if pkg_name in self.libc_pkgs:
continue
pkg_cls = self.repo.get_pkg_class(pkg_name=pkg_name)
for name, conditions in pkg_cls.dependencies_by_name(when=True).items():
if all(self.unreachable(pkg_name=pkg_name, when_spec=x) for x in conditions):
tty.debug(
f"[{__name__}] Not adding {name} as a dep of {pkg_name}, because "
f"conditions cannot be met"
)
continue
if not self._has_deptypes(
conditions, allowed_deps=allowed_deps, strict=strict_depflag
):
continue
if name in virtuals:
continue
dep_names = set()
if self.is_virtual(name):
virtuals.add(name)
if expand_virtuals:
providers = self.providers_for(name)
dep_names = {spec.name for spec in providers}
else:
dep_names = {name}
edges[pkg_name].update(dep_names)
if not transitive:
continue
for dep_name in dep_names:
if dep_name in edges:
continue
if not self._is_possible(pkg_name=dep_name):
continue
stack.append(dep_name)
real_packages = set(edges)
if not transitive:
# We exit early, so add children from the edges information
for root, children in edges.items():
real_packages.update(x for x in children if self._is_possible(pkg_name=x))
virtuals.update(self.runtime_virtuals)
real_packages = real_packages | self.runtime_pkgs
return PossibleGraph(real_pkgs=real_packages, virtuals=virtuals, edges=edges)
def _package_list(self, specs: Tuple[Union[spack.spec.Spec, str], ...]) -> List[str]:
stack = []
for current_spec in specs:
if isinstance(current_spec, str):
current_spec = spack.spec.Spec(current_spec)
if self.repo.is_virtual(current_spec.name):
stack.extend([p.name for p in self.providers_for(current_spec.name)])
continue
stack.append(current_spec.name)
return sorted(set(stack))
def _has_deptypes(self, dependencies, *, allowed_deps: dt.DepFlag, strict: bool) -> bool:
if strict is True:
return any(
dep.depflag == allowed_deps for deplist in dependencies.values() for dep in deplist
)
return any(
dep.depflag & allowed_deps for deplist in dependencies.values() for dep in deplist
)
def _is_possible(self, *, pkg_name):
try:
return self.is_allowed_on_this_platform(pkg_name=pkg_name) and self.can_be_installed(
pkg_name=pkg_name
)
except spack.repo.UnknownPackageError:
return False
class StaticAnalysis(NoStaticAnalysis):
"""Performs some static analysis of the configuration, store, etc. to provide more precise
answers on whether some packages can be installed, or used as a provider.
It increases the setup time, but might decrease the grounding and solve time considerably,
especially when requirements restrict the possible choices for providers.
"""
def __init__(
self,
*,
configuration: spack.config.Configuration,
repo: spack.repo.RepoPath,
store: spack.store.Store,
binary_index: spack.binary_distribution.BinaryCacheIndex,
):
super().__init__(configuration=configuration, repo=repo)
self.store = store
self.binary_index = binary_index
@lang.memoized
def providers_for(self, virtual_str: str) -> List[spack.spec.Spec]:
candidates = super().providers_for(virtual_str)
result = []
for spec in candidates:
if not self._is_provider_candidate(pkg_name=spec.name, virtual=virtual_str):
continue
result.append(spec)
return result
@lang.memoized
def buildcache_specs(self) -> List[spack.spec.Spec]:
self.binary_index.update()
return self.binary_index.get_all_built_specs()
@lang.memoized
def can_be_installed(self, *, pkg_name) -> bool:
if self.configuration.get(f"packages:{pkg_name}:buildable", True):
return True
if self.configuration.get(f"packages:{pkg_name}:externals", []):
return True
reuse = self.configuration.get("concretizer:reuse")
if reuse is not False and self.store.db.query(pkg_name):
return True
if reuse is not False and any(x.name == pkg_name for x in self.buildcache_specs()):
return True
tty.debug(f"[{__name__}] {pkg_name} cannot be installed")
return False
@lang.memoized
def _is_provider_candidate(self, *, pkg_name: str, virtual: str) -> bool:
if not self.is_allowed_on_this_platform(pkg_name=pkg_name):
return False
if not self.can_be_installed(pkg_name=pkg_name):
return False
virtual_spec = spack.spec.Spec(virtual)
if self.unreachable(pkg_name=virtual_spec.name, when_spec=pkg_name):
tty.debug(f"[{__name__}] {pkg_name} cannot be a provider for {virtual}")
return False
return True
@lang.memoized
def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""Returns true if the context can determine that the condition cannot ever
be met on pkg_name.
"""
candidates = self.configuration.get(f"packages:{pkg_name}:require", [])
if not candidates and pkg_name != "all":
return self.unreachable(pkg_name="all", when_spec=when_spec)
if not candidates:
return False
if isinstance(candidates, str):
candidates = [candidates]
union_requirement = spack.spec.Spec()
for c in candidates:
if not isinstance(c, str):
continue
try:
union_requirement.constrain(c)
except SpackError:
# Less optimized, but shouldn't fail
pass
if not union_requirement.intersects(when_spec):
return True
return False
def create_graph_analyzer() -> PossibleDependencyGraph:
static_analysis = spack.config.CONFIG.get("concretizer:static_analysis", False)
if static_analysis:
return StaticAnalysis(
configuration=spack.config.CONFIG,
repo=spack.repo.PATH,
store=spack.store.STORE,
binary_index=spack.binary_distribution.BINARY_INDEX,
)
return NoStaticAnalysis(configuration=spack.config.CONFIG, repo=spack.repo.PATH)
class Counter:
"""Computes the possible packages and the maximum number of duplicates
allowed for each of them.
Args:
specs: abstract specs to concretize
tests: if True, add test dependencies to the list of possible packages
"""
def __init__(
self, specs: List["spack.spec.Spec"], tests: bool, possible_graph: PossibleDependencyGraph
) -> None:
self.possible_graph = possible_graph
self.specs = specs
self.link_run_types: dt.DepFlag = dt.LINK | dt.RUN | dt.TEST
self.all_types: dt.DepFlag = dt.ALL
if not tests:
self.link_run_types = dt.LINK | dt.RUN
self.all_types = dt.LINK | dt.RUN | dt.BUILD
self._possible_dependencies: Set[str] = set()
self._possible_virtuals: Set[str] = {
x.name for x in specs if spack.repo.PATH.is_virtual(x.name)
}
def possible_dependencies(self) -> Set[str]:
"""Returns the list of possible dependencies"""
self.ensure_cache_values()
return self._possible_dependencies
def possible_virtuals(self) -> Set[str]:
"""Returns the list of possible virtuals"""
self.ensure_cache_values()
return self._possible_virtuals
def ensure_cache_values(self) -> None:
"""Ensure the cache values have been computed"""
if self._possible_dependencies:
return
self._compute_cache_values()
def possible_packages_facts(self, gen: "spack.solver.asp.ProblemInstanceBuilder", fn) -> None:
"""Emit facts associated with the possible packages"""
raise NotImplementedError("must be implemented by derived classes")
def _compute_cache_values(self) -> None:
raise NotImplementedError("must be implemented by derived classes")
class NoDuplicatesCounter(Counter):
def _compute_cache_values(self) -> None:
self._possible_dependencies, virtuals, _ = self.possible_graph.possible_dependencies(
*self.specs, allowed_deps=self.all_types
)
self._possible_virtuals.update(virtuals)
def possible_packages_facts(self, gen: "spack.solver.asp.ProblemInstanceBuilder", fn) -> None:
gen.h2("Maximum number of nodes (packages)")
for package_name in sorted(self.possible_dependencies()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (virtual packages)")
for package_name in sorted(self.possible_virtuals()):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self.possible_dependencies()):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class MinimalDuplicatesCounter(NoDuplicatesCounter):
def __init__(
self, specs: List["spack.spec.Spec"], tests: bool, possible_graph: PossibleDependencyGraph
) -> None:
super().__init__(specs, tests, possible_graph)
self._link_run: Set[str] = set()
self._direct_build: Set[str] = set()
self._total_build: Set[str] = set()
self._link_run_virtuals: Set[str] = set()
def _compute_cache_values(self) -> None:
self._link_run, virtuals, _ = self.possible_graph.possible_dependencies(
*self.specs, allowed_deps=self.link_run_types
)
self._possible_virtuals.update(virtuals)
self._link_run_virtuals.update(virtuals)
for x in self._link_run:
reals, virtuals, _ = self.possible_graph.possible_dependencies(
x, allowed_deps=dt.BUILD, transitive=False, strict_depflag=True
)
self._possible_virtuals.update(virtuals)
self._direct_build.update(reals)
self._total_build, virtuals, _ = self.possible_graph.possible_dependencies(
*self._direct_build, allowed_deps=self.all_types
)
self._possible_virtuals.update(virtuals)
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Packages with at multiple possible nodes (build-tools)")
default = spack.config.CONFIG.get("concretizer:duplicates:max_dupes:default", 2)
for package_name in sorted(self.possible_dependencies() & build_tools):
max_dupes = spack.config.CONFIG.get(
f"concretizer:duplicates:max_dupes:{package_name}", default
)
gen.fact(fn.max_dupes(package_name, max_dupes))
if max_dupes > 1:
gen.fact(fn.multiple_unification_sets(package_name))
gen.newline()
gen.h2("Maximum number of nodes (link-run virtuals)")
for package_name in sorted(self._link_run_virtuals):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Maximum number of nodes (other virtuals)")
for package_name in sorted(self.possible_virtuals() - self._link_run_virtuals):
max_dupes = spack.config.CONFIG.get(
f"concretizer:duplicates:max_dupes:{package_name}", default
)
gen.fact(fn.max_dupes(package_name, max_dupes))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)
gen.h2("Maximum number of nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
count = min(count, 2)
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
gen.h2("Build unification sets ")
for name in sorted(self.possible_dependencies() & build_tools):
gen.fact(fn.multiple_unification_sets(name))
gen.newline()
gen.h2("Possible package in link-run subDAG")
for name in sorted(self._link_run):
gen.fact(fn.possible_in_link_run(name))
gen.newline()
counter = collections.Counter(
list(self._link_run_virtuals) + list(self._possible_virtuals)
)
gen.h2("Maximum number of virtual nodes")
for pkg, count in sorted(counter.items(), key=lambda x: (x[1], x[0])):
gen.fact(fn.max_dupes(pkg, count))
gen.newline()
def create_counter(
specs: List[spack.spec.Spec], tests: bool, possible_graph: PossibleDependencyGraph
) -> Counter:
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
if strategy == "full":
return FullDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)
if strategy == "minimal":
return MinimalDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)
return NoDuplicatesCounter(specs, tests=tests, possible_graph=possible_graph)

View File

@@ -52,6 +52,7 @@
import enum
import io
import itertools
import json
import os
import pathlib
import platform
@@ -237,23 +238,14 @@ def _make_microarchitecture(name: str) -> archspec.cpu.Microarchitecture:
class ArchSpec:
"""Aggregate the target platform, the operating system and the target microarchitecture."""
@staticmethod
def _return_arch(os_tag, target_tag):
platform = spack.platforms.host()
default_os = platform.operating_system(os_tag)
default_target = platform.target(target_tag)
arch_tuple = str(platform), str(default_os), str(default_target)
return ArchSpec(arch_tuple)
@staticmethod
def default_arch():
"""Return the default architecture"""
return ArchSpec._return_arch("default_os", "default_target")
@staticmethod
def frontend_arch():
"""Return the frontend architecture"""
return ArchSpec._return_arch("frontend", "frontend")
platform = spack.platforms.host()
default_os = platform.default_operating_system()
default_target = platform.default_target()
arch_tuple = str(platform), str(default_os), str(default_target)
return ArchSpec(arch_tuple)
__slots__ = "_platform", "_os", "_target"
@@ -1346,14 +1338,20 @@ class SpecBuildInterface(lang.ObjectWrapper):
"command", default_handler=_command_default_handler, _indirect=True
)
def __init__(self, spec: "Spec", name: str, query_parameters: List[str], _parent: "Spec"):
def __init__(
self,
spec: "Spec",
name: str,
query_parameters: List[str],
_parent: "Spec",
is_virtual: bool,
):
super().__init__(spec)
# Adding new attributes goes after super() call since the ObjectWrapper
# resets __dict__ to behave like the passed object
original_spec = getattr(spec, "wrapped_obj", spec)
self.wrapped_obj = original_spec
self.token = original_spec, name, query_parameters, _parent
is_virtual = spack.repo.PATH.is_virtual(name)
self.token = original_spec, name, query_parameters, _parent, is_virtual
self.last_query = QueryState(
name=name, extra_parameters=query_parameters, isvirtual=is_virtual
)
@@ -1536,9 +1534,8 @@ def __init__(self, spec_like=None, *, external_path=None, external_modules=None)
self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for
# external specs. None signal that it was not set yet.
self.extra_attributes = None
# This attribute is used to store custom information for external specs.
self.extra_attributes: dict = {}
# This attribute holds the original build copy of the spec if it is
# deployed differently than it was built. None signals that the spec
@@ -1915,10 +1912,22 @@ def package_class(self):
"""Internal package call gets only the class object for a package.
Use this to just get package metadata.
"""
warnings.warn(
"`Spec.package_class` is deprecated and will be removed in version 1.0.0. Use "
"`spack.repo.PATH.get_pkg_class(spec.fullname) instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
return spack.repo.PATH.get_pkg_class(self.fullname)
@property
def virtual(self):
warnings.warn(
"`Spec.virtual` is deprecated and will be removed in version 1.0.0. Use "
"`spack.repo.PATH.is_virtual(spec.name)` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
return spack.repo.PATH.is_virtual(self.name)
@property
@@ -2137,7 +2146,9 @@ def spec_hash(self, hash):
if hash.override is not None:
return hash.override(self)
node_dict = self.to_node_dict(hash=hash)
json_text = sjson.dump(node_dict)
json_text = json.dumps(
node_dict, ensure_ascii=True, indent=None, separators=(",", ":"), sort_keys=False
)
# This implements "frankenhashes", preserving the last 7 characters of the
# original hash when splicing so that we can avoid relocation issues
out = spack.util.hash.b32_hash(json_text)
@@ -2360,15 +2371,10 @@ def to_node_dict(self, hash=ht.dag_hash):
)
if self.external:
if self.extra_attributes:
extra_attributes = syaml.sorted_dict(self.extra_attributes)
else:
extra_attributes = None
d["external"] = {
"path": self.external_path,
"module": self.external_modules,
"extra_attributes": extra_attributes,
"module": self.external_modules or None,
"extra_attributes": syaml.sorted_dict(self.extra_attributes),
}
if not self._concrete:
@@ -2703,7 +2709,7 @@ def name_and_dependency_types(s: str) -> Tuple[str, dt.DepFlag]:
return name, depflag
def spec_and_dependency_types(
s: Union[Spec, Tuple[Spec, str]]
s: Union[Spec, Tuple[Spec, str]],
) -> Tuple[Spec, dt.DepFlag]:
"""Given a non-string key in the literal, extracts the spec
and its dependency types.
@@ -2823,24 +2829,6 @@ def from_detection(
s.extra_attributes = extra_attributes
return s
def validate_detection(self):
"""Validate the detection of an external spec.
This method is used as part of Spack's detection protocol, and is
not meant for client code use.
"""
# Assert that _extra_attributes is a Mapping and not None,
# which likely means the spec was created with Spec.from_detection
msg = 'cannot validate "{0}" since it was not created ' "using Spec.from_detection".format(
self
)
assert isinstance(self.extra_attributes, collections.abc.Mapping), msg
# Validate the spec calling a package specific method
pkg_cls = spack.repo.PATH.get_pkg_class(self.name)
validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None)
validate_fn(self, self.extra_attributes)
def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer."""
# FIXME: _patches_in_order_of_appearance is attached after concretization
@@ -2879,7 +2867,7 @@ def inject_patches_variant(root):
# Add any patches from the package to the spec.
patches = set()
for cond, patch_list in s.package_class.patches.items():
for cond, patch_list in spack.repo.PATH.get_pkg_class(s.fullname).patches.items():
if s.satisfies(cond):
for patch in patch_list:
patches.add(patch)
@@ -2892,7 +2880,7 @@ def inject_patches_variant(root):
if dspec.spec.concrete:
continue
pkg_deps = dspec.parent.package_class.dependencies
pkg_deps = spack.repo.PATH.get_pkg_class(dspec.parent.fullname).dependencies
patches = []
for cond, deps_by_name in pkg_deps.items():
@@ -3099,7 +3087,7 @@ def validate_or_raise(self):
# FIXME: raise just the first one encountered
for spec in self.traverse():
# raise an UnknownPackageError if the spec's package isn't real.
if (not spec.virtual) and spec.name:
if spec.name and not spack.repo.PATH.is_virtual(spec.name):
spack.repo.PATH.get_pkg_class(spec.fullname)
# validate compiler in addition to the package name.
@@ -3108,7 +3096,7 @@ def validate_or_raise(self):
raise UnsupportedCompilerError(spec.compiler.name)
# Ensure correctness of variants (if the spec is not virtual)
if not spec.virtual:
if not spack.repo.PATH.is_virtual(spec.name):
Spec.ensure_valid_variants(spec)
substitute_abstract_variants(spec)
@@ -3126,7 +3114,7 @@ def ensure_valid_variants(spec):
if spec.concrete:
return
pkg_cls = spec.package_class
pkg_cls = spack.repo.PATH.get_pkg_class(spec.fullname)
pkg_variants = pkg_cls.variant_names()
# reserved names are variants that may be set on any package
# but are not necessarily recorded by the package's class
@@ -3343,7 +3331,9 @@ def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
if self.virtual and other.virtual:
self_virtual = spack.repo.PATH.is_virtual(self.name)
other_virtual = spack.repo.PATH.is_virtual(other.name)
if self_virtual and other_virtual:
# Two virtual specs intersect only if there are providers for both
lhs = spack.repo.PATH.providers_for(str(self))
rhs = spack.repo.PATH.providers_for(str(other))
@@ -3351,8 +3341,8 @@ def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
return bool(intersection)
# A provider can satisfy a virtual dependency.
elif self.virtual or other.virtual:
virtual_spec, non_virtual_spec = (self, other) if self.virtual else (other, self)
elif self_virtual or other_virtual:
virtual_spec, non_virtual_spec = (self, other) if self_virtual else (other, self)
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.PATH.get_pkg_class(non_virtual_spec.fullname)
@@ -3422,12 +3412,20 @@ def _intersects_dependencies(self, other):
# These two loops handle cases where there is an overly restrictive
# vpkg in one spec for a provider in the other (e.g., mpi@3: is not
# compatible with mpich2)
for spec in self.virtual_dependencies():
if spec.name in other_index and not other_index.providers_for(spec):
for spec in self.traverse():
if (
spack.repo.PATH.is_virtual(spec.name)
and spec.name in other_index
and not other_index.providers_for(spec)
):
return False
for spec in other.virtual_dependencies():
if spec.name in self_index and not self_index.providers_for(spec):
for spec in other.traverse():
if (
spack.repo.PATH.is_virtual(spec.name)
and spec.name in self_index
and not self_index.providers_for(spec)
):
return False
return True
@@ -3457,7 +3455,9 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
# A concrete provider can satisfy a virtual dependency.
if not self.virtual and other.virtual:
if not spack.repo.PATH.is_virtual(self.name) and spack.repo.PATH.is_virtual(
other.name
):
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.PATH.get_pkg_class(self.fullname)
@@ -3525,7 +3525,7 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
lhs_edges: Dict[str, Set[DependencySpec]] = collections.defaultdict(set)
for rhs_edge in other.traverse_edges(root=False, cover="edges"):
# If we are checking for ^mpi we need to verify if there is any edge
if rhs_edge.spec.virtual:
if spack.repo.PATH.is_virtual(rhs_edge.spec.name):
rhs_edge.update_virtuals(virtuals=(rhs_edge.spec.name,))
if not rhs_edge.virtuals:
@@ -3569,10 +3569,6 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
for rhs in other.traverse(root=False)
)
def virtual_dependencies(self):
"""Return list of any virtual deps in this spec."""
return [spec for spec in self.traverse() if spec.virtual]
@property # type: ignore[misc] # decorated prop not supported in mypy
def patches(self):
"""Return patch objects for any patch sha256 sums on this Spec.
@@ -3762,30 +3758,23 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
order = lambda: itertools.chain(
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"),
# Consider all direct dependencies and transitive runtime dependencies
order = itertools.chain(
self.edges_to_dependencies(depflag=dt.ALL),
self.traverse_edges(deptype=dt.LINK | dt.RUN, order="breadth", cover="edges"),
)
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try:
child: Spec = next(
e.spec
for e in itertools.chain(
(e for e in order() if e.spec.name == name or name in e.virtuals),
# for historical reasons
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)),
)
)
except StopIteration:
raise KeyError(f"No spec with name {name} in {self}")
edge = next((e for e in order if e.spec.name == name or name in e.virtuals))
except StopIteration as e:
raise KeyError(f"No spec with name {name} in {self}") from e
if self._concrete:
return SpecBuildInterface(child, name, query_parameters, _parent=self)
return SpecBuildInterface(
edge.spec, name, query_parameters, _parent=self, is_virtual=name in edge.virtuals
)
return child
return edge.spec
def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec.
@@ -3801,8 +3790,11 @@ def __contains__(self, spec):
# if anonymous or same name, we only have to look at the root
if not spec.name or spec.name == self.name:
return self.satisfies(spec)
else:
return any(s.satisfies(spec) for s in self.traverse(root=False))
try:
dep = self[spec.name]
except KeyError:
return False
return dep.satisfies(spec)
def eq_dag(self, other, deptypes=True, vs=None, vo=None):
"""True if the full dependency DAGs of specs are equal."""
@@ -3870,6 +3862,13 @@ def _cmp_iter(self):
for item in self._cmp_node():
yield item
# If there is ever a breaking change to hash computation, whether accidental or purposeful,
# two specs can be identical modulo DAG hash, depending on what time they were concretized
# From the perspective of many operation in Spack (database, build cache, etc) a different
# DAG hash means a different spec. Here we ensure that two otherwise identical specs, one
# serialized before the hash change and one after, are considered different.
yield self.dag_hash() if self.concrete else None
# This needs to be in _cmp_iter so that no specs with different process hashes
# are considered the same by `__hash__` or `__eq__`.
#
@@ -4713,7 +4712,7 @@ def concrete(self):
bool: True or False
"""
return self.spec._concrete or all(
v in self for v in self.spec.package_class.variant_names()
v in self for v in spack.repo.PATH.get_pkg_class(self.spec.fullname).variant_names()
)
def copy(self) -> "VariantMap":
@@ -4735,7 +4734,10 @@ def __str__(self):
bool_keys = []
kv_keys = []
for key in sorted_keys:
bool_keys.append(key) if isinstance(self[key].value, bool) else kv_keys.append(key)
if isinstance(self[key].value, bool):
bool_keys.append(key)
else:
kv_keys.append(key)
# add spaces before and after key/value variants.
string = io.StringIO()
@@ -4770,14 +4772,14 @@ def substitute_abstract_variants(spec: Spec):
elif name in vt.reserved_names:
continue
variant_defs = spec.package_class.variant_definitions(name)
variant_defs = spack.repo.PATH.get_pkg_class(spec.fullname).variant_definitions(name)
valid_defs = []
for when, vdef in variant_defs:
if when.intersects(spec):
valid_defs.append(vdef)
if not valid_defs:
if name not in spec.package_class.variant_names():
if name not in spack.repo.PATH.get_pkg_class(spec.fullname).variant_names():
unknown.append(name)
else:
whens = [str(when) for when, _ in variant_defs]
@@ -4849,7 +4851,9 @@ def reconstruct_virtuals_on_edges(spec):
possible_virtuals = set()
for node in spec.traverse():
try:
possible_virtuals.update({x for x in node.package.dependencies if Spec(x).virtual})
possible_virtuals.update(
{x for x in node.package.dependencies if spack.repo.PATH.is_virtual(x)}
)
except Exception as e:
warnings.warn(f"cannot reconstruct virtual dependencies on package {node.name}: {e}")
continue
@@ -4914,7 +4918,7 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"]
if spec.external_modules is False:
spec.external_modules = None
spec.extra_attributes = node["external"].get("extra_attributes", {})
spec.extra_attributes = node["external"].get("extra_attributes") or {}
# specs read in are concrete unless marked abstract
if node.get("concrete", True):
@@ -5191,12 +5195,10 @@ def get_host_environment_metadata() -> Dict[str, str]:
def get_host_environment() -> Dict[str, Any]:
"""Return a dictionary (lookup) with host information (not including the
os.environ).
"""
"""Returns a dictionary with host information (not including the os.environ)."""
host_platform = spack.platforms.host()
host_target = host_platform.target("default_target")
host_os = host_platform.operating_system("default_os")
host_target = host_platform.default_target()
host_os = host_platform.default_operating_system()
arch_fmt = "platform={0} os={1} target={2}"
arch_spec = Spec(arch_fmt.format(host_platform, host_os, host_target))
return {

View File

@@ -1,7 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
""" Test ABI-based splicing of dependencies """
"""Test ABI-based splicing of dependencies"""
from typing import List

View File

@@ -60,8 +60,7 @@ def test_user_input_combination(config, target_str, os_str):
"""Test for all the valid user input combinations that both the target and
the operating system match.
"""
spec_str = "libelf os={} target={}".format(os_str, target_str)
spec = Spec(spec_str)
spec = Spec(f"libelf os={os_str} target={target_str}")
assert spec.architecture.os == str(TEST_PLATFORM.operating_system(os_str))
assert spec.architecture.target == TEST_PLATFORM.target(target_str)
@@ -71,8 +70,8 @@ def test_default_os_and_target(default_mock_concretization):
after concretization.
"""
spec = default_mock_concretization("libelf")
assert spec.architecture.os == str(TEST_PLATFORM.operating_system("default_os"))
assert spec.architecture.target == TEST_PLATFORM.target("default_target")
assert spec.architecture.os == str(TEST_PLATFORM.default_operating_system())
assert spec.architecture.target == TEST_PLATFORM.default_target()
def test_operating_system_conversion_to_dict():

View File

@@ -1,8 +1,10 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import io
import os
import subprocess
from urllib.error import HTTPError
import pytest
@@ -15,6 +17,7 @@
import spack.paths as spack_paths
import spack.repo as repo
import spack.util.git
from spack.test.conftest import MockHTTPResponse
pytestmark = [pytest.mark.usefixtures("mock_packages")]
@@ -162,38 +165,8 @@ def test_import_signing_key(mock_gnupghome):
ci.import_signing_key(signing_key)
class FakeWebResponder:
def __init__(self, response_code=200, content_to_read=[]):
self._resp_code = response_code
self._content = content_to_read
self._read = [False for c in content_to_read]
def open(self, request, data=None, timeout=object()):
return self
def getcode(self):
return self._resp_code
def read(self, length=None):
if len(self._content) <= 0:
return None
if not self._read[-1]:
return_content = self._content[-1]
if length:
self._read[-1] = True
else:
self._read.pop()
self._content.pop()
return return_content
self._read.pop()
self._content.pop()
return None
def test_download_and_extract_artifacts(tmpdir, monkeypatch, working_env):
os.environ.update({"GITLAB_PRIVATE_TOKEN": "faketoken"})
def test_download_and_extract_artifacts(tmpdir, monkeypatch):
monkeypatch.setenv("GITLAB_PRIVATE_TOKEN", "faketoken")
url = "https://www.nosuchurlexists.itsfake/artifacts.zip"
working_dir = os.path.join(tmpdir.strpath, "repro")
@@ -201,10 +174,13 @@ def test_download_and_extract_artifacts(tmpdir, monkeypatch, working_env):
spack_paths.test_path, "data", "ci", "gitlab", "artifacts.zip"
)
with open(test_artifacts_path, "rb") as fd:
fake_responder = FakeWebResponder(content_to_read=[fd.read()])
def _urlopen_OK(*args, **kwargs):
with open(test_artifacts_path, "rb") as f:
return MockHTTPResponse(
"200", "OK", {"Content-Type": "application/zip"}, io.BytesIO(f.read())
)
monkeypatch.setattr(ci, "build_opener", lambda handler: fake_responder)
monkeypatch.setattr(ci, "urlopen", _urlopen_OK)
ci.download_and_extract_artifacts(url, working_dir)
@@ -214,7 +190,11 @@ def test_download_and_extract_artifacts(tmpdir, monkeypatch, working_env):
found_install = fs.find(working_dir, "install.sh")
assert len(found_install) == 1
fake_responder._resp_code = 400
def _urlopen_500(*args, **kwargs):
raise HTTPError(url, 500, "Internal Server Error", {}, None)
monkeypatch.setattr(ci, "urlopen", _urlopen_500)
with pytest.raises(spack.error.SpackError):
ci.download_and_extract_artifacts(url, working_dir)
@@ -328,16 +308,14 @@ def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
e1.add("hypre")
e1.concretize()
"""
Concretizing the above environment results in the following graphs:
# Concretizing the above environment results in the following graphs:
mpileaks -> mpich (provides mpi virtual dep of mpileaks)
-> callpath -> dyninst -> libelf
-> libdwarf -> libelf
-> mpich (provides mpi dep of callpath)
# mpileaks -> mpich (provides mpi virtual dep of mpileaks)
# -> callpath -> dyninst -> libelf
# -> libdwarf -> libelf
# -> mpich (provides mpi dep of callpath)
hypre -> openblas-with-lapack (provides lapack and blas virtual deps of hypre)
"""
# hypre -> openblas-with-lapack (provides lapack and blas virtual deps of hypre)
touched = ["libdwarf"]

View File

@@ -12,7 +12,7 @@
build_env = SpackCommand("build-env")
@pytest.mark.parametrize("pkg", [("zlib",), ("zlib", "--")])
@pytest.mark.parametrize("pkg", [("pkg-c",), ("pkg-c", "--")])
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_it_just_runs(pkg):
build_env(*pkg)
@@ -38,7 +38,7 @@ def test_build_env_requires_a_spec(args):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_dump(shell_as, shell, tmpdir):
with tmpdir.as_cwd():
build_env("--dump", _out_file, "zlib")
build_env("--dump", _out_file, "pkg-c")
with open(_out_file, encoding="utf-8") as f:
if shell == "pwsh":
assert any(line.startswith("$Env:PATH") for line in f.readlines())
@@ -51,7 +51,7 @@ def test_dump(shell_as, shell, tmpdir):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_pickle(tmpdir):
with tmpdir.as_cwd():
build_env("--pickle", _out_file, "zlib")
build_env("--pickle", _out_file, "pkg-c")
environment = pickle.load(open(_out_file, "rb"))
assert isinstance(environment, dict)
assert "PATH" in environment

View File

@@ -148,7 +148,7 @@ def test_update_key_index(
s = spack.concretize.concretize_one("libdwarf")
# Install a package
install(s.name)
install("--fake", s.name)
# Put installed package in the buildcache, which, because we're signing
# it, should result in the public key getting pushed to the buildcache
@@ -178,7 +178,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
s = spack.concretize.concretize_one("libdwarf")
# Install and generate build cache index
PackageInstaller([s.package], explicit=True).install()
PackageInstaller([s.package], fake=True, explicit=True).install()
metadata_file = spack.binary_distribution.tarball_name(s, ".spec.json")
@@ -220,7 +220,7 @@ def verify_mirror_contents():
# Install a package and put it in the buildcache
s = spack.concretize.concretize_one(out_env_pkg)
install(s.name)
install("--fake", s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name)
env("create", "test")

View File

@@ -3,6 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import pathlib
import pytest
@@ -22,7 +23,15 @@
@pytest.fixture
def can_fetch_versions(monkeypatch):
def no_add(monkeypatch):
def add_versions_to_pkg(pkg, version_lines, open_in_editor):
raise AssertionError("Should not be called")
monkeypatch.setattr(spack.cmd.checksum, "add_versions_to_pkg", add_versions_to_pkg)
@pytest.fixture
def can_fetch_versions(monkeypatch, no_add):
"""Fake successful version detection."""
def fetch_remote_versions(pkg, concurrency):
@@ -45,7 +54,7 @@ def url_exists(url, curl=None):
@pytest.fixture
def cannot_fetch_versions(monkeypatch):
def cannot_fetch_versions(monkeypatch, no_add):
"""Fake unsuccessful version detection."""
def fetch_remote_versions(pkg, concurrency):
@@ -88,7 +97,6 @@ def test_checksum_args(arguments, expected):
(["--batch", "preferred-test"], "version of preferred-test"),
(["--latest", "preferred-test"], "Found 1 version"),
(["--preferred", "preferred-test"], "Found 1 version"),
(["--add-to-package", "preferred-test"], "Added 0 new versions to"),
(["--verify", "preferred-test"], "Verified 1 of 1"),
(["--verify", "zlib", "1.2.13"], "1.2.13 [-] No previous checksum"),
],
@@ -271,15 +279,12 @@ def test_checksum_interactive_unrecognized_command():
assert interactive_version_filter(v.copy(), input=input) == v
def test_checksum_versions(mock_packages, can_fetch_versions):
def test_checksum_versions(mock_packages, can_fetch_versions, monkeypatch):
pkg_cls = spack.repo.PATH.get_pkg_class("zlib")
versions = [str(v) for v in pkg_cls.versions]
output = spack_checksum("zlib", *versions)
assert "Found 3 versions" in output
assert "version(" in output
output = spack_checksum("--add-to-package", "zlib", *versions)
assert "Found 3 versions" in output
assert "Added 0 new versions to" in output
def test_checksum_missing_version(mock_packages, cannot_fetch_versions):
@@ -287,7 +292,6 @@ def test_checksum_missing_version(mock_packages, cannot_fetch_versions):
assert "Could not find any remote versions" in output
output = spack_checksum("--add-to-package", "preferred-test", "99.99.99", fail_on_error=False)
assert "Could not find any remote versions" in output
assert "Added 1 new versions to" not in output
def test_checksum_deprecated_version(mock_packages, can_fetch_versions):
@@ -297,8 +301,6 @@ def test_checksum_deprecated_version(mock_packages, can_fetch_versions):
"--add-to-package", "deprecated-versions", "1.1.0", fail_on_error=False
)
assert "Version 1.1.0 is deprecated" in output
# TODO alecbcs: broken assertion.
# assert "Added 0 new versions to" not in output
def test_checksum_url(mock_packages, config):
@@ -337,3 +339,52 @@ def test_checksum_manual_download_fails(mock_packages, monkeypatch):
monkeypatch.setattr(spack.package_base.PackageBase, "download_instr", error)
with pytest.raises(ManualDownloadRequiredError, match=error):
spack_checksum(name, *versions)
def test_upate_package_contents(tmp_path: pathlib.Path):
"""Test that the package.py file is updated with the new versions."""
pkg_path = tmp_path / "package.py"
pkg_path.write_text(
"""\
from spack.package import *
class Zlib(Package):
homepage = "http://zlib.net"
url = "http://zlib.net/fossils/zlib-1.2.11.tar.gz"
version("1.2.11", sha256="c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1")
version("1.2.8", sha256="36658cb768a54c1d4dec43c3116c27ed893e88b02ecfcb44f2166f9c0b7f2a0d")
version("1.2.3", sha256="1795c7d067a43174113fdf03447532f373e1c6c57c08d61d9e4e9be5e244b05e")
variant("pic", default=True, description="test")
def install(self, spec, prefix):
make("install")
"""
)
version_lines = """\
version("1.2.13", sha256="abcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890")
version("1.2.5", sha256="abcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890")
version("1.2.3", sha256="1795c7d067a43174113fdf03447532f373e1c6c57c08d61d9e4e9be5e244b05e")
"""
# two new versions are added
assert spack.cmd.checksum.add_versions_to_pkg(str(pkg_path), version_lines) == 2
assert (
pkg_path.read_text()
== """\
from spack.package import *
class Zlib(Package):
homepage = "http://zlib.net"
url = "http://zlib.net/fossils/zlib-1.2.11.tar.gz"
version("1.2.13", sha256="abcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890") # FIXME
version("1.2.11", sha256="c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1")
version("1.2.8", sha256="36658cb768a54c1d4dec43c3116c27ed893e88b02ecfcb44f2166f9c0b7f2a0d")
version("1.2.5", sha256="abcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890") # FIXME
version("1.2.3", sha256="1795c7d067a43174113fdf03447532f373e1c6c57c08d61d9e4e9be5e244b05e")
variant("pic", default=True, description="test")
def install(self, spec, prefix):
make("install")
"""
)

View File

@@ -28,6 +28,7 @@
from spack.ci.generator_registry import generator
from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE
from spack.database import INDEX_JSON_FILE
from spack.error import SpackError
from spack.schema.buildcache_spec import schema as specfile_schema
from spack.schema.database_index import schema as db_idx_schema
from spack.spec import Spec
@@ -170,7 +171,9 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
url: https://my.fake.cdash
project: Not used
site: Nothing
"""
""",
"--artifacts-root",
str(tmp_path / "my_artifacts_root"),
)
yaml_contents = syaml.load(outputfile.read_text())
@@ -192,7 +195,7 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
assert "variables" in yaml_contents
assert "SPACK_ARTIFACTS_ROOT" in yaml_contents["variables"]
assert yaml_contents["variables"]["SPACK_ARTIFACTS_ROOT"] == "jobs_scratch_dir"
assert yaml_contents["variables"]["SPACK_ARTIFACTS_ROOT"] == "my_artifacts_root"
def test_ci_generate_with_env_missing_section(ci_generate_test, tmp_path, mock_binary_index):
@@ -329,14 +332,14 @@ def test_ci_generate_pkg_with_deps(ci_generate_test, tmp_path, ci_base_environme
f"""\
spack:
specs:
- flatten-deps
- dependent-install
mirrors:
buildcache-destination: {tmp_path / 'ci-mirror'}
ci:
pipeline-gen:
- submapping:
- match:
- flatten-deps
- dependent-install
build-job:
tags:
- donotcare
@@ -355,12 +358,12 @@ def test_ci_generate_pkg_with_deps(ci_generate_test, tmp_path, ci_base_environme
assert "stage" in ci_obj
assert ci_obj["stage"] == "stage-0"
found.append("dependency-install")
if "flatten-deps" in ci_key:
if "dependent-install" in ci_key:
assert "stage" in ci_obj
assert ci_obj["stage"] == "stage-1"
found.append("flatten-deps")
found.append("dependent-install")
assert "flatten-deps" in found
assert "dependent-install" in found
assert "dependency-install" in found
@@ -372,14 +375,14 @@ def test_ci_generate_for_pr_pipeline(ci_generate_test, tmp_path, monkeypatch):
f"""\
spack:
specs:
- flatten-deps
- dependent-install
mirrors:
buildcache-destination: {tmp_path / 'ci-mirror'}
ci:
pipeline-gen:
- submapping:
- match:
- flatten-deps
- dependent-install
build-job:
tags:
- donotcare
@@ -899,7 +902,7 @@ def test_ci_generate_override_runner_attrs(
f"""\
spack:
specs:
- flatten-deps
- dependent-install
- pkg-a
mirrors:
buildcache-destination: {tmp_path / "ci-mirror"}
@@ -908,7 +911,7 @@ def test_ci_generate_override_runner_attrs(
- match_behavior: {match_behavior}
submapping:
- match:
- flatten-deps
- dependent-install
build-job:
tags:
- specific-one
@@ -1006,8 +1009,8 @@ def test_ci_generate_override_runner_attrs(
assert the_elt["script"][0] == "main step"
assert len(the_elt["after_script"]) == 1
assert the_elt["after_script"][0] == "post step one"
if "flatten-deps" in ci_key:
# The flatten-deps match specifies that we keep the two
if "dependent-install" in ci_key:
# The dependent-install match specifies that we keep the two
# top level variables, but add a third specifc one. It
# also adds a custom tag which should be combined with
# the top-level tag.
@@ -1062,7 +1065,7 @@ def test_ci_rebuild_index(
with open(tmp_path / "spec.json", "w", encoding="utf-8") as f:
f.write(concrete_spec.to_json(hash=ht.dag_hash))
install_cmd("--add", "-f", str(tmp_path / "spec.json"))
install_cmd("--fake", "--add", "-f", str(tmp_path / "spec.json"))
buildcache_cmd("push", "-u", "-f", mirror_url, "callpath")
ci_cmd("rebuild-index")
@@ -1182,12 +1185,12 @@ def test_ci_generate_read_broken_specs_url(
spec_a = spack.concretize.concretize_one("pkg-a")
a_dag_hash = spec_a.dag_hash()
spec_flattendeps = spack.concretize.concretize_one("flatten-deps")
spec_flattendeps = spack.concretize.concretize_one("dependent-install")
flattendeps_dag_hash = spec_flattendeps.dag_hash()
broken_specs_url = tmp_path.as_uri()
# Mark 'a' as broken (but not 'flatten-deps')
# Mark 'a' as broken (but not 'dependent-install')
broken_spec_a_url = "{0}/{1}".format(broken_specs_url, a_dag_hash)
job_stack = "job_stack"
a_job_url = "a_job_url"
@@ -1201,7 +1204,7 @@ def test_ci_generate_read_broken_specs_url(
f"""\
spack:
specs:
- flatten-deps
- dependent-install
- pkg-a
mirrors:
buildcache-destination: {(tmp_path / "ci-mirror").as_uri()}
@@ -1211,7 +1214,7 @@ def test_ci_generate_read_broken_specs_url(
- submapping:
- match:
- pkg-a
- flatten-deps
- dependent-install
- pkg-b
- dependency-install
build-job:
@@ -1234,7 +1237,7 @@ def test_ci_generate_read_broken_specs_url(
)
assert expected in output
not_expected = f"flatten-deps/{flattendeps_dag_hash[:7]} (in stack"
not_expected = f"dependent-install/{flattendeps_dag_hash[:7]} (in stack"
assert not_expected not in output
@@ -1322,44 +1325,50 @@ def test_ci_reproduce(
env.concretize()
env.write()
repro_dir.mkdir()
def fake_download_and_extract_artifacts(url, work_dir, merge_commit_test=True):
with working_dir(tmp_path), ev.Environment(".") as env:
if not os.path.exists(repro_dir):
repro_dir.mkdir()
job_spec = env.concrete_roots()[0]
with open(repro_dir / "archivefiles.json", "w", encoding="utf-8") as f:
f.write(job_spec.to_json(hash=ht.dag_hash))
job_spec = env.concrete_roots()[0]
with open(repro_dir / "archivefiles.json", "w", encoding="utf-8") as f:
f.write(job_spec.to_json(hash=ht.dag_hash))
artifacts_root = repro_dir / "jobs_scratch_dir"
pipeline_path = artifacts_root / "pipeline.yml"
artifacts_root = repro_dir / "scratch_dir"
pipeline_path = artifacts_root / "pipeline.yml"
ci_cmd(
"generate",
"--output-file",
str(pipeline_path),
"--artifacts-root",
str(artifacts_root),
)
job_name = gitlab_generator.get_job_name(job_spec)
with open(repro_dir / "repro.json", "w", encoding="utf-8") as f:
f.write(
json.dumps(
{
"job_name": job_name,
"job_spec_json": "archivefiles.json",
"ci_project_dir": str(repro_dir),
}
ci_cmd(
"generate",
"--output-file",
str(pipeline_path),
"--artifacts-root",
str(artifacts_root),
)
)
with open(repro_dir / "install.sh", "w", encoding="utf-8") as f:
f.write("#!/bin/sh\n\n#fake install\nspack install blah\n")
job_name = gitlab_generator.get_job_name(job_spec)
with open(repro_dir / "spack_info.txt", "w", encoding="utf-8") as f:
f.write(f"\nMerge {last_two_git_commits[1]} into {last_two_git_commits[0]}\n\n")
with open(repro_dir / "repro.json", "w", encoding="utf-8") as f:
f.write(
json.dumps(
{
"job_name": job_name,
"job_spec_json": "archivefiles.json",
"ci_project_dir": str(repro_dir),
}
)
)
def fake_download_and_extract_artifacts(url, work_dir):
pass
with open(repro_dir / "install.sh", "w", encoding="utf-8") as f:
f.write("#!/bin/sh\n\n#fake install\nspack install blah\n")
with open(repro_dir / "spack_info.txt", "w", encoding="utf-8") as f:
if merge_commit_test:
f.write(
f"\nMerge {last_two_git_commits[1]} into {last_two_git_commits[0]}\n\n"
)
else:
f.write(f"\ncommit {last_two_git_commits[1]}\n\n")
return "jobs_scratch_dir"
monkeypatch.setattr(ci, "download_and_extract_artifacts", fake_download_and_extract_artifacts)
rep_out = ci_cmd(
@@ -1375,6 +1384,64 @@ def fake_download_and_extract_artifacts(url, work_dir):
# Make sure we tell the user where it is when not in interactive mode
assert f"$ {repro_dir}/start.sh" in rep_out
# Ensure the correct commits are used
assert f"checkout_commit: {last_two_git_commits[0]}" in rep_out
assert f"merge_commit: {last_two_git_commits[1]}" in rep_out
# Test re-running in dirty working dir
with pytest.raises(SpackError, match=f"{repro_dir}"):
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--working-dir",
str(repro_dir),
output=str,
)
# Cleanup between tests
shutil.rmtree(repro_dir)
# Test --use-local-head
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--use-local-head",
"--working-dir",
str(repro_dir),
output=str,
)
# Make sure we are checkout out the HEAD commit without a merge commit
assert "checkout_commit: HEAD" in rep_out
assert "merge_commit: None" in rep_out
# Test the case where the spack_info.txt is not a merge commit
monkeypatch.setattr(
ci,
"download_and_extract_artifacts",
lambda url, wd: fake_download_and_extract_artifacts(url, wd, False),
)
# Cleanup between tests
shutil.rmtree(repro_dir)
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--working-dir",
str(repro_dir),
output=str,
)
# Make sure the script was generated
assert (repro_dir / "start.sh").exists()
# Make sure we tell the user where it is when not in interactive mode
assert f"$ {repro_dir}/start.sh" in rep_out
# Ensure the correct commit is used (different than HEAD)
assert f"checkout_commit: {last_two_git_commits[1]}" in rep_out
assert "merge_commit: None" in rep_out
@pytest.mark.parametrize(
"url_in,url_out",
@@ -1447,7 +1514,7 @@ def test_gitlab_config_scopes(ci_generate_test, tmp_path):
include: [{configs_path}]
view: false
specs:
- flatten-deps
- dependent-install
mirrors:
buildcache-destination: {tmp_path / "ci-mirror"}
ci:

View File

@@ -335,7 +335,7 @@ def test_config_add_override_leaf_from_file(mutable_empty_config, tmpdir):
def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
config("add", "packages:all:compiler:[gcc]")
config("add", "packages:all:require:['%gcc']")
# contents to add to file
contents = """spack:
@@ -357,7 +357,7 @@ def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
expected = """packages:
all:
target: [x86_64]
compiler: [gcc]
require: ['%gcc']
"""
assert expected == output
@@ -606,7 +606,6 @@ def test_config_prefer_upstream(
packages = syaml.load(open(cfg_file, encoding="utf-8"))["packages"]
# Make sure only the non-default variants are set.
assert packages["all"] == {"compiler": ["gcc@=10.2.1"]}
assert packages["boost"] == {"variants": "+debug +graph", "version": ["1.63.0"]}
assert packages["dependency-install"] == {"version": ["2.0"]}
# Ensure that neither variant gets listed for hdf5, since they conflict

View File

@@ -2,57 +2,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import platform
import pytest
import spack
import spack.platforms
import spack.spec
from spack.database import INDEX_JSON_FILE
from spack.main import SpackCommand
from spack.util.executable import which
debug = SpackCommand("debug")
@pytest.mark.db
def test_create_db_tarball(tmpdir, database):
with tmpdir.as_cwd():
debug("create-db-tarball")
# get the first non-dotfile to avoid coverage files in the directory
files = os.listdir(os.getcwd())
tarball_name = next(
f for f in files if not f.startswith(".") and not f.startswith("tests")
)
# debug command made an archive
assert os.path.exists(tarball_name)
# print contents of archive
tar = which("tar")
contents = tar("tzf", tarball_name, output=str)
# DB file is included
assert INDEX_JSON_FILE in contents
# specfiles from all installs are included
for spec in database.query():
# externals won't have a specfile
if spec.external:
continue
spec_suffix = "%s/.spack/spec.json" % spec.dag_hash()
assert spec_suffix in contents
def test_report():
out = debug("report")
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("frontend")
host_target = host_platform.target("frontend")
host_os = host_platform.default_operating_system()
host_target = host_platform.default_target()
architecture = spack.spec.ArchSpec((str(host_platform), str(host_os), str(host_target)))
assert spack.get_version() in out

View File

@@ -24,32 +24,24 @@
mpi_deps = ["fake"]
def test_direct_dependencies(mock_packages):
out = dependencies("mpileaks")
actual = set(re.split(r"\s+", out.strip()))
expected = set(["callpath"] + mpis)
assert expected == actual
def test_transitive_dependencies(mock_packages):
out = dependencies("--transitive", "mpileaks")
actual = set(re.split(r"\s+", out.strip()))
expected = set(["callpath", "dyninst", "libdwarf", "libelf"] + mpis + mpi_deps)
assert expected == actual
def test_transitive_dependencies_with_deptypes(mock_packages):
out = dependencies("--transitive", "--deptype=link,run", "dtbuild1")
deps = set(re.split(r"\s+", out.strip()))
assert set(["dtlink2", "dtrun2"]) == deps
out = dependencies("--transitive", "--deptype=build", "dtbuild1")
deps = set(re.split(r"\s+", out.strip()))
assert set(["dtbuild2", "dtlink2"]) == deps
out = dependencies("--transitive", "--deptype=link", "dtbuild1")
deps = set(re.split(r"\s+", out.strip()))
assert set(["dtlink2"]) == deps
@pytest.mark.parametrize(
"cli_args,expected",
[
(["mpileaks"], set(["callpath"] + mpis)),
(
["--transitive", "mpileaks"],
set(["callpath", "dyninst", "libdwarf", "libelf"] + mpis + mpi_deps),
),
(["--transitive", "--deptype=link,run", "dtbuild1"], {"dtlink2", "dtrun2"}),
(["--transitive", "--deptype=build", "dtbuild1"], {"dtbuild2", "dtlink2"}),
(["--transitive", "--deptype=link", "dtbuild1"], {"dtlink2"}),
],
)
def test_direct_dependencies(cli_args, expected, mock_runtimes):
out = dependencies(*cli_args)
result = set(re.split(r"\s+", out.strip()))
expected.update(mock_runtimes)
assert expected == result
@pytest.mark.db

View File

@@ -17,16 +17,16 @@
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
all_installed = spack.store.STORE.db.query()
all_installed = spack.store.STORE.db.query("libelf")
assert len(all_installed) == 2
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert all_available == all_installed
assert non_deprecated == spack.store.STORE.db.query("libelf@0.8.13")
@@ -39,24 +39,24 @@ def test_deprecate_fails_no_such_package(mock_packages, mock_archive, mock_fetch
output = deprecate("-y", "libelf@0.8.10", "libelf@0.8.13", fail_on_error=False)
assert "Spec 'libelf@0.8.10' matches no installed packages" in output
install("libelf@0.8.10")
install("--fake", "libelf@0.8.10")
output = deprecate("-y", "libelf@0.8.10", "libelf@0.8.13", fail_on_error=False)
assert "Spec 'libelf@0.8.13' matches no installed packages" in output
def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that the ```-i`` option allows us to deprecate in favor of a spec
that is not yet installed."""
install("libelf@0.8.10")
to_deprecate = spack.store.STORE.db.query()
def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mockery, monkeypatch):
"""Tests that the -i option allows us to deprecate in favor of a spec
that is not yet installed.
"""
install("--fake", "libelf@0.8.10")
to_deprecate = spack.store.STORE.db.query("libelf")
assert len(to_deprecate) == 1
deprecate("-y", "-i", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
deprecated = spack.store.STORE.db.query(installed=InstallRecordStatus.DEPRECATED)
non_deprecated = spack.store.STORE.db.query("libelf")
deprecated = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.DEPRECATED)
assert deprecated == to_deprecate
assert len(non_deprecated) == 1
assert non_deprecated[0].satisfies("libelf@0.8.13")
@@ -64,8 +64,8 @@ def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mock
def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Test that the deprecate command deprecates all dependencies properly."""
install("libdwarf@20130729 ^libelf@0.8.13")
install("libdwarf@20130207 ^libelf@0.8.10")
install("--fake", "libdwarf@20130729 ^libelf@0.8.13")
install("--fake", "libdwarf@20130207 ^libelf@0.8.10")
new_spec = spack.concretize.concretize_one("libdwarf@20130729^libelf@0.8.13")
old_spec = spack.concretize.concretize_one("libdwarf@20130207^libelf@0.8.10")
@@ -81,14 +81,14 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
assert all_available == all_installed
assert sorted(all_available) == sorted(deprecated + non_deprecated)
assert sorted(non_deprecated) == sorted(list(new_spec.traverse()))
assert sorted(deprecated) == sorted(list(old_spec.traverse()))
assert sorted(non_deprecated) == sorted(new_spec.traverse())
assert sorted(deprecated) == sorted([old_spec, old_spec["libelf"]])
def test_uninstall_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that we can still uninstall deprecated packages."""
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
@@ -104,9 +104,9 @@ def test_uninstall_deprecated(mock_packages, mock_archive, mock_fetch, install_m
def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that we can re-deprecate a spec to change its deprecator."""
install("libelf@0.8.13")
install("libelf@0.8.12")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.12")
install("--fake", "libelf@0.8.10")
deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
@@ -117,8 +117,8 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 2
assert len(all_available) == 3
@@ -129,9 +129,9 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that when a deprecator spec is deprecated, its deprecatee specs
are updated to point to the new deprecator."""
install("libelf@0.8.13")
install("libelf@0.8.12")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.12")
install("--fake", "libelf@0.8.10")
first_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
second_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.12")
@@ -144,8 +144,8 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
deprecate("-y", "libelf@0.8.12", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 1
assert len(all_available) == 3
@@ -158,8 +158,8 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
def test_concretize_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that the concretizer throws an error if we concretize to a
deprecated spec"""
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")

View File

@@ -127,16 +127,15 @@ def test_dev_build_before_until(tmpdir, install_mockery):
assert not_installed in out
def print_spack_cc(*args):
# Eat arguments and print environment variable to test
print(os.environ.get("CC", ""))
def _print_spack_short_spec(*args):
print(f"SPACK_SHORT_SPEC={os.environ['SPACK_SHORT_SPEC']}")
def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery, working_env):
monkeypatch.setattr(os, "execvp", print_spack_cc)
monkeypatch.setattr(os, "execvp", _print_spack_short_spec)
with tmpdir.as_cwd():
output = dev_build("-b", "edit", "--drop-in", "sh", "dev-build-test-install@0.0.0")
assert os.path.join("lib", "spack", "env") in output
assert "SPACK_SHORT_SPEC=dev-build-test-install@0.0.0" in output
def test_dev_build_fails_already_installed(tmpdir, install_mockery):

View File

@@ -194,7 +194,7 @@ def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
def test_load_first(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test with and without the --first option"""
install_cmd("mpileaks")
install_cmd("--fake", "mpileaks")
# Only one version of mpileaks will work
diff_cmd("mpileaks", "mpileaks")

View File

@@ -1038,6 +1038,58 @@ def test_init_from_yaml(environment_from_manifest):
assert not e2.specs_by_hash
def test_init_from_yaml_relative_includes(tmp_path):
files = [
"relative_copied/packages.yaml",
"./relative_copied/compilers.yaml",
"repos.yaml",
"./config.yaml",
]
manifest = f"""
spack:
specs: []
include: {files}
"""
e1_path = tmp_path / "e1"
e1_manifest = e1_path / "spack.yaml"
fs.mkdirp(e1_path)
with open(e1_manifest, "w", encoding="utf-8") as f:
f.write(manifest)
for f in files:
fs.touchp(e1_path / f)
e2 = _env_create("test2", init_file=e1_manifest)
for f in files:
assert os.path.exists(os.path.join(e2.path, f))
def test_init_from_yaml_relative_includes_outside_env(tmp_path):
files = ["../outside_env_not_copied/repos.yaml"]
manifest = f"""
spack:
specs: []
include: {files}
"""
# subdir to ensure parent of environment dir is not shared
e1_path = tmp_path / "e1_subdir" / "e1"
e1_manifest = e1_path / "spack.yaml"
fs.mkdirp(e1_path)
with open(e1_manifest, "w", encoding="utf-8") as f:
f.write(manifest)
for f in files:
fs.touchp(e1_path / f)
with pytest.raises(spack.config.ConfigFileError, match="Detected 1 missing include"):
_ = _env_create("test2", init_file=e1_manifest)
def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
fake_prefix = tmp_path / "a-prefix"
fake_bin = fake_prefix / "bin"
@@ -1698,7 +1750,10 @@ def check_stage(spec):
spec = spack.concretize.concretize_one(spec)
for dep in spec.traverse():
stage_name = f"{stage_prefix}{dep.name}-{dep.version}-{dep.dag_hash()}"
assert os.path.isdir(os.path.join(root, stage_name))
if dep.external:
assert not os.path.exists(os.path.join(root, stage_name))
else:
assert os.path.isdir(os.path.join(root, stage_name))
check_stage("mpileaks")
check_stage("zmpi")
@@ -2614,7 +2669,7 @@ def test_stack_yaml_remove_from_matrix_no_effect(tmpdir):
- packages:
- matrix:
- [mpileaks, callpath]
- [target=be]
- [target=default_target]
specs:
- $packages
"""
@@ -2639,7 +2694,7 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
- packages:
- matrix:
- [mpileaks, callpath]
- [target=be]
- [target=default_target]
specs:
- $packages
"""
@@ -2659,7 +2714,7 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
assert before_user == after_user
mpileaks_spec = Spec("mpileaks target=be")
mpileaks_spec = Spec("mpileaks target=default_target")
assert mpileaks_spec in before_conc
assert mpileaks_spec not in after_conc
@@ -3023,11 +3078,10 @@ def test_stack_view_activate_from_default(
assert "FOOBAR=mpileaks" in shell
def test_envvar_set_in_activate(tmpdir, mock_fetch, mock_packages, mock_archive, install_mockery):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w", encoding="utf-8") as f:
f.write(
"""\
def test_envvar_set_in_activate(tmp_path, mock_packages, install_mockery):
spack_yaml = tmp_path / "spack.yaml"
spack_yaml.write_text(
"""
spack:
specs:
- cmake%gcc
@@ -3035,21 +3089,21 @@ def test_envvar_set_in_activate(tmpdir, mock_fetch, mock_packages, mock_archive,
set:
ENVAR_SET_IN_ENV_LOAD: "True"
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
install()
)
test_env = ev.read("test")
output = env("activate", "--sh", "test")
env("create", "test", str(spack_yaml))
with ev.read("test"):
install("--fake")
assert "ENVAR_SET_IN_ENV_LOAD=True" in output
test_env = ev.read("test")
output = env("activate", "--sh", "test")
with test_env:
with spack.util.environment.set_env(ENVAR_SET_IN_ENV_LOAD="True"):
output = env("deactivate", "--sh")
assert "unset ENVAR_SET_IN_ENV_LOAD" in output
assert "ENVAR_SET_IN_ENV_LOAD=True" in output
with test_env:
with spack.util.environment.set_env(ENVAR_SET_IN_ENV_LOAD="True"):
output = env("deactivate", "--sh")
assert "unset ENVAR_SET_IN_ENV_LOAD" in output
def test_stack_view_no_activate_without_default(

View File

@@ -233,21 +233,27 @@ def test_display_json_deps(database, capsys):
@pytest.mark.db
def test_find_format(database, config):
output = find("--format", "{name}-{^mpi.name}", "mpileaks")
assert set(output.strip().split("\n")) == set(
["mpileaks-zmpi", "mpileaks-mpich", "mpileaks-mpich2"]
)
assert set(output.strip().split("\n")) == {
"mpileaks-zmpi",
"mpileaks-mpich",
"mpileaks-mpich2",
}
output = find("--format", "{name}-{version}-{compiler.name}-{^mpi.name}", "mpileaks")
assert "installed package" not in output
assert set(output.strip().split("\n")) == set(
["mpileaks-2.3-gcc-zmpi", "mpileaks-2.3-gcc-mpich", "mpileaks-2.3-gcc-mpich2"]
)
assert set(output.strip().split("\n")) == {
"mpileaks-2.3-gcc-zmpi",
"mpileaks-2.3-gcc-mpich",
"mpileaks-2.3-gcc-mpich2",
}
output = find("--format", "{name}-{^mpi.name}-{hash:7}", "mpileaks")
elements = output.strip().split("\n")
assert set(e[:-7] for e in elements) == set(
["mpileaks-zmpi-", "mpileaks-mpich-", "mpileaks-mpich2-"]
)
assert set(e[:-7] for e in elements) == {
"mpileaks-zmpi-",
"mpileaks-mpich-",
"mpileaks-mpich2-",
}
# hashes are in base32
for e in elements:
@@ -348,7 +354,7 @@ def test_find_prefix_in_env(
"""Test `find` formats requiring concrete specs work in environments."""
env("create", "test")
with ev.read("test"):
install("--add", "mpileaks")
install("--fake", "--add", "mpileaks")
find("-p")
find("-l")
find("-L")

View File

@@ -139,7 +139,7 @@ def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path
def test_gc_except_nonexisting_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
output = gc("-ye", tmpdir.strpath, fail_on_error=False)
assert "No such environment" in output
gc.returncode == 1
assert gc.returncode == 1
@pytest.mark.db

Some files were not shown because too many files have changed in this diff Show More