Compare commits

...

378 Commits

Author SHA1 Message Date
Paul R. C. Kent
66c1c213b1 gcc: add v15.1.0 (#50212) 2025-04-25 22:08:01 -06:00
Mathew Cleveland
f46528ec6b draco: add v7.20.0 (#49996)
Co-authored-by: Cleveland <cleveland@lanl.gov>
2025-04-25 17:37:26 +02:00
Gregor Daiß
41489efa4c sgpp: update dependencies and variants (#49384)
* sgpp: add new variants and constraints

* sgpp: fix format

* sgpp: add missing patch

* sgpp: fix style

* sgpp: Stop applying aarch patch for newer versions

* sgpp: Better Eigen variant description

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* sgpp: fix format

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-04-25 17:32:30 +02:00
Massimiliano Culpo
3df5a85237 input_analysis.py: fix conditional requirements (#50194)
Fixes a logic bug where a -> b was assumed to imply not a -> not b
in conditional requirements.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-25 09:44:05 +02:00
Mikael Simberg
8921612f6a boost: add 1.88.0 (#50158)
* boost: add 1.88.0

* pika: add conflict with boost 1.88.0
2025-04-25 08:54:07 +02:00
Matt Thompson
e6a0a6c145 mapl: add v2.55.1 (#50201) 2025-04-24 20:40:48 -07:00
Matt Thompson
104d6b4484 mepo: add v2.3.2 (#50202) 2025-04-24 20:39:20 -07:00
Alec Scott
cba9436cf4 py-repligit: add v0.1.1 (#50204)
* py-repligit: add v0.1.1

* Add conflicts for older versions of python when at v0.1.0
2025-04-24 16:45:02 -07:00
Taillefumier Mathieu
9dc3ad4db7 [package updates] Bump version of cp2k and sirius (#50141) 2025-04-24 17:59:28 +02:00
Satish Balay
4bfd7aeb25 petsc4py: update ldshared.patch for v3.20.1, and skip for v3.23.1+ (#50170) 2025-04-24 08:50:23 -07:00
Mike Nolta
fcf615b53e namd: add v3.0.1 (#50192) 2025-04-24 07:56:35 -07:00
Wouter Deconinck
1155318534 geomodel: depend on c (#49781)
* geomodel: depend on c
* hep: add geomodel
* hep: geomodel +fullsimlight
* geomodel: depends on virtual gl, not opengl
* soqt: depends on gl and glu instead of opengl
* geomodel: rm generated comments on language dependencies
2025-04-24 10:58:49 +00:00
Peter Scheibel
a3c430e810 CompilerAdaptor: add support for opt_flags/debug_flags (#50126) 2025-04-24 07:08:54 +02:00
Alec Scott
41ff0500f9 Add ls alias to spack {compiler, external} (#50189) 2025-04-24 05:08:20 +00:00
Stephen Nicholas Swatman
059a4a58e2 covfie: depend on c (#50190)
This commit makes the covfie package explicitly depend on the C
language, as CMake enables C by default if it is not explicitly turned
off.
2025-04-23 23:03:17 -06:00
Peter Brady
14513ba76f lua-sol2: add v3.5.0 (#49970)
* sol2: update version

* fix lua version in libpressio

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-04-23 13:25:00 -06:00
Alex Richert
21da90e062 crtm-fix: fix directory logic (#50172) 2025-04-23 10:55:06 -07:00
Robert Maaskant
a3c7e97463 py-build: add v1.2.2 (#50148)
* py-build: add v1.2.2
   Release notes: https://github.com/pypa/build/releases/tag/1.2.2
   Diff: https://github.com/pypa/build/compare/1.2.1...1.2.2
* py-build: fix deps
* fixup! py-build: fix deps
2025-04-23 10:16:56 -07:00
Harmen Stoppels
f7ed3ce4ae py-pillow: fix build (#50177) 2025-04-23 10:07:19 -07:00
Robert Maaskant
36caa6158a py-flit and py-flit-core: add v3.10.0 -> v3.12.0 (#50139)
* py-flit: add v3.10.0 and v3.10.1
* fixup! py-flit: add v3.10.0 and v3.10.1
* py-flit and py-flit-core: add v3.11.0
* py-flit and py-flit-core: add v3.12.0
* py-flit: some deps are runtime only
* py-flit-core: fix python dep for v3.12.0
* py-flit-core: correct versions for python dep
2025-04-23 10:04:46 -07:00
Robert Maaskant
1904d99fd0 py-trove-classifiers: add v2025.4.11.15 (#50143) 2025-04-23 08:03:20 -07:00
Robert Maaskant
de0b17c07f py-id: new package (#50145) 2025-04-23 08:01:02 -07:00
Matt Thompson
5d695623db esmf: add v8.8.1 (#50178) 2025-04-23 15:53:00 +02:00
Adam J. Stewart
3f063ace1d Add type hints to all setup_*environment functions (#49985) 2025-04-23 15:41:22 +02:00
Jonas Eschle
47b71ba8ca py-zfit-physics: new package (#43696)
* Added package
* Fix base class
   Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix: copyright

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2025-04-22 18:14:10 -07:00
Robert Maaskant
67eb9cfccb py-setuptools: add v78.1.1 (#50133) 2025-04-22 17:07:10 -07:00
Robert Maaskant
dddbd944a4 py-tzdata: add v2025.2 (#50138) 2025-04-22 17:04:25 -07:00
Robert Maaskant
b7d85e7694 py-packaging: add v25.0 (#50142) 2025-04-22 16:58:31 -07:00
Robert Maaskant
f4c4b06a46 py-hatch: add v1.13.0 (#50147) 2025-04-22 16:06:40 -07:00
Robert Maaskant
6995010bab py-pyproject-metadata: add v0.9.1 (#50150)
* py-pyproject-metadata: add v0.9.1
   Changelog: https://github.com/pypa/pyproject-metadata/blob/0.9.1/docs/changelog.md#091-10-03-2024
   Diff: https://github.com/pypa/pyproject-metadata/compare/0.7.1...0.9.1
* fixup! py-pyproject-metadata: add v0.9.1
2025-04-22 15:55:27 -07:00
Chris White
2d212561fb libcerf: Add new versions (#50089)
* update libcerf to use new URL and CMake for new versions but keep old URL and autoconf for 1.3
* add maintainer
* fix comment

---------

Co-authored-by: white238 <white238@users.noreply.github.com>
2025-04-22 15:47:51 -07:00
Alec Scott
7cab3e2383 g2: update for best practices (#50155) 2025-04-22 15:37:01 -07:00
SXS Bot
48ca9a5f3c spectre: add v2025.04.21 (#50153)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2025-04-22 15:36:15 -07:00
Alec Scott
1934c8cf73 verible: update for best practices (#50154) 2025-04-22 15:34:50 -07:00
Buldram
42cd7c4f89 nim: add 2.2.4, 2.0.16 (#50166)
https://nim-lang.org//blog/2025/04/22/nim-224-2016.html
2025-04-22 15:29:56 -07:00
Matt Thompson
ce654b6882 mapl: add cxx dependence (#50168) 2025-04-22 15:27:43 -07:00
Vicente Bolea
94719a55b4 viskores: new package (#50078)
* viskores: add package

* Update var/spack/repos/builtin/packages/viskores/package.py

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>

* Apply suggestions from code review

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>

* Update var/spack/repos/builtin/packages/viskores/package.py

---------

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>
2025-04-22 16:38:26 -05:00
etiennemlb
76168292c3 pdi: refactor version handling and update to 1.8.3 (#49276)
* Refactor version handling
* Pdi fixes
  # Conflicts:
  #	var/spack/repos/builtin/packages/pdi/package.py
* Details
2025-04-22 14:34:23 -07:00
Ryan Krattiger
3fd6066e54 ci: copy logs from failed job stage dir (#49884) 2025-04-22 23:23:49 +02:00
Derek Ryan Strong
c62cc6a45d rsem: add zlib dependency (#50102) 2025-04-22 20:52:31 +02:00
Kyle Brindley
423548fc90 py-salib: add v1.4.6 -> v1.5.1 (#49941)
* MAINT: py-salib up to v1.5.1

* MAINT: black style requires trailing commas

* WIP: make pathos an optional dependency at the same version where salib upstream made it optional

* MAINT: fix run time requirements for older versions. Add build/run requirements for newere versions

* MAINT: spack style specs

* MAINT: spack package naming convention

* MAINT: match dependency order to version order
2025-04-22 10:28:35 -07:00
Chris Richardson
9010e6f556 Make PETSc an optional dependency of fenics-dolfinx (#49837)
* Make petsc optional

* Add C dependency

* Add to cmake args

* Make petsc optional

* Add C dependency

* Add to cmake args

* Update var/spack/repos/builtin/packages/fenics-dolfinx/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix duplicate line

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-22 10:08:27 -07:00
Stephen Nicholas Swatman
6085586407 oneapi: Move temporary home directory to stage (#50160)
When trying to use Spack to build Intel TBB on the EOS network file
system we use at CERN, I am facing the following error:

```
Launching the installer...
Installation directory is not empty.
The product cannot be installed into nonempty directory
'/eos/home-s/sswatman/spack/opt/spack/linux-x86_64_v2/...
    intel-oneapi-tbb-2021.12.0-3jlx6hlr3z6di42f3qy35eizcse7u2tk'.
```

This error appears to happen because Spack tries to set `$HOME` to the
prefix of the package, into which the Intel installer is also trying to
install it's software. I've found that an easy fix is to set `$HOME` to
a directory in the build stage instead, as this can be guaranteed to
remain empty.
2025-04-22 15:47:30 +00:00
Jonathon Anderson
dbd745bdab hpctoolkit: fix smoke test to use compiler node (#50152)
Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
2025-04-22 17:18:15 +02:00
Harmen Stoppels
31c5c0b423 directives_meta.py: remove global decl (#50162)
silence flake8
2025-04-22 17:08:14 +02:00
Andrey Perestoronin
41f99f8131 add new intel-oneapi-vtune package (#50159) 2025-04-22 08:27:59 -04:00
Alex Richert
441ade5809 wrf-io: remove check() (#50151) 2025-04-22 12:38:29 +02:00
Harmen Stoppels
60f6f8d836 python.py/r.py: fix type issues with classproperty/constant (#50059) 2025-04-22 09:53:34 +02:00
Massimiliano Culpo
5e7925c502 packages: minor improvements for compiler packages (#50111)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-22 09:53:12 +02:00
Joe
d39382bec8 Env vars docs (#49346) 2025-04-21 18:59:01 -06:00
Wouter Deconinck
be270f2311 py-zfit: add through v0.25.0 (#49349)
* py-zfit: add through v0.24.3

* hep: add py-zfit to cloud pipeline

* py-zfit: add v0.25.0

Co-authored-by: Jonas Eschle <mayou36@jonas.eschle.com>

* py-zfit: depends_on py-tensorflow(-probability) when @0.25.0:

Co-authored-by: Jonas Eschle <mayou36@jonas.eschle.com>

* py-zfit: fix style (tab to spaces)

* py-zfit: fix style

* py-zfit: depends_on py-tensorflow-probability without +py-tensorflow

* py-zfit: remove redundant type="run"

---------

Co-authored-by: Jonas Eschle <mayou36@jonas.eschle.com>
2025-04-21 10:07:43 -05:00
Dave Keeshan
c500200952 verible: add v0.0.3967 (#50122) 2025-04-21 10:08:36 +02:00
Alex Richert
71b110e6c7 g2: update recipe (#49889) 2025-04-21 01:05:14 -07:00
Andrey Prokopenko
7b877ec9e2 arborx: add version 2.0 (#50112) 2025-04-21 10:02:14 +02:00
Garth N. Wells
a74ac87d34 py-nanobind: add v2.6.1 (#50087) 2025-04-21 10:01:07 +02:00
Adam J. Stewart
796adb6b9b py-pandas: arrow+parquet when +parquet (#50119) 2025-04-21 09:54:31 +02:00
Alex Richert
2967bb5540 g2c: +utils requires +build_v2_api (#50114) 2025-04-21 09:42:22 +02:00
Robert Maaskant
9f4c677e46 trivy: v0.61.1 (#50131) 2025-04-21 09:37:38 +02:00
Robert Maaskant
1d369ba02d gh: v2.70.0 (#50132) 2025-04-21 09:37:05 +02:00
Adam J. Stewart
dcde4f9d5a fish: add v4.0.2 (#50134) 2025-04-21 09:36:30 +02:00
Robert Maaskant
3c576ca8c2 yarn: add v4.9.0 and v4.9.1 (#50135) 2025-04-21 09:36:07 +02:00
Adam J. Stewart
a89c89a23e py-smp: add v0.5.0 (#50120) 2025-04-21 09:35:26 +02:00
Adam J. Stewart
aee7455568 PyTorch: fix build with Apple Clang 17 (#50105) 2025-04-21 09:34:16 +02:00
Adam J. Stewart
69edcc6d2f py-numpy: add v2.2.5 (#50129) 2025-04-21 09:17:17 +02:00
Jiakun Yan
46263a493e lci: add v1.7.8, v1.7.9 (#50136) 2025-04-21 09:13:55 +02:00
Robert Maaskant
b24f2875e6 py-setuptools: deprecate old versions (#49595)
* setuptools: deprecated old versions

* py-zope-interface: deprecate versions requiring old setuptools verions

* py-botorch: deprecate versions requiring old setuptools verions

* py-deepsig: deprecate versions requiring old setuptools verions

* py-scipy: deprecate versions requiring old setuptools verions

* py-openslide-python: deprecate versions requiring old setuptools verions

* py-setuptools: fixup python 3.8 comment
2025-04-20 15:49:56 +02:00
Wouter Deconinck
18eebce04d external: list licensing in init summary (#46042) 2025-04-18 21:28:23 -06:00
Krishna Chilleri
f5c6e10e08 hpc-beeflow: add v0.1.10 and py-requests-unixsocket: add v0.4.1 (#49709)
* add version 0.1.10

* add hpc-beeflow v0.1.10

* force typer version to 0.5.0

* add neo4j and redis dependencies

* add method that sets the path of neo4j and redis installations

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-04-18 16:09:16 -07:00
jdomke
e7c17f7ed8 hpcg: compiler flag not supported by fujitsu either (#43110)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2025-04-18 10:32:25 +02:00
snehring
a284cbf256 sentieon-genomics: adding v202503 (#50043)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-04-18 08:56:55 +02:00
Edoardo Zoni
8cbf067455 py-amrex: update maintainers (#50044) 2025-04-18 08:56:26 +02:00
Rao Garimella
875397cf16 r3d: add the shared variant (#49953) 2025-04-18 08:54:11 +02:00
Alec Scott
a38045f77e libffi: update for best practices (#50050) 2025-04-18 08:52:36 +02:00
Cameron Rutherford
31ce23f3fc libceed: add BLAS_DIR and link time blas dependency (#50033) 2025-04-18 08:52:04 +02:00
Alec Scott
9e65bd5837 zoltan: update for best practices (#50062) 2025-04-18 08:45:15 +02:00
Teague Sterling
2c1a3eff74 libdisplay-info: new package (#49653)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-04-18 08:43:47 +02:00
Alec Scott
1d81ceb101 mergiraf: new package (#50070) 2025-04-18 08:42:47 +02:00
Adam J. Stewart
044c37372a py-pillow: add v11.2.1 (#50057) 2025-04-18 08:32:34 +02:00
Matt Thompson
8f40889a46 mapl: add v2.55.0 (#50068) 2025-04-18 08:29:31 +02:00
Chris Marsh
0a0282163b libogg: fix depends_on to include cxx (#50115) 2025-04-18 00:23:09 -06:00
Adam J. Stewart
54f4530df4 py-numpy: fix support for newer macOS ld linker (#50065)
Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-18 08:15:57 +02:00
Mikael Simberg
193f3b3c5a mimalloc: add 3.0.3 (#50109) 2025-04-18 08:14:08 +02:00
Mikael Simberg
34b0e8ebce asio: add 1.34.2 (#50110) 2025-04-18 08:12:43 +02:00
Thomas Applencourt
10109bf128 valgrind: add v3.24.0 (#50116) 2025-04-18 07:53:54 +02:00
Alec Scott
f0a7388496 py-repligit: new package (#50098) 2025-04-17 12:43:25 -04:00
Matt Thompson
45bc8fd2a3 mepo: add v2.3.1 (#50085) 2025-04-17 08:23:20 -07:00
Alec Scott
ca82085c82 covfie: update for best practices (#50064) 2025-04-17 08:21:46 -07:00
Greg Sjaardema
b97fbcb970 seacas: new version (#50049)
* Kluge to support file-per-rank and multiple-rank-single-file read/write in CGNS, other CGNS-related changes.
* Catalyst changes
* Update to latest TriBITs
* Improved static library build
* EPU: Handle case where no elements, but add_processor_id specified
* CPUP: Handle decomp-created zgc between zones better
* IOSS: Add filessystem type function (lustre, gpfs, nfs, ...)
* CPUP: Fix handling of assemblies
* IOSS: fix io_shell compare of db with no changesets
* IOSS: Cgns - handle missing assemblies correctly
* IOSS: Clean up owning_processor calculation
* EXO2MAT: Add -i option to not transfer info records to mat file
2025-04-17 10:58:06 -04:00
Miranda Mundt
cf812dd3a9 py-pyomo: add v6.9.0, v6.9.1, v6.9.2 (#50096) 2025-04-17 10:48:35 -04:00
Peter Scheibel
e78d9d93dd c23 standard typo (#50101) 2025-04-17 10:20:40 -04:00
Massimiliano Culpo
be492e1ed7 solver: encode % as "build requirement", not as "dependency" (take 2) (#50104)
reland 8fc1ccc686

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-17 15:25:02 +02:00
Rocco Meli
fcfbc28e10 dla-future-fortran: add v0.4.0 (#50095) 2025-04-17 13:02:36 +02:00
Till Ehrengruber
8fb5898d10 cudnn: aarch64 hash for several version (#46272)
* cudnn: aarch64 hash for several version

* Remove spurious newline

* Fix format

* [@spackbot] updating style on behalf of tehrengruber

* Fix cudnn 8.8 link derivation for aarch64

* Use sbsa for cudnn >= 8.3.1

* Fix typo

* Temporarily remove hashes

* Undo

* Use sbsa for cudnn >= 8.3.1

* Update hashes

---------

Co-authored-by: tehrengruber <tehrengruber@users.noreply.github.com>
Co-authored-by: Till Ehrengruber <tille@santis-ln001.cscs.ch>
2025-04-17 11:50:12 +02:00
Garth N. Wells
b75e35289c py-scikit-build-core: add v.0.11.1 (#50088) 2025-04-17 02:13:22 -06:00
AMD Toolchain Support
4024200d61 aocc: add missing attributes (#50082)
Co-authored-by: viveshar <vivek.sharma2@amd.com>
2025-04-17 02:04:12 -06:00
Vanessasaurus
eab1d6df80 flux-core: add v0.68.0 -> v0.73.0 (#49893)
* Automated deployment to update package flux-core 2025-04-04
* Add py-packaging
* Do not pin py-packaging
* flux-sched: build older flux-core

flux sched 0.38 was the first that required gcc
version 12 or higher, and flux-core continued to
build for some time, but eventually added
features that we are now seeing break with
sched 0.37 and the latest flux. This conflicts
should ensure that older flux-sched, which
is being built by having an older compiler,
only builds with flux-core up to 0.68.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

---------

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2025-04-17 17:00:27 +09:00
Robert Maaskant
0d7c0c8362 py-deephyper: add v0.9.3 (#49604)
* py-deephyper: deprecate versions requiring old setuptools versions

* py-deephyper: add v0.9.3

* py-deephyper: wip

* py-deephyper: use nested context managers

* py-deephyper: comment out py-pymoo dep

* py-deephyper: deprecate versions requiring old setuptools versions

* py-deephyper: add v0.9.3

* py-deephyper: wip

* py-deephyper: use nested context managers

* sync with deephyper developer spack repo

* py-deephyper: disable variants jax-cpu and redis

Both variants require depedencies missing in Spack.

* py-deephyper: add back dependencies for deprecated versions

* fixup! py-deephyper: add back dependencies for deprecated versions

* py-deephyper: fix copyright notice

* py-deephyper: add back license

---------

Co-authored-by: Brett Eiffert <eiffertbc@ornl.gov>
2025-04-17 16:55:57 +09:00
Robert Maaskant
a573f2248d new package: theia-ide (#49539) 2025-04-17 16:54:43 +09:00
Alec Scott
986102ab7a Revert "solver: encode % as "build requirement", not as "dependency" (#50011)" (#50092)
This reverts commit 8fc1ccc686.

conflicts("%gcc@:11", when="@2.17:")
conflicts("%gcc", when="+ompt")

regressed.
2025-04-17 08:45:29 +02:00
Massimiliano Culpo
04f6881b76 Environment: remove leftover code to attach test deps (#50097)
This should not be needed anymore after #49405

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-17 07:55:32 +02:00
Caetano Melone
d4582945ba py-gidgetlab: add v2.0.0, v2.0.1, v2.1.0 (#50051)
* py-gidgetlab: add v2.0.0-v2.1.0

1.1.0 doesn't work with Python 3.13.

See this commit (and the tags that contain it) for history on build
deps:
310bc109ba.

* group dependencies

Co-authored-by: Alec Scott <hi@alecbcs.com>

* set python version constraint

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-16 14:34:30 -04:00
Alberto Invernizzi
a0940510df libluv, tree-sitter, utf8-proc and unibilium: bump update of (some of) neovim deps (#50054)
* simple update of some of neovim deps

* switch to neovim fork for unibilium and add newer versions

* fix tree-sitter _DEFAULT_SOURCE vs _BSD_SOURCE

* oneliner for filter_file
2025-04-16 11:29:17 -06:00
Adam J. Stewart
c2327a2adf py-pykwalify: add v1.8.0 (#50063) 2025-04-16 13:10:30 -04:00
Kyle Brindley
4c075801db py-pathos: add v0.2.9 -> v0.3.3 (#49943)
* MAINT: add py-pathos versions

* MAINT: match version directives order in dependencies
2025-04-16 12:57:46 -04:00
Thomas Padioleau
1d27add307 Update mdspan recipe (#50046)
* Remove hard-coded compiler

* Remove compiler flags

* Use spack functions

* Add a cxxstd variant

* Replace main branch of googletest with some random not too old version
2025-04-16 12:05:10 -04:00
Robert Maaskant
3256ad8e5c py-pip: add v25.0, v25.0.1 (#49249)
* py-pip: add v25.0 and v25.0.1

Release notes:
- https://pip.pypa.io/en/stable/news/#v25-0
- https://pip.pypa.io/en/stable/news/#v25-0-1

* py-pip: add known conflict

* fixup! py-pip: add known conflict

* fixup! fixup! py-pip: add known conflict

* py-setuptools: fix typo in conflict

* fixup! py-setuptools: fix typo in conflict
2025-04-16 15:36:25 +02:00
Robert Maaskant
dc8678136c py-setuptools: add v76.1.0 -> v78.1.0 (#49680)
* py-setuptools: add v76.1.0

* py-setuptools: do not deprecate older minor and patch versions

* py-setuptools: add v77.0.1 and v77.0.3

* py-flatbuffers: constrain setuptools

* py-torchvision: constrain setuptools

* py-torch-cluster: constrain setuptools

* py-torch-scatter: constrain setuptools

* py-torch-spline-conv: constrain setuptools

* py-setuptools: add v78.0.1, v78.0.2, v78.1.0
2025-04-16 15:35:10 +02:00
Christian Lamparter
62ec0f6d33 fenics-dolfinx: add missing "c" build dependency (#50071)
building fenics-dolfinx resulted in the following error:

==> No patches needed for fenics-dolfinx
==> fenics-dolfinx: Executing phase: 'cmake'
==> Error: ProcessError: Command exited with status 1:

3 errors found in build log:
     3     -- The C compiler identification is unknown
     4     -- The CXX compiler identification is GNU 12.2.0
     5     -- Detecting C compiler ABI info
     6     -- Detecting C compiler ABI info - failed
     7     -- Check for working C compiler: /opt/spack/[...]libexec/spack/cc
     8     -- Check for working C compiler: /opt/spack/[...]libexec/spack/cc - broken

  >> 9     CMake Error at /opt/spack/opt/spack/[...]/CMakeTestCCompiler.cmake:67 (message):
     10      The C compiler
     11
     12        "/opt/spack/opt/spack/[...]/libexec/spack/cc"
     13
     14      is not able to compile a simple test program.
     15

     ...

Thanks to @amd-toolchain-support issue #50021, this is easily fixed
by adding the one-liner for the missing dependency for the C compiler.

Signed-off-by: Christian Lamparter <chunkeey@gmail.com>
2025-04-16 07:16:11 -04:00
Wouter Deconinck
25d8e95ad4 go: ignore unresolved library (#50072) 2025-04-16 07:09:37 -04:00
Harmen Stoppels
883bbf3826 variants: fix narrowing multi -> single -> bool (#49880)
* `x=*` constrained by `+x` now produces a boolean valued variant instead of a multi-valued variant.
   
* Values are now always stored as a tuple internally, whether bool, single or multi-valued. 

* Value assignment has a stricter api to prevent ambiguity / type issues related to 
   `variant.value = "x"` / `variant.value = ["x"]` / `variant.value = ("x",)`. It's now `variant.set("x", ...)` for 
   single and multi-valued variants.

* The `_original_value` prop is dropped, since it was unused.

* The wildcard `*` is no longer a possible variant value in any type of variant, since the *parser*
   deals with it and creates a variant with no values.
2025-04-16 09:44:38 +02:00
Harmen Stoppels
1dc9bac745 detection/common.py: catch is_file() inside loop (#50042) 2025-04-16 09:40:14 +02:00
Peter Scheibel
8ac5398576 ci: add gawk (#50074) 2025-04-16 09:29:20 +02:00
Harmen Stoppels
dbd3895cbf version_types.py: Version -> Union[StandardVersion, GitVersion] (#50061) 2025-04-16 09:15:13 +02:00
Todd Gamblin
1d70dc8292 Remove Prolog so that GitHub detects Answer Set Programming (#50077)
This reverts a change made in #20639 to have GitHub recognize our
ASP files as Prolog, the closest langauge supported by
[Linguist](https://github.com/github-linguist/linguist) at the time.

Linguist has since
[added support for ASP](https://github.com/github-linguist/linguist/pull/7184),
so we no longer need to force Prolog detection -- our `.lp` files should
be auto-detected as ASP.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-04-15 22:41:53 -07:00
Alec Scott
4b2a96fe06 Improve our README to make it easier for new users (#49711) 2025-04-15 17:08:39 -04:00
John W. Parent
4a7508c9df Update make/nmake invocations (mostly Windows) (#49022)
The second change technically affects non-Windows, but the
behavior should be exactly the same:

* Packages no longer have access to `.msbuild` and `.nmake`
  automatically; they now get them via a dependency on `msvc`.
* Update two CMake-based packages that call `make test` to
  instead call `ctest` (`netcdf-cxx4` and `pegtl`).
  CMake-based packages should do this because on Windows
  `make test` will not generally work, but `ctest` does.
* Fix `openssl` "make test" on Windows (WRT prior point: not
  a CMake-based package).
2025-04-15 14:44:25 -06:00
Alec Scott
4f27ef8157 conmon: update for best practices (#50058) 2025-04-15 14:00:39 -05:00
Vicente Bolea
069010fe13 vtk-m: update to latest release (#49867)
* vtk-m: add v2.3.0 version

* Update package.py

* Update var/spack/repos/builtin/packages/vtk-m/package.py

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>

---------

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>
2025-04-15 12:40:52 -05:00
Teague Sterling
0fa64f9791 hwdata: add v0.392 and update configure args to fix data path (#49654)
* hwdata: add v0.392 and update configure args to fix data path

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-04-15 09:37:53 -07:00
Adam J. Stewart
8d23edd1a9 libpng: add v1.6.47 (#49979)
* libpng: add v1.6.47

* Add conflict
2025-04-15 12:04:38 -04:00
Alec Scott
336d33ecfa iwyu: update for best practices (#50060) 2025-04-15 09:28:27 -06:00
Rocco Meli
caaf0c50f6 dla-future: add v0.9.0 (#50055) 2025-04-15 16:00:08 +02:00
Alberto Invernizzi
6b2cd0ca45 bump iwyu (#50053) 2025-04-15 03:33:18 -06:00
Sichao25
0bec90ecd7 zoltan: add +scotch variant (#49845)
* add scotch variant to zoltan

* style fix

* apply satisfies func and F-strings
2025-04-14 15:03:39 -06:00
Stephen Nicholas Swatman
1f77b33255 covfie: add v0.13.0 (#50039)
Does what it says on the tin.
2025-04-14 13:32:59 -07:00
Wouter Deconinck
6dab20e8f8 py-particle: add v0.25.3 (#49907) 2025-04-14 11:13:21 -04:00
Wouter Deconinck
f926512cd4 conmon: add v2.1.13 (#49914) 2025-04-14 11:12:23 -04:00
Wouter Deconinck
4a08f5b6e4 sherpa: add variant internal_pdfs to avoid fortran (#49918) 2025-04-14 11:11:09 -04:00
Adam J. Stewart
2cbc21d584 Remove runtime errors for Fortran compilers (#49981) 2025-04-14 16:35:03 +02:00
Seth R. Johnson
b4646c340c dd4hep: fix inconsistent cxxstd (#50027)
* dd4hep: fix doc dependencies and made doc optional

* edm4hep: fix downstream build error in dd4hep

* dd4hep: propagate cxxstd to fix build error with +ddg4
2025-04-14 10:17:18 -04:00
Paul Gessinger
24efb56528 fastjet: ProtoJet output compilation error (#50004)
* fastjet: ProtoJet output

* switch to upstream fix

* spack wants .diff URLs apparently?

* Delete var/spack/repos/builtin/packages/fastjet/protojet.patch

* increase patch level, upper bound for patch

* Update package.py
2025-04-14 08:03:15 -06:00
Robert Maaskant
19f7a1bfbd yarn: add v4.8.1 (#49898) 2025-04-14 09:56:31 -04:00
Wouter Deconinck
bc5b57dca9 fmt: conflicts llvm@21: when @:11.0 (#49965)
* fmt: conflicts llvm@21: when @:11.0

* Update var/spack/repos/builtin/packages/fmt/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-04-14 07:23:31 -06:00
Harmen Stoppels
ebef5f75fb util/environment.py: allow PurePath for path modifiers and improve file/lineno of warnings (#50038) 2025-04-14 14:52:12 +02:00
Wouter Deconinck
e45ee9cb92 gaudi: add v39.3, v39.4 (#50017) 2025-04-14 12:27:37 +02:00
Robert Maaskant
900fff77cd Use gnu_mirror_path for GNU packages (#50034) 2025-04-14 12:26:30 +02:00
Robert Maaskant
e97a78ebcc Mark glibc and musl as buildable false (#50035) 2025-04-14 12:15:32 +02:00
Massimiliano Culpo
25beeef865 Environment: separate parsing concerns from SpecList (#49973)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-14 11:22:13 +02:00
Matt Thompson
b3ded1332e openblas: fixes for Xcode 16.3 darwin aarch64 (#49976) 2025-04-14 11:18:07 +02:00
Matt Thompson
b66694d1ca nag: nag provides fortran (#49895) 2025-04-14 10:41:54 +02:00
Marc T. Henry de Frahan
ebb2bb206e openfast: add v4.0.3 (#49974) 2025-04-14 10:40:44 +02:00
Adam J. Stewart
7a489e1e4e py-grpcio: add v1.71.0 (#49980) 2025-04-14 17:40:24 +09:00
Thomas Madlener
940f47a47c Fix dependency on C compiler for some packages to pass cmake or configure stage (#49993)
* genfit: depend on c compiler to fix installation issues

genfit does not specify `LANGUAGES` in their `project` declaration yet, so C is also required to pass the cmake stage.

* vbfnlo: Depend on C compiler to pass configure stage

* cppunit: Depend on C compiler to pass configure stage

* Make py-pyqt5 depend on C to fix build error

* bdsim: Depend on C to pass configure stage
2025-04-14 10:14:35 +02:00
Nathan Ellingwood
ccea1c6c9b trilinos: update for kokkos and kokkos-kernels 4.6.00 (#49977)
Corresponds to the kokkos 4.6.00 release, see PRs:
* https://github.com/spack/spack/pull/49810
* https://github.com/trilinos/Trilinos/pull/13925

Signed-off-by: Nathan Ellingwood <ndellin@sandia.gov>
2025-04-14 10:13:32 +02:00
Stephen Hudson
476863c4e8 libEnsemble: add v1.5.0 (#50012) 2025-04-14 10:10:20 +02:00
dependabot[bot]
7794d51adb build(deps): bump urllib3 from 2.3.0 to 2.4.0 in /lib/spack/docs (#50014)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.3.0 to 2.4.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.3.0...2.4.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-version: 2.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-14 10:07:24 +02:00
Thomas Padioleau
0a6767e602 benchmark: set default build type to Release (#50013) 2025-04-14 10:06:06 +02:00
Kyle Brindley
b3585ff1b8 py-multiprocess: add v0.70.13 -> v0.70.17 (#49947) 2025-04-14 10:02:33 +02:00
Kyle Brindley
7e9c24a789 py-ppft: add v1.7.6.5 -> v1.7.6.9 (#49946) 2025-04-14 10:01:58 +02:00
Massimiliano Culpo
c5b227d14c iconv: add a strong preference on libiconv by default (#50020)
This strong preference fixes a sporadic issue when
concretizing environments with `unify:when_possible`.

In the first round of concretization, it is almost
certain that glibc is installed, and that spec might
provide iconv.

In later rounds using that as a provider might be
preferred, as it leads to less nodes to be "built".

To avoid duplication by default, prefer libiconv
in a stronger way than default preferences.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-14 09:39:54 +02:00
Joshua Finkelstein
620d5c7ef8 bml: add magma and cusolver build options (#49652) 2025-04-14 09:38:37 +02:00
AMD Toolchain Support
74f78bd24f aocl-da & aocl-utils: fix missing compiler dependencies (#50029) 2025-04-14 09:18:29 +02:00
Gerhard Theurich
fa9dcb43bd esmf: remove @:7.0 conditionals with associated patches (#50030) 2025-04-14 09:15:43 +02:00
Richard Berger
9a37a6fcb1 lammps: add versions 20250402 and 20240829.2 (#50031) 2025-04-14 09:14:22 +02:00
Edoardo Zoni
4ae4739537 warpx, py-amrex: add v25.04 (#49891) 2025-04-14 09:13:16 +02:00
Richard Berger
493a307e4f gcc: bump aarch64-darwin patches (#50032)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2025-04-12 12:37:39 +02:00
Cameron Rutherford
6fb5a1492a gslib: add shared library support (#50016) 2025-04-12 09:22:44 +02:00
Harmen Stoppels
cc3d40d9d3 fetch_strategy.py: show progress (#50003)
Show progress meter for fetches when `stdout` is a `tty`.

* fetch_strategy.py: show progress
* "Fetched: x MB at y MB/s"
* add tests, show % if content-length
2025-04-11 12:39:42 -07:00
Massimiliano Culpo
8fc1ccc686 solver: encode % as "build requirement", not as "dependency" (#50011)
This PR fixes the issues with `%` and reused specs, due to https://github.com/spack/spack/issues/49847#issuecomment-2774640234 

It does so by adding another layer of indirection, so that whenever a spec
`foo %bar` is encountered, the `%bar` part is encoded as an
`attr("build_requirement", ...)`.

Then:

1. If `foo` is a node to be built, then the build requirement implies a dependency
2. Otherwise it implies looking e.g. reused specs metadata, and ensure it matches

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-11 11:09:21 -07:00
Pranav Sivaraman
8a8d88aab9 libcxi: add dependencies for autoreconf phase (#50025) 2025-04-11 11:43:04 -06:00
Seth R. Johnson
246ac7ced9 root: remove unnecessary patches (#50022)
* root: 6.34 included this patch

* ROOT: no need to patch unless +root7 +geom +webgui
2025-04-11 13:20:04 -04:00
Harmen Stoppels
fd1b982073 solver: dump output specs to file if not satisfied (#50019)
When the solver produces specs that do not satisfy the input
constraints, dump both input and output specs as json in an temporary
dir and ask the user to upload these files in a bug report.
2025-04-11 19:01:27 +02:00
Harmen Stoppels
fd31f7e014 spec.py: remove exceptions of old concretizer (#50018) 2025-04-11 15:52:11 +02:00
Kyle Brindley
e70d7d4eb7 py-dill: add v0.3.7, v0.3.8, v0.3.9 (#49944) 2025-04-11 07:52:36 +02:00
Thomas Madlener
b4b35f9efd Make podio depend on fmt for newer versions (#49986) 2025-04-10 18:24:08 -06:00
Cody Balos
aa05af81d0 sundials: add v7.3.0 (#49972) 2025-04-10 12:26:05 -07:00
Scott Wittenburg
3da44cff0b ci: Use explicit version of notary image (#50010) 2025-04-10 13:27:36 -05:00
Kyle Shores
95de0c021b New package: musica (#49988)
* adding the musica package with git and version sha256

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-04-10 12:00:40 -05:00
Dave Keeshan
9347769d4b yosys: add v0.52 (#49998) 2025-04-10 09:00:29 -07:00
Harmen Stoppels
8885f6b861 defaults/config.yaml: timeout at 30s by default (#50002)
Fetching generated tarballs from github.com sometimes takes pauses for
more than 10 seconds, when the server is slow to put together the next
bits of the tarball. Default to 30s to avoid that issue.
2025-04-10 13:06:22 +02:00
Harmen Stoppels
c5e5ed3a3b rocm ecosystem: deprecate master type versions (#49982) 2025-04-10 10:43:33 +02:00
Scott Wittenburg
23d7305efd buildcache: Remove deprecated commands and arguments (#49999) 2025-04-10 09:42:01 +02:00
psakievich
252ceeedbe spack develop: avoid deprecated package_class (#49997) 2025-04-10 09:40:16 +02:00
Alex Tyler Chapman
6df832d979 Add +axom variant to the hiop spack package (#49817)
* Add missing AXOM_DIR to hiop package
* remove auto-generated compiler comments
2025-04-10 00:07:47 -06:00
Rob Latham
4a88884a8e root's new RNTuple format can use daos backends (#49849)
* root's new RNTuple format can use daos backends

* Update var/spack/repos/builtin/packages/root/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* root: fix style

* hep: root +daos

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-04-09 13:14:16 -06:00
Harmen Stoppels
84dcc654ec util/environment.py: require string values in env mods (#49987) 2025-04-09 19:29:47 +02:00
Olivier Cessenat
b6722ce5c9 scotch: takes care IDXSIZE differs from INTSIZE even with cmake (#49842) 2025-04-09 11:46:48 -04:00
Robert Maaskant
1cbee69bec gh: add v2.69.0 (#49963) 2025-04-08 22:44:18 -06:00
Juan Miguel Carceller
1cf311f217 whizard: add v3.1.5 (#49785)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-04-08 21:37:21 -07:00
Fabien Bruneval
c960fa0996 molgw: add v3.4 (#49701)
* update molgw package v3.4

* Set OpenMP variant default to False

---------

Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2025-04-08 21:35:40 -07:00
Chris Marsh
69a95bf1f8 libpng: add v1.6.47 (#49872)
* add 1.6.47, add cmake reqs

* use open ended cmake versions

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-08 21:34:44 -07:00
Adam J. Stewart
513142f154 py-litdata: add new package (#49961) 2025-04-08 21:42:24 -06:00
Robert Maaskant
d6b6910654 go: add v1.23.8 (#49962) 2025-04-08 21:36:44 -06:00
Thomas Padioleau
ae78c7698a Add languages to PDI ecosystem (#49957) 2025-04-08 21:30:23 -06:00
Jon Rood
f4f1606298 hypre: remove shared variant default for darwin (#49749)
* hypre: remove shared variant default for darwin.

* [@spackbot] updating style on behalf of jrood-nrel

* Update var/spack/repos/builtin/packages/hypre/package.py

Co-authored-by: Victor A. P. Magri <50467563+victorapm@users.noreply.github.com>

* hypre: remove sys import.

---------

Co-authored-by: jrood-nrel <jrood-nrel@users.noreply.github.com>
Co-authored-by: Victor A. P. Magri <50467563+victorapm@users.noreply.github.com>
2025-04-08 14:26:07 -07:00
John W. Parent
d2dd4e96d9 Windows: add env name to prompts (#48819) 2025-04-08 14:16:39 -07:00
Jon Rood
4cb64e150f trilinos: patch stk to include cstddef for size_t error (#49952)
* trilinos: patch stk to include cstddef for size_t error

* Update comment.
2025-04-08 14:02:41 -06:00
Kyle Brindley
b74e23a637 pox: add v0.3.5 (#49945) 2025-04-08 12:41:46 -06:00
Harmen Stoppels
8ffd6c29bf gcc/oneapi: inject runtime iff language virtual (#49956)
Currently we inject runtimes when a package has a direct build dep on a
compiler, but what matters is whether the package depends on a language.

That way we can avoid recursion of injecting runtimes to runtimes
without a rule in the solver: runtimes don't depend on languages, they
just have a build dep on the same compiler.
2025-04-08 19:51:16 +02:00
Jon Rood
4372907fc1 h5z-zfp: update to use CMake (#49735)
* h5z-zfp: update to use CMake.

* Add depedency requirement.

* Remove shared variant.
2025-04-08 10:43:40 -07:00
Cameron Smith
63a506ed17 pumi: remove redundant compiler deps (#49936) 2025-04-08 11:39:08 -06:00
Jon Rood
382647c8af trilinos: depends on kokkos~cuda when ~cuda and kokkos~rocm when ~rocm (#49951) 2025-04-08 10:49:10 -06:00
Kyle Brindley
4b73da5bb2 scons: add v4.9.1(#49942) 2025-04-08 10:43:51 -06:00
Adam J. Stewart
17b47c9dbe py-torchmetrics: add v1.7.1 (#49955) 2025-04-08 09:41:35 -07:00
Thomas Padioleau
5075275873 Set tpadioleau and nmm0 as mdspan maintainers (#49938) 2025-04-08 10:39:10 -06:00
arezaii
2acdacb129 py-pyarrow: add depends_on c (#49940) 2025-04-08 10:38:44 -06:00
Nicholson Koukpaizan
bedc7bd518 coinhsl: fix typo when setting liblapack_path (#49937) 2025-04-08 10:38:28 -06:00
Adrien Bernede
dd8d2a2515 Radiuss-Spack-Configs update 2025-03-0 (#49637)
* Clearly split old and new hip settings requirements

* Fix style

* TO REVERT: Trigger CI

* Apply generic rocm handling to every project

* TO REVERT: Trigger CI

* Revert "TO REVERT: Trigger CI"

This reverts commit dcedb2ead5.

* Revert "TO REVERT: Trigger CI"

This reverts commit 02f76a8ca6.

* Update RADIUSS packages with latest release and sync with RSC implementation

* Update CARE package

* make default logic for hip support more robust

* TO REVERT: Trigger CI

* Fix style

* Fix style (bis)

* Shorten message

* GPU_TARGET is only necessary under certain project specific conditions, it should not be necessary in general

* Update logic to find amdclang++

* Fix syntax

* Let's postpone update of hip handling in camp

* Revert "TO REVERT: Trigger CI"

This reverts commit 620fbc1b01.

* Remove unecessary logic from CARE

* Update CARE: add 0.15.1
2025-04-08 09:29:16 -07:00
Massimiliano Culpo
129338c4c9 Group together all concretization unit test (#49960)
In this way, to run them, we just need to run:

spack unit-test lib/spack/spack/test/concretization

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-08 17:27:17 +02:00
Harmen Stoppels
e7b009e350 asp.py: reduce verbosity of "preferences" comment (#49954) 2025-04-08 15:17:21 +02:00
Massimiliano Culpo
9e6e478ccf llvm: don't detect +flang multiple times (#49876)
fixes #49831

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-08 09:12:34 +02:00
Adam J. Stewart
357089f347 py-nbstripout: add v0.8.1 (#49896) 2025-04-08 08:54:39 +02:00
Wouter Deconinck
6228247eda hep: build Geant4 with Qt5 and Qt6 (#49777)
* hep: build geant4 with qt5 and qt6

* hep: keep geant4 ~vtk for now
2025-04-08 08:51:22 +02:00
Wouter Deconinck
a919b67cb4 hep: root +arrow +emaca (#49931) 2025-04-08 08:50:52 +02:00
Wouter Deconinck
58ac6f7cba py-kiwisolver: add v1.4.6, v1.4.7, v1.4.8 (#49911) 2025-04-08 08:49:07 +02:00
Wouter Deconinck
86b57c233d apptainer: add v1.4.0 (#49912) 2025-04-08 08:48:32 +02:00
Wouter Deconinck
b6dec56f4f lbzip2: change URL and deprecate (#49948) 2025-04-08 08:26:26 +02:00
Jon Rood
d403060cf2 trilinos: patch version 16.1.0 to build on MacOS. (#49811) 2025-04-07 11:05:41 -06:00
Robert Maaskant
7d0dd27363 py-setuptools-scm: add v8.2.1 (#49901) 2025-04-07 09:56:45 -07:00
dependabot[bot]
cfbc92c2f0 build(deps): bump flake8 from 7.1.2 to 7.2.0 in /lib/spack/docs (#49816)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.2 to 7.2.0.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.2...7.2.0)

---
updated-dependencies:
- dependency-name: flake8
  dependency-version: 7.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-07 09:49:58 -07:00
Wouter Deconinck
e7bca5b8f6 cppgsl: add v4.2.0 (#49915) 2025-04-07 12:46:37 -04:00
Wouter Deconinck
cf5ba8aee3 dbus: add v1.16.2 (#49916) 2025-04-07 12:46:06 -04:00
Wouter Deconinck
32a4eb4ebb jwt-cpp: add v0.7.1 (#49917)
* jwt-cpp: add v0.7.1

* jwt-cpp: depends on c only thru 0.7.0

* jwt-cpp: fix checksum
2025-04-07 09:45:24 -07:00
Adam J. Stewart
a3f4fd68d6 Packages: add missing compiler dependencies (#49920)
* Packages: add missing compiler dependencies

* Undo changes to cray-mpich
2025-04-07 09:42:54 -07:00
Wouter Deconinck
95f8c7e073 cmark: add v0.31.1 (#49913) 2025-04-07 09:41:42 -07:00
Wouter Deconinck
20f31ce39d voms: add v2.1.2 (#49921)
* voms: add v2.1.2

* voms: patch only when @:2.1.0
2025-04-07 12:39:52 -04:00
Wouter Deconinck
91ef8c056b vdt: add v0.4.5, v0.4.6 (#49922)
* vdt: add thru v0.4.6

* vdt: add maintainer
2025-04-07 12:39:01 -04:00
Wouter Deconinck
c6ce7637fc spdlog: add v1.15.2 (#49923) 2025-04-07 09:37:51 -07:00
Adam J. Stewart
770c6cc612 py-torchgeo: add v0.7.0 (#49903) 2025-04-07 09:36:52 -07:00
Wouter Deconinck
97bad2f5a7 soqt: depends on c (#49924) 2025-04-07 09:35:55 -07:00
Wouter Deconinck
8ca82fb2b6 r: add v4.4.3 (#49926) 2025-04-07 09:34:50 -07:00
Wouter Deconinck
36540708f1 py-zope-interface: add thru v7.2 (#49927) 2025-04-07 09:33:53 -07:00
Wouter Deconinck
7e027cae3e gdbm: add v1.25 (#49928) 2025-04-07 09:32:38 -07:00
Wouter Deconinck
a311d0a8c0 armadillo: add v14.4.1 (#49929) 2025-04-07 09:30:40 -07:00
Wouter Deconinck
cd8ebdcfbd embree: add v4.4.0 (#49930) 2025-04-07 09:28:43 -07:00
Wouter Deconinck
8b3bfbd95e libsm: add v1.2.6 (#49933) 2025-04-07 09:26:21 -07:00
Tahmid Khan
10afe49877 simdjson: new package (#49453)
* simdjson: new package
* simdjson: update description
* simdjson: make line lengths < 100 chars to pass style checks
* simdjson: add variants for enabling sanitizers
* simdjson: improve description
* simdjson: fixes and improvements
* simdjson: fix conditions and requirements for variants

---------

Co-authored-by: Tahmid A. Khan <60913202+tahmid-khan@users.noreply.github.com>
2025-04-07 09:21:01 -07:00
Robert Maaskant
2afbeded25 py-python-dateutil: add v2.9.0.post0 (#49527) 2025-04-07 08:04:03 -07:00
Wouter Deconinck
415055d303 scitokens-cpp: add v1.1.3 (#49925) 2025-04-07 09:45:41 -05:00
Massimiliano Culpo
081e4c463b compilers: add .stdcxx_libs to the compiler adaptor (#49873)
This property is used by a few recipes, including `cp2k` under
certain configurations

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-07 15:39:13 +02:00
Harmen Stoppels
e5ec08771b docs: editor support config files (#49935) 2025-04-07 15:35:21 +02:00
Rémi Lacroix
98605621e7 cuDDN: Add versions 9.5.1, 9.6.0, 9.7.1 and 9.8.0 (#49789) 2025-04-07 05:53:01 -06:00
Rocco Meli
625a4b854c greenx: new package (#49646)
* greenx

* [@spackbot] updating style on behalf of RMeli

* split desc

* license

* Update var/spack/repos/builtin/packages/greenx/package.py

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2025-04-07 03:42:33 -06:00
Wouter Deconinck
dcf2c8744a zstd: add v1.5.7 (#49904) 2025-04-07 10:39:50 +02:00
Wouter Deconinck
d1b7cc9b5e zlib-ng: add v2.2.4 (#49905) 2025-04-07 10:39:12 +02:00
Wouter Deconinck
8fbe1ad941 libarchive: add thru v3.7.9 (#49932) 2025-04-07 10:38:54 +02:00
Vanessasaurus
440ae973d1 flux-sched: add v0.44.0 (#49894)
* Automated deployment to update package flux-sched 2025-04-04

* depends on c/cxx... order matters!

* Empty space

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-04-06 00:22:53 -06:00
Wouter Deconinck
2cb140f9a8 py-maturin: add v1.8.3 (#49910) 2025-04-05 22:31:54 -07:00
dependabot[bot]
fb2cca4e1e build(deps): bump clingo in /.github/workflows/requirements/style (#49868)
Bumps [clingo](https://github.com/potassco/clingo) from 5.7.1 to 5.8.0.
- [Release notes](https://github.com/potassco/clingo/releases)
- [Changelog](https://github.com/potassco/clingo/blob/master/CHANGES.md)
- [Commits](https://github.com/potassco/clingo/compare/v5.7.1...v5.8.0)

---
updated-dependencies:
- dependency-name: clingo
  dependency-version: 5.8.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-05 22:31:31 -07:00
dependabot[bot]
03b0d299f9 build(deps): bump types-six in /.github/workflows/requirements/style (#49869)
Bumps [types-six](https://github.com/typeshed-internal/stub_uploader) from 1.17.0.20250304 to 1.17.0.20250403.
- [Commits](https://github.com/typeshed-internal/stub_uploader/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-version: 1.17.0.20250403
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-05 22:30:35 -07:00
Robert Maaskant
8f93ea80fd util-linux-uuid: add v2.41 (#49900) 2025-04-06 01:26:11 -04:00
Wouter Deconinck
5cd5fcdd7f xtrans: add v1.6.0 (#49906) 2025-04-06 01:18:12 -04:00
Wouter Deconinck
da760a898e py-paramiko: add v3.5.1 (#49908) 2025-04-06 01:17:05 -04:00
Wouter Deconinck
dd55635fae py-mplhep: add thru v0.3.59 (#49909) 2025-04-06 01:14:31 -04:00
Richard Berger
320c758fea legion: extend slingshot11 support (#49713)
- allow conduit=ofi-slingshot11 to work with regular OpenMPI and MPICH when
  they are built with ^libfabric fabrics=cxi.
- add missing libfabric dependency for conduit=ofi-slingshot11. Embedded GASNet
  build uses PATH to detect libfabric installation.
2025-04-05 09:42:19 -07:00
Richard Berger
ff82ba24e9 kokkos: allow using new gfx942_apu arch (#48609)
Add an apu variant that promotes GPU architectures to their APU
equivalent. Right now this is just gfx942 -> gfx942_apu.
2025-04-04 22:29:29 -07:00
Alec Scott
d00b05b71e ci: decrease style checks setup time (#49888)
* ci: test speeding up style

* Combine pylint check with existing style checks to reduce overhead
2025-04-05 01:08:40 +00:00
John W. Parent
f8524f9d5e libpng package: find correct zlib library on Windows (#49034)
This explicitly specifies the correct library for zlib to CMake:

CMake's find zlib looks for zlib before zdll. On Windows, zlib is the
static lib, and zdll the import library. LibPNG only links to shared
zlib, but was getting zlib from CMake on Windows, which was resulting
in a linker failure.
2025-04-04 17:58:05 -07:00
John W. Parent
924204828e libxml2 package: add CMake builder; add version 2.13.5 (#47878) 2025-04-04 17:43:01 -07:00
Peter Brady
2f4c5f2aa2 libffi: add v3.4.7 (#49887)
* update libffi to 3.4.7
* Add conflict for libffi/apple-clang

---------

Co-authored-by: pbrady <pbrady@users.noreply.github.com>
2025-04-04 16:58:00 -06:00
Massimiliano Culpo
7e6a216d33 spack test run: add a --timeout argument (#49839)
* TestSuite: add type hints
* spack test run: add a --timeout argument
* pipelines: allow 2 minutes to run tests
* Fix docstrings, increase maximum pipelines time for tests to 5 mins.
* Use SIGTERM first, SIGKILL shortly after
* Add unit-tests for "start_build_process"

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-04 15:48:48 -07:00
Francesco Rizzi
87bbcefba9 pressio: new package (#49592)
* pressio: add packages for pressio, pressio-ops, and pressio-log

* pressio: use symlinks for pressio-ops/log; specify compatible versions

* pressio: refactor supported versions, update pressio-ops to use main branch

* pressio: update package after renaming repository to pressio-rom

* pressio: remove unneeded if statement

---------

Co-authored-by: Caleb Schilly <cwschilly@gmail.com>
2025-04-04 15:56:47 -05:00
Thomas Padioleau
8ab1011192 kokkos-tools: add cxx language dependency (#49885) 2025-04-04 12:42:56 -06:00
Massimiliano Culpo
b2f8cd22c3 Update the "missing attribute in recipe" message (#49874)
Multiple build systems have been part of Spack for a long
time now, and they are rarely the cause of a missing attribute.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-04 10:38:38 -07:00
Thomas-Ulrich
e6e58423aa seissol: add v1.3.1 (#49878)
* seissol: new version
* split the long line
* also add netcdf max version
2025-04-04 10:36:49 -07:00
Alec Scott
5255af3981 go: add v1.24.2 (#49886) 2025-04-04 10:33:09 -07:00
Thomas-Ulrich
5d913d0708 easi: add c compiler dependence (#49877) 2025-04-04 09:41:04 -07:00
Chris Marsh
5fda19194b vtk: fix PYTHONPATH (#49773)
* Ensure PYTHONPATH is set for vtk+python

* style

* switch to extends
2025-04-04 10:04:47 -05:00
Harmen Stoppels
3ea92b1983 conditional: fix value type (#49882) 2025-04-04 15:31:44 +02:00
Harmen Stoppels
522fa9dc62 test/variant.py: fix broken test (#49879) 2025-04-04 11:13:50 +02:00
Caetano Melone
65ec330af5 fix: intel compiler alias intel-typo (#49870)
This typo was causing issues when concretizing specs with "%intel"
2025-04-04 08:51:48 +02:00
Dom Heinzeller
72c1d0033f Add p4est@2.8.7 (#49859) 2025-04-03 23:32:45 -06:00
Harmen Stoppels
6bfe83106d Concrete multi-valued variants (#49756)
Similar to the range-or-specific-version ambiguity of `@1.2` in the past,
which was solved with `@1.2` vs `@=1.2` we still have the ambiguity of
`name=a,b,c` in multi-valued variants. Do they mean "at least a,b,c" or
"exactly a,b,c"?

This issue comes up in for example `gcc languages=c,cxx`; there's no
way to exclude `fortran`.

The ambiguity is resolved with syntax `:=` to distinguish concrete from
abstract.

The following strings parse as **concrete** variants:

* `name:=a,b,c` => values exactly {a, b, c}
* `name:=a` => values exactly {a}
* `+name` => values exactly {True}
* `~name` => values exactly {False}

The following strings parse as **abstract** variants:

* `name=a,b,c` values at least {a, b, c}
* `name=*` special case for testing existence of a variant; values are at
  least the empty set {}

As a reminder

* `satisfies(lhs, rhs)` means `concretizations(lhs)` ⊆ `concretizations(rhs)`
* `intersects(lhs, rhs)` means `concretizations(lhs)` ∩ `concretizations(rhs)` ≠ ∅

where `concretizations(...)` is the set of sets of variant values in this case.

The satisfies semantics are:

* rhs abstract: rhs values is a subset of lhs values (whether lhs is abstract or concrete)
* lhs concrete, rhs concrete: set equality
* lhs abstract, rhs concrete: false

and intersects should mean

* lhs and rhs abstract: true (the union is a valid concretization under both)
* lhs or rhs abstract: true iff the abstract variant's values are a subset of the concrete one
* lhs concrete, rhs concrete: set equality

Concrete specs with single-valued variants are printed `+foo`, `~foo` and `foo=bar`;
only multi-valued variants are printed with `foo:=bar,baz` to reduce the visual noise.
2025-04-04 04:47:43 +00:00
suzannepaterno
d37e2c600c Add TotalView 2024.4 & 2025.1 (#49858)
Update to include the latest versions of Totalview and remove retired and interim maintainers.
2025-04-03 22:37:43 -06:00
Vicente Bolea
db9630e9e0 ascent: add rocm capabilities (#48517) 2025-04-03 14:30:45 -05:00
Zack Galbreath
136a658746 ci: replace 'graviton3' with 'neoverse_v1' (#49860)
neoverse_v1 matches the name of the stack and more accurately captures
the requirement for these jobs. The relevant runners in GitLab already
bear both tags, so this shouldn't affect how jobs get assigned to runners.
2025-04-03 14:24:39 -05:00
Adam J. Stewart
f0bfc7d898 py-keras: add v3.9.2 (#49856) 2025-04-03 20:48:20 +02:00
Alex Richert
03ebb82752 bufr: add v12.2.0 plus several updates (#49850)
* bufr: add test file resource and set LD_LIBRARY_PATH in setup_build_environment (python tests)
2025-04-03 10:47:41 -07:00
Alex Richert
82d808d58d g2: disable one problematic unit test for intel@:2022 (#49851) 2025-04-03 10:44:17 -07:00
Alex Richert
43fa93c8e1 wrf-io: add make test (#49852) 2025-04-03 10:41:39 -07:00
Adam J. Stewart
aa70cb34e1 py-shapely: add v2.1.0 (#49857) 2025-04-03 10:39:01 -07:00
Richard Berger
9acb70f204 kokkos: add v4.6.00 (#49810)
* kokkos: add version 4.6.00

* kokkos-kernels: add version 4.6.00

* kokkos-nvcc-wrapper: add version 4.6.00 and update url to match kokkos releases

* kokkos: add zen4 support
2025-04-03 10:26:25 -07:00
Daniele Colombo
8296788730 deconwolf: new package (#49801)
* deconwolf: add new package

* fix deconwolf patch: add full_index=1

* deconwolf: fix spack license string

* deconwolf: add commit to version

Co-authored-by: Alec Scott <hi@alecbcs.com>

* deconwolf: use url before git

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-03 10:15:10 -07:00
Frédéric Simonis
31a8dc6f6c precice: Add version 3.2.0 (#49833) 2025-04-03 10:12:06 -07:00
Massimiliano Culpo
b1ac661ba8 Simplify a few methods in environments (#49682)
Return a single scope from environment.env_config_scope
Return a single scope from environment_path_scope

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-03 18:19:04 +02:00
Greg Becker
63b437ddf9 py-black: add v24.4.0 -> 25.1.0 (#49814)
Signed-off-by: Gregory Becker <becker33@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-04-03 16:48:18 +02:00
Rob Falgout
578675cec8 Update package.py (#49843)
Adding hypre release 2.33.0
2025-04-03 05:44:08 -06:00
Harmen Stoppels
751c79872f variant.py: make Variant.default is str | bool | tuple[str] (#49836)
Warn if variant default is not among those types
2025-04-03 11:07:08 +00:00
Samuel Browne
22f26eec68 binder: add dependency on C (#49838) 2025-04-03 12:04:57 +02:00
Daryl W. Grunau
4ec2016f56 glproto, inputproto, kbproto, libpthread-stubs, randrproto, renderproto, xextproto, xproto, xsimd, xtensor, xtensor-blas, xtl: add c dependency (#49823) 2025-04-03 12:03:59 +02:00
Massimiliano Culpo
5c71d36330 compiler-wrapper: set SPACK_COMPILER_EXTRA_RPATHS (#49828)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-03 11:23:08 +02:00
Kyle Brindley
035096006e py-waves: new package (#49805) 2025-04-02 15:46:01 -07:00
Weiqun Zhang
1f797208bc amrex: add v25.04 (#49848) 2025-04-02 14:55:45 -07:00
Wouter Deconinck
aa41fe05ff qt-*: add v6.8.3, v6.9.0 (#49840)
* qt-base: add v6.8.3, v6.9.0
* qt-declarative: add v6.8.3, v6.9.0
* qt:-quick3d: add v6.8.e, v6.9.0
* qt-quicktimeline: add v6.8.3, v6.9.0
* qt-tools: add v6.8.3, v6.9.0
* qt-svg: add v6.8.3, v6.9.0
* qt-5compat: add v6.8.3, v6.9.0
* qt-shadertools: add v6.8.3, v6.9.0
2025-04-02 14:39:59 -07:00
Richard Berger
f4792c834e portage, tangram, wonton: update packages (#49829) 2025-04-02 19:53:14 +02:00
Satish Balay
98ca90aebc slepc, py-slepc4py, petsc, py-petsc4py add v3.23.0 (#49813) 2025-04-02 10:08:56 -07:00
Mikael Simberg
991f26d1ae pika: Add 0.33.0 (#49834) 2025-04-02 10:58:24 -06:00
eugeneswalker
973a7e6de8 e4s oneapi: upgrade to latest compilers oneapi@2025.1 (#47317)
* e4s oneapi: upgrade to latest compilers oneapi@2025.1

* update specs and package preferences

* enable some more dav packages

* enable additional specs

* e4s oneapi: packages: elfutils does not have bzip2 variant

* e4s oneapi: packages: elfutils does not have xz variant

* e4s oneapi: comment out heffte+sycl

* comment out e4s oneapi failures

* comment out more failures

* comment out failing spec
2025-04-02 09:21:49 -07:00
Cory Quammen
528ba74965 paraview: add v5.13.3 (#49818)
* Revert "paraview: add patch for Intel Classic compilers (#49116)"

This reverts commit 7a95e2beb5.

We'll mark Intel Classic compilers as conflicting with ParaView
versions 5.13.0-5.13.2 instead since 5.13.3 is available and can be
built with with those compilers.

* Add conflict for Intel Class compilers and ParaView 5.13.0-5.13.2.

* paraview: add new v5.13.3 release
2025-04-02 10:53:19 -05:00
eugeneswalker
73034c163b rocm 6.3.3 updates (#49684) 2025-04-02 08:44:06 -07:00
Massimiliano Culpo
62ee56e8a3 docs: remove leftover references to compiler: entries (#49824)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-02 09:33:23 +02:00
Massimiliano Culpo
01471aee6b solver: don't use tags to compute injected deps (#49723)
This commit reorders ASP setup, so that rules from
possible compilers are collected first.

This allows us to know the dependencies that may be
injected before counting the possible dependencies,
so we can account for them too.

Proceeding this way makes it easier to inject
complex runtimes, like hip.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-02 09:32:25 +02:00
Afzal Patel
c004c8b616 rocm-opencl: change homepage and git (#49832) 2025-04-02 09:19:40 +02:00
Harmen Stoppels
0facab231f spec.py: more virtuals=... type hints (#49753)
Deal with the "issue" that passing a str instance does not cause a
type check failure, because str is a subset of Sequence[str] and
Iterable[str]. Instead fix it by special casing the str instance.
2025-04-02 00:05:00 -07:00
Greg Becker
ca64050f6a config:url_fetch_method: allow curl args (#49712)
Signed-off-by: Gregory Becker <becker33@llnl.gov>
2025-04-01 15:23:28 -05:00
Jonas Eschle
91b3afac88 Add py-tf-keras package, upgrade TFP (#43688)
* enh: add tf-keras package, upgrade TFP

* chore:  remove legacy deps

* chore:  fix style

* chore:  fix style

* fix: url

* fix: use jax, tensorflow instead of py-jax, py-tensorflow

* fix: remove typo

* Update var/spack/repos/builtin/packages/py-tensorflow-probability/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix: typos

* fix: swap version

* fix: typos

* fix: typos

* fix: typos

* chore: use f strings

* enh: move tf-keras to pypi

* [@spackbot] updating style on behalf of jonas-eschle

* fix: t

* enh: add tf-keras package, upgrade TFP

* chore:  remove legacy deps

* chore:  fix style

* chore:  fix style

* fix: url

* fix: use jax, tensorflow instead of py-jax, py-tensorflow

* fix: remove typo

* Update var/spack/repos/builtin/packages/py-tensorflow-probability/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix: typos

* fix: swap version

* fix: typos

* fix: typos

* fix: typos

* chore: use f strings

* enh: move tf-keras to pypi

* [@spackbot] updating style on behalf of jonas-eschle

* enh: move tf-keras to pypi

* enh: move back to releases to make it work, actually

* enh: move back to releases to make it work, actually

* fix:change back to tar...

* Fix concretisation: py-tf-keras only has 2.17, not 2.16, fix checksum

* enh: add TFP 0.25

* enh: add tf-keras 2.18

* chore: fix style

* fix: remove patch

* maybe fix license

* Update var/spack/repos/builtin/packages/py-tf-keras/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix:  pipargs global?

* Update var/spack/repos/builtin/packages/py-tf-keras/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* chore: fix formatting

* chore: fix formatting again

* fix:  pathes in spack

* fix:  typo

* fix:  typo

* use github package

* use pip install

* fix typo

* fix typo

* comment 2.19 out

* fix typo

* fix typo

* fix typo

* chore: remove unused patch file

* chore: cleanup

* chore: add comment about TF version

* chore: remove unused Bazel, cleanup imports

* [@spackbot] updating style on behalf of jonas-eschle

* chore: add star import, degrading readability

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-04-01 20:12:57 +02:00
Alec Scott
0327ba1dfe py-python-lsp-ruff: new package (#49764)
* py-python-lsp-ruff: new package
* Add git and main versions to both for development
2025-04-01 10:32:49 -07:00
Vanessasaurus
67f091f0d9 flux-core: cffi needing for linking (#49656)
* Automated deployment to update package flux-core 2025-03-23

* cffi also needed for runtime

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-04-01 10:24:57 -07:00
Robert Maaskant
c09759353f trivy: new package (#49786) 2025-04-01 09:46:04 -07:00
Wouter Deconinck
14d72d2703 treewide style: move depends_on(c,cxx,fortran) with other dependencies, after variants (#49769)
* fix: move depends_on(c,cxx,fortran) with other dependencies, after variants

* treewide style: move depends_on(c,cxx,fortran) with other dependencies, after variants

* treewide style: move depends_on(c,cxx,fortran) with other dependencies, after variant

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-04-01 09:25:46 -07:00
Massimiliano Culpo
288298bd2c ci: don't run unit-test on ubuntu 20.04 (#49826)
Compatibility with Python 3.6 is still tested by the
rhel8-platform-python job, and Ubuntu 20.04 will be
removed soon from the list of runners:

https://github.com/actions/runner-images/issues/11101

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-01 17:33:24 +02:00
Olivier Cessenat
964f81d3c2 scotch: remains buildable with make (#49812) 2025-04-01 07:56:48 -07:00
John W. Parent
93220f706e concretizer cache: disable due to broken cleanup (#49470)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-04-01 15:38:56 +02:00
Victor A. P. Magri
fe9275a5d4 New hypre variants + refactoring (#49217) 2025-04-01 08:36:17 -05:00
Massimiliano Culpo
0fa829ae77 cmake: set CMAKE_POLICY_VERSION_MINIMUM (#49819)
CMake 4.0.0 breaks compatibility with CMake projects
requiring a CMake < 3.5. However, many projects that
specify a minimum requirement for versions older
than 3.5 are actually compatible with newer CMake
and do not use CMake 3.4 or older features. This
allows those projects to use a newer CMake

Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-01 14:55:51 +02:00
Paul Gessinger
451db85657 apple-clang: fix spec syntax error (#49802) 2025-04-01 03:53:56 -06:00
Massimiliano Culpo
ce1c2b0f05 chapel: remove requirements on compilers (#49783)
The requirements being removed are redundant, or 
even outdated (%cray-prgenv-* is not a compiler in v0.23).

When compilers turned into nodes, these constraints,
with the "one_of" policy, started being unsat under
certain conditions e.g. we can't compile anymore
with GCC and depend on LLVM as a library.

Remove the requirements to make the recipe solvable
again.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-04-01 09:30:43 +02:00
Adam J. Stewart
88c1eae5d4 py-webdataset: add v0.2.111 (#49780) 2025-04-01 09:15:18 +02:00
Adam J. Stewart
4af3bc47a2 py-wids: add new package (#49779) 2025-04-01 09:14:58 +02:00
Harmen Stoppels
a257747cba test/spec_semantics.py: more fixes (#49821) 2025-04-01 09:14:26 +02:00
Robert Maaskant
ed2ddec715 py-setuptools-scm: deprecate old versions (#49657) 2025-04-01 09:08:10 +02:00
Tamara Dahlgren
748c7e5420 Documentation: remote URL inclusion updates (#49669) 2025-04-01 09:02:50 +02:00
dependabot[bot]
cf70d71ba8 build(deps): bump flake8 in /.github/workflows/requirements/style (#49815)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.2 to 7.2.0.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.2...7.2.0)

---
updated-dependencies:
- dependency-name: flake8
  dependency-version: 7.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-01 08:34:26 +02:00
Stephen Nicholas Swatman
3913c24c19 acts dependencies: new versions as of 2025/03/31 (#49803)
This commit adds a new version of ACTS algebra plugins (v0.27.0) as
well as detray (v0.94.0).
2025-04-01 00:28:54 -06:00
Stephen Nicholas Swatman
032a0dba90 benchmark: add v1.9.2 (#49804)
This commit adds v1.9.2 of Google Benchmark.
2025-04-01 00:28:39 -06:00
Chris Marsh
d4a8602577 spdlog: Add v1.15.1, improve version ranges (#49745)
* Add v1.15.1, fix external tweakme.h, better constraint version ranges of external fmt

* style
2025-03-31 19:09:28 -06:00
Elliott Slaughter
1ce5ecfbd7 legion: add 25.03.0, deprecate versions 23.03.0 and prior. (#49770)
* legion: Add 25.03.0. Deprecate 23.03.0 and prior.

* legion: Add ucc dependency when using ucx network.

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-03-31 18:54:00 -06:00
Satish Balay
bf6ea7b047 cuda@12.8 requires glibc@2.27 (#49561) 2025-03-31 13:02:09 -07:00
Chris Marsh
49a17de751 python 3.12 package fixes (#48830)
* update packages to work with python 3.12+

* partd updates

* py-nc-time-axis updates for 3.12

* Add new py-cftime as required for py-nc-time-axis

* fix dropped python 3.9

* switch from when blocks to flat

* remove redundant requires

* protect version range for python@:3.11

* add new c-blosc to support newly added python 3.12 py-blosc version

* add scikit-build 0.18.1 for python 3.12 required for this set of commits

* add complete optional variant for py-partd to match pyproject.toml. Deprecate super old versions

* only set system blosc for the required case

* style

* Remove incorrect python bound

* improve python version reqs and move to more canonical depends_on

* move to depends from req

* add new python range limit, update comment

* remove @coreyjadams as maintainer as per their request https://github.com/spack/spack/pull/48830#issuecomment-2705062587

* Fix python bounds

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-31 12:09:56 -05:00
Harmen Stoppels
95927df455 test/spec_semantics.py: split a test into multiple ones, test only necessary bits (#49806) 2025-03-31 17:08:36 +00:00
Tara Drwenski
0f64f1baec CachedCMakePackage: Improve finding mpiexec for non-slurm machines (#49033)
* Check for LSF, FLux, and Slurm when determing MPI exec

* Make scheduler/MPI exec helper functions methods of CachedCMakeBuilder

* Remove axom workaround for running mpi on machines with flux
2025-03-31 09:11:09 -07:00
Harmen Stoppels
46f7737626 xfail -> skipif platform specific tests on other platforms (#49800) 2025-03-31 16:32:26 +02:00
Alec Scott
7256508983 fzf: add v0.61.0 (#49787) 2025-03-31 12:16:36 +02:00
John W. Parent
75a3d179b1 Windows: MSVC provides fortran, fix msmpi handling (#49734)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-31 10:53:07 +02:00
Xylar Asay-Davis
72b14de89e Add nco v5.3.3 (#49776) 2025-03-30 10:12:12 -07:00
Christoph Junghans
de6eaa1b4e votca: add v2025 (#49768) 2025-03-30 10:10:23 -07:00
Paul Gessinger
0f4bfda2a1 apple-gl: remove compiler requirement after change in compiler model (#49760) 2025-03-30 11:12:42 +02:00
Ritwik Patil
6c78d9cab2 sendme: add v0.25.0, v0.24.0 (#49775)
* sendme: add v0.25.0, v0.24.0

* fix version orders

* add minimum rust version
2025-03-29 19:12:34 -06:00
Matthieu Dorier
7970a04025 unqlite: fix dependency on C++ (#49774) 2025-03-29 10:10:59 -05:00
Adrien Bernede
22c38e5975 Update hip support for Cached CMake Packages (#49635)
* Clearly split old and new hip settings requirements
* Apply generic rocm handling to every project
* make default logic for hip support more robust
* GPU_TARGET is only necessary under certain project specific conditions, it should not be necessary in general
* Update logic to find amdclang++
2025-03-28 18:38:45 -07:00
Wouter Deconinck
3c4cb0d4f3 gsoap: depends on c, cxx (#49728) 2025-03-28 13:58:31 -06:00
Wouter Deconinck
61b9e8779b mlpack: depends on c (#49767)
* mlpack: depends on c

* mlpack: mv depends_on after variants
2025-03-28 13:53:56 -06:00
Wouter Deconinck
2a25e2b572 qt-*: depends on c (#49766)
* qt-5compat: depends on c

* qt-*: ensure all depend on C

* qt-base: mv depends_on after variants
2025-03-28 13:53:28 -06:00
Sebastian Pipping
7db5b1d5d6 expat: Add 2.7.1 (#49741) 2025-03-28 13:32:55 -06:00
Massimiliano Culpo
10f309273a Do not error when trying to convert corrupted compiler entries (#49759)
fixes #49717

If no compiler is listed in the 'packages' section of
the configuration, Spack will currently try to:
1. Look for a legacy compilers.yaml to convert
2. Look for compilers in PATH

in that order. If an entry in compilers.yaml is
corrupted, that should not result in an obscure
error.

Instead, it should just be skipped.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-28 13:14:12 -06:00
Matthias Krack
3e99a12ea2 libsmeagol: add dependency fortran (#49754) 2025-03-28 13:09:31 -06:00
Greg Becker
d5f5d48bb3 oneapi-compilers: allow detection of executables symlinked to prefix.bin (#49742)
* intel-oneapi-compilers: detect compilers when symlinked to prefix.bin

* update detection tests

--

Signed-off-by: Gregory Becker <becker33@llnl.gov>
2025-03-28 12:28:47 -06:00
Wouter Deconinck
6f2019ece9 root: add v6.34.06 (#49748) 2025-03-28 11:09:47 -05:00
Harmen Stoppels
35a84f02fa variant.py: reserved names is a set (#49762) 2025-03-28 08:17:04 -07:00
Wouter Deconinck
f6123ee160 onnx: depends on c (#49739) 2025-03-28 08:31:21 -06:00
Harmen Stoppels
4b02ecddf4 variant.py: use spack.error the normal way (#49763)
This module references `spack.error.Something` in the same file, which happens to
work but is incorrect. Use `spack.error` to prevent that in the future.
2025-03-28 14:12:45 +00:00
Richard Berger
bd39598e61 libcxi: disable ze, cuda, rocm autodetection (#49736)
Only enable these if explicitly requested.
2025-03-28 08:06:29 -06:00
Wouter Deconinck
df9cac172e cryptopp: depends on cxx (#49730) 2025-03-28 08:06:00 -06:00
Wouter Deconinck
511c2750c7 cppzmq: depends on c (#49729) 2025-03-28 08:00:23 -06:00
Jim Edwards
b7a81426b0 parallelio: add v2.6.4, v2.6.5 (#49607)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-28 04:31:50 -06:00
Jim Edwards
654b294641 mpi-serial: add v2.5.2, v2.5.3 (#49606)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-28 04:31:25 -06:00
Cameron Rutherford
c19a90b74a py-notebook: Add 6.5.6 and 6.5.7, fix py-traitlets incompatibility for 6.5.5 (#49666) 2025-03-28 10:30:52 +01:00
Fernando Ayats
03d70feb18 Add py-equinox (#49342)
Adds Equinox, a JAX library. I've added the latest version 0.11.2, and also 0.11.0
which is compatible with older JAX versions.

I've built both versions and loading an example from their page seems to work OK.

* Add py-wadler-lindig
* Add py-equinox
2025-03-28 10:27:04 +01:00
Vinícius
bb1216432a simgrid: add v4.0 (#49675)
Co-authored-by: viniciusvgp <viniciusvgp@users.noreply.github.com>
2025-03-28 10:24:55 +01:00
Wouter Deconinck
d27aab721a erlang: add v26.2.5.2, v27.0.1 (#45725) 2025-03-28 10:23:09 +01:00
Todd Gamblin
3444d40ae2 bugfix: pure cxx and fortran deps should display with a compiler. (#49752)
I noticed that `abseil-cpp` was showing in `spack find` with "no compiler", and the only
difference between it and other nodes was that it *only* depends on `cxx` -- others
depend on `c` as well.

It turns out that the `select()` method on `EdgeMap` only takes `Sequence[str]` and doesn't
check whether they're actually just one `str`.  So asking for, e.g., `cxx` is like asking for
`c` or `x` or `x`, as the `str` is treated like a sequence. This causes Spack to miss `cxx`
and `fortran` language virtuals in `DeprecatedCompilerSpec`.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-28 10:18:18 +01:00
Carlos Bederián
37f2683d17 hcoll: add new versions (#47923) 2025-03-28 10:17:46 +01:00
Harmen Stoppels
3c4f23f64a Fix import urllib.request (#49699) 2025-03-28 10:15:33 +01:00
jmuddnv
db8e56b0a5 NVIDIA HPC SDK: add v25.3 (#49694) 2025-03-28 10:14:21 +01:00
Alec Scott
ff6dfea9b9 difftastic: new package (#49708) 2025-03-28 10:06:27 +01:00
Adam J. Stewart
2f3ef790e2 py-keras: add v3.9.1 (#49726) 2025-03-28 01:48:06 -07:00
Richard Berger
01db307f41 libfabric: add missing xpmem dependency (#49569) 2025-03-28 09:41:36 +01:00
dithwick
d715b725fa openmpi: add ipv6 variant (#49727) 2025-03-28 09:26:56 +01:00
Todd Gamblin
52a995a95c UI: Color external packages differently from installed packages (#49751)
Currently, externals show up in `spack find` and `spack spec` install status as a green
`[e]`, which is hard to distinguish from the green [+] used for installed packages.

- [x] Make externals magenta instead, so they stand out.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-28 00:37:56 -07:00
John W. Parent
b87c025cd3 Remove spack sh setup (#49747) 2025-03-27 16:40:39 -07:00
v
360eb4278c set compiler version for xsd package (#49702)
* set compiler version for xsd package

---------

Co-authored-by: novasoft <novasoft@fnal.gov>
2025-03-27 14:29:05 -07:00
Vicente Bolea
7b3fc7dee3 adios2: fix smoke package (#49647) 2025-03-27 16:10:42 -05:00
Dave Keeshan
f0acbe4310 yosys: add v0.51 (#49639)
* add: yosys v0.51

* Move license to within 6 lines of start, but truncated description to do that

* Moving license back, and reinstating description
2025-03-27 12:28:44 -06:00
Adam J. Stewart
f0c676d14a py-contrib: add new package (#49634) 2025-03-27 19:07:33 +01:00
Howard Pritchard
13dd198a09 Open MPI: add 5.0.7 (#49661)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2025-03-27 10:33:56 -07:00
Teague Sterling
3f6c66d701 libwnck: add v43.2 (#49655)
* libwnck: new package

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

add gettext and xres (testing)

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Adding Cairo variant

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* libwnck: add v43.2

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-27 10:32:32 -07:00
gregorweiss
dcd6e61f34 bigwhoop: new package (#49677)
* add package for BigWhoop lossy data compression

Co-authored-by: Patrick Vogler <patrick.vogler@hlrs.de>

* add version 0.2.0

* update url

* corrected checksum

---------

Co-authored-by: Patrick Vogler <patrick.vogler@hlrs.de>
2025-03-27 10:30:20 -07:00
Richard Berger
ac7b467897 libfabric: add v2.0.0, v2.1.0, and lnx fabric (#49549) 2025-03-27 10:22:15 -07:00
Harmen Stoppels
c5adb934c2 rocminfo: extends python (#49532) 2025-03-27 18:05:56 +01:00
eugeneswalker
1e70a8d6ad amr-wind+rocm depends_on rocrand and rocprim (#49703)
* amr-wind+rocm depends_on rocrand

* amr-wind +rocm: depend_on rocprim
2025-03-27 09:30:29 -07:00
Buldram
755a4054b2 zuo: add v1.12 (#49678) 2025-03-27 08:45:35 -07:00
Seth R. Johnson
040d747a86 g4tendl: avoid closing braket that breaks lmod (#49721) 2025-03-27 09:44:16 -06:00
Jon Rood
9440894173 hdf5: add version 1.14.6. (#49704) 2025-03-27 08:40:10 -07:00
Alex Tyler Chapman
4e42e3c2ec update gslib package to include version 1.0.9 (#49710) 2025-03-27 08:38:44 -07:00
Wouter Deconinck
662bf113e2 ensmallen: depends on c (#49719) 2025-03-27 08:36:24 -07:00
Wouter Deconinck
7e11fd62e2 geant4: add v11.3.1 (#49700) 2025-03-27 16:08:57 +01:00
Massimiliano Culpo
a6c22f2690 Update documentation after compiler as deps (#49715) 2025-03-27 11:19:35 +01:00
Massimiliano Culpo
4894668ece Remove "classic" Intel packages (#45188) 2025-03-27 10:13:33 +01:00
2122 changed files with 13153 additions and 11483 deletions

3
.gitattributes vendored
View File

@@ -1,4 +1,3 @@
*.py diff=python
*.lp linguist-language=Prolog
lib/spack/external/* linguist-vendored
*.bat text eol=crlf
*.bat text eol=crlf

View File

@@ -59,7 +59,6 @@ jobs:
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit configs

View File

@@ -26,7 +26,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
cmake bison bison-devel libstdc++-static gawk
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |

View File

@@ -25,14 +25,16 @@ jobs:
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: '.github/workflows/requirements/style/requirements.txt'
- name: Install Python Packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/requirements/style/requirements.txt
- name: vermin (Spack's Core)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
run: |
vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
run: |
vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
@@ -40,23 +42,20 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
fetch-depth: 2
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: '.github/workflows/requirements/style/requirements.txt'
- name: Install Python packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/requirements/style/requirements.txt
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/bin/setup_git.sh
- name: Run style tests
run: |
share/spack/qa/run-style-tests
bin/spack style --base HEAD^1
bin/spack license verify
pylint -j $(nproc) --disable=all --enable=unspecified-encoding --ignore-paths=lib/spack/external lib
audit:
uses: ./.github/workflows/audit.yaml
@@ -103,21 +102,3 @@ jobs:
spack -d bootstrap now --dev
spack -d style -t black
spack unit-test -V
# Further style checks from pylint
pylint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
cache: 'pip'
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pylint
- name: Pylint (Spack Core)
run: |
pylint -j 4 --disable=all --enable=unspecified-encoding --ignore-paths=lib/spack/external lib

View File

@@ -1,7 +1,8 @@
black==25.1.0
clingo==5.7.1
flake8==7.1.2
clingo==5.8.0
flake8==7.2.0
isort==6.0.1
mypy==1.15.0
types-six==1.17.0.20250304
types-six==1.17.0.20250403
vermin==1.6.0
pylint==3.3.6

View File

@@ -19,9 +19,6 @@ jobs:
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
include:
- python-version: '3.6'
os: ubuntu-20.04
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.7'
os: ubuntu-22.04
on_develop: ${{ github.ref == 'refs/heads/develop' }}

View File

@@ -46,18 +46,42 @@ See the
[Feature Overview](https://spack.readthedocs.io/en/latest/features.html)
for examples and highlights.
To install spack and your first package, make sure you have Python & Git.
Installation
----------------
To install spack, first make sure you have Python & Git.
Then:
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install zlib
```bash
git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git
```
<details>
<summary>What are <code>manyFiles=true</code> and <code>--depth=2</code>?</summary>
<br>
> [!TIP]
> `-c feature.manyFiles=true` improves git's performance on repositories with 1,000+ files.
>
> `--depth=2` prunes the git history to reduce the size of the Spack installation.
</details>
```bash
# For bash/zsh/sh
. spack/share/spack/setup-env.sh
# For tcsh/csh
source spack/share/spack/setup-env.csh
# For fish
. spack/share/spack/setup-env.fish
```
```bash
# Now you're ready to install a package!
spack install zlib-ng
```
Documentation
----------------

View File

@@ -90,10 +90,9 @@ config:
misc_cache: $user_cache_path/cache
# Timeout in seconds used for downloading sources etc. This only applies
# to the connection phase and can be increased for slow connections or
# servers. 0 means no timeout.
connect_timeout: 10
# Abort downloads after this many seconds if not data is received.
# Setting this to 0 will disable the timeout.
connect_timeout: 30
# If this is false, tools like curl that use SSL will not verify

View File

@@ -25,6 +25,8 @@ packages:
glu: [apple-glu]
unwind: [apple-libunwind]
uuid: [apple-libuuid]
apple-clang:
buildable: false
apple-gl:
buildable: false
externals:

View File

@@ -72,6 +72,8 @@ packages:
permissions:
read: world
write: user
cce:
buildable: false
cray-fftw:
buildable: false
cray-libsci:
@@ -86,13 +88,23 @@ packages:
buildable: false
essl:
buildable: false
fj:
buildable: false
fujitsu-mpi:
buildable: false
fujitsu-ssl2:
buildable: false
glibc:
buildable: false
hpcx-mpi:
buildable: false
iconv:
prefer: [libiconv]
mpt:
buildable: false
musl:
buildable: false
spectrum-mpi:
buildable: false
xl:
buildable: false

View File

@@ -20,3 +20,8 @@ packages:
cxx: [msvc]
mpi: [msmpi]
gl: [wgl]
mpi:
require:
- one_of: [msmpi]
msvc:
buildable: false

View File

@@ -1291,55 +1291,61 @@ based on site policies.
Variants
^^^^^^^^
Variants are named options associated with a particular package. They are
optional, as each package must provide default values for each variant it
makes available. Variants can be specified using
a flexible parameter syntax ``name=<value>``. For example,
``spack install mercury debug=True`` will install mercury built with debug
flags. The names of particular variants available for a package depend on
Variants are named options associated with a particular package and are
typically used to enable or disable certain features at build time. They
are optional, as each package must provide default values for each variant
it makes available.
The names of variants available for a particular package depend on
what was provided by the package author. ``spack info <package>`` will
provide information on what build variants are available.
For compatibility with earlier versions, variants which happen to be
boolean in nature can be specified by a syntax that represents turning
options on and off. For example, in the previous spec we could have
supplied ``mercury +debug`` with the same effect of enabling the debug
compile time option for the libelf package.
There are different types of variants:
Depending on the package a variant may have any default value. For
``mercury`` here, ``debug`` is ``False`` by default, and we turned it on
with ``debug=True`` or ``+debug``. If a variant is ``True`` by default
you can turn it off by either adding ``-name`` or ``~name`` to the spec.
1. Boolean variants. Typically used to enable or disable a feature at
compile time. For example, a package might have a ``debug`` variant that
can be explicitly enabled with ``+debug`` and disabled with ``~debug``.
2. Single-valued variants. Often used to set defaults. For example, a package
might have a ``compression`` variant that determines the default
compression algorithm, which users could set to ``compression=gzip`` or
``compression=zstd``.
3. Multi-valued variants. A package might have a ``fabrics`` variant that
determines which network fabrics to support. Users could set this to
``fabrics=verbs,ofi`` to enable both InfiniBand verbs and OpenFabrics
interfaces. The values are separated by commas.
There are two syntaxes here because, depending on context, ``~`` and
``-`` may mean different things. In most shells, the following will
result in the shell performing home directory substitution:
The meaning of ``fabrics=verbs,ofi`` is to enable *at least* the specified
fabrics, but other fabrics may be enabled as well. If the intent is to
enable *only* the specified fabrics, then the ``fabrics:=verbs,ofi``
syntax should be used with the ``:=`` operator.
.. code-block:: sh
.. note::
mpileaks ~debug # shell may try to substitute this!
mpileaks~debug # use this instead
In certain shells, the the ``~`` character is expanded to the home
directory. To avoid these issues, avoid whitespace between the package
name and the variant:
If there is a user called ``debug``, the ``~`` will be incorrectly
expanded. In this situation, you would want to write ``libelf
-debug``. However, ``-`` can be ambiguous when included after a
package name without spaces:
.. code-block:: sh
.. code-block:: sh
mpileaks ~debug # shell may try to substitute this!
mpileaks~debug # use this instead
mpileaks-debug # wrong!
mpileaks -debug # right
Alternatively, you can use the ``-`` character to disable a variant,
but be aware that this requires a space between the package name and
the variant:
Spack allows the ``-`` character to be part of package names, so the
above will be interpreted as a request for the ``mpileaks-debug``
package, not a request for ``mpileaks`` built without ``debug``
options. In this scenario, you should write ``mpileaks~debug`` to
avoid ambiguity.
.. code-block:: sh
When spack normalizes specs, it prints them out with no spaces boolean
variants using the backwards compatibility syntax and uses only ``~``
for disabled boolean variants. The ``-`` and spaces on the command
line are provided for convenience and legibility.
mpileaks-debug # wrong: refers to a package named "mpileaks-debug"
mpileaks -debug # right: refers to a package named mpileaks with debug disabled
As a last resort, ``debug=False`` can also be used to disable a boolean variant.
"""""""""""""""""""""""""""""""""""
Variant propagation to dependencies
"""""""""""""""""""""""""""""""""""
Spack allows variants to propagate their value to the package's
dependency by using ``++``, ``--``, and ``~~`` for boolean variants.
@@ -1409,27 +1415,29 @@ that executables will run without the need to set ``LD_LIBRARY_PATH``.
.. code-block:: yaml
compilers:
- compiler:
spec: gcc@4.9.3
paths:
cc: /opt/gcc/bin/gcc
c++: /opt/gcc/bin/g++
f77: /opt/gcc/bin/gfortran
fc: /opt/gcc/bin/gfortran
environment:
unset:
- BAD_VARIABLE
set:
GOOD_VARIABLE_NUM: 1
GOOD_VARIABLE_STR: good
prepend_path:
PATH: /path/to/binutils
append_path:
LD_LIBRARY_PATH: /opt/gcc/lib
extra_rpaths:
- /path/to/some/compiler/runtime/directory
- /path/to/some/other/compiler/runtime/directory
packages:
gcc:
externals:
- spec: gcc@4.9.3
prefix: /opt/gcc
extra_attributes:
compilers:
c: /opt/gcc/bin/gcc
cxx: /opt/gcc/bin/g++
fortran: /opt/gcc/bin/gfortran
environment:
unset:
- BAD_VARIABLE
set:
GOOD_VARIABLE_NUM: 1
GOOD_VARIABLE_STR: good
prepend_path:
PATH: /path/to/binutils
append_path:
LD_LIBRARY_PATH: /opt/gcc/lib
extra_rpaths:
- /path/to/some/compiler/runtime/directory
- /path/to/some/other/compiler/runtime/directory
^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -63,7 +63,6 @@ on these ideas for each distinct build system that Spack supports:
build_systems/cudapackage
build_systems/custompackage
build_systems/inteloneapipackage
build_systems/intelpackage
build_systems/rocmpackage
build_systems/sourceforgepackage

View File

@@ -33,9 +33,6 @@ For more information on a specific package, do::
spack info --all <package-name>
Intel no longer releases new versions of Parallel Studio, which can be
used in Spack via the :ref:`intelpackage`. All of its components can
now be found in oneAPI.
Examples
========
@@ -50,34 +47,8 @@ Install the oneAPI compilers::
spack install intel-oneapi-compilers
Add the compilers to your ``compilers.yaml`` so spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
spack compiler list
Note that 2024 and later releases do not include ``icc``. Before 2024,
the package layout was different::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
The ``intel-oneapi-compilers`` package includes 2 families of
compilers:
* ``intel``: ``icc``, ``icpc``, ``ifort``. Intel's *classic*
compilers. 2024 and later releases contain ``ifort``, but not
``icc`` and ``icpc``.
* ``oneapi``: ``icx``, ``icpx``, ``ifx``. Intel's new generation of
compilers based on LLVM.
To build the ``patchelf`` Spack package with ``icc``, do::
spack install patchelf%intel
To build with with ``icx``, do ::
To build the ``patchelf`` Spack package with ``icx``, do::
spack install patchelf%oneapi
@@ -92,15 +63,6 @@ Install the oneAPI compilers::
spack install intel-oneapi-compilers
Add the compilers to your ``compilers.yaml`` so Spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
spack compiler list
Clone `spack-configs <https://github.com/spack/spack-configs>`_ repo and activate Intel oneAPI CPU environment::
git clone https://github.com/spack/spack-configs
@@ -149,7 +111,7 @@ Compilers
---------
To use the compilers, add some information about the installation to
``compilers.yaml``. For most users, it is sufficient to do::
``packages.yaml``. For most users, it is sufficient to do::
spack compiler add /opt/intel/oneapi/compiler/latest/bin
@@ -157,7 +119,7 @@ Adapt the paths above if you did not install the tools in the default
location. After adding the compilers, using them is the same
as if you had installed the ``intel-oneapi-compilers`` package.
Another option is to manually add the configuration to
``compilers.yaml`` as described in :ref:`Compiler configuration
``packages.yaml`` as described in :ref:`Compiler configuration
<compiler-config>`.
Before 2024, the directory structure was different::
@@ -200,15 +162,5 @@ You can also use Spack-installed libraries. For example::
Will update your environment CPATH, LIBRARY_PATH, and other
environment variables for building an application with oneMKL.
More information
================
This section describes basic use of oneAPI, especially if it has
changed compared to Parallel Studio. See :ref:`intelpackage` for more
information on :ref:`intel-virtual-packages`,
:ref:`intel-unrelated-packages`,
:ref:`intel-integrating-external-libraries`, and
:ref:`using-mkl-tips`.
.. _`Intel installers`: https://software.intel.com/content/www/us/en/develop/documentation/installation-guide-for-intel-oneapi-toolkits-linux/top.html

File diff suppressed because it is too large Load Diff

View File

@@ -91,7 +91,7 @@ there are any other variables you need to set, you can do this in the
.. code-block:: python
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
env.set("PREFIX", prefix)
env.set("BLASLIB", spec["blas"].libs.ld_flags)

View File

@@ -12,8 +12,7 @@ The ``ROCmPackage`` is not a build system but a helper package. Like ``CudaPacka
it provides standard variants, dependencies, and conflicts to facilitate building
packages using GPUs though for AMD in this case.
You can find the source for this package (and suggestions for setting up your
``compilers.yaml`` and ``packages.yaml`` files) at
You can find the source for this package (and suggestions for setting up your ``packages.yaml`` file) at
`<https://github.com/spack/spack/blob/develop/lib/spack/spack/build_systems/rocm.py>`__.
^^^^^^^^

View File

@@ -225,8 +225,10 @@ def setup(sphinx):
("py:class", "llnl.util.lang.T"),
("py:class", "llnl.util.lang.KT"),
("py:class", "llnl.util.lang.VT"),
("py:class", "llnl.util.lang.ClassPropertyType"),
("py:obj", "llnl.util.lang.KT"),
("py:obj", "llnl.util.lang.VT"),
("py:obj", "llnl.util.lang.ClassPropertyType"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -148,15 +148,16 @@ this can expose you to attacks. Use at your own risk.
``ssl_certs``
--------------------
Path to custom certificats for SSL verification. The value can be a
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.
When ``url_fetch_method:curl`` the ``config:ssl_certs`` should resolve to
a single file. Spack will then set the environment variable ``CURL_CA_BUNDLE``
in the subprocess calling ``curl``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
in the subprocess calling ``curl``. If additional ``curl`` arguments are required,
they can be set in the config, e.g. ``url_fetch_method:'curl -k -q'``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
``config:ssl_certs:$SSL_CERT_FILE`` or ``config:ssl_certs:$SSL_CERT_DIR``
will work.
In all cases the expanded path must be absolute for Spack to use the certificates.

View File

@@ -11,7 +11,7 @@ Configuration Files
Spack has many configuration files. Here is a quick list of them, in
case you want to skip directly to specific docs:
* :ref:`compilers.yaml <compiler-config>`
* :ref:`packages.yaml <compiler-config>`
* :ref:`concretizer.yaml <concretizer-options>`
* :ref:`config.yaml <config-yaml>`
* :ref:`include.yaml <include-yaml>`
@@ -46,6 +46,12 @@ Each Spack configuration file is nested under a top-level section
corresponding to its name. So, ``config.yaml`` starts with ``config:``,
``mirrors.yaml`` starts with ``mirrors:``, etc.
.. tip::
Validation and autocompletion of Spack config files can be enabled in
your editor with the YAML language server. See `spack/schemas
<https://github.com/spack/schemas>`_ for more information.
.. _configuration-scopes:
--------------------
@@ -95,7 +101,7 @@ are six configuration scopes. From lowest to highest:
precedence over all other scopes.
Each configuration directory may contain several configuration files,
such as ``config.yaml``, ``compilers.yaml``, or ``mirrors.yaml``. When
such as ``config.yaml``, ``packages.yaml``, or ``mirrors.yaml``. When
configurations conflict, settings from higher-precedence scopes override
lower-precedence settings.

View File

@@ -0,0 +1,34 @@
.. Copyright Spack Project Developers. See COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _env-vars-yaml:
=============================================
Environment Variable Settings (env_vars.yaml)
=============================================
Spack allows you to include shell environment variable modifications
for a spack environment by including an ``env_vars.yaml``. Environment
varaibles can be modified by setting, unsetting, appending, and prepending
variables in the shell environment.
The changes to the shell environment will take effect when the spack
environment is activated.
for example,
.. code-block:: yaml
env_vars:
set:
ENVAR_TO_SET_IN_ENV_LOAD: "FOO"
unset:
ENVAR_TO_UNSET_IN_ENV_LOAD:
prepend_path:
PATH_LIST: "path/to/prepend"
append_path:
PATH_LIST: "path/to/append"
remove_path:
PATH_LIST: "path/to/remove"

View File

@@ -667,11 +667,11 @@ a ``packages.yaml`` file) could contain:
# ...
packages:
all:
compiler: [intel]
providers:
mpi: [openmpi]
# ...
This configuration sets the default compiler for all packages to
``intel``.
This configuration sets the default mpi provider to be openmpi.
^^^^^^^^^^^^^^^^^^^^^^^
Included configurations
@@ -686,7 +686,8 @@ the environment.
spack:
include:
- environment/relative/path/to/config.yaml
- https://github.com/path/to/raw/config/compilers.yaml
- path: https://github.com/path/to/raw/config/compilers.yaml
sha256: 26e871804a92cd07bb3d611b31b4156ae93d35b6a6d6e0ef3a67871fcb1d258b
- /absolute/path/to/packages.yaml
- path: /path/to/$os/$target/environment
optional: true
@@ -700,11 +701,11 @@ with the ``optional`` clause and conditional with the ``when`` clause. (See
Files are listed using paths to individual files or directories containing them.
Path entries may be absolute or relative to the environment or specified as
URLs. URLs to individual files need link to the **raw** form of the file's
URLs. URLs to individual files must link to the **raw** form of the file's
contents (e.g., `GitHub
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
or `GitLab
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_) **and** include a valid sha256 for the file.
Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or schemes) are
supported. Spack-specific, environment and user path variables can be used.
(See :ref:`config-file-variables` for more information.)
@@ -999,6 +1000,28 @@ For example, the following environment has three root packages:
This allows for a much-needed reduction in redundancy between packages
and constraints.
-------------------------------
Modifying Environment Variables
-------------------------------
Spack Environments can modify the active shell's environment variables when activated. The environment can be
configured to set, unset, prepend, or append using ``env_vars`` configuration in the ``spack.yaml`` or through config scopes
file:
.. code-block:: yaml
spack:
env_vars:
set:
ENVAR_TO_SET_IN_ENV_LOAD: "FOO"
unset:
ENVAR_TO_UNSET_IN_ENV_LOAD:
prepend_path:
PATH_LIST: "path/to/prepend"
append_path:
PATH_LIST: "path/to/append"
remove_path:
PATH_LIST: "path/to/remove"
-----------------
Environment Views

View File

@@ -1,161 +0,0 @@
spack:
definitions:
- compiler-pkgs:
- 'llvm+clang@6.0.1 os=centos7'
- 'gcc@6.5.0 os=centos7'
- 'llvm+clang@6.0.1 os=ubuntu18.04'
- 'gcc@6.5.0 os=ubuntu18.04'
- pkgs:
- readline@7.0
# - xsdk@0.4.0
- compilers:
- '%gcc@5.5.0'
- '%gcc@6.5.0'
- '%gcc@7.3.0'
- '%clang@6.0.0'
- '%clang@6.0.1'
- oses:
- os=ubuntu18.04
- os=centos7
specs:
- matrix:
- [$pkgs]
- [$compilers]
- [$oses]
exclude:
- '%gcc@7.3.0 os=centos7'
- '%gcc@5.5.0 os=ubuntu18.04'
mirrors:
cloud_gitlab: https://mirror.spack.io
compilers:
# The .gitlab-ci.yml for this project picks a Docker container which does
# not have any compilers pre-built and ready to use, so we need to fake the
# existence of those here.
- compiler:
operating_system: centos7
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: gcc@5.5.0
target: x86_64
- compiler:
operating_system: centos7
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: gcc@6.5.0
target: x86_64
- compiler:
operating_system: centos7
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: clang@6.0.0
target: x86_64
- compiler:
operating_system: centos7
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: clang@6.0.1
target: x86_64
- compiler:
operating_system: ubuntu18.04
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: clang@6.0.0
target: x86_64
- compiler:
operating_system: ubuntu18.04
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: clang@6.0.1
target: x86_64
- compiler:
operating_system: ubuntu18.04
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: gcc@6.5.0
target: x86_64
- compiler:
operating_system: ubuntu18.04
modules: []
paths:
cc: /not/used
cxx: /not/used
f77: /not/used
fc: /not/used
spec: gcc@7.3.0
target: x86_64
gitlab-ci:
bootstrap:
- name: compiler-pkgs
compiler-agnostic: true
mappings:
- # spack-cloud-ubuntu
match:
# these are specs, if *any* match the spec under consideration, this
# 'mapping' will be used to generate the CI job
- os=ubuntu18.04
runner-attributes:
# 'tags' and 'image' go directly onto the job, 'variables' will
# be added to what we already necessarily create for the job as
# a part of the CI workflow
tags:
- spack-k8s
image:
name: scottwittenburg/spack_builder_ubuntu_18.04
entrypoint: [""]
- # spack-cloud-centos
match:
# these are specs, if *any* match the spec under consideration, this
# 'mapping' will be used to generate the CI job
- 'os=centos7'
runner-attributes:
tags:
- spack-k8s
image:
name: scottwittenburg/spack_builder_centos_7
entrypoint: [""]
cdash:
build-group: Release Testing
url: http://cdash
project: Spack Testing
site: Spack Docker-Compose Workflow
repos: []
upstreams: {}
modules:
enable: []
packages: {}
config: {}

View File

@@ -254,12 +254,11 @@ directory.
Compiler configuration
----------------------
Spack has the ability to build packages with multiple compilers and
compiler versions. Compilers can be made available to Spack by
specifying them manually in ``compilers.yaml`` or ``packages.yaml``,
or automatically by running ``spack compiler find``, but for
convenience Spack will automatically detect compilers the first time
it needs them.
Spack has the ability to build packages with multiple compilers and compiler versions.
Compilers can be made available to Spack by specifying them manually in ``packages.yaml``,
or automatically by running ``spack compiler find``.
For convenience, Spack will automatically detect compilers the first time it needs them,
if none is available.
.. _cmd-spack-compilers:
@@ -274,16 +273,11 @@ compilers`` or ``spack compiler list``:
$ spack compilers
==> Available compilers
-- gcc ---------------------------------------------------------
gcc@4.9.0 gcc@4.8.0 gcc@4.7.0 gcc@4.6.2 gcc@4.4.7
gcc@4.8.2 gcc@4.7.1 gcc@4.6.3 gcc@4.6.1 gcc@4.1.2
-- intel -------------------------------------------------------
intel@15.0.0 intel@14.0.0 intel@13.0.0 intel@12.1.0 intel@10.0
intel@14.0.3 intel@13.1.1 intel@12.1.5 intel@12.0.4 intel@9.1
intel@14.0.2 intel@13.1.0 intel@12.1.3 intel@11.1
intel@14.0.1 intel@13.0.1 intel@12.1.2 intel@10.1
-- clang -------------------------------------------------------
clang@3.4 clang@3.3 clang@3.2 clang@3.1
-- gcc ubuntu20.04-x86_64 ---------------------------------------
gcc@9.4.0 gcc@8.4.0 gcc@10.5.0
-- llvm ubuntu20.04-x86_64 --------------------------------------
llvm@12.0.0 llvm@11.0.0 llvm@10.0.0
Any of these compilers can be used to build Spack packages. More on
how this is done is in :ref:`sec-specs`.
@@ -302,16 +296,22 @@ An alias for ``spack compiler find``.
``spack compiler find``
^^^^^^^^^^^^^^^^^^^^^^^
Lists the compilers currently available to Spack. If you do not see
a compiler in this list, but you want to use it with Spack, you can
simply run ``spack compiler find`` with the path to where the
compiler is installed. For example:
If you do not see a compiler in the list shown by:
.. code-block:: console
$ spack compiler find /usr/local/tools/ic-13.0.079
==> Added 1 new compiler to ~/.spack/linux/compilers.yaml
intel@13.0.079
$ spack compiler list
but you want to use it with Spack, you can simply run ``spack compiler find`` with the
path to where the compiler is installed. For example:
.. code-block:: console
$ spack compiler find /opt/intel/oneapi/compiler/2025.1/bin/
==> Added 1 new compiler to /home/user/.spack/packages.yaml
intel-oneapi-compilers@2025.1.0
==> Compilers are defined in the following files:
/home/user/.spack/packages.yaml
Or you can run ``spack compiler find`` with no arguments to force
auto-detection. This is useful if you do not know where compilers are
@@ -322,7 +322,7 @@ installed, but you know that new compilers have been added to your
$ module load gcc/4.9.0
$ spack compiler find
==> Added 1 new compiler to ~/.spack/linux/compilers.yaml
==> Added 1 new compiler to /home/user/.spack/packages.yaml
gcc@4.9.0
This loads the environment module for gcc-4.9.0 to add it to
@@ -331,7 +331,7 @@ This loads the environment module for gcc-4.9.0 to add it to
.. note::
By default, spack does not fill in the ``modules:`` field in the
``compilers.yaml`` file. If you are using a compiler from a
``packages.yaml`` file. If you are using a compiler from a
module, then you should add this field manually.
See the section on :ref:`compilers-requiring-modules`.
@@ -341,91 +341,82 @@ This loads the environment module for gcc-4.9.0 to add it to
``spack compiler info``
^^^^^^^^^^^^^^^^^^^^^^^
If you want to see specifics on a particular compiler, you can run
``spack compiler info`` on it:
If you want to see additional information on some specific compilers, you can run ``spack compiler info`` on it:
.. code-block:: console
$ spack compiler info intel@15
intel@15.0.0:
paths:
cc = /usr/local/bin/icc-15.0.090
cxx = /usr/local/bin/icpc-15.0.090
f77 = /usr/local/bin/ifort-15.0.090
fc = /usr/local/bin/ifort-15.0.090
modules = []
operating_system = centos6
...
$ spack compiler info gcc
gcc@=8.4.0 languages='c,c++,fortran' arch=linux-ubuntu20.04-x86_64:
prefix: /usr
compilers:
c: /usr/bin/gcc-8
cxx: /usr/bin/g++-8
fortran: /usr/bin/gfortran-8
This shows which C, C++, and Fortran compilers were detected by Spack.
Notice also that we didn't have to be too specific about the
version. We just said ``intel@15``, and information about the only
matching Intel compiler was displayed.
gcc@=9.4.0 languages='c,c++,fortran' arch=linux-ubuntu20.04-x86_64:
prefix: /usr
compilers:
c: /usr/bin/gcc
cxx: /usr/bin/g++
fortran: /usr/bin/gfortran
gcc@=10.5.0 languages='c,c++,fortran' arch=linux-ubuntu20.04-x86_64:
prefix: /usr
compilers:
c: /usr/bin/gcc-10
cxx: /usr/bin/g++-10
fortran: /usr/bin/gfortran-10
This shows the details of the compilers that were detected by Spack.
Notice also that we didn't have to be too specific about the version. We just said ``gcc``, and we got information
about all the matching compilers.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Manual compiler configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If auto-detection fails, you can manually configure a compiler by
editing your ``~/.spack/<platform>/compilers.yaml`` file. You can do this by running
``spack config edit compilers``, which will open the file in
If auto-detection fails, you can manually configure a compiler by editing your ``~/.spack/packages.yaml`` file.
You can do this by running ``spack config edit packages``, which will open the file in
:ref:`your favorite editor <controlling-the-editor>`.
Each compiler configuration in the file looks like this:
Each compiler has an "external" entry in the file with some ``extra_attributes``:
.. code-block:: yaml
compilers:
- compiler:
modules: []
operating_system: centos6
paths:
cc: /usr/local/bin/icc-15.0.024-beta
cxx: /usr/local/bin/icpc-15.0.024-beta
f77: /usr/local/bin/ifort-15.0.024-beta
fc: /usr/local/bin/ifort-15.0.024-beta
spec: intel@15.0.0
packages:
gcc:
externals:
- spec: gcc@10.5.0 languages='c,c++,fortran'
prefix: /usr
extra_attributes:
compilers:
c: /usr/bin/gcc-10
cxx: /usr/bin/g++-10
fortran: /usr/bin/gfortran-10
For compilers that do not support Fortran (like ``clang``), put
``None`` for ``f77`` and ``fc``:
The compiler executables are listed under ``extra_attributes:compilers``, and are keyed by language.
Once you save the file, the configured compilers will show up in the list displayed by ``spack compilers``.
.. code-block:: yaml
compilers:
- compiler:
modules: []
operating_system: centos6
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: None
fc: None
spec: clang@3.3svn
Once you save the file, the configured compilers will show up in the
list displayed by ``spack compilers``.
You can also add compiler flags to manually configured compilers. These
flags should be specified in the ``flags`` section of the compiler
specification. The valid flags are ``cflags``, ``cxxflags``, ``fflags``,
You can also add compiler flags to manually configured compilers. These flags should be specified in the
``flags`` section of the compiler specification. The valid flags are ``cflags``, ``cxxflags``, ``fflags``,
``cppflags``, ``ldflags``, and ``ldlibs``. For example:
.. code-block:: yaml
compilers:
- compiler:
modules: []
operating_system: centos6
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags:
cflags: -O3 -fPIC
cxxflags: -O3 -fPIC
cppflags: -O3 -fPIC
spec: gcc@4.7.2
packages:
gcc:
externals:
- spec: gcc@10.5.0 languages='c,c++,fortran'
prefix: /usr
extra_attributes:
compilers:
c: /usr/bin/gcc-10
cxx: /usr/bin/g++-10
fortran: /usr/bin/gfortran-10
flags:
cflags: -O3 -fPIC
cxxflags: -O3 -fPIC
cppflags: -O3 -fPIC
These flags will be treated by spack as if they were entered from
the command line each time this compiler is used. The compiler wrappers
@@ -440,95 +431,44 @@ These variables should be specified in the ``environment`` section of the compil
specification. The operations available to modify the environment are ``set``, ``unset``,
``prepend_path``, ``append_path``, and ``remove_path``. For example:
.. code-block:: yaml
compilers:
- compiler:
modules: []
operating_system: centos6
paths:
cc: /opt/intel/oneapi/compiler/latest/linux/bin/icx
cxx: /opt/intel/oneapi/compiler/latest/linux/bin/icpx
f77: /opt/intel/oneapi/compiler/latest/linux/bin/ifx
fc: /opt/intel/oneapi/compiler/latest/linux/bin/ifx
spec: oneapi@latest
environment:
set:
MKL_ROOT: "/path/to/mkl/root"
unset: # A list of environment variables to unset
- CC
prepend_path: # Similar for append|remove_path
LD_LIBRARY_PATH: /ld/paths/added/by/setvars/sh
.. note::
Spack is in the process of moving compilers from a separate
attribute to be handled like all other packages. As part of this
process, the ``compilers.yaml`` section will eventually be replaced
by configuration in the ``packages.yaml`` section. This new
configuration is now available, although it is not yet the default
behavior.
Compilers can also be configured as external packages in the
``packages.yaml`` config file. Any external package for a compiler
(e.g. ``gcc`` or ``llvm``) will be treated as a configured compiler
assuming the paths to the compiler executables are determinable from
the prefix.
If the paths to the compiler executable are not determinable from the
prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
Note that the format for the ``paths`` field in the
``extra_attributes`` section is different than in the ``compilers``
config. For compilers configured as external packages, the section is
named ``compilers`` and the dictionary maps language names (``c``,
``cxx``, ``fortran``) to paths, rather than using the names ``cc``,
``fc``, and ``f77``.
.. code-block:: yaml
packages:
gcc:
external:
- spec: gcc@12.2.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
environment:
set:
GCC_ROOT: /usr
external:
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
intel-oneapi-compilers:
externals:
- spec: intel-oneapi-compilers@2025.1.0
prefix: /opt/intel/oneapi
extra_attributes:
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
c: /opt/intel/oneapi/compiler/2025.1/bin/icx
cxx: /opt/intel/oneapi/compiler/2025.1/bin/icpx
fortran: /opt/intel/oneapi/compiler/2025.1/bin/ifx
environment:
set:
MKL_ROOT: "/path/to/mkl/root"
unset: # A list of environment variables to unset
- CC
prepend_path: # Similar for append|remove_path
LD_LIBRARY_PATH: /ld/paths/added/by/setvars/sh
^^^^^^^^^^^^^^^^^^^^^^^
Build Your Own Compiler
^^^^^^^^^^^^^^^^^^^^^^^
If you are particular about which compiler/version you use, you might
wish to have Spack build it for you. For example:
If you are particular about which compiler/version you use, you might wish to have Spack build it for you.
For example:
.. code-block:: console
$ spack install gcc@4.9.3
$ spack install gcc@14+binutils
Once that has finished, you will need to add it to your
``compilers.yaml`` file. You can then set Spack to use it by default
by adding the following to your ``packages.yaml`` file:
Once the compiler is installed, you can start using it without additional configuration:
.. code-block:: yaml
.. code-block:: console
packages:
all:
compiler: [gcc@4.9.3]
$ spack install hdf5~mpi %gcc@14
The same holds true for compilers that are made available from buildcaches, when reusing them is allowed.
.. _compilers-requiring-modules:
@@ -536,30 +476,26 @@ by adding the following to your ``packages.yaml`` file:
Compilers Requiring Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Many installed compilers will work regardless of the environment they
are called with. However, some installed compilers require
``$LD_LIBRARY_PATH`` or other environment variables to be set in order
to run; this is typical for Intel and other proprietary compilers.
Many installed compilers will work regardless of the environment they are called with.
However, some installed compilers require environment variables to be set in order to run;
this is typical for Intel and other proprietary compilers.
In such a case, you should tell Spack which module(s) to load in order
to run the chosen compiler (If the compiler does not come with a
module file, you might consider making one by hand). Spack will load
this module into the environment ONLY when the compiler is run, and
NOT in general for a package's ``install()`` method. See, for
example, this ``compilers.yaml`` file:
On typical HPC clusters, these environment modifications are usually delegated to some "module" system.
In such a case, you should tell Spack which module(s) to load in order to run the chosen compiler:
.. code-block:: yaml
compilers:
- compiler:
modules: [other/comp/gcc-5.3-sp3]
operating_system: SuSE11
paths:
cc: /usr/local/other/SLES11.3/gcc/5.3.0/bin/gcc
cxx: /usr/local/other/SLES11.3/gcc/5.3.0/bin/g++
f77: /usr/local/other/SLES11.3/gcc/5.3.0/bin/gfortran
fc: /usr/local/other/SLES11.3/gcc/5.3.0/bin/gfortran
spec: gcc@5.3.0
packages:
gcc:
externals:
- spec: gcc@10.5.0 languages='c,c++,fortran'
prefix: /opt/compilers
extra_attributes:
compilers:
c: /opt/compilers/bin/gcc-10
cxx: /opt/compilers/bin/g++-10
fortran: /opt/compilers/bin/gfortran-10
modules: [gcc/10.5.0]
Some compilers require special environment settings to be loaded not just
to run, but also to execute the code they build, breaking packages that
@@ -580,7 +516,7 @@ Licensed Compilers
^^^^^^^^^^^^^^^^^^
Some proprietary compilers require licensing to use. If you need to
use a licensed compiler (eg, PGI), the process is similar to a mix of
use a licensed compiler, the process is similar to a mix of
build your own, plus modules:
#. Create a Spack package (if it doesn't exist already) to install
@@ -590,24 +526,21 @@ build your own, plus modules:
using Spack to load the module it just created, and running simple
builds (eg: ``cc helloWorld.c && ./a.out``)
#. Add the newly-installed compiler to ``compilers.yaml`` as shown
above.
#. Add the newly-installed compiler to ``packages.yaml`` as shown above.
.. _mixed-toolchains:
^^^^^^^^^^^^^^^^
Mixed Toolchains
^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^
Fortran compilers on macOS
^^^^^^^^^^^^^^^^^^^^^^^^^^
Modern compilers typically come with related compilers for C, C++ and
Fortran bundled together. When possible, results are best if the same
compiler is used for all languages.
In some cases, this is not possible. For example, starting with macOS El
Capitan (10.11), many packages no longer build with GCC, but XCode
provides no Fortran compilers. The user is therefore forced to use a
mixed toolchain: XCode-provided Clang for C/C++ and GNU ``gfortran`` for
Fortran.
In some cases, this is not possible. For example, XCode on macOS provides no Fortran compilers.
The user is therefore forced to use a mixed toolchain: XCode-provided Clang for C/C++ and e.g.
GNU ``gfortran`` for Fortran.
#. You need to make sure that Xcode is installed. Run the following command:
@@ -660,45 +593,25 @@ Fortran.
Note: the flag is ``-license``, not ``--license``.
#. Run ``spack compiler find`` to locate Clang.
#. There are different ways to get ``gfortran`` on macOS. For example, you can
install GCC with Spack (``spack install gcc``), with Homebrew (``brew install
gcc``), or from a `DMG installer
<https://github.com/fxcoudert/gfortran-for-macOS/releases>`_.
#. The only thing left to do is to edit ``~/.spack/darwin/compilers.yaml`` to provide
the path to ``gfortran``:
#. Run ``spack compiler find`` to locate both Apple-Clang and GCC.
.. code-block:: yaml
compilers:
- compiler:
# ...
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: /path/to/bin/gfortran
fc: /path/to/bin/gfortran
spec: apple-clang@11.0.0
If you used Spack to install GCC, you can get the installation prefix by
``spack location -i gcc`` (this will only work if you have a single version
of GCC installed). Whereas for Homebrew, GCC is installed in
``/usr/local/Cellar/gcc/x.y.z``. With the DMG installer, the correct path
will be ``/usr/local/gfortran``.
Since languages in Spack are modeled as virtual packages, ``apple-clang`` will be used to provide
C and C++, while GCC will be used for Fortran.
^^^^^^^^^^^^^^^^^^^^^
Compiler Verification
^^^^^^^^^^^^^^^^^^^^^
You can verify that your compilers are configured properly by installing a
simple package. For example:
You can verify that your compilers are configured properly by installing a simple package. For example:
.. code-block:: console
$ spack install zlib%gcc@5.3.0
$ spack install zlib-ng%gcc@5.3.0
.. _vendor-specific-compiler-configuration:
@@ -707,9 +620,7 @@ simple package. For example:
Vendor-Specific Compiler Configuration
--------------------------------------
With Spack, things usually "just work" with GCC. Not so for other
compilers. This section provides details on how to get specific
compilers working.
This section provides details on how to get vendor-specific compilers working.
^^^^^^^^^^^^^^^
Intel Compilers
@@ -731,8 +642,8 @@ compilers:
you have installed from the ``PATH`` environment variable.
If you want use a version of ``gcc`` or ``g++`` other than the default
version on your system, you need to use either the ``-gcc-name``
or ``-gxx-name`` compiler option to specify the path to the version of
version on your system, you need to use either the ``--gcc-install-dir``
or ``--gcc-toolchain`` compiler option to specify the path to the version of
``gcc`` or ``g++`` that you want to use."
-- `Intel Reference Guide <https://software.intel.com/en-us/node/522750>`_
@@ -740,76 +651,12 @@ compilers:
Intel compilers may therefore be configured in one of two ways with
Spack: using modules, or using compiler flags.
""""""""""""""""""""""""""
Configuration with Modules
""""""""""""""""""""""""""
One can control which GCC is seen by the Intel compiler with modules.
A module must be loaded both for the Intel Compiler (so it will run)
and GCC (so the compiler can find the intended GCC). The following
configuration in ``compilers.yaml`` illustrates this technique:
.. code-block:: yaml
compilers:
- compiler:
modules: [gcc-4.9.3, intel-15.0.24]
operating_system: centos7
paths:
cc: /opt/intel-15.0.24/bin/icc-15.0.24-beta
cxx: /opt/intel-15.0.24/bin/icpc-15.0.24-beta
f77: /opt/intel-15.0.24/bin/ifort-15.0.24-beta
fc: /opt/intel-15.0.24/bin/ifort-15.0.24-beta
spec: intel@15.0.24.4.9.3
.. note::
The version number on the Intel compiler is a combination of
the "native" Intel version number and the GNU compiler it is
targeting.
""""""""""""""""""""""""""
Command Line Configuration
""""""""""""""""""""""""""
One can also control which GCC is seen by the Intel compiler by adding
flags to the ``icc`` command:
#. Identify the location of the compiler you just installed:
.. code-block:: console
$ spack location --install-dir gcc
~/spack/opt/spack/linux-centos7-x86_64/gcc-4.9.3-iy4rw...
#. Set up ``compilers.yaml``, for example:
.. code-block:: yaml
compilers:
- compiler:
modules: [intel-15.0.24]
operating_system: centos7
paths:
cc: /opt/intel-15.0.24/bin/icc-15.0.24-beta
cxx: /opt/intel-15.0.24/bin/icpc-15.0.24-beta
f77: /opt/intel-15.0.24/bin/ifort-15.0.24-beta
fc: /opt/intel-15.0.24/bin/ifort-15.0.24-beta
flags:
cflags: -gcc-name ~/spack/opt/spack/linux-centos7-x86_64/gcc-4.9.3-iy4rw.../bin/gcc
cxxflags: -gxx-name ~/spack/opt/spack/linux-centos7-x86_64/gcc-4.9.3-iy4rw.../bin/g++
fflags: -gcc-name ~/spack/opt/spack/linux-centos7-x86_64/gcc-4.9.3-iy4rw.../bin/gcc
spec: intel@15.0.24.4.9.3
^^^
NAG
^^^
The Numerical Algorithms Group provides a licensed Fortran compiler. Like Clang,
this requires you to set up a :ref:`mixed-toolchains`. It is recommended to use
GCC for your C/C++ compilers.
The Numerical Algorithms Group provides a licensed Fortran compiler.
It is recommended to use GCC for your C/C++ compilers.
The NAG Fortran compilers are a bit more strict than other compilers, and many
packages will fail to install with error messages like:
@@ -826,44 +673,40 @@ the command line:
$ spack install openmpi fflags="-mismatch"
Or it can be set permanently in your ``compilers.yaml``:
Or it can be set permanently in your ``packages.yaml``:
.. code-block:: yaml
- compiler:
modules: []
operating_system: centos6
paths:
cc: /soft/spack/opt/spack/linux-x86_64/gcc-5.3.0/gcc-6.1.0-q2zosj3igepi3pjnqt74bwazmptr5gpj/bin/gcc
cxx: /soft/spack/opt/spack/linux-x86_64/gcc-5.3.0/gcc-6.1.0-q2zosj3igepi3pjnqt74bwazmptr5gpj/bin/g++
f77: /soft/spack/opt/spack/linux-x86_64/gcc-4.4.7/nag-6.1-jt3h5hwt5myezgqguhfsan52zcskqene/bin/nagfor
fc: /soft/spack/opt/spack/linux-x86_64/gcc-4.4.7/nag-6.1-jt3h5hwt5myezgqguhfsan52zcskqene/bin/nagfor
flags:
fflags: -mismatch
spec: nag@6.1
packages:
nag:
externals:
- spec: nag@6.1
prefix: /opt/nag/bin
extra_attributes:
compilers:
fortran: /opt/nag/bin/nagfor
flags:
fflags: -mismatch
---------------
System Packages
---------------
Once compilers are configured, one needs to determine which
pre-installed system packages, if any, to use in builds. This is
configured in the file ``~/.spack/packages.yaml``. For example, to use
an OpenMPI installed in /opt/local, one would use:
Once compilers are configured, one needs to determine which pre-installed system packages,
if any, to use in builds. These are also configured in the ``~/.spack/packages.yaml`` file.
For example, to use an OpenMPI installed in /opt/local, one would use:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: openmpi@1.10.1
prefix: /opt/local
buildable: False
packages:
openmpi:
buildable: False
externals:
- spec: openmpi@1.10.1
prefix: /opt/local
In general, Spack is easier to use and more reliable if it builds all of
its own dependencies. However, there are several packages for which one
commonly needs to use system versions:
In general, *Spack is easier to use and more reliable if it builds all of its own dependencies*.
However, there are several packages for which one commonly needs to use system versions:
^^^
MPI
@@ -876,8 +719,7 @@ you are unlikely to get a working MPI from Spack. Instead, use an
appropriate pre-installed MPI.
If you choose a pre-installed MPI, you should consider using the
pre-installed compiler used to build that MPI; see above on
``compilers.yaml``.
pre-installed compiler used to build that MPI.
^^^^^^^
OpenSSL
@@ -1441,9 +1283,9 @@ To configure Spack, first run the following command inside the Spack console:
spack compiler find
This creates a ``.staging`` directory in our Spack prefix, along with a ``windows`` subdirectory
containing a ``compilers.yaml`` file. On a fresh Windows install with the above packages
containing a ``packages.yaml`` file. On a fresh Windows install with the above packages
installed, this command should only detect Microsoft Visual Studio and the Intel Fortran
compiler will be integrated within the first version of MSVC present in the ``compilers.yaml``
compiler will be integrated within the first version of MSVC present in the ``packages.yaml``
output.
Spack provides a default ``config.yaml`` file for Windows that it will use unless overridden.

View File

@@ -23,7 +23,6 @@ components for use by dependent packages:
packages:
all:
compiler: [rocmcc@=5.3.0]
variants: amdgpu_target=gfx90a
hip:
buildable: false
@@ -70,16 +69,15 @@ This is in combination with the following compiler definition:
.. code-block:: yaml
compilers:
- compiler:
spec: rocmcc@=5.3.0
paths:
cc: /opt/rocm-5.3.0/bin/amdclang
cxx: /opt/rocm-5.3.0/bin/amdclang++
f77: null
fc: /opt/rocm-5.3.0/bin/amdflang
operating_system: rhel8
target: x86_64
packages:
llvm-amdgpu:
externals:
- spec: llvm-amdgpu@=5.3.0
prefix: /opt/rocm-5.3.0
compilers:
c: /opt/rocm-5.3.0/bin/amdclang
cxx: /opt/rocm-5.3.0/bin/amdclang++
fortran: null
This includes the following considerations:

View File

@@ -43,6 +43,20 @@ or specified as URLs. Only the ``file``, ``ftp``, ``http`` and ``https`` protoco
schemes) are supported. Spack-specific, environment and user path variables
can be used. (See :ref:`config-file-variables` for more information.)
A ``sha256`` is required for remote file URLs and must be specified as follows:
.. code-block:: yaml
include:
- path: https://github.com/path/to/raw/config/compilers.yaml
sha256: 26e871804a92cd07bb3d611b31b4156ae93d35b6a6d6e0ef3a67871fcb1d258b
Additionally, remote file URLs must link to the **raw** form of the file's
contents (e.g., `GitHub
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
or `GitLab
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
.. warning::
Recursive includes are not currently processed in a breadth-first manner

View File

@@ -75,6 +75,7 @@ or refer to the full manual below.
packages_yaml
build_settings
environments
env_vars_yaml
containers
mirrors
module_file_support

View File

@@ -128,7 +128,7 @@ depend on the spec:
.. code-block:: python
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
if self.spec.satisfies("+foo"):
env.set("FOO", "bar")
@@ -142,7 +142,7 @@ For example, a simplified version of the ``python`` package could look like this
.. code-block:: python
def setup_dependent_run_environment(self, env, dependent_spec):
def setup_dependent_run_environment(self, env: EnvironmentModifications, dependent_spec: Spec) -> None:
if dependent_spec.package.extends(self.spec):
env.prepend_path("PYTHONPATH", dependent_spec.prefix.lib.python)

View File

@@ -557,14 +557,13 @@ preferences.
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
Most package preferences (``compilers``, ``target`` and ``providers``)
The ``target`` and ``providers`` preferences
can only be set globally under the ``all`` section of ``packages.yaml``:
.. code-block:: yaml
packages:
all:
compiler: [gcc@12.2.0, clang@12:, oneapi@2023:]
target: [x86_64_v3]
providers:
mpi: [mvapich2, mpich, openmpi]

View File

@@ -5,9 +5,9 @@ sphinx-rtd-theme==3.0.2
python-levenshtein==0.27.1
docutils==0.21.2
pygments==2.19.1
urllib3==2.3.0
urllib3==2.4.0
pytest==8.3.5
isort==6.0.1
black==25.1.0
flake8==7.1.2
flake8==7.2.0
mypy==1.11.1

View File

@@ -11,6 +11,7 @@
* Homepage: https://altgraph.readthedocs.io/en/latest/index.html
* Usage: dependency of macholib
* Version: 0.17.3
* License: MIT
archspec
--------
@@ -18,6 +19,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.5 (commit 38ce485258ffc4fc6dd6688f8dc90cb269478c47)
* License: Apache-2.0 or MIT
astunparse
----------------
@@ -25,6 +27,7 @@
* Homepage: https://github.com/simonpercivall/astunparse
* Usage: Unparsing Python ASTs for package hashes in Spack
* Version: 1.6.3 (plus modifications)
* License: PSF-2.0
* Note: This is in ``spack.util.unparse`` because it's very heavily
modified, and we want to track coverage for it.
Specifically, we have modified this library to generate consistent unparsed ASTs
@@ -41,6 +44,7 @@
* Homepage: https://github.com/python-attrs/attrs
* Usage: Needed by jsonschema.
* Version: 22.1.0
* License: MIT
ctest_log_parser
----------------
@@ -48,6 +52,7 @@
* Homepage: https://github.com/Kitware/CMake/blob/master/Source/CTest/cmCTestBuildHandler.cxx
* Usage: Functions to parse build logs and extract error messages.
* Version: Unversioned
* License: BSD-3-Clause
* Note: This is a homemade port of Kitware's CTest build handler.
distro
@@ -56,6 +61,7 @@
* Homepage: https://pypi.python.org/pypi/distro
* Usage: Provides a more stable linux distribution detection.
* Version: 1.8.0
* License: Apache-2.0
jinja2
------
@@ -63,6 +69,7 @@
* Homepage: https://pypi.python.org/pypi/Jinja2
* Usage: A modern and designer-friendly templating language for Python.
* Version: 3.0.3 (last version supporting Python 3.6)
* License: BSD-3-Clause
jsonschema
----------
@@ -70,6 +77,7 @@
* Homepage: https://pypi.python.org/pypi/jsonschema
* Usage: An implementation of JSON Schema for Python.
* Version: 3.2.0 (last version before 2.7 and 3.6 support was dropped)
* License: MIT
* Note: We don't include tests or benchmarks; just what Spack needs.
macholib
@@ -78,6 +86,7 @@
* Homepage: https://macholib.readthedocs.io/en/latest/index.html#
* Usage: Manipulation of Mach-o binaries for relocating macOS buildcaches on Linux
* Version: 1.16.2
* License: MIT
markupsafe
----------
@@ -85,6 +94,7 @@
* Homepage: https://pypi.python.org/pypi/MarkupSafe
* Usage: Implements a XML/HTML/XHTML Markup safe string for Python.
* Version: 2.0.1 (last version supporting Python 3.6)
* License: BSD-3-Clause
pyrsistent
----------
@@ -92,6 +102,7 @@
* Homepage: http://github.com/tobgu/pyrsistent/
* Usage: Needed by `jsonschema`
* Version: 0.18.0
* License: MIT
ruamel.yaml
------
@@ -101,6 +112,7 @@
actively maintained and has more features, including round-tripping
comments read from config files.
* Version: 0.17.21
* License: MIT
six
---
@@ -108,5 +120,6 @@
* Homepage: https://pypi.python.org/pypi/six
* Usage: Python 2 and 3 compatibility utilities.
* Version: 1.16.0
* License: MIT
"""

View File

@@ -764,7 +764,7 @@ def copy_tree(
files = glob.glob(src)
if not files:
raise OSError("No such file or directory: '{0}'".format(src))
raise OSError("No such file or directory: '{0}'".format(src), errno.ENOENT)
# For Windows hard-links and junctions, the source path must exist to make a symlink. Add
# all symlinks to this list while traversing the tree, then when finished, make all

View File

@@ -15,7 +15,19 @@
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Dict, Iterable, List, Mapping, Optional, Tuple, TypeVar
from typing import (
Any,
Callable,
Dict,
Generic,
Iterable,
List,
Mapping,
Optional,
Tuple,
TypeVar,
Union,
)
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -1047,19 +1059,28 @@ def __exit__(self, exc_type, exc_value, tb):
return True
class classproperty:
ClassPropertyType = TypeVar("ClassPropertyType")
class classproperty(Generic[ClassPropertyType]):
"""Non-data descriptor to evaluate a class-level property. The function that performs
the evaluation is injected at creation time and take an instance (could be None) and
an owner (i.e. the class that originated the instance)
the evaluation is injected at creation time and takes an owner (i.e., the class that
originated the instance).
"""
def __init__(self, callback):
def __init__(self, callback: Callable[[Any], ClassPropertyType]) -> None:
self.callback = callback
def __get__(self, instance, owner):
def __get__(self, instance, owner) -> ClassPropertyType:
return self.callback(owner)
#: A type alias that represents either a classproperty descriptor or a constant value of the same
#: type. This allows derived classes to override a computed class-level property with a constant
#: value while retaining type compatibility.
ClassProperty = Union[ClassPropertyType, classproperty[ClassPropertyType]]
class DeprecatedProperty:
"""Data descriptor to error or warn when a deprecated property is accessed.

View File

@@ -7,7 +7,7 @@
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compiler-classic": "intel",
"intel-oneapi-compilers-classic": "intel",
"acfl": "arm",
}
@@ -15,6 +15,6 @@
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"intel": "intel-oneapi-compilers-classic",
"arm": "acfl",
}

View File

@@ -133,7 +133,7 @@ def mypy_root_spec() -> str:
def black_root_spec() -> str:
"""Return the root spec used to bootstrap black"""
return _root_spec("py-black@:24.1.0")
return _root_spec("py-black@:25.1.0")
def flake8_root_spec() -> str:

View File

@@ -36,9 +36,11 @@
import multiprocessing
import os
import re
import signal
import sys
import traceback
import types
import warnings
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
@@ -572,12 +574,10 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.make = DeprecatedExecutable(pkg.name, "make", "gmake")
module.gmake = DeprecatedExecutable(pkg.name, "gmake", "gmake")
module.ninja = DeprecatedExecutable(pkg.name, "ninja", "ninja")
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
module.nmake = Executable("nmake")
module.msbuild = Executable("msbuild")
module.nmake = DeprecatedExecutable(pkg.name, "nmake", "msvc")
module.msbuild = DeprecatedExecutable(pkg.name, "msbuild", "msvc")
# analog to configure for win32
module.cscript = Executable("cscript")
@@ -1189,11 +1189,9 @@ def _setup_pkg_and_run(
if isinstance(e, (spack.multimethod.NoSuchMethodError, AttributeError)):
process = "test the installation" if context == "test" else "build from sources"
error_msg = (
"The '{}' package cannot find an attribute while trying to {}. "
"This might be due to a change in Spack's package format "
"to support multiple build-systems for a single package. You can fix this "
"by updating the {} recipe, and you can also report the issue as a bug. "
"More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure"
"The '{}' package cannot find an attribute while trying to {}. You can fix this "
"by updating the {} recipe, and you can also report the issue as a build-error or "
"a bug at https://github.com/spack/spack/issues"
).format(pkg.name, process, context)
error_msg = colorize("@*R{{{}}}".format(error_msg))
error_msg = "{}\n\n{}".format(str(e), error_msg)
@@ -1218,15 +1216,45 @@ def _setup_pkg_and_run(
input_pipe.close()
def start_build_process(pkg, function, kwargs):
class BuildProcess:
def __init__(self, *, target, args) -> None:
self.p = multiprocessing.Process(target=target, args=args)
def start(self) -> None:
self.p.start()
def is_alive(self) -> bool:
return self.p.is_alive()
def join(self, *, timeout: Optional[int] = None):
self.p.join(timeout=timeout)
def terminate(self):
# Opportunity for graceful termination
self.p.terminate()
self.p.join(timeout=1)
# If the process didn't gracefully terminate, forcefully kill
if self.p.is_alive():
# TODO (python 3.6 removal): use self.p.kill() instead, consider removing this class
assert isinstance(self.p.pid, int), f"unexpected value for PID: {self.p.pid}"
os.kill(self.p.pid, signal.SIGKILL)
self.p.join()
@property
def exitcode(self):
return self.p.exitcode
def start_build_process(pkg, function, kwargs, *, timeout: Optional[int] = None):
"""Create a child process to do part of a spack build.
Args:
pkg (spack.package_base.PackageBase): package whose environment we should set up the
child process for.
function (typing.Callable): argless function to run in the child
process.
function (typing.Callable): argless function to run in the child process.
timeout: maximum time allowed to finish the execution of function
Usage::
@@ -1254,14 +1282,14 @@ def child_fun():
# Forward sys.stdin when appropriate, to allow toggling verbosity
if sys.platform != "win32" and sys.stdin.isatty() and hasattr(sys.stdin, "fileno"):
input_fd = Connection(os.dup(sys.stdin.fileno()))
mflags = os.environ.get("MAKEFLAGS", False)
if mflags:
mflags = os.environ.get("MAKEFLAGS")
if mflags is not None:
m = re.search(r"--jobserver-[^=]*=(\d),(\d)", mflags)
if m:
jobserver_fd1 = Connection(int(m.group(1)))
jobserver_fd2 = Connection(int(m.group(2)))
p = multiprocessing.Process(
p = BuildProcess(
target=_setup_pkg_and_run,
args=(
serialized_pkg,
@@ -1295,14 +1323,17 @@ def exitcode_msg(p):
typ = "exit" if p.exitcode >= 0 else "signal"
return f"{typ} {abs(p.exitcode)}"
p.join(timeout=timeout)
if p.is_alive():
warnings.warn(f"Terminating process, since the timeout of {timeout}s was exceeded")
p.terminate()
p.join()
try:
child_result = read_pipe.recv()
except EOFError:
p.join()
raise InstallError(f"The process has stopped unexpectedly ({exitcode_msg(p)})")
p.join()
# If returns a StopPhase, raise it
if isinstance(child_result, spack.error.StopPhase):
# do not print

View File

@@ -16,6 +16,7 @@
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
@@ -846,7 +847,9 @@ def _remove_libtool_archives(self) -> None:
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
f.write("\n".join(libtool_files))
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
# Many configure files rely on matching '10.*' for macOS version
# detection and fail to add flags if it shows as version 11.

View File

@@ -2,9 +2,10 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import enum
import os
import re
from typing import Tuple
from typing import Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -13,6 +14,7 @@
import spack.spec
import spack.util.prefix
from spack.directives import depends_on
from spack.util.executable import which_string
from .cmake import CMakeBuilder, CMakePackage
@@ -178,6 +180,64 @@ def initconfig_compiler_entries(self):
return entries
class Scheduler(enum.Enum):
LSF = enum.auto()
SLURM = enum.auto()
FLUX = enum.auto()
def get_scheduler(self) -> Optional[Scheduler]:
spec = self.pkg.spec
# Check for Spectrum-mpi, which always uses LSF or LSF MPI variant
if spec.satisfies("^spectrum-mpi") or spec["mpi"].satisfies("schedulers=lsf"):
return self.Scheduler.LSF
# Check for Slurm MPI variants
slurm_checks = ["+slurm", "schedulers=slurm", "process_managers=slurm"]
if any(spec["mpi"].satisfies(variant) for variant in slurm_checks):
return self.Scheduler.SLURM
# TODO improve this when MPI implementations support flux
# Do this check last to avoid using a flux wrapper present next to Slurm/ LSF schedulers
if which_string("flux") is not None:
return self.Scheduler.FLUX
return None
def get_mpi_exec(self) -> Optional[str]:
spec = self.pkg.spec
scheduler = self.get_scheduler()
if scheduler == self.Scheduler.LSF:
return which_string("lrun")
elif scheduler == self.Scheduler.SLURM:
if spec["mpi"].external:
return which_string("srun")
else:
return os.path.join(spec["slurm"].prefix.bin, "srun")
elif scheduler == self.Scheduler.FLUX:
flux = which_string("flux")
return f"{flux};run" if flux else None
elif hasattr(spec["mpi"].package, "mpiexec"):
return spec["mpi"].package.mpiexec
else:
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpirun")
if not os.path.exists(mpiexec):
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpiexec")
return mpiexec
def get_mpi_exec_num_proc(self) -> str:
scheduler = self.get_scheduler()
if scheduler in [self.Scheduler.FLUX, self.Scheduler.LSF, self.Scheduler.SLURM]:
return "-n"
else:
return "-np"
def initconfig_mpi_entries(self):
spec = self.pkg.spec
@@ -197,27 +257,10 @@ def initconfig_mpi_entries(self):
if hasattr(spec["mpi"], "mpifc"):
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# Check for slurm
using_slurm = False
slurm_checks = ["+slurm", "schedulers=slurm", "process_managers=slurm"]
if any(spec["mpi"].satisfies(variant) for variant in slurm_checks):
using_slurm = True
# Determine MPIEXEC
if using_slurm:
if spec["mpi"].external:
# Heuristic until we have dependents on externals
mpiexec = "/usr/bin/srun"
else:
mpiexec = os.path.join(spec["slurm"].prefix.bin, "srun")
elif hasattr(spec["mpi"].package, "mpiexec"):
mpiexec = spec["mpi"].package.mpiexec
else:
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpirun")
if not os.path.exists(mpiexec):
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpiexec")
mpiexec = self.get_mpi_exec()
if not os.path.exists(mpiexec):
if mpiexec is None or not os.path.exists(mpiexec.split(";")[0]):
msg = "Unable to determine MPIEXEC, %s tests may fail" % self.pkg.name
entries.append("# {0}\n".format(msg))
tty.warn(msg)
@@ -230,10 +273,7 @@ def initconfig_mpi_entries(self):
entries.append(cmake_cache_path("MPIEXEC", mpiexec))
# Determine MPIEXEC_NUMPROC_FLAG
if using_slurm:
entries.append(cmake_cache_string("MPIEXEC_NUMPROC_FLAG", "-n"))
else:
entries.append(cmake_cache_string("MPIEXEC_NUMPROC_FLAG", "-np"))
entries.append(cmake_cache_string("MPIEXEC_NUMPROC_FLAG", self.get_mpi_exec_num_proc()))
return entries
@@ -276,30 +316,18 @@ def initconfig_hardware_entries(self):
entries.append("# ROCm")
entries.append("#------------------{0}\n".format("-" * 30))
if spec.satisfies("^blt@0.7:"):
rocm_root = os.path.dirname(spec["llvm-amdgpu"].prefix)
entries.append(cmake_cache_path("ROCM_PATH", rocm_root))
else:
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath(
"CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "amdclang++")
)
)
rocm_root = os.path.dirname(spec["llvm-amdgpu"].prefix)
entries.append(cmake_cache_path("ROCM_PATH", rocm_root))
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(cmake_cache_string("CMAKE_HIP_ARCHITECTURES", arch_str))
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "amdclang++"))
)
if spec.satisfies("%gcc"):
entries.append(
@@ -308,6 +336,15 @@ def initconfig_hardware_entries(self):
)
)
# Extra definitions that might be required in other cases
if not spec.satisfies("^blt"):
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
return entries
def std_initconfig_entries(self):

View File

@@ -8,6 +8,7 @@
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
@@ -86,7 +87,9 @@ def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("CARGO_HOME", self.stage.path)
def build(

View File

@@ -47,6 +47,11 @@ class CompilerPackage(spack.package_base.PackageBase):
#: Relative path to compiler wrappers
compiler_wrapper_link_paths: Dict[str, str] = {}
#: Optimization flags
opt_flags: Sequence[str] = []
#: Flags for generating debug information
debug_flags: Sequence[str] = []
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"

View File

@@ -8,6 +8,7 @@
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
@@ -68,7 +69,9 @@ class GoBuilder(BuilderWithDefaults):
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
env.set("GOPATH", fs.join_path(self.pkg.stage.path, "go"))

View File

@@ -23,6 +23,7 @@
import spack.error
import spack.phase_callbacks
import spack.spec
from spack.build_environment import dso_suffix
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
@@ -1016,7 +1017,7 @@ def libs(self):
debug_print(result)
return result
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
"""Adds environment variables to the generated module file.
These environment variables come from running:
@@ -1049,7 +1050,9 @@ def setup_run_environment(self, env):
env.set("F77", self.prefix.bin.ifort)
env.set("F90", self.prefix.bin.ifort)
def setup_dependent_build_environment(self, env, dependent_spec):
def setup_dependent_build_environment(
self, env: EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
# NB: This function is overwritten by 'mpi' provider packages:
#
# var/spack/repos/builtin/packages/intel-mpi/package.py
@@ -1061,7 +1064,12 @@ def setup_dependent_build_environment(self, env, dependent_spec):
# Handle everything in a callback version.
self._setup_dependent_env_callback(env, dependent_spec)
def _setup_dependent_env_callback(self, env, dependent_spec, compilers_of_client={}):
def _setup_dependent_env_callback(
self,
env: EnvironmentModifications,
dependent_spec: spack.spec.Spec,
compilers_of_client={},
) -> None:
# Expected to be called from a client's
# setup_dependent_build_environment(),
# with args extended to convey the client's compilers as needed.

View File

@@ -8,6 +8,7 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.environment
import spack.util.executable
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
@@ -114,5 +115,7 @@ def install(
def _luarocks_config_path(self):
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("LUAROCKS_CONFIG", self._luarocks_config_path())

View File

@@ -4,6 +4,7 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, extends
from spack.multimethod import when
@@ -57,7 +58,9 @@ def install(
"pkg prefix %s; pkg install %s" % (prefix, self.pkg.stage.archive_file),
)
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
# octave does not like those environment variables to be set:
env.unset("CC")
env.unset("CXX")

View File

@@ -106,8 +106,8 @@ def install_component(self, installer_path):
bash = Executable("bash")
# Installer writes files in ~/intel set HOME so it goes to prefix
bash.add_default_env("HOME", self.prefix)
# Installer writes files in ~/intel set HOME so it goes to staging directory
bash.add_default_env("HOME", join_path(self.stage.path, "home"))
# Installer checks $XDG_RUNTIME_DIR/.bootstrapper_lock_file as well
bash.add_default_env("XDG_RUNTIME_DIR", join_path(self.stage.path, "runtime"))
@@ -132,7 +132,7 @@ def install_component(self, installer_path):
if not isdir(install_dir):
raise RuntimeError("install failed to directory: {0}".format(install_dir))
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
"""Adds environment variables to the generated module file.
These environment variables come from running:
@@ -311,4 +311,4 @@ def ld_flags(self):
#: Tuple of Intel math libraries, exported to packages
INTEL_MATH_LIBRARIES = ("intel-mkl", "intel-oneapi-mkl", "intel-parallel-studio")
INTEL_MATH_LIBRARIES = ("intel-oneapi-mkl",)

View File

@@ -13,9 +13,9 @@
import archspec
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList, join_path
from llnl.util.lang import ClassProperty, classproperty, match_predicate
import spack.builder
import spack.config
@@ -139,7 +139,7 @@ def view_file_conflicts(self, view, merge_map):
ext_map = view.extensions_layout.extension_map(self.extendee_spec)
namespaces = set(x.package.py_namespace for x in ext_map.values())
namespace_re = r"site-packages/{0}/__init__.py".format(self.py_namespace)
find_namespace = lang.match_predicate(namespace_re)
find_namespace = match_predicate(namespace_re)
if self.py_namespace in namespaces:
conflicts = list(x for x in conflicts if not find_namespace(x))
@@ -206,7 +206,7 @@ def remove_files_from_view(self, view, merge_map):
spec.package.py_namespace for name, spec in ext_map.items() if name != self.name
)
if self.py_namespace in remaining_namespaces:
namespace_init = lang.match_predicate(
namespace_init = match_predicate(
r"site-packages/{0}/__init__.py".format(self.py_namespace)
)
ignore_namespace = True
@@ -324,6 +324,27 @@ def get_external_python_for_prefix(self):
raise StopIteration("No external python could be detected for %s to depend on" % self.spec)
def _homepage(cls: "PythonPackage") -> Optional[str]:
"""Get the homepage from PyPI if available."""
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/project/{name}/"
return None
def _url(cls: "PythonPackage") -> Optional[str]:
if cls.pypi:
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
def _list_url(cls: "PythonPackage") -> Optional[str]:
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/simple/{name}/"
return None
class PythonPackage(PythonExtension):
"""Specialized class for packages that are built using pip."""
@@ -351,25 +372,9 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/project/{name}/"
return None
@lang.classproperty
def url(cls) -> Optional[str]:
if cls.pypi:
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
@lang.classproperty
def list_url(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/simple/{name}/"
return None
homepage: ClassProperty[Optional[str]] = classproperty(_homepage)
url: ClassProperty[Optional[str]] = classproperty(_url)
list_url: ClassProperty[Optional[str]] = classproperty(_list_url)
@property
def python_spec(self) -> Spec:

View File

@@ -3,8 +3,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional, Tuple
import llnl.util.lang as lang
from llnl.util.filesystem import mkdirp
from llnl.util.lang import ClassProperty, classproperty
from spack.directives import extends
@@ -54,6 +54,32 @@ def install(self, pkg, spec, prefix):
pkg.module.R(*args)
def _homepage(cls: "RPackage") -> Optional[str]:
if cls.cran:
return f"https://cloud.r-project.org/package={cls.cran}"
elif cls.bioc:
return f"https://bioconductor.org/packages/{cls.bioc}"
return None
def _url(cls: "RPackage") -> Optional[str]:
if cls.cran:
return f"https://cloud.r-project.org/src/contrib/{cls.cran}_{str(list(cls.versions)[0])}.tar.gz"
return None
def _list_url(cls: "RPackage") -> Optional[str]:
if cls.cran:
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
return None
def _git(cls: "RPackage") -> Optional[str]:
if cls.bioc:
return f"https://git.bioconductor.org/packages/{cls.bioc}"
return None
class RPackage(Package):
"""Specialized class for packages that are built using R.
@@ -77,24 +103,7 @@ class RPackage(Package):
extends("r")
@lang.classproperty
def homepage(cls):
if cls.cran:
return f"https://cloud.r-project.org/package={cls.cran}"
elif cls.bioc:
return f"https://bioconductor.org/packages/{cls.bioc}"
@lang.classproperty
def url(cls):
if cls.cran:
return f"https://cloud.r-project.org/src/contrib/{cls.cran}_{str(list(cls.versions)[0])}.tar.gz"
@lang.classproperty
def list_url(cls):
if cls.cran:
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
@lang.classproperty
def git(cls):
if cls.bioc:
return f"https://git.bioconductor.org/packages/{cls.bioc}"
homepage: ClassProperty[Optional[str]] = classproperty(_homepage)
url: ClassProperty[Optional[str]] = classproperty(_url)
list_url: ClassProperty[Optional[str]] = classproperty(_list_url)
git: ClassProperty[Optional[str]] = classproperty(_git)

View File

@@ -5,8 +5,8 @@
from typing import Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.lang import ClassProperty, classproperty
import spack.builder
import spack.spec
@@ -19,6 +19,12 @@
from spack.util.executable import Executable, ProcessError
def _homepage(cls: "RacketPackage") -> Optional[str]:
if cls.racket_name:
return f"https://pkgs.racket-lang.org/package/{cls.racket_name}"
return None
class RacketPackage(PackageBase):
"""Specialized class for packages that are built using Racket's
`raco pkg install` and `raco setup` commands.
@@ -37,13 +43,7 @@ class RacketPackage(PackageBase):
extends("racket", when="build_system=racket")
racket_name: Optional[str] = None
parallel = True
@lang.classproperty
def homepage(cls):
if cls.racket_name:
return "https://pkgs.racket-lang.org/package/{0}".format(cls.racket_name)
return None
homepage: ClassProperty[Optional[str]] = classproperty(_homepage)
@spack.builder.builder("racket")

View File

@@ -185,10 +185,16 @@ def __init__(self, pkg):
# These two methods don't follow the (self, spec, prefix) signature of phases nor
# the (self) signature of methods, so they are added explicitly to avoid using a
# catch-all (*args, **kwargs)
def setup_build_environment(self, env):
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
return self.pkg_with_dispatcher.setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
def setup_dependent_build_environment(
self,
env: spack.util.environment.EnvironmentModifications,
dependent_spec: spack.spec.Spec,
) -> None:
return self.pkg_with_dispatcher.setup_dependent_build_environment(env, dependent_spec)
return Adapter(pkg)
@@ -402,7 +408,7 @@ def fixup_install(self):
# do something after the package is installed
pass
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
env.set("MY_ENV_VAR", "my_value")
class CMakeBuilder(cmake.CMakeBuilder, AnyBuilder):

View File

@@ -14,7 +14,7 @@
import tempfile
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set, Union
from typing import Callable, Dict, List, Optional, Set, Union
from urllib.request import Request
import llnl.path
@@ -24,6 +24,7 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.config as cfg
import spack.environment as ev
import spack.error
@@ -613,32 +614,40 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
job_spec, and attempts to copy the files into the directory given
by job_log_dir.
Args:
Parameters:
job_spec: spec associated with spack install log
job_log_dir: path into which build log should be copied
"""
tty.debug(f"job spec: {job_spec}")
try:
package_metadata_root = pathlib.Path(spack.store.STORE.layout.metadata_path(job_spec))
except spack.error.SpackError as e:
tty.error(f"Cannot copy logs: {str(e)}")
if not job_spec.concrete:
tty.warn("Cannot copy artifacts for non-concrete specs")
return
# Get the package's archived files
archive_files = []
archive_root = package_metadata_root / "archived-files"
if archive_root.is_dir():
archive_files = [f for f in archive_root.rglob("*") if f.is_file()]
else:
msg = "Cannot copy package archived files: archived-files must be a directory"
tty.warn(msg)
package_metadata_root = pathlib.Path(spack.store.STORE.layout.metadata_path(job_spec))
if not os.path.isdir(package_metadata_root):
# Fallback to using the stage directory
job_pkg = job_spec.package
package_metadata_root = pathlib.Path(job_pkg.stage.path)
archive_files = spack.builder.create(job_pkg).archive_files
tty.warn("Package not installed, falling back to use stage dir")
tty.debug(f"stage dir: {package_metadata_root}")
else:
# Get the package's archived files
archive_files = []
archive_root = package_metadata_root / "archived-files"
if os.path.isdir(archive_root):
archive_files = [str(f) for f in archive_root.rglob("*") if os.path.isfile(f)]
else:
tty.debug(f"No archived files detected at {archive_root}")
# Try zipped and unzipped versions of the build log
build_log_zipped = package_metadata_root / "spack-build-out.txt.gz"
build_log = package_metadata_root / "spack-build-out.txt"
build_env_mods = package_metadata_root / "spack-build-env.txt"
for f in [build_log_zipped, build_env_mods, *archive_files]:
copy_files_to_artifacts(str(f), job_log_dir)
for f in [build_log_zipped, build_log, build_env_mods, *archive_files]:
copy_files_to_artifacts(str(f), job_log_dir, compress_artifacts=True)
def copy_test_logs_to_artifacts(test_stage, job_test_dir):
@@ -651,11 +660,12 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
"""
tty.debug(f"test stage: {test_stage}")
if not os.path.exists(test_stage):
msg = f"Cannot copy test logs: job test stage ({test_stage}) does not exist"
tty.error(msg)
tty.error(f"Cannot copy test logs: job test stage ({test_stage}) does not exist")
return
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
copy_files_to_artifacts(
os.path.join(test_stage, "*", "*.txt"), job_test_dir, compress_artifacts=True
)
def download_and_extract_artifacts(url, work_dir) -> str:
@@ -1294,35 +1304,34 @@ def display_broken_spec_messages(base_url, hashes):
tty.msg(msg)
def run_standalone_tests(**kwargs):
def run_standalone_tests(
*,
cdash: Optional[CDashHandler] = None,
fail_fast: bool = False,
log_file: Optional[str] = None,
job_spec: Optional[spack.spec.Spec] = None,
repro_dir: Optional[str] = None,
timeout: Optional[int] = None,
):
"""Run stand-alone tests on the current spec.
Arguments:
kwargs (dict): dictionary of arguments used to run the tests
List of recognized keys:
* "cdash" (CDashHandler): (optional) cdash handler instance
* "fail_fast" (bool): (optional) terminate tests after the first failure
* "log_file" (str): (optional) test log file name if NOT CDash reporting
* "job_spec" (Spec): spec that was built
* "repro_dir" (str): reproduction directory
Args:
cdash: cdash handler instance
fail_fast: terminate tests after the first failure
log_file: test log file name if NOT CDash reporting
job_spec: spec that was built
repro_dir: reproduction directory
timeout: maximum time (in seconds) that tests are allowed to run
"""
cdash = kwargs.get("cdash")
fail_fast = kwargs.get("fail_fast")
log_file = kwargs.get("log_file")
if cdash and log_file:
tty.msg(f"The test log file {log_file} option is ignored with CDash reporting")
log_file = None
# Error out but do NOT terminate if there are missing required arguments.
job_spec = kwargs.get("job_spec")
if not job_spec:
tty.error("Job spec is required to run stand-alone tests")
return
repro_dir = kwargs.get("repro_dir")
if not repro_dir:
tty.error("Reproduction directory is required for stand-alone tests")
return
@@ -1331,6 +1340,9 @@ def run_standalone_tests(**kwargs):
if fail_fast:
test_args.append("--fail-fast")
if timeout is not None:
test_args.extend(["--timeout", str(timeout)])
if cdash:
test_args.extend(cdash.args())
else:

View File

@@ -2,9 +2,13 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import errno
import glob
import gzip
import json
import os
import re
import shutil
import sys
import time
from collections import deque
@@ -25,6 +29,7 @@
import spack.mirrors.mirror
import spack.schema
import spack.spec
import spack.util.compression as compression
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
@@ -40,22 +45,67 @@
_urlopen = web_util.urlopen
def copy_files_to_artifacts(src, artifacts_dir):
def copy_gzipped(glob_or_path: str, dest: str) -> None:
"""Copy all of the files in the source glob/path to the destination.
Args:
glob_or_path: path to file to test
dest: destination path to copy to
"""
files = glob.glob(glob_or_path)
if not files:
raise OSError("No such file or directory: '{0}'".format(glob_or_path), errno.ENOENT)
if len(files) > 1 and not os.path.isdir(dest):
raise ValueError(
"'{0}' matches multiple files but '{1}' is not a directory".format(glob_or_path, dest)
)
def is_gzipped(path):
with open(path, "rb") as fd:
return compression.GZipFileType().matches_magic(fd)
for src in files:
if is_gzipped(src):
fs.copy(src, dest)
else:
# Compress and copy in one step
src_name = os.path.basename(src)
if os.path.isdir(dest):
zipped = os.path.join(dest, f"{src_name}.gz")
elif not dest.endswith(".gz"):
zipped = f"{dest}.gz"
else:
zipped = dest
with open(src, "rb") as fin, gzip.open(zipped, "wb") as fout:
shutil.copyfileobj(fin, fout)
def copy_files_to_artifacts(
src: str, artifacts_dir: str, *, compress_artifacts: bool = False
) -> None:
"""
Copy file(s) to the given artifacts directory
Parameters:
Args:
src (str): the glob-friendly path expression for the file(s) to copy
artifacts_dir (str): the destination directory
compress_artifacts (bool): option to compress copied artifacts using Gzip
"""
try:
fs.copy(src, artifacts_dir)
if compress_artifacts:
copy_gzipped(src, artifacts_dir)
else:
fs.copy(src, artifacts_dir)
except Exception as err:
msg = (
f"Unable to copy files ({src}) to artifacts {artifacts_dir} due to "
f"exception: {str(err)}"
tty.warn(
(
f"Unable to copy files ({src}) to artifacts {artifacts_dir} due to "
f"exception: {str(err)}"
)
)
tty.warn(msg)
def win_quote(quote_str: str) -> str:

View File

@@ -76,9 +76,6 @@ def setup_parser(subparser: argparse.ArgumentParser):
default=False,
help="regenerate buildcache index after building package(s)",
)
push.add_argument(
"--spec-file", default=None, help="create buildcache entry for spec from json or yaml file"
)
push.add_argument(
"--only",
default="package,dependencies",
@@ -192,28 +189,14 @@ def setup_parser(subparser: argparse.ArgumentParser):
default=lambda: spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
# Unfortunately there are 3 ways to do the same thing here:
check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check_specs.add_argument(
"--spec-file",
help="check single spec from json or yaml file instead of release specs file",
)
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn)
# Download tarball and specfile
download = subparsers.add_parser("download", help=download_fn.__doc__)
download_spec_or_specfile = download.add_mutually_exclusive_group(required=True)
download_spec_or_specfile.add_argument(
"-s", "--spec", help="download built tarball for spec from mirror"
)
download_spec_or_specfile.add_argument(
"--spec-file", help="download built tarball for spec (from json or yaml file) from mirror"
)
download.add_argument("-s", "--spec", help="download built tarball for spec from mirror")
download.add_argument(
"-p",
"--path",
@@ -223,28 +206,10 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
download.set_defaults(func=download_fn)
# Get buildcache name
getbuildcachename = subparsers.add_parser(
"get-buildcache-name", help=get_buildcache_name_fn.__doc__
)
getbuildcachename_spec_or_specfile = getbuildcachename.add_mutually_exclusive_group(
required=True
)
getbuildcachename_spec_or_specfile.add_argument(
"-s", "--spec", help="spec string for which buildcache name is desired"
)
getbuildcachename_spec_or_specfile.add_argument(
"--spec-file", help="path to spec json or yaml file for which buildcache name is desired"
)
getbuildcachename.set_defaults(func=get_buildcache_name_fn)
# Given the root spec, save the yaml of the dependent spec to a file
savespecfile = subparsers.add_parser("save-specfile", help=save_specfile_fn.__doc__)
savespecfile_spec_or_specfile = savespecfile.add_mutually_exclusive_group(required=True)
savespecfile_spec_or_specfile.add_argument("--root-spec", help="root spec of dependent spec")
savespecfile_spec_or_specfile.add_argument(
"--root-specfile", help="path to json or yaml file containing root spec of dependent spec"
)
savespecfile.add_argument(
"-s",
"--specs",
@@ -380,14 +345,8 @@ def _specs_to_be_packaged(
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use positional arguments instead."
)
if args.specs or args.spec_file:
roots = _matching_specs(spack.cmd.parse_specs(args.specs or args.spec_file))
if args.specs:
roots = _matching_specs(spack.cmd.parse_specs(args.specs))
else:
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
@@ -529,22 +488,7 @@ def check_fn(args: argparse.Namespace):
this command uses the process exit code to indicate its result, specifically, if the
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
f"Use `spack buildcache check {specs_arg}` instead."
)
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
specs_arg = args.specs
if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
@@ -578,13 +522,7 @@ def download_fn(args):
code indicates that the command failed to download at least one of the required buildcache
components
"""
if args.spec_file:
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
)
specs = _matching_specs(spack.cmd.parse_specs(args.spec or args.spec_file))
specs = _matching_specs(spack.cmd.parse_specs(args.spec))
if len(specs) != 1:
tty.die("a single spec argument is required to download from a buildcache")
@@ -593,15 +531,6 @@ def download_fn(args):
sys.exit(1)
def get_buildcache_name_fn(args):
"""get name (prefix) of buildcache entries for this spec"""
tty.warn("This command is deprecated and will be removed in Spack 0.22.")
specs = _matching_specs(spack.cmd.parse_specs(args.spec or args.spec_file))
if len(specs) != 1:
tty.die("a single spec argument is required to get buildcache name")
print(bindist.tarball_name(specs[0], ""))
def save_specfile_fn(args):
"""get full spec for dependencies and write them to files in the specified output directory
@@ -609,13 +538,7 @@ def save_specfile_fn(args):
successful. if any errors or exceptions are encountered, or if expected command-line arguments
are not provided, then the exit code will be non-zero
"""
if args.root_specfile:
tty.warn(
"The flag `--root-specfile` is deprecated and will be removed in Spack 0.22. "
"Use --root-spec instead."
)
specs = spack.cmd.parse_specs(args.root_spec or args.root_specfile)
specs = spack.cmd.parse_specs(args.root_spec)
if len(specs) != 1:
tty.die("a single spec argument is required to save specfile")

View File

@@ -160,6 +160,12 @@ def setup_parser(subparser):
default=False,
help="stop stand-alone tests after the first failure",
)
rebuild.add_argument(
"--timeout",
type=int,
default=None,
help="maximum time (in seconds) that tests are allowed to run",
)
rebuild.set_defaults(func=ci_rebuild)
spack.cmd.common.arguments.add_common_arguments(rebuild, ["jobs"])
@@ -447,7 +453,7 @@ def ci_rebuild(args):
# Arguments when installing the root from sources
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--only=package"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
@@ -487,6 +493,9 @@ def ci_rebuild(args):
# Copy logs and archived files from the install metadata (.spack) directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# Clear the stage directory
spack.stage.purge()
# If the installation succeeded and we're running stand-alone tests for
# the package, run them and copy the output. Failures of any kind should
# *not* terminate the build process or preclude creating the build cache.
@@ -521,6 +530,7 @@ def ci_rebuild(args):
fail_fast=args.fail_fast,
log_file=log_file,
repro_dir=repro_dir,
timeout=args.timeout,
)
except Exception as err:

View File

@@ -63,7 +63,7 @@ def setup_parser(subparser):
)
# List
list_parser = sp.add_parser("list", help="list available compilers")
list_parser = sp.add_parser("list", aliases=["ls"], help="list available compilers")
list_parser.add_argument(
"--scope", action=arguments.ConfigScope, help="configuration scope to read from"
)
@@ -216,5 +216,6 @@ def compiler(parser, args):
"rm": compiler_remove,
"info": compiler_info,
"list": compiler_list,
"ls": compiler_list,
}
action[args.compiler_command](args)

View File

@@ -572,7 +572,7 @@ def edit(self, spec, prefix):
class IntelPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for licensed Intel software"""
base_class_name = "IntelPackage"
base_class_name = "IntelOneApiPackage"
body_def = """\
# FIXME: Override `setup_environment` if necessary."""

View File

@@ -102,7 +102,7 @@ def assure_concrete_spec(env: spack.environment.Environment, spec: spack.spec.Sp
)
else:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
version = max(spack.repo.PATH.get_pkg_class(spec.fullname).versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])

View File

@@ -62,7 +62,7 @@ def setup_parser(subparser):
"package Spack knows how to find."
)
sp.add_parser("list", help="list detectable packages, by repository and name")
sp.add_parser("list", aliases=["ls"], help="list detectable packages, by repository and name")
read_cray_manifest = sp.add_parser(
"read-cray-manifest",
@@ -259,6 +259,7 @@ def external(parser, args):
action = {
"find": external_find,
"list": external_list,
"ls": external_list,
"read-cray-manifest": external_read_cray_manifest,
}
action[args.external_command](args)

View File

@@ -65,6 +65,12 @@ def setup_parser(subparser):
run_parser.add_argument(
"--help-cdash", action="store_true", help="show usage instructions for CDash reporting"
)
run_parser.add_argument(
"--timeout",
type=int,
default=None,
help="maximum time (in seconds) that tests are allowed to run",
)
cd_group = run_parser.add_mutually_exclusive_group()
arguments.add_common_arguments(cd_group, ["clean", "dirty"])
@@ -176,7 +182,7 @@ def test_run(args):
for spec in specs:
matching = spack.store.STORE.db.query_local(spec, hashes=hashes, explicit=explicit)
if spec and not matching:
tty.warn("No {0}installed packages match spec {1}".format(explicit_str, spec))
tty.warn(f"No {explicit_str}installed packages match spec {spec}")
# TODO: Need to write out a log message and/or CDASH Testing
# output that package not installed IF continue to process
@@ -192,7 +198,7 @@ def test_run(args):
# test_stage_dir
test_suite = spack.install_test.TestSuite(specs_to_test, args.alias)
test_suite.ensure_stage()
tty.msg("Spack test %s" % test_suite.name)
tty.msg(f"Spack test {test_suite.name}")
# Set up reporter
setattr(args, "package", [s.format() for s in test_suite.specs])
@@ -204,6 +210,7 @@ def test_run(args):
dirty=args.dirty,
fail_first=args.fail_first,
externals=args.externals,
timeout=args.timeout,
)

View File

@@ -18,6 +18,10 @@ class Languages(enum.Enum):
class CompilerAdaptor:
"""Provides access to compiler attributes via `Package.compiler`. Useful for
packages which do not yet access compiler properties via `self.spec[language]`.
"""
def __init__(
self, compiled_spec: spack.spec.Spec, compilers: Dict[Languages, spack.spec.Spec]
) -> None:
@@ -79,6 +83,14 @@ def implicit_rpaths(self) -> List[str]:
result.extend(CompilerPropertyDetector(compiler).implicit_rpaths())
return result
@property
def opt_flags(self) -> List[str]:
return next(iter(self.compilers.values())).package.opt_flags
@property
def debug_flags(self) -> List[str]:
return next(iter(self.compilers.values())).package.debug_flags
@property
def openmp_flag(self) -> str:
return next(iter(self.compilers.values())).package.openmp_flag
@@ -140,7 +152,7 @@ def c17_flag(self) -> str:
@property
def c23_flag(self) -> str:
return self.compilers[Languages.C].package.standard_flag(
language=Languages.C.value, standard="17"
language=Languages.C.value, standard="23"
)
@property
@@ -190,6 +202,10 @@ def f77(self):
self._lang_exists_or_raise("f77", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.fortran
@property
def stdcxx_libs(self):
return self._maybe_return_attribute("stdcxx_libs", lang=Languages.CXX)
class DeprecatedCompiler(lang.DeprecatedProperty):
def __init__(self) -> None:

View File

@@ -7,6 +7,7 @@
import os
import re
import sys
import warnings
from typing import Any, Dict, List, Optional, Tuple
import archspec.cpu
@@ -337,7 +338,15 @@ def from_legacy_yaml(compiler_dict: Dict[str, Any]) -> List[spack.spec.Spec]:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
pattern = re.compile(r"|".join(finder.search_patterns(pkg=pkg_cls)))
filtered_paths = [x for x in candidate_paths if pattern.search(os.path.basename(x))]
detected = finder.detect_specs(pkg=pkg_cls, paths=filtered_paths)
try:
detected = finder.detect_specs(pkg=pkg_cls, paths=filtered_paths)
except Exception:
warnings.warn(
f"[{__name__}] cannot detect {pkg_name} from the "
f"following paths: {', '.join(filtered_paths)}"
)
continue
for s in detected:
for key in ("flags", "environment", "extra_rpaths"):
if key in compiler_dict:

View File

@@ -149,12 +149,12 @@ def _getfqdn():
return socket.getfqdn()
def reader(version: vn.ConcreteVersion) -> Type["spack.spec.SpecfileReaderBase"]:
def reader(version: vn.StandardVersion) -> Type["spack.spec.SpecfileReaderBase"]:
reader_cls = {
vn.Version("5"): spack.spec.SpecfileV1,
vn.Version("6"): spack.spec.SpecfileV3,
vn.Version("7"): spack.spec.SpecfileV4,
vn.Version("8"): spack.spec.SpecfileV5,
vn.StandardVersion.from_string("5"): spack.spec.SpecfileV1,
vn.StandardVersion.from_string("6"): spack.spec.SpecfileV3,
vn.StandardVersion.from_string("7"): spack.spec.SpecfileV4,
vn.StandardVersion.from_string("8"): spack.spec.SpecfileV5,
}
return reader_cls[version]
@@ -824,7 +824,7 @@ def check(cond, msg):
db = fdata["database"]
check("version" in db, "no 'version' in JSON DB.")
self.db_version = vn.Version(db["version"])
self.db_version = vn.StandardVersion.from_string(db["version"])
if self.db_version > _DB_VERSION:
raise InvalidDatabaseVersionError(self, _DB_VERSION, self.db_version)
elif self.db_version < _DB_VERSION:

View File

@@ -20,7 +20,7 @@
import sys
from typing import Dict, List, Optional, Set, Tuple, Union
import llnl.util.tty
from llnl.util import tty
import spack.config
import spack.error
@@ -93,14 +93,13 @@ def _spec_is_valid(spec: spack.spec.Spec) -> bool:
except spack.error.SpackError:
# It is assumed here that we can at least extract the package name from the spec so we
# can look up the implementation of determine_spec_details
msg = f"Constructed spec for {spec.name} does not have a string representation"
llnl.util.tty.warn(msg)
tty.warn(f"Constructed spec for {spec.name} does not have a string representation")
return False
try:
spack.spec.Spec(str(spec))
except spack.error.SpackError:
llnl.util.tty.warn(
tty.warn(
"Constructed spec has a string representation but the string"
" representation does not evaluate to a valid spec: {0}".format(str(spec))
)
@@ -109,20 +108,24 @@ def _spec_is_valid(spec: spack.spec.Spec) -> bool:
return True
def path_to_dict(search_paths: List[str]):
def path_to_dict(search_paths: List[str]) -> Dict[str, str]:
"""Return dictionary[fullpath]: basename from list of paths"""
path_to_lib = {}
path_to_lib: Dict[str, str] = {}
# Reverse order of search directories so that a lib in the first
# entry overrides later entries
for search_path in reversed(search_paths):
try:
with os.scandir(search_path) as entries:
path_to_lib.update(
{entry.path: entry.name for entry in entries if entry.is_file()}
)
dir_iter = os.scandir(search_path)
except OSError as e:
msg = f"cannot scan '{search_path}' for external software: {str(e)}"
llnl.util.tty.debug(msg)
tty.debug(f"cannot scan '{search_path}' for external software: {e}")
continue
with dir_iter as entries:
for entry in entries:
try:
if entry.is_file():
path_to_lib[entry.path] = entry.name
except OSError as e:
tty.debug(f"cannot scan '{search_path}' for external software: {e}")
return path_to_lib

View File

@@ -34,11 +34,13 @@ class OpenMpi(Package):
import collections.abc
import os
import re
import warnings
from typing import Any, Callable, List, Optional, Tuple, Type, Union
import llnl.util.tty.color
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.package_base
import spack.patch
@@ -608,7 +610,7 @@ def _execute_patch(
return _execute_patch
def conditional(*values: List[Any], when: Optional[WhenType] = None):
def conditional(*values: Union[str, bool], when: Optional[WhenType] = None):
"""Conditional values that can be used in variant declarations."""
# _make_when_spec returns None when the condition is statically false.
when = _make_when_spec(when)
@@ -620,7 +622,7 @@ def conditional(*values: List[Any], when: Optional[WhenType] = None):
@directive("variants")
def variant(
name: str,
default: Optional[Any] = None,
default: Optional[Union[bool, str, Tuple[str, ...]]] = None,
description: str = "",
values: Optional[Union[collections.abc.Sequence, Callable[[Any], bool]]] = None,
multi: Optional[bool] = None,
@@ -650,11 +652,29 @@ def variant(
DirectiveError: If arguments passed to the directive are invalid
"""
# This validation can be removed at runtime and enforced with an audit in Spack v1.0.
# For now it's a warning to let people migrate faster.
if not (
default is None
or type(default) in (bool, str)
or (type(default) is tuple and all(type(x) is str for x in default))
):
if isinstance(default, (list, tuple)):
did_you_mean = f"default={','.join(str(x) for x in default)!r}"
else:
did_you_mean = f"default={str(default)!r}"
warnings.warn(
f"default value for variant '{name}' is not a boolean or string: default={default!r}. "
f"Did you mean {did_you_mean}?",
stacklevel=3,
category=spack.error.SpackAPIWarning,
)
def format_error(msg, pkg):
msg += " @*r{{[{0}, variant '{1}']}}"
return llnl.util.tty.color.colorize(msg.format(pkg.name, name))
if name in spack.variant.reserved_names:
if name in spack.variant.RESERVED_NAMES:
def _raise_reserved_name(pkg):
msg = "The name '%s' is reserved by Spack" % name
@@ -665,7 +685,11 @@ def _raise_reserved_name(pkg):
# Ensure we have a sequence of allowed variant values, or a
# predicate for it.
if values is None:
if str(default).upper() in ("TRUE", "FALSE"):
if (
default in (True, False)
or type(default) is str
and default.upper() in ("TRUE", "FALSE")
):
values = (True, False)
else:
values = lambda x: True
@@ -698,12 +722,15 @@ def _raise_argument_error(pkg):
# or the empty string, as the former indicates that a default
# was not set while the latter will make the variant unparsable
# from the command line
if isinstance(default, tuple):
default = ",".join(default)
if default is None or default == "":
def _raise_default_not_set(pkg):
if default is None:
msg = "either a default was not explicitly set, " "or 'None' was used"
elif default == "":
msg = "either a default was not explicitly set, or 'None' was used"
else:
msg = "the default cannot be an empty string"
raise DirectiveError(format_error(msg, pkg))

View File

@@ -144,7 +144,6 @@ class Foo(Package):
Package class, and it's how Spack gets information from the
packages to the core.
"""
global directive_names
if isinstance(dicts, str):
dicts = (dicts,)

View File

@@ -566,7 +566,7 @@
display_specs,
environment_dir_from_name,
environment_from_name_or_dir,
environment_path_scopes,
environment_path_scope,
exists,
initialize_environment_dir,
installed_specs,
@@ -603,7 +603,7 @@
"display_specs",
"environment_dir_from_name",
"environment_from_name_or_dir",
"environment_path_scopes",
"environment_path_scope",
"exists",
"initialize_environment_dir",
"installed_specs",

View File

@@ -31,7 +31,6 @@
import spack.repo
import spack.schema.env
import spack.spec
import spack.spec_list
import spack.store
import spack.user_environment as uenv
import spack.util.environment
@@ -44,10 +43,10 @@
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
from ..enums import ConfigScopePriority
from .list import SpecList, SpecListError, SpecListParser
SpecPair = spack.concretize.SpecPair
@@ -97,16 +96,15 @@ def environment_name(path: Union[str, pathlib.Path]) -> str:
return path_str
def ensure_no_disallowed_env_config_mods(scopes: List[spack.config.ConfigScope]) -> None:
for scope in scopes:
config = scope.get_section("config")
if config and "environments_root" in config["config"]:
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
def ensure_no_disallowed_env_config_mods(scope: spack.config.ConfigScope) -> None:
config = scope.get_section("config")
if config and "environments_root" in config["config"]:
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
def default_manifest_yaml():
@@ -933,8 +931,10 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
self.new_specs: List[Spec] = []
self.views: Dict[str, ViewDescriptor] = {}
#: Parser for spec lists
self._spec_lists_parser = SpecListParser()
#: Specs from "spack.yaml"
self.spec_lists: Dict[str, SpecList] = {user_speclist_name: SpecList()}
self.spec_lists: Dict[str, SpecList] = {}
#: User specs from the last concretization
self.concretized_user_specs: List[Spec] = []
#: Roots associated with the last concretization, in order
@@ -1002,26 +1002,6 @@ def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return lk.WriteTransaction(self.txlock, acquire=self._re_read)
def _process_definition(self, entry):
"""Process a single spec definition item."""
when_string = entry.get("when")
if when_string is not None:
when = spack.spec.eval_conditional(when_string)
assert len([x for x in entry if x != "when"]) == 1
else:
when = True
assert len(entry) == 1
if when:
for name, spec_list in entry.items():
if name == "when":
continue
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
def _process_view(self, env_view: Optional[Union[bool, str, Dict]]):
"""Process view option(s), which can be boolean, string, or None.
@@ -1083,21 +1063,24 @@ def _process_concrete_includes(self):
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
self.views = {}
self._sync_speclists()
self._process_view(spack.config.get("view", True))
self._process_concrete_includes()
for item in spack.config.get("definitions", []):
self._process_definition(item)
def _sync_speclists(self):
self.spec_lists = {}
self.spec_lists.update(
self._spec_lists_parser.parse_definitions(
data=spack.config.CONFIG.get("definitions", [])
)
)
env_configuration = self.manifest[TOP_LEVEL_KEY]
spec_list = env_configuration.get(user_speclist_name, [])
user_specs = SpecList(
user_speclist_name, [s for s in spec_list if s], self.spec_lists.copy()
self.spec_lists[user_speclist_name] = self._spec_lists_parser.parse_user_specs(
name=user_speclist_name, yaml_list=spec_list
)
self.spec_lists[user_speclist_name] = user_specs
self._process_view(spack.config.get("view", True))
self._process_concrete_includes()
def all_concretized_user_specs(self) -> List[Spec]:
"""Returns all of the concretized user specs of the environment and
@@ -1168,9 +1151,7 @@ def clear(self, re_read=False):
re_read: If ``True``, do not clear ``new_specs``. This value cannot be read from yaml,
and needs to be maintained when re-reading an existing environment.
"""
self.spec_lists = collections.OrderedDict()
self.spec_lists[user_speclist_name] = SpecList()
self.spec_lists = {}
self._dev_specs = {}
self.concretized_order = [] # roots of last concretize, in order
self.concretized_user_specs = [] # user specs from last concretize
@@ -1277,22 +1258,6 @@ def destroy(self):
"""Remove this environment from Spack entirely."""
shutil.rmtree(self.path)
def update_stale_references(self, from_list=None):
"""Iterate over spec lists updating references."""
if not from_list:
from_list = next(iter(self.spec_lists.keys()))
index = list(self.spec_lists.keys()).index(from_list)
# spec_lists is an OrderedDict to ensure lists read from the manifest
# are maintainted in order, hence, all list entries after the modified
# list may refer to the modified list requiring stale references to be
# updated.
for i, (name, speclist) in enumerate(
list(self.spec_lists.items())[index + 1 :], index + 1
):
new_reference = dict((n, self.spec_lists[n]) for n in list(self.spec_lists.keys())[:i])
speclist.update_reference(new_reference)
def add(self, user_spec, list_name=user_speclist_name):
"""Add a single user_spec (non-concretized) to the Environment
@@ -1312,18 +1277,17 @@ def add(self, user_spec, list_name=user_speclist_name):
elif not spack.repo.PATH.exists(spec.name) and not spec.abstract_hash:
virtuals = spack.repo.PATH.provider_index.providers.keys()
if spec.name not in virtuals:
msg = "no such package: %s" % spec.name
raise SpackEnvironmentError(msg)
raise SpackEnvironmentError(f"no such package: {spec.name}")
list_to_change = self.spec_lists[list_name]
existing = str(spec) in list_to_change.yaml_list
if not existing:
list_to_change.add(str(spec))
self.update_stale_references(list_name)
if list_name == user_speclist_name:
self.manifest.add_user_spec(str(user_spec))
else:
self.manifest.add_definition(str(user_spec), list_name=list_name)
self._sync_speclists()
return bool(not existing)
@@ -1367,18 +1331,17 @@ def change_existing_spec(
"There are no specs named {0} in {1}".format(match_spec.name, list_name)
)
elif len(matches) > 1 and not allow_changing_multiple_specs:
raise ValueError("{0} matches multiple specs".format(str(match_spec)))
raise ValueError(f"{str(match_spec)} matches multiple specs")
for idx, spec in matches:
override_spec = Spec.override(spec, change_spec)
self.spec_lists[list_name].replace(idx, str(override_spec))
if list_name == user_speclist_name:
self.manifest.override_user_spec(str(override_spec), idx=idx)
else:
self.manifest.override_definition(
str(spec), override=str(override_spec), list_name=list_name
)
self.update_stale_references(from_list=list_name)
self._sync_speclists()
def remove(self, query_spec, list_name=user_speclist_name, force=False):
"""Remove specs from an environment that match a query_spec"""
@@ -1406,22 +1369,17 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
raise SpackEnvironmentError(f"{err_msg_header}, no spec matches")
old_specs = set(self.user_specs)
new_specs = set()
# Remove specs from the appropriate spec list
for spec in matches:
if spec not in list_to_change:
continue
try:
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError as e:
# define new specs list
new_specs = set(self.user_specs)
except SpecListError as e:
msg = str(e)
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
new_specs.remove(spec)
tty.warn(msg)
else:
if list_name == user_speclist_name:
@@ -1429,7 +1387,11 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
else:
self.manifest.remove_definition(str(spec), list_name=list_name)
# If force, update stale concretized specs
# Recompute "definitions" and user specs
self._sync_speclists()
new_specs = set(self.user_specs)
# If 'force', update stale concretized specs
for spec in old_specs - new_specs:
if force and spec in self.concretized_user_specs:
i = self.concretized_user_specs.index(spec)
@@ -1643,23 +1605,6 @@ def _concretize_separately(self, tests=False):
# Unify the specs objects, so we get correct references to all parents
self._read_lockfile_dict(self._to_lockfile_dict())
# Re-attach information on test dependencies
if tests:
# This is slow, but the information on test dependency is lost
# after unification or when reading from a lockfile.
for h in self.specs_by_hash:
current_spec, computed_spec = self.specs_by_hash[h], by_hash[h]
for node in computed_spec.traverse():
test_edges = node.edges_to_dependencies(depflag=dt.TEST)
for current_edge in test_edges:
test_dependency = current_edge.spec
if test_dependency in current_spec[node.name]:
continue
current_spec[node.name].add_dependency_edge(
test_dependency.copy(), depflag=dt.TEST, virtuals=current_edge.virtuals
)
return concretized_specs
@property
@@ -2717,9 +2662,9 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str], name: Optional[str] =
self.scope_name = f"env:{self.name}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
#: Configuration scopes associated with this environment. Note that these are not
#: Configuration scope associated with this environment. Note that this is not
#: invalidated by a re-read of the manifest file.
self._config_scopes: Optional[List[spack.config.ConfigScope]] = None
self._env_config_scope: Optional[spack.config.ConfigScope] = None
if not self.manifest_file.exists():
msg = f"cannot find '{manifest_name}' in {self.manifest_dir}"
@@ -2828,6 +2773,8 @@ def add_definition(self, user_spec: str, list_name: str) -> None:
item[list_name].append(user_spec)
break
# "definitions" can be remote, so we need to update the global config too
spack.config.CONFIG.set("definitions", defs, scope=self.scope_name)
self.changed = True
def remove_definition(self, user_spec: str, list_name: str) -> None:
@@ -2854,6 +2801,8 @@ def remove_definition(self, user_spec: str, list_name: str) -> None:
except ValueError:
pass
# "definitions" can be remote, so we need to update the global config too
spack.config.CONFIG.set("definitions", defs, scope=self.scope_name)
self.changed = True
def override_definition(self, user_spec: str, *, override: str, list_name: str) -> None:
@@ -2879,6 +2828,8 @@ def override_definition(self, user_spec: str, *, override: str, list_name: str)
except ValueError:
pass
# "definitions" can be remote, so we need to update the global config too
spack.config.CONFIG.set("definitions", defs, scope=self.scope_name)
self.changed = True
def _iterate_on_definitions(self, definitions, *, list_name, err_msg):
@@ -2957,33 +2908,27 @@ def __str__(self):
return str(self.manifest_file)
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest. On the first call this
instantiates all the scopes, on subsequent calls it returns the cached list."""
if self._config_scopes is not None:
return self._config_scopes
scopes: List[spack.config.ConfigScope] = [
spack.config.SingleFileScope(
def env_config_scope(self) -> spack.config.ConfigScope:
"""The configuration scope for the environment manifest"""
if self._env_config_scope is None:
self._env_config_scope = spack.config.SingleFileScope(
self.scope_name,
str(self.manifest_file),
spack.schema.env.schema,
yaml_path=[TOP_LEVEL_KEY],
)
]
ensure_no_disallowed_env_config_mods(scopes)
self._config_scopes = scopes
return scopes
ensure_no_disallowed_env_config_mods(self._env_config_scope)
return self._env_config_scope
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope, priority=ConfigScopePriority.ENVIRONMENT)
"""Add the manifest's scope to the global configuration search path."""
spack.config.CONFIG.push_scope(
self.env_config_scope, priority=ConfigScopePriority.ENVIRONMENT
)
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
"""Remove the manifest's scope from the global config path."""
spack.config.CONFIG.remove_scope(self.env_config_scope.name)
@contextlib.contextmanager
def use_config(self):
@@ -2994,8 +2939,8 @@ def use_config(self):
self.deactivate_config_scope()
def environment_path_scopes(name: str, path: str) -> Optional[List[spack.config.ConfigScope]]:
"""Retrieve the suitably named environment path scopes
def environment_path_scope(name: str, path: str) -> Optional[spack.config.ConfigScope]:
"""Retrieve the suitably named environment path scope
Arguments:
name: configuration scope name
@@ -3010,11 +2955,9 @@ def environment_path_scopes(name: str, path: str) -> Optional[List[spack.config.
else:
return None
for scope in manifest.env_config_scopes:
scope.name = f"{name}:{scope.name}"
scope.writable = False
return manifest.env_config_scopes
manifest.env_config_scope.name = f"{name}:{manifest.env_config_scope.name}"
manifest.env_config_scope.writable = False
return manifest.env_config_scope
class SpackEnvironmentError(spack.error.SpackError):

View File

@@ -2,36 +2,24 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
from typing import List
from typing import Any, Dict, List, NamedTuple, Optional, Union
import spack.spec
import spack.util.spack_yaml
import spack.variant
from spack.error import SpackError
from spack.spec import Spec
class SpecList:
def __init__(self, name="specs", yaml_list=None, reference=None):
# Normalize input arguments
yaml_list = yaml_list or []
reference = reference or {}
def __init__(self, *, name: str = "specs", yaml_list=None, expanded_list=None):
self.name = name
self._reference = reference # TODO: Do we need defensive copy here?
# Validate yaml_list before assigning
if not all(isinstance(s, str) or isinstance(s, (list, dict)) for s in yaml_list):
raise ValueError(
"yaml_list can contain only valid YAML types! Found:\n %s"
% [type(s) for s in yaml_list]
)
self.yaml_list = yaml_list[:]
self.yaml_list = yaml_list[:] if yaml_list is not None else []
# Expansions can be expensive to compute and difficult to keep updated
# We cache results and invalidate when self.yaml_list changes
self._expanded_list = None
self.specs_as_yaml_list = expanded_list or []
self._constraints = None
self._specs = None
self._specs: Optional[List[Spec]] = None
@property
def is_matrix(self):
@@ -40,12 +28,6 @@ def is_matrix(self):
return True
return False
@property
def specs_as_yaml_list(self):
if self._expanded_list is None:
self._expanded_list = self._expand_references(self.yaml_list)
return self._expanded_list
@property
def specs_as_constraints(self):
if self._constraints is None:
@@ -62,7 +44,7 @@ def specs_as_constraints(self):
@property
def specs(self) -> List[Spec]:
if self._specs is None:
specs = []
specs: List[Spec] = []
# This could be slightly faster done directly from yaml_list,
# but this way is easier to maintain.
for constraint_list in self.specs_as_constraints:
@@ -74,12 +56,13 @@ def specs(self) -> List[Spec]:
return self._specs
def add(self, spec):
self.yaml_list.append(str(spec))
def add(self, spec: Spec):
spec_str = str(spec)
self.yaml_list.append(spec_str)
# expanded list can be updated without invalidation
if self._expanded_list is not None:
self._expanded_list.append(str(spec))
if self.specs_as_yaml_list is not None:
self.specs_as_yaml_list.append(spec_str)
# Invalidate cache variables when we change the list
self._constraints = None
@@ -101,83 +84,18 @@ def remove(self, spec):
# Remove may contain more than one string representation of the same spec
for item in remove:
self.yaml_list.remove(item)
self.specs_as_yaml_list.remove(item)
# invalidate cache variables when we change the list
self._expanded_list = None
self._constraints = None
self._specs = None
def replace(self, idx: int, spec: str):
"""Replace the existing spec at the index with the new one.
Args:
idx: index of the spec to replace in the speclist
spec: new spec
"""
self.yaml_list[idx] = spec
# invalidate cache variables when we change the list
self._expanded_list = None
self._constraints = None
self._specs = None
def extend(self, other, copy_reference=True):
def extend(self, other: "SpecList", copy_reference=True) -> None:
self.yaml_list.extend(other.yaml_list)
self._expanded_list = None
self.specs_as_yaml_list.extend(other.specs_as_yaml_list)
self._constraints = None
self._specs = None
if copy_reference:
self._reference = other._reference
def update_reference(self, reference):
self._reference = reference
self._expanded_list = None
self._constraints = None
self._specs = None
def _parse_reference(self, name):
sigil = ""
name = name[1:]
# Parse specs as constraints
if name.startswith("^") or name.startswith("%"):
sigil = name[0]
name = name[1:]
# Make sure the reference is valid
if name not in self._reference:
msg = f"SpecList '{self.name}' refers to named list '{name}'"
msg += " which does not appear in its reference dict."
raise UndefinedReferenceError(msg)
return (name, sigil)
def _expand_references(self, yaml):
if isinstance(yaml, list):
ret = []
for item in yaml:
# if it's a reference, expand it
if isinstance(item, str) and item.startswith("$"):
# replace the reference and apply the sigil if needed
name, sigil = self._parse_reference(item)
referent = [
_sigilify(item, sigil) for item in self._reference[name].specs_as_yaml_list
]
ret.extend(referent)
else:
# else just recurse
ret.append(self._expand_references(item))
return ret
elif isinstance(yaml, dict):
# There can't be expansions in dicts
return dict((name, self._expand_references(val)) for (name, val) in yaml.items())
else:
# Strings are just returned
return yaml
def __len__(self):
return len(self.specs)
@@ -251,6 +169,111 @@ def _sigilify(item, sigil):
return sigil + item
class Definition(NamedTuple):
name: str
yaml_list: List[Union[str, Dict]]
when: Optional[str]
class SpecListParser:
"""Parse definitions and user specs from data in environments"""
def __init__(self):
self.definitions: Dict[str, SpecList] = {}
def parse_definitions(self, *, data: List[Dict[str, Any]]) -> Dict[str, SpecList]:
definitions_from_yaml: Dict[str, List[Definition]] = {}
for item in data:
value = self._parse_yaml_definition(item)
definitions_from_yaml.setdefault(value.name, []).append(value)
self.definitions = {}
self._build_definitions(definitions_from_yaml)
return self.definitions
def parse_user_specs(self, *, name, yaml_list) -> SpecList:
definition = Definition(name=name, yaml_list=yaml_list, when=None)
return self._speclist_from_definitions(name, [definition])
def _parse_yaml_definition(self, yaml_entry) -> Definition:
when_string = yaml_entry.get("when")
if (when_string and len(yaml_entry) > 2) or (not when_string and len(yaml_entry) > 1):
mark = spack.util.spack_yaml.get_mark_from_yaml_data(yaml_entry)
attributes = ", ".join(x for x in yaml_entry if x != "when")
error_msg = f"definition must have a single attribute, got many: {attributes}"
raise SpecListError(f"{mark.name}:{mark.line + 1}: {error_msg}")
for name, yaml_list in yaml_entry.items():
if name == "when":
continue
return Definition(name=name, yaml_list=yaml_list, when=when_string)
# If we are here, it means only "when" is in the entry
mark = spack.util.spack_yaml.get_mark_from_yaml_data(yaml_entry)
error_msg = "definition must have a single attribute, got none"
raise SpecListError(f"{mark.name}:{mark.line + 1}: {error_msg}")
def _build_definitions(self, definitions_from_yaml: Dict[str, List[Definition]]):
for name, definitions in definitions_from_yaml.items():
self.definitions[name] = self._speclist_from_definitions(name, definitions)
def _speclist_from_definitions(self, name, definitions) -> SpecList:
combined_yaml_list = []
for def_part in definitions:
if def_part.when is not None and not spack.spec.eval_conditional(def_part.when):
continue
combined_yaml_list.extend(def_part.yaml_list)
expanded_list = self._expand_yaml_list(combined_yaml_list)
return SpecList(name=name, yaml_list=combined_yaml_list, expanded_list=expanded_list)
def _expand_yaml_list(self, raw_yaml_list):
result = []
for item in raw_yaml_list:
if isinstance(item, str) and item.startswith("$"):
result.extend(self._expand_reference(item))
continue
value = item
if isinstance(item, dict):
value = self._expand_yaml_matrix(item)
result.append(value)
return result
def _expand_reference(self, item: str):
sigil, name = "", item[1:]
if name.startswith("^") or name.startswith("%"):
sigil, name = name[0], name[1:]
if name not in self.definitions:
mark = spack.util.spack_yaml.get_mark_from_yaml_data(item)
error_msg = f"trying to expand the name '{name}', which is not defined yet"
raise UndefinedReferenceError(f"{mark.name}:{mark.line + 1}: {error_msg}")
value = self.definitions[name].specs_as_yaml_list
if not sigil:
return value
return [_sigilify(x, sigil) for x in value]
def _expand_yaml_matrix(self, matrix_yaml):
extra_attributes = set(matrix_yaml) - {"matrix", "exclude"}
if extra_attributes:
mark = spack.util.spack_yaml.get_mark_from_yaml_data(matrix_yaml)
error_msg = f"extra attributes in spec matrix: {','.join(sorted(extra_attributes))}"
raise SpecListError(f"{mark.name}:{mark.line + 1}: {error_msg}")
if "matrix" not in matrix_yaml:
mark = spack.util.spack_yaml.get_mark_from_yaml_data(matrix_yaml)
error_msg = "matrix is missing the 'matrix' attribute"
raise SpecListError(f"{mark.name}:{mark.line + 1}: {error_msg}")
# Assume data has been validated against the YAML schema
result = {"matrix": [self._expand_yaml_list(row) for row in matrix_yaml["matrix"]]}
if "exclude" in matrix_yaml:
result["exclude"] = matrix_yaml["exclude"]
return result
class SpecListError(SpackError):
"""Error class for all errors related to SpecList objects."""

View File

@@ -49,10 +49,23 @@ def activate_header(env, shell, prompt=None, view: Optional[str] = None):
cmds += 'set "SPACK_ENV=%s"\n' % env.path
if view:
cmds += 'set "SPACK_ENV_VIEW=%s"\n' % view
if prompt:
old_prompt = os.environ.get("SPACK_OLD_PROMPT")
if not old_prompt:
old_prompt = os.environ.get("PROMPT")
cmds += f'set "SPACK_OLD_PROMPT={old_prompt}"\n'
cmds += f'set "PROMPT={prompt} $P$G"\n'
elif shell == "pwsh":
cmds += "$Env:SPACK_ENV='%s'\n" % env.path
if view:
cmds += "$Env:SPACK_ENV_VIEW='%s'\n" % view
if prompt:
cmds += (
"function global:prompt { $pth = $(Convert-Path $(Get-Location))"
' | Split-Path -leaf; if(!"$Env:SPACK_OLD_PROMPT") '
'{$Env:SPACK_OLD_PROMPT="[spack] PS $pth>"}; '
'"%s PS $pth>"}\n' % prompt
)
else:
bash_color_prompt = colorize(f"@G{{{prompt}}}", color=True, enclose=True)
zsh_color_prompt = colorize(f"@G{{{prompt}}}", color=True, enclose=False, zsh=True)
@@ -107,10 +120,19 @@ def deactivate_header(shell):
cmds += 'set "SPACK_ENV="\n'
cmds += 'set "SPACK_ENV_VIEW="\n'
# TODO: despacktivate
# TODO: prompt
old_prompt = os.environ.get("SPACK_OLD_PROMPT")
if old_prompt:
cmds += f'set "PROMPT={old_prompt}"\n'
cmds += 'set "SPACK_OLD_PROMPT="\n'
elif shell == "pwsh":
cmds += "Set-Item -Path Env:SPACK_ENV\n"
cmds += "Set-Item -Path Env:SPACK_ENV_VIEW\n"
cmds += (
"function global:prompt { $pth = $(Convert-Path $(Get-Location))"
' | Split-Path -leaf; $spack_prompt = "[spack] $pth >"; '
'if("$Env:SPACK_OLD_PROMPT") {$spack_prompt=$Env:SPACK_OLD_PROMPT};'
" $spack_prompt}\n"
)
else:
cmds += "if [ ! -z ${SPACK_ENV+x} ]; then\n"
cmds += "unset SPACK_ENV; export SPACK_ENV;\n"

View File

@@ -27,11 +27,14 @@
import os
import re
import shutil
import sys
import time
import urllib.error
import urllib.parse
import urllib.request
import urllib.response
from pathlib import PurePath
from typing import List, Optional
from typing import Callable, List, Mapping, Optional
import llnl.url
import llnl.util
@@ -219,6 +222,114 @@ def mirror_id(self):
"""BundlePackages don't have a mirror id."""
def _format_speed(total_bytes: int, elapsed: float) -> str:
"""Return a human-readable average download speed string."""
elapsed = 1 if elapsed <= 0 else elapsed # avoid divide by zero
speed = total_bytes / elapsed
if speed >= 1e9:
return f"{speed / 1e9:6.1f} GB/s"
elif speed >= 1e6:
return f"{speed / 1e6:6.1f} MB/s"
elif speed >= 1e3:
return f"{speed / 1e3:6.1f} KB/s"
return f"{speed:6.1f} B/s"
def _format_bytes(total_bytes: int) -> str:
"""Return a human-readable total bytes string."""
if total_bytes >= 1e9:
return f"{total_bytes / 1e9:7.2f} GB"
elif total_bytes >= 1e6:
return f"{total_bytes / 1e6:7.2f} MB"
elif total_bytes >= 1e3:
return f"{total_bytes / 1e3:7.2f} KB"
return f"{total_bytes:7.2f} B"
class FetchProgress:
#: Characters to rotate in the spinner.
spinner = ["|", "/", "-", "\\"]
def __init__(
self,
total_bytes: Optional[int] = None,
enabled: bool = True,
get_time: Callable[[], float] = time.time,
) -> None:
"""Initialize a FetchProgress instance.
Args:
total_bytes: Total number of bytes to download, if known.
enabled: Whether to print progress information.
get_time: Function to get the current time."""
#: Number of bytes downloaded so far.
self.current_bytes = 0
#: Delta time between progress prints
self.delta = 0.1
#: Whether to print progress information.
self.enabled = enabled
#: Function to get the current time.
self.get_time = get_time
#: Time of last progress print to limit output
self.last_printed = 0.0
#: Time of start of download
self.start_time = get_time() if enabled else 0.0
#: Total number of bytes to download, if known.
self.total_bytes = total_bytes if total_bytes and total_bytes > 0 else 0
#: Index of spinner character to print (used if total bytes is unknown)
self.index = 0
@classmethod
def from_headers(
cls,
headers: Mapping[str, str],
enabled: bool = True,
get_time: Callable[[], float] = time.time,
) -> "FetchProgress":
"""Create a FetchProgress instance from HTTP headers."""
# headers.get is case-insensitive if it's from a HTTPResponse object.
content_length = headers.get("Content-Length")
try:
total_bytes = int(content_length) if content_length else None
except ValueError:
total_bytes = None
return cls(total_bytes=total_bytes, enabled=enabled, get_time=get_time)
def advance(self, num_bytes: int, out=sys.stdout) -> None:
if not self.enabled:
return
self.current_bytes += num_bytes
self.print(out=out)
def print(self, final: bool = False, out=sys.stdout) -> None:
if not self.enabled:
return
current_time = self.get_time()
if self.last_printed + self.delta < current_time or final:
self.last_printed = current_time
# print a newline if this is the final update
maybe_newline = "\n" if final else ""
# if we know the total bytes, show a percentage, otherwise a spinner
if self.total_bytes > 0:
percentage = min(100 * self.current_bytes / self.total_bytes, 100.0)
percent_or_spinner = f"[{percentage:3.0f}%] "
else:
# only show the spinner if we are not at 100%
if final:
percent_or_spinner = "[100%] "
else:
percent_or_spinner = f"[ {self.spinner[self.index]} ] "
self.index = (self.index + 1) % len(self.spinner)
print(
f"\r {percent_or_spinner}{_format_bytes(self.current_bytes)} "
f"@ {_format_speed(self.current_bytes, current_time - self.start_time)}"
f"{maybe_newline}",
end="",
flush=True,
file=out,
)
@fetcher
class URLFetchStrategy(FetchStrategy):
"""URLFetchStrategy pulls source code from a URL for an archive, check the
@@ -295,8 +406,9 @@ def fetch(self):
)
def _fetch_from_url(self, url):
if spack.config.get("config:url_fetch_method") == "curl":
return self._fetch_curl(url)
fetch_method = spack.config.get("config:url_fetch_method", "urllib")
if fetch_method.startswith("curl"):
return self._fetch_curl(url, config_args=fetch_method.split()[1:])
else:
return self._fetch_urllib(url)
@@ -315,7 +427,7 @@ def _check_headers(self, headers):
tty.warn(msg)
@_needs_stage
def _fetch_urllib(self, url):
def _fetch_urllib(self, url, chunk_size=65536):
save_file = self.stage.save_filename
request = urllib.request.Request(url, headers={"User-Agent": web_util.SPACK_USER_AGENT})
@@ -326,8 +438,15 @@ def _fetch_urllib(self, url):
try:
response = web_util.urlopen(request)
tty.msg(f"Fetching {url}")
progress = FetchProgress.from_headers(response.headers, enabled=sys.stdout.isatty())
with open(save_file, "wb") as f:
shutil.copyfileobj(response, f)
while True:
chunk = response.read(chunk_size)
if not chunk:
break
f.write(chunk)
progress.advance(len(chunk))
progress.print(final=True)
except OSError as e:
# clean up archive on failure.
if self.archive_file:
@@ -345,7 +464,7 @@ def _fetch_urllib(self, url):
self._check_headers(str(response.headers))
@_needs_stage
def _fetch_curl(self, url):
def _fetch_curl(self, url, config_args=[]):
save_file = None
partial_file = None
if self.stage.save_filename:
@@ -374,7 +493,7 @@ def _fetch_curl(self, url):
timeout = self.extra_options.get("timeout")
base_args = web_util.base_curl_fetch_args(url, timeout)
curl_args = save_args + base_args + cookie_args
curl_args = config_args + save_args + base_args + cookie_args
# Run curl but grab the mime type from the http headers
curl = self.curl

View File

@@ -12,7 +12,7 @@
import shutil
import sys
from collections import Counter, OrderedDict
from typing import Callable, List, Optional, Tuple, Type, TypeVar, Union
from typing import Callable, Iterable, List, Optional, Tuple, Type, TypeVar, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -391,7 +391,7 @@ def phase_tests(self, builder, phase_name: str, method_names: List[str]):
if self.test_failures:
raise TestFailure(self.test_failures)
def stand_alone_tests(self, kwargs):
def stand_alone_tests(self, kwargs, timeout: Optional[int] = None) -> None:
"""Run the package's stand-alone tests.
Args:
@@ -399,7 +399,9 @@ def stand_alone_tests(self, kwargs):
"""
import spack.build_environment # avoid circular dependency
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
spack.build_environment.start_build_process(
self.pkg, test_process, kwargs, timeout=timeout
)
def parts(self) -> int:
"""The total number of (checked) test parts."""
@@ -847,7 +849,7 @@ def write_test_summary(counts: "Counter"):
class TestSuite:
"""The class that manages specs for ``spack test run`` execution."""
def __init__(self, specs, alias=None):
def __init__(self, specs: Iterable[Spec], alias: Optional[str] = None) -> None:
# copy so that different test suites have different package objects
# even if they contain the same spec
self.specs = [spec.copy() for spec in specs]
@@ -855,42 +857,43 @@ def __init__(self, specs, alias=None):
self.current_base_spec = None # spec currently running do_test
self.alias = alias
self._hash = None
self._stage = None
self._hash: Optional[str] = None
self._stage: Optional[Prefix] = None
self.counts: "Counter" = Counter()
@property
def name(self):
def name(self) -> str:
"""The name (alias or, if none, hash) of the test suite."""
return self.alias if self.alias else self.content_hash
@property
def content_hash(self):
def content_hash(self) -> str:
"""The hash used to uniquely identify the test suite."""
if not self._hash:
json_text = sjson.dump(self.to_dict())
assert json_text is not None, f"{__name__} unexpected value for 'json_text'"
sha = hashlib.sha1(json_text.encode("utf-8"))
b32_hash = base64.b32encode(sha.digest()).lower()
b32_hash = b32_hash.decode("utf-8")
self._hash = b32_hash
return self._hash
def __call__(self, *args, **kwargs):
def __call__(
self,
*,
remove_directory: bool = True,
dirty: bool = False,
fail_first: bool = False,
externals: bool = False,
timeout: Optional[int] = None,
):
self.write_reproducibility_data()
remove_directory = kwargs.get("remove_directory", True)
dirty = kwargs.get("dirty", False)
fail_first = kwargs.get("fail_first", False)
externals = kwargs.get("externals", False)
for spec in self.specs:
try:
if spec.package.test_suite:
raise TestSuiteSpecError(
"Package {} cannot be run in two test suites at once".format(
spec.package.name
)
f"Package {spec.package.name} cannot be run in two test suites at once"
)
# Set up the test suite to know which test is running
@@ -905,7 +908,7 @@ def __call__(self, *args, **kwargs):
fs.mkdirp(test_dir)
# run the package tests
spec.package.do_test(dirty=dirty, externals=externals)
spec.package.do_test(dirty=dirty, externals=externals, timeout=timeout)
# Clean up on success
if remove_directory:
@@ -956,15 +959,12 @@ def __call__(self, *args, **kwargs):
if failures:
raise TestSuiteFailure(failures)
def test_status(self, spec: spack.spec.Spec, externals: bool) -> Optional[TestStatus]:
"""Determine the overall test results status for the spec.
def test_status(self, spec: spack.spec.Spec, externals: bool) -> TestStatus:
"""Returns the overall test results status for the spec.
Args:
spec: instance of the spec under test
externals: ``True`` if externals are to be tested, else ``False``
Returns:
the spec's test status if available or ``None``
"""
tests_status_file = self.tested_file_for_spec(spec)
if not os.path.exists(tests_status_file):
@@ -981,109 +981,84 @@ def test_status(self, spec: spack.spec.Spec, externals: bool) -> Optional[TestSt
value = (f.read()).strip("\n")
return TestStatus(int(value)) if value else TestStatus.NO_TESTS
def ensure_stage(self):
def ensure_stage(self) -> None:
"""Ensure the test suite stage directory exists."""
if not os.path.exists(self.stage):
fs.mkdirp(self.stage)
@property
def stage(self):
"""The root test suite stage directory.
Returns:
str: the spec's test stage directory path
"""
def stage(self) -> Prefix:
"""The root test suite stage directory"""
if not self._stage:
self._stage = Prefix(fs.join_path(get_test_stage_dir(), self.content_hash))
return self._stage
@stage.setter
def stage(self, value):
def stage(self, value: Union[Prefix, str]) -> None:
"""Set the value of a non-default stage directory."""
self._stage = value if isinstance(value, Prefix) else Prefix(value)
@property
def results_file(self):
def results_file(self) -> Prefix:
"""The path to the results summary file."""
return self.stage.join(results_filename)
@classmethod
def test_pkg_id(cls, spec):
def test_pkg_id(cls, spec: Spec) -> str:
"""The standard install test package identifier.
Args:
spec: instance of the spec under test
Returns:
str: the install test package identifier
"""
return spec.format_path("{name}-{version}-{hash:7}")
@classmethod
def test_log_name(cls, spec):
def test_log_name(cls, spec: Spec) -> str:
"""The standard log filename for a spec.
Args:
spec (spack.spec.Spec): instance of the spec under test
Returns:
str: the spec's log filename
spec: instance of the spec under test
"""
return "%s-test-out.txt" % cls.test_pkg_id(spec)
return f"{cls.test_pkg_id(spec)}-test-out.txt"
def log_file_for_spec(self, spec):
def log_file_for_spec(self, spec: Spec) -> Prefix:
"""The test log file path for the provided spec.
Args:
spec (spack.spec.Spec): instance of the spec under test
Returns:
str: the path to the spec's log file
spec: instance of the spec under test
"""
return self.stage.join(self.test_log_name(spec))
def test_dir_for_spec(self, spec):
def test_dir_for_spec(self, spec: Spec) -> Prefix:
"""The path to the test stage directory for the provided spec.
Args:
spec (spack.spec.Spec): instance of the spec under test
Returns:
str: the spec's test stage directory path
spec: instance of the spec under test
"""
return Prefix(self.stage.join(self.test_pkg_id(spec)))
@classmethod
def tested_file_name(cls, spec):
def tested_file_name(cls, spec: Spec) -> str:
"""The standard test status filename for the spec.
Args:
spec (spack.spec.Spec): instance of the spec under test
Returns:
str: the spec's test status filename
spec: instance of the spec under test
"""
return "%s-tested.txt" % cls.test_pkg_id(spec)
def tested_file_for_spec(self, spec):
def tested_file_for_spec(self, spec: Spec) -> str:
"""The test status file path for the spec.
Args:
spec (spack.spec.Spec): instance of the spec under test
Returns:
str: the spec's test status file path
spec: instance of the spec under test
"""
return fs.join_path(self.stage, self.tested_file_name(spec))
@property
def current_test_cache_dir(self):
def current_test_cache_dir(self) -> str:
"""Path to the test stage directory where the current spec's cached
build-time files were automatically copied.
Returns:
str: path to the current spec's staged, cached build-time files.
Raises:
TestSuiteSpecError: If there is no spec being tested
"""
@@ -1095,13 +1070,10 @@ def current_test_cache_dir(self):
return self.test_dir_for_spec(base_spec).cache.join(test_spec.name)
@property
def current_test_data_dir(self):
def current_test_data_dir(self) -> str:
"""Path to the test stage directory where the current spec's custom
package (data) files were automatically copied.
Returns:
str: path to the current spec's staged, custom package (data) files
Raises:
TestSuiteSpecError: If there is no spec being tested
"""
@@ -1112,17 +1084,17 @@ def current_test_data_dir(self):
base_spec = self.current_base_spec
return self.test_dir_for_spec(base_spec).data.join(test_spec.name)
def write_test_result(self, spec, result):
def write_test_result(self, spec: Spec, result: TestStatus) -> None:
"""Write the spec's test result to the test suite results file.
Args:
spec (spack.spec.Spec): instance of the spec under test
result (str): result from the spec's test execution (e.g, PASSED)
spec: instance of the spec under test
result: result from the spec's test execution (e.g, PASSED)
"""
msg = f"{self.test_pkg_id(spec)} {result}"
_add_msg_to_file(self.results_file, msg)
def write_reproducibility_data(self):
def write_reproducibility_data(self) -> None:
for spec in self.specs:
repo_cache_path = self.stage.repo.join(spec.name)
spack.repo.PATH.dump_provenance(spec, repo_cache_path)
@@ -1167,12 +1139,12 @@ def from_dict(d):
return TestSuite(specs, alias)
@staticmethod
def from_file(filename):
def from_file(filename: str) -> "TestSuite":
"""Instantiate a TestSuite using the specs and optional alias
provided in the given file.
Args:
filename (str): The path to the JSON file containing the test
filename: The path to the JSON file containing the test
suite specs and optional alias.
Raises:

View File

@@ -20,6 +20,7 @@
import signal
import subprocess as sp
import sys
import tempfile
import traceback
import warnings
from typing import List, Tuple
@@ -41,6 +42,7 @@
import spack.paths
import spack.platforms
import spack.repo
import spack.solver.asp
import spack.spec
import spack.store
import spack.util.debug
@@ -871,8 +873,8 @@ def add_command_line_scopes(
"""
for i, path in enumerate(command_line_scopes):
name = f"cmd_scope_{i}"
scopes = ev.environment_path_scopes(name, path)
if scopes is None:
scope = ev.environment_path_scope(name, path)
if scope is None:
if os.path.isdir(path): # directory with config files
cfg.push_scope(
spack.config.DirectoryConfigScope(name, path, writable=False),
@@ -885,8 +887,7 @@ def add_command_line_scopes(
else:
raise spack.error.ConfigError(f"Invalid configuration scope: {path}")
for scope in scopes:
cfg.push_scope(scope, priority=ConfigScopePriority.CUSTOM)
cfg.push_scope(scope, priority=ConfigScopePriority.CUSTOM)
def _main(argv=None):
@@ -1047,6 +1048,10 @@ def main(argv=None):
try:
return _main(argv)
except spack.solver.asp.OutputDoesNotSatisfyInputError as e:
_handle_solver_bug(e)
return 1
except spack.error.SpackError as e:
tty.debug(e)
e.die() # gracefully die on any SpackErrors
@@ -1070,5 +1075,45 @@ def main(argv=None):
return 3
def _handle_solver_bug(
e: spack.solver.asp.OutputDoesNotSatisfyInputError, out=sys.stderr, root=None
) -> None:
# when the solver outputs specs that do not satisfy the input and spack is used as a command
# line tool, we dump the incorrect output specs to json so users can upload them in bug reports
wrong_output = [(input, output) for input, output in e.input_to_output if output is not None]
no_output = [input for input, output in e.input_to_output if output is None]
if no_output:
tty.error(
"internal solver error: the following specs were not solved:\n - "
+ "\n - ".join(str(s) for s in no_output),
stream=out,
)
if wrong_output:
msg = (
"internal solver error: the following specs were concretized, but do not satisfy the "
"input:\n - "
+ "\n - ".join(str(s) for s, _ in wrong_output)
+ "\n Please report a bug at https://github.com/spack/spack/issues"
)
# try to write the input/output specs to a temporary directory for bug reports
try:
tmpdir = tempfile.mkdtemp(prefix="spack-asp-", dir=root)
files = []
for i, (input, output) in enumerate(wrong_output, start=1):
in_file = os.path.join(tmpdir, f"input-{i}.json")
out_file = os.path.join(tmpdir, f"output-{i}.json")
files.append(in_file)
files.append(out_file)
with open(in_file, "w", encoding="utf-8") as f:
input.to_json(f)
with open(out_file, "w", encoding="utf-8") as f:
output.to_json(f)
msg += " and attach the following files:\n - " + "\n - ".join(files)
except Exception:
msg += "."
tty.error(msg, stream=out)
class SpackCommandError(Exception):
"""Raised when SpackCommand execution fails."""

View File

@@ -162,6 +162,7 @@ class tty:
configure: Executable
make_jobs: int
make: MakeExecutable
nmake: Executable
ninja: MakeExecutable
python_include: str
python_platlib: str

View File

@@ -28,7 +28,7 @@
import llnl.util.filesystem as fsys
import llnl.util.tty as tty
from llnl.util.lang import classproperty, memoized
from llnl.util.lang import ClassProperty, classproperty, memoized
import spack.config
import spack.dependency
@@ -701,10 +701,10 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
_verbose = None
#: Package homepage where users can find more information about the package
homepage: Optional[str] = None
homepage: ClassProperty[Optional[str]] = None
#: Default list URL (place to find available versions)
list_url: Optional[str] = None
list_url: ClassProperty[Optional[str]] = None
#: Link depth to which list_url should be searched for new versions
list_depth = 0
@@ -1821,7 +1821,7 @@ def _resource_stage(self, resource):
resource_stage_folder = "-".join(pieces)
return resource_stage_folder
def do_test(self, dirty=False, externals=False):
def do_test(self, *, dirty=False, externals=False, timeout: Optional[int] = None):
if self.test_requires_compiler and not any(
lang in self.spec for lang in ("c", "cxx", "fortran")
):
@@ -1839,7 +1839,7 @@ def do_test(self, dirty=False, externals=False):
"verbose": tty.is_verbose(),
}
self.tester.stand_alone_tests(kwargs)
self.tester.stand_alone_tests(kwargs, timeout=timeout)
def unit_test_check(self):
"""Hook for unit tests to assert things about package internals.

View File

@@ -100,7 +100,7 @@
"allow_sgid": {"type": "boolean"},
"install_status": {"type": "boolean"},
"binary_index_root": {"type": "string"},
"url_fetch_method": {"type": "string", "enum": ["urllib", "curl"]},
"url_fetch_method": {"type": "string", "pattern": r"^urllib$|^curl( .*)*"},
"additional_external_search_paths": {"type": "array", "items": {"type": "string"}},
"binary_index_ttl": {"type": "integer", "minimum": 0},
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},

View File

@@ -287,9 +287,33 @@ def specify(spec):
return spack.spec.Spec(spec)
def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]:
"""Transformation that removes all "node" and "virtual_node" from the input list of facts."""
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
def remove_facts(
*to_be_removed: str,
) -> Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]:
"""Returns a transformation function that removes facts from the input list of facts."""
def _remove(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]:
return list(filter(lambda x: x.args[0] not in to_be_removed, facts))
return _remove
def remove_build_deps(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]:
build_deps = {x.args[2]: x.args[1] for x in facts if x.args[0] == "depends_on"}
result = []
for x in facts:
current_name = x.args[1]
if current_name in build_deps:
x.name = "build_requirement"
result.append(fn.attr("build_requirement", build_deps[current_name], x))
continue
if x.args[0] == "depends_on":
continue
result.append(x)
return result
def all_libcs() -> Set[spack.spec.Spec]:
@@ -1190,7 +1214,7 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
problem_repr += "\n" + f.read()
result = None
conc_cache_enabled = spack.config.get("config:concretization_cache:enable", True)
conc_cache_enabled = spack.config.get("config:concretization_cache:enable", False)
if conc_cache_enabled:
result, concretization_stats = CONC_CACHE.fetch(problem_repr)
@@ -1287,12 +1311,8 @@ def on_model(model):
result.raise_if_unsat()
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
raise OutputDoesNotSatisfyInputError(result.unsolved_specs)
if conc_cache_enabled:
CONC_CACHE.store(problem_repr, result, self.control.statistics, test=setup.tests)
concretization_stats = self.control.statistics
@@ -1735,15 +1755,17 @@ def define_variant(
pkg_fact(fn.variant_condition(name, vid, cond_id))
# record type so we can construct the variant when we read it back in
self.gen.fact(fn.variant_type(vid, variant_def.variant_type.value))
self.gen.fact(fn.variant_type(vid, variant_def.variant_type.string))
if variant_def.sticky:
pkg_fact(fn.variant_sticky(vid))
# define defaults for this variant definition
defaults = variant_def.make_default().value if variant_def.multi else [variant_def.default]
for val in sorted(defaults):
pkg_fact(fn.variant_default_value_from_package_py(vid, val))
if variant_def.multi:
for val in sorted(variant_def.make_default().values):
pkg_fact(fn.variant_default_value_from_package_py(vid, val))
else:
pkg_fact(fn.variant_default_value_from_package_py(vid, variant_def.default))
# define possible values for this variant definition
values = variant_def.values
@@ -1771,7 +1793,9 @@ def define_variant(
# make a spec indicating whether the variant has this conditional value
variant_has_value = spack.spec.Spec()
variant_has_value.variants[name] = spack.variant.AbstractVariant(name, value.value)
variant_has_value.variants[name] = vt.VariantValue(
vt.VariantType.MULTI, name, (value.value,)
)
if value.when:
# the conditional value is always "possible", but it imposes its when condition as
@@ -1884,7 +1908,7 @@ def condition(
if not context:
context = ConditionContext()
context.transform_imposed = remove_node
context.transform_imposed = remove_facts("node", "virtual_node")
if imposed_spec:
imposed_name = imposed_spec.name or imposed_name
@@ -1984,7 +2008,7 @@ def track_dependencies(input_spec, requirements):
return requirements + [fn.attr("track_dependencies", input_spec.name)]
def dependency_holds(input_spec, requirements):
result = remove_node(input_spec, requirements) + [
result = remove_facts("node", "virtual_node")(input_spec, requirements) + [
fn.attr(
"dependency_holds", pkg.name, input_spec.name, dt.flag_to_string(t)
)
@@ -2174,7 +2198,10 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
pkg_name, ConstraintOrigin.REQUIRE
)
if not virtual:
context.transform_imposed = remove_node
context.transform_required = remove_build_deps
context.transform_imposed = remove_facts(
"node", "virtual_node", "depends_on"
)
# else: for virtuals we want to emit "node" and
# "virtual_node" in imposed specs
@@ -2236,16 +2263,18 @@ def external_packages(self):
if pkg_name not in self.pkgs:
continue
self.gen.h2(f"External package: {pkg_name}")
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
external_buildable = data.get("buildable", True)
externals = data.get("externals", [])
if not external_buildable or externals:
self.gen.h2(f"External package: {pkg_name}")
if not external_buildable:
self.gen.fact(fn.buildable_false(pkg_name))
# Read a list of all the specs for this package
externals = data.get("externals", [])
candidate_specs = [
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
@@ -2334,6 +2363,8 @@ def preferred_variants(self, pkg_name):
if not preferred_variants:
return
self.gen.h2(f"Package preferences: {pkg_name}")
for variant_name in sorted(preferred_variants):
variant = preferred_variants[variant_name]
@@ -2346,7 +2377,7 @@ def preferred_variants(self, pkg_name):
)
continue
for value in variant.value_as_tuple:
for value in variant.values:
for variant_def in variant_defs:
self.variant_values_from_specs.add((pkg_name, id(variant_def), value))
self.gen.fact(
@@ -2464,7 +2495,7 @@ def _spec_clauses(
if variant.value == ("*",):
continue
for value in variant.value_as_tuple:
for value in variant.values:
# ensure that the value *can* be valid for the spec
if spec.name and not spec.concrete and not spack.repo.PATH.is_virtual(spec.name):
variant_defs = vt.prevalidate_variant_value(
@@ -2574,6 +2605,16 @@ def _spec_clauses(
# already-installed concrete specs.
if concrete_build_deps or dspec.depflag != dt.BUILD:
clauses.append(fn.attr("hash", dep.name, dep.dag_hash()))
elif not concrete_build_deps and dspec.depflag:
clauses.append(
fn.attr(
"concrete_build_dependency", spec.name, dep.name, dep.dag_hash()
)
)
for virtual_name in dspec.virtuals:
clauses.append(
fn.attr("virtual_on_build_edge", spec.name, dep.name, virtual_name)
)
# if the spec is abstract, descend into dependencies.
# if it's concrete, then the hashes above take care of dependency
@@ -2997,8 +3038,36 @@ def setup(
"""
reuse = reuse or []
check_packages_exist(specs)
self.gen = ProblemInstanceBuilder()
node_counter = create_counter(specs, tests=self.tests, possible_graph=self.possible_graph)
# Compute possible compilers first, so we can record which dependencies they might inject
_ = spack.compilers.config.all_compilers(init_config=True)
# Get compilers from buildcache only if injected through "reuse" specs
supported_compilers = spack.compilers.config.supported_compilers()
compilers_from_reuse = {
x for x in reuse if x.name in supported_compilers and not x.external
}
candidate_compilers, self.rejected_compilers = possible_compilers(
configuration=spack.config.CONFIG
)
for x in candidate_compilers:
if x.external or x in reuse:
continue
reuse.append(x)
for dep in x.traverse(root=False, deptype="run"):
reuse.extend(dep.traverse(deptype=("link", "run")))
candidate_compilers.update(compilers_from_reuse)
self.possible_compilers = list(candidate_compilers)
self.possible_compilers.sort() # type: ignore[call-overload]
self.gen.h1("Runtimes")
injected_dependencies = self.define_runtime_constraints()
node_counter = create_counter(
specs + injected_dependencies, tests=self.tests, possible_graph=self.possible_graph
)
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
self.libcs = sorted(all_libcs()) # type: ignore[type-var]
@@ -3021,7 +3090,6 @@ def setup(
if node.namespace is not None:
self.explicitly_required_namespaces[node.name] = node.namespace
self.gen = ProblemInstanceBuilder()
self.gen.h1("Generic information")
if using_libc_compatibility():
for libc in self.libcs:
@@ -3050,27 +3118,6 @@ def setup(
specs = tuple(specs) # ensure compatible types to add
_ = spack.compilers.config.all_compilers(init_config=True)
# Get compilers from buildcache only if injected through "reuse" specs
supported_compilers = spack.compilers.config.supported_compilers()
compilers_from_reuse = {
x for x in reuse if x.name in supported_compilers and not x.external
}
candidate_compilers, self.rejected_compilers = possible_compilers(
configuration=spack.config.CONFIG
)
for x in candidate_compilers:
if x.external or x in reuse:
continue
reuse.append(x)
for dep in x.traverse(root=False, deptype="run"):
reuse.extend(dep.traverse(deptype=("link", "run")))
candidate_compilers.update(compilers_from_reuse)
self.possible_compilers = list(candidate_compilers)
self.possible_compilers.sort() # type: ignore[call-overload]
self.gen.h1("Reusable concrete specs")
self.define_concrete_input_specs(specs, self.pkgs)
if reuse:
@@ -3122,7 +3169,6 @@ def setup(
for pkg in sorted(self.pkgs):
self.gen.h2("Package rules: %s" % pkg)
self.pkg_rules(pkg, tests=self.tests)
self.gen.h2("Package preferences: %s" % pkg)
self.preferred_variants(pkg)
self.gen.h1("Special variants")
@@ -3142,9 +3188,6 @@ def setup(
self.gen.h1("Variant Values defined in specs")
self.define_variant_values()
self.gen.h1("Runtimes")
self.define_runtime_constraints()
self.gen.h1("Version Constraints")
self.collect_virtual_constraints()
self.define_version_constraints()
@@ -3178,8 +3221,10 @@ def visit(node):
path = os.path.join(parent_dir, "concretize.lp")
parse_files([path], visit)
def define_runtime_constraints(self):
"""Define the constraints to be imposed on the runtimes"""
def define_runtime_constraints(self) -> List[spack.spec.Spec]:
"""Define the constraints to be imposed on the runtimes, and returns a list of
injected packages.
"""
recorder = RuntimePropertyRecorder(self)
for compiler in self.possible_compilers:
@@ -3195,12 +3240,13 @@ def define_runtime_constraints(self):
# FIXME (compiler as nodes): think of using isinstance(compiler_cls, WrappedCompiler)
# Add a dependency on the compiler wrapper
recorder("*").depends_on(
"compiler-wrapper",
when=f"%{compiler.name}@{compiler.versions}",
type="build",
description=f"Add the compiler wrapper when using {compiler}",
)
for language in ("c", "cxx", "fortran"):
recorder("*").depends_on(
"compiler-wrapper",
when=f"%[virtuals={language}] {compiler.name}@{compiler.versions}",
type="build",
description=f"Add the compiler wrapper when using {compiler} for {language}",
)
if not using_libc_compatibility():
continue
@@ -3229,6 +3275,7 @@ def define_runtime_constraints(self):
)
recorder.consume_facts()
return sorted(recorder.injected_dependencies)
def literal_specs(self, specs):
for spec in sorted(specs):
@@ -3261,15 +3308,13 @@ def literal_specs(self, specs):
# These facts are needed to compute the "condition_set" of the root
pkg_name = clause.args[1]
self.gen.fact(fn.mentioned_in_literal(trigger_id, root_name, pkg_name))
elif clause_name == "depends_on":
pkg_name = clause.args[2]
self.gen.fact(fn.mentioned_in_literal(trigger_id, root_name, pkg_name))
requirements.append(
fn.attr(
"virtual_root" if spack.repo.PATH.is_virtual(spec.name) else "root", spec.name
)
)
requirements = [x for x in requirements if x.args[0] != "depends_on"]
cache[imposed_spec_key] = (effect_id, requirements)
self.gen.fact(fn.pkg_fact(spec.name, fn.condition_effect(condition_id, effect_id)))
@@ -3493,6 +3538,7 @@ def __init__(self, setup):
self._setup = setup
self.rules = []
self.runtime_conditions = set()
self.injected_dependencies = set()
# State of this object set in the __call__ method, and reset after
# each directive-like method
self.current_package = None
@@ -3531,6 +3577,7 @@ def depends_on(self, dependency_str: str, *, when: str, type: str, description:
if dependency_spec.versions != vn.any_version:
self._setup.version_constraints.add((dependency_spec.name, dependency_spec.versions))
self.injected_dependencies.add(dependency_spec)
body_str, node_variable = self.rule_body_from(when_spec)
head_clauses = self._setup.spec_clauses(dependency_spec, body=False)
@@ -3592,11 +3639,9 @@ def rule_body_from(self, when_spec: "spack.spec.Spec") -> Tuple[str, str]:
# (avoid adding virtuals everywhere, if a single edge needs it)
_, provider, virtual = clause.args
clause.args = "virtual_on_edge", node_placeholder, provider, virtual
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{node_placeholder}"', f"{node_variable}")
body_str = ",\n".join(f" {x}" for x in body_clauses)
body_str += f",\n not external({node_variable})"
body_str = body_str.replace(f'"{node_placeholder}"', f"{node_variable}")
for old, replacement in when_substitutions.items():
body_str = body_str.replace(old, replacement)
return body_str, node_variable
@@ -3686,20 +3731,21 @@ def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and
facts for the runtimes.
"""
self._setup.gen.h2("Runtimes: declarations")
runtime_pkgs = sorted(
{x.name for x in self.injected_dependencies if not spack.repo.PATH.is_virtual(x.name)}
)
for runtime_pkg in runtime_pkgs:
self._setup.gen.fact(fn.runtime(runtime_pkg))
self._setup.gen.newline()
self._setup.gen.h2("Runtimes: rules")
self._setup.gen.newline()
for rule in self.rules:
self._setup.gen.append(rule)
self._setup.gen.newline()
self._setup.gen.h2("Runtimes: conditions")
for runtime_pkg in spack.repo.PATH.packages_with_tags("runtime"):
self._setup.gen.fact(fn.runtime(runtime_pkg))
self._setup.gen.fact(fn.possible_in_link_run(runtime_pkg))
self._setup.gen.newline()
# Inject version rules for runtimes (versions are declared based
# on the available compilers)
self._setup.pkg_version_rules(runtime_pkg)
self._setup.gen.h2("Runtimes: requirements")
for imposed_spec, when_spec in sorted(self.runtime_conditions):
msg = f"{when_spec} requires {imposed_spec} at runtime"
_ = self._setup.condition(when_spec, imposed_spec=imposed_spec, msg=msg)
@@ -3786,13 +3832,13 @@ def node_os(self, node, os):
def node_target(self, node, target):
self._arch(node).target = target
def variant_selected(self, node, name, value, variant_type, variant_id):
def variant_selected(self, node, name: str, value: str, variant_type: str, variant_id):
spec = self._specs[node]
variant = spec.variants.get(name)
if not variant:
spec.variants[name] = vt.VariantType(variant_type).variant_class(name, value)
spec.variants[name] = vt.VariantValue.from_concretizer(name, value, variant_type)
else:
assert variant_type == vt.VariantType.MULTI.value, (
assert variant_type == "multi", (
f"Can't have multiple values for single-valued variant: "
f"{node}, {name}, {value}, {variant_type}, {variant_id}"
)
@@ -3834,7 +3880,7 @@ def virtual_on_edge(self, parent_node, provider_node, virtual):
provider_spec = self._specs[provider_node]
dependencies = [x for x in dependencies if id(x.spec) == id(provider_spec)]
assert len(dependencies) == 1, f"{virtual}: {provider_node.pkg}"
dependencies[0].update_virtuals((virtual,))
dependencies[0].update_virtuals(virtual)
def reorder_flags(self):
"""For each spec, determine the order of compiler flags applied to it.
@@ -4171,10 +4217,10 @@ def _inject_patches_variant(root: spack.spec.Spec) -> None:
continue
patches = list(spec_to_patches[id(spec)])
variant: vt.MultiValuedVariant = spec.variants.setdefault(
variant: vt.VariantValue = spec.variants.setdefault(
"patches", vt.MultiValuedVariant("patches", ())
)
variant.value = tuple(p.sha256 for p in patches)
variant.set(*(p.sha256 for p in patches))
# FIXME: Monkey patches variant to store patches order
ordered_hashes = [(*p.ordering_key, p.sha256) for p in patches if p.ordering_key]
ordered_hashes.sort()
@@ -4642,13 +4688,9 @@ def solve_in_rounds(
break
if not result.specs:
# This is also a problem: no specs were solved for, which
# means we would be in a loop if we tried again
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: a subset of input specs could not"
f" be solved for.\n\t{unsolved_str}"
)
# This is also a problem: no specs were solved for, which means we would be in a
# loop if we tried again
raise OutputDoesNotSatisfyInputError(result.unsolved_specs)
input_specs = list(x for (x, y) in result.unsolved_specs)
for spec in result.specs:
@@ -4678,6 +4720,19 @@ def __init__(self, msg):
self.constraint_type = None
class OutputDoesNotSatisfyInputError(InternalConcretizerError):
def __init__(
self, input_to_output: List[Tuple[spack.spec.Spec, Optional[spack.spec.Spec]]]
) -> None:
self.input_to_output = input_to_output
super().__init__(
"internal solver error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{Result.format_unsolved(input_to_output)}"
)
class SolverError(InternalConcretizerError):
"""For cases where the solver is unable to produce a solution.

View File

@@ -175,12 +175,23 @@ trigger_node(TriggerID, Node, Node) :-
% Since we trigger the existence of literal nodes from a condition, we need to construct the condition_set/2
mentioned_in_literal(Root, Mentioned) :- mentioned_in_literal(TriggerID, Root, Mentioned), solve_literal(TriggerID).
condition_set(node(min_dupe_id, Root), node(min_dupe_id, Root)) :- mentioned_in_literal(Root, Root).
literal_node(Root, node(min_dupe_id, Root)) :- mentioned_in_literal(Root, Root).
1 { condition_set(node(min_dupe_id, Root), node(0..Y-1, Mentioned)) : max_dupes(Mentioned, Y) } 1 :-
1 { literal_node(Root, node(0..Y-1, Mentioned)) : max_dupes(Mentioned, Y) } 1 :-
mentioned_in_literal(Root, Mentioned), Mentioned != Root,
internal_error("must have exactly one condition_set for literals").
1 { build_dependency_of_literal_node(LiteralNode, node(0..Y-1, BuildDependency)) : max_dupes(BuildDependency, Y) } 1 :-
literal_node(Root, LiteralNode),
build(LiteralNode),
attr("build_requirement", LiteralNode, build_requirement("node", BuildDependency)).
condition_set(node(min_dupe_id, Root), LiteralNode) :- literal_node(Root, LiteralNode).
condition_set(LiteralNode, BuildNode) :- build_dependency_of_literal_node(LiteralNode, BuildNode).
:- build_dependency_of_literal_node(LiteralNode, BuildNode),
not attr("depends_on", LiteralNode, BuildNode, "build").
% Discriminate between "roots" that have been explicitly requested, and roots that are deduced from "virtual roots"
explicitly_requested_root(node(min_dupe_id, Package)) :-
solve_literal(TriggerID),
@@ -472,10 +483,35 @@ provider(ProviderNode, VirtualNode) :- attr("provider_set", ProviderNode, Virtua
imposed_constraint(ID, "depends_on", A1, A2, A3),
internal_error("Build deps must land in exactly one duplicate").
1 { build_requirement(node(X, Parent), node(0..Y-1, BuildDependency)) : max_dupes(BuildDependency, Y) } 1
% If the parent is built, then we have a build_requirement on another node. For concrete nodes,
% or external nodes, we don't since we are trimming their build dependencies.
1 { attr("depends_on", node(X, Parent), node(0..Y-1, BuildDependency), "build") : max_dupes(BuildDependency, Y) } 1
:- attr("build_requirement", node(X, Parent), build_requirement("node", BuildDependency)),
impose(ID, node(X, Parent)),
imposed_constraint(ID,"build_requirement",Parent,_).
build(node(X, Parent)),
not external(node(X, Parent)).
:- attr("build_requirement", ParentNode, build_requirement("node", BuildDependency)),
concrete(ParentNode),
not attr("concrete_build_dependency", ParentNode, BuildDependency, _).
:- attr("build_requirement", ParentNode, build_requirement("node_version_satisfies", BuildDependency, Constraint)),
attr("concrete_build_dependency", ParentNode, BuildDependency, BuildDependencyHash),
not 1 { pkg_fact(BuildDependency, version_satisfies(Constraint, Version)) : hash_attr(BuildDependencyHash, "version", BuildDependency, Version) } 1.
:- attr("build_requirement", ParentNode, build_requirement("provider_set", BuildDependency, Virtual)),
attr("concrete_build_dependency", ParentNode, BuildDependency, BuildDependencyHash),
attr("virtual_on_build_edge", ParentNode, BuildDependency, Virtual),
not 1 { pkg_fact(BuildDependency, version_satisfies(Constraint, Version)) : hash_attr(BuildDependencyHash, "version", BuildDependency, Version) } 1.
% Asking for gcc@10 %gcc@9 shouldn't give us back an external gcc@10, just because of the hack
% we have on externals
:- attr("build_requirement", node(X, Parent), build_requirement("node", BuildDependency)),
Parent == BuildDependency,
external(node(X, Parent)).
build_requirement(node(X, Parent), node(Y, BuildDependency)) :-
attr("depends_on", node(X, Parent), node(Y, BuildDependency), "build"),
attr("build_requirement", node(X, Parent), build_requirement("node", BuildDependency)).
1 { virtual_build_requirement(ParentNode, node(0..Y-1, Virtual)) : max_dupes(Virtual, Y) } 1
:- attr("dependency_holds", ParentNode, Virtual, "build"),
@@ -496,7 +532,6 @@ attr("node_version_satisfies", node(X, BuildDependency), Constraint) :-
attr("build_requirement", ParentNode, build_requirement("node_version_satisfies", BuildDependency, Constraint)),
build_requirement(ParentNode, node(X, BuildDependency)).
attr("depends_on", node(X, Parent), node(Y, BuildDependency), "build") :- build_requirement(node(X, Parent), node(Y, BuildDependency)).
1 { attr("provider_set", node(X, BuildDependency), node(0..Y-1, Virtual)) : max_dupes(Virtual, Y) } 1 :-
attr("build_requirement", ParentNode, build_requirement("provider_set", BuildDependency, Virtual)),
@@ -882,6 +917,12 @@ requirement_weight(node(ID, Package), Group, W) :-
requirement_policy(Package, Group, "one_of"),
requirement_group_satisfied(node(ID, Package), Group).
{ attr("build_requirement", node(ID, Package), BuildRequirement) : condition_requirement(TriggerID, "build_requirement", Package, BuildRequirement) } :-
pkg_fact(Package, condition_trigger(ConditionID, TriggerID)),
requirement_group_member(ConditionID, Package, Group),
activate_requirement(node(ID, Package), Group),
requirement_group(Package, Group).
requirement_group_satisfied(node(ID, Package), X) :-
1 { condition_holds(Y, node(ID, Package)) : requirement_group_member(Y, Package, X) } ,
requirement_policy(Package, X, "any_of"),

View File

@@ -18,8 +18,6 @@
import spack.store
from spack.error import SpackError
RUNTIME_TAG = "runtime"
class PossibleGraph(NamedTuple):
real_pkgs: Set[str]
@@ -50,7 +48,8 @@ def possible_dependencies(
) -> PossibleGraph:
"""Returns the set of possible dependencies, and the set of possible virtuals.
Both sets always include runtime packages, which may be injected by compilers.
Runtime packages, which may be injected by compilers, needs to be added to specs if
the dependency is not explicit in the package.py recipe.
Args:
transitive: return transitive dependencies if True, only direct dependencies if False
@@ -70,14 +69,9 @@ class NoStaticAnalysis(PossibleDependencyGraph):
def __init__(self, *, configuration: spack.config.Configuration, repo: spack.repo.RepoPath):
self.configuration = configuration
self.repo = repo
self.runtime_pkgs = set(self.repo.packages_with_tags(RUNTIME_TAG))
self.runtime_virtuals = set()
self._platform_condition = spack.spec.Spec(
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
)
for x in self.runtime_pkgs:
pkg_class = self.repo.get_pkg_class(x)
self.runtime_virtuals.update(pkg_class.provided_virtual_names())
try:
self.libc_pkgs = [x.name for x in self.providers_for("libc")]
@@ -91,8 +85,10 @@ def is_virtual(self, name: str) -> bool:
def is_allowed_on_this_platform(self, *, pkg_name: str) -> bool:
"""Returns true if a package is allowed on the current host"""
pkg_cls = self.repo.get_pkg_class(pkg_name)
no_condition = spack.spec.Spec()
for when_spec, conditions in pkg_cls.requirements.items():
if not when_spec.intersects(self._platform_condition):
# Restrict analysis to unconditional requirements
if when_spec != no_condition:
continue
for requirements, _, _ in conditions:
if not any(x.intersects(self._platform_condition) for x in requirements):
@@ -214,8 +210,6 @@ def possible_dependencies(
for root, children in edges.items():
real_packages.update(x for x in children if self._is_possible(pkg_name=x))
virtuals.update(self.runtime_virtuals)
real_packages = real_packages | self.runtime_pkgs
return PossibleGraph(real_pkgs=real_packages, virtuals=virtuals, edges=edges)
def _package_list(self, specs: Tuple[Union[spack.spec.Spec, str], ...]) -> List[str]:
@@ -470,7 +464,7 @@ def possible_packages_facts(self, gen, fn):
gen.fact(fn.max_dupes(package_name, 1))
gen.newline()
gen.h2("Packages with at multiple possible nodes (build-tools)")
gen.h2("Packages with multiple possible nodes (build-tools)")
default = spack.config.CONFIG.get("concretizer:duplicates:max_dupes:default", 2)
for package_name in sorted(self.possible_dependencies() & build_tools):
max_dupes = spack.config.CONFIG.get(

View File

@@ -111,22 +111,14 @@
__all__ = [
"CompilerSpec",
"Spec",
"SpecParseError",
"UnsupportedPropagationError",
"DuplicateDependencyError",
"DuplicateCompilerSpecError",
"UnsupportedCompilerError",
"DuplicateArchitectureError",
"InconsistentSpecError",
"InvalidDependencyError",
"NoProviderError",
"MultipleProviderError",
"UnsatisfiableSpecNameError",
"UnsatisfiableVersionSpecError",
"UnsatisfiableCompilerSpecError",
"UnsatisfiableCompilerFlagSpecError",
"UnsatisfiableArchitectureSpecError",
"UnsatisfiableProviderSpecError",
"UnsatisfiableDependencySpecError",
"AmbiguousHashError",
"InvalidHashError",
@@ -206,7 +198,7 @@ class InstallStatus(enum.Enum):
installed = "@g{[+]} "
upstream = "@g{[^]} "
external = "@g{[e]} "
external = "@M{[e]} "
absent = "@K{ - } "
missing = "@r{[-]} "
@@ -754,11 +746,17 @@ def update_deptypes(self, depflag: dt.DepFlag) -> bool:
self.depflag = new
return True
def update_virtuals(self, virtuals: Iterable[str]) -> bool:
def update_virtuals(self, virtuals: Union[str, Iterable[str]]) -> bool:
"""Update the list of provided virtuals"""
old = self.virtuals
self.virtuals = tuple(sorted(set(virtuals).union(self.virtuals)))
return old != self.virtuals
if isinstance(virtuals, str):
union = {virtuals, *self.virtuals}
else:
union = {*virtuals, *self.virtuals}
if len(union) == len(old):
return False
self.virtuals = tuple(sorted(union))
return True
def copy(self) -> "DependencySpec":
"""Return a copy of this edge"""
@@ -1022,7 +1020,7 @@ def select(
parent: Optional[str] = None,
child: Optional[str] = None,
depflag: dt.DepFlag = dt.ALL,
virtuals: Optional[Sequence[str]] = None,
virtuals: Optional[Union[str, Sequence[str]]] = None,
) -> List[DependencySpec]:
"""Selects a list of edges and returns them.
@@ -1041,7 +1039,7 @@ def select(
parent: name of the parent package
child: name of the child package
depflag: allowed dependency types in flag form
virtuals: list of virtuals on the edge
virtuals: list of virtuals or specific virtual on the edge
"""
if not depflag:
return []
@@ -1062,7 +1060,10 @@ def select(
# Filter by virtuals
if virtuals is not None:
selected = (dep for dep in selected if any(v in dep.virtuals for v in virtuals))
if isinstance(virtuals, str):
selected = (dep for dep in selected if virtuals in dep.virtuals)
else:
selected = (dep for dep in selected if any(v in dep.virtuals for v in virtuals))
return list(selected)
@@ -1587,7 +1588,11 @@ def _get_dependency(self, name):
return deps[0]
def edges_from_dependents(
self, name=None, depflag: dt.DepFlag = dt.ALL, *, virtuals: Optional[List[str]] = None
self,
name=None,
depflag: dt.DepFlag = dt.ALL,
*,
virtuals: Optional[Union[str, Sequence[str]]] = None,
) -> List[DependencySpec]:
"""Return a list of edges connecting this node in the DAG
to parents.
@@ -1602,7 +1607,11 @@ def edges_from_dependents(
]
def edges_to_dependencies(
self, name=None, depflag: dt.DepFlag = dt.ALL, *, virtuals: Optional[Sequence[str]] = None
self,
name=None,
depflag: dt.DepFlag = dt.ALL,
*,
virtuals: Optional[Union[str, Sequence[str]]] = None,
) -> List[DependencySpec]:
"""Returns a list of edges connecting this node in the DAG to children.
@@ -1644,7 +1653,7 @@ def dependencies(
name=None,
deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL,
*,
virtuals: Optional[Sequence[str]] = None,
virtuals: Optional[Union[str, Sequence[str]]] = None,
) -> List["Spec"]:
"""Returns a list of direct dependencies (nodes in the DAG)
@@ -1689,18 +1698,19 @@ def _dependencies_dict(self, depflag: dt.DepFlag = dt.ALL):
result[key] = list(group)
return result
def _add_flag(self, name, value, propagate):
"""Called by the parser to add a known flag.
Known flags currently include "arch"
"""
def _add_flag(
self, name: str, value: Union[str, bool], propagate: bool, concrete: bool
) -> None:
"""Called by the parser to add a known flag"""
if propagate and name in vt.reserved_names:
if propagate and name in vt.RESERVED_NAMES:
raise UnsupportedPropagationError(
f"Propagation with '==' is not supported for '{name}'."
)
valid_flags = FlagMap.valid_compiler_flags()
if name == "arch" or name == "architecture":
assert type(value) is str, "architecture have a string value"
parts = tuple(value.split("-"))
plat, os, tgt = parts if len(parts) == 3 else (None, None, value)
self._set_architecture(platform=plat, os=os, target=tgt)
@@ -1714,19 +1724,15 @@ def _add_flag(self, name, value, propagate):
self.namespace = value
elif name in valid_flags:
assert self.compiler_flags is not None
assert type(value) is str, f"{name} must have a string value"
flags_and_propagation = spack.compilers.flags.tokenize_flags(value, propagate)
flag_group = " ".join(x for (x, y) in flags_and_propagation)
for flag, propagation in flags_and_propagation:
self.compiler_flags.add_flag(name, flag, propagation, flag_group)
else:
# FIXME:
# All other flags represent variants. 'foo=true' and 'foo=false'
# map to '+foo' and '~foo' respectively. As such they need a
# BoolValuedVariant instance.
if str(value).upper() == "TRUE" or str(value).upper() == "FALSE":
self.variants[name] = vt.BoolValuedVariant(name, value, propagate)
else:
self.variants[name] = vt.AbstractVariant(name, value, propagate)
self.variants[name] = vt.VariantValue.from_string_or_bool(
name, value, propagate=propagate, concrete=concrete
)
def _set_architecture(self, **kwargs):
"""Called by the parser to set the architecture."""
@@ -2334,6 +2340,7 @@ def to_node_dict(self, hash=ht.dag_hash):
[v.name for v in self.variants.values() if v.propagate], flag_names
)
)
d["abstract"] = sorted(v.name for v in self.variants.values() if not v.concrete)
if self.external:
d["external"] = {
@@ -3007,9 +3014,8 @@ def ensure_valid_variants(spec):
# but are not necessarily recorded by the package's class
propagate_variants = [name for name, variant in spec.variants.items() if variant.propagate]
not_existing = set(spec.variants) - (
set(pkg_variants) | set(vt.reserved_names) | set(propagate_variants)
)
not_existing = set(spec.variants)
not_existing.difference_update(pkg_variants, vt.RESERVED_NAMES, propagate_variants)
if not_existing:
raise vt.UnknownVariantError(
@@ -3061,7 +3067,7 @@ def constrain(self, other, deps=True):
raise UnsatisfiableVersionSpecError(self.versions, other.versions)
for v in [x for x in other.variants if x in self.variants]:
if not self.variants[v].compatible(other.variants[v]):
if not self.variants[v].intersects(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
sarch, oarch = self.architecture, other.architecture
@@ -4476,7 +4482,7 @@ def __init__(self, spec: Spec):
def __setitem__(self, name, vspec):
# Raise a TypeError if vspec is not of the right type
if not isinstance(vspec, vt.AbstractVariant):
if not isinstance(vspec, vt.VariantValue):
raise TypeError(
"VariantMap accepts only values of variant types "
f"[got {type(vspec).__name__} instead]"
@@ -4586,8 +4592,7 @@ def constrain(self, other: "VariantMap") -> bool:
changed = False
for k in other:
if k in self:
# If they are not compatible raise an error
if not self[k].compatible(other[k]):
if not self[k].intersects(other[k]):
raise vt.UnsatisfiableVariantSpecError(self[k], other[k])
# If they are compatible merge them
changed |= self[k].constrain(other[k])
@@ -4617,7 +4622,7 @@ def __str__(self):
bool_keys = []
kv_keys = []
for key in sorted_keys:
if isinstance(self[key].value, bool):
if self[key].type == vt.VariantType.BOOL:
bool_keys.append(key)
else:
kv_keys.append(key)
@@ -4650,9 +4655,10 @@ def substitute_abstract_variants(spec: Spec):
unknown = []
for name, v in spec.variants.items():
if name == "dev_path":
spec.variants.substitute(vt.SingleValuedVariant(name, v._original_value))
v.type = vt.VariantType.SINGLE
v.concrete = True
continue
elif name in vt.reserved_names:
elif name in vt.RESERVED_NAMES:
continue
variant_defs = spack.repo.PATH.get_pkg_class(spec.fullname).variant_definitions(name)
@@ -4673,7 +4679,7 @@ def substitute_abstract_variants(spec: Spec):
if rest:
continue
new_variant = pkg_variant.make_variant(v._original_value)
new_variant = pkg_variant.make_variant(*v.values)
pkg_variant.validate_or_raise(new_variant, spec.name)
spec.variants.substitute(new_variant)
@@ -4791,6 +4797,7 @@ def from_node_dict(cls, node):
spec.architecture = ArchSpec.from_dict(node)
propagated_names = node.get("propagate", [])
abstract_variants = set(node.get("abstract", ()))
for name, values in node.get("parameters", {}).items():
propagate = name in propagated_names
if name in _valid_compiler_flags:
@@ -4798,8 +4805,8 @@ def from_node_dict(cls, node):
for val in values:
spec.compiler_flags.add_flag(name, val, propagate)
else:
spec.variants[name] = vt.MultiValuedVariant.from_node_dict(
name, values, propagate=propagate
spec.variants[name] = vt.VariantValue.from_node_dict(
name, values, propagate=propagate, abstract=name in abstract_variants
)
spec.external_path = None
@@ -4824,7 +4831,7 @@ def from_node_dict(cls, node):
patches = node["patches"]
if len(patches) > 0:
mvar = spec.variants.setdefault("patches", vt.MultiValuedVariant("patches", ()))
mvar.value = patches
mvar.set(*patches)
# FIXME: Monkey patches mvar to store patches order
mvar._patches_in_order_of_appearance = patches
@@ -5149,25 +5156,6 @@ def eval_conditional(string):
return eval(string, valid_variables)
class SpecParseError(spack.error.SpecError):
"""Wrapper for ParseError for when we're parsing specs."""
def __init__(self, parse_error):
super().__init__(parse_error.message)
self.string = parse_error.string
self.pos = parse_error.pos
@property
def long_message(self):
return "\n".join(
[
" Encountered when parsing spec:",
" %s" % self.string,
" %s^" % (" " * self.pos),
]
)
class InvalidVariantForSpecError(spack.error.SpecError):
"""Raised when an invalid conditional variant is specified."""
@@ -5185,14 +5173,6 @@ class DuplicateDependencyError(spack.error.SpecError):
"""Raised when the same dependency occurs in a spec twice."""
class MultipleVersionError(spack.error.SpecError):
"""Raised when version constraints occur in a spec twice."""
class DuplicateCompilerSpecError(spack.error.SpecError):
"""Raised when the same compiler occurs in a spec twice."""
class UnsupportedCompilerError(spack.error.SpecError):
"""Raised when the user asks for a compiler spack doesn't know about."""
@@ -5201,11 +5181,6 @@ class DuplicateArchitectureError(spack.error.SpecError):
"""Raised when the same architecture occurs in a spec twice."""
class InconsistentSpecError(spack.error.SpecError):
"""Raised when two nodes in the same spec DAG have inconsistent
constraints."""
class InvalidDependencyError(spack.error.SpecError):
"""Raised when a dependency in a spec is not actually a dependency
of the package."""
@@ -5217,30 +5192,6 @@ def __init__(self, pkg, deps):
)
class NoProviderError(spack.error.SpecError):
"""Raised when there is no package that provides a particular
virtual dependency.
"""
def __init__(self, vpkg):
super().__init__("No providers found for virtual package: '%s'" % vpkg)
self.vpkg = vpkg
class MultipleProviderError(spack.error.SpecError):
"""Raised when there is no package that provides a particular
virtual dependency.
"""
def __init__(self, vpkg, providers):
"""Takes the name of the vpkg"""
super().__init__(
"Multiple providers found for '%s': %s" % (vpkg, [str(s) for s in providers])
)
self.vpkg = vpkg
self.providers = providers
class UnsatisfiableSpecNameError(spack.error.UnsatisfiableSpecError):
"""Raised when two specs aren't even for the same package."""
@@ -5255,20 +5206,6 @@ def __init__(self, provided, required):
super().__init__(provided, required, "version")
class UnsatisfiableCompilerSpecError(spack.error.UnsatisfiableSpecError):
"""Raised when a spec compiler conflicts with package constraints."""
def __init__(self, provided, required):
super().__init__(provided, required, "compiler")
class UnsatisfiableCompilerFlagSpecError(spack.error.UnsatisfiableSpecError):
"""Raised when a spec variant conflicts with package constraints."""
def __init__(self, provided, required):
super().__init__(provided, required, "compiler_flags")
class UnsatisfiableArchitectureSpecError(spack.error.UnsatisfiableSpecError):
"""Raised when a spec architecture conflicts with package constraints."""
@@ -5276,14 +5213,6 @@ def __init__(self, provided, required):
super().__init__(provided, required, "architecture")
class UnsatisfiableProviderSpecError(spack.error.UnsatisfiableSpecError):
"""Raised when a provider is supplied but constraints don't match
a vpkg requirement"""
def __init__(self, provided, required):
super().__init__(provided, required, "provider")
# TODO: get rid of this and be more specific about particular incompatible
# dep constraints
class UnsatisfiableDependencySpecError(spack.error.UnsatisfiableSpecError):

View File

@@ -62,7 +62,7 @@
import sys
import traceback
import warnings
from typing import Iterator, List, Optional, Tuple
from typing import Iterator, List, Optional, Tuple, Union
from llnl.util.tty import color
@@ -99,8 +99,7 @@
VERSION_RANGE = rf"(?:(?:{VERSION})?:(?:{VERSION}(?!\s*=))?)"
VERSION_LIST = rf"(?:{VERSION_RANGE}|{VERSION})(?:\s*,\s*(?:{VERSION_RANGE}|{VERSION}))*"
#: Regex with groups to use for splitting (optionally propagated) key-value pairs
SPLIT_KVP = re.compile(rf"^({NAME})(==?)(.*)$")
SPLIT_KVP = re.compile(rf"^({NAME})(:?==?)(.*)$")
#: Regex with groups to use for splitting %[virtuals=...] tokens
SPLIT_COMPILER_TOKEN = re.compile(rf"^%\[virtuals=({VALUE}|{QUOTED_VALUE})]\s*(.*)$")
@@ -135,8 +134,8 @@ class SpecTokens(TokenBase):
# Variants
PROPAGATED_BOOL_VARIANT = rf"(?:(?:\+\+|~~|--)\s*{NAME})"
BOOL_VARIANT = rf"(?:[~+-]\s*{NAME})"
PROPAGATED_KEY_VALUE_PAIR = rf"(?:{NAME}==(?:{VALUE}|{QUOTED_VALUE}))"
KEY_VALUE_PAIR = rf"(?:{NAME}=(?:{VALUE}|{QUOTED_VALUE}))"
PROPAGATED_KEY_VALUE_PAIR = rf"(?:{NAME}:?==(?:{VALUE}|{QUOTED_VALUE}))"
KEY_VALUE_PAIR = rf"(?:{NAME}:?=(?:{VALUE}|{QUOTED_VALUE}))"
# Compilers
COMPILER_AND_VERSION = rf"(?:%\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))"
COMPILER = rf"(?:%\s*(?:{NAME}))"
@@ -370,10 +369,10 @@ def raise_parsing_error(string: str, cause: Optional[Exception] = None):
"""Raise a spec parsing error with token context."""
raise SpecParsingError(string, self.ctx.current_token, self.literal_str) from cause
def add_flag(name: str, value: str, propagate: bool):
def add_flag(name: str, value: Union[str, bool], propagate: bool, concrete: bool):
"""Wrapper around ``Spec._add_flag()`` that adds parser context to errors raised."""
try:
initial_spec._add_flag(name, value, propagate)
initial_spec._add_flag(name, value, propagate, concrete)
except Exception as e:
raise_parsing_error(str(e), e)
@@ -428,29 +427,34 @@ def warn_if_after_compiler(token: str):
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.BOOL_VARIANT):
name = self.ctx.current_token.value[1:].strip()
variant_value = self.ctx.current_token.value[0] == "+"
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
add_flag(name, variant_value, propagate=False, concrete=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_BOOL_VARIANT):
name = self.ctx.current_token.value[2:].strip()
variant_value = self.ctx.current_token.value[0:2] == "++"
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
add_flag(name, variant_value, propagate=True, concrete=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
assert match, "SPLIT_KVP and KEY_VALUE_PAIR do not agree."
name, value = self.ctx.current_token.value.split("=", maxsplit=1)
concrete = name.endswith(":")
if concrete:
name = name[:-1]
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
add_flag(
name, strip_quotes_and_unescape(value), propagate=False, concrete=concrete
)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
assert match, "SPLIT_KVP and PROPAGATED_KEY_VALUE_PAIR do not agree."
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
name, value = self.ctx.current_token.value.split("==", maxsplit=1)
concrete = name.endswith(":")
if concrete:
name = name[:-1]
add_flag(name, strip_quotes_and_unescape(value), propagate=True, concrete=concrete)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.expect(SpecTokens.DAG_HASH):
@@ -509,7 +513,8 @@ def parse(self):
while True:
if self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
name, value = self.ctx.current_token.value.split("=", maxsplit=1)
name = name.strip("'\" ")
if name.endswith(":"):
name = name[:-1]
value = value.strip("'\" ").split(",")
attributes[name] = value
if name not in ("deptypes", "virtuals"):

View File

@@ -1,9 +1,12 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import multiprocessing
import os
import posixpath
import sys
from typing import Dict, Optional, Tuple
import pytest
@@ -777,3 +780,139 @@ def test_optimization_flags_are_using_node_target(default_mock_concretization, m
assert len(actions) == 1 and isinstance(actions[0], spack.util.environment.SetEnv)
assert actions[0].value == "-march=x86-64 -mtune=generic"
@pytest.mark.regression("49827")
@pytest.mark.parametrize(
"gcc_config,expected_rpaths",
[
(
"""\
gcc:
externals:
- spec: gcc@14.2.0 languages=c
prefix: /fake/path1
extra_attributes:
compilers:
c: /fake/path1
extra_rpaths:
- /extra/rpaths1
- /extra/rpaths2
""",
"/extra/rpaths1:/extra/rpaths2",
),
(
"""\
gcc:
externals:
- spec: gcc@14.2.0 languages=c
prefix: /fake/path1
extra_attributes:
compilers:
c: /fake/path1
""",
None,
),
],
)
@pytest.mark.not_on_windows("Windows doesn't use the compiler-wrapper")
def test_extra_rpaths_is_set(
working_env, mutable_config, mock_packages, gcc_config, expected_rpaths
):
"""Tests that using a compiler with an 'extra_rpaths' section will set the corresponding
SPACK_COMPILER_EXTRA_RPATHS variable for the wrapper.
"""
cfg_data = syaml.load_config(gcc_config)
spack.config.set("packages", cfg_data)
mpich = spack.concretize.concretize_one("mpich %gcc@14")
spack.build_environment.setup_package(mpich.package, dirty=False)
if expected_rpaths is not None:
assert os.environ["SPACK_COMPILER_EXTRA_RPATHS"] == expected_rpaths
else:
assert "SPACK_COMPILER_EXTRA_RPATHS" not in os.environ
class _TestProcess:
calls: Dict[str, int] = collections.defaultdict(int)
terminated = False
runtime = 0
def __init__(self, *, target, args):
self.alive = None
self.exitcode = 0
self._reset()
def start(self):
self.calls["start"] += 1
self.alive = True
def is_alive(self):
self.calls["is_alive"] += 1
return self.alive
def join(self, timeout: Optional[int] = None):
self.calls["join"] += 1
if timeout is not None and timeout > self.runtime:
self.alive = False
def terminate(self):
self.calls["terminate"] += 1
self._set_terminated()
self.alive = False
@classmethod
def _set_terminated(cls):
cls.terminated = True
@classmethod
def _reset(cls):
cls.calls.clear()
cls.terminated = False
class _TestPipe:
def close(self):
pass
def recv(self):
if _TestProcess.terminated is True:
return 1
return 0
def _pipe_fn(*, duplex: bool = False) -> Tuple[_TestPipe, _TestPipe]:
return _TestPipe(), _TestPipe()
@pytest.fixture()
def mock_build_process(monkeypatch):
monkeypatch.setattr(spack.build_environment, "BuildProcess", _TestProcess)
monkeypatch.setattr(multiprocessing, "Pipe", _pipe_fn)
def _factory(*, runtime: int):
_TestProcess.runtime = runtime
return _factory
@pytest.mark.parametrize(
"runtime,timeout,expected_result,expected_calls",
[
# execution time < timeout
(2, 5, 0, {"start": 1, "join": 1, "is_alive": 1}),
# execution time > timeout
(5, 2, 1, {"start": 1, "join": 2, "is_alive": 1, "terminate": 1}),
],
)
def test_build_process_timeout(
mock_build_process, runtime, timeout, expected_result, expected_calls
):
"""Tests that we make the correct function calls in different timeout scenarios."""
mock_build_process(runtime=runtime)
result = spack.build_environment.start_build_process(
pkg=None, function=None, kwargs={}, timeout=timeout
)
assert result == expected_result
assert _TestProcess.calls == expected_calls

View File

@@ -873,10 +873,6 @@ def test_push_to_build_cache(
ci.copy_stage_logs_to_artifacts(concrete_spec, str(logs_dir))
assert "spack-build-out.txt.gz" in os.listdir(logs_dir)
dl_dir = scratch / "download_dir"
buildcache_cmd("download", "--spec-file", json_path, "--path", str(dl_dir))
assert len(os.listdir(dl_dir)) == 2
def test_push_to_build_cache_exceptions(monkeypatch, tmp_path, capsys):
def push_or_raise(*args, **kwargs):

View File

@@ -62,7 +62,7 @@
(
["-t", "intel", "/test-intel"],
"test-intel",
[r"TestIntel(IntelPackage)", r"setup_environment"],
[r"TestIntel(IntelOneApiPackage)", r"setup_environment"],
),
(
["-t", "makefile", "/test-makefile"],

View File

@@ -38,10 +38,9 @@
(["--transitive", "--deptype=link", "dtbuild1"], {"dtlink2"}),
],
)
def test_direct_dependencies(cli_args, expected, mock_runtimes):
def test_direct_dependencies(cli_args, expected, mock_packages):
out = dependencies(*cli_args)
result = set(re.split(r"\s+", out.strip()))
expected.update(mock_runtimes)
assert expected == result

View File

@@ -24,8 +24,6 @@ def test_it_just_runs(pkg):
(
("mpi",),
[
"intel-mpi",
"intel-parallel-studio",
"mpich",
"mpilander",
"mvapich2",

View File

@@ -83,3 +83,12 @@ def tests_compiler_conversion_modules(mock_compiler):
compiler_spec = CompilerFactory.from_legacy_yaml(mock_compiler)[0]
assert compiler_spec.external
assert compiler_spec.external_modules == modules
@pytest.mark.regression("49717")
def tests_compiler_conversion_corrupted_paths(mock_compiler):
"""Tests that compiler entries with corrupted path do not raise"""
mock_compiler["paths"] = {"cc": "gcc", "cxx": "g++", "fc": "gfortran", "f77": "gfortran"}
# Test this call doesn't raise
compiler_spec = CompilerFactory.from_legacy_yaml(mock_compiler)
assert compiler_spec == []

View File

@@ -1831,10 +1831,7 @@ def test_solve_in_rounds_all_unsolved(self, monkeypatch, mock_packages):
monkeypatch.setattr(spack.solver.asp.Result, "unsolved_specs", simulate_unsolved_property)
monkeypatch.setattr(spack.solver.asp.Result, "specs", list())
with pytest.raises(
spack.solver.asp.InternalConcretizerError,
match="a subset of input specs could not be solved for",
):
with pytest.raises(spack.solver.asp.OutputDoesNotSatisfyInputError):
list(solver.solve_in_rounds(specs))
def test_coconcretize_reuse_and_virtuals(self):
@@ -3336,3 +3333,50 @@ def test_specifying_compilers_with_virtuals_syntax(default_mock_concretization):
assert mpich["fortran"].satisfies("gcc")
assert mpich["c"].satisfies("llvm")
assert mpich["cxx"].satisfies("llvm")
@pytest.mark.regression("49847")
@pytest.mark.xfail(sys.platform == "win32", reason="issues with install mockery")
def test_reuse_when_input_specifies_build_dep(install_mockery, do_not_check_runtimes_on_reuse):
"""Test that we can reuse a spec when specifying build dependencies in the input"""
pkgb_old = spack.concretize.concretize_one(spack.spec.Spec("pkg-b@0.9 %gcc@9"))
PackageInstaller([pkgb_old.package], fake=True, explicit=True).install()
with spack.config.override("concretizer:reuse", True):
result = spack.concretize.concretize_one("pkg-b %gcc")
assert pkgb_old.dag_hash() == result.dag_hash()
result = spack.concretize.concretize_one("pkg-a ^pkg-b %gcc@9")
assert pkgb_old.dag_hash() == result["pkg-b"].dag_hash()
assert result.satisfies("%gcc@9")
result = spack.concretize.concretize_one("pkg-a %gcc@10 ^pkg-b %gcc@9")
assert pkgb_old.dag_hash() == result["pkg-b"].dag_hash()
@pytest.mark.regression("49847")
def test_reuse_when_requiring_build_dep(
install_mockery, do_not_check_runtimes_on_reuse, mutable_config
):
"""Test that we can reuse a spec when specifying build dependencies in requirements"""
mutable_config.set("packages:all:require", "%gcc")
pkgb_old = spack.concretize.concretize_one(spack.spec.Spec("pkg-b@0.9"))
PackageInstaller([pkgb_old.package], fake=True, explicit=True).install()
with spack.config.override("concretizer:reuse", True):
result = spack.concretize.concretize_one("pkg-b")
assert pkgb_old.dag_hash() == result.dag_hash(), result.tree()
@pytest.mark.regression("50167")
def test_input_analysis_and_conditional_requirements(default_mock_concretization):
"""Tests that input analysis doesn't account for conditional requirement
to discard possible dependencies.
If the requirement is conditional, and impossible to achieve on the current
platform, the valid search space is still the complement of the condition that
activates the requirement.
"""
libceed = default_mock_concretization("libceed")
assert libceed["libxsmm"].satisfies("@main")
assert libceed["libxsmm"].satisfies("platform=test")

View File

@@ -2,11 +2,15 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from io import StringIO
import pytest
import spack.concretize
import spack.config
import spack.main
import spack.solver.asp
import spack.spec
version_error_messages = [
"Cannot satisfy",
@@ -60,3 +64,31 @@ def test_error_messages(error_messages, config_set, spec, mock_packages, mutable
for em in error_messages:
assert em in str(e.value)
def test_internal_error_handling_formatting(tmp_path):
log = StringIO()
input_to_output = [
(spack.spec.Spec("foo+x"), spack.spec.Spec("foo@=1.0~x")),
(spack.spec.Spec("bar+y"), spack.spec.Spec("x@=1.0~y")),
(spack.spec.Spec("baz+z"), None),
]
spack.main._handle_solver_bug(
spack.solver.asp.OutputDoesNotSatisfyInputError(input_to_output), root=tmp_path, out=log
)
output = log.getvalue()
assert "the following specs were not solved:\n - baz+z\n" in output
assert (
"the following specs were concretized, but do not satisfy the input:\n"
" - foo+x\n"
" - bar+y\n"
) in output
files = {f.name: str(f) for f in tmp_path.glob("spack-asp-*/*.json")}
assert {"input-1.json", "input-2.json", "output-1.json", "output-2.json"} == set(files.keys())
assert spack.spec.Spec.from_specfile(files["input-1.json"]) == spack.spec.Spec("foo+x")
assert spack.spec.Spec.from_specfile(files["input-2.json"]) == spack.spec.Spec("bar+y")
assert spack.spec.Spec.from_specfile(files["output-1.json"]) == spack.spec.Spec("foo@=1.0~x")
assert spack.spec.Spec.from_specfile(files["output-2.json"]) == spack.spec.Spec("x@=1.0~y")

View File

@@ -1239,3 +1239,68 @@ def test_virtual_requirement_respects_any_of(concretize_scope, mock_packages):
with pytest.raises(spack.error.SpackError):
spack.concretize.concretize_one("mpileaks ^[virtuals=mpi] zmpi")
@pytest.mark.parametrize(
"packages_yaml,expected_reuse,expected_contraints",
[
(
"""
packages:
all:
require:
- "%gcc"
""",
True,
# To minimize installed specs we reuse pkg-b compiler, since the requirement allows it
["%gcc@9"],
),
(
"""
packages:
all:
require:
- "%gcc@10"
""",
False,
["%gcc@10"],
),
(
"""
packages:
all:
require:
- "%gcc"
pkg-a:
require:
- "%gcc@10"
""",
True,
["%gcc@10"],
),
],
)
@pytest.mark.regression("49847")
def test_requirements_on_compilers_and_reuse(
concretize_scope, mock_packages, packages_yaml, expected_reuse, expected_contraints
):
"""Tests that we can require compilers with `%` in configuration files, and still get reuse
of specs (even though reused specs have no build dependency in the ASP encoding).
"""
input_spec = "pkg-a"
reused_spec = spack.concretize.concretize_one("pkg-b@0.9 %gcc@9")
reused_nodes = list(reused_spec.traverse())
update_packages_config(packages_yaml)
root_specs = [Spec(input_spec)]
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=reused_nodes)
pkga = result.specs[0]
is_pkgb_reused = pkga["pkg-b"].dag_hash() == reused_spec.dag_hash()
assert is_pkgb_reused == expected_reuse
for c in expected_contraints:
assert pkga.satisfies(c), print(pkga.tree())

View File

@@ -47,10 +47,10 @@ class Grads(AutotoolsPackage):
depends_on('readline')
depends_on('pkgconfig', type='build')
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
env.set('SUPPLIBS', '/')
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
env.set('GADDIR', self.prefix.data)
@run_after('install')

View File

@@ -517,7 +517,7 @@ class Llvm(CMakePackage, CudaPackage):
return (None, flags, None)
return (flags, None, None)
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
"""When using %clang, add only its ld.lld-$ver and/or ld.lld to our PATH"""
if self.compiler.name in ["clang", "apple-clang"]:
for lld in "ld.lld-{0}".format(self.compiler.version.version[0]), "ld.lld":
@@ -528,7 +528,7 @@ class Llvm(CMakePackage, CudaPackage):
os.symlink(bin, sym)
env.prepend_path("PATH", self.stage.path)
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
if "+clang" in self.spec:
env.set("CC", join_path(self.spec.prefix.bin, "clang"))
env.set("CXX", join_path(self.spec.prefix.bin, "clang++"))

View File

@@ -318,7 +318,7 @@ class Mfem(Package, CudaPackage, ROCmPackage):
patch('mfem-4.0.0-makefile-syntax-fix.patch', when='@4.0.0')
phases = ['configure', 'build', 'install']
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
env.unset('MFEM_DIR')
env.unset('MFEM_BUILD_DIR')

View File

@@ -281,7 +281,7 @@ class PyTorch(PythonPackage, CudaPackage):
"caffe2/CMakeLists.txt",
)
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
"""Set environment variables used to control the build.
PyTorch's ``setup.py`` is a thin wrapper around ``cmake``.

View File

@@ -440,7 +440,7 @@ class Trilinos(CMakePackage, CudaPackage):
url = "https://github.com/trilinos/Trilinos/archive/trilinos-release-{0}.tar.gz"
return url.format(version.dashed)
def setup_dependent_run_environment(self, env, dependent_spec):
def setup_dependent_run_environment(self, env: EnvironmentModifications, dependent_spec: Spec) -> None:
if "+cuda" in self.spec:
# currently Trilinos doesn't perform the memory fence so
# it relies on blocking CUDA kernel launch. This is needed
@@ -453,7 +453,7 @@ class Trilinos(CMakePackage, CudaPackage):
else:
self.spec.kokkos_cxx = spack_cxx
def setup_build_environment(self, env):
def setup_build_environment(self, env: EnvironmentModifications) -> None:
spec = self.spec
if "+cuda" in spec and "+wrapper" in spec:
if "+mpi" in spec:
@@ -847,7 +847,7 @@ class Trilinos(CMakePackage, CudaPackage):
)
filter_file(r"-lpytrilinos", "", "%s/Makefile.export.Trilinos" % self.prefix.include)
def setup_run_environment(self, env):
def setup_run_environment(self, env: EnvironmentModifications) -> None:
if "+exodus" in self.spec:
env.prepend_path("PYTHONPATH", self.prefix.lib)

View File

@@ -20,7 +20,7 @@
SpackEnvironmentViewError,
_error_on_nonempty_view_dir,
)
from spack.spec_list import UndefinedReferenceError
from spack.environment.list import UndefinedReferenceError
pytestmark = pytest.mark.not_on_windows("Envs are not supported on windows")
@@ -107,7 +107,8 @@ def test_env_change_spec_in_definition(tmp_path, mock_packages, mutable_mock_env
assert any(x.intersects("mpileaks@2.1%gcc") for x in e.user_specs)
e.change_existing_spec(spack.spec.Spec("mpileaks@2.2"), list_name="desired_specs")
with e:
e.change_existing_spec(spack.spec.Spec("mpileaks@2.2"), list_name="desired_specs")
e.write()
# Ensure changed specs are in memory
@@ -776,10 +777,8 @@ def test_env_with_include_def_missing(mutable_mock_env_path, mock_packages):
"""
)
e = ev.Environment(env_path)
with e:
with pytest.raises(UndefinedReferenceError, match=r"which does not appear"):
e.concretize()
with pytest.raises(UndefinedReferenceError, match=r"which is not defined"):
_ = ev.Environment(env_path)
@pytest.mark.regression("41292")

View File

@@ -2,6 +2,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from io import StringIO
import pytest
from spack import fetch_strategy
@@ -13,3 +15,136 @@ def test_fetchstrategy_bad_url_scheme():
with pytest.raises(ValueError):
fetcher = fetch_strategy.from_url_scheme("bogus-scheme://example.com/a/b/c") # noqa: F841
@pytest.mark.parametrize(
"expected,total_bytes",
[
(" 0.00 B", 0),
(" 999.00 B", 999),
(" 1.00 KB", 1000),
(" 2.05 KB", 2048),
(" 1.00 MB", 1e6),
(" 12.30 MB", 1.23e7),
(" 1.23 GB", 1.23e9),
(" 999.99 GB", 9.9999e11),
("5000.00 GB", 5e12),
],
)
def test_format_bytes(expected, total_bytes):
assert fetch_strategy._format_bytes(total_bytes) == expected
@pytest.mark.parametrize(
"expected,total_bytes,elapsed",
[
(" 0.0 B/s", 0, 0), # no time passed -- defaults to 1s.
(" 0.0 B/s", 0, 1),
(" 999.0 B/s", 999, 1),
(" 1.0 KB/s", 1000, 1),
(" 500.0 B/s", 1000, 2),
(" 2.0 KB/s", 2048, 1),
(" 1.0 MB/s", 1e6, 1),
(" 500.0 KB/s", 1e6, 2),
(" 12.3 MB/s", 1.23e7, 1),
(" 1.2 GB/s", 1.23e9, 1),
(" 999.9 GB/s", 9.999e11, 1),
("5000.0 GB/s", 5e12, 1),
],
)
def test_format_speed(expected, total_bytes, elapsed):
assert fetch_strategy._format_speed(total_bytes, elapsed) == expected
def test_fetch_progress_unknown_size():
# time stamps in seconds, with 0.1s delta except 1.5 -> 1.55.
time_stamps = iter([1.0, 1.5, 1.55, 2.0, 3.0, 5.0, 5.5, 5.5])
progress = fetch_strategy.FetchProgress(total_bytes=None, get_time=lambda: next(time_stamps))
assert progress.start_time == 1.0
out = StringIO()
progress.advance(1000, out)
assert progress.last_printed == 1.5
progress.advance(50, out)
assert progress.last_printed == 1.5 # does not print, too early after last print
progress.advance(2000, out)
assert progress.last_printed == 2.0
progress.advance(3000, out)
assert progress.last_printed == 3.0
progress.advance(4000, out)
assert progress.last_printed == 5.0
progress.advance(4000, out)
assert progress.last_printed == 5.5
progress.print(final=True, out=out) # finalize download
outputs = [
"\r [ | ] 1.00 KB @ 2.0 KB/s",
"\r [ / ] 3.05 KB @ 3.0 KB/s",
"\r [ - ] 6.05 KB @ 3.0 KB/s",
"\r [ \\ ] 10.05 KB @ 2.5 KB/s", # have to escape \ here but is aligned in output
"\r [ | ] 14.05 KB @ 3.1 KB/s",
"\r [100%] 14.05 KB @ 3.1 KB/s\n", # final print: no spinner; newline
]
assert out.getvalue() == "".join(outputs)
def test_fetch_progress_known_size():
time_stamps = iter([1.0, 1.5, 3.0, 4.0, 4.0])
progress = fetch_strategy.FetchProgress(total_bytes=6000, get_time=lambda: next(time_stamps))
out = StringIO()
progress.advance(1000, out) # time 1.5
progress.advance(2000, out) # time 3.0
progress.advance(3000, out) # time 4.0
progress.print(final=True, out=out)
outputs = [
"\r [ 17%] 1.00 KB @ 2.0 KB/s",
"\r [ 50%] 3.00 KB @ 1.5 KB/s",
"\r [100%] 6.00 KB @ 2.0 KB/s",
"\r [100%] 6.00 KB @ 2.0 KB/s\n", # final print has newline
]
assert out.getvalue() == "".join(outputs)
def test_fetch_progress_disabled():
"""When disabled, FetchProgress shouldn't print anything when advanced"""
def get_time():
raise RuntimeError("Should not be called")
progress = fetch_strategy.FetchProgress(enabled=False, get_time=get_time)
out = StringIO()
progress.advance(1000, out)
progress.advance(2000, out)
progress.print(final=True, out=out)
assert progress.last_printed == 0
assert not out.getvalue()
@pytest.mark.parametrize(
"header,value,total_bytes",
[
("Content-Length", "1234", 1234),
("Content-Length", "0", 0),
("Content-Length", "-10", 0),
("Content-Length", "not a number", 0),
("Not-Content-Length", "1234", 0),
],
)
def test_fetch_progress_from_headers(header, value, total_bytes):
time_stamps = iter([1.0, 1.5, 3.0, 4.0, 4.0])
progress = fetch_strategy.FetchProgress.from_headers(
{header: value}, get_time=lambda: next(time_stamps), enabled=True
)
assert progress.total_bytes == total_bytes
assert progress.enabled
assert progress.start_time == 1.0
def test_fetch_progress_from_headers_disabled():
progress = fetch_strategy.FetchProgress.from_headers(
{"Content-Length": "1234"}, get_time=lambda: 1.0, enabled=False
)
assert not progress.enabled

View File

@@ -111,14 +111,13 @@ def mpi_names(mock_inspector):
("dtbuild1", {"allowed_deps": dt.LINK}, {"dtbuild1", "dtlink2"}),
],
)
def test_possible_dependencies(pkg_name, fn_kwargs, expected, mock_runtimes, mock_inspector):
def test_possible_dependencies(pkg_name, fn_kwargs, expected, mock_inspector):
"""Tests possible nodes of mpileaks, under different scenarios."""
expected.update(mock_runtimes)
result, *_ = mock_inspector.possible_dependencies(pkg_name, **fn_kwargs)
assert expected == result
def test_possible_dependencies_virtual(mock_inspector, mock_packages, mock_runtimes, mpi_names):
def test_possible_dependencies_virtual(mock_inspector, mock_packages, mpi_names):
expected = set(mpi_names)
for name in mpi_names:
expected.update(
@@ -126,7 +125,6 @@ def test_possible_dependencies_virtual(mock_inspector, mock_packages, mock_runti
for dep in mock_packages.get_pkg_class(name).dependencies_by_name()
if not mock_packages.is_virtual(dep)
)
expected.update(mock_runtimes)
expected.update(s.name for s in mock_packages.providers_for("c"))
real_pkgs, *_ = mock_inspector.possible_dependencies(
@@ -146,7 +144,6 @@ def test_possible_dependencies_with_multiple_classes(
pkgs = ["dt-diamond", "mpileaks"]
expected = set(mpileaks_possible_deps)
expected.update({"dt-diamond", "dt-diamond-left", "dt-diamond-right", "dt-diamond-bottom"})
expected.update(mock_packages.packages_with_tags("runtime"))
real_pkgs, *_ = mock_inspector.possible_dependencies(*pkgs, allowed_deps=dt.ALL)
assert set(expected) == real_pkgs

View File

@@ -6,53 +6,56 @@
import pytest
import spack.concretize
from spack.environment.list import SpecListParser
from spack.installer import PackageInstaller
from spack.spec import Spec
from spack.spec_list import SpecList
DEFAULT_EXPANSION = [
"mpileaks",
"zmpi@1.0",
"mpich@3.0",
{"matrix": [["hypre"], ["%gcc@4.5.0", "%clang@3.3"]]},
"libelf",
]
DEFAULT_CONSTRAINTS = [
[Spec("mpileaks")],
[Spec("zmpi@1.0")],
[Spec("mpich@3.0")],
[Spec("hypre"), Spec("%gcc@4.5.0")],
[Spec("hypre"), Spec("%clang@3.3")],
[Spec("libelf")],
]
DEFAULT_SPECS = [
Spec("mpileaks"),
Spec("zmpi@1.0"),
Spec("mpich@3.0"),
Spec("hypre%gcc@4.5.0"),
Spec("hypre%clang@3.3"),
Spec("libelf"),
]
@pytest.fixture()
def parser_and_speclist():
"""Default configuration of parser and user spec list for tests"""
parser = SpecListParser()
parser.parse_definitions(
data=[
{"gccs": ["%gcc@4.5.0"]},
{"clangs": ["%clang@3.3"]},
{"mpis": ["zmpi@1.0", "mpich@3.0"]},
]
)
result = parser.parse_user_specs(
name="specs",
yaml_list=["mpileaks", "$mpis", {"matrix": [["hypre"], ["$gccs", "$clangs"]]}, "libelf"],
)
return parser, result
class TestSpecList:
default_input = ["mpileaks", "$mpis", {"matrix": [["hypre"], ["$gccs", "$clangs"]]}, "libelf"]
default_reference = {
"gccs": SpecList("gccs", ["%gcc@4.5.0"]),
"clangs": SpecList("clangs", ["%clang@3.3"]),
"mpis": SpecList("mpis", ["zmpi@1.0", "mpich@3.0"]),
}
default_expansion = [
"mpileaks",
"zmpi@1.0",
"mpich@3.0",
{"matrix": [["hypre"], ["%gcc@4.5.0", "%clang@3.3"]]},
"libelf",
]
default_constraints = [
[Spec("mpileaks")],
[Spec("zmpi@1.0")],
[Spec("mpich@3.0")],
[Spec("hypre"), Spec("%gcc@4.5.0")],
[Spec("hypre"), Spec("%clang@3.3")],
[Spec("libelf")],
]
default_specs = [
Spec("mpileaks"),
Spec("zmpi@1.0"),
Spec("mpich@3.0"),
Spec("hypre%gcc@4.5.0"),
Spec("hypre%clang@3.3"),
Spec("libelf"),
]
def test_spec_list_expansions(self):
speclist = SpecList("specs", self.default_input, self.default_reference)
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
@pytest.mark.regression("28749")
@pytest.mark.parametrize(
"specs,expected",
@@ -86,116 +89,87 @@ def test_spec_list_expansions(self):
],
)
def test_spec_list_constraint_ordering(self, specs, expected):
speclist = SpecList("specs", specs)
expected_specs = [Spec(x) for x in expected]
assert speclist.specs == expected_specs
result = SpecListParser().parse_user_specs(name="specs", yaml_list=specs)
assert result.specs == [Spec(x) for x in expected]
def test_spec_list_add(self):
speclist = SpecList("specs", self.default_input, self.default_reference)
def test_mock_spec_list(self, parser_and_speclist):
"""Tests expected properties on the default mock spec list"""
parser, mock_list = parser_and_speclist
assert mock_list.specs_as_yaml_list == DEFAULT_EXPANSION
assert mock_list.specs_as_constraints == DEFAULT_CONSTRAINTS
assert mock_list.specs == DEFAULT_SPECS
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
def test_spec_list_add(self, parser_and_speclist):
parser, mock_list = parser_and_speclist
mock_list.add("libdwarf")
speclist.add("libdwarf")
assert mock_list.specs_as_yaml_list == DEFAULT_EXPANSION + ["libdwarf"]
assert mock_list.specs_as_constraints == DEFAULT_CONSTRAINTS + [[Spec("libdwarf")]]
assert mock_list.specs == DEFAULT_SPECS + [Spec("libdwarf")]
assert speclist.specs_as_yaml_list == self.default_expansion + ["libdwarf"]
assert speclist.specs_as_constraints == self.default_constraints + [[Spec("libdwarf")]]
assert speclist.specs == self.default_specs + [Spec("libdwarf")]
def test_spec_list_remove(self, parser_and_speclist):
parser, mock_list = parser_and_speclist
mock_list.remove("libelf")
def test_spec_list_remove(self):
speclist = SpecList("specs", self.default_input, self.default_reference)
assert mock_list.specs_as_yaml_list + ["libelf"] == DEFAULT_EXPANSION
assert mock_list.specs_as_constraints + [[Spec("libelf")]] == DEFAULT_CONSTRAINTS
assert mock_list.specs + [Spec("libelf")] == DEFAULT_SPECS
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
speclist.remove("libelf")
assert speclist.specs_as_yaml_list + ["libelf"] == self.default_expansion
assert speclist.specs_as_constraints + [[Spec("libelf")]] == self.default_constraints
assert speclist.specs + [Spec("libelf")] == self.default_specs
def test_spec_list_update_reference(self):
speclist = SpecList("specs", self.default_input, self.default_reference)
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
new_mpis = SpecList("mpis", self.default_reference["mpis"].yaml_list)
new_mpis.add("mpich@3.3")
new_reference = self.default_reference.copy()
new_reference["mpis"] = new_mpis
speclist.update_reference(new_reference)
expansion = list(self.default_expansion)
expansion.insert(3, "mpich@3.3")
constraints = list(self.default_constraints)
constraints.insert(3, [Spec("mpich@3.3")])
specs = list(self.default_specs)
specs.insert(3, Spec("mpich@3.3"))
assert speclist.specs_as_yaml_list == expansion
assert speclist.specs_as_constraints == constraints
assert speclist.specs == specs
def test_spec_list_extension(self):
speclist = SpecList("specs", self.default_input, self.default_reference)
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
new_ref = self.default_reference.copy()
otherlist = SpecList("specs", ["zlib", {"matrix": [["callpath"], ["%intel@18"]]}], new_ref)
speclist.extend(otherlist)
assert speclist.specs_as_yaml_list == (
self.default_expansion + otherlist.specs_as_yaml_list
def test_spec_list_extension(self, parser_and_speclist):
parser, mock_list = parser_and_speclist
other_list = parser.parse_user_specs(
name="specs", yaml_list=[{"matrix": [["callpath"], ["%intel@18"]]}]
)
assert speclist.specs == self.default_specs + otherlist.specs
assert speclist._reference is new_ref
mock_list.extend(other_list)
assert mock_list.specs_as_yaml_list == (DEFAULT_EXPANSION + other_list.specs_as_yaml_list)
assert mock_list.specs == DEFAULT_SPECS + other_list.specs
def test_spec_list_nested_matrices(self, parser_and_speclist):
parser, _ = parser_and_speclist
def test_spec_list_nested_matrices(self):
inner_matrix = [{"matrix": [["zlib", "libelf"], ["%gcc", "%intel"]]}]
outer_addition = ["+shared", "~shared"]
outer_matrix = [{"matrix": [inner_matrix, outer_addition]}]
speclist = SpecList("specs", outer_matrix)
result = parser.parse_user_specs(name="specs", yaml_list=outer_matrix)
expected_components = itertools.product(
["zlib", "libelf"], ["%gcc", "%intel"], ["+shared", "~shared"]
)
expected = [Spec(" ".join(combo)) for combo in expected_components]
assert set(speclist.specs) == set(expected)
assert set(result.specs) == set(expected)
@pytest.mark.regression("16897")
def test_spec_list_recursion_specs_as_constraints(self):
input = ["mpileaks", "$mpis", {"matrix": [["hypre"], ["$%gccs", "$%clangs"]]}, "libelf"]
reference = {
"gccs": SpecList("gccs", ["gcc@4.5.0"]),
"clangs": SpecList("clangs", ["clang@3.3"]),
"mpis": SpecList("mpis", ["zmpi@1.0", "mpich@3.0"]),
}
speclist = SpecList("specs", input, reference)
assert speclist.specs_as_yaml_list == self.default_expansion
assert speclist.specs_as_constraints == self.default_constraints
assert speclist.specs == self.default_specs
def test_spec_list_matrix_exclude(self, mock_packages):
# Test on non-boolean variants for regression for #16841
matrix = [
{"matrix": [["multivalue-variant"], ["foo=bar", "foo=baz"]], "exclude": ["foo=bar"]}
definitions = [
{"gccs": ["gcc@4.5.0"]},
{"clangs": ["clang@3.3"]},
{"mpis": ["zmpi@1.0", "mpich@3.0"]},
]
speclist = SpecList("specs", matrix)
assert len(speclist.specs) == 1
parser = SpecListParser()
parser.parse_definitions(data=definitions)
result = parser.parse_user_specs(name="specs", yaml_list=input)
assert result.specs_as_yaml_list == DEFAULT_EXPANSION
assert result.specs_as_constraints == DEFAULT_CONSTRAINTS
assert result.specs == DEFAULT_SPECS
@pytest.mark.regression("16841")
def test_spec_list_matrix_exclude(self, mock_packages):
parser = SpecListParser()
result = parser.parse_user_specs(
name="specs",
yaml_list=[
{
"matrix": [["multivalue-variant"], ["foo=bar", "foo=baz"]],
"exclude": ["foo=bar"],
}
],
)
assert len(result.specs) == 1
def test_spec_list_exclude_with_abstract_hashes(self, mock_packages, install_mockery):
# Put mpich in the database so it can be referred to by hash.
@@ -205,9 +179,10 @@ def test_spec_list_exclude_with_abstract_hashes(self, mock_packages, install_moc
# Create matrix and exclude +debug, which excludes the first mpich after its abstract hash
# is resolved.
speclist = SpecList(
"specs",
[
parser = SpecListParser()
result = parser.parse_user_specs(
name="specs",
yaml_list=[
{
"matrix": [
["mpileaks"],
@@ -220,5 +195,5 @@ def test_spec_list_exclude_with_abstract_hashes(self, mock_packages, install_moc
)
# Ensure that only mpich~debug is selected, and that the assembled spec remains abstract.
assert len(speclist.specs) == 1
assert speclist.specs[0] == Spec(f"mpileaks ^callpath ^mpich/{mpich_2.dag_hash(5)}")
assert len(result.specs) == 1
assert result.specs[0] == Spec(f"mpileaks ^callpath ^mpich/{mpich_2.dag_hash(5)}")

View File

@@ -626,43 +626,32 @@ def test_propagate_reserved_variant_names(self, spec_string):
with pytest.raises(spack.spec_parser.SpecParsingError, match="Propagation"):
Spec(spec_string)
def test_unsatisfiable_multi_value_variant(self, default_mock_concretization):
def test_multivalued_variant_1(self, default_mock_concretization):
# Semantics for a multi-valued variant is different
# Depending on whether the spec is concrete or not
a = default_mock_concretization('multivalue-variant foo="bar"')
spec_str = 'multivalue-variant foo="bar,baz"'
b = Spec(spec_str)
a = default_mock_concretization("multivalue-variant foo=bar")
b = Spec("multivalue-variant foo=bar,baz")
assert not a.satisfies(b)
assert not a.satisfies(spec_str)
# A concrete spec cannot be constrained further
with pytest.raises(UnsatisfiableSpecError):
a.constrain(b)
a = Spec('multivalue-variant foo="bar"')
spec_str = 'multivalue-variant foo="bar,baz"'
b = Spec(spec_str)
def test_multivalued_variant_2(self):
a = Spec("multivalue-variant foo=bar")
b = Spec("multivalue-variant foo=bar,baz")
# The specs are abstract and they **could** be constrained
assert a.satisfies(b)
assert a.satisfies(spec_str)
assert b.satisfies(a) and not a.satisfies(b)
# An abstract spec can instead be constrained
assert a.constrain(b)
a = default_mock_concretization('multivalue-variant foo="bar,baz"')
spec_str = 'multivalue-variant foo="bar,baz,quux"'
b = Spec(spec_str)
def test_multivalued_variant_3(self, default_mock_concretization):
a = default_mock_concretization("multivalue-variant foo=bar,baz")
b = Spec("multivalue-variant foo=bar,baz,quux")
assert not a.satisfies(b)
assert not a.satisfies(spec_str)
# A concrete spec cannot be constrained further
with pytest.raises(UnsatisfiableSpecError):
a.constrain(b)
a = Spec('multivalue-variant foo="bar,baz"')
spec_str = 'multivalue-variant foo="bar,baz,quux"'
b = Spec(spec_str)
def test_multivalued_variant_4(self):
a = Spec("multivalue-variant foo=bar,baz")
b = Spec("multivalue-variant foo=bar,baz,quux")
# The specs are abstract and they **could** be constrained
assert a.intersects(b)
assert a.intersects(spec_str)
# An abstract spec can instead be constrained
assert a.constrain(b)
# ...but will fail during concretization if there are
@@ -670,15 +659,14 @@ def test_unsatisfiable_multi_value_variant(self, default_mock_concretization):
with pytest.raises(InvalidVariantValueError):
spack.concretize.concretize_one(a)
def test_multivalued_variant_5(self):
# This time we'll try to set a single-valued variant
a = Spec('multivalue-variant fee="bar"')
spec_str = 'multivalue-variant fee="baz"'
b = Spec(spec_str)
a = Spec("multivalue-variant fee=bar")
b = Spec("multivalue-variant fee=baz")
# The specs are abstract and they **could** be constrained,
# as before concretization I don't know which type of variant
# I have (if it is not a BV)
assert a.intersects(b)
assert a.intersects(spec_str)
# A variant cannot be parsed as single-valued until we try to
# concretize. This means that we can constrain the variant above
assert a.constrain(b)
@@ -985,13 +973,10 @@ def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str)
with pytest.raises(SpecFormatStringError):
spec.format(fmt_str)
def test_combination_of_wildcard_or_none(self):
# Test that using 'none' and another value raises
with pytest.raises(spack.spec_parser.SpecParsingError, match="cannot be combined"):
Spec("multivalue-variant foo=none,bar")
# Test that using wildcard and another value raises
with pytest.raises(spack.spec_parser.SpecParsingError, match="cannot be combined"):
def test_wildcard_is_invalid_variant_value(self):
"""The spec string x=* is parsed as a multi-valued variant with values the empty set.
That excludes * as a literal variant value."""
with pytest.raises(spack.spec_parser.SpecParsingError, match="cannot use reserved value"):
Spec("multivalue-variant foo=*,bar")
def test_errors_in_variant_directive(self):
@@ -1380,6 +1365,18 @@ def test_splice_swap_names_mismatch_virtuals(self, default_mock_concretization,
with pytest.raises(spack.spec.SpliceError, match="virtual"):
vt.splice(vh, transitive)
def test_adaptor_optflags(self):
"""Tests that we can obtain the list of optflags, and debugflags,
from the compiler adaptor, and that this list is taken from the
appropriate compiler package.
"""
# pkg-a depends on c, so only the gcc compiler should be chosen
spec = spack.concretize.concretize_one(Spec("pkg-a %gcc"))
assert "-Otestopt" in spec.package.compiler.opt_flags
# This is not set, make sure we get an empty list
for x in spec.package.compiler.debug_flags:
pass
def test_spec_override(self):
init_spec = Spec("pkg-a foo=baz foobar=baz cflags=-O3 cxxflags=-O1")
change_spec = Spec("pkg-a foo=fee cflags=-O2")
@@ -1960,6 +1957,35 @@ def test_edge_equality_does_not_depend_on_virtual_order():
assert tuple(sorted(edge2.virtuals)) == edge1.virtuals
def test_update_virtuals():
parent, child = Spec("parent"), Spec("child")
edge = DependencySpec(parent, child, depflag=0, virtuals=("mpi", "lapack"))
assert edge.update_virtuals("blas")
assert edge.virtuals == ("blas", "lapack", "mpi")
assert edge.update_virtuals(("c", "fortran", "mpi", "lapack"))
assert edge.virtuals == ("blas", "c", "fortran", "lapack", "mpi")
assert not edge.update_virtuals("mpi")
assert not edge.update_virtuals(("c", "fortran", "mpi", "lapack"))
assert edge.virtuals == ("blas", "c", "fortran", "lapack", "mpi")
def test_virtual_queries_work_for_strings_and_lists():
"""Ensure that ``dependencies()`` works with both virtuals=str and virtuals=[str, ...]."""
parent, child = Spec("parent"), Spec("child")
parent._add_dependency(
child, depflag=dt.BUILD, virtuals=("cxx", "fortran") # multi-char dep names
)
assert not parent.dependencies(virtuals="c") # not in virtuals but shares a char with cxx
for lang in ["cxx", "fortran"]:
assert parent.dependencies(virtuals=lang) # string arg
assert parent.edges_to_dependencies(virtuals=lang) # string arg
assert parent.dependencies(virtuals=[lang]) # list arg
assert parent.edges_to_dependencies(virtuals=[lang]) # string arg
def test_old_format_strings_trigger_error(default_mock_concretization):
s = spack.concretize.concretize_one("pkg-a")
with pytest.raises(SpecFormatStringError):

View File

@@ -26,15 +26,9 @@
)
from spack.tokenize import Token
FAIL_ON_WINDOWS = pytest.mark.xfail(
sys.platform == "win32",
raises=(SpecTokenizationError, spack.spec.InvalidHashError),
reason="Unix style path on Windows",
)
SKIP_ON_WINDOWS = pytest.mark.skipif(sys.platform == "win32", reason="Unix style path on Windows")
FAIL_ON_UNIX = pytest.mark.xfail(
sys.platform != "win32", raises=SpecTokenizationError, reason="Windows style path on Unix"
)
SKIP_ON_UNIX = pytest.mark.skipif(sys.platform != "win32", reason="Windows style path on Unix")
def simple_package_name(name):
@@ -639,6 +633,23 @@ def _specfile_for(spec_str, filename):
],
"zlib %[virtuals=fortran] gcc@14.1 %[virtuals=c,cxx] clang",
),
# test := and :== syntax for key value pairs
(
"gcc languages:=c,c++",
[
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, "gcc"),
Token(SpecTokens.KEY_VALUE_PAIR, "languages:=c,c++"),
],
"gcc languages:='c,c++'",
),
(
"gcc languages:==c,c++",
[
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, "gcc"),
Token(SpecTokens.PROPAGATED_KEY_VALUE_PAIR, "languages:==c,c++"),
],
"gcc languages:=='c,c++'",
),
],
)
def test_parse_single_spec(spec_str, tokens, expected_roundtrip, mock_git_test_package):
@@ -1060,56 +1071,56 @@ def test_error_conditions(text, match_string):
[
# Specfile related errors
pytest.param(
"/bogus/path/libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS
"/bogus/path/libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_WINDOWS
),
pytest.param("../../libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS),
pytest.param("./libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS),
pytest.param("../../libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_WINDOWS),
pytest.param("./libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_WINDOWS),
pytest.param(
"libfoo ^/bogus/path/libdwarf.yaml",
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_WINDOWS,
marks=SKIP_ON_WINDOWS,
),
pytest.param(
"libfoo ^../../libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS
"libfoo ^../../libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_WINDOWS
),
pytest.param(
"libfoo ^./libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS
"libfoo ^./libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_WINDOWS
),
pytest.param(
"/bogus/path/libdwarf.yamlfoobar",
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_WINDOWS,
marks=SKIP_ON_WINDOWS,
),
pytest.param(
"libdwarf^/bogus/path/libelf.yamlfoobar ^/path/to/bogus.yaml",
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_WINDOWS,
marks=SKIP_ON_WINDOWS,
),
pytest.param(
"c:\\bogus\\path\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_UNIX
"c:\\bogus\\path\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_UNIX
),
pytest.param("..\\..\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_UNIX),
pytest.param(".\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_UNIX),
pytest.param("..\\..\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_UNIX),
pytest.param(".\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_UNIX),
pytest.param(
"libfoo ^c:\\bogus\\path\\libdwarf.yaml",
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_UNIX,
marks=SKIP_ON_UNIX,
),
pytest.param(
"libfoo ^..\\..\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_UNIX
"libfoo ^..\\..\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_UNIX
),
pytest.param(
"libfoo ^.\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_UNIX
"libfoo ^.\\libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=SKIP_ON_UNIX
),
pytest.param(
"c:\\bogus\\path\\libdwarf.yamlfoobar",
spack.spec.SpecFilenameError,
marks=FAIL_ON_UNIX,
marks=SKIP_ON_UNIX,
),
pytest.param(
"libdwarf^c:\\bogus\\path\\libelf.yamlfoobar ^c:\\path\\to\\bogus.yaml",
spack.spec.SpecFilenameError,
marks=FAIL_ON_UNIX,
marks=SKIP_ON_UNIX,
),
],
)

Some files were not shown because too many files have changed in this diff Show More