Compare commits

..

374 Commits

Author SHA1 Message Date
Wouter Deconinck
7ad08213dc py-krb5: add v0.7.0 (#46799)
* py-krb5: add v0.7.0

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-10-05 21:24:26 +02:00
dependabot[bot]
dce3cf6f31 build(deps): bump docker/setup-buildx-action from 3.7.0 to 3.7.1 (#46796)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.7.0 to 3.7.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](8026d2bc36...c47758b77c)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-05 10:35:12 -05:00
Todd Gamblin
d763d6f738 gc: restrict to specific specs (#46790)
`spack gc` has so far been a global or environment-specific thing.
This adds the ability to restrict garbage collection to specific specs,
e.g. if you *just* want to get rid of all your unused python installations,
you could write:

```console
spack gc python
```

- [x] add `constraint` arg to `spack gc`
- [x] add a simple test

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-04 17:43:10 -07:00
Matt Thompson
d87464e995 mapl: add 2.49.0 (#46787)
* mapl: add 2.49.0
* Remove unneeded depends_on
2024-10-04 15:54:56 -07:00
Wouter Deconinck
280fd1cd68 py-rpds-py: add v0.18.1 (#46786) 2024-10-04 15:37:45 -06:00
Wouter Deconinck
bfbd0a4d4c py-cookiecutter: add v2.6.0 (fix CVE) (#46762)
* py-cookiecutter: add v2.6.0

* py-poyo: add v0.5.0

* py-poyo: fix style
2024-10-04 22:34:48 +02:00
Satish Balay
be21b0b3bf hdf5: use spack installed zlib (#46612)
Avoid that hdf5 searches all search paths for ZLIB.cmake config files (inluding /usr/lib), before it looks for zlib without cmake config files, which is how Spack installs it
2024-10-04 22:55:46 +03:00
Wouter Deconinck
8f798c01ec freetype: add v2.13.3 (#46751) 2024-10-04 13:28:22 -06:00
Axel Huebl
853a7b2567 WarpX: 24.10 (#46763)
* WarpX: 24.10
   This updates WarpX and dependencies for the 24.10 release.
   New features:
   - EB runtime control: we can now compile with EB on by default,
     because it is not an incompatible binary option anymore
   - Catalyst2 support: AMReX/WarpX 24.09+ support Catalyst2 through
     the existing Conduit bindings
* Fix Typo in Variant
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Improve Python Dep Version Ranges
* Add Missing `-DWarpX_CATALYST`
* AMReX: Missing CMake Options for Vis

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-10-04 11:54:42 -07:00
Massimiliano Culpo
1756aeb45a re2c: rework package to use multiple build systems (#46748)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-04 20:48:58 +03:00
Alberto Sartori
7a9a634c94 justbuild: add version 1.3.2 (#46777) 2024-10-04 10:25:30 -07:00
downloadico
4caf718626 abinit: fixed "if" statement bug (#46785)
fixed "if" statement when selecting the linear algebra flavor.
2024-10-04 10:24:06 -07:00
Manuela Kuhn
5867d90ccf py-rsatoolbox: add v0.2.0 (#46780) 2024-10-04 19:01:15 +02:00
Adrien Bernede
8a9bfd162d raja-perf: Fix support for SYCL (#46782) 2024-10-04 17:49:01 +02:00
Mikael Simberg
7bc1316ce4 mold: add 2.34.1 (#46783) 2024-10-04 09:45:36 -06:00
Manuela Kuhn
c9d312e9c1 laynii: add new package (#46781) 2024-10-04 17:42:26 +02:00
Manuela Kuhn
792b576224 py-glmsingle: Add v1.2 (#46776) 2024-10-04 17:40:40 +02:00
Mikael Simberg
5c66f6b994 pika: add 0.29.0 (#46778) 2024-10-04 09:26:57 -06:00
Harmen Stoppels
f17e76e9d8 clingo: fix build with Python 3.13+ (#46775)
Python 3.9 deprecated `PyEval_InitThreads` and kept it as a no-op, and
Python 3.13 deleted the function.

The patch removes calls to it.
2024-10-04 13:26:15 +02:00
Massimiliano Culpo
905e7b9b45 Separate bootstrapping tests for Windows from other platforms (#46750)
This allows us to keep the workflow file tidier, and avoid
using indirections to perform platform specific operations.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-04 10:35:07 +02:00
Mikael Simberg
ad294bc19f umpire: re-instate lost depend on camp ~rocm when umpire itself has ~rocm (#46747) 2024-10-04 01:59:28 -06:00
Satish Balay
8dd978ddb9 py-mpi4py: add v4.0.0 (#46652)
* py-mpi4py: add v4.0.0
* sensei: update mpi4py dependency
   build with py-mpi4py@4.0.0 due to fatal no such file or directory error
* petsc4py: update license, and remove C++/Fortran dependency
2024-10-04 00:42:07 -06:00
Vicente Bolea
b0c48b66c2 paraview: add new v5.13.1 release (#46695) 2024-10-04 00:30:03 -06:00
dependabot[bot]
96880e0b65 build(deps): bump docker/setup-buildx-action from 3.6.1 to 3.7.0 (#46768)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.6.1 to 3.7.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](988b5a0280...8026d2bc36)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-04 07:26:49 +02:00
Wouter Deconinck
48de9d48e2 armadillo: add v12.8.4, v14.0.2 (#46761)
* armadillo: add v12.8.4, v14.0.2
* armadillo: fix git

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-10-03 19:51:09 -06:00
Teague Sterling
72581ded8f duckdb: add v1.1.1, remove 0.7.1->0.8.1, update variants (#46743)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-03 19:02:51 -06:00
Harmen Stoppels
8c4ff56d9f update CHANGELOG.md (#46758) 2024-10-03 18:01:46 -07:00
Satish Balay
d2a551c047 hiop: add v1.1.0 (#46764) 2024-10-03 18:32:40 -06:00
Adam J. Stewart
9dfe096d3a py-keras: add v3.6 (#46767) 2024-10-03 18:20:22 -06:00
Teague Sterling
62f5b18491 at-spi2-core: add v2.52.0,2.54.0 (#46769)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-10-03 17:15:56 -07:00
Wouter Deconinck
8f5af6eb7a doxygen: add v1.12.0 (#46752) 2024-10-03 19:59:35 -04:00
Wouter Deconinck
d3a0904790 catch2: add v3.7.1 (#46759) 2024-10-03 17:44:18 -06:00
Todd Gamblin
3c2a682876 Better docs and typing for _setup_pkg_and_run (#46709)
There was a bit of mystery surrounding the arguments for `_setup_pkg_and_run`. It passes
two file descriptors for handling the `gmake`'s job server in child processes, but they are
unsed in the method.

It turns out that there are good reasons to do this -- depending on the multiprocessing
backend, these file descriptors may be closed in the child if they're not passed
directly to it.

- [x] Document all args to `_setup_pkg_and_run`.
- [x] Document all arguments to `_setup_pkg_and_run`.
- [x] Add type hints for `_setup_pkg_and_run`.
- [x] Refactor exception handling in `_setup_pkg_and_run` so it's easier to add type
      hints. `exc_info()` was problematic because it *can* return `None` (just not
      in the context where it's used).  `mypy` was too dumb to notice this.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-03 14:36:53 -07:00
Niklas Bölter
a52ec2a9cc Update some package homepage URLS (#46683)
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-10-03 15:00:53 -06:00
Alec Scott
43a9c6cb66 gh: add v2.58.0, drop GoBuilder (#46720)
* gh: drop GoBuilder now that GoPackage supports legacy attributes

* gh: add v2.58.0
2024-10-03 09:00:57 -07:00
m-shunji
4cace0cb62 libxsmm: fix for aarch64 environment (#42312)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-10-03 07:30:06 -06:00
kwryankrattiger
482e2fbde8 Noop jobs should do less work (#46732) 2024-10-03 11:59:27 +00:00
Ashim Mahara
ad75e8fc95 py-igv-notebook: new package (#46723) 2024-10-03 05:44:16 -06:00
Wouter Deconinck
af43f6be0d apptainer: add v1.3.4 (#46730) 2024-10-03 05:36:33 -06:00
AcriusWinter
76243bfcd7 Stand-alone testing: remove deprecated methods (#45752)
* Stand-alone testing: remove deprecated methods
* audit: replace deprecated test method checks for test callbacks (AcriusWinter)
* install_test: replace deprecated with new test method name starts (AcriusWinter)
* package_base: removed deprecated test methods (AcriusWinter)
* test/package_class: remove deprecated test methods (AcriusWinter)
* test/reporters: remove deprecated test methods (AcriusWinter)
* reporters/extract: remove deprecated test-related methods (AcriusWinter)
* py-test-callback: rename test in callbacks and output (AcriusWinter)
* reporters/extract: remove deprecated expected failure output capture (AcriusWinter)
* stand-alone test cleanup: f-string, remove deprecation warning, etc. (AcriusWinter)
* stand-alone tests: f-string fix and remove unused imports (AcriusWinter)
* package_base: finish removing the rest of deprecated run_test method (AcriusWinter)
* py-test-callback: remove stand-alone test method (AcriusWinter)
* package_base: remove now unused imports (AcriusWinter)
* Python: test_imports replaces test (tldahlgren)
* mptensor, trivial-smoke-test: replace deprecated cache_extra_test_sources method (tldahlgren)
* trivial-smoke-test: replace deprecated install_test_root method (tldahlgren)
* docs: update perl and scons package's test methods (tldahlgren)
* builder: remove deprecated test() method (tldahlgren)
* Update legion and mfem test data for stand-alone tests (tldahlgren)
* py-test-callback: restore the test method
* Audit/stand-alone testing: Fix and added stand-alone test audit checks
   - fix audit failure message for build-time test callback check
   - remove empty test method check during stand-alone testing
   - change build-time test callback check to package_properties
   - add test method docstring audit check and mock fail-test-audit-docstring pkg
   - add test method implementation audit check and mock fail-test-audit-impl pkg
   - add new mock packages to test_package_audits and test_test_list_all
* audits: loosen docstring content constraints
* Add missing test method docstring
* py-test-callback: resolve builder issues
* Audits: Add checks for use of deprecated stand-alone test methods; improve output for other test-related checks
* Fix style issues
* test_test_list_all: add new fail-test-audit-deprecated package

---------

Signed-off-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-10-03 11:36:18 +00:00
Adam J. Stewart
feabcb884a py-torchaudio: add support for CUDA 12.5+ (#46746) 2024-10-03 13:16:40 +02:00
Daniel Beltrán
1c065f2da9 ambertools: new package (#42684)
* new builtin package: ambertools

* fixes for the style test

* yet more changes for the style test

* hope this is the last fix for the style test

* netlib-xblas is a dependency, it needs a depends_on("m4", type="build")

* ambertools: Add new setuptool dependency, limit python to <= 3.10 (does not build with 3.11+)

---------

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-03 05:13:46 -06:00
Teague Sterling
401e7b4477 vep: new package (#44522)
* Adding the perl-bio-db-bigfile package

* Adding perl-bio-ensembl package

* Adding perl-bio-ensembl-variation package

* Adding the perl-bio-ensembl-funcgen package

* Adding perl-bio-ensembl-io package

* Fixing checksums

* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* rollback

* Update package.py

* Update package.py

* vep: add v110,v111,v112

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Reverting variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Rename package.py to package.py

* Update package.py

* Update package.py

* Update package.py

* Fix variant installation and dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Removing unneeded dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2024-10-03 11:07:54 +01:00
Mikael Simberg
1b0020f3ee pika: remove git dependency (#46745) 2024-10-03 03:01:45 -06:00
Mark W. Krentel
d21577803f hpcviewer: add version 2024.09 (#46744) 2024-10-03 09:58:40 +02:00
Wouter Deconinck
d3518f866b ospray: add v3.2.0 (and dependencies) (#46676) 2024-10-02 22:48:21 -06:00
Adam J. Stewart
93bf5b302d py-pyproj: add v3.7.0 (#46690) 2024-10-02 22:40:22 -06:00
Wouter Deconinck
6e0efdff61 py-jinja2: add v3.1.4 (fix CVE) (#46668)
* py-jinja2: add v3.1.4 (fix CVE)

* py-jinja2: fix py-flit-core upper limit

* py-jinja2: url_for_version due to lowercase jinja2
2024-10-02 22:32:27 -06:00
Wouter Deconinck
95f16f203a dcmtk: add v3.6.8 (fix CVE) (#46627)
* dcmtk: add v3.6.8 (fix CVE)

* dcmtk: apply tiff patch to new version too
* dmctk: tiff patch when range closed since in upstream master
2024-10-02 22:08:17 -06:00
Todd Gamblin
322a83c808 style: fix black configuration (#46740)
We mostly use `spack style` and `spack style --fix`, but it's nice to also be able to
run plain old `black .` in the repo.

- [x] Fix includes and excludes `pyproject.toml` so that we *only* cover files we expect
  to be blackened.

Note that `spack style` is still likely the better way to go, because it looks at `git
status` and tells black to only check files that changed from `develop`. `black` with
`pyproject.toml` won't do that. Of course, you can always manually specify which files
you want blackened.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-02 20:22:54 -06:00
Tapish Narwal
aa53007f82 alpaka: add v0.9.0, v1.0.0, v1.1.0, v1.2.0 (#46607) 2024-10-03 04:03:30 +02:00
Juan Miguel Carceller
12fd940e81 edm4hep: fix concretization by adding a when inside the conditional (#46710) 2024-10-03 02:57:11 +02:00
Alec Scott
e193320ebb yq: add v4.44.3, fix go dependency constraints (#46738) 2024-10-03 02:27:21 +02:00
Matt Thompson
77839303ca mepo: add 2.1.0 (#46726) 2024-10-02 18:17:35 -06:00
Vasileios Karakasis
e572189112 Add more ReFrame versions (#46706) 2024-10-02 16:56:32 -07:00
Alec Scott
20a18b5710 scc: add v3.4.0, drop generated dependencies (#46733) 2024-10-03 01:48:49 +02:00
Ken Raffenetti
e17d81692f MPICH: Add version 4.2.3 (#46718) 2024-10-02 17:43:33 -06:00
Alec Scott
4bf6c61ea0 hugo: add v0.135.0, switch to GoPackage (#46735) 2024-10-02 16:32:16 -07:00
Alec Scott
21b03d149e lazygit: add v0.44.1 (#46736) 2024-10-02 17:25:35 -06:00
Alec Scott
21fbebd273 git: fix zsh shell completions (#46725) 2024-10-02 17:01:39 -06:00
Alec Scott
9326e9211c glow: fix package to install completions and not source dir (#46728) 2024-10-03 00:10:56 +02:00
kwryankrattiger
742e8142cf Do not fail noop jobs in Spack CI (#46713) 2024-10-02 16:04:21 -06:00
Alec Scott
c30a17a302 rclone: add v1.68.1, convert to a GoPackage (#46721)
* rclone: convert to a GoPackage
* rclone: add v1.68.1
2024-10-02 14:21:37 -07:00
Alec Scott
beecc5dc87 kubectl: add v1.31.1, convert to a GoPackage (#46722) 2024-10-02 14:19:55 -07:00
Alec Scott
dfb0f58254 direnv: convert to GoPackage (#46724) 2024-10-02 14:18:20 -07:00
Alec Scott
31f0905f3f litestream: drop GoBuilder now that GoPackage is fixed (#46727) 2024-10-02 14:15:10 -07:00
Alec Scott
726bf85d15 restic: convert to GoPackage (#46719) 2024-10-02 14:04:30 -07:00
Auriane R.
d187a8ab68 stdexec: Add nvhpc@24.09 version (#46714) 2024-10-02 13:57:47 -06:00
Mosè Giordano
1245a2e058 libuv-julia: Add v1.48.0 (#46705) 2024-10-02 11:47:50 -06:00
Mosè Giordano
f8bd11c18f nghttp2: Add v1.59.0 (#46704) 2024-10-02 11:47:36 -06:00
kjrstory
787863e176 fds: add 6.9.0,6.9.1 and openmp (#46717)
* fds: add 6.9.0,6.9.1 and openmp
* typo and style fix
2024-10-02 11:47:12 -06:00
dependabot[bot]
1da7d3bfe3 build(deps): bump codecov/codecov-action from 4.5.0 to 4.6.0 (#46708)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.5.0 to 4.6.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](e28ff129e5...b9fd7d16f6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-02 11:40:43 -06:00
Wouter Deconinck
26bc91fe9b py-rucio-clients: new package (and dependencies) (#46585)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2024-10-02 11:27:23 -06:00
Ray Huang
470774687d fix 'command -v' not in csh (#45845) 2024-10-02 19:15:35 +02:00
Alex Tyler Chapman
20aec1536a ipopt: add 3.14.12 to 3.14.14, improve mumps integration (#46673) 2024-10-02 19:11:39 +02:00
Mosè Giordano
9e1082b625 libblastrampoline: Add v5.9.0, v5.10.1, v5.11.0 (#46702) 2024-10-02 11:09:09 -06:00
Arne Becker
f8f13ad8aa postgresql: Add icu4c dependency for versions 16+ (#46691)
* postgresql: Add icu4c dependency for versions 16+

* postgresql: make ICU an option

* postgresql: ICU variant only needed for v16+

* postgresql: Check for negated option

Check for negated option instead of negating the test

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-10-02 10:08:32 -07:00
Harmen Stoppels
fcf72201d3 aspell_dict: remove unused import (#46716) 2024-10-02 17:49:04 +02:00
Teague Sterling
1c528719cb perl-bio-ensembl-variation: new package (#44507)
* Adding the perl-bio-db-bigfile package

* Adding perl-bio-ensembl-variation package

* Adding perl-bio-ensembl-io package

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Reverting variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Rename package.py to package.py

* Update package.py

* Update package.py

* Update package.py

* Fix variant installation and dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Removing unneeded dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* perl-bio-ensembl: update sha256 of 112

* perl-bio-ensembl-variation: add perl-bio-ensembl-funcgen@{vers}

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-02 15:21:06 +01:00
Auriane R.
f3faeb0f77 py-flash-attn: add v2.6.3 (#46487)
* [py-flash-attn] Add version 2.6.3

* Update dependencies according to the latest version

* Add max_jobs environmental variable to avoid oom error

---------

Co-authored-by: aurianer <8trash-can8@protonmail.ch>
2024-10-02 16:11:50 +02:00
Wouter Deconinck
b1707c2e3c dbus: fix url; add v1.14.10, v1.15.10 (meson); fix CVEs (#46625) 2024-10-02 15:48:04 +02:00
Asher Mancinelli
0093dd74e3 cbqn: new package (#46603) 2024-10-02 15:40:54 +02:00
Alec Scott
f92c5d7a2e gh: add shell completions (#46701) 2024-10-02 15:34:30 +02:00
Alec Scott
dcdc678b19 GoPackage: fix attributes to allow for custom build_directory values (#46707) 2024-10-02 06:23:13 -06:00
Weston Ortiz
8e3e7a5541 goma: add v7.8.2 (#46696) 2024-10-02 06:05:32 -06:00
Andrew W Elble
4016172938 cuda: add conflict for aarch64 with gcc-13.2.0 and cuda@12.4 (#46694)
https://github.com/spack/spack/pull/39666#issuecomment-2377609263
2024-10-02 06:05:07 -06:00
Matthieu Dorier
95ea678d77 kafka: add v2.13-3.7.1, v2.13-3.8.0, remove v2.13-3.7.0 (#46692)
* kafka: added new versions

* kafka: add back deprecated version

* kafka: fixing style
2024-10-02 05:34:10 -06:00
Harmen Stoppels
8ccf244c6b Revert "cc: ensure that RPATHs passed to linker are unique" (#46712)
* Revert "`cc`: ensure that RPATHs passed to linker are unique"

This reverts commit 2613a14c43.

* Revert "`cc`: simplify ordered list handling"

This reverts commit a76a48c42e.
2024-10-02 11:50:09 +02:00
Wouter Deconinck
308dfeb2a6 py-django: add v5.0.9, v5.1.1 (fix CVEs) (#46671)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-02 10:28:20 +02:00
Harmen Stoppels
118a9e8db7 tests: fix wrong install name (#46711) 2024-10-02 10:09:17 +02:00
Derek Ryan Strong
6740ea79af Add openblas v0.3.28 with patch (#46495) 2024-10-02 01:40:03 -06:00
Adam J. Stewart
6af8526aa1 py-rasterio: add v1.4.1 (#46693) 2024-10-02 01:04:35 -06:00
Weiqun Zhang
f73cb0ded4 amrex: add v24.10 (#46697) 2024-10-01 22:18:31 -06:00
Thomas-Ulrich
0a683c14a4 add easi 1.4 and 1.5 (#46688) 2024-10-01 21:30:31 -06:00
Thomas-Ulrich
9cf5ba82c7 fix #46681 (#46682) 2024-10-01 21:23:04 -06:00
renjithravindrankannath
104b5c3f68 yaml-cpp library path is lib64 in centos (#46669) 2024-10-01 16:45:49 -07:00
Tamara Dahlgren
c8efea117f Docs: environment update (#44087)
Updated the terminology for the two types of environments to be
consistent with that used in the tutorial for the last three years.

Additionally:
* changed 'anonymous' to 'independent in environment command+test for consistency.
2024-10-01 22:26:31 +02:00
Harmen Stoppels
486ff2beac gha: brew gnupg2 is now gnupg (#46689) 2024-10-01 22:23:56 +02:00
Wouter Deconinck
12334099f7 umoci: fix my github handle (#46664) 2024-10-01 11:34:50 -06:00
Harmen Stoppels
e4f571b1ee build_environment: small cleanup and add todo comments (#46687) 2024-10-01 18:35:23 +02:00
dependabot[bot]
f8b9178630 build(deps): bump actions/checkout from 4.1.7 to 4.2.0 (#46675)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.7 to 4.2.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v4.1.7...d632683dd7b4114ad314bca15554477dd762a938)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-01 08:49:18 -06:00
Bernhard Kaindl
ed57db9498 mplack depends on hermes-shm, and it needs pkgconfig 2024-10-01 16:39:08 +02:00
Wouter Deconinck
731abee6b4 mlpack: add v4.5.0 2024-10-01 16:39:08 +02:00
Sreenivasa Murthy Kolam
60469f83e2 ROCm packages: Bump up the version for ROCm-6.2.1 release (#46553) 2024-10-01 16:30:24 +02:00
Wouter Deconinck
f2f3af19c3 py-uwsgi: add v2.0.27 (fix CVE) (#46667) 2024-10-01 08:18:35 -06:00
suzannepaterno
cbbdf38f4f totalview: add v2024.1-linux-arm64 -> v2024.2-x86-64 (#45490)
* Update package.py

Update to pull Totalview tar files from AWS instead of requiring the user to download ahead of time. Use new license type, RLM license. Only allow for installs of versions using the new license type. 2024.1 and 2024.2. User selects the platform with the version as it is down from the TotalView downloads website.

* Update package.py

Update to pass style test

* Update package.py

fixing syle

* Updating to pass style check

removing more spaces to pass style check

* final style fixes

fixing the last 2 style errors

* Typo

Typo correction to pass style check

* REmove new line

removing new line character

* Ran black to reformat

Ran black to clear errors

* Changing to use sha256

Updating to use sha256 checksums for all TotalView files.
2024-10-01 16:12:02 +02:00
Harmen Stoppels
44618e31c8 stable_partition: use TypeVar (#46686) 2024-10-01 15:58:24 +02:00
Wouter Deconinck
5bc105c01c gfal2: new package (and dependencies) (#46559) 2024-10-01 12:01:10 +02:00
Wouter Deconinck
4e3a8b1928 ecdsautils: add v0.4.1 (fix CVEs) (#46628)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-01 11:55:14 +02:00
Stephen Nicholas Swatman
59179764d7 acts dependencies: new versions as of 2024/09/30 (#46661)
* acts dependencies: new versions as of 2024/09/30

This commit adds new versions of acts, actsvg, and detray.

* Add vecmem version, patch detray version
2024-09-30 20:40:35 -06:00
Massimiliano Culpo
acce67241e Do not use single letter packages in unit-tests (#46665)
#45205 already removed previous use of single letter packages
from unit tests, in view of reserving `c` as a language (see #45191).

Some use of them has been re-introduced accidentally in #46382, and
is making unit-tests fail in the feature branch #45189 since there
`c` is a virtual package.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-01 02:33:24 +00:00
Wouter Deconinck
73036f9567 ffmpeg: add v7.0.2 (fix CVE) (#46629)
* ffmpeg: add v7.0.2 (fix CVE)
* ffmpeg: fix license directive for gpl variant
2024-09-30 18:14:46 -07:00
Harmen Stoppels
d40ea48db7 grpc/protobuf: new versions (#46408) 2024-09-30 19:08:21 -06:00
Wouter Deconinck
fdb5178f99 py-onnx: use out of source tree build for CMake part (#45266)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2024-09-30 19:07:57 -06:00
Afzal Patel
7a3525a053 patch clang++ path 2024-10-01 03:02:10 +02:00
Luc Berger
394ed4f90d Kokkos sanity checks (#44589)
* Kokkos: adding some sanity checks
   We can pretty much guarentee that if bin, include or lib directory
   is missing, something is wrong. Additionally KokkosCore_config.h
   and Kokkos_Core.hpp. I guess technically we could look for all
   public headers at least but that seems a bit overkill as well?
* Kokkos Kernels: adding sanity checks
* Remove check for lib directory since it might end up being lib64
* Also remove lib from kokkos-kernels sanity check
2024-09-30 16:10:20 -07:00
Adrien Bernede
d7f5dbaf89 Update and standardize implementation of RADIUSS packages (#45648)
* Add latest releases of Camp, RAJA, Umpire, CHAI and CARE
* Address review comments + blt requirement in Umpire
* CARE @develop & @main: Submodules -> False
* Changes in Umpire
* Changes in RAJA
* Changes in CHAI
* Changes in RAJA: prefer 'spec.satisfies' to 'in spec'
   This is due to a non-equivalence in Spack with providers like mpi.
   See e.g. https://github.com/spack/spack/pull/46126
* Changes in Umpire: prefer 'spec.satisfies' to 'in spec'
   This is due to a non-equivalence in Spack with providers like mpi.
   See e.g. https://github.com/spack/spack/pull/46126
* Changes in CARE:
   Still need to update to CachedCMakePackage based on RADIUSS Spack Configs version
* Missing change in RAJA + changes in fmt
* Fix synta
* Changes in Camp
* Fix style
* CHAI: when ~raja, turn off RAJA in build system
* Fix: Ascent@0.9.3 does not support RAJA@2024.07.0
* Enforce same version constraint on Umpire as for RAJA
* Enforce preferred version of vtk-m in ascent 0.9.3
* Migrate CARE package to CachedCMakePackage
* Fix style in CARE package
* CARE: Apply changes for uniform implementation accross RADIUSS projects
* Caliper: move to CachedCMakePackage, from RADIUSS Spack Configs
* Adapt RAJA Perf to spack CI
* Activate CHAI, CARE and RAJAPerf in Spack CI
* Fixes and diffs with RADIUSS Spack Configs
* Caliper: fix
* Caliper : fix + RAJAPerf : style
* RAJAPerf: fixes
* Update maintainers
* raja-perf: fix license header
* raja-perf: Fix variant naming openmp_target -> omptarget
* raja-perf: style and blt dependency versions
* CARE: benchmark and examples off by default (like tests)
* CARE: fix missing variable
* Update var/spack/repos/builtin/packages/raja-perf/package.py
* CARE: fix branch name
* Revert changes in MFEM to pass CI
* Fix CXX17 condition in RAJA + add sycl option in RAJAPerf

---------

Co-authored-by: Rich Hornung <hornung1@llnl.gov>
2024-09-30 15:16:24 -07:00
Teague Sterling
cb43019455 cbindgen: new package plus 1 dependency package (#45393)
* cbindgen: new package
* Attempting to add rust dependencies for cbindgen
* adding rust-toml min rust version
* Removing dependencies that don't install with cargo
* cleanup broken packages

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-30 14:57:51 -07:00
Pranav Sivaraman
5303ec9aa5 neocmakelsp: new package (#46623)
* neocmake: add new package
* neocmake: add description, homepage, and license
* neocmakelsp: remove boilerplate
2024-09-30 14:39:25 -07:00
Harmen Stoppels
02499c72c9 avoid rpath'ing default search paths (#44686)
On sysroot systems like gentoo prefix, as well as nix/guix, our "is
system path" logic is broken cause it's static.

Talking about "the system paths" is not helpful, we have to talk
about default search paths of the dynamic linker instead.

If glibc is recent enough, we can query the dynamic loader's default
search paths, which is a much more robust way to avoid registering
rpaths to system dirs, which can shadow Spack dirs.

This PR adds an **additional** filter on rpaths the compiler wrapper
adds, dropping rpaths that are default search paths. The PR **does
not** remove any of the original `is_system_path` code yet.

This fixes issues where build systems run just-built executables
linked against their *not-yet-installed libraries*, typically:

```
LD_LIBRARY_PATH=. ./exe
```

which happens in `perl`, `python`, and other non-cmake packages.
If a default path is rpath'ed, it takes precedence over
`LD_LIBRARY_PATH`, and a system library gets loaded instead
of the just-built library in the stage dir, breaking the build. If
default paths are not rpath'ed, then LD_LIBRARY_PATH takes
precedence, as is desired.

This PR additionally fixes an inconsistency in rpaths between
cmake and non-cmake packages. The cmake build system
computed rpaths by itself, but used a different order than
computed for the compiler wrapper. In fact it's not necessary
to compute rpaths at all, since we let cmake do that thanks to
`CMAKE_INSTALL_RPATH_USE_LINK_PATH`. This covers rpaths
for all dependencies. The only install rpaths we need to set are
`<install prefix>/{lib,lib64}`, which cmake unfortunately omits,
although it could also know these. Also, cmake does *not*
delete rpaths added by the toolchain (i.e. Spack's compiler
wrapper), so I don't think it should be controversial to simplify
things.
2024-09-30 20:32:50 +02:00
Wouter Deconinck
f1275536a2 libexif: add v0.6.24 (fix CVEs) (#46634)
* libexif: add v0.6.24 (fix CVEs)

* libexif: operator Version -> spec.satisfies

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-09-30 11:14:38 -07:00
Wouter Deconinck
a88239affc hazelcast: add v5.5.0 (fix CVE) (#46666) 2024-09-30 11:10:50 -07:00
Wouter Deconinck
260ea91b83 qpdf: add v11.9.1 (fix CVE) (#46645) 2024-09-30 10:01:15 -07:00
John Jolly
dea075a8b0 tmux: Fix MacOS header guard collison build bug 2024-09-30 18:55:03 +02:00
H. Joe Lee
5f333c3632 hermes-shm: new package (#46601) 2024-09-30 18:46:01 +02:00
Wouter Deconinck
59b5cb0962 rabbitmq-c: add v0.14.0 (fix CVE) (#46648) 2024-09-30 09:38:24 -07:00
Benjamin Rodenberg
16410cda18 py-pyprecice: add v2.5.0.3 -> v3.1.1 (#45105)
* Add missing 2.5 releases and 3.x releases.

* Update var/spack/repos/builtin/packages/py-pyprecice/package.py

* Update package.py

Add license (again)
2024-09-30 09:18:30 -07:00
Wouter Deconinck
100c8b7630 routinator: add v0.14.0 (fix CVE) (#46649) 2024-09-30 10:17:04 -04:00
Thomas-Ulrich
47591f73da impalajit, hipsycl, tandem: various updates (#45013) 2024-09-30 11:17:46 +02:00
AMD Toolchain Support
d3b4f0bebc nwchem: add v7.2.3 (#46479)
Co-authored-by: viveshar <vivek.sharma2@amd.com>
Co-authored-by: Vivek Sharma <129405591+VivekAMD@users.noreply.github.com>
2024-09-30 11:15:49 +02:00
Wouter Deconinck
5c52ded1be pacparser: add v1.4.5 (#46639) 2024-09-30 03:04:28 -06:00
Bernhard Kaindl
ff7e1b5918 shadow: add 4.16, fix build when libbsd installed on hosts (#46621) 2024-09-30 10:53:32 +02:00
Wouter Deconinck
6ea8079919 cups: fix url; add v2.4.10 (#46624)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-09-30 10:47:53 +02:00
Wouter Deconinck
d48cf4ea3e flume: add v1.11.0 (#46630) 2024-09-30 10:44:16 +02:00
Wouter Deconinck
41044b6f84 flink: add v1.20.0 (#46631) 2024-09-30 10:43:04 +02:00
Wouter Deconinck
338c331115 libpcap: add v1.10.5 (#46635) 2024-09-30 10:42:15 +02:00
Wouter Deconinck
563f76fbe5 libressl: add v3.7.3, v3.8.4, v3.9.2 (#46636) 2024-09-30 10:41:22 +02:00
Wouter Deconinck
6a2fb38673 net-snmp: add v5.9.4 (#46638) 2024-09-30 10:40:34 +02:00
Wouter Deconinck
429c5149ce perl-dbi: add v1.645 (#46640) 2024-09-30 10:39:32 +02:00
Wouter Deconinck
043fea4ef9 postgresql: add v12.20, v13.16, v14.13, v15.8, v16.4 (fix CVEs) (#46641) 2024-09-30 10:34:21 +02:00
Wouter Deconinck
4aa1f92c9c gdk-pixbuf: add v2.42.12 (#46633) 2024-09-30 10:32:56 +02:00
stepanvanecek
92d3e0ca6f sys-sage add v0.5.0 (#46659)
Co-authored-by: Stepan Vanecek <stepan.vanecek@tum.de>
2024-09-30 10:28:44 +02:00
Wouter Deconinck
a28b9ec8f3 py-mechanize: add v0.4.10 (#46642) 2024-09-30 10:25:38 +02:00
Wouter Deconinck
d7452f68a0 py-pycryptodome: add v3.20.0 (#46643) 2024-09-30 10:24:59 +02:00
Wouter Deconinck
4c333fb535 py-werkzeug: add v3.0.4 (#46644) 2024-09-30 10:23:51 +02:00
Wouter Deconinck
f32bdea867 qt-*: add v6.7.3 (#46646) 2024-09-30 10:22:53 +02:00
Wouter Deconinck
05ad0618cc unixodbc: add v2.3.12 (#46650) 2024-09-30 10:18:22 +02:00
Wouter Deconinck
e0987e0350 umoci: fix url; add v0.4.7 (#46651) 2024-09-30 10:17:13 +02:00
Christoph Junghans
519ed7f611 votca: add v2024.2 (#46653) 2024-09-30 09:35:38 +02:00
Wouter Deconinck
ecbc21a7ce rpm: add v4.18.2 (#46654) 2024-09-30 09:34:51 +02:00
Satish Balay
56c61763d7 slepc, py-slepc4py, petsc, py-petsc4py add v3.22.0 (#46647)
Co-authored-by: Jose E. Roman <jroman@dsic.upv.es>
2024-09-30 09:33:58 +02:00
Wouter Deconinck
8168b17ddf libvips: add v8.15.3 (#46655) 2024-09-30 09:32:20 +02:00
SXS Bot
d2a5b07d7f spectre: add v2024.09.29 (#46657)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-09-30 09:27:13 +02:00
Wouter Deconinck
b570ca3cf3 hive: add v3.1.3 (fix CVEs) (#46632) 2024-09-28 22:59:54 -06:00
Pranav Sivaraman
343586d832 fd: add minimum rustc version requirement (#46622) 2024-09-28 22:38:48 -06:00
etiennemlb
a8d02bd3b0 cp2k: fix libs type issue (#46079)
* Fix CP2K list/LibraryList issue

* [@spackbot] updating style on behalf of etiennemlb
2024-09-28 10:02:51 -07:00
Maciej Wójcik
e45019f246 snakemake: add new version and update plugins (#43437) 2024-09-28 17:12:37 +02:00
MatthewLieber
f31a99f188 mvapich: Add pmix variant (#45531) 2024-09-28 15:17:10 +02:00
Adam J. Stewart
6c4abef75c py-protobuf: drop +cpp, always require protobuf (#46501)
* py-protobuf: drop +cpp, always require protobuf

* Fix remaining reference to +cpp
2024-09-28 15:13:42 +02:00
Pranav Sivaraman
4f6789bb38 highway: remove HWY_CMAKE_ARM7 (not supported) (#45815)
Spack currently does not detect armv7 architectures so remove it.
2024-09-28 14:43:01 +02:00
Pasquale Claudio Africa
aff3fd8efd trilonos: Fix missing #include <cstdint> for gcc-13+ (#46165) 2024-09-28 14:33:43 +02:00
Marie Houillon
3bacced861 opencarp, meshtool, py-carputils: New openCARP version v16.0 (#45820)
Co-authored-by: openCARP consortium <info@opencarp.org>
2024-09-28 14:20:49 +02:00
dependabot[bot]
849b82d242 build(deps): bump docker/build-push-action from 6.7.0 to 6.8.0 (#46618)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.7.0 to 6.8.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](5cd11c3a4c...32945a3392)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-28 05:57:35 -06:00
wspear
114ac5908a gpgme: fix dep on libassuan up to @2.5.7 (#45703)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2024-09-28 05:22:51 -06:00
Cristian Di Pietrantonio
19712d3461 Singularity: set share loop devices to yes. (#45846)
https://docs.sylabs.io/guides/main/admin-guide/configfiles.html#loop-devices

shared loop devices: This allows containers running the same image
to share a single loop device. This minimizes loop device usage and
helps optimize kernel cache usage.
Enabling this feature can be particularly useful for large MPI jobs.
2024-09-28 13:14:19 +02:00
kwryankrattiger
e08a72a333 dav-sdk: Add new DaV SDK Package (#45465)
This package is a post-ECP port of the Data and Vis SDK. The new
DAV SDK drops the ECP prefix and some previously included packages.
2024-09-28 13:05:31 +02:00
Cristian Di Pietrantonio
1b6345828d AMBER: Add conflict clause to avoid using X11 on Cray (#45844) 2024-09-28 12:40:19 +02:00
Victor Lopez Herrero
04f4493c5b dlb: add v3.4.1 (#46032) 2024-09-28 12:17:39 +02:00
Mike Renfro
44bff90089 gcc: add 11.5.0 (#46614) 2024-09-27 19:38:28 -06:00
Asher Mancinelli
9290294ada sbcl: new package (#46611)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-27 19:23:11 -06:00
Alec Scott
04fb52aeca ci: simplify coverage CI/CD job (#46441)
* ci: simplify coverage CI/CD job
* Fix typo in dependent job
2024-09-28 01:13:33 +00:00
Pierre Augier
98854582e3 py-fluidsim-core and py-fluidsim: add new packages (#46438) 2024-09-28 02:28:41 +02:00
Alec Scott
4eb9bb5f9a rust: fix bootstrap dependency version typo (#46620) 2024-09-27 17:16:02 -07:00
Asher Mancinelli
13a58476ee sbcl-bootstrap: add darwin binaries (#46617)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-27 17:56:01 -06:00
Alec Scott
5c8d22c597 Better shell completion support for packages (#44756) 2024-09-27 16:02:37 -07:00
Greg Becker
07e964c688 Spec.splice: Allow splices when multiples nodes in the DAG share a name (#46382)
The current `Spec.splice` model is very limited by the inability to splice specs that
contain multiple nodes with the same name.  This is an artifact of the original
algorithm design predating the separate concretization of build dependencies,
which was the first feature to allow multiple specs in a DAG to share a name.

This PR provides a complete reimplementation of `Spec.splice` to avoid that
limitation. At the same time, the new algorithm ensures that build dependencies
for spliced specs are not changed, since the splice by definition cannot change
the build-time information of the spec. This is handled by splitting the dependency
edges and link/run edges into separate dependencies as needed.

Signed-off-by: Gregory Becker <becker33@llnl.gov>
2024-09-27 13:58:43 -07:00
kwryankrattiger
35b2750407 CI: Add documentation for adding new stacks and runners (#42179)
* CI: Add documentation for adding new stacks and runners
* More docs for runner registration

---------

Co-authored-by: Zack Galbreath <zack.galbreath@kitware.com>
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2024-09-27 13:09:41 -07:00
Massimiliano Culpo
639854ba8b spec: simplify string formatting (#46609)
This PR shorten the string representation for concrete specs,
in order to make it more legible.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-27 12:59:14 -07:00
Beat Reichenbach
cd5c85ba13 openimageio: update to v2.5.15.0, old versions not buildable anymore (#46045)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-27 20:21:17 +02:00
Teague Sterling
cdba31280b rust-bindgen: add v0.66.0 -> v0.69.4 (#45392)
* rust-bindgen: add v0.66.0,v0.66.1,v0.68.1,v0.69.0-v0.69.4 & change build system to cargo

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix dep

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/rust-bindgen/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-09-27 09:16:07 -07:00
Alec Scott
9b5f15abec docs: add --depth=2 to reduce download size (#46605)
* docs: add --depth=2 to reduce download size

* Add note to tell users about --depth=2 and manyFiles

* Fix inline code in info block
2024-09-27 09:09:19 -07:00
Mikael Simberg
9ad1d0c813 gperftools: Add 2.16 (#46606) 2024-09-27 07:49:23 -06:00
AMD Toolchain Support
00fae6dd79 charmpp: build fix for aocc (#45826) 2024-09-27 15:07:22 +02:00
Todd Gamblin
2613a14c43 cc: ensure that RPATHs passed to linker are unique
macOS Sequoia's linker will complain if RPATHs on the CLI are specified more than once.
To avoid errors due to this, make `cc` only append unique RPATHs to the final args list.

This required a few improvements to the logic in `cc`:

1. List functions in `cc` didn't have any way to append unique elements to a list. Add a
   `contains()` shell function that works like our other list functions. Use it to implement
   an optional `"unique"` argument to `append()` and an `extend_unique()`. Use that to add
   RPATHs to the `args_list`.

2. In the pure `ld` case, we weren't actually parsing `RPATH` arguments separately as we
   do for `ccld`. Fix this by adding *another* nested case statement for raw `RPATH`
   parsing. There are now 3 places where we deal with `-rpath` and friends, but I don't
   see a great way to unify them, as `-Wl,`, `-Xlinker`, and raw `-rpath` arguments are
   all ever so slightly different.

3. Fix ordering of assertions to make `pytest` diffs more intelligible. The meaning of
   `+` and `-` in diffs changed in `pytest` 6.0 and the "preferred" order for assertions
   became `assert actual == expected` instead of the other way around.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-27 05:27:26 -07:00
Todd Gamblin
a76a48c42e cc: simplify ordered list handling
`cc` divides most paths up into system paths, spack managed paths, and other paths.
This gets really repetitive and makes the code hard to read. Simplify the script
by adding some functions to do most of the redundant work for us.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-27 05:27:26 -07:00
Tamara Dahlgren
0619a8bd4f axom/stand-alone tests: build and run in test stage directory (#46421)
* axom/stand-alone tests: build and run in test stage directory
* Removed unused glob
* axom/stand-alone tests: add example_stage_dir variable for clarity
2024-09-27 05:42:42 -06:00
吴坎
a3dc9e1eb8 py-altair: add v5.4.1 (#46461)
* Update py-altair@5.4.1

* Update

* Update var/spack/repos/builtin/packages/py-altair/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/py-altair/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-27 05:37:58 -06:00
Elliott Slaughter
e129df38ce legion: add 24.09.0 (#46592) 2024-09-27 05:13:01 -06:00
Ashim Mahara
9762e8719b - added rust versions (#46597) 2024-09-27 04:53:58 -06:00
dslarm
044dcee7ad acfl: remove gcc oriented PATH / LD_LIBRARY_PATH (#46594) 2024-09-27 04:48:36 -06:00
H. Joe Lee
32b6381b9b py-jarvis-util: add master version (#46599) 2024-09-27 04:38:41 -06:00
Stephen Sachs
532e6b6aa9 mpas-model: enable oneapi compiler (#46457)
Co-authored-by: stephenmsachs <stephenmsachs@users.noreply.github.com>
2024-09-27 03:18:23 -06:00
Richard Berger
7d6231b38a aspell: various fixes and updates (#46383)
SimpleFilesystemView was producing an error due to looking for a
<prefix>/lib/.spack folder. Also, view_destination had no effect and
wasn't called. Changed this by instead patching in the correct
installation prefix for dictionaries.

Since aspell is using the resolved path of the executable prefix, the
runtime environment variable ASPELL_CONF is set to correct the prefix
when in a view. With this change aspell can now find installed
dictionaries. Verified with:

aspell dump config
aspell dump dicts
2024-09-27 10:48:08 +02:00
Richard Berger
653556e881 texlive: fixup mtxrun for 2024 version (#46465) 2024-09-27 10:23:21 +02:00
BOUDAOUD34
13feee8364 CP2K: fix .mod file incompatibility on ROCm by using USE, INTRINSIC d… (#45848)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-09-27 10:19:38 +02:00
Asher Mancinelli
62abcfb05f ocaml: add new versions (#46534) 2024-09-27 10:14:37 +02:00
Brett Viren
3637c087c5 ollama: add v0.3.9, and cuda variant (#46204)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: brettviren <brettviren@users.noreply.github.com>
Co-authored-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-27 10:13:54 +02:00
Wouter Deconinck
a835a0ed31 hudi: fix url (#46524) 2024-09-27 09:39:18 +02:00
kenche-linaro
7b3ab0e575 linaro-forge: added version 24.0.5 (#46588) 2024-09-27 09:26:53 +02:00
jmuddnv
0aa7963ed1 nvhpc: add v24.9 (#46586) 2024-09-27 08:28:12 +02:00
dependabot[bot]
941c8daabe build(deps): bump actions/checkout from 4.1.7 to 4.2.0 (#46584)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.7 to 4.2.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](692973e3d9...d632683dd7)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-27 08:22:47 +02:00
Alex Richert
29d85ba552 openblas: %intel@2021: conflict with avx512 (#44883) 2024-09-27 07:12:36 +02:00
Gavin John
1b01680be1 py-nanoplot: add v1.43.0 (#46160)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-26 22:57:46 -06:00
Robert Cohn
e439f8ea87 intel-oneapi-runtime: add depends on libirc (#46589) 2024-09-27 05:40:48 +02:00
arezaii
1d06a324b5 Chapel 2.2 update (#46593)
* shorten version number validations per reviewer feedback
* rename set_lib_path per reviewer feedback
* Add E4S tag
* Set CHPL_CUDA_PATH to ensure Chapel installer finds the right package
* Update ROCm dependency for Chapel 2.2
* Fix llvm-amdgpu and CHPL_TARGET_* for llvm=bundled
* Ensure CHPL_TARGET_COMPILER is set to "llvm" when required (llvm=spack
   or +cuda or +rocm).
* Ensure CHPL_TARGET_{CC,CXX} are only set when using llvm=spack or llvm=none
* Use hip.prefix to set CHPL_ROCM_PATH
   Since we might not directly depend on llvm-amdgpu, thus it might
   not appear in our spec
* limit m4 dependency to +gmp
* limit names of env vars created from variants
* Ensure that +cuda and +rocm variants are Sticky

The concretizer should never be permitted to select GPU support, because
it's only meaningful and functional when the appropriate hardware is actually
available, and the concretizer cannot reliably determine that.

Also: Chapel's GPU support includes alot of complicated dependencies
and constraints, so leaving that choice free to the concretizer leads to alot
of extraneous and confusing messages when failing to concretize a
non-GPU-enabled spec.

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2024-09-27 05:39:14 +02:00
Adam J. Stewart
20f90dcda2 LLVM: mark cuda_arch compatibility (#46397) 2024-09-27 05:30:47 +02:00
Asher Mancinelli
6f5f6a65b3 sbcl-bootstrap: new package (#46582)
Add pre-built sbcl for x86 and arm for various glibc versions, making
way for an actual sblc built from source.

Also switch to use set_env in a context manager over setting the
environment variable for the build environment. I hit an issue with the
build system due to this in the sbcl package, pre-empting the same issue
here.
2024-09-27 05:27:48 +02:00
Alec Scott
7360afb668 developer-tools-ci: remove version constraint on Emacs (#46590) 2024-09-26 15:57:28 -07:00
Juan Miguel Carceller
f25e586f0a madgraph5: add newer versions and a pythia8 variant (#41128)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-26 12:24:01 -06:00
Cameron Smith
0191e15a6a omega-h: add version scorec.10.8.5 and test support (#45990) 2024-09-26 19:07:40 +02:00
psakievich
ea6e39805a Add a custom hook for dev_path changes (#46529)
* Add a custom hook for dev_path changes

Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-09-26 08:59:13 -07:00
Thomas Madlener
bbd205543b py-onnx: build 1.15 with c++17 / c++20 when neeed (#46571) 2024-09-26 15:29:58 +02:00
Juan Miguel Carceller
b95160cd86 gaudi: add a patch for missing includes for @37:38 (#46365) 2024-09-26 15:08:59 +02:00
Wouter Deconinck
201840367d parallel: add v20240822 (#46355) 2024-09-26 14:41:50 +02:00
Thomas Madlener
5ebf45861f gaudi: Add version 39.0 and adapt dependencies and variants accordingly (#46572)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-26 06:38:28 -06:00
Tobias Ribizel
ec0d97ae82 hwloc: Disable levelzero explicitly if not requested (#46530)
The configure script will otherwise pick up external levelzero libraries and may potentially break depending libraries like pmix
2024-09-26 06:33:44 -06:00
Henri Menke
8290e7d947 bigdft-psolver: fix build failure (#46482)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-26 05:53:32 -06:00
Adam J. Stewart
95966ce10a py-pandas: add v2.2.3 (#46508) 2024-09-26 13:17:23 +02:00
Derek Ryan Strong
dce2f4ca7c strace: add v6.11 and mpers variant (disabed by default) (#46472)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-26 04:23:01 -06:00
Adam J. Stewart
7dc549d926 py-rasterio: add v1.4.0 (#46587) 2024-09-26 03:38:18 -06:00
Derek Ryan Strong
aa0e605956 fix: Disable native host build optimisations for fio (#46444) 2024-09-26 11:30:51 +02:00
snehring
5e56fa839d entrezdirect: add v22.6.20240912 (#46470)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-09-26 11:14:27 +02:00
Olivier Cessenat
c62ed8bb2f ngspice: add version 43 (#46499) 2024-09-26 10:36:08 +02:00
Wouter Deconinck
6872da419d spherepack: fix url, checksum, add flags (#46525) 2024-09-26 10:21:19 +02:00
Wouter Deconinck
e2aa11518a drill: fix url, add non-log4j-affected v1.20.3, v1.21.2 (#46531) 2024-09-26 09:42:28 +02:00
Dom Heinzeller
d4233a3048 py-arch and py-pandas-datareader: New packages (#46557) 2024-09-26 09:34:04 +02:00
Mikael Simberg
badb3bcee7 dla-future: Adapt lapack/scalapack CMake variables for master branch and next version (#46570)
* dla-future: Add DLAF_ prefix to LAPACK_LIBRARY CMake variable in newer versions

* dla-future: Use spec.satisfies to check version constraint for LAPACK_LIBRARY variable prefix

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>

---------

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2024-09-26 01:33:27 -06:00
Ben Cowan
4b05a2b78f py-ipython: Add 8.26, 8.27 (#46566) 2024-09-26 09:26:54 +02:00
Matt Thompson
777b5b0c39 mapl: add 2.48.0 (#46563) 2024-09-26 09:09:06 +02:00
James Shen
cd9446b5d4 genie: add v3.4.2 (#46568) 2024-09-26 09:07:34 +02:00
Brian Spilner
37ede80d3f add v2.4.4 (#46581) 2024-09-26 01:03:15 -06:00
Alec Scott
4eeb5b9665 emacs: fix dependency on texinfo when @29.4: (#46419) 2024-09-25 19:37:27 -06:00
Satish Balay
2778e530ad llvm: add v19.1.0 (#46504)
llvm@19 removed LLVM_ENABLE_TERMINFO,
so specify spec["ncurses"].libs via cmake option, LLDB_CURSES_LIBS
2024-09-25 17:39:47 -06:00
Wouter Deconinck
af9b359478 apr: add v1.7.5, deprecate older versions due to CVE (#46532) 2024-09-25 23:47:12 +02:00
dmagdavector
54ffc635e2 iproute2: add new versions (#46541) 2024-09-25 23:38:31 +02:00
Wouter Deconinck
71f73f5e40 cryptopp: fix url (#46562) 2024-09-25 23:31:14 +02:00
Richard Berger
b83c5237e9 Updates to several Python packages (#46404)
* py-sphinx-tabs: new version 3.4.5

* py-sphinx-design: new versions 0.5.0, 0.6.0, and 0.6.1

* py-requests: new version 2.32.3

* py-dnspython: new version 2.6.1

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* py-hatch-vcs: new version 0.4.0

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-25 15:22:32 -06:00
Matthieu Dorier
6bf005e042 faiss: add 1.8.0 and fix the dependency on cmake (#46578) 2024-09-25 23:15:49 +02:00
Pranav Sivaraman
2d8b4dbe3b zoxide: add v0.9.5 and v0.9.6 (#46579) 2024-09-25 20:11:47 +02:00
Tobias Ribizel
0fab5cadb3 llvm: Add conflict related to Python distutils removal (#46528)
Python 3.12 removed the `distutils` module, which is being required
by the build process of LLVM <= 14: Conflict with it for +python.

Fix build to not pick host tools like an incompatible Python from host

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-25 19:13:57 +02:00
Jen Herting
ade2513dcb py-nltk: add v3.9.1 (#46302) 2024-09-25 19:12:16 +02:00
Stephen Sachs
74b76cc6b3 quantum-espresso: restrict oneapi compiler to supported versions (#46456) 2024-09-25 18:06:14 +02:00
Mikael Simberg
edf6729943 mimalloc: Add 2.1.7 (#46577) 2024-09-25 08:29:52 -06:00
Andrew W Elble
fd087107ea py-tensorflow: fix linking with ubuntu's gcc (#45437)
gcc on ubuntu has fix-cortex-a53-843419 set by default - this causes linking
issues (symbol relocation errors) for tf, even when compiling for different
cpus.
2024-09-25 15:19:58 +02:00
Andrey Perestoronin
a342da5642 added new packages (#46440) 2024-09-25 04:55:34 -06:00
Mikael Simberg
5f7501c4f7 mold: Add 2.34.0 (#46569) 2024-09-25 02:53:35 -06:00
Harmen Stoppels
f01dbe2c35 Remove a few redundant imports (#46512) 2024-09-25 10:05:11 +02:00
afzpatel
a474034023 py-jaxlib: add external ROCm support (#46467)
* add external ROCm support for py-jaxlib

* fix style

* remove fork releases
2024-09-24 16:38:58 -06:00
Samuel Browne
022eca1cfe Fix off-by-one padding bug (#46560)
If `add_padding()` is allowed to return a path with a trailing path
separator, it will get collapsed elsewhere in Spack. This can lead to
buildcache entries that have RPATHS that are too short to be replaced by
other users whose install root happens to be padded to the correct
length.  Detect this and replace the trailing path separator with a
concrete path character.

Signed-off-by: Samuel E. Browne <sebrown@sandia.gov>
2024-09-24 15:37:52 -06:00
Martin Pokorny
f49b10ee43 casacore-measures: new package, fix wget dependency (#46125)
Co-authored-by: Bernhard Kaindl <bernhard.kaindl@cloud.com>
2024-09-24 15:14:02 -06:00
etiennemlb
ff4e311e72 kahip: Add patch to fix missing include (#46084) 2024-09-24 22:45:08 +02:00
Thomas Madlener
43d1cdb0bd dd4hep: Add tag for version 1.30 (#46429) 2024-09-24 14:19:29 -06:00
Asher Mancinelli
08c4f81596 opam: Add versions 2.1.4 to 2.2.1 (#46535)
Also: set the build and install directories to the source directory
because the build system unfortunately expects the `src_ext` directory
to be under the current working directory when building the bundled
third-party libraries, even when the configure script is run from
another directory.

@scemama pointed out that 'make' just calls 'dune' which is already
parallel, so make itself should not have more than one job.

opam@:2.1 need 'make lib-ext' for cmdliner, above it's obsolete.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@cloud.com>
2024-09-24 13:45:56 -06:00
Pierre Augier
63986d31ef py-pyfftw: add v0.14.0 (#46336) 2024-09-24 21:33:47 +02:00
Gregory Lee
f50f5859f3 dakota: add v6.20 and update boost version deps (#46551) 2024-09-24 20:41:47 +02:00
Derek Ryan Strong
99c4f2c3ff openmpi: Enable built-in atomics by default (#46186) 2024-09-24 20:37:01 +02:00
afzpatel
3b61a1f778 omniperf: New package (with deps) (#46239)
Co-authored-by: Bernhard Kaindl <bernhard.kaindl@cloud.com>
2024-09-24 12:21:15 -06:00
Rocco Meli
3771278d50 plumed: add v2.9.2 (#46544) 2024-09-24 19:35:28 +02:00
Adam J. Stewart
3649a195d3 ML CI: Ubuntu 22 -> 24 (#45644) 2024-09-24 19:13:28 +02:00
Wouter Deconinck
2b9a3b6057 httpd: add v2.4.62, deprecate versions due to CVE, rm deprecated versions (#46533) 2024-09-24 18:52:27 +02:00
teddy
a300359627 add package py-non-regression-test-tools (#46180)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-09-24 10:15:44 -06:00
teddy
ad75bc022f py-mgmetis: New package (#46178) 2024-09-24 18:01:02 +02:00
Robert Underwood
befbbec2b3 py-pybind11: set correct prefix in pc files (#46298)
The detection logic for the prefix used in py-bind11 if broken for spack
resulting in an empty prefix.  However, the package provides an escape
hatch in the form of `prefix_for_pc_file`. Use this escape hatch to
provide the correct path; spack will always know better than pybind11's
CMake.

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-24 16:47:01 +02:00
Robert Underwood
f0e51a35c7 opencv+hdf: depend on mpi when ^hdf5+mpi (#46244)
When OpenCV depends on HDF5+mpi, it needs to the MPI compilers available.
2024-09-24 16:33:30 +02:00
Kelly (KT) Thompson
d5b8b0600a random123: Add support for HIP/rocm. (#46284) 2024-09-24 16:29:16 +02:00
Adam J. Stewart
0b575f60a5 libsodium: add v1.0.20 (#46455) 2024-09-24 09:07:02 -05:00
Darren Bolduc
2d16e15659 google-cloud-cpp: new package (#46285) 2024-09-24 16:05:58 +02:00
Thomas Madlener
5d9c534018 podio: add version 1.1, edm4hep: add version 0.99.1 (#46502)
* podio: Add tag for 1.1

* edm4hep: Add 0.99.1 tag and update c++ standards

* Avoid repetition of cxxstd values

* Remove stray assert from developing
2024-09-24 15:56:21 +02:00
Rocco Meli
8471b1b471 patch (#46545) 2024-09-24 15:52:44 +02:00
Thomas Madlener
1c6b6d0a08 lcio: Add version 2.22.2 (#46556)
Add the latest tag for LCIO
2024-09-24 15:44:49 +02:00
Adam J. Stewart
d7031dcd09 py-pybind11: add v2.13.5 (#46230)
* py-pybind11: add v2.13

* Remove build-tools tag
2024-09-24 08:21:17 -05:00
Adam J. Stewart
6692eb7b6f py-pillow-simd: add v9.5.0 (#46543) 2024-09-24 15:07:41 +02:00
Daniel Arndt
728da2ff87 trilinos: add cuda_constexpr variant (#45812) 2024-09-24 14:26:16 +02:00
Todd Gamblin
c070ddac97 database: don't call socket.getfqdn() on every write (#46554)
We've seen `getfqdn()` cause slowdowns on macOS in CI when added elsewhere. It's also
called by database.py every time we write the DB file.

- [x] replace the call with a memoized version so that it is only called once per process.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-23 23:59:07 -07:00
Massimiliano Culpo
679770b02c solver: use a new heuristic (#46548)
This PR introduces a new heuristic for the solver, which behaves better when
compilers are treated as nodes. Apparently, it performs better also on `develop`,
where compilers are still node attributes.

The new heuristic:
- Sets an initial priority for guessing a few attributes. The order is "nodes" (300), 
  "dependencies" (150), "virtual dependencies" (60), "version" and "variants" (30), and
  "targets" and "compilers" (1). This initial priority decays over time during the solve, and
  falls back to the defaults.

- By default, it considers most guessed facts as "false". For instance, by default a node
  doesn't exist in the optimal answer set, or a version is not picked as a node version etc.

- There are certain conditions that override the default heuristic using the _priority_ of
  a rule, which previously we didn't use. For instance, by default we guess that a
  `attr("variant", Node, Variant, Value)` is false, but if we know that the node is already
  in the answer set, and the value is the default one, then we guess it is true.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-23 23:44:47 -07:00
Justin Cook
971577d853 spec: fix spelling (#46550)
Signed-off-by: Justin Cook <jscook@lbl.gov>
2024-09-24 07:06:54 +02:00
Kyoko Nagahashi
2c36a8aac3 new package: msr-safe (#46289)
This includes a test_linux699 variant which "activates" a version
that pulls from a repository other than the official repository.
This version is required to work with Linux kernel version
6.9.9 or later. Future official `msr-safe` versions are expected
to support later Linux kernel versions.
2024-09-23 16:04:33 -06:00
Kyoko Nagahashi
bda94e4067 linux-external-modules: update maintainers (#46474) 2024-09-23 15:35:41 -06:00
Adam J. Stewart
93ab70b07c py-dm-tree: support externally-installed pybind11 and abseil-cpp (#46431) 2024-09-23 12:11:38 -04:00
Stephen Nicholas Swatman
44215de24e acts dependencies: new versions as of 2024/09/23 (#46538)
This commit adds some new versions of acts, detray, and vecmem.
2024-09-23 07:16:55 -05:00
Chris Marsh
5b77ce15c7 py-cffi: Add macos patch from cffi-feedstock, add version 1.17.1, update depe (#46484)
* Add macos patch from cffi-feedstock, add version 1.17.1, update depends_on versions

* missing patch

* Use a url for the patch

* Remove 3.12 support
2024-09-23 11:15:55 +02:00
Wouter Deconinck
c118c7733b *: no loop over files with filter_file(*files) (#46420)
* *: no loop over files with filter_file(*files)
* scalpel: revert
2024-09-22 11:04:23 -06:00
Joseph Wang
f73f0f861d qd: add new versions and pull from main source tree (#46451)
* qd: add new versions and pull from main source tree

* add comment to that sha256 identical is intentional
2024-09-22 10:05:26 -06:00
Adam J. Stewart
2269d424f9 py-jaxlib: add GCC 13 support (#46433) 2024-09-22 09:55:06 -06:00
potter-s
d4ad167567 bcftools: Add runtime dependency gffutils (#46255)
Co-authored-by: Simon Potter <sp39sanger.ac.uk>
2024-09-22 17:50:24 +02:00
Sajid Ali
5811d754d9 libwebsockets: add v4.3.3 (#46380)
Co-authored-by: Bernhard Kaindl <bernhard.kaindl@cloud.com>
2024-09-22 09:44:23 -06:00
Jen Herting
315f3a0b4d py-jiter: new package (#46308) 2024-09-22 17:42:12 +02:00
Jen Herting
3a353c2a04 py-striprtf: New package (#46304) 2024-09-22 17:33:55 +02:00
Jen Herting
29aefd8d86 py-dirtyjson: new package (#46305) 2024-09-22 17:33:00 +02:00
Jen Herting
ba978964e5 py-typing-extensions: added 4.12.2 (#46309) 2024-09-22 17:30:50 +02:00
Jen Herting
ac0a1ff3a2 py-beautifulsoup4: added 4.12.3 (#46310) 2024-09-22 17:29:53 +02:00
Christophe Prud'homme
f2474584bf gsmsh: add missing png, jpeg and zlib deps (#46395) 2024-09-22 17:23:19 +02:00
Adam J. Stewart
61c07becc5 py-tensorflow: add GCC 13 conflict (#46435) 2024-09-22 09:17:32 -06:00
Richard Berger
98b149d711 py-sphinx-fortran: new package (#46401) 2024-09-22 16:52:23 +02:00
Richard Berger
a608f83bfc py-pyenchant: new package (#46400) 2024-09-22 16:39:37 +02:00
Tuomas Koskela
f2c132af2d fftw: Apply fix for missing FFTW3LibraryDepends.cmake (#46477) 2024-09-22 16:35:17 +02:00
Adam J. Stewart
873cb5c1a0 py-horovod: support newer torch, gcc (#46432) 2024-09-22 08:26:25 -06:00
Richard Berger
c2eea41848 py-linkchecker: new package (#46403) 2024-09-22 15:59:02 +02:00
Adam J. Stewart
d62a03bbf8 py-fiona: add v1.10.1 (#46425) 2024-09-22 15:28:55 +02:00
Wouter Deconinck
4e48ed73c6 static-analysis-suite: delete: no longer available (#46519) 2024-09-22 14:56:02 +02:00
Thomas Bouvier
8328851391 py-nvidia-dali: update to v1.41.0 (#46369)
* py-nvidia-dali: update to v1.41.0

* py-nvidia-dali: drop unnecessary 'preferred' attribute
2024-09-22 06:51:09 -06:00
Derek Ryan Strong
73125df0ec fpart: Confirm license and c dependency (#46509) 2024-09-22 14:15:56 +02:00
Wouter Deconinck
639990c385 bird: change url and checksums, add v2.15.1 (#46513) 2024-09-22 14:10:26 +02:00
Wouter Deconinck
34525388fe codec2: fix url; add v1.2.0 (#46514) 2024-09-22 14:06:57 +02:00
Wouter Deconinck
2375f873bf grackle: fix url, checksums, deps and sbang (#46516) 2024-09-22 14:01:28 +02:00
Wouter Deconinck
3e0331b250 goblin-hmc-sim: fix url (#46515) 2024-09-22 13:58:07 +02:00
Wouter Deconinck
c302013c5b yajl: fix url (#46518) 2024-09-22 13:54:57 +02:00
Wouter Deconinck
87d389fe78 shc: fix url (#46517) 2024-09-22 13:54:10 +02:00
Wouter Deconinck
27c590d2dc testdfsio: fix url and switch to be deprecated (#46520) 2024-09-22 13:42:21 +02:00
Wouter Deconinck
960f206a68 evemu: fix url (#46521) 2024-09-22 13:35:31 +02:00
Wouter Deconinck
1ccfb1444a py-falcon: fix url (#46522) 2024-09-22 13:34:38 +02:00
Wouter Deconinck
17199e7fed py-cftime: fix url (#46523) 2024-09-22 13:32:14 +02:00
Wouter Deconinck
b88971e125 tinker: add v8.7.2 (#46527) 2024-09-22 13:28:52 +02:00
Juan Miguel Carceller
f4ddb54293 opendatadetector: Add an env variable pointing to the share directory (#46511)
* opendatadetector: Add an env variable pointing to the share directory

* Rename the new variable to OPENDATADETECTOR_DATA and use join_path

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-09-22 05:51:25 -05:00
Juan Miguel Carceller
478d1fd8ff geant4: Add a patch for twisted tubes (#45368)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-09-22 12:10:37 +02:00
afzpatel
3c7357225a py-onnxruntime: add v1.18.0 -> v1.18.3 and add ROCm support (#46448)
* add ROCm support for py-onnxruntime

* add new versions of py-onnxruntime

* add review changes
2024-09-21 17:23:59 -05:00
Adam J. Stewart
8a3128eb70 Bazel: add GCC 13 support for v6 (#46430)
* Bazel: add GCC 13 support for v6
* Fix offline builds
2024-09-21 11:24:38 -06:00
Adam J. Stewart
096ab11961 py-onnx: link to external protobuf (#46434) 2024-09-21 10:59:20 -06:00
Adam J. Stewart
9577fd8b8a py-ruff: add v0.6.5 (#46459) 2024-09-21 10:39:32 -06:00
Adam J. Stewart
8088fb8ccc py-cmocean: add v4.0.3 (#46454) 2024-09-21 10:18:27 -06:00
Massimiliano Culpo
b93c57cab9 Remove spack.target from code (#46503)
The `spack.target.Target` class is a weird entity, that is just needed to:

1. Sort microarchitectures in lists deterministically
2. Being able to use microarchitectures in hashed containers

This PR removes it, and uses `archspec.cpu.Microarchitecture` directly. To sort lists, we use a proper `key=` when needed. Being able to use `Microarchitecture` objects in sets is achieved by updating the external `archspec`.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-21 14:05:41 +02:00
Wouter Deconinck
35ae2743d9 cans: update homepage and url (#46488) 2024-09-21 08:45:29 +02:00
Wouter Deconinck
cd3bd453d3 chatterbug: update homepage and git (#46489) 2024-09-21 08:44:49 +02:00
Stephen Herbener
fc6cd7c51f Introduce the bufr_query library from NOAA-EMC (#45920)
* Introduce the bufr_query library from NOAA-EMC (#461)

This PR adds in a new package.py script for the new bufr_query library from NOAA-EMC. This is being used by JEDI and other applications.

* Add explicit build dependency spec to the pybind11 depends_on spec

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Convert patch file to the URL form which pulls the changes from github.

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Added new version (0.0.3) and removed obsolete site-packages.patch file

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-20 15:46:42 -06:00
John W. Parent
cfee88a5bb Docs/Windows: Clarify supported shells and caveats (#46381)
While the existing getting started guide does in fact reference the
powershell support, it's a footnote and easily missed. This PR adds
explicit, upfront mentions of the powershell support. Additionally
this PR adds notes about some of the issues with certain components
of the spec syntax when using CMD.
2024-09-20 11:55:17 -07:00
Harmen Stoppels
7711730f2c spack.modules: move get_module up (#46428) 2024-09-20 19:30:09 +02:00
Massimiliano Culpo
14e8902854 gcc: simplify setup_run_environment (#46505)
If the spec is external, it has extra attributes. If not, we know
which names are used. In both cases we don't need to search again
for executables.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-20 18:30:43 +02:00
eugeneswalker
97961555dc petsc@3.21 +rocm: add patches for rocm 6+ (#46486) 2024-09-20 18:12:56 +02:00
Wouter Deconinck
51df7e088a gptune: don't make git attribute an Executable (#46492)
* gptune: don't make `git` attribute an Executable

* gptune: fine, I'll fix style myself then
2024-09-20 11:02:22 -05:00
Massimiliano Culpo
b28583bc58 Remove code from Compiler that is not used anymore (#45982)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-20 10:00:34 +02:00
Tamara Dahlgren
f9f6f094c3 do_install: post #46423 cleanup (#46496) 2024-09-20 09:45:31 +02:00
Jon Rood
e780a83ac6 amr-wind: use CMAKE_CUDA_ARCHITECTURES (#46442) 2024-09-19 17:58:57 -06:00
Adam J. Stewart
7f57a85514 py-pyogrio: add missing GDAL dependency (#46458) 2024-09-19 18:25:33 -05:00
Harmen Stoppels
e4927b35d1 package_base: break dependency on installer (#46423)
Removes `spack.package_base.PackageBase.do_{install,deprecate}` in favor of
`spack.installer.PackageInstaller.install` and `spack.installer.deprecate` resp.

That drops a dependency of `spack.package_base` on `spack.installer`, which is
necessary to get rid of circular dependencies in Spack.

Also change the signature of `PackageInstaller.__init__` from taking a dict as
positional argument, to an explicit list of keyword arguments.
2024-09-19 23:25:36 +02:00
etiennemlb
098ad7ffc0 quantum-espresso: ensure no space in HDF5 lib variable (#46089)
* Ensure no space in HDF5 lib variable.
* QE patch fix
2024-09-19 14:00:34 -07:00
Harmen Stoppels
db7aece186 require spec in develop entry (#46485) 2024-09-19 20:11:22 +02:00
Ivan Maidanski
0e9f131b44 bdw-gc: add v8.2.8 (#46286) 2024-09-19 18:48:34 +02:00
Harmen Stoppels
1d18f571ae url join: fix oci scheme (#46483)
* url.py: also special case oci scheme in join

* avoid fetching keys from oci mirror
2024-09-19 16:06:44 +02:00
Harmen Stoppels
586360a8fe Revert "For "when:" and install_environment.json: Support fully qualified hos…" (#46478)
This reverts commit 6b0011c8f1.

It caused a major performance penalty in unit test time on macOS (about 30 minutes).
2024-09-19 15:34:12 +02:00
Harmen Stoppels
b1db22d406 run-unit-tests: no xdist if coverage (#46480)
xdist only slows down unit tests under coverage
2024-09-19 14:01:33 +02:00
Harmen Stoppels
7395656663 docs: refer to upstreams.yaml in chain.rst title (#46475) 2024-09-19 13:08:05 +02:00
Harmen Stoppels
d0b736607b spack.util.url: fix join breakage in python 3.12.6 (#46453) 2024-09-19 12:29:56 +02:00
Harmen Stoppels
fe5d7881f5 avoid multiprocessing in tests (#46439)
- silences a few pytest warnings related to forking in xdist
- speeds up a few tests / avoids possible oversubscription in xdist
2024-09-19 10:34:23 +02:00
Harmen Stoppels
28e3295fb0 Add GHA for circular imports regressions (#46436) 2024-09-18 09:07:23 +02:00
Massimiliano Culpo
314a3fbe77 Bump archspec to latest commit (#46445)
This should fix an issue with Neoverse XX detection

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-18 00:54:39 +00:00
Auriane R.
6df831ef00 Replace if ... in spec with spec.satisfies in h* and i* packages (#46387)
* Replace if ... in spec with spec.satisfies in h* packages

* Replace if ... in spec with spec.satisfies in i* packages
2024-09-17 12:39:22 -05:00
Todd Gamblin
9818002219 variants: Unify metadata dictionaries to index by when (#44425)
Continuing the work started in #40326, his changes the structure
of Variant metadata on Packages from a single variant definition
per name with a list of `when` specs:

```
name: (Variant, [when_spec, ...])
```

to a Variant definition per `when_spec` per name:

```
when_spec: { name: Variant }
```

With this change, everything on a package *except* versions is
 keyed by `when` spec. This:

1. makes things consistent, in that conditional things are (nearly)
   all modeled in the same way; and

2. fixes an issue where we would lose information about multiple
   variant definitions in a package (see #38302). We can now have,
   e.g., different defaults for the same variant in different
   versions of a package.

Some notes:

1. This required some pretty deep changes to the solver. Previously,
   the solver's job was to select value(s) for a single variant definition
   per name per package. Now, the solver needs to:

   a. Determine which variant definition should be used for a given node,
      which can depend on the node's version, compiler, target, other variants, etc.
   b. Select valid value(s) for variants for each node based on the selected
      variant definition.

   When multiple variant definitions are enabled via their `when=` clause, we will
   always prefer the *last* matching definition, by declaration order in packages. This
   is implemented by adding a `precedence` to each variant at definition time, and we
   ensure they are added to the solver in order of precedence.

   This has the effect that variant definitions from derived classes are preferred over
   definitions from superclasses, and the last definition within the same class sticks.
   This matches python semantics. Some examples:

    ```python
    class ROCmPackage(PackageBase):
        variant("amdgpu_target", ..., when="+rocm")

    class Hipblas(ROCmPackage):
        variant("amdgpu_target", ...)
    ```

   The global variant in `hipblas` will always supersede the `when="+rocm"` variant in
   `ROCmPackage`. If `hipblas`'s variant was also conditional on `+rocm` (as it probably
   should be), we would again filter out the definition from `ROCmPackage` because it
   could never be activated. If you instead have:

    ```python
    class ROCmPackage(PackageBase):
        variant("amdgpu_target", ..., when="+rocm")

    class Hipblas(ROCmPackage):
        variant("amdgpu_target", ..., when="+rocm+foo")
    ```

   The variant on `hipblas` will win for `+rocm+foo` but the one on `ROCmPackage` will
   win with `rocm~foo`.

   So, *if* we can statically determine if a variant is overridden, we filter it out.
   This isn't strictly necessary, as the solver can handle many definitions fine, but
   this reduces the complexity of the problem instance presented to `clingo`, and
   simplifies output in `spack info` for derived packages. e.g., `spack info hipblas`
   now shows only one definition of `amdgpu_target` where before it showed two, one of
   which would never be used.

2. Nearly all access to the `variants` dictionary on packages has been refactored to
   use the following class methods on `PackageBase`:
    * `variant_names(cls) -> List[str]`: get all variant names for a package
    * `has_variant(cls, name) -> bool`: whether a package has a variant with a given name
    * `variant_definitions(cls, name: str) -> List[Tuple[Spec, Variant]]`: all definitions
      of variant `name` that are possible, along with their `when` specs.
    * `variant_items() -> `: iterate over `pkg.variants.items()`, with impossible variants
      filtered out.

   Consolidating to these methods seems to simplify the code a lot.

3. The solver does a lot more validation on variant values at setup time now. In
   particular, it checks whether a variant value on a spec is valid given the other
   constraints on that spec. This allowed us to remove the crufty logic in
   `update_variant_validate`, which was needed because we previously didn't *know* after
   a solve which variant definition had been used. Now, variant values from solves are
   constructed strictly based on which variant definition was selected -- no more
   heuristics.

4. The same prevalidation can now be done in package audits, and you can run:

   ```
   spack audit packages --strict-variants
   ```

   This turns up around 18 different places where a variant specification isn't valid
   given the conditions on variant definitions in packages. I haven't fixed those here
   but will open a separate PR to iterate on them. I plan to make strict checking the
   defaults once all existing package issues are resolved. It's not clear to me that
   strict checking should be the default for the prevalidation done at solve time.

There are a few other changes here that might be of interest:

  1. The `generator` variant in `CMakePackage` is now only defined when `build_system=cmake`.
  2. `spack info` has been updated to support the new metadata layout.
  3.  split out variant propagation into its own `.lp` file in the `solver` code.
  4. Add better typing and clean up code for variant types in `variant.py`.
  5. Add tests for new variant behavior.
2024-09-17 09:59:05 -07:00
Harmen Stoppels
1768b923f1 cloud_pipelines/.gitlab-ci.yml: run spack arch (#46437) 2024-09-17 17:07:26 +02:00
Harmen Stoppels
aa6651fe27 drop main dep from build_environment/subprocess_context (#46426) 2024-09-17 17:06:16 +02:00
Harmen Stoppels
3ded2fc9c5 untangle spack.config / spack.util.cpus & spack.spec (#46427) 2024-09-17 17:06:00 +02:00
Harmen Stoppels
623c5a4d24 package_base.py: do not depend on spack.environment (#46424) 2024-09-17 14:43:03 +02:00
Harmen Stoppels
673565aefe imports: automate missing imports (#46410) 2024-09-17 07:45:59 +02:00
Todd Gamblin
930e711771 coverage: only upload to codecov once (#46385)
Historically, every PR, push, etc. to Spack generates a bunch of jobs, each of which
uploads its coverage report to codecov independently. This means that we get annoying
partial coverage numbers when only a few of the jobs have finished, and frequently
codecov is bad at understanding when to merge reports for a given PR. The numbers of the
site can be weird as a result.

This restructures our coverage handling so that we do all the merging ourselves and
upload exactly one report per GitHub actions workflow. In practice, that means that
every push to every PR will get exactly one coverage report and exactly one coverage
number reported. I think this will at least partially restore peoples' faith in what
codecov is telling them, and it might even make codecov handle Spack a bit better, since
this reduces the report burden by ~7x.

- [x] test and audit jobs now upload artifacts for coverage
- [x] add a new job that downloads artifacts and merges coverage reports together
- [x] set `paths` section of `pyproject.toml` so that cross-platform clone locations are merged
- [x] upload to codecov once, at the end of the workflow

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-16 19:18:21 -07:00
Satish Balay
b1905186a6 kokkos, kokkos-kernels, kokkos-nvcc-wrapper: add v4.4.01 (#46377)
* kokkos, kokkos-kernels, kokkos-nvcc-wrapper: add v4.4.01

* trilinos: update @[master,develop] dependency on kokkos

==> Error: InstallError: For Trilinos@[master,develop], ^kokkos version in spec must match version in Trilinos source code. Specify ^kokkos@4.4.01 for trilinos@[master,develop] instead of ^kokkos@4.4.00.
2024-09-16 13:53:00 -06:00
Richard Berger
176b7f8854 lammps-example-plugin: add new versions, fix bug (#46331) 2024-09-16 12:15:02 -07:00
Wouter Deconinck
0cb4db950f nghttp2: add v1.62.1, v1.63.0 (#46358) 2024-09-16 12:13:24 -07:00
Wouter Deconinck
bba66f9dae libxslt: add through v1.1.42 (now at gnome.org) (#46364)
* libxslt: add through v1.1.42 (now at gnome.org)
* libxslt: add v1.1.35 (apparently forgotten)
2024-09-16 12:09:47 -07:00
Wouter Deconinck
9c222aee67 libedit: add v3.1-20240517, v3.1-20240808 (#46366) 2024-09-16 12:03:39 -07:00
Jen Herting
fcc28d72e8 [py-httpx] Dependency fixes and simplifications (#46367) 2024-09-16 12:02:22 -07:00
Satish Balay
8bc897cee1 hipfft: update @master dependency wrt rocfft (#46376)
* add master branch to rocfft and ensure its dependency on that branch for hip and rocm-cmake
* ensure hipfft@master uses rocm-cmake@master
2024-09-16 12:00:26 -07:00
David Collins
2b70f8367c Use the correct variable in configure() in bash package.py (#46384) 2024-09-16 11:44:27 -07:00
Teague Sterling
a9a23f4565 duckdb: add v1.1.0, deprecate v0.10.0 (#46391)
* duckdb: add v1.0.0, v0.10.3
* Adding issue reference
* duckdb: add v1.1.0

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-16 11:42:27 -07:00
Andrey Prokopenko
0d86ecf122 arborx: add 1.7 (#46392) 2024-09-16 11:40:39 -07:00
Wouter Deconinck
654d6f1397 qt: add v5.15.15 (#46405) 2024-09-16 12:38:13 -06:00
kwryankrattiger
ade9c8da0e Revert "allow failure for cray-sles (#46411)" (#46413)
This reverts commit 576251f0da.
2024-09-16 19:42:24 +02:00
Greg Becker
f4f59b7f18 openblas and others: change flag_handler idiom to respect incoming flags (#46211)
* openblas: fix flag_handler to respect flags

* arpack-ng: fix flag_handler to respect flags

* czmq: fix flag_handler to respect flags

* flex: fix flag_handler to respect flags

* json-c: fix flag_handler to respect flags

* mpifileutils: fix flag_handler to respect flags

* netlib-scalapack: fix flag_handler to respect flags

* sed: fix flag_handler to respect flags

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-09-16 09:14:24 -07:00
Massimiliano Culpo
61d6c5486c Add repositories for "requirements" and "flag mixing" unit tests (#46412) 2024-09-16 17:59:35 +02:00
Harmen Stoppels
576251f0da allow failure for cray-sles (#46411) 2024-09-16 17:17:36 +02:00
Harmen Stoppels
55e0ef1e64 Add missing & remove redundant imports (#46407) 2024-09-16 12:54:16 +02:00
Harmen Stoppels
ef0e54726d freexl: add missing deps (#46330) 2024-09-16 12:12:38 +02:00
Harmen Stoppels
8225b18985 Fix a few circular deps (#46373) 2024-09-16 09:15:51 +02:00
Satish Balay
02320b18f3 petsc, mfem: update rocm dependency (#46324)
* petsc: configure requires rocm-core/rocm_version.h to detect ROCM_VERSION_MAJOR.ROCM_VERSION_MINOR.ROCM_VERSION_PATCH

* mfem: add dependency on rocprim (as needed via petsc dependency)

In file included from linalg/petsc.cpp:19:
In file included from linalg/linalg.hpp:65:
In file included from linalg/petsc.hpp:48:
In file included from /scratch/svcpetsc/spack.x/opt/spack/linux-ubuntu22.04-x86_64/gcc-11.4.0/petsc-3.22.0-7dsxwizo24ycnqvwnsscupuh4i7yusrh/include/petscsystypes.h:530:
In file included from /scratch/svcpetsc/spack.x/opt/spack/linux-ubuntu22.04-x86_64/gcc-11.4.0/rocthrust-6.1.2-ux5nmi4utw27oaqmz3sfjmhb6hyt5zed/include/thrust/complex.h:30:
/scratch/svcpetsc/spack.x/opt/spack/linux-ubuntu22.04-x86_64/gcc-11.4.0/rocthrust-6.1.2-ux5nmi4utw27oaqmz3sfjmhb6hyt5zed/include/thrust/detail/type_traits.h:29:10: fatal error: 'rocprim/detail/match_result_type.hpp' file not found
   29 | #include <rocprim/detail/match_result_type.hpp>
      |          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2024-09-15 13:30:55 -05:00
Harmen Stoppels
b4e32706db fetch_strategy: show the effective URL on checksum validation failure (#46349) 2024-09-15 20:26:02 +02:00
Debojyoti Ghosh
bcde9a3afb Updated HyPar package (#46394)
* updated HyPar repo links

* updated configure args with and without MPI

* updated checksum for zipped source
2024-09-15 10:45:57 -05:00
858 changed files with 13400 additions and 5887 deletions

View File

@@ -28,7 +28,7 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{inputs.python_version}}
@@ -40,6 +40,8 @@ jobs:
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
env:
COVERAGE_FILE: coverage/.coverage-audits-${{ matrix.system.os }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
@@ -47,27 +49,26 @@ jobs:
coverage run $(which spack) audit configs
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit configs
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit configs
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
if: ${{ inputs.with_coverage == 'true' }}
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
with:
flags: unittests,audits
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-audits-${{ matrix.system.os }}
path: coverage
include-hidden-files: true

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -53,33 +53,27 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest' }}
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: "3.12"
- name: Bootstrap clingo
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
tree $HOME/.spack/bootstrap/store/
gnupg-sources:
runs-on: ${{ matrix.runner }}
@@ -96,7 +90,7 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -112,10 +106,10 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest'}}
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
@@ -124,13 +118,8 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Setup Windows
if: ${{ matrix.runner == 'windows-latest' }}
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -142,20 +131,11 @@ jobs:
3.11
3.12
- name: Set bootstrap sources
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.4
- name: Disable from source bootstrap
if: ${{ matrix.runner != 'windows-latest' }}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
# No binary clingo on Windows yet
if: ${{ matrix.runner != 'windows-latest' }}
run: |
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
@@ -178,24 +158,48 @@ jobs:
fi
done
- name: Bootstrap GnuPG
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
source share/spack/setup-env.sh
spack -d gpg list
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
tree $HOME/.spack/bootstrap/store/
- name: Bootstrap File
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
source share/spack/setup-env.sh
spack -d python share/spack/qa/bootstrap-file.py
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
tree $HOME/.spack/bootstrap/store/
windows:
runs-on: "windows-latest"
steps:
- name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: "3.12"
- name: Setup Windows
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Bootstrap clingo
run: |
./share/spack/setup-env.ps1
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
./share/spack/qa/validate_last_exit.ps1
tree $env:userprofile/.spack/bootstrap/store/
- name: Bootstrap GnuPG
run: |
./share/spack/setup-env.ps1
spack -d gpg list
./share/spack/qa/validate_last_exit.ps1
tree $env:userprofile/.spack/bootstrap/store/
- name: Bootstrap File
run: |
./share/spack/setup-env.ps1
spack -d python share/spack/qa/bootstrap-file.py
./share/spack/qa/validate_last_exit.ps1
tree $env:userprofile/.spack/bootstrap/store/

View File

@@ -55,7 +55,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -96,7 +96,7 @@ jobs:
uses: docker/setup-qemu-action@49b3bc8e6bdd4a60e6116a5414239cba5943d3cf
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@988b5a0280414f521da01fcc63a27aeeb4b104db
uses: docker/setup-buildx-action@c47758b77c9736f4b2ef4073d4d51994fabfe349
- name: Log in to GitHub Container Registry
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
@@ -113,7 +113,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@5cd11c3a4ced054e52742c5fd54dca954e0edd85
uses: docker/build-push-action@32945a339266b759abcbdc89316275140b0fc960
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -15,18 +15,6 @@ concurrency:
cancel-in-progress: true
jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
all-prechecks:
needs: [ prechecks ]
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
@@ -36,7 +24,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -79,13 +67,34 @@ jobs:
needs: [ prechecks, changes ]
uses: ./.github/workflows/bootstrap.yml
secrets: inherit
unit-tests:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
all:
needs: [ unit-tests, bootstrap ]
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
all-prechecks:
needs: [ prechecks ]
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"
coverage:
needs: [ unit-tests, prechecks ]
uses: ./.github/workflows/coverage.yml
secrets: inherit
all:
needs: [ coverage, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

34
.github/workflows/coverage.yml vendored Normal file
View File

@@ -0,0 +1,34 @@
name: coverage
on:
workflow_call:
jobs:
# Upload coverage reports to codecov once as a single bundle
upload:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
cache: 'pip'
- name: Install python dependencies
run: pip install -r .github/workflows/requirements/coverage/requirements.txt
- name: Download coverage artifact files
uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16
with:
pattern: coverage-*
path: coverage
merge-multiple: true
- run: ls -la coverage
- run: coverage combine -a coverage/.coverage*
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
with:
verbose: true

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3

View File

@@ -0,0 +1 @@
coverage==7.6.1

View File

@@ -40,7 +40,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -76,19 +76,20 @@ jobs:
SPACK_PYTHON: python
SPACK_TEST_PARALLEL: 2
COVERAGE: true
COVERAGE_FILE: coverage/.coverage-${{ matrix.os }}-python${{ matrix.python-version }}
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }}
path: coverage
include-hidden-files: true
# Test shell integration
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -112,11 +113,11 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-shell
path: coverage
include-hidden-files: true
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
@@ -129,7 +130,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- name: Setup repo and non-root user
run: |
git --version
@@ -148,7 +149,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -170,13 +171,14 @@ jobs:
- name: Run unit tests (full suite with coverage)
env:
COVERAGE: true
COVERAGE_FILE: coverage/.coverage-clingo-cffi
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-clingo-cffi
path: coverage
include-hidden-files: true
# Run unit tests on MacOS
macos:
runs-on: ${{ matrix.os }}
@@ -185,7 +187,7 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -197,10 +199,11 @@ jobs:
pip install --upgrade pytest coverage[toml] pytest-xdist pytest-cov
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
brew install dash fish gcc gnupg kcov
- name: Run unit tests
env:
SPACK_TEST_PARALLEL: 4
COVERAGE_FILE: coverage/.coverage-${{ matrix.os }}-python${{ matrix.python-version }}
run: |
git --version
. .github/workflows/bin/setup_git.sh
@@ -209,11 +212,11 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }}
path: coverage
include-hidden-files: true
# Run unit tests on Windows
windows:
defaults:
@@ -222,7 +225,7 @@ jobs:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -235,13 +238,13 @@ jobs:
run: |
./.github/workflows/bin/setup_git.ps1
- name: Unit Test
env:
COVERAGE_FILE: coverage/.coverage-windows
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
name: coverage-windows
path: coverage
include-hidden-files: true

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
- name: Setup repo and non-root user
run: |
git --version
@@ -87,3 +87,62 @@ jobs:
spack -d bootstrap now --dev
spack style -t black
spack unit-test -V
import-check:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
- uses: julia-actions/cache@v2
# PR: use the base of the PR as the old commit
- name: Checkout PR base commit
if: github.event_name == 'pull_request'
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
ref: ${{ github.event.pull_request.base.sha }}
path: old
# not a PR: use the previous commit as the old commit
- name: Checkout previous commit
if: github.event_name != 'pull_request'
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 2
path: old
- name: Checkout previous commit
if: github.event_name != 'pull_request'
run: git -C old reset --hard HEAD^
- name: Checkout new commit
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
path: new
- name: Install circular import checker
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
repository: haampie/circular-import-fighter
ref: 555519c6fd5564fd2eb844e7b87e84f4d12602e2
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter
run: make -j dependencies
- name: Import cycles before
working-directory: circular-import-fighter
run: make SPACK_ROOT=../old && cp solution solution.old
- name: Import cycles after
working-directory: circular-import-fighter
run: make clean-graph && make SPACK_ROOT=../new && cp solution solution.new
- name: Compare import cycles
working-directory: circular-import-fighter
run: |
edges_before="$(grep -oP 'edges to delete: \K\d+' solution.old)"
edges_after="$(grep -oP 'edges to delete: \K\d+' solution.new)"
if [ "$edges_after" -gt "$edges_before" ]; then
printf '\033[1;31mImport check failed: %s imports need to be deleted, ' "$edges_after"
printf 'previously this was %s\033[0m\n' "$edges_before"
printf 'Compare \033[1;97m"Import cycles before"\033[0m and '
printf '\033[1;97m"Import cycles after"\033[0m to see problematic imports.\n'
exit 1
else
printf '\033[1;32mImport check passed: %s <= %s\033[0m\n' "$edges_after" "$edges_before"
fi

View File

@@ -1,3 +1,64 @@
# v0.22.2 (2024-09-21)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump vendored `archspec` for better aarch64 support (#45721, #46445)
- Support macOS Sequoia (#45018, #45127)
- Fix regression in `{variants.X}` and `{variants.X.value}` format strings (#46206)
- Ensure shell escaping of environment variable values in load and activate commands (#42780)
- Fix an issue where `spec[pkg]` considers specs outside the current DAG (#45090)
- Do not halt concretization on unknown variants in externals (#45326)
- Improve validation of `develop` config section (#46485)
- Explicitly disable `ccache` if turned off in config, to avoid cache pollution (#45275)
- Improve backwards compatibility in `include_concrete` (#45766)
- Fix issue where package tags were sometimes repeated (#45160)
- Make `setup-env.sh` "sourced only" by dropping execution bits (#45641)
- Make certain source/binary fetch errors recoverable instead of a hard error (#45683)
- Remove debug statements in package hash computation (#45235)
- Remove redundant clingo warnings (#45269)
- Remove hard-coded layout version (#45645)
- Do not initialize previous store state in `use_store` (#45268)
- Docs improvements (#46475)
## Package updates
- `chapel` major update (#42197, #44931, #45304)
# v0.22.1 (2024-07-04)
## Bugfixes
- Fix reuse of externals on Linux (#44316)
- Ensure parent gcc-runtime version >= child (#44834, #44870)
- Ensure the latest gcc-runtime is rpath'ed when multiple exist among link deps (#44219)
- Improve version detection of glibc (#44154)
- Improve heuristics for solver (#44893, #44976, #45023)
- Make strong preferences override reuse (#44373)
- Reduce verbosity when C compiler is missing (#44182)
- Make missing ccache executable an error when required (#44740)
- Make every environment view containing `python` a `venv` (#44382)
- Fix external detection for compilers with os but no target (#44156)
- Fix version optimization for roots (#44272)
- Handle common implementations of pagination of tags in OCI build caches (#43136)
- Apply fetched patches to develop specs (#44950)
- Avoid Windows wrappers for filesystem utilities on non-Windows (#44126)
- Fix issue with long filenames in build caches on Windows (#43851)
- Fix formatting issue in `spack audit` (#45045)
- CI fixes (#44582, #43965, #43967, #44279, #44213)
## Package updates
- protobuf: fix 3.4:3.21 patch checksum (#44443)
- protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
- git: bump v2.39 to 2.45; deprecate unsafe versions (#44248)
- gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
- Remove mesa18 and libosmesa (#44264)
- Enforce consistency of `gl` providers (#44307)
- Require libiconv for iconv (#44335, #45026).
Notice that glibc/musl also provide iconv, but are not guaranteed to be
complete. Set `packages:iconv:require:[glibc]` to restore the old behavior.
- py-matplotlib: qualify when to do a post install (#44191)
- rust: fix v1.78.0 instructions (#44127)
- suite-sparse: improve setting of the `libs` property (#44214)
- netlib-lapack: provide blas and lapack together (#44981)
# v0.22.0 (2024-05-12)
@@ -319,6 +380,16 @@
* 344 committers to packages
* 45 committers to core
# v0.21.3 (2024-10-02)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump `archspec` to 0.2.5-dev for better aarch64 and Windows support (#42854, #44005,
#45721, #46445)
- Support macOS Sequoia (#45018, #45127, #43862)
- CI and test maintenance (#42909, #42728, #46711, #41943, #43363)
# v0.21.2 (2024-03-01)
## Bugfixes

View File

@@ -46,13 +46,18 @@ See the
[Feature Overview](https://spack.readthedocs.io/en/latest/features.html)
for examples and highlights.
To install spack and your first package, make sure you have Python.
To install spack and your first package, make sure you have Python & Git.
Then:
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install zlib
> [!TIP]
> `-c feature.manyFiles=true` improves git's performance on repositories with 1,000+ files.
>
> `--depth=2` prunes the git history to reduce the size of the Spack installation.
Documentation
----------------

View File

@@ -1175,6 +1175,17 @@ unspecified version, but packages can depend on other packages with
could depend on ``mpich@1.2:`` if it can only build with version
``1.2`` or higher of ``mpich``.
.. note:: Windows Spec Syntax Caveats
Windows has a few idiosyncrasies when it comes to the Spack spec syntax and the use of certain shells
Spack's spec dependency syntax uses the carat (``^``) character, however this is an escape string in CMD
so it must be escaped with an additional carat (i.e. ``^^``).
CMD also will attempt to interpret strings with ``=`` characters in them. Any spec including this symbol
must double quote the string.
Note: All of these issues are unique to CMD, they can be avoided by using Powershell.
For more context on these caveats see the related issues: `carat <https://github.com/spack/spack/issues/42833>`_ and `equals <https://github.com/spack/spack/issues/43348>`_
Below are more details about the specifiers that you can add to specs.
.. _version-specifier:

View File

@@ -130,14 +130,19 @@ before or after a particular phase. For example, in ``perl``, we see:
@run_after("install")
def install_cpanm(self):
spec = self.spec
if spec.satisfies("+cpanm"):
with working_dir(join_path("cpanm", "cpanm")):
perl = spec["perl"].command
perl("Makefile.PL")
make()
make("install")
spec = self.spec
maker = make
cpan_dir = join_path("cpanm", "cpanm")
if sys.platform == "win32":
maker = nmake
cpan_dir = join_path(self.stage.source_path, cpan_dir)
cpan_dir = windows_sfn(cpan_dir)
if "+cpanm" in spec:
with working_dir(cpan_dir):
perl = spec["perl"].command
perl("Makefile.PL")
maker()
maker("install")
This extra step automatically installs ``cpanm`` in addition to the
base Perl installation.
@@ -176,8 +181,14 @@ In the ``perl`` package, we can see:
@run_after("build")
@on_package_attributes(run_tests=True)
def test(self):
make("test")
def build_test(self):
if sys.platform == "win32":
win32_dir = os.path.join(self.stage.source_path, "win32")
win32_dir = windows_sfn(win32_dir)
with working_dir(win32_dir):
nmake("test", ignore_quotes=True)
else:
make("test")
As you can guess, this runs ``make test`` *after* building the package,
if and only if testing is requested. Again, this is not specific to

View File

@@ -49,14 +49,14 @@ following phases:
#. ``install`` - install the package
Package developers often add unit tests that can be invoked with
``scons test`` or ``scons check``. Spack provides a ``test`` method
``scons test`` or ``scons check``. Spack provides a ``build_test`` method
to handle this. Since we don't know which one the package developer
chose, the ``test`` method does nothing by default, but can be easily
chose, the ``build_test`` method does nothing by default, but can be easily
overridden like so:
.. code-block:: python
def test(self):
def build_test(self):
scons("check")

View File

@@ -5,9 +5,9 @@
.. chain:
============================
Chaining Spack Installations
============================
=============================================
Chaining Spack Installations (upstreams.yaml)
=============================================
You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance,

View File

@@ -219,6 +219,9 @@ def setup(sphinx):
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -316,6 +316,215 @@ documentation tests to make sure there are no errors. Documentation changes can
in some obfuscated warning messages. If you don't understand what they mean, feel free
to ask when you submit your PR.
.. _spack-builders-and-pipelines:
^^^^^^^^^
GitLab CI
^^^^^^^^^
""""""""""""""""""
Build Cache Stacks
""""""""""""""""""
Spack welcomes the contribution of software stacks of interest to the community. These
stacks are used to test package recipes and generate publicly available build caches.
Spack uses GitLab CI for managing the orchestration of build jobs.
GitLab Entry Point
~~~~~~~~~~~~~~~~~~
Add stack entrypoint to the ``share/spack/gitlab/cloud_pipelines/.gitlab-ci.yml``. There
are two stages required for each new stack, the generation stage and the build stage.
The generate stage is defined using the job template ``.generate`` configured with
environment variables defining the name of the stack in ``SPACK_CI_STACK_NAME`` and the
platform (``SPACK_TARGET_PLATFORM``) and architecture (``SPACK_TARGET_ARCH``) configuration,
and the tags associated with the class of runners to build on.
.. note::
The ``SPACK_CI_STACK_NAME`` must match the name of the directory containing the
stacks ``spack.yaml``.
.. note::
The platform and architecture variables are specified in order to select the
correct configurations from the generic configurations used in Spack CI. The
configurations currently available are:
* ``.cray_rhel_zen4``
* ``.cray_sles_zen4``
* ``.darwin_aarch64``
* ``.darwin_x86_64``
* ``.linux_aarch64``
* ``.linux_icelake``
* ``.linux_neoverse_n1``
* ``.linux_neoverse_v1``
* ``.linux_neoverse_v2``
* ``.linux_power``
* ``.linux_skylake``
* ``.linux_x86_64``
* ``.linux_x86_64_v4``
New configurations can be added to accommodate new platforms and architectures.
The build stage is defined as a trigger job that consumes the GitLab CI pipeline generated in
the generate stage for this stack. Build stage jobs use the ``.build`` job template which
handles the basic configuration.
An example entry point for a new stack called ``my-super-cool-stack``
.. code-block:: yaml
.my-super-cool-stack:
extends: [ ".linux_x86_64_v3" ]
variables:
SPACK_CI_STACK_NAME: my-super-cool-stack
tags: [ "all", "tags", "your", "job", "needs"]
my-super-cool-stack-generate:
extends: [ ".generate", ".my-super-cool-stack" ]
image: my-super-cool-stack-image:0.0.1
my-super-cool-stack-build:
extends: [ ".build", ".my-super-cool-stack" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: my-super-cool-stack-generate
strategy: depend
needs:
- artifacts: True
job: my-super-cool-stack-generate
Stack Configuration
~~~~~~~~~~~~~~~~~~~
The stack configuration is a spack environment file with two additional sections added.
Stack configurations should be located in ``share/spack/gitlab/cloud_pipelines/stacks/<stack_name>/spack.yaml``.
The ``ci`` section is generally used to define stack specific mappings such as image or tags.
For more information on what can go into the ``ci`` section refer to the docs on pipelines.
The ``cdash`` section is used for defining where to upload the results of builds. Spack configures
most of the details for posting pipeline results to
`cdash.spack.io <https://cdash.spack.io/index.php?project=Spack+Testing>`_. The only
requirement in the stack configuration is to define a ``build-group`` that is unique,
this is usually the long name of the stack.
An example stack that builds ``zlib``.
.. code-block:: yaml
spack:
view: false
packages:
all:
require: ["%gcc", "target=x86_64_v3"]
specs:
- zlib
ci:
pipeline-gen
- build-job:
image: my-super-cool-stack-image:0.0.1
cdash:
build-group: My Super Cool Stack
.. note::
The ``image`` used in the ``*-generate`` job must match exactly the ``image`` used in the ``build-job``.
When the images do not match the build job may fail.
"""""""""""""""""""
Registering Runners
"""""""""""""""""""
Contributing computational resources to Spack's CI build farm is one way to help expand the
capabilities and offerings of the public Spack build caches. Currently, Spack utilizes linux runners
from AWS, Google, and the University of Oregon (UO).
Runners require three key peices:
* Runner Registration Token
* Accurate tags
* OIDC Authentication script
* GPG keys
Minimum GitLab Runner Version: ``16.1.0``
`Intallation instructions <https://docs.gitlab.com/runner/install/>`_
Registration Token
~~~~~~~~~~~~~~~~~~
The first step to contribute new runners is to open an issue in the `spack infrastructure <https://github.com/spack/spack-infrastructure/issues/new?assignees=&labels=runner-registration&projects=&template=runner_registration.yml>`_
project. This will be reported to the spack infrastructure team who will guide users through the process
of registering new runners for Spack CI.
The information needed to register a runner is the motivation for the new resources, a semi-detailed description of
the runner, and finallly the point of contact for maintaining the software on the runner.
The point of contact will then work with the infrastruture team to obtain runner registration token(s) for interacting with
with Spack's GitLab instance. Once the runner is active, this point of contact will also be responsible for updating the
GitLab runner software to keep pace with Spack's Gitlab.
Tagging
~~~~~~~
In the initial stages of runner registration it is important to **exclude** the special tag ``spack``. This will prevent
the new runner(s) from being picked up for production CI jobs while it is configured and evaluated. Once it is determined
that the runner is ready for production use the ``spack`` tag will be added.
Because gitlab has no concept of tag exclustion, runners that provide specialized resource also require specialized tags.
For example, a basic CPU only x86_64 runner may have a tag ``x86_64`` associated with it. However, a runner containing an
CUDA capable GPU may have the tag ``x86_64-cuda`` to denote that it should only be used for packages that will benefit from
a CUDA capable resource.
OIDC
~~~~
Spack runners use OIDC authentication for connecting to the appropriate AWS bucket
which is used for coordinating the communication of binaries between build jobs. In
order to configure OIDC authentication, Spack CI runners use a python script with minimal
dependencies. This script can be configured for runners as seen here using the ``pre_build_script``.
.. code-block:: toml
[[runners]]
pre_build_script = """
echo 'Executing Spack pre-build setup script'
for cmd in "${PY3:-}" python3 python; do
if command -v > /dev/null "$cmd"; then
export PY3="$(command -v "$cmd")"
break
fi
done
if [ -z "${PY3:-}" ]; then
echo "Unable to find python3 executable"
exit 1
fi
$PY3 -c "import urllib.request;urllib.request.urlretrieve('https://raw.githubusercontent.com/spack/spack-infrastructure/main/scripts/gitlab_runner_pre_build/pre_build.py', 'pre_build.py')"
$PY3 pre_build.py > envvars
. ./envvars
rm -f envvars
unset GITLAB_OIDC_TOKEN
"""
GPG Keys
~~~~~~~~
Runners that may be utilized for ``protected`` CI require the registration of an intermediate signing key that
can be used to sign packages. For more information on package signing read :ref:`key_architecture`.
--------
Coverage
--------

View File

@@ -5,49 +5,56 @@
.. _environments:
=========================
Environments (spack.yaml)
=========================
=====================================
Environments (spack.yaml, spack.lock)
=====================================
An environment is used to group together a set of specs for the
purpose of building, rebuilding and deploying in a coherent fashion.
Environments provide a number of advantages over the *à la carte*
approach of building and loading individual Spack modules:
An environment is used to group a set of specs intended for some purpose
to be built, rebuilt, and deployed in a coherent fashion. Environments
define aspects of the installation of the software, such as:
#. Environments separate the steps of (a) choosing what to
install, (b) concretizing, and (c) installing. This allows
Environments to remain stable and repeatable, even if Spack packages
are upgraded: specs are only re-concretized when the user
explicitly asks for it. It is even possible to reliably
transport environments between different computers running
different versions of Spack!
#. Environments allow several specs to be built at once; a more robust
solution than ad-hoc scripts making multiple calls to ``spack
install``.
#. An Environment that is built as a whole can be loaded as a whole
into the user environment. An Environment can be built to maintain
a filesystem view of its packages, and the environment can load
that view into the user environment at activation time. Spack can
also generate a script to load all modules related to an
environment.
#. *which* specs to install;
#. *how* those specs are configured; and
#. *where* the concretized software will be installed.
Aggregating this information into an environment for processing has advantages
over the *à la carte* approach of building and loading individual Spack modules.
With environments, you concretize, install, or load (activate) all of the
specs with a single command. Concretization fully configures the specs
and dependencies of the environment in preparation for installing the
software. This is a more robust solution than ad-hoc installation scripts.
And you can share an environment or even re-use it on a different computer.
Environment definitions, especially *how* specs are configured, allow the
software to remain stable and repeatable even when Spack packages are upgraded. Changes are only picked up when the environment is explicitly re-concretized.
Defining *where* specs are installed supports a filesystem view of the
environment. Yet Spack maintains a single installation of the software that
can be re-used across multiple environments.
Activating an environment determines *when* all of the associated (and
installed) specs are loaded so limits the software loaded to those specs
actually needed by the environment. Spack can even generate a script to
load all modules related to an environment.
Other packaging systems also provide environments that are similar in
some ways to Spack environments; for example, `Conda environments
<https://conda.io/docs/user-guide/tasks/manage-environments.html>`_ or
`Python Virtual Environments
<https://docs.python.org/3/tutorial/venv.html>`_. Spack environments
provide some distinctive features:
provide some distinctive features though:
#. A spec installed "in" an environment is no different from the same
spec installed anywhere else in Spack. Environments are assembled
simply by collecting together a set of specs.
#. Spack Environments may contain more than one spec of the same
spec installed anywhere else in Spack.
#. Spack environments may contain more than one spec of the same
package.
Spack uses a "manifest and lock" model similar to `Bundler gemfiles
<https://bundler.io/man/gemfile.5.html>`_ and other package
managers. The user input file is named ``spack.yaml`` and the lock
file is named ``spack.lock``
<https://bundler.io/man/gemfile.5.html>`_ and other package managers.
The environment's user input file (or manifest), is named ``spack.yaml``.
The lock file, which contains the fully configured and concretized specs,
is named ``spack.lock``.
.. _environments-using:
@@ -68,55 +75,60 @@ An environment is created by:
$ spack env create myenv
Spack then creates the directory ``var/spack/environments/myenv``.
The directory ``$SPACK_ROOT/var/spack/environments/myenv`` is created
to manage the environment.
.. note::
All managed environments by default are stored in the ``var/spack/environments`` folder.
This location can be changed by setting the ``environments_root`` variable in ``config.yaml``.
All managed environments by default are stored in the
``$SPACK_ROOT/var/spack/environments`` folder. This location can be changed
by setting the ``environments_root`` variable in ``config.yaml``.
In the ``var/spack/environments/myenv`` directory, Spack creates the
file ``spack.yaml`` and the hidden directory ``.spack-env``.
Spack stores metadata in the ``.spack-env`` directory. User
interaction will occur through the ``spack.yaml`` file and the Spack
commands that affect it. When the environment is concretized, Spack
will create a file ``spack.lock`` with the concrete information for
Spack creates the file ``spack.yaml``, hidden directory ``.spack-env``, and
``spack.lock`` file under ``$SPACK_ROOT/var/spack/environments/myenv``. User
interaction occurs through the ``spack.yaml`` file and the Spack commands
that affect it. Metadata and, by default, the view are stored in the
``.spack-env`` directory. When the environment is concretized, Spack creates
the ``spack.lock`` file with the fully configured specs and dependencies for
the environment.
In addition to being the default location for the view associated with
an Environment, the ``.spack-env`` directory also contains:
The ``.spack-env`` subdirectory also contains:
* ``repo/``: A repo consisting of the Spack packages used in this
environment. This allows the environment to build the same, in
theory, even on different versions of Spack with different
* ``repo/``: A subdirectory acting as the repo consisting of the Spack
packages used in the environment. It allows the environment to build
the same, in theory, even on different versions of Spack with different
packages!
* ``logs/``: A directory containing the build logs for the packages
in this Environment.
* ``logs/``: A subdirectory containing the build logs for the packages
in this environment.
Spack Environments can also be created from either a manifest file
(usually but not necessarily named, ``spack.yaml``) or a lockfile.
To create an Environment from a manifest:
Spack Environments can also be created from either the user input, or
manifest, file or the lockfile. Create an environment from a manifest using:
.. code-block:: console
$ spack env create myenv spack.yaml
To create an Environment from a ``spack.lock`` lockfile:
The resulting environment is guaranteed to have the same root specs as
the original but may concretize differently in the presence of different
explicit or default configuration settings (e.g., a different version of
Spack or for a different user account).
Create an environment from a ``spack.lock`` file using:
.. code-block:: console
$ spack env create myenv spack.lock
Either of these commands can also take a full path to the
initialization file.
The resulting environment, when on the same or a compatible machine, is
guaranteed to initially have the same concrete specs as the original.
A Spack Environment created from a ``spack.yaml`` manifest is
guaranteed to have the same root specs as the original Environment,
but may concretize differently. A Spack Environment created from a
``spack.lock`` lockfile is guaranteed to have the same concrete specs
as the original Environment. Either may obviously then differ as the
user modifies it.
.. note::
Environment creation also accepts a full path to the file.
If the path is not under the ``$SPACK_ROOT/var/spack/environments``
directory then the source is referred to as an
:ref:`independent environment <independent_environments>`.
^^^^^^^^^^^^^^^^^^^^^^^^^
Activating an Environment
@@ -129,7 +141,7 @@ To activate an environment, use the following command:
$ spack env activate myenv
By default, the ``spack env activate`` will load the view associated
with the Environment into the user environment. The ``-v,
with the environment into the user environment. The ``-v,
--with-view`` argument ensures this behavior, and the ``-V,
--without-view`` argument activates the environment without changing
the user environment variables.
@@ -142,8 +154,11 @@ user's prompt to begin with the environment name in brackets.
$ spack env activate -p myenv
[myenv] $ ...
The ``activate`` command can also be used to create a new environment if it does not already
exist.
The ``activate`` command can also be used to create a new environment, if it is
not already defined, by adding the ``--create`` flag. Managed and independent
environments can both be created using the same flags that `spack env create`
accepts. If an environment already exists then spack will simply activate it
and ignore the create-specific flags.
.. code-block:: console
@@ -168,49 +183,50 @@ or the shortcut alias
If the environment was activated with its view, deactivating the
environment will remove the view from the user environment.
^^^^^^^^^^^^^^^^^^^^^^
Anonymous Environments
^^^^^^^^^^^^^^^^^^^^^^
.. _independent_environments:
Apart from managed environments, Spack also supports anonymous environments.
^^^^^^^^^^^^^^^^^^^^^^^^
Independent Environments
^^^^^^^^^^^^^^^^^^^^^^^^
Anonymous environments can be placed in any directory of choice.
Independent environments can be located in any directory outside of Spack.
.. note::
When uninstalling packages, Spack asks the user to confirm the removal of packages
that are still used in a managed environment. This is not the case for anonymous
that are still used in a managed environment. This is not the case for independent
environments.
To create an anonymous environment, use one of the following commands:
To create an independent environment, use one of the following commands:
.. code-block:: console
$ spack env create --dir my_env
$ spack env create ./my_env
As a shorthand, you can also create an anonymous environment upon activation if it does not
As a shorthand, you can also create an independent environment upon activation if it does not
already exist:
.. code-block:: console
$ spack env activate --create ./my_env
For convenience, Spack can also place an anonymous environment in a temporary directory for you:
For convenience, Spack can also place an independent environment in a temporary directory for you:
.. code-block:: console
$ spack env activate --temp
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment Sensitive Commands
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment-Aware Commands
^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack commands are environment sensitive. For example, the ``find``
command shows only the specs in the active Environment if an
Environment has been activated. Similarly, the ``install`` and
``uninstall`` commands act on the active environment.
Spack commands are environment-aware. For example, the ``find``
command shows only the specs in the active environment if an
environment has been activated. Otherwise it shows all specs in
the Spack instance. The same rule applies to the ``install`` and
``uninstall`` commands.
.. code-block:: console
@@ -255,32 +271,33 @@ Environment has been activated. Similarly, the ``install`` and
Note that when we installed the abstract spec ``zlib@1.2.8``, it was
presented as a root of the Environment. All explicitly installed
packages will be listed as roots of the Environment.
presented as a root of the environment. All explicitly installed
packages will be listed as roots of the environment.
All of the Spack commands that act on the list of installed specs are
Environment-sensitive in this way, including ``install``,
``uninstall``, ``find``, ``extensions``, and more. In the
environment-aware in this way, including ``install``,
``uninstall``, ``find``, ``extensions``, etcetera. In the
:ref:`environment-configuration` section we will discuss
Environment-sensitive commands further.
environment-aware commands further.
^^^^^^^^^^^^^^^^^^^^^
Adding Abstract Specs
^^^^^^^^^^^^^^^^^^^^^
An abstract spec is the user-specified spec before Spack has applied
any defaults or dependency information.
An abstract spec is the user-specified spec before Spack applies
defaults or dependency information.
Users can add abstract specs to an Environment using the ``spack add``
command. The most important component of an Environment is a list of
Users can add abstract specs to an environment using the ``spack add``
command. The most important component of an environment is a list of
abstract specs.
Adding a spec adds to the manifest (the ``spack.yaml`` file), which is
used to define the roots of the Environment, but does not affect the
concrete specs in the lockfile, nor does it install the spec.
Adding a spec adds it as a root spec of the environment in the user
input file (``spack.yaml``). It does not affect the concrete specs
in the lock file (``spack.lock``) and it does not install the spec.
The ``spack add`` command is environment aware. It adds to the
currently active environment. All environment aware commands can also
The ``spack add`` command is environment-aware. It adds the spec to the
currently active environment. An error is generated if there isn't an
active environment. All environment-aware commands can also
be called using the ``spack -e`` flag to specify the environment.
.. code-block:: console
@@ -300,11 +317,11 @@ or
Concretizing
^^^^^^^^^^^^
Once some user specs have been added to an environment, they can be concretized.
There are at the moment three different modes of operation to concretize an environment,
which are explained in details in :ref:`environments_concretization_config`.
Regardless of which mode of operation has been chosen, the following
command will ensure all the root specs are concretized according to the
Once user specs have been added to an environment, they can be concretized.
There are three different modes of operation to concretize an environment,
explained in detail in :ref:`environments_concretization_config`.
Regardless of which mode of operation is chosen, the following
command will ensure all of the root specs are concretized according to the
constraints that are prescribed in the configuration:
.. code-block:: console
@@ -313,16 +330,15 @@ constraints that are prescribed in the configuration:
In the case of specs that are not concretized together, the command
above will concretize only the specs that were added and not yet
concretized. Forcing a re-concretization of all the specs can be done
instead with this command:
concretized. Forcing a re-concretization of all of the specs can be done
by adding the ``-f`` option:
.. code-block:: console
[myenv]$ spack concretize -f
When the ``-f`` flag is not used to reconcretize all specs, Spack
guarantees that already concretized specs are unchanged in the
environment.
Without the option, Spack guarantees that already concretized specs are
unchanged in the environment.
The ``concretize`` command does not install any packages. For packages
that have already been installed outside of the environment, the
@@ -355,16 +371,16 @@ installed specs using the ``-c`` (``--concretized``) flag.
Installing an Environment
^^^^^^^^^^^^^^^^^^^^^^^^^
In addition to installing individual specs into an Environment, one
can install the entire Environment at once using the command
In addition to adding individual specs to an environment, one
can install the entire environment at once using the command
.. code-block:: console
[myenv]$ spack install
If the Environment has been concretized, Spack will install the
concretized specs. Otherwise, ``spack install`` will first concretize
the Environment and then install the concretized specs.
If the environment has been concretized, Spack will install the
concretized specs. Otherwise, ``spack install`` will concretize
the environment before installing the concretized specs.
.. note::
@@ -385,17 +401,17 @@ the Environment and then install the concretized specs.
As it installs, ``spack install`` creates symbolic links in the
``logs/`` directory in the Environment, allowing for easy inspection
``logs/`` directory in the environment, allowing for easy inspection
of build logs related to that environment. The ``spack install``
command also stores a Spack repo containing the ``package.py`` file
used at install time for each package in the ``repos/`` directory in
the Environment.
the environment.
The ``--no-add`` option can be used in a concrete environment to tell
spack to install specs already present in the environment but not to
add any new root specs to the environment. For root specs provided
to ``spack install`` on the command line, ``--no-add`` is the default,
while for dependency specs on the other hand, it is optional. In other
while for dependency specs, it is optional. In other
words, if there is an unambiguous match in the active concrete environment
for a root spec provided to ``spack install`` on the command line, spack
does not require you to specify the ``--no-add`` option to prevent the spec
@@ -414,7 +430,13 @@ default, it will also clone the package to a subdirectory in the
environment. This package will have a special variant ``dev_path``
set, and Spack will ensure the package and its dependents are rebuilt
any time the environment is installed if the package's local source
code has been modified. Spack ensures that all instances of a
code has been modified. Spack's native implementation to check for modifications
is to check if ``mtime`` is newer than the installation.
A custom check can be created by overriding the ``detect_dev_src_change`` method
in your package class. This is particularly useful for projects using custom spack repo's
to drive development and want to optimize performance.
Spack ensures that all instances of a
developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the
``spack develop`` command.
@@ -424,7 +446,7 @@ also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone
the ``main`` branch of the package, and ``spack install`` will install from
that git clone if ``foo`` is in the environment.
Further development on ``foo`` can be tested by reinstalling the environment,
Further development on ``foo`` can be tested by re-installing the environment,
and eventually committed and pushed to the upstream git repo.
If the package being developed supports out-of-source builds then users can use the
@@ -609,7 +631,7 @@ manipulate configuration inline in the ``spack.yaml`` file.
Inline configurations
^^^^^^^^^^^^^^^^^^^^^
Inline Environment-scope configuration is done using the same yaml
Inline environment-scope configuration is done using the same yaml
format as standard Spack configuration scopes, covered in the
:ref:`configuration` section. Each section is contained under a
top-level yaml object with it's name. For example, a ``spack.yaml``
@@ -634,7 +656,7 @@ Included configurations
Spack environments allow an ``include`` heading in their yaml
schema. This heading pulls in external configuration files and applies
them to the Environment.
them to the environment.
.. code-block:: yaml
@@ -661,7 +683,7 @@ have higher precedence, as the included configs are applied in reverse order.
Manually Editing the Specs List
-------------------------------
The list of abstract/root specs in the Environment is maintained in
The list of abstract/root specs in the environment is maintained in
the ``spack.yaml`` manifest under the heading ``specs``.
.. code-block:: yaml
@@ -769,7 +791,7 @@ evaluates to the cross-product of those specs. Spec matrices also
contain an ``excludes`` directive, which eliminates certain
combinations from the evaluated result.
The following two Environment manifests are identical:
The following two environment manifests are identical:
.. code-block:: yaml
@@ -844,7 +866,7 @@ files are identical.
In short files like the example, it may be easier to simply list the
included specs. However for more complicated examples involving many
packages across many toolchains, separately factored lists make
Environments substantially more manageable.
environments substantially more manageable.
Additionally, the ``-l`` option to the ``spack add`` command allows
one to add to named lists in the definitions section of the manifest
@@ -893,9 +915,8 @@ The valid variables for a ``when`` clause are:
#. ``env``. The user environment (usually ``os.environ`` in Python).
#. ``hostname``. The hostname of the system.
#. ``full_hostname``. The fully qualified hostname of the system.
#. ``hostname``. The hostname of the system (if ``hostname`` is an
executable in the user's PATH).
^^^^^^^^^^^^^^^^^^^^^^^^
SpecLists as Constraints
@@ -1061,7 +1082,7 @@ true``). The argument ``--without-view`` can be used to create an
environment without any view configured.
The ``spack env view`` command can be used to change the manage views
of an Environment. The subcommand ``spack env view enable`` will add a
of an environment. The subcommand ``spack env view enable`` will add a
view named ``default`` to an environment. It takes an optional
argument to specify the path for the new default view. The subcommand
``spack env view disable`` will remove the view named ``default`` from
@@ -1229,7 +1250,7 @@ gets installed and is available for use in the ``env`` target.
$(SPACK) -e . env depfile -o $@ --make-prefix spack
env: spack/env
$(info Environment installed!)
$(info environment installed!)
clean:
rm -rf spack.lock env.mk spack/

View File

@@ -61,10 +61,15 @@ Getting Spack is easy. You can clone it from the `github repository
.. code-block:: console
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git
This will create a directory called ``spack``.
.. note::
``-c feature.manyFiles=true`` improves git's performance on repositories with 1,000+ files.
``--depth=2`` prunes the git history to reduce the size of the Spack installation.
.. _shell-support:
^^^^^^^^^^^^^
@@ -1475,16 +1480,14 @@ in a Windows CMD prompt.
Step 3: Run and configure Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To use Spack, run ``bin\spack_cmd.bat`` (you may need to Run as Administrator) from the top-level spack
directory. This will provide a Windows command prompt with an environment properly set up with Spack
and its prerequisites. If you receive a warning message that Python is not in your ``PATH``
On Windows, Spack supports both primary native shells, Powershell and the traditional command prompt.
To use Spack, pick your favorite shell, and run ``bin\spack_cmd.bat`` or ``share/spack/setup-env.ps1``
(you may need to Run as Administrator) from the top-level spack
directory. This will provide a Spack enabled shell. If you receive a warning message that Python is not in your ``PATH``
(which may happen if you installed Python from the website and not the Windows Store) add the location
of the Python executable to your ``PATH`` now. You can permanently add Python to your ``PATH`` variable
by using the ``Edit the system environment variables`` utility in Windows Control Panel.
.. note::
Alternatively, Powershell can be used in place of CMD
To configure Spack, first run the following command inside the Spack console:
.. code-block:: console
@@ -1549,7 +1552,7 @@ and not tabs, so ensure that this is the case when editing one directly.
.. note:: Cygwin
The use of Cygwin is not officially supported by Spack and is not tested.
However Spack will not throw an error, so use if choosing to use Spack
However Spack will not prevent this, so use if choosing to use Spack
with Cygwin, know that no functionality is garunteed.
^^^^^^^^^^^^^^^^^
@@ -1563,21 +1566,12 @@ Spack console via:
spack install cpuinfo
If in the previous step, you did not have CMake or Ninja installed, running the command above should bootstrap both packages
If in the previous step, you did not have CMake or Ninja installed, running the command above should install both packages
"""""""""""""""""""""""""""
Windows Compatible Packages
"""""""""""""""""""""""""""
.. note:: Spec Syntax Caveats
Windows has a few idiosyncrasies when it comes to the Spack spec syntax and the use of certain shells
See the Spack spec syntax doc for more information
Not all spack packages currently have Windows support. Some are inherently incompatible with the
platform, and others simply have yet to be ported. To view the current set of packages with Windows
support, the list command should be used via `spack list -t windows`. If there's a package you'd like
to install on Windows but is not in that list, feel free to reach out to request the port or contribute
the port yourself.
.. note::
This is by no means a comprehensive list, some packages may have ports that were not tagged
while others may just work out of the box on Windows and have not been tagged as such.
^^^^^^^^^^^^^^
For developers
@@ -1587,6 +1581,3 @@ The intent is to provide a Windows installer that will automatically set up
Python, Git, and Spack, instead of requiring the user to do so manually.
Instructions for creating the installer are at
https://github.com/spack/spack/blob/develop/lib/spack/spack/cmd/installer/README.md
Alternatively a pre-built copy of the Windows installer is available as an artifact of Spack's Windows CI
available at each run of the CI on develop or any PR.

View File

@@ -39,10 +39,15 @@ package:
.. code-block:: console
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install libelf
.. note::
``-c feature.manyFiles=true`` improves git's performance on repositories with 1,000+ files.
``--depth=2`` prunes the git history to reduce the size of the Spack installation.
If you're new to spack and want to start using it, see :doc:`getting_started`,
or refer to the full manual below.

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.5-dev (commit 7e6740012b897ae4a950f0bba7e9726b767e921f)
* Version: 0.2.5-dev (commit bceb39528ac49dd0c876b2e9bf3e7482e9c2be4a)
astunparse
----------------

View File

@@ -115,6 +115,9 @@ def __eq__(self, other):
and self.cpu_part == other.cpu_part
)
def __hash__(self):
return hash(self.name)
@coerce_target_names
def __ne__(self, other):
return not self == other

View File

@@ -2844,8 +2844,7 @@
"asimdrdm",
"lrcpc",
"dcpop",
"asimddp",
"ssbs"
"asimddp"
],
"compilers" : {
"gcc": [
@@ -2942,7 +2941,6 @@
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3010,7 +3008,7 @@
},
{
"versions": "11:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
"flags" : "-march=armv8.4-a+sve+fp16+bf16+crypto+i8mm+rng"
},
{
"versions": "12:",
@@ -3066,7 +3064,6 @@
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"dcpodp",
"sve2",
@@ -3179,7 +3176,6 @@
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"dcpodp",
"sve2",

View File

@@ -12,7 +12,7 @@
import sys
import traceback
from datetime import datetime, timedelta
from typing import Any, Callable, Iterable, List, Tuple
from typing import Callable, Iterable, List, Tuple, TypeVar
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -879,9 +879,12 @@ def enum(**kwargs):
return type("Enum", (object,), kwargs)
T = TypeVar("T")
def stable_partition(
input_iterable: Iterable, predicate_fn: Callable[[Any], bool]
) -> Tuple[List[Any], List[Any]]:
input_iterable: Iterable[T], predicate_fn: Callable[[T], bool]
) -> Tuple[List[T], List[T]]:
"""Partition the input iterable according to a custom predicate.
Args:
@@ -893,12 +896,13 @@ def stable_partition(
Tuple of the list of elements evaluating to True, and
list of elements evaluating to False.
"""
true_items, false_items = [], []
true_items: List[T] = []
false_items: List[T] = []
for item in input_iterable:
if predicate_fn(item):
true_items.append(item)
continue
false_items.append(item)
else:
false_items.append(item)
return true_items, false_items

View File

@@ -3,6 +3,13 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
from typing import Optional
import spack.paths
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0"
spack_version = __version__
@@ -19,4 +26,47 @@ def __try_int(v):
spack_version_info = tuple([__try_int(v) for v in __version__.split(".")])
__all__ = ["spack_version_info", "spack_version"]
def get_spack_commit() -> Optional[str]:
"""Get the Spack git commit sha.
Returns:
(str or None) the commit sha if available, otherwise None
"""
git_path = os.path.join(spack.paths.prefix, ".git")
if not os.path.exists(git_path):
return None
git = spack.util.git.git()
if not git:
return None
rev = git(
"-C",
spack.paths.prefix,
"rev-parse",
"HEAD",
output=str,
error=os.devnull,
fail_on_error=False,
)
if git.returncode != 0:
return None
match = re.match(r"[a-f\d]{7,}$", rev)
return match.group(0) if match else None
def get_version() -> str:
"""Get a descriptive version of this instance of Spack.
Outputs '<PEP440 version> (<git commit sha>)'.
The commit sha is only added when available.
"""
commit = get_spack_commit()
if commit:
return f"{spack_version} ({commit})"
return spack_version
__all__ = ["spack_version_info", "spack_version", "get_version", "get_spack_commit"]

View File

@@ -39,6 +39,7 @@ def _search_duplicate_compilers(error_cls):
import collections
import collections.abc
import glob
import inspect
import io
import itertools
import os
@@ -50,10 +51,12 @@ def _search_duplicate_compilers(error_cls):
from urllib.request import urlopen
import llnl.util.lang
from llnl.string import plural
import spack.builder
import spack.config
import spack.fetch_strategy
import spack.patch
import spack.paths
import spack.repo
import spack.spec
import spack.util.crypto
@@ -281,7 +284,7 @@ def _avoid_mismatched_variants(error_cls):
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant in current_spec.variants.values():
# Variant does not exist at all
if variant.name not in pkg_cls.variants:
if variant.name not in pkg_cls.variant_names():
summary = (
f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'"
@@ -290,9 +293,8 @@ def _avoid_mismatched_variants(error_cls):
continue
# Variant cannot accept this value
s = spack.spec.Spec(pkg_name)
try:
s.update_variant_validate(variant.name, variant.value)
spack.variant.prevalidate_variant_value(pkg_cls, variant, strict=True)
except Exception:
summary = (
f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
@@ -386,6 +388,14 @@ def _make_config_error(config_data, summary, error_cls):
)
package_deprecated_attributes = AuditClass(
group="packages",
tag="PKG-DEPRECATED-ATTRIBUTES",
description="Sanity checks to preclude use of deprecated package attributes",
kwargs=("pkgs",),
)
package_properties = AuditClass(
group="packages",
tag="PKG-PROPERTIES",
@@ -404,22 +414,23 @@ def _make_config_error(config_data, summary, error_cls):
)
@package_directives
@package_properties
def _check_build_test_callbacks(pkgs, error_cls):
"""Ensure stand-alone test method is not included in build-time callbacks"""
"""Ensure stand-alone test methods are not included in build-time callbacks.
Test methods are for checking the installed software as stand-alone tests.
They could also be called during the post-install phase of a build.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
test_callbacks = getattr(pkg_cls, "build_time_test_callbacks", None)
# TODO (post-34236): "test*"->"test_*" once remove deprecated methods
# TODO (post-34236): "test"->"test_" once remove deprecated methods
has_test_method = test_callbacks and any([m.startswith("test") for m in test_callbacks])
has_test_method = test_callbacks and any([m.startswith("test_") for m in test_callbacks])
if has_test_method:
msg = '{0} package contains "test*" method(s) in ' "build_time_test_callbacks"
instr = 'Remove all methods whose names start with "test" from: [{0}]'.format(
", ".join(test_callbacks)
)
msg = f"Package {pkg_name} includes stand-alone test methods in build-time checks."
callbacks = ", ".join(test_callbacks)
instr = f"Remove the following from 'build_time_test_callbacks': {callbacks}"
errors.append(error_cls(msg.format(pkg_name), [instr]))
return errors
@@ -517,6 +528,46 @@ def _search_for_reserved_attributes_names_in_packages(pkgs, error_cls):
return errors
@package_deprecated_attributes
def _search_for_deprecated_package_methods(pkgs, error_cls):
"""Ensure the package doesn't define or use deprecated methods"""
DEPRECATED_METHOD = (("test", "a name starting with 'test_'"),)
DEPRECATED_USE = (
("self.cache_extra_test_sources(", "cache_extra_test_sources(self, ..)"),
("self.install_test_root(", "install_test_root(self, ..)"),
("self.run_test(", "test_part(self, ..)"),
)
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
methods = inspect.getmembers(pkg_cls, predicate=lambda x: inspect.isfunction(x))
method_errors = collections.defaultdict(list)
for name, function in methods:
for deprecated_name, alternate in DEPRECATED_METHOD:
if name == deprecated_name:
msg = f"Rename '{deprecated_name}' method to {alternate} instead."
method_errors[name].append(msg)
source = inspect.getsource(function)
for deprecated_name, alternate in DEPRECATED_USE:
if deprecated_name in source:
msg = f"Change '{deprecated_name}' to '{alternate}' in '{name}' method."
method_errors[name].append(msg)
num_methods = len(method_errors)
if num_methods > 0:
methods = plural(num_methods, "method", show_n=False)
error_msg = (
f"Package '{pkg_name}' implements or uses unsupported deprecated {methods}."
)
instr = [f"Make changes to '{pkg_cls.__module__}':"]
for name in sorted(method_errors):
instr.extend([f" {msg}" for msg in method_errors[name]])
errors.append(error_cls(error_msg, instr))
return errors
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
@@ -662,9 +713,15 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
buildsystem_variant, _ = pkg_cls.variants["build_system"]
buildsystem_names = [getattr(x, "value", x) for x in buildsystem_variant.values]
builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in buildsystem_names]
# values are either Value objects (for conditional values) or the values themselves
build_system_names = set(
v.value if isinstance(v, spack.variant.Value) else v
for _, variant in pkg_cls.variant_definitions("build_system")
for v in variant.values
)
builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in build_system_names]
module = pkg_cls.module
has_builders_in_package_py = any(
getattr(module, name, False) for name in builder_cls_names
@@ -765,6 +822,89 @@ def _uses_deprecated_globals(pkgs, error_cls):
return errors
@package_properties
def _ensure_test_docstring(pkgs, error_cls):
"""Ensure stand-alone test methods have a docstring.
The docstring of a test method is implicitly used as the description of
the corresponding test part during test results reporting.
"""
doc_regex = r'\s+("""[^"]+""")'
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
methods = inspect.getmembers(pkg_cls, predicate=lambda x: inspect.isfunction(x))
method_names = []
for name, test_fn in methods:
if not name.startswith("test_"):
continue
# Ensure the test method has a docstring
source = inspect.getsource(test_fn)
match = re.search(doc_regex, source)
if match is None or len(match.group(0).replace('"', "").strip()) == 0:
method_names.append(name)
num_methods = len(method_names)
if num_methods > 0:
methods = plural(num_methods, "method", show_n=False)
docstrings = plural(num_methods, "docstring", show_n=False)
msg = f"Package {pkg_name} has test {methods} with empty or missing {docstrings}."
names = ", ".join(method_names)
instr = [
"Docstrings are used as descriptions in test outputs.",
f"Add a concise summary to the following {methods} in '{pkg_cls.__module__}':",
f"{names}",
]
errors.append(error_cls(msg, instr))
return errors
@package_properties
def _ensure_test_implemented(pkgs, error_cls):
"""Ensure stand-alone test methods are implemented.
The test method is also required to be non-empty.
"""
def skip(line):
ln = line.strip()
return ln.startswith("#") or "pass" in ln
doc_regex = r'\s+("""[^"]+""")'
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
methods = inspect.getmembers(pkg_cls, predicate=lambda x: inspect.isfunction(x))
method_names = []
for name, test_fn in methods:
if not name.startswith("test_"):
continue
source = inspect.getsource(test_fn)
# Attempt to ensure the test method is implemented.
impl = re.sub(doc_regex, r"", source).splitlines()[1:]
lines = [ln.strip() for ln in impl if not skip(ln)]
if not lines:
method_names.append(name)
num_methods = len(method_names)
if num_methods > 0:
methods = plural(num_methods, "method", show_n=False)
msg = f"Package {pkg_name} has empty or missing test {methods}."
names = ", ".join(method_names)
instr = [
f"Implement or remove the following {methods} from '{pkg_cls.__module__}': {names}"
]
errors.append(error_cls(msg, instr))
return errors
@package_https_directives
def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links"""
@@ -931,20 +1071,22 @@ def check_virtual_with_variants(spec, msg):
# check variants
dependency_variants = dep.spec.variants
for name, value in dependency_variants.items():
for name, variant in dependency_variants.items():
try:
v, _ = dependency_pkg_cls.variants[name]
v.validate_or_raise(value, pkg_cls=dependency_pkg_cls)
spack.variant.prevalidate_variant_value(
dependency_pkg_cls, variant, dep.spec, strict=True
)
except Exception as e:
summary = (
f"{pkg_name}: wrong variant used for dependency in 'depends_on()'"
)
error_msg = str(e)
if isinstance(e, KeyError):
error_msg = (
f"variant {str(e).strip()} does not exist in package {dep_name}"
f" in package '{dep_name}'"
)
error_msg += f" in package '{dep_name}'"
errors.append(
error_cls(summary=summary, details=[error_msg, f"in {filename}"])
@@ -956,39 +1098,38 @@ def check_virtual_with_variants(spec, msg):
@package_directives
def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
"""Ensures that variant defaults are present and parsable from cli"""
def check_variant(pkg_cls, variant, vname):
# bool is a subclass of int in python. Permitting a default that is an instance
# of 'int' means both foo=false and foo=0 are accepted. Other falsish values are
# not allowed, since they can't be parsed from CLI ('foo=')
default_is_parsable = isinstance(variant.default, int) or variant.default
if not default_is_parsable:
msg = f"Variant '{vname}' of package '{pkg_cls.name}' has an unparsable default value"
return [error_cls(msg, [])]
try:
vspec = variant.make_default()
except spack.variant.MultipleValuesInExclusiveVariantError:
msg = f"Can't create default value for variant '{vname}' in package '{pkg_cls.name}'"
return [error_cls(msg, [])]
try:
variant.validate_or_raise(vspec, pkg_cls.name)
except spack.variant.InvalidVariantValueError:
msg = "Default value of variant '{vname}' in package '{pkg.name}' is invalid"
question = "Is it among the allowed values?"
return [error_cls(msg, [question])]
return []
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant_name, entry in pkg_cls.variants.items():
variant, _ = entry
default_is_parsable = (
# Permitting a default that is an instance on 'int' permits
# to have foo=false or foo=0. Other falsish values are
# not allowed, since they can't be parsed from cli ('foo=')
isinstance(variant.default, int)
or variant.default
)
if not default_is_parsable:
error_msg = "Variant '{}' of package '{}' has a bad default value"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
continue
try:
vspec = variant.make_default()
except spack.variant.MultipleValuesInExclusiveVariantError:
error_msg = "Cannot create a default value for the variant '{}' in package '{}'"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
continue
try:
variant.validate_or_raise(vspec, pkg_cls=pkg_cls)
except spack.variant.InvalidVariantValueError:
error_msg = (
"The default value of the variant '{}' in package '{}' failed validation"
)
question = "Is it among the allowed values?"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), [question]))
for vname in pkg_cls.variant_names():
for _, variant_def in pkg_cls.variant_definitions(vname):
errors.extend(check_variant(pkg_cls, variant_def, vname))
return errors
@@ -998,11 +1139,11 @@ def _ensure_variants_have_descriptions(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant_name, entry in pkg_cls.variants.items():
variant, _ = entry
if not variant.description:
error_msg = "Variant '{}' in package '{}' is missing a description"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
for name in pkg_cls.variant_names():
for when, variant in pkg_cls.variant_definitions(name):
if not variant.description:
msg = f"Variant '{name}' in package '{pkg_name}' is missing a description"
errors.append(error_cls(msg, []))
return errors
@@ -1059,29 +1200,26 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
variant_exceptions = (
spack.variant.InconsistentValidationError,
spack.variant.MultipleValuesInExclusiveVariantError,
spack.variant.InvalidVariantValueError,
KeyError,
)
errors = []
variant_names = pkg.variant_names()
summary = f"{pkg.name}: wrong variant in '{directive}' directive"
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
for name, v in constraint.variants.items():
if name not in variant_names:
msg = f"variant {name} does not exist in {pkg.name}"
errors.append(error_cls(summary=summary, details=[msg, f"in {filename}"]))
continue
try:
variant, _ = pkg.variants[name]
variant.validate_or_raise(v, pkg_cls=pkg)
except variant_exceptions as e:
summary = pkg.name + ': wrong variant in "{0}" directive'
summary = summary.format(directive)
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
error_msg = str(e).strip()
if isinstance(e, KeyError):
error_msg = "the variant {0} does not exist".format(error_msg)
err = error_cls(summary=summary, details=[error_msg, "in " + filename])
errors.append(err)
spack.variant.prevalidate_variant_value(pkg, v, constraint, strict=True)
except (
spack.variant.InconsistentValidationError,
spack.variant.MultipleValuesInExclusiveVariantError,
spack.variant.InvalidVariantValueError,
) as e:
msg = str(e).strip()
errors.append(error_cls(summary=summary, details=[msg, f"in {filename}"]))
return errors
@@ -1119,9 +1257,10 @@ def _extracts_errors(triggers, summary):
for dname in dnames
)
for vname, (variant, triggers) in pkg_cls.variants.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant"
errors.extend(_extracts_errors(triggers, summary))
for when, variants_by_name in pkg_cls.variants.items():
for vname, variant in variants_by_name.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant"
errors.extend(_extracts_errors([when], summary))
for when, providers, details in _error_items(pkg_cls.provided):
errors.extend(

View File

@@ -33,7 +33,6 @@
from llnl.util.symlink import readlink
import spack.caches
import spack.cmd
import spack.config as config
import spack.database as spack_db
import spack.error
@@ -44,9 +43,9 @@
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
import spack.paths
import spack.platforms
import spack.relocate as relocate
import spack.repo
import spack.spec
import spack.stage
import spack.store
@@ -1447,7 +1446,9 @@ def _oci_push_pkg_blob(
filename = os.path.join(tmpdir, f"{spec.dag_hash()}.tar.gz")
# Create an oci.image.layer aka tarball of the package
compressed_tarfile_checksum, tarfile_checksum = spack.oci.oci.create_tarball(spec, filename)
compressed_tarfile_checksum, tarfile_checksum = _do_create_tarball(
filename, spec.prefix, get_buildinfo_dict(spec)
)
blob = spack.oci.oci.Blob(
Digest.from_sha256(compressed_tarfile_checksum),
@@ -2697,6 +2698,9 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
for mirror in mirror_collection.values():
fetch_url = mirror.fetch_url
# TODO: oci:// does not support signing.
if fetch_url.startswith("oci://"):
continue
keys_url = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
)

View File

@@ -14,6 +14,7 @@
import spack.compilers
import spack.config
import spack.environment
import spack.modules
import spack.paths
import spack.platforms
import spack.repo

View File

@@ -37,21 +37,16 @@
import spack.binary_distribution
import spack.config
import spack.detection
import spack.environment
import spack.modules
import spack.paths
import spack.platforms
import spack.platforms.linux
import spack.repo
import spack.spec
import spack.store
import spack.user_environment
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
import spack.version
from spack.installer import PackageInstaller
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from .clingo import ClingoBootstrapConcretizer
@@ -283,7 +278,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install(fail_fast=True)
PackageInstaller([concrete_spec.package], fail_fast=True).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -306,7 +301,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install()
PackageInstaller([concrete_spec.package], fail_fast=True).install()
if _executables_in_store(executables, concrete_spec, query_info=info):
self.last_search = info
return True

View File

@@ -14,9 +14,9 @@
from llnl.util import tty
import spack.environment
import spack.spec
import spack.tengine
import spack.util.cpus
import spack.util.executable
import spack.util.path
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -37,13 +37,16 @@
import multiprocessing
import os
import re
import stat
import sys
import traceback
import types
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import Dict, List, Set, Tuple
from typing import Callable, Dict, List, Optional, Set, Tuple
import archspec.cpu
import llnl.util.tty as tty
from llnl.string import plural
@@ -53,6 +56,7 @@
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
import spack.build_systems._checks
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
@@ -61,26 +65,21 @@
import spack.config
import spack.deptypes as dt
import spack.error
import spack.main
import spack.multimethod
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
import spack.schema.environment
import spack.spec
import spack.stage
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.executable
import spack.util.path
import spack.util.pattern
import spack.util.libc
from spack import traverse
from spack.context import Context
from spack.error import NoHeadersError, NoLibrariesError
from spack.error import InstallError, NoHeadersError, NoLibrariesError
from spack.install_test import spack_install_test_log
from spack.installer import InstallError
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
@@ -363,7 +362,7 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = spec.architecture.target.optimization_flags(compiler)
isa_arg = optimization_flags(compiler, spec.target)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
@@ -408,6 +407,65 @@ def set_compiler_environment_variables(pkg, env):
return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
try:
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
class FilterDefaultDynamicLinkerSearchPaths:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
def __init__(self, dynamic_linker: Optional[str]) -> None:
# Identify directories by (inode, device) tuple, which handles symlinks too.
self.default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
self.default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
def is_dynamic_loader_default_path(self, p: str) -> bool:
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
@@ -455,7 +513,7 @@ def set_wrapper_variables(pkg, env):
env.set(SPACK_DEBUG, "TRUE")
env.set(SPACK_SHORT_SPEC, pkg.spec.short_spec)
env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}"))
env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir)
env.set(SPACK_DEBUG_LOG_DIR, spack.paths.spack_working_dir)
if spack.config.get("config:ccache"):
# Enable ccache in the compiler wrapper
@@ -465,69 +523,71 @@ def set_wrapper_variables(pkg, env):
env.set("CCACHE_DISABLE", "1")
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
rpath_deps = get_rpath_deps(pkg)
rpath_hashes = set(s.dag_hash() for s in get_rpath_deps(pkg))
link_deps = pkg.spec.traverse(root=False, order="topo", deptype=dt.LINK)
external_link_deps, nonexternal_link_deps = stable_partition(link_deps, lambda d: d.external)
link_dirs = []
include_dirs = []
rpath_dirs = []
def _prepend_all(list_to_modify, items_to_add):
# Update the original list (creating a new list would be faster but
# may not be convenient)
for item in reversed(list(items_to_add)):
list_to_modify.insert(0, item)
for dep in chain(external_link_deps, nonexternal_link_deps):
# TODO: is_system_path is wrong, but even if we knew default -L, -I flags from the compiler
# and default search dirs from the dynamic linker, it's not obvious how to avoid a possibly
# expensive search in `query.libs.directories` and `query.headers.directories`, which is
# what this branch is trying to avoid.
if is_system_path(dep.prefix):
continue
# TODO: as of Spack 0.22, multiple instances of the same package may occur among the link
# deps, so keying by name is wrong. In practice it is not problematic: we obtain the same
# gcc-runtime / glibc here, and repeatedly add the same dirs that are later deduped.
query = pkg.spec[dep.name]
dep_link_dirs = []
try:
# Locating libraries can be time consuming, so log start and finish.
tty.debug(f"Collecting libraries for {dep.name}")
dep_link_dirs.extend(query.libs.directories)
tty.debug(f"Libraries for {dep.name} have been collected.")
except NoLibrariesError:
tty.debug(f"No libraries found for {dep.name}")
def update_compiler_args_for_dep(dep):
if dep in link_deps and (not is_system_path(dep.prefix)):
query = pkg.spec[dep.name]
dep_link_dirs = list()
try:
# In some circumstances (particularly for externals) finding
# libraries packages can be time consuming, so indicate that
# we are performing this operation (and also report when it
# finishes).
tty.debug("Collecting libraries for {0}".format(dep.name))
dep_link_dirs.extend(query.libs.directories)
tty.debug("Libraries for {0} have been collected.".format(dep.name))
except NoLibrariesError:
tty.debug("No libraries found for {0}".format(dep.name))
for default_lib_dir in ("lib", "lib64"):
default_lib_prefix = os.path.join(dep.prefix, default_lib_dir)
if os.path.isdir(default_lib_prefix):
dep_link_dirs.append(default_lib_prefix)
for default_lib_dir in ["lib", "lib64"]:
default_lib_prefix = os.path.join(dep.prefix, default_lib_dir)
if os.path.isdir(default_lib_prefix):
dep_link_dirs.append(default_lib_prefix)
link_dirs[:0] = dep_link_dirs
if dep.dag_hash() in rpath_hashes:
rpath_dirs[:0] = dep_link_dirs
_prepend_all(link_dirs, dep_link_dirs)
if dep in rpath_deps:
_prepend_all(rpath_dirs, dep_link_dirs)
try:
tty.debug(f"Collecting headers for {dep.name}")
include_dirs[:0] = query.headers.directories
tty.debug(f"Headers for {dep.name} have been collected.")
except NoHeadersError:
tty.debug(f"No headers found for {dep.name}")
try:
_prepend_all(include_dirs, query.headers.directories)
except NoHeadersError:
tty.debug("No headers found for {0}".format(dep.name))
for dspec in pkg.spec.traverse(root=False, order="post"):
if dspec.external:
update_compiler_args_for_dep(dspec)
# Just above, we prepended entries for -L/-rpath for externals. We
# now do this for non-external packages so that Spack-built packages
# are searched first for libraries etc.
for dspec in pkg.spec.traverse(root=False, order="post"):
if not dspec.external:
update_compiler_args_for_dep(dspec)
# The top-level package is always RPATHed. It hasn't been installed yet
# so the RPATHs are added unconditionally (e.g. even though lib64/ may
# not be created for the install).
for libdir in ["lib64", "lib"]:
# The top-level package is heuristically rpath'ed.
for libdir in ("lib64", "lib"):
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
filter_default_dynamic_linker_search_paths = FilterDefaultDynamicLinkerSearchPaths(
pkg.compiler.default_dynamic_linker
)
# TODO: filter_system_paths is again wrong (and probably unnecessary due to the is_system_path
# branch above). link_dirs should be filtered with entries from _parse_link_paths.
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
rpath_dirs = filter_default_dynamic_linker_search_paths(rpath_dirs)
# TODO: implicit_rpaths is prefiltered by is_system_path, that should be removed in favor of
# just this filter.
implicit_rpaths = filter_default_dynamic_linker_search_paths(pkg.compiler.implicit_rpaths())
if implicit_rpaths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
@@ -562,7 +622,7 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
jobs = determine_number_of_jobs(parallel=pkg.parallel)
jobs = spack.config.determine_number_of_jobs(parallel=pkg.parallel)
module.make_jobs = jobs
# TODO: make these build deps that can be installed if not found.
@@ -788,7 +848,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Platform specific setup goes before package specific setup. This is for setting
# defaults like MACOSX_DEPLOYMENT_TARGET on macOS.
platform = spack.platforms.by_name(pkg.spec.architecture.platform)
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
@@ -813,15 +872,8 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules:
load_module(mod)
if target and target.module_name:
load_module(target.module_name)
load_external_modules(pkg)
implicit_rpaths = pkg.compiler.implicit_rpaths()
if implicit_rpaths:
env_mods.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
# Make sure nothing's strange about the Spack environment.
validate(env_mods, tty.warn)
env_mods.apply_modifications()
@@ -1116,8 +1168,51 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
serialized_pkg: "spack.subprocess_context.PackageInstallContext",
function: Callable,
kwargs: Dict,
write_pipe: multiprocessing.connection.Connection,
input_multiprocess_fd: Optional[MultiProcessFd],
jsfd1: Optional[MultiProcessFd],
jsfd2: Optional[MultiProcessFd],
):
"""Main entry point in the child process for Spack builds.
``_setup_pkg_and_run`` is called by the child process created in
``start_build_process()``, and its main job is to run ``function()`` on behalf of
some Spack installation (see :ref:`spack.installer.PackageInstaller._install_task`).
The child process is passed a ``write_pipe``, on which it's expected to send one of
the following:
* ``StopPhase``: error raised by a build process indicating it's stopping at a
particular build phase.
* ``BaseException``: any exception raised by a child build process, which will be
wrapped in ``ChildError`` (which adds a bunch of debug info and log context) and
raised in the parent.
* The return value of ``function()``, which can be anything (except an exception).
This is returned to the caller.
Note: ``jsfd1`` and ``jsfd2`` are passed solely to ensure that the child process
does not close these file descriptors. Some ``multiprocessing`` backends will close
them automatically in the child if they are not passed at process creation time.
Arguments:
serialized_pkg: Spack package install context object (serialized form of the
package that we'll build in the child process).
function: function to call in the child process; serialized_pkg is passed to
this as the first argument.
kwargs: additional keyword arguments to pass to ``function()``.
write_pipe: multiprocessing ``Connection`` to the parent process, to which the
child *must* send a result (or an error) back to parent on.
input_multiprocess_fd: stdin from the parent (not passed currently on Windows)
jsfd1: gmake Jobserver file descriptor 1.
jsfd2: gmake Jobserver file descriptor 2.
"""
context: str = kwargs.get("context", "build")
try:
@@ -1139,17 +1234,18 @@ def _setup_pkg_and_run(
return_value = function(pkg, kwargs)
write_pipe.send(return_value)
except StopPhase as e:
except spack.error.StopPhase as e:
# Do not create a full ChildError from this, it's not an error
# it's a control statement.
write_pipe.send(e)
except BaseException:
except BaseException as e:
# catch ANYTHING that goes wrong in the child process
exc_type, exc, tb = sys.exc_info()
# Need to unwind the traceback in the child because traceback
# objects can't be sent to the parent.
tb_string = traceback.format_exc()
exc_type = type(e)
tb = e.__traceback__
tb_string = traceback.format_exception(exc_type, e, tb)
# build up some context from the offending package so we can
# show that, too.
@@ -1166,8 +1262,8 @@ def _setup_pkg_and_run(
elif context == "test":
logfile = os.path.join(pkg.test_suite.stage, pkg.test_suite.test_log_name(pkg.spec))
error_msg = str(exc)
if isinstance(exc, (spack.multimethod.NoSuchMethodError, AttributeError)):
error_msg = str(e)
if isinstance(e, (spack.multimethod.NoSuchMethodError, AttributeError)):
process = "test the installation" if context == "test" else "build from sources"
error_msg = (
"The '{}' package cannot find an attribute while trying to {}. "
@@ -1177,7 +1273,7 @@ def _setup_pkg_and_run(
"More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure"
).format(pkg.name, process, context)
error_msg = colorize("@*R{{{}}}".format(error_msg))
error_msg = "{}\n\n{}".format(str(exc), error_msg)
error_msg = "{}\n\n{}".format(str(e), error_msg)
# make a pickleable exception to send to parent.
msg = "%s: %s" % (exc_type.__name__, error_msg)
@@ -1300,7 +1396,7 @@ def exitcode_msg(p):
p.join()
# If returns a StopPhase, raise it
if isinstance(child_result, StopPhase):
if isinstance(child_result, spack.error.StopPhase):
# do not print
raise child_result
@@ -1509,17 +1605,6 @@ def _make_child_error(msg, module, name, traceback, log, log_type, context):
return ChildError(msg, module, name, traceback, log, log_type, context)
class StopPhase(spack.error.SpackError):
"""Pickle-able exception to control stopped builds."""
def __reduce__(self):
return _make_stop_phase, (self.message, self.long_message)
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
def write_log_summary(out, log_type, log, last=None):
errors, warnings = parse_log_events(log)
nerr = len(errors)

View File

@@ -8,7 +8,7 @@
import llnl.util.lang
import spack.builder
import spack.installer
import spack.error
import spack.relocate
import spack.spec
import spack.store
@@ -34,7 +34,7 @@ def check_paths(path_list, filetype, predicate):
if not predicate(abs_path):
msg = "Install failed for {0}. No such {1} in prefix: {2}"
msg = msg.format(pkg.name, filetype, path)
raise spack.installer.InstallError(msg)
raise spack.error.InstallError(msg)
check_paths(pkg.sanity_check_is_file, "file", os.path.isfile)
check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir)
@@ -42,7 +42,7 @@ def check_paths(path_list, filetype, predicate):
ignore_file = llnl.util.lang.match_predicate(spack.store.STORE.layout.hidden_file_regexes)
if all(map(ignore_file, os.listdir(pkg.prefix))):
msg = "Install failed for {0}. Nothing was installed!"
raise spack.installer.InstallError(msg.format(pkg.name))
raise spack.error.InstallError(msg.format(pkg.name))
def apply_macos_rpath_fixups(builder: spack.builder.Builder):

View File

@@ -2,10 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import llnl.util.filesystem as fs
import spack.directives
import spack.package_base
import spack.util.executable
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -46,18 +47,12 @@ class AspellDictPackage(AutotoolsPackage):
#: Override the default autotools builder
AutotoolsBuilder = AspellBuilder
def view_destination(self, view):
aspell_spec = self.spec["aspell"]
if view.get_projection_for_spec(aspell_spec) != aspell_spec.prefix:
raise spack.package_base.ExtensionError(
"aspell does not support non-global extensions"
)
aspell = aspell_spec.command
return aspell("dump", "config", "dict-dir", output=str).strip()
def view_source(self):
return self.prefix.lib
def patch(self):
fs.filter_file(r"^dictdir=.*$", "dictdir=/lib", "configure")
fs.filter_file(r"^datadir=.*$", "datadir=/lib", "configure")
aspell_spec = self.spec["aspell"]
aspell = aspell_spec.command
dictdir = aspell("dump", "config", "dict-dir", output=str).strip()
datadir = aspell("dump", "config", "data-dir", output=str).strip()
dictdir = os.path.relpath(dictdir, aspell_spec.prefix)
datadir = os.path.relpath(datadir, aspell_spec.prefix)
fs.filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
fs.filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure")

View File

@@ -13,6 +13,7 @@
import spack.build_environment
import spack.builder
import spack.error
import spack.package_base
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
@@ -248,7 +249,7 @@ def runs_ok(script_abs_path):
# An external gnuconfig may not not have a prefix.
if gnuconfig_dir is None:
raise spack.build_environment.InstallError(
raise spack.error.InstallError(
"Spack could not find substitutes for GNU config files because no "
"prefix is available for the `gnuconfig` package. Make sure you set a "
"prefix path instead of modules for external `gnuconfig`."
@@ -268,7 +269,7 @@ def runs_ok(script_abs_path):
msg += (
" or the `gnuconfig` package prefix is misconfigured as" " an external package"
)
raise spack.build_environment.InstallError(msg)
raise spack.error.InstallError(msg)
# Filter working substitutes
candidates = [f for f in candidates if runs_ok(f)]
@@ -293,9 +294,7 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise spack.build_environment.InstallError(
msg.format(", ".join(to_be_found), self.name)
)
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -688,9 +687,8 @@ def _activate_or_not(
variant = variant or name
# Defensively look that the name passed as argument is among
# variants
if variant not in self.pkg.variants:
# Defensively look that the name passed as argument is among variants
if not self.pkg.has_variant(variant):
msg = '"{0}" is not a variant of "{1}"'
raise KeyError(msg.format(variant, self.pkg.name))
@@ -699,27 +697,19 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
variant_desc, _ = self.pkg.variants[variant]
if set(variant_desc.values) == set((True, False)):
vdef = self.pkg.get_variant(variant)
if set(vdef.values) == set((True, False)):
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
condition = "+{name}".format(name=variant)
options = [(name, condition in spec)]
options = [(name, f"+{variant}" in spec)]
else:
condition = "{variant}={value}"
# "feature_values" is used to track values which correspond to
# features which can be enabled or disabled as understood by the
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = (
getattr(variant_desc.values, "feature_values", None) or variant_desc.values
)
options = [
(value, condition.format(variant=variant, value=value) in spec)
for value in feature_values
]
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values
options = [(value, f"{variant}={value}" in spec) for value in feature_values]
# For each allowed value in the list of values
for option_value, activated in options:

View File

@@ -89,7 +89,7 @@ def define_cmake_cache_from_variant(self, cmake_var, variant=None, comment=""):
if variant is None:
variant = cmake_var.lower()
if variant not in self.pkg.variants:
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:

View File

@@ -15,6 +15,7 @@
import spack.build_environment
import spack.builder
import spack.deptypes as dt
import spack.error
import spack.package_base
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
@@ -145,6 +146,7 @@ def _values(x):
default=default,
values=_values,
description="the build system generator to use",
when="build_system=cmake",
)
for x in not_used:
conflicts(f"generator={x}")
@@ -344,7 +346,7 @@ def std_args(pkg, generator=None):
msg = "Invalid CMake generator: '{0}'\n".format(generator)
msg += "CMakePackage currently supports the following "
msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators))
raise spack.package_base.InstallError(msg)
raise spack.error.InstallError(msg)
try:
build_type = pkg.spec.variants["build_type"].value
@@ -504,7 +506,7 @@ def define_from_variant(self, cmake_var, variant=None):
if variant is None:
variant = cmake_var.lower()
if variant not in self.pkg.variants:
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:

View File

@@ -14,6 +14,7 @@
import spack.compiler
import spack.package_base
import spack.util.executable
# Local "type" for type hints
Path = Union[str, pathlib.Path]

View File

@@ -1,192 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import importlib
import inspect
import os
import shutil
import time
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_systems.cmake
import spack.builder
import spack.util.log_parse
from spack.builder import run_after
from spack.directives import depends_on, requires, variant
from spack.package import CMakePackage
class CTestBuilder(spack.build_systems.cmake.CMakeBuilder):
"""
This builder mirrors the behavior of a CMakeBuilder, but all commands are run through
CTest. This ensures that xml files are created through CTest. This provides a unified
buildstamp and improved xml over the spack generated ones.
An additional phase is added for running tests post installation. This allows for things
like regression tests that can be used to monitior differences in behavior/performance
without failing the install.
"""
phases = ("cmake", "build", "install", "analysis")
@property
def std_cmake_args(self):
"""
Args to always supply to CMake. CDash args don't do anything if you don't submit
TODO: workout how to get the track,build,site mapped correctly
Currently this ignores the spack flags, and injests the CTestConfig.cmake files
In ExaWind it is hooked into additional infrastrucure.
The spack flags are not ingestible to the package as far as I can currently tell.
"""
args = super().std_cmake_args
if self.spec.variants["cdash_submit"].value:
args.extend(
[
"-D",
f"BUILDNAME={self.pkg.spec.name}",
"-D",
f"CTEST_BUILD_OPTIONS={self.pkg.spec.short_spec}",
"-D",
"SITE=TODO",
]
)
return args
def ctest_args(self):
args = ["-T", "Test"]
args.append("--stop-time")
overall_test_timeout = 60 * 60 * 4 # 4 hours TODO should probably be a variant
args.append(time.strftime("%H:%M:%S", time.localtime(time.time() + overall_test_timeout)))
args.append("-VV") # make sure lots of output can go to the log
# a way to parse additional information to ctest exectution.
# for ecample in exawind, we default to running unit-tests, but for nightly tests
# we expand to our regression test suite through this variant
extra_args = self.pkg.spec.variants["ctest_args"].value
if extra_args:
args.extend(extra_args.split())
return args
@property
def build_args(self):
"""
CTest arguments that translate to running a to the end of the build phase through CTest
"""
args = [
"--group",
self.pkg.spec.name,
"-T",
"Start",
"-T",
"Configure",
"-T",
"Build",
"-VV",
]
return args
@property
def submit_args(self):
"""
CTest arguments just for sumbmission. Allows us to split phases, where default CTest behavior is to configure, build, test and submit from a single command.
"""
args = ["-T", "Submit", "-V"]
return args
def submit_cdash(self, pkg, spec, prefix):
ctest = Executable(self.spec["cmake"].prefix.bin.ctest)
ctest.add_default_env("CTEST_PARALLEL_LEVEL", str(make_jobs))
build_env = os.environ.copy()
ctest(*self.submit_args, env=build_env)
def build(self, pkg, spec, prefix):
"""
The only reason to run through the CTest interface is if we want to submit to CDash with
unified CTest xml's.
If we aren't going to submit then we can just run as the CMakeBuilder
"""
if self.pkg.spec.variants["cdash_submit"].value:
ctest = Executable(self.spec["cmake"].prefix.bin.ctest)
ctest.add_default_env("CMAKE_BUILD_PARALLEL_LEVEL", str(make_jobs))
with fs.working_dir(self.build_directory):
build_env = os.environ.copy()
# have ctest run, but we still want to submit if there are build failures where spack would stop.
# check for errors and submit to cdash if there are failures
output = ctest(
*self.build_args, env=build_env, output=str.split, error=str.split
).split("\n")
errors, warnings = spack.util.log_parse.parse_log_events(output)
if len(errors) > 0:
errs = [str(e) for e in errors]
tty.warn(f"Errors: {errs}")
tty.warn(f"returncode {ctest.returncode}")
self.submit_cdash(pkg, spec, prefix)
raise BaseException(f"{self.pkg.spec.name} had build errors")
else:
super().build(pkg, spec, prefix)
def analysis(self, pkg, spec, prefix):
"""
This method currently runs tests post install to avoid the undesired side effect
of failing installs for failed tests with spack's built in testing infrastructure
"""
with working_dir(self.build_directory):
args = self.ctest_args()
tty.debug("{} running CTest".format(self.pkg.spec.name))
tty.debug("Running:: ctest" + " ".join(args))
ctest = Executable(self.spec["cmake"].prefix.bin.ctest)
ctest.add_default_env("CTEST_PARALLEL_LEVEL", str(make_jobs))
ctest.add_default_env("CMAKE_BUILD_PARALLEL_LEVEL", str(make_jobs))
build_env = os.environ.copy()
ctest(*args, "-j", str(make_jobs), env=build_env, fail_on_error=False)
if self.pkg.spec.variants["cdash_submit"].value:
self.submit_cdash(pkg, spec, prefix)
class CtestPackage(CMakePackage):
"""
This package's default behavior is to act as a Standard CMakePackage,
"""
CMakeBuilder = CTestBuilder
variant("cdash_submit", default=False, description="Submit results to cdash")
variant("ctest_args", default="", description="quoted string of arguments to send to ctest")
def setup_build_environment(self, env):
env.prepend_path("PYTHONPATH", os.environ["EXAWIND_MANAGER"])
def do_clean(self):
"""
A nice feature for development builds. Can be omitted from the final product.
"""
super().do_clean()
if not self.stage.managed_by_spack:
build_artifacts = glob.glob(os.path.join(self.stage.source_path, "spack-*"))
for f in build_artifacts:
if os.path.isfile(f):
os.remove(f)
if os.path.isdir(f):
shutil.rmtree(f)
ccjson = os.path.join(self.stage.source_path, "compile_commands.json")
if os.path.isfile(ccjson):
os.remove(ccjson)
@run_after("cmake")
def copy_compile_commands(self):
"""
A nice feature for development builds. Can be omitted from the final product.
"""
if self.spec.satisfies("dev_path=*"):
target = os.path.join(self.stage.source_path, "compile_commands.json")
source = os.path.join(self.build_directory, "compile_commands.json")
if os.path.isfile(source):
shutil.copyfile(source, target)

View File

@@ -241,6 +241,11 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%intel@19.2:", when="+cuda ^cuda@:11.1.0")
conflicts("%intel@2021:", when="+cuda ^cuda@:11.4.0")
# ARM
# https://github.com/spack/spack/pull/39666#issuecomment-2377609263
# Might need to be expanded to other gcc versions
conflicts("%gcc@13.2.0", when="+cuda ^cuda@:12.4 target=aarch64:")
# XL is mostly relevant for ppc64le Linux
conflicts("%xl@:12,14:", when="+cuda ^cuda@:9.1")
conflicts("%xl@:12,14:15,17:", when="+cuda ^cuda@9.2")

View File

@@ -44,16 +44,27 @@ class GoBuilder(BaseBuilder):
+-----------------------------------------------+--------------------+
| **Method** | **Purpose** |
+===============================================+====================+
| :py:meth:`~.GoBuilder.build_args` | Specify arguments |
| :py:attr:`~.GoBuilder.build_args` | Specify arguments |
| | to ``go build`` |
+-----------------------------------------------+--------------------+
| :py:meth:`~.GoBuilder.check_args` | Specify arguments |
| :py:attr:`~.GoBuilder.check_args` | Specify arguments |
| | to ``go test`` |
+-----------------------------------------------+--------------------+
"""
phases = ("build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("check", "installcheck")
#: Names associated with package attributes in the old build-system format
legacy_attributes = (
"build_args",
"check_args",
"build_directory",
"install_time_test_callbacks",
)
#: Callback names for install-time test
install_time_test_callbacks = ["check"]

View File

@@ -22,9 +22,10 @@
install,
)
import spack.builder
import spack.error
from spack.build_environment import dso_suffix
from spack.package_base import InstallError
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix

View File

@@ -15,7 +15,7 @@
import spack.util.path
from spack.build_environment import dso_suffix
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable

View File

@@ -24,6 +24,8 @@
import spack.detection
import spack.multimethod
import spack.package_base
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.directives import build_system, depends_on, extends
@@ -337,7 +339,7 @@ class PythonPackage(PythonExtension):
legacy_buildsystem = "python_pip"
#: Callback names for install-time test
install_time_test_callbacks = ["test"]
install_time_test_callbacks = ["test_imports"]
build_system("python_pip")
@@ -427,7 +429,7 @@ class PythonPipBuilder(BaseBuilder):
phases = ("install",)
#: Names associated with package methods in the old build-system format
legacy_methods = ("test",)
legacy_methods = ("test_imports",)
#: Same as legacy_methods, but the signature is different
legacy_long_methods = ("install_options", "global_options", "config_settings")
@@ -436,7 +438,7 @@ class PythonPipBuilder(BaseBuilder):
legacy_attributes = ("archive_files", "build_directory", "install_time_test_callbacks")
#: Callback names for install-time test
install_time_test_callbacks = ["test"]
install_time_test_callbacks = ["test_imports"]
@staticmethod
def std_args(cls) -> List[str]:

View File

@@ -11,9 +11,9 @@
import spack.builder
from spack.build_environment import SPACK_NO_PARALLEL_MAKE
from spack.config import determine_number_of_jobs
from spack.directives import build_system, extends, maintainers
from spack.package_base import PackageBase
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError

View File

@@ -10,7 +10,7 @@
from llnl.util import lang
import spack.build_environment
import spack.error
import spack.multimethod
#: Builder classes, as registered by the "builder" decorator
@@ -461,15 +461,13 @@ def _on_phase_start(self, instance):
# If a phase has a matching stop_before_phase attribute,
# stop the installation process raising a StopPhase
if getattr(instance, "stop_before_phase", None) == self.name:
raise spack.build_environment.StopPhase(
"Stopping before '{0}' phase".format(self.name)
)
raise spack.error.StopPhase("Stopping before '{0}' phase".format(self.name))
def _on_phase_exit(self, instance):
# If a phase has a matching last_phase attribute,
# stop the installation process raising a StopPhase
if getattr(instance, "last_phase", None) == self.name:
raise spack.build_environment.StopPhase("Stopping at '{0}' phase".format(self.name))
raise spack.error.StopPhase("Stopping at '{0}' phase".format(self.name))
def copy(self):
return copy.deepcopy(self)
@@ -523,10 +521,6 @@ def stage(self):
def prefix(self):
return self.pkg.prefix
def test(self):
# Defer tests to virtual and concrete packages
pass
def setup_build_environment(self, env):
"""Sets up the build environment for a package.

View File

@@ -11,9 +11,7 @@
from llnl.util.filesystem import mkdirp
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.paths
import spack.util.file_cache
import spack.util.path

View File

@@ -31,6 +31,7 @@
import spack
import spack.binary_distribution as bindist
import spack.concretize
import spack.config as cfg
import spack.environment as ev
import spack.main
@@ -38,7 +39,6 @@
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
@@ -1219,8 +1219,8 @@ def main_script_replacements(cmd):
# Capture the version of Spack used to generate the pipeline, that can be
# passed to `git checkout` for version consistency. If we aren't in a Git
# repository, presume we are a Spack release and use the Git tag instead.
spack_version = spack.main.get_version()
version_to_clone = spack.main.get_spack_commit() or f"v{spack.spack_version}"
spack_version = spack.get_version()
version_to_clone = spack.get_spack_commit() or f"v{spack.spack_version}"
output_object["variables"] = {
"SPACK_ARTIFACTS_ROOT": rel_artifacts_root,
@@ -1272,7 +1272,9 @@ def main_script_replacements(cmd):
else:
# No jobs were generated
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
noop_job["retry"] = service_job_retries
# If this job fails ignore the status and carry on
noop_job["retry"] = 0
noop_job["allow_failure"] = True
if copy_only_pipeline and config_deprecated:
tty.debug("Generating no-op job as copy-only is unsupported here.")

View File

@@ -17,7 +17,7 @@
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.config
import spack.config # breaks a cycle.
import spack.environment as ev
import spack.error
import spack.extensions

View File

@@ -11,6 +11,7 @@
import llnl.util.tty.color as color
import spack.platforms
import spack.spec
description = "print architecture information about this machine"
section = "system"

View File

@@ -16,11 +16,11 @@
import spack.bootstrap.config
import spack.bootstrap.core
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
import spack.util.spack_yaml
from spack.cmd.common import arguments
description = "manage bootstrap configuration"

View File

@@ -23,14 +23,9 @@
import spack.error
import spack.mirror
import spack.oci.oci
import spack.oci.opener
import spack.relocate
import spack.repo
import spack.spec
import spack.stage
import spack.store
import spack.user_environment
import spack.util.crypto
import spack.util.parallel
import spack.util.url as url_util
import spack.util.web as web_util

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.spec
from spack.cmd.common import arguments
description = "change an existing spec in an environment"

View File

@@ -15,7 +15,6 @@
import spack.repo
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.package_base import (

View File

@@ -19,7 +19,6 @@
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.environment.depfile
import spack.hash_types as ht
import spack.mirror
import spack.util.gpg as gpg_util

View File

@@ -10,11 +10,9 @@
import llnl.util.filesystem
import llnl.util.tty as tty
import spack.bootstrap
import spack.caches
import spack.cmd.test
import spack.cmd
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path

View File

@@ -17,6 +17,7 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.config
import spack.main
import spack.paths
import spack.platforms

View File

@@ -15,7 +15,6 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.modules
import spack.reporters
import spack.spec
import spack.store

View File

@@ -9,6 +9,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.spec
display_args = {"long": True, "show_flags": False, "variants": False, "indent": 4}

View File

@@ -10,7 +10,6 @@
import spack.cmd
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import build_environment, traverse

View File

@@ -13,9 +13,9 @@
import spack.config
import spack.environment as ev
import spack.repo
import spack.error
import spack.schema.env
import spack.schema.packages
import spack.spec
import spack.store
import spack.util.spack_yaml as syaml
from spack.cmd.common import arguments
@@ -256,7 +256,7 @@ def config_remove(args):
existing.pop(value, None)
else:
# This should be impossible to reach
raise spack.config.ConfigError("Config has nested non-dict values")
raise spack.error.ConfigError("Config has nested non-dict values")
spack.config.set(path, existing, scope)
@@ -340,7 +340,7 @@ def _config_change(config_path, match_spec_str=None):
if not changed:
existing_requirements = spack.config.get(key_path)
if isinstance(existing_requirements, str):
raise spack.config.ConfigError(
raise spack.error.ConfigError(
"'config change' needs to append a requirement,"
" but existing require: config is not a list"
)
@@ -536,11 +536,11 @@ def config_prefer_upstream(args):
# Get and list all the variants that differ from the default.
variants = []
for var_name, variant in spec.variants.items():
if var_name in ["patches"] or var_name not in spec.package.variants:
if var_name in ["patches"] or not spec.package.has_variant(var_name):
continue
variant_desc, _ = spec.package.variants[var_name]
if variant.value != variant_desc.default:
vdef = spec.package.get_variant(var_name)
if variant.value != vdef.default:
variants.append(str(variant))
variants.sort()
variants = " ".join(variants)

View File

@@ -13,7 +13,6 @@
import spack.repo
import spack.stage
import spack.util.web
from spack.spec import Spec
from spack.url import (
UndetectableNameError,

View File

@@ -13,11 +13,12 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack.config
import spack
import spack.paths
import spack.platforms
import spack.spec
import spack.store
import spack.util.git
from spack.main import get_version
from spack.util.executable import which
description = "debugging commands for troubleshooting Spack"
@@ -89,7 +90,7 @@ def report(args):
host_os = host_platform.operating_system("frontend")
host_target = host_platform.target("frontend")
architecture = spack.spec.ArchSpec((str(host_platform), str(host_os), str(host_target)))
print("* **Spack:**", get_version())
print("* **Spack:**", spack.get_version())
print("* **Python:**", platform.python_version())
print("* **Platform:**", architecture)

View File

@@ -11,7 +11,6 @@
import spack.cmd
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments

View File

@@ -20,6 +20,7 @@
import spack.cmd
import spack.environment as ev
import spack.installer
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
@@ -142,4 +143,4 @@ def deprecate(parser, args):
tty.die("Will not deprecate any packages.")
for dcate, dcator in zip(all_deprecate, all_deprecators):
dcate.package.do_deprecate(dcator, symlink)
spack.installer.deprecate(dcate, dcator, symlink)

View File

@@ -8,10 +8,13 @@
import llnl.util.tty as tty
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.repo
from spack.cmd.common import arguments
from spack.installer import PackageInstaller
description = "developer build: build from code in current working directory"
section = "build"
@@ -129,9 +132,9 @@ def dev_build(self, args):
elif args.test == "root":
tests = [spec.name for spec in specs]
spec.package.do_install(
PackageInstaller(
[spec.package],
tests=tests,
make_jobs=args.jobs,
keep_prefix=args.keep_prefix,
install_deps=not args.ignore_deps,
verbose=not args.quiet,
@@ -139,7 +142,7 @@ def dev_build(self, args):
stop_before=args.before,
skip_patch=args.skip_patch,
stop_at=args.until,
)
).install()
# drop into the build environment of the package?
if args.shell is not None:

View File

@@ -10,7 +10,6 @@
import spack.cmd
import spack.config
import spack.fetch_strategy
import spack.package_base
import spack.repo
import spack.spec
import spack.stage

View File

@@ -12,7 +12,6 @@
import spack.cmd
import spack.environment as ev
import spack.solver.asp as asp
import spack.util.environment
import spack.util.spack_json as sjson
from spack.cmd.common import arguments

View File

@@ -21,15 +21,12 @@
import spack.cmd
import spack.cmd.common
import spack.cmd.common.arguments
import spack.cmd.install
import spack.cmd.modules
import spack.cmd.uninstall
import spack.config
import spack.environment as ev
import spack.environment.depfile as depfile
import spack.environment.environment
import spack.environment.shell
import spack.schema.env
import spack.spec
import spack.tengine
from spack.cmd.common import arguments
from spack.util.environment import EnvironmentModifications
@@ -273,7 +270,7 @@ def env_activate_setup_parser(subparser):
nargs="?",
default=None,
help=(
"name of managed environment or directory of the anonymous env"
"name of managed environment or directory of the independent env"
" (when using --dir/-d) to activate"
),
)
@@ -543,7 +540,7 @@ def env_rename_setup_parser(subparser):
def env_rename(args):
"""Rename an environment.
This renames a managed environment or moves an anonymous environment.
This renames a managed environment or moves an independent environment.
"""
# Directory option has been specified

View File

@@ -18,9 +18,9 @@
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.package_base
import spack.repo
import spack.spec
import spack.util.environment
from spack.cmd.common import arguments
description = "manage external packages in Spack configuration"

View File

@@ -8,7 +8,6 @@
import spack.cmd
import spack.config
import spack.environment as ev
import spack.repo
import spack.traverse
from spack.cmd.common import arguments

View File

@@ -10,10 +10,11 @@
import llnl.util.tty as tty
import llnl.util.tty.color as color
import spack.bootstrap
import spack.cmd as cmd
import spack.config
import spack.environment as ev
import spack.repo
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses

View File

@@ -41,7 +41,7 @@ def setup_parser(subparser):
help="do not remove installed build-only dependencies of roots\n"
"(default is to keep only link & run dependencies)",
)
spack.cmd.common.arguments.add_common_arguments(subparser, ["yes_to_all"])
spack.cmd.common.arguments.add_common_arguments(subparser, ["yes_to_all", "constraint"])
def roots_from_environments(args, active_env):
@@ -97,6 +97,12 @@ def gc(parser, args):
root_hashes = None
specs = spack.store.STORE.db.unused_specs(root_hashes=root_hashes, deptype=deptype)
# limit search to constraint specs if provided
if args.constraint:
hashes = set(spec.dag_hash() for spec in args.specs())
specs = [spec for spec in specs if spec.dag_hash() in hashes]
if not specs:
tty.msg("There are no unused specs. Spack's store is clean.")
return

View File

@@ -16,7 +16,7 @@
import spack.install_test
import spack.repo
import spack.spec
import spack.version
import spack.variant
from spack.cmd.common import arguments
from spack.package_base import preferred_version
@@ -333,26 +333,6 @@ def _fmt_variant(variant, max_name_default_len, indent, when=None, out=None):
out.write("\n")
def _variants_by_name_when(pkg):
"""Adaptor to get variants keyed by { name: { when: { [Variant...] } }."""
# TODO: replace with pkg.variants_by_name(when=True) when unified directive dicts are merged.
variants = {}
for name, (variant, whens) in sorted(pkg.variants.items()):
for when in whens:
variants.setdefault(name, {}).setdefault(when, []).append(variant)
return variants
def _variants_by_when_name(pkg):
"""Adaptor to get variants keyed by { when: { name: Variant } }"""
# TODO: replace with pkg.variants when unified directive dicts are merged.
variants = {}
for name, (variant, whens) in pkg.variants.items():
for when in whens:
variants.setdefault(when, {})[name] = variant
return variants
def _print_variants_header(pkg):
"""output variants"""
@@ -363,32 +343,22 @@ def _print_variants_header(pkg):
color.cprint("")
color.cprint(section_title("Variants:"))
variants_by_name = _variants_by_name_when(pkg)
# Calculate the max length of the "name [default]" part of the variant display
# This lets us know where to print variant values.
max_name_default_len = max(
color.clen(_fmt_name_and_default(variant))
for name, when_variants in variants_by_name.items()
for variants in when_variants.values()
for variant in variants
for name in pkg.variant_names()
for _, variant in pkg.variant_definitions(name)
)
return max_name_default_len, variants_by_name
def _unconstrained_ver_first(item):
"""sort key that puts specs with open version ranges first"""
spec, _ = item
return (spack.version.any_version not in spec.versions, spec)
return max_name_default_len
def print_variants_grouped_by_when(pkg):
max_name_default_len, _ = _print_variants_header(pkg)
max_name_default_len = _print_variants_header(pkg)
indent = 4
variants = _variants_by_when_name(pkg)
for when, variants_by_name in sorted(variants.items(), key=_unconstrained_ver_first):
for when, variants_by_name in pkg.variant_items():
padded_values = max_name_default_len + 4
start_indent = indent
@@ -406,15 +376,14 @@ def print_variants_grouped_by_when(pkg):
def print_variants_by_name(pkg):
max_name_default_len, variants_by_name = _print_variants_header(pkg)
max_name_default_len = _print_variants_header(pkg)
max_name_default_len += 4
indent = 4
for name, when_variants in variants_by_name.items():
for when, variants in sorted(when_variants.items(), key=_unconstrained_ver_first):
for variant in variants:
_fmt_variant(variant, max_name_default_len, indent, when, out=sys.stdout)
sys.stdout.write("\n")
for name in pkg.variant_names():
for when, variant in pkg.variant_definitions(name):
_fmt_variant(variant, max_name_default_len, indent, when, out=sys.stdout)
sys.stdout.write("\n")
def print_variants(pkg, args):

View File

@@ -13,18 +13,15 @@
from llnl.string import plural
from llnl.util import lang, tty
import spack.build_environment
import spack.cmd
import spack.config
import spack.environment as ev
import spack.fetch_strategy
import spack.package_base
import spack.paths
import spack.report
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.error import SpackError
from spack.error import InstallError, SpackError
from spack.installer import PackageInstaller
description = "build and install packages"
@@ -287,7 +284,7 @@ def require_user_confirmation_for_overwrite(concrete_specs, args):
tty.die("Reinstallation aborted.")
def _dump_log_on_error(e: spack.build_environment.InstallError):
def _dump_log_on_error(e: InstallError):
e.print_context()
assert e.pkg, "Expected InstallError to include the associated package"
if not os.path.exists(e.pkg.log_path):
@@ -352,7 +349,7 @@ def reporter_factory(specs):
install_with_active_env(env, args, install_kwargs, reporter_factory)
else:
install_without_active_env(args, install_kwargs, reporter_factory)
except spack.build_environment.InstallError as e:
except InstallError as e:
if args.show_log_on_error:
_dump_log_on_error(e)
raise
@@ -477,5 +474,5 @@ def install_without_active_env(args, install_kwargs, reporter_factory):
installs = [s.package for s in concrete_specs]
install_kwargs["explicit"] = [s.dag_hash() for s in concrete_specs]
builder = PackageInstaller(installs, install_kwargs)
builder = PackageInstaller(installs, **install_kwargs)
builder.install()

View File

@@ -6,11 +6,10 @@
import sys
import spack.cmd
import spack.cmd.find
import spack.cmd.common
import spack.environment as ev
import spack.store
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "add package to the user environment"

View File

@@ -8,9 +8,6 @@
from llnl.util import tty
import spack.cmd
import spack.error
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses

View File

@@ -17,7 +17,6 @@
import spack.mirror
import spack.repo
import spack.spec
import spack.util.path
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.error import SpackError

View File

@@ -15,6 +15,7 @@
import spack.cmd
import spack.config
import spack.error
import spack.modules
import spack.modules.common
import spack.repo
@@ -124,13 +125,13 @@ def check_module_set_name(name):
names = [k for k in modules if k != "prefix_inspections"]
if not names:
raise spack.config.ConfigError(
raise spack.error.ConfigError(
f"Module set configuration is missing. Cannot use module set '{name}'"
)
pretty_names = "', '".join(names)
raise spack.config.ConfigError(
raise spack.error.ConfigError(
f"Cannot use invalid module set '{name}'.",
f"Valid module set names are: '{pretty_names}'.",
)
@@ -172,7 +173,7 @@ def loads(module_type, specs, args, out=None):
modules = list(
(
spec,
spack.modules.common.get_module(
spack.modules.get_module(
module_type,
spec,
get_full_path=False,
@@ -221,7 +222,7 @@ def find(module_type, specs, args):
try:
modules = [
spack.modules.common.get_module(
spack.modules.get_module(
module_type,
spec,
args.full_path,
@@ -232,7 +233,7 @@ def find(module_type, specs, args):
]
modules.append(
spack.modules.common.get_module(
spack.modules.get_module(
module_type,
single_spec,
args.full_path,

View File

@@ -9,7 +9,6 @@
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.traverse
from spack.cmd.common import arguments

View File

@@ -12,7 +12,6 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph

View File

@@ -78,8 +78,8 @@ def python(parser, args, unknown_args):
# Run user choice of interpreter
if args.python_interpreter == "ipython":
return spack.cmd.python.ipython_interpreter(args)
return spack.cmd.python.python_interpreter(args)
return ipython_interpreter(args)
return python_interpreter(args)
def ipython_interpreter(args):

View File

@@ -6,7 +6,6 @@
import llnl.util.tty as tty
import spack.cmd
import spack.repo
from spack.cmd.common import arguments
description = "revert checked out package source code"

View File

@@ -12,11 +12,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package_base
import spack.solver.asp as asp
import spack.spec
from spack.cmd.common import arguments
description = "concretize a specs using an ASP solver"

View File

@@ -14,6 +14,7 @@
import spack.hash_types as ht
import spack.spec
import spack.store
import spack.traverse
from spack.cmd.common import arguments
description = "show what would be installed, given a spec"

View File

@@ -11,8 +11,6 @@
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.stage
import spack.traverse
from spack.cmd.common import arguments

View File

@@ -9,8 +9,8 @@
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
import spack.environment
import spack.repo
import spack.store
import spack.tag
description = "show package tags and associated packages"

View File

@@ -15,11 +15,12 @@
from llnl.util.tty import colify
import spack.cmd
import spack.config
import spack.environment as ev
import spack.install_test
import spack.package_base
import spack.repo
import spack.report
import spack.store
from spack.cmd.common import arguments
description = "run spack's tests for an install"

View File

@@ -10,6 +10,7 @@
from llnl.util.filesystem import working_dir
import spack
import spack.cmd
import spack.config
import spack.paths
import spack.util.git

View File

@@ -6,6 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.config
from spack.cmd.common import arguments
description = "remove specs from an environment"

View File

@@ -10,6 +10,8 @@
import re
import sys
import spack.extensions
try:
import pytest
except ImportError:

View File

@@ -7,9 +7,10 @@
import sys
import spack.cmd
import spack.cmd.common
import spack.error
import spack.store
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "remove package from the user environment"

View File

@@ -6,6 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.environment as ev
import spack.store
import spack.verify

View File

@@ -202,18 +202,6 @@ class Compiler:
support for specific compilers, their possible names, arguments,
and how to identify the particular type of compiler."""
# Subclasses use possible names of C compiler
cc_names: List[str] = []
# Subclasses use possible names of C++ compiler
cxx_names: List[str] = []
# Subclasses use possible names of Fortran 77 compiler
f77_names: List[str] = []
# Subclasses use possible names of Fortran 90 compiler
fc_names: List[str] = []
# Optional prefix regexes for searching for this type of compiler.
# Prefixes are sometimes used for toolchains
prefixes: List[str] = []
@@ -427,14 +415,19 @@ def implicit_rpaths(self) -> List[str]:
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
def default_dynamic_linker(self) -> Optional[str]:
"""Determine default dynamic linker from compiler link line"""
output = self.compiler_verbose_output
if not output:
return None
dynamic_linker = spack.util.libc.parse_dynamic_linker(output)
return spack.util.libc.parse_dynamic_linker(output)
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
dynamic_linker = self.default_dynamic_linker
if not dynamic_linker:
return None
@@ -619,18 +612,6 @@ def extract_version_from_output(cls, output):
def cc_version(cls, cc):
return cls.default_version(cc)
@classmethod
def cxx_version(cls, cxx):
return cls.default_version(cxx)
@classmethod
def f77_version(cls, f77):
return cls.default_version(f77)
@classmethod
def fc_version(cls, fc):
return cls.default_version(fc)
@classmethod
def search_regexps(cls, language):
# Compile all the regular expressions used for files beforehand.

Some files were not shown because too many files have changed in this diff Show More