Compare commits

...

178 Commits

Author SHA1 Message Date
Wouter Deconinck
23b436074b netcdf-c: add comment to PR with CMake option rename 2025-03-06 09:50:52 -06:00
Wouter Deconinck
a593b751b9 netcdf-c: prefix CMake options with NETCDF_ 2025-03-05 22:06:37 -06:00
Wouter Deconinck
2e8aafc544 seacas: depends_on netcdf-c@:4.9.2 when @:2024-08-15 2025-03-05 20:08:23 -06:00
Wouter Deconinck
2c8b93fa66 netcdf-c: add v4.9.3 2025-03-05 20:05:55 -06:00
Weiqun Zhang
170a276f18 amrex: add v25.03 (#49252)
Starting from amrex-25.03, FFT is enabled by default in spack build.
2025-03-05 15:53:25 -08:00
Massimiliano Culpo
313524dc6d qrupdate: update to use oneapi packages (#49304)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:37 -05:00
Massimiliano Culpo
5aae6e25a5 arpack-ng: update to use oneapi packages (#49302)
Also, remove deprecated versions

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:13 -05:00
Massimiliano Culpo
b58a52b6ce abinit: update to use oneapi packages (#49301)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 13:44:01 -05:00
Chris White
32760e2885 sundials: expand patch when rule (#49296) 2025-03-05 16:13:19 +01:00
Harmen Stoppels
125feb125c Define Package API version (#49274)
Defines `spack.package_api_version` and `spack.min_package_api_version` 
as tuples (major, minor). 

This defines resp. the current Package API version implemented by this version 
of Spack and the minimal Package API version it is backwards compatible with.

Repositories can optionally define:
```yaml
repo:
    namespace: my_repo
    api: v1.2
```
which indicates they are compatible with versions of Spack that implement 
Package API `>= 1.2` and `< 2.0`. When the `api` key is omitted, the default 
`v1.0` is assumed.
2025-03-05 15:42:48 +01:00
Wouter Deconinck
8677063142 QtPackage: modify QT_ADDITIONAL_PACKAGES_PREFIX_PATH handling (#49297)
* QtPackage: mv QT_ADDITIONAL_PACKAGES_PREFIX_PATH handling

* geomodel: support Qt6

* qt-base: rm import re
2025-03-05 09:09:32 -05:00
Massimiliano Culpo
f015b18230 hydrogen: update to use oneapi packages (#49293)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 09:06:32 +01:00
Massimiliano Culpo
aa9e610fa6 elemental: remove deprecated package (#49291)
This package has not been maintained since 2016.

We maintain an active fork in the hydrogen
package, so remove this one.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-05 08:36:05 +01:00
Wouter Deconinck
7d62045c30 py-networkx: add up to v3.4.2 (#49289)
* py-networkx: add new versions up to 3.4.2
* py-networkx: add more requirements
* py-networkx: fix typo
* py-networkx: fix python and py-setuptools dependencies

---------

Co-authored-by: Joseph C Wang <joequant@gmail.com>
2025-03-04 17:02:54 -08:00
Chris Marsh
5b03173b99 r-packages: add missing gettext dependencies (#48910)
* add gettext dependency

* typo

* style
2025-03-04 17:07:01 -06:00
mvlopri
36fcdb8cfa Update the incorrect sha for the SEACAS package.py (#49292)
The sha256sum for the 2025-02-27 version of SEACAS is incorrect
due to the movement of the tagged version.
2025-03-04 16:03:28 -07:00
Chris Marsh
7d5b17fbf2 py-rpy2: Add 3.5.17 (#48911)
* Update rpy2 to newest version and clean up package

* Add me as maintainer

* Update depends section as per review. Add ipython variant. Fix some ranges and add support for python 3.9. Deprecated outdated versions

* refine depends_on and remove redundant version info

* style
2025-03-04 15:58:12 -07:00
Piotr Sacharuk
d6e3292955 flux-sched: Apply workarounds for oneAPI compiler for problem with build (#49282) 2025-03-04 15:28:33 -07:00
Chris Marsh
60f54df964 Explicitly depend on gettext for libintl (#48908) 2025-03-04 16:25:31 -06:00
Wouter Deconinck
487df807cc veccore: add typo fix for clang (#49288)
* veccore: add typo for clang

* veccore: apply ScalarWrapper.h patch for all compilers

---------

Co-authored-by: Joseph C Wang <joequant@gmail.com>
2025-03-04 14:35:47 -07:00
Zack Galbreath
cacdf84964 ci: add support for high priority local mirror (#49264) 2025-03-04 14:47:37 -06:00
fbrechin
e2293c758f Adding ability for repo paths from a manifest file to be expanded when creating an environment. (#49084)
* Adding ability for repo paths from a manifest file to be expanded when creating an environment.

A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.

* Fixing new unit test for env repo var substitution

* Adding ability for repo paths from a manifest file to be expanded when creating an environment.

A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.

* Messed up resolving last rebase
2025-03-04 09:52:28 -08:00
Harmen Stoppels
f5a275adf5 gitignore: remove *_archive (#49278) 2025-03-04 18:37:18 +01:00
Paul
615ced32cd protobuf: add v3.29.3 (#49246) 2025-03-04 11:29:53 -06:00
Massimiliano Culpo
bc04d963e5 Remove debug print statements in unit-tests (#49280)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-04 18:29:45 +01:00
Taillefumier Mathieu
11051ce5c7 CP2K: Add GRPP support (#49232) 2025-03-04 06:54:27 -07:00
Adam J. Stewart
631bddc52e py-pyarrow: add v19.0.1 (#49149)
* py-pyarrow: add v19.0.1

* Environment variables no longer needed either

* Remove py-pyarrow variants
2025-03-04 13:20:52 +01:00
Adam J. Stewart
b5f40aa7fb OpenCV: fix +cuda build (#49146) 2025-03-04 13:19:57 +01:00
Adam J. Stewart
57e0798af2 py-pip: mark Python 3.12+ support (#49148) 2025-03-04 13:18:38 +01:00
Chris White
0161b662f7 conduit: do not pass link flags to ar (#49263) 2025-03-03 19:53:11 -07:00
afzpatel
aa55b19680 fix +asan in ROCm packages (#48745)
* fix asan for hsa-rocr-dev
* add libclang_rt.asan-x86_64.so to LD_LIBRARY_PATH
* fix +asan for hipsparselt
* fix rocm-openmp-extras asan and add rccl +asan support
* add missing comgr build env variables
* add missing rocm-smi-lib build env variables
* minor dependency change
* fix style
2025-03-03 17:57:34 -08:00
dependabot[bot]
8cfffd88fa build(deps): bump pytest from 8.3.4 to 8.3.5 in /lib/spack/docs (#49268)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.4 to 8.3.5.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.4...8.3.5)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:18:42 -06:00
dependabot[bot]
2f8dcb8097 build(deps): bump python-levenshtein in /lib/spack/docs (#49269)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.26.1 to 0.27.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.26.1...v0.27.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:17:48 -06:00
dependabot[bot]
5b70fa8cc8 build(deps): bump sphinx from 8.2.1 to 8.2.3 in /lib/spack/docs (#49270)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.2.1 to 8.2.3.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.2.1...v8.2.3)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 19:17:08 -06:00
Adam J. Stewart
b4025e89ed py-torchmetrics: add v1.6.2 (#49262) 2025-03-03 19:15:49 -06:00
Eric Berquist
8db74e1b2f tmux: add 3.5a, 3.5, and 3.3 (#49259)
* tmux: add 3.5a, 3.5, and 3.3

* tmux: patch is in releases from 3.5 onward

* tmux: versions 3.5 and newer can use jemalloc
2025-03-03 19:12:45 -06:00
Wouter Deconinck
1fcfbadba7 qwt: add v6.2.0, v6.3.0, support Qt6 (#45604)
* qwt: support building against Qt6

* qwt: fix style

* qwt: depends_on qt-base+opengl+widgets when +opengl

* visit: patch for missing cmath include

---------

Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2025-03-03 16:25:48 -08:00
Chris White
13ec35873f Axom: Changes from Axom repository (#49183)
* pull in new changes from axom project

* add new versions

* convert more conditionals to spec.satisfies

-------------
Co-authored-by: white238 <white238@users.noreply.github.com>
2025-03-03 15:47:45 -08:00
Philip Fackler
f96b6eac2b xolotl: new package (#48876)
* Adding xolotl package

* [@spackbot] updating style on behalf of PhilipFackler

* Removing redundant text

* Add blank line

* Update var/spack/repos/builtin/packages/xolotl/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/xolotl/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Switch to CudaPackage and remove source dir from runtime env

* [@spackbot] updating style on behalf of PhilipFackler

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-03 15:18:28 -06:00
Rocco Meli
933a1a5cd9 update (#49261) 2025-03-03 10:38:10 -07:00
Stephen Nicholas Swatman
b2b9914efc acts dependencies: new versions as of 2025/03/03 (#49253)
This commit adds ACTS version 39.2.0 and detray version 0.89.0.
2025-03-03 09:32:59 -07:00
Rocco Meli
9ce9596981 multicharge: add v0.3.1 (#49255)
* multicharge: add v0.3.1

* fix url
2025-03-03 15:32:29 +01:00
Wouter Deconinck
fc30fe1f6b librsvg: add v2.56.4, v2.57.3, v2.58.2 (#45734)
* librsvg: add v2.56.4, v2.57.3, v2.58.2

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-03-02 14:08:43 -08:00
Paul
25a4b98359 jacamar-ci: add v0.25.0 (#49248) 2025-03-02 14:50:43 -06:00
Adam J. Stewart
05c34b7312 py-pymc3: not compatible with numpy 2 (#49225) 2025-03-01 13:43:05 -06:00
Tahmid Khan
b22842af56 globalarrays: Add variant cxx which adds the --enable-cxx flag (#49241) 2025-03-01 13:16:04 -06:00
Vanessasaurus
0bef028692 Automated deployment to update package flux-sched 2025-02-28 (#49229)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-01 08:48:41 -07:00
Vanessasaurus
935facd069 Automated deployment to update package flux-security 2025-02-28 (#49230)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-03-01 08:47:19 -07:00
Adam J. Stewart
87e5255bbc py-matplotlib: add v3.10.1 (#49233) 2025-03-01 16:22:49 +01:00
dependabot[bot]
b42f0d793d build(deps): bump isort in /.github/workflows/requirements/style (#49212)
Bumps [isort](https://github.com/PyCQA/isort) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/PyCQA/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/PyCQA/isort/compare/6.0.0...6.0.1)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-01 08:18:06 -07:00
dependabot[bot]
ccca0d3354 build(deps): bump isort from 6.0.0 to 6.0.1 in /lib/spack/docs (#49213)
Bumps [isort](https://github.com/PyCQA/isort) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/PyCQA/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/PyCQA/isort/compare/6.0.0...6.0.1)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-01 08:17:39 -07:00
HELICS-bot
9699bbc7b9 helics: Add version 3.6.1 (#49231)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-03-01 08:16:21 -07:00
Raffaele Solcà
c7e251de9f Add dla-future v0.8.0 (#49235) 2025-03-01 08:14:52 -07:00
Robert Maaskant
d788b15529 libmd: add version 1.1.0 (#49239)
Release notes can be read at https://archive.hadrons.org/software/libmd/libmd-1.1.0.announce
2025-03-01 08:11:12 -07:00
Harmen Stoppels
8e7489bc17 Revert "Honor cmake_prefix_paths property if available (#42569)" (#49237)
This reverts commit fe171a560b.
2025-02-28 23:33:02 +01:00
John W. Parent
d234df62d7 Solver: Cache Concretization Results (#48198)
Concretizer caching for reusing solver results
2025-02-28 12:42:00 -06:00
Mikhail Titov
4a5922a0ec py-radical-*: new version 1.90 (#48586)
* rct: update packages (RE, RG, RP, RS, RU) with new version 1.90

* radical: added `url_for_version` for older versions

* radical: set latest versions for `radical.pilot` and `radical.utils`

* radical: fixed `url_for_version` setup

* radical: set the latest version for `radical.entk`

* radical: fixed style for `url_for_version`

* Apply suggestions from code review (python version dependency)

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-28 07:38:45 -07:00
John W. Parent
5bd184aaaf Windows Rpath: Allow package test rpaths (#47072)
On Windows, libraries search their directory for dependencies, and
we help libraries in Spack-built packages locate their dependencies
by symlinking them into the dependent's directory (we refer to this
as simulated RPATHing).

We extend the convenience functionality here to support base library
directories outside of the package prefix: this is primarily for
running tests in the build directory (which is not located inside
of the final install prefix chosen by spack).
2025-02-27 19:16:00 -08:00
Mikael Simberg
464c3b96fa fmt: Add 11.1.4 (#49218) 2025-02-27 19:12:26 -06:00
Scott Wittenburg
60544a4e84 ci: avoid py-mpi4py tests on darwin (#49227) 2025-02-27 18:07:59 -07:00
Greg Sjaardema
a664d98f37 seacas: new version with change set support (#49224)
This release contains modifications to most of the SEACAS applications to support ChangeSets to some degree.
See https://github.com/SandiaLabs/seacas/wiki/Dynamic_Topology for information about Change Sets and
See https://github.com/SandiaLabs/seacas/wiki/Supporting-Change-Sets for information about how the various seacas applications are supporting the use or creation of change sets.

The release also includes various other small changes including formatting, portability, intallation, TPL version updates, and spelling.
2025-02-27 18:02:51 -07:00
Sinan
0e3d7efb0f alps: add conflict (#48751)
Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2025-02-27 17:57:55 -07:00
Chris Green
a8cd0b99f3 New recipes for PlantUML and py-sphinxcontrib-plantuml (#49204)
* new-recipe: plantuml
* new-recipe: py-sphinxcontrib-plantuml
2025-02-27 16:57:23 -08:00
Alec Scott
a43df598a1 rust: add v1.85.0 (#49158) 2025-02-27 13:18:23 -06:00
Alec Scott
a7163cd0fa gnutls: add master, improve styling (#49080) 2025-02-27 13:13:23 -06:00
Kyle Knoepfel
fe171a560b Honor cmake_prefix_paths property if available (#42569)
* Honor package-specified cmake_prefix_paths at runtime

* Add paths in the correct order and prune duplicates

* Normalize paths for windows' sake
2025-02-27 11:11:22 -07:00
Ritwik Patil
24abc3294a sendme: new package (#49133)
* add sendme package

* style fix

* add docstring for test function

* changed maintainer string, run test after install

* removed redundant test

* Follow the common package license header format

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-27 09:59:09 -07:00
MatthewLieber
2dea0073b2 mvapich-plus: new package (#48507)
* add mvapich-plus 4.0

* run spack style fix

* fix license issue

* change styling of mvapich-plus package based on review

* using spack style --fix

* fix more typos

* Apply suggestions from code review

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Adding CudaPackage and RocmPackage Mixins

---------

Co-authored-by: Matt Lieber <lieber.31@osu.edu>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-27 09:36:26 -07:00
Massimiliano Culpo
31ecefbfd2 heppdt: add dependency on C (#49219)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-27 08:10:19 -06:00
Harmen Stoppels
7363047b82 schema: additionalKeysAreSpecs (#49221)
Currently we validate all keys as specs, but it's meant to validate only additional keys in all cases.
2025-02-27 12:17:25 +01:00
Massimiliano Culpo
12fe7aef65 pipelines: extract changes from compiler as nodes (#49222)
* Split requirements to get better error messages in case of unsat solves.
* use list requirements instead of string
* activate static_analysis in a few pipelines

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-27 12:13:34 +01:00
Harmen Stoppels
5da4f18188 schema/modules.py: remove lmod props from tcl schema (#49220) 2025-02-27 10:48:22 +01:00
Marc T. Henry de Frahan
61c54ed28b Remove FSI variant from Nalu-Wind (#49209)
* Remove fsi variant from Nalu-Wind

* fix exawind
2025-02-26 16:57:56 -07:00
eugeneswalker
677caec3c6 Ci reactivate darwin pipelines (#48453)
* ci: darwin stacks: update tags following system updates

* disable SPACK_CI_DISABLE_STACKS; only enable *darwin* stacks for testing

* manually chmod u+w tmp/ before cleanup due to issue#49147

* comment out failing specs for now

* re-enable logic for disabling stacks

* add explanatory comment for darwin after_script additions

* remove more darwin-only targetting

* restore build_stage to default location

* move build-job-remove out of individual darwin stacks into darwin top level config

* keep build_stage in $spack/tmp for now
2025-02-26 17:34:22 -06:00
eugeneswalker
b914bd6638 e4s oneapi ci stack: mpi: require intel-oneapi-mpi (#49043)
* e4s oneapi ci stack: mpi: require intel-oneapi-mpi

* nrm ^py-scipy cflags="-Wno-error=incompatible-function-pointer-types"

* add explanatory comment
2025-02-26 23:07:57 +00:00
Massimiliano Culpo
3caa3132aa python: allow it as a build-tool again (#49201)
Python was removed from being a build tool in #46980, due to issues
when reusing specs. This PR adds a new rule to match the interpreter
among different Python packages, in clingo.

It also adds a bunch of new "build-tools", so that specs like:
```
py-matplotlib backend=tkagg
```
can be concretized in one go.

Modifications:
- [x] Make `py-matplotlib backend=tkagg` concretizable
- [x] Add unit-tests to ensure situations like in #46980 do not happen

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-26 15:04:31 -08:00
Massimiliano Culpo
dbd531112c Assign priorities to configuration scopes (take 2) (#49187)
Currently, the custom config scopes are pushed at the top when constructing
configuration, and are demoted whenever a context manager activating an
environment is used - see #48414 for details. Workflows that rely on the order
in the [docs](https://spack.readthedocs.io/en/latest/configuration.html#custom-scopes)
are thus fragile, and may break

This PR allows to assign priorities to scopes, and ensures that scopes of lower priorities
are always "below" scopes of higher priorities. When scopes have the same priority,
what matters is the insertion order.

Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
      adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
      add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-26 10:52:19 -08:00
psakievich
ae5e121502 Preserve --lines (#49194)
This does not propagate in parsing. Open to other ideas.
2025-02-26 17:48:01 +00:00
snehring
929cfc8e5a relion: add v5.0.0 (#49174)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-02-26 09:17:54 -08:00
Chris Marsh
bad28e7f9f py-natsort: add new variant +icu and dependent package (#48907)
* Add new package py-pyicu to support new py-natsort variant +icu

* note version req location

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* bound icu variant

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-26 10:42:13 -06:00
Patrick Lavin
3d63fe91b0 sst-elements: add support for --enable-ariel-mpi flag (#49135) 2025-02-26 07:30:51 -07:00
Mikhail Titov
95af020310 py-psij-python: new version 0.9.9 (#48610)
* py-psij-python: new version 0.9.9

* Update var/spack/repos/builtin/packages/py-psij-python/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* fixed py3.8 dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-26 08:13:48 -06:00
Seth R. Johnson
2147b9d95e g4vg: new version 1.0.3 (#49195) 2025-02-25 16:27:41 -06:00
psakievich
68636e7c19 lua-lpeg inheritance fix (#49065)
The parent class function doesn't return the path to the config file. This is one potential fix, or we can add the return back to base builder.
2025-02-25 15:14:54 -07:00
Elsa Gonsiorowski, PhD
f56675648a mpifileutils: add v0.12 (#49132)
* mpifileutils: update for v0.12 release

* removed @adammoody from maintainers
2025-02-25 13:39:25 -07:00
Tara Drwenski
3a219d114d Petsc: add in hipblas dependency on hipblas-common (#49017) 2025-02-25 10:35:36 -06:00
Wouter Deconinck
3cefa7047c davix: add v0.8.8, v0.8.9, v0.8.10 (#49057)
* davix: add v0.8.8, v0.8.9, v0.8.10

* davix: url_for_version

* davix: depends on googletest when @0.8.8: (type test, maybe build)

* davix: define DAVIX_TESTS
2025-02-25 10:05:31 -06:00
Cédric Chevalier
35013773ba Fix setup.fish syntax (#49176)
* Fix setup.fish syntax

* Simplify conditional in share/spack/setup-env.fish

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-25 07:11:55 -06:00
AMD Toolchain Support
e28379e98b python: limit parallellism in compileall (#48441) 2025-02-25 13:54:59 +01:00
afzpatel
93329d7f99 add ck variant to miopen-hip (#49143) 2025-02-25 05:48:49 -07:00
Massimiliano Culpo
9e508b0321 Revert "Assign priorities to configuration scopes (#48420)" (#49185)
All the build jobs in pipelines are apparently relying on the bug that was fixed.

The issue was not caught in the PR because generation jobs were fine, and
there was nothing to rebuild.

Reverting to fix pipelines in a new PR.

This reverts commit 3ad99d75f9.
2025-02-25 02:33:41 -08:00
Adam J. Stewart
2c26c429a7 py-sphinx: add v8.2.0 (#49107) 2025-02-25 10:44:58 +01:00
dependabot[bot]
1cc63e2b7c build(deps): bump sphinx from 8.2.0 to 8.2.1 in /lib/spack/docs (#49180)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.2.0 to 8.2.1.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/v8.2.1/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.2.0...v8.2.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-25 02:33:03 -07:00
Harmen Stoppels
4e311a22d0 spec.py: remove VariantMap.concrete (#49170)
VariantMap.concrete is unused, and would be incorrect if it were used
due to conditional variants.

Just let the Spec dictate what is concrete and what is not.
2025-02-25 10:18:06 +01:00
Massimiliano Culpo
3ad99d75f9 Assign priorities to configuration scopes (#48420)
Currently, environments can end up with higher priority than `-C` custom
config scopes and `-c` command line arguments sometimes. This shouldn't
happen -- those explicit CLI scopes should override active environments.

Up to now configuration behaved like a stack, where scopes could be only be
pushed at the top. This PR allows to assign priorities to scopes, and ensures
that scopes of lower priorities are always "below" scopes of higher priorities.

When scopes have the same priority, what matters is the insertion order.

Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
      adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
      add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`
- [x] Remove `unify:false` from custom `-C` scope in pipelines

On the last modification: on `develop`, pipelines are relying on the environment
being able to override `-C` scopes, which is a bug. After this fix, we need to be
explicit about the unification strategy in each stack, and remove the blanket
`unify:false` from the highest priority scope

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-25 00:58:16 -08:00
Adam J. Stewart
b79c01077d py-sympy: add v1.13.1 (#48951)
* py-sympy: add v1.13.1
2025-02-24 15:09:00 -08:00
Jim Galarowicz
4385f36b8d survey: add latest releases and python path settings for building with autoload none. Ref issue: 42535 (#48050)
* Update survey package file with latest releases and python path settings for building with autoload none.
* Submitting reformatted file.
* update survey package file with libmonitor dependency changes, take out py-gpustat, and minor comment change.
* Trigger build.
2025-02-24 15:04:40 -08:00
Axel Huebl
a85f1cfa4b WarpX 25.02 (#48917)
* pyAMReX: 25.02
* PICMI: 0.33.0
* WarpX: 25.02
* `amrex +fft` depends on `pkgconfig`
* Updated CMake logic uses `pkgconfig`
2025-02-24 14:48:27 -08:00
Melven Roehrig-Zoellner
13524fa8ed gcc: fix package.py for gcc@:9 (#49173) 2025-02-24 15:44:04 -07:00
Mikael Simberg
738c73975e mimalloc: Add new versions (#49168) 2025-02-24 13:04:22 -07:00
Mikael Simberg
bf9d72f87b ut: Add 2.3.0 (#49169) 2025-02-24 12:59:31 -07:00
Mikael Simberg
674cca3c4a asio: add 1.32.0 (#49167) 2025-02-24 12:39:18 -07:00
Cory Quammen
7a95e2beb5 paraview: add patch for Intel Classic compilers (#49116)
ParaView 5.12.0 through 5.13.2 do not compile. See
https://gitlab.kitware.com/vtk/vtk/-/issues/19620.
2025-02-24 11:27:03 -06:00
Adam J. Stewart
5ab71814a9 py-torchgeo: correct pyvista dep (#49140) 2025-02-24 09:06:33 -08:00
Harmen Stoppels
e783a2851d Revert "Repo.packages_with_tags: do not construct a set of all packages (#49141)" (#49172)
This reverts commit 0da5bafaf2.
2025-02-24 16:46:41 +01:00
Stephen Nicholas Swatman
29e3a28071 vecmem: add v1.14.0 (#49166)
This commit adds version 1.14.0 of the vecmem package.
2025-02-24 08:08:52 -06:00
Harmen Stoppels
4e7a5e9362 spack verify libraries: verify dependencies of installed packages can be resolved (#49124)
Currently, we have `config:shared_linking:missing_library_policy` to error
or warn when shared libraries cannot be resolved upon install.

The new `spack verify libraries` command allows users to run this post
install hook at any point in time to check whether their current
installations can resolve shared libs in rpaths.
2025-02-24 11:28:06 +01:00
Harmen Stoppels
89d1dfa340 python: deprecate old patch versions, remove patches that do not apply (#48958) 2025-02-24 03:23:05 -07:00
Harmen Stoppels
974abc8067 Add typehints for directory_layout / Spec.prefix (#48652) 2025-02-24 09:47:07 +00:00
Massimiliano Culpo
2f9ad5f34d spec.py: fix virtual reconstruction for old specs (#49103)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-02-24 01:23:37 -07:00
Harmen Stoppels
9555ceeb8a glib: various fixes (#48840)
* remove preferred to allow seamless python@3.12 usage

* glib: remove deprecated versions

* glib: use extends because python-venv is pulled in from build deps and put into path

* dont patch patch versions, use new patch releases containing the fix instead

* restrict patch of shebangs, group relevant bits together

* simplify lowerbound

* fix pinned glib version

---------

Co-authored-by: Chris Marsh <chrismarsh.c2@gmail.com>
2025-02-24 09:17:45 +01:00
Harmen Stoppels
6cd74efa90 Spec.ensure_external_path_if_external, Spec.inject_patches_variant -> spack.solver.asp (#48988)
y
2025-02-24 08:36:23 +01:00
Wouter Deconinck
3b3735a2cc root: add v6.34.04 (#49163)
* root: add v6.34.04

* root: add conflict for gcc-15 with earlier versions

---------

Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-02-23 22:17:47 -07:00
dependabot[bot]
2ffbc0d053 build(deps): bump mypy in /.github/workflows/requirements/style (#49165)
Bumps [mypy](https://github.com/python/mypy) from 1.11.2 to 1.15.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.11.2...v1.15.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-23 22:29:24 -06:00
Dom Heinzeller
a92419ffe4 Partial bug fix + conflict for compiling node-js@21: with gcc@11.2 (#48494)
* Bug fix for compiling node-js@21: with gcc@11.2 (var/spack/repos/builtin/packages/node-js/package.py var/spack/repos/builtin/packages/node-js/wasm-compiler-gcc11p2.patch)

Since this bug fix is not sufficient, add a conflict for node-js@21: with gcc@11.2

* In var/spack/repos/builtin/packages/node-js/package.py, restrict patch wasm-compiler-gcc11p2.patch to versions 21:22 for gcc@11.2
2025-02-23 19:13:56 -06:00
Juan Miguel Carceller
92c16d085f gtkplus: add conflict with GCC 14 (#48661)
* gtkplus: add conflict with GCC 14

* gtkplus: conflict gcc@14: when @:3.24.35

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-23 15:28:55 -07:00
Adam J. Stewart
c94024d51d py-timm: add v1.0.15 (#49159) 2025-02-23 14:43:33 -07:00
Harmen Stoppels
11915ca568 apr-util: add missing libxcrypt (#49160) 2025-02-23 13:30:09 -07:00
Buldram
4729b6e837 chafa: new package (#49162)
* chafa: new package

* Require at least one of +shared/+static
2025-02-23 13:29:50 -07:00
Seth R. Johnson
2f1978cf2f celeritas: add 'develop' branch (#49004)
* Revert "REVERTME: move celeritas changes to another branch"

This reverts commit a063e43aaf.

* Use predicted g4vg version

* Use

* fixup! Use predicted g4vg version

* Use spec for versions and improve dependency specification
2025-02-23 13:29:31 -07:00
Chase Phelps
d4045c1ef3 py-perfdump: new package (#49035)
* py-perfdump: new package

* Update package.py

style

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* package.py aktualisieren

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update package.py

remove unneeded rpath and pythonroot

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-23 13:29:13 -07:00
Pranav Sivaraman
a0f8aaf4e7 setup-env.fish: fix version checking for completions (#48806) 2025-02-23 20:28:51 +00:00
Wouter Deconinck
b7a5e9ca03 root: add v6.34.00, v6.34.02 (#48129)
* root: add v6.34.00, v6.34.02

* root: prefer 6.32.08

* root: updated variants

* root: treat v6.34 as stable, no preference for v6.32

* root: add variants geom, geombuilder

* delphes: depends on root +geom +opengl

* dd4hep: depends on root +geom

* pandoramonitoring: depends on root +geom

* root: actually pass geom, geombuilder to cmake

---------

Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-02-23 12:18:10 -06:00
Adam J. Stewart
7e4b8aa020 py-pyproj: add v3.7.1 (#49066) 2025-02-22 20:46:00 +01:00
Derek Ryan Strong
f5aa15034e Add fpart v1.7.0 (#49119) 2025-02-22 10:54:52 -06:00
Phil Tooley
f210be30d8 LLVM,GCC: Keep stable-series releases a bit longer (#49113) 2025-02-22 09:54:36 -06:00
Richard Berger
c63741a089 py-sphinx-rtd-dark-mode: add version 1.3.0 (#49136) 2025-02-22 08:31:13 -06:00
Andrey Perestoronin
4c99ffd81f new impi intel package 2021.14.2 release (#49114) 2025-02-22 08:02:53 -05:00
Nils Vu
1331332dcf libxsmm: update URL (#49155) 2025-02-22 02:04:49 -07:00
dmagdavector
910a4e6d22 slirp4netns: add v1.2.3, v1.3.1 (#48569) 2025-02-21 16:00:28 -08:00
Piotr Sacharuk
93f1ec20aa Update openturns versions (#48872) 2025-02-21 15:59:23 -08:00
Harmen Stoppels
9edbe5aed1 liburing: requires(...) (#49041)
* liburing: requires

* Update var/spack/repos/builtin/packages/liburing/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 15:45:54 -08:00
Krishna Chilleri
a574c7610b py-schema-salad: add v8.7.20241021092521 and py-mypy: add v1.12.1 (#49127)
* add new versions of dependencies

* modify pypi url for newest version

* add option for url depending on version number

* add version ranges of dependencies

* [@spackbot] updating style on behalf of kchilleri

* remove unnecessary py-cache-control version number
2025-02-21 16:07:32 -07:00
Alec Scott
4742f053af emacs: improve gui variant to cover both linux and macos (#49054)
* emacs: improve gui variant to cover both linux and macos

* emacs: fix optional deps type
2025-02-21 16:16:44 -06:00
Alec Scott
b06c5c7e81 fzf: fix go cache protection to allow delete (#49151) 2025-02-21 13:52:33 -07:00
Tobias Ribizel
03fa150185 typst: add v0.13.0 (#49134)
* spack: add version 0.13

* typst: fix version order

Co-authored-by: Alec Scott <hi@alecbcs.com>

* typst: more precise version requirements

* typst: use build_directory

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-02-21 11:25:50 -08:00
ByteHamster
b304a2d854 Fix installing rust@nightly (#49098)
Installing `rust@nightly` fails because the package file declares a conflict of rust versions older than `:1.64` with `gcc>=13`. However, because `nightly` is alphanumerically smaller than any actual version number, `nightly` is incorrectly detected to have a conflict with `gcc>=13` as well. Marking `nightly` as an infinity version instead solves this.
2025-02-21 09:53:46 -08:00
Ryan Krattiger
1fa1864b37 Reproducer should deduce artifact root from concrete environment (#45281)
* Reproducer should decude artifact root from concrete environment

* Add documentation on the layout of the artifacts directory

* Use dag hash in the container name

* Add reproducer options to improve local testing

* --use-local-head allows running reproducer with
  the current Spack HEAD commit rather than computing
  a commit for the reproducer

* Add test to verify commits and recreating reproduction environment

* Add test for non-merge commit case

* ci reproduce-build: Drop overwrite option
in favor of throwing an error if the working dir is non-empty
2025-02-21 10:46:43 -06:00
Harmen Stoppels
0da5bafaf2 Repo.packages_with_tags: do not construct a set of all packages (#49141) 2025-02-21 16:23:42 +01:00
Massimiliano Culpo
f4614a4931 Extract some package changes from compiler as deps (#49138) 2025-02-21 12:52:34 +01:00
Harmen Stoppels
b8ec69112f Extracted changes from 45189 (#49137)
Co-Authored-By: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 10:58:00 +01:00
Massimiliano Culpo
a3645fd372 Make BaseConfiguration pickleable (#47545)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-21 09:54:22 +01:00
dependabot[bot]
9bcd86071f build(deps): bump sphinx from 8.1.3 to 8.2.0 in /lib/spack/docs (#49118)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 8.1.3 to 8.2.0.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v8.1.3...v8.2.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 16:42:00 -08:00
Robert Maaskant
b126335800 go: add v1.24.0 (#49104)
* go-bootstrap: add v1.22.12

* go: add v1.24.0

* Reformat package, don't deprecate versions without an active CVE

---------

Co-authored-by: Alec Scott <scott112@llnl.gov>
2025-02-20 16:07:52 -08:00
Derek Ryan Strong
d32b6099b3 sw4: fix build options (#48774)
* Fix sw4 build options

* Update constraint to hdf5@1.14

* Change edit to setup_build_environment

* Use append_flags

* Fix style
2025-02-20 15:02:52 -07:00
Harmen Stoppels
3e8cb852b0 spec.py: use json.dumps directly to avoid hash breakage (#48884) 2025-02-20 17:39:07 +01:00
Richard Berger
c8d7aa1772 lammps: fix pace link dep 2025-02-20 17:37:19 +01:00
Buldram
ec836d740f py-tensorflow: patch for v2.15 build errors (#49001)
* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: patch for v2.15 build errors with new compilers

* py-tensorflow: fix clang build and add clang version constraints

* py-tensorflow: use compiler wrapper

* py-tensorflow: relax clang conflict
2025-02-20 13:13:04 +01:00
Teague Sterling
cacdaaf3a9 bcftools: add v1.21, v1.20 (#49070)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-20 01:58:21 -07:00
Dave Keeshan
165e6b1d5e xnedit: new package (#41255) 2025-02-20 08:03:11 +01:00
Paul
70f5300cf2 Add new package for Jacamar CI (#48424) 2025-02-19 22:18:15 -06:00
Buldram
81e08167e2 nim: fix Musl build with new compilers (#48487)
* nim: fix build with new compilers

* narrow condition for disabling warnings

* move flags into offending module

disables warnings also for compiling projects other than the Nim compiler when necessary

* specify different versions pthread modules

* instead patch SysThread type

* adapt patch for old Nim versions

* Specify hypothetical `:@0.19.6` for patch version constraint
2025-02-19 22:16:12 -06:00
Alex Richert
e9d8c5767b crtm-fix: 3.1.1.2 (#48755)
* crtm-fix: 3.1.1.2

* correct checksum

* exclude test files

* Update package.py
2025-02-19 22:13:08 -06:00
Teague Sterling
4cefa973cd htslib: add v1.21 (#49056)
* Adding variants based of configure flags and an option to compile with PIC

* Adding GCS to htslib

* Revisions from review

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating descriptions, fixing flags, fixing version and variant conditions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* htslib: add v1.21

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-02-19 21:43:10 -06:00
Pruvost Florent
a2bd221ee4 chameleon: update to 1.3.0 (#49112) 2025-02-19 20:03:23 -07:00
Dom Heinzeller
adbb41c7df Bug fixes for fckit (disable finalization of DDTs) and ectrans (disable use of Fortran contiguous keyword) (#49111)
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ECTRANS_HAVE_CONTIGUOUS_ISSUE to turn off problematic use of Fortran 'contiguous' keyword
* In var/spack/repos/builtin/packages/ectrans/package.py, always set cmake argument ENABLE_FINAL=OFF to turn off problematic finalization of derived data types
* Update links to issues in fckit and ectrans
* Fix wrong cmake argument for ECTRANS_HAVE_CONTIGUOUS_ISSUE in var/spack/repos/builtin/packages/ectrans/package.py
2025-02-19 20:03:05 -07:00
Joseph Wang
2554c7bd21 py-onnxruntime: add v1.18.0 -> v1.19.2 (#46329)
* py-onnxruntime: add new versions

* py-onnxruntime: add constraints

* py-onnxruntime: fix typo

* py-onnxruntime: fix style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 12:02:27 -08:00
Garth N. Wells
e274e855f1 py-nanobind: add v2.5.0 (#48953)
Also removes inactive maintainer.
2025-02-19 11:47:25 -08:00
Matthew L. Curry
e76ebf2cf7 xz: Work around ASM declaration issue with NVHPC (#49006)
This commit works around an issue described below that xz encounters
during compilation with nvhpc.

https://forums.developer.nvidia.com/t/problem-in-inline-assembly-when-using-multiple-asm-declarations/210952
2025-02-19 11:38:29 -08:00
Pranav Sivaraman
11ba5ebbcd jsoncons: new package (#49105) 2025-02-19 11:13:08 -08:00
Adam J. Stewart
53262b968b py-scikit-image: add v0.25.2 (#49101) 2025-02-19 11:12:07 -08:00
etiennemlb
39620085d4 Add new packages: PDI (and dependencies/plugins) (#48710)
* Add new packages: PDI
* Fix style and  typos
* License and pdi python version/shebang issue
* Version update
* 1.8.0 cutoff and dependency simplifications
* Remove unused guard
2025-02-19 09:56:03 -08:00
Krishna Chilleri
78c985fce4 hpc-beeflow: New package (#49036)
* hpc-beeflow: New package

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* add new version of py-fastapi

* Update var/spack/repos/builtin/packages/hpc-beeflow/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 11:18:01 -06:00
Wouter Deconinck
f9e4d3898a cargo: avoid need to use super().build_args with std_build_args (#49071)
* cargo: avoid need to use super().build_args with std_build_args

* cargo: fix style

* jujutsu: avoid need for super().build_args
2025-02-19 09:01:56 -08:00
Lehman Garrison
75c3d0a053 py-yt: add 4.4.0 and dependencies (#47571)
* py-ewah-bool-utils: add new package

* py-extension-helpers: add 1.2.0

* py-regions: add new package

* py-erfa: add 2.0.1.5

* py-yt: add 4.4.0

* py-yt: respect build_jobs
2025-02-19 08:14:35 -07:00
Stephen Nicholas Swatman
6afe002c94 vecmem: fix SYCL compiler specification (#49108)
This commit adds an additional requirement to the vecmem package,
requiring the OneAPI compiler iff the `sycl` variant is turned on. This
allows us to correctly set the non-standard `SYCLCXX` environment
variable.
2025-02-19 08:54:47 -06:00
Alexandre DENIS
f76e01707a mpi-sync-clocks: new package (#47834)
* mpi_ysnc_clocks: new package

* mpi_sync_clocks: move package to the right location

* mpi_sync_clocks: add copyright header

* mpi-sync-clocks: rename package mpi_sync_clocks -> mpi-sync-clocks to comply with naming convention

* mpi-sync-clocks: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpi-sync-clocks: streamline autogen

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:32:04 -06:00
Alexandre DENIS
738ca8e2c2 mpibenchmark: new package (#47835)
* mpibenchmark: new package

* mpibenchmark: add copyright header

* mpibenchmark: move the package to the right location

* mpibenchmark: explicitely disable CUDA & ROCm

* mpibenchmark: update copyright

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline management of --enable/--disable

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: streamline autogen

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* mpibenchmark: fix coding style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-19 08:31:15 -06:00
Seth R. Johnson
817df233fb g4vg/vecgeom: add version1.0.2 and patch cuda build failures (#49110)
* vecgeom: remove old patches and add patch for CUDA 11

* g4vg: add 1.0.2
2025-02-19 06:29:01 -07:00
Adam J. Stewart
5ea4d04450 py-scipy: add v1.15.2 (#49074) 2025-02-19 13:08:20 +01:00
Taillefumier Mathieu
49bf5a349e cp2k: fine graining control of the GPU modules (#48925)
Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2025-02-19 09:29:31 +01:00
Robert Maaskant
2427b9649d ca-certificates-mozilla: add 2024-12-31 and deprecate older (#49096) 2025-02-19 00:34:18 -07:00
Robert Maaskant
0cec2c9fc6 glab: v1.52.0 and v1.53.0 (#49094) 2025-02-19 00:34:05 -07:00
Harmen Stoppels
c97be2a9d7 checksum.py tests: extract add_versions_to_pkg fixture (#49100) 2025-02-19 07:33:50 +00:00
Joe Schoonover
3fbdfc464b fluidnumerics-self: new package (#48636)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: fluidnumerics-joe <fluidnumerics-joe@users.noreply.github.com>
2025-02-19 00:23:38 -07:00
Buldram
c4449cb201 chez-scheme: new package (#49067)
* chez-scheme: new package

* Add separate zuo package, correct dep flags, disable threads and libffi by default

* zuo: add +big, parallelize build
2025-02-18 16:05:37 -06:00
Christian Heusel
1601193e12 likwid: Fix the perms script (#48666)
* likwid: Fix the perms script

The script loops over the path (which includes the prefix), but
additionally adds the prefix up front which results in duplicate paths
and a double "/" in the command like in the following one:

    chown root:root /opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn//opt/csg/spack/opt/spack/linux-debian12-zen2/gcc-12.2.0/likwid-5.4.1-xfc6quebnf2kosydl3ospaeoskxnxwhn/sbin/likwid-accessD

Additionally the path is currently not quoted which can potentially
result in word splitting for weird paths.

Signed-off-by: Christian Heusel <christian@heusel.eu>

* likwid: Make the perm scripts' name unique

Also move it into the proper binary folder as per the Filesystem
Hierarchy Standard.

Signed-off-by: Christian Heusel <christian@heusel.eu>

---------

Signed-off-by: Christian Heusel <christian@heusel.eu>
2025-02-18 13:18:03 -07:00
Alex Richert
9c5b3ccb4e wgrib2: add cmake builder (#48447)
* wgrib2: add cmake builder

* wgrib2 add maintainer

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* style fixes; add tar.gz for old vers

* license update

* wgrib2: don't restrict openjpeg variant by version

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/wgrib2/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-02-18 13:12:42 -07:00
324 changed files with 6458 additions and 4130 deletions

View File

@@ -1,7 +1,7 @@
black==25.1.0
clingo==5.7.1
flake8==7.1.2
isort==6.0.0
mypy==1.11.2
isort==6.0.1
mypy==1.15.0
types-six==1.17.0.20241205
vermin==1.6.0

1
.gitignore vendored
View File

@@ -201,7 +201,6 @@ tramp
# Org-mode
.org-id-locations
*_archive
# flymake-mode
*_flymake.*

View File

@@ -54,9 +54,15 @@ concretizer:
# Regular packages
cmake: 2
gmake: 2
python: 2
python-venv: 2
py-cython: 2
py-flit-core: 2
py-pip: 2
py-setuptools: 2
py-wheel: 2
xcb-proto: 2
# Compilers
gcc: 2
llvm: 2
# Option to specify compatibility between operating systems for reuse of compilers and packages

View File

@@ -1761,19 +1761,24 @@ Verifying installations
The ``spack verify`` command can be used to verify the validity of
Spack-installed packages any time after installation.
^^^^^^^^^^^^^^^^^^^^^^^^^
``spack verify manifest``
^^^^^^^^^^^^^^^^^^^^^^^^^
At installation time, Spack creates a manifest of every file in the
installation prefix. For links, Spack tracks the mode, ownership, and
destination. For directories, Spack tracks the mode, and
ownership. For files, Spack tracks the mode, ownership, modification
time, hash, and size. The Spack verify command will check, for every
file in each package, whether any of those attributes have changed. It
will also check for newly added files or deleted files from the
installation prefix. Spack can either check all installed packages
time, hash, and size. The ``spack verify manifest`` command will check,
for every file in each package, whether any of those attributes have
changed. It will also check for newly added files or deleted files from
the installation prefix. Spack can either check all installed packages
using the `-a,--all` or accept specs listed on the command line to
verify.
The ``spack verify`` command can also verify for individual files that
they haven't been altered since installation time. If the given file
The ``spack verify manifest`` command can also verify for individual files
that they haven't been altered since installation time. If the given file
is not in a Spack installation prefix, Spack will report that it is
not owned by any package. To check individual files instead of specs,
use the ``-f,--files`` option.
@@ -1788,6 +1793,22 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
^^^^^^^^^^^^^^^^^^^^^^^^^^
``spack verify libraries``
^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``spack verify libraries`` command can be used to verify that packages
do not have accidental system dependencies. This command scans the install
prefixes of packages for executables and shared libraries, and resolves
their needed libraries in their RPATHs. When needed libraries cannot be
located, an error is reported. This typically indicates that a package
was linked against a system library, instead of a library provided by
a Spack package.
This verification can also be enabled as a post-install hook by setting
``config:shared_linking:missing_library_policy`` to ``error`` or ``warn``
in :ref:`config.yaml <config-yaml>`.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -223,6 +223,10 @@ def setup(sphinx):
("py:class", "spack.compiler.CompilerCache"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
("py:class", "llnl.util.lang.KT"),
("py:class", "llnl.util.lang.VT"),
("py:obj", "llnl.util.lang.KT"),
("py:obj", "llnl.util.lang.VT"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -125,6 +125,8 @@ are stored in ``$spack/var/spack/cache``. These are stored indefinitely
by default. Can be purged with :ref:`spack clean --downloads
<cmd-spack-clean>`.
.. _Misc Cache:
--------------------
``misc_cache``
--------------------
@@ -334,3 +336,52 @@ create a new alias called ``inst`` that will always call ``install -v``:
aliases:
inst: install -v
-------------------------------
``concretization_cache:enable``
-------------------------------
When set to ``true``, Spack will utilize a cache of solver outputs from
successful concretization runs. When enabled, Spack will check the concretization
cache prior to running the solver. If a previous request to solve a given
problem is present in the cache, Spack will load the concrete specs and other
solver data from the cache rather than running the solver. Specs not previously
concretized will be added to the cache on a successful solve. The cache additionally
holds solver statistics, so commands like ``spack solve`` will still return information
about the run that produced a given solver result.
This cache is a subcache of the :ref:`Misc Cache` and as such will be cleaned when the Misc
Cache is cleaned.
When ``false`` or ommitted, all concretization requests will be performed from scatch
----------------------------
``concretization_cache:url``
----------------------------
Path to the location where Spack will root the concretization cache. Currently this only supports
paths on the local filesystem.
Default location is under the :ref:`Misc Cache` at: ``$misc_cache/concretization``
------------------------------------
``concretization_cache:entry_limit``
------------------------------------
Sets a limit on the number of concretization results that Spack will cache. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.
-----------------------------------
``concretization_cache:size_limit``
-----------------------------------
Sets a limit on the size of the concretization cache in bytes. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.

View File

@@ -820,6 +820,69 @@ presence of a ``SPACK_CDASH_AUTH_TOKEN`` environment variable during the
build group on CDash called "Release Testing" (that group will be created if
it didn't already exist).
.. _ci_artifacts:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
CI Artifacts Directory Layout
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
When running the CI build using the command ``spack ci rebuild`` a number of directories are created for
storing data generated during the CI job. The default root directory for artifacts is ``job_scratch_root``.
This can be overridden by passing the argument ``--artifacts-root`` to the ``spack ci generate`` command
or by setting the ``SPACK_ARTIFACTS_ROOT`` environment variable in the build job scripts.
The top level directories under the artifact root are ``concrete_environment``, ``logs``, ``reproduction``,
``tests``, and ``user_data``. Spack does not restrict what is written to any of these directories nor does
it require user specified files be written to any specific directory.
------------------------
``concrete_environment``
------------------------
The directory ``concrete_environment`` is used to communicate the ci generate processed ``spack.yaml`` and
the concrete ``spack.lock`` for the CI environment.
--------
``logs``
--------
The directory ``logs`` contains the spack build log, ``spack-build-out.txt``, and the spack build environment
modification file, ``spack-build-mod-env.txt``. Additionally all files specified by the packages ``Builder``
property ``archive_files`` are also copied here (ie. ``CMakeCache.txt`` in ``CMakeBuilder``).
----------------
``reproduction``
----------------
The directory ``reproduction`` is used to store the files needed by the ``spack reproduce-build`` command.
This includes ``repro.json``, copies of all of the files in ``concrete_environment``, the concrete spec
JSON file for the current spec being built, and all of the files written in the artifacts root directory.
The ``repro.json`` file is not versioned and is only designed to work with the version of spack CI was run with.
An example of what a ``repro.json`` may look like is here.
.. code:: json
{
"job_name": "adios2@2.9.2 /feaevuj %gcc@11.4.0 arch=linux-ubuntu20.04-x86_64_v3 E4S ROCm External",
"job_spec_json": "adios2.json",
"ci_project_dir": "/builds/spack/spack"
}
---------
``tests``
---------
The directory ``tests`` is used to store output from running ``spack test <job spec>``. This may or may not have
data in it depending on the package that was built and the availability of tests.
-------------
``user_data``
-------------
The directory ``user_data`` is used to store everything else that shouldn't be copied to the ``reproduction`` direcotory.
Users may use this to store additional logs or metrics or other types of files generated by the build job.
-------------------------------------
Using a custom spack in your pipeline
-------------------------------------

View File

@@ -1,13 +1,13 @@
sphinx==8.1.3
sphinx==8.2.3
sphinxcontrib-programoutput==0.18
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
python-levenshtein==0.27.1
docutils==0.21.2
pygments==2.19.1
urllib3==2.3.0
pytest==8.3.4
isort==6.0.0
pytest==8.3.5
isort==6.0.1
black==25.1.0
flake8==7.1.2
mypy==1.11.1

View File

@@ -7,6 +7,7 @@
import fnmatch
import glob
import hashlib
import io
import itertools
import numbers
import os
@@ -20,6 +21,7 @@
from contextlib import contextmanager
from itertools import accumulate
from typing import (
IO,
Callable,
Deque,
Dict,
@@ -2454,26 +2456,69 @@ class WindowsSimulatedRPath:
and vis versa.
"""
def __init__(self, package, link_install_prefix=True):
def __init__(
self,
package,
base_modification_prefix: Optional[Union[str, pathlib.Path]] = None,
link_install_prefix: bool = True,
):
"""
Args:
package (spack.package_base.PackageBase): Package requiring links
base_modification_prefix (str|pathlib.Path): Path representation indicating
the root directory in which to establish the simulated rpath, ie where the
symlinks that comprise the "rpath" behavior will be installed.
Note: This is a mutually exclusive option with `link_install_prefix` using
both is an error.
Default: None
link_install_prefix (bool): Link against package's own install or stage root.
Packages that run their own executables during build and require rpaths to
the build directory during build time require this option. Default: install
the build directory during build time require this option.
Default: install
root
Note: This is a mutually exclusive option with `base_modification_prefix`, using
both is an error.
"""
self.pkg = package
self._addl_rpaths = set()
self._addl_rpaths: set[str] = set()
if link_install_prefix and base_modification_prefix:
raise RuntimeError(
"Invalid combination of arguments given to WindowsSimulated RPath.\n"
"Select either `link_install_prefix` to create an install prefix rpath"
" or specify a `base_modification_prefix` for any other link type. "
"Specifying both arguments is invalid."
)
if not (link_install_prefix or base_modification_prefix):
raise RuntimeError(
"Insufficient arguments given to WindowsSimulatedRpath.\n"
"WindowsSimulatedRPath requires one of link_install_prefix"
" or base_modification_prefix to be specified."
" Neither was provided."
)
self.link_install_prefix = link_install_prefix
self._additional_library_dependents = set()
if base_modification_prefix:
self.base_modification_prefix = pathlib.Path(base_modification_prefix)
else:
self.base_modification_prefix = pathlib.Path(self.pkg.prefix)
self._additional_library_dependents: set[pathlib.Path] = set()
if not self.link_install_prefix:
tty.debug(f"Generating rpath for non install context: {base_modification_prefix}")
@property
def library_dependents(self):
"""
Set of directories where package binaries/libraries are located.
"""
return set([pathlib.Path(self.pkg.prefix.bin)]) | self._additional_library_dependents
base_pths = set()
if self.link_install_prefix:
base_pths.add(pathlib.Path(self.pkg.prefix.bin))
base_pths |= self._additional_library_dependents
return base_pths
def add_library_dependent(self, *dest):
"""
@@ -2489,6 +2534,12 @@ def add_library_dependent(self, *dest):
new_pth = pathlib.Path(pth).parent
else:
new_pth = pathlib.Path(pth)
path_is_in_prefix = new_pth.is_relative_to(self.base_modification_prefix)
if not path_is_in_prefix:
raise RuntimeError(
f"Attempting to generate rpath symlink out of rpath context:\
{str(self.base_modification_prefix)}"
)
self._additional_library_dependents.add(new_pth)
@property
@@ -2577,6 +2628,33 @@ def establish_link(self):
self._link(library, lib_dir)
def make_package_test_rpath(pkg, test_dir: Union[str, pathlib.Path]):
"""Establishes a temp Windows simulated rpath for the pkg in the testing directory
so an executable can test the libraries/executables with proper access
to dependent dlls
Note: this is a no-op on all other platforms besides Windows
Args:
pkg (spack.package_base.PackageBase): the package for which the rpath should be computed
test_dir: the testing directory in which we should construct an rpath
"""
# link_install_prefix as false ensures we're not linking into the install prefix
mini_rpath = WindowsSimulatedRPath(pkg, link_install_prefix=False)
# add the testing directory as a location to install rpath symlinks
mini_rpath.add_library_dependent(test_dir)
# check for whether build_directory is available, if not
# assume the stage root is the build dir
build_dir_attr = getattr(pkg, "build_directory", None)
build_directory = build_dir_attr if build_dir_attr else pkg.stage.path
# add the build dir & build dir bin
mini_rpath.add_rpath(os.path.join(build_directory, "bin"))
mini_rpath.add_rpath(os.path.join(build_directory))
# construct rpath
mini_rpath.establish_link()
@system_path_filter
@memoized
def can_access_dir(path):
@@ -2805,6 +2883,20 @@ def keep_modification_time(*filenames):
os.utime(f, (os.path.getatime(f), mtime))
@contextmanager
def temporary_file_position(stream):
orig_pos = stream.tell()
yield
stream.seek(orig_pos)
@contextmanager
def current_file_position(stream: IO[str], loc: int, relative_to=io.SEEK_CUR):
with temporary_file_position(stream):
stream.seek(loc, relative_to)
yield
@contextmanager
def temporary_dir(
suffix: Optional[str] = None, prefix: Optional[str] = None, dir: Optional[str] = None

View File

@@ -14,7 +14,7 @@
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Dict, Iterable, List, Tuple, TypeVar
from typing import Callable, Dict, Iterable, List, Mapping, Optional, Tuple, TypeVar
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -1080,3 +1080,88 @@ def __set__(self, instance, value):
def factory(self, instance, owner):
raise NotImplementedError("must be implemented by derived classes")
KT = TypeVar("KT")
VT = TypeVar("VT")
class PriorityOrderedMapping(Mapping[KT, VT]):
"""Mapping that iterates over key according to an integer priority. If the priority is
the same for two keys, insertion order is what matters.
The priority is set when the key/value pair is added. If not set, the highest current priority
is used.
"""
_data: Dict[KT, VT]
_priorities: List[Tuple[int, KT]]
def __init__(self) -> None:
self._data = {}
# Tuple of (priority, key)
self._priorities = []
def __getitem__(self, key: KT) -> VT:
return self._data[key]
def __len__(self) -> int:
return len(self._data)
def __iter__(self):
yield from (key for _, key in self._priorities)
def __reversed__(self):
yield from (key for _, key in reversed(self._priorities))
def reversed_keys(self):
"""Iterates over keys from the highest priority, to the lowest."""
return reversed(self)
def reversed_values(self):
"""Iterates over values from the highest priority, to the lowest."""
yield from (self._data[key] for _, key in reversed(self._priorities))
def _highest_priority(self) -> int:
if not self._priorities:
return 0
result, _ = self._priorities[-1]
return result
def add(self, key: KT, *, value: VT, priority: Optional[int] = None) -> None:
"""Adds a key/value pair to the mapping, with a specific priority.
If the priority is None, then it is assumed to be the highest priority value currently
in the container.
Raises:
ValueError: when the same priority is already in the mapping
"""
if priority is None:
priority = self._highest_priority()
if key in self._data:
self.remove(key)
self._priorities.append((priority, key))
# We rely on sort being stable
self._priorities.sort(key=lambda x: x[0])
self._data[key] = value
assert len(self._data) == len(self._priorities)
def remove(self, key: KT) -> VT:
"""Removes a key from the mapping.
Returns:
The value associated with the key being removed
Raises:
KeyError: if the key is not in the mapping
"""
if key not in self._data:
raise KeyError(f"cannot find {key}")
popped_item = self._data.pop(key)
self._priorities = [(p, k) for p, k in self._priorities if k != key]
assert len(self._data) == len(self._priorities)
return popped_item

View File

@@ -13,6 +13,18 @@
__version__ = "1.0.0.dev0"
spack_version = __version__
#: The current Package API version implemented by this version of Spack. The Package API defines
#: the Python interface for packages as well as the layout of package repositories. The minor
#: version is incremented when the package API is extended in a backwards-compatible way. The major
#: version is incremented upon breaking changes. This version is changed independently from the
#: Spack version.
package_api_version = (1, 0)
#: The minimum Package API version that this version of Spack is compatible with. This should
#: always be a tuple of the form ``(major, 0)``, since compatibility with vX.Y implies
#: compatibility with vX.0.
min_package_api_version = (1, 0)
def __try_int(v):
try:
@@ -79,4 +91,6 @@ def get_short_version() -> str:
"get_version",
"get_spack_commit",
"get_short_version",
"package_api_version",
"min_package_api_version",
]

View File

@@ -292,7 +292,12 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install()
PackageInstaller(
[concrete_spec.package],
fail_fast=True,
package_use_cache=False,
dependencies_use_cache=False,
).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -362,6 +367,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
return

View File

@@ -12,6 +12,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import depends_on
from .cmake import CMakeBuilder, CMakePackage
@@ -371,6 +372,10 @@ class CachedCMakePackage(CMakePackage):
CMakeBuilder = CachedCMakeBuilder
# These dependencies are assumed in the builder
depends_on("c", type="build")
depends_on("cxx", type="build")
def flag_handler(self, name, flags):
if name in ("cflags", "cxxflags", "cppflags", "fflags"):
return None, None, None # handled in the cmake cache

View File

@@ -70,10 +70,16 @@ def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def std_build_args(self):
"""Standard arguments for ``cargo build`` provided as a property for
convenience of package writers."""
return ["-j", str(self.pkg.module.make_jobs)]
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return ["-j", str(self.pkg.module.make_jobs)]
return []
@property
def check_args(self):
@@ -88,7 +94,9 @@ def build(
) -> None:
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
pkg.module.cargo(
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
)
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix

View File

@@ -142,7 +142,7 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
# Only if environment modifications are desired (default is +envmods)
if "~envmods" not in self.spec:
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
self.component_prefix.env.join("vars.sh"), *self.env_script_args

View File

@@ -616,7 +616,7 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def download_and_extract_artifacts(url, work_dir):
def download_and_extract_artifacts(url, work_dir) -> str:
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -624,6 +624,10 @@ def download_and_extract_artifacts(url, work_dir):
url (str): Complete url to artifacts.zip file
work_dir (str): Path to destination where artifacts should be extracted
Output:
Artifacts root path relative to the archive root
"""
tty.msg(f"Fetching artifacts from: {url}")
@@ -641,13 +645,25 @@ def download_and_extract_artifacts(url, work_dir):
response = urlopen(request, timeout=SPACK_CDASH_TIMEOUT)
with open(artifacts_zip_path, "wb") as out_file:
shutil.copyfileobj(response, out_file)
with zipfile.ZipFile(artifacts_zip_path) as zip_file:
zip_file.extractall(work_dir)
# Get the artifact root
artifact_root = ""
for f in zip_file.filelist:
if "spack.lock" in f.filename:
artifact_root = os.path.dirname(os.path.dirname(f.filename))
break
except OSError as e:
raise SpackError(f"Error fetching artifacts: {e}")
finally:
try:
os.remove(artifacts_zip_path)
except FileNotFoundError:
# If the file doesn't exist we are already raising
pass
with zipfile.ZipFile(artifacts_zip_path) as zip_file:
zip_file.extractall(work_dir)
os.remove(artifacts_zip_path)
return artifact_root
def get_spack_info():
@@ -761,7 +777,7 @@ def setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None):
return True
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime, use_local_head):
"""Given a url to gitlab artifacts.zip from a failed 'spack ci rebuild' job,
attempt to setup an environment in which the failure can be reproduced
locally. This entails the following:
@@ -775,8 +791,11 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commands to run to reproduce the build once inside the container.
"""
work_dir = os.path.realpath(work_dir)
if os.path.exists(work_dir) and os.listdir(work_dir):
raise SpackError(f"Cannot run reproducer in non-emptry working dir:\n {work_dir}")
platform_script_ext = "ps1" if IS_WINDOWS else "sh"
download_and_extract_artifacts(url, work_dir)
artifact_root = download_and_extract_artifacts(url, work_dir)
gpg_path = None
if gpg_url:
@@ -838,6 +857,9 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
with open(repro_file, encoding="utf-8") as fd:
repro_details = json.load(fd)
spec_file = fs.find(work_dir, repro_details["job_spec_json"])[0]
reproducer_spec = spack.spec.Spec.from_specfile(spec_file)
repro_dir = os.path.dirname(repro_file)
rel_repro_dir = repro_dir.replace(work_dir, "").lstrip(os.path.sep)
@@ -898,17 +920,20 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
commit_regex = re.compile(r"commit\s+([^\s]+)")
merge_commit_regex = re.compile(r"Merge\s+([^\s]+)\s+into\s+([^\s]+)")
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
if use_local_head:
commit_1 = "HEAD"
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
if m:
commit_1 = m.group(1)
setup_result = False
if commit_1:
@@ -983,6 +1008,8 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"entrypoint", entrypoint_script, work_dir, run=False, exit_on_failure=False
)
# Attempt to create a unique name for the reproducer container
container_suffix = "_" + reproducer_spec.dag_hash() if reproducer_spec else ""
docker_command = [
runtime,
"run",
@@ -990,14 +1017,14 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"-t",
"--rm",
"--name",
"spack_reproducer",
f"spack_reproducer{container_suffix}",
"-v",
":".join([work_dir, mounted_workdir, "Z"]),
"-v",
":".join(
[
os.path.join(work_dir, "jobs_scratch_dir"),
os.path.join(mount_as_dir, "jobs_scratch_dir"),
os.path.join(work_dir, artifact_root),
os.path.join(mount_as_dir, artifact_root),
"Z",
]
),

View File

@@ -176,6 +176,11 @@ def setup_parser(subparser):
reproduce.add_argument(
"-s", "--autostart", help="Run docker reproducer automatically", action="store_true"
)
reproduce.add_argument(
"--use-local-head",
help="Use the HEAD of the local Spack instead of reproducing a commit",
action="store_true",
)
gpg_group = reproduce.add_mutually_exclusive_group(required=False)
gpg_group.add_argument(
"--gpg-file", help="Path to public GPG key for validating binary cache installs"
@@ -608,7 +613,12 @@ def ci_reproduce(args):
gpg_key_url = None
return spack_ci.reproduce_ci_job(
args.job_url, args.working_dir, args.autostart, gpg_key_url, args.runtime
args.job_url,
args.working_dir,
args.autostart,
gpg_key_url,
args.runtime,
args.use_local_head,
)

View File

@@ -528,7 +528,6 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually
# specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line")

View File

@@ -216,7 +216,7 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest

View File

@@ -2,35 +2,48 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import io
from typing import List, Optional
import llnl.util.tty as tty
from llnl.string import plural
from llnl.util.filesystem import visit_directory_tree
import spack.cmd
import spack.environment as ev
import spack.spec
import spack.store
import spack.verify
import spack.verify_libraries
from spack.cmd.common import arguments
description = "check that all spack packages are on disk as installed"
description = "verify spack installations on disk"
section = "admin"
level = "long"
MANIFEST_SUBPARSER: Optional[argparse.ArgumentParser] = None
def setup_parser(subparser):
setup_parser.parser = subparser
subparser.add_argument(
def setup_parser(subparser: argparse.ArgumentParser):
global MANIFEST_SUBPARSER
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="verify_command")
MANIFEST_SUBPARSER = sp.add_parser(
"manifest", help=verify_manifest.__doc__, description=verify_manifest.__doc__
)
MANIFEST_SUBPARSER.add_argument(
"-l", "--local", action="store_true", help="verify only locally installed packages"
)
subparser.add_argument(
MANIFEST_SUBPARSER.add_argument(
"-j", "--json", action="store_true", help="ouptut json-formatted errors"
)
subparser.add_argument("-a", "--all", action="store_true", help="verify all packages")
subparser.add_argument(
MANIFEST_SUBPARSER.add_argument("-a", "--all", action="store_true", help="verify all packages")
MANIFEST_SUBPARSER.add_argument(
"specs_or_files", nargs=argparse.REMAINDER, help="specs or files to verify"
)
type = subparser.add_mutually_exclusive_group()
type.add_argument(
manifest_sp_type = MANIFEST_SUBPARSER.add_mutually_exclusive_group()
manifest_sp_type.add_argument(
"-s",
"--specs",
action="store_const",
@@ -39,7 +52,7 @@ def setup_parser(subparser):
default="specs",
help="treat entries as specs (default)",
)
type.add_argument(
manifest_sp_type.add_argument(
"-f",
"--files",
action="store_const",
@@ -49,14 +62,67 @@ def setup_parser(subparser):
help="treat entries as absolute filenames\n\ncannot be used with '-a'",
)
libraries_subparser = sp.add_parser(
"libraries", help=verify_libraries.__doc__, description=verify_libraries.__doc__
)
arguments.add_common_arguments(libraries_subparser, ["constraint"])
def verify(parser, args):
cmd = args.verify_command
if cmd == "libraries":
return verify_libraries(args)
elif cmd == "manifest":
return verify_manifest(args)
parser.error("invalid verify subcommand")
def verify_libraries(args):
"""verify that shared libraries of install packages can be located in rpaths (Linux only)"""
specs_from_db = [s for s in args.specs(installed=True) if not s.external]
tty.info(f"Checking {len(specs_from_db)} packages for shared library resolution")
errors = 0
for spec in specs_from_db:
try:
pkg = spec.package
except Exception:
tty.warn(f"Skipping {spec.cformat('{name}{@version}{/hash}')} due to missing package")
error_msg = _verify_libraries(spec, pkg.unresolved_libraries)
if error_msg is not None:
errors += 1
tty.error(error_msg)
if errors:
tty.error(f"Cannot resolve shared libraries in {plural(errors, 'package')}")
return 1
def _verify_libraries(spec: spack.spec.Spec, unresolved_libraries: List[str]) -> Optional[str]:
"""Go over the prefix of the installed spec and verify its shared libraries can be resolved."""
visitor = spack.verify_libraries.ResolveSharedElfLibDepsVisitor(
[*spack.verify_libraries.ALLOW_UNRESOLVED, *unresolved_libraries]
)
visit_directory_tree(spec.prefix, visitor)
if not visitor.problems:
return None
output = io.StringIO()
visitor.write(output, indent=4, brief=True)
message = output.getvalue().rstrip()
return f"{spec.cformat('{name}{@version}{/hash}')}: {spec.prefix}:\n{message}"
def verify_manifest(args):
"""verify that install directories have not been modified since installation"""
local = args.local
if args.type == "files":
if args.all:
setup_parser.parser.print_help()
return 1
MANIFEST_SUBPARSER.error("cannot use --all with --files")
for file in args.specs_or_files:
results = spack.verify.check_file_manifest(file)
@@ -87,8 +153,7 @@ def verify(parser, args):
env = ev.active_environment()
specs = list(map(lambda x: spack.cmd.disambiguate_spec(x, env, local=local), spec_args))
else:
setup_parser.parser.print_help()
return 1
MANIFEST_SUBPARSER.error("use --all or specify specs to verify")
for spec in specs:
tty.debug("Verifying package %s")

View File

@@ -66,6 +66,8 @@
import spack.util.web as web_util
from spack.util.cpus import cpus_available
from .enums import ConfigScopePriority
#: Dict from section names -> schema for that section
SECTION_SCHEMAS: Dict[str, Any] = {
"compilers": spack.schema.compilers.schema,
@@ -408,26 +410,18 @@ def _method(self, *args, **kwargs):
return _method
class Configuration:
"""A full Spack configuration, from a hierarchy of config files.
ScopeWithOptionalPriority = Union[ConfigScope, Tuple[int, ConfigScope]]
ScopeWithPriority = Tuple[int, ConfigScope]
This class makes it easy to add a new scope on top of an existing one.
"""
class Configuration:
"""A hierarchical configuration, merging a number of scopes at different priorities."""
# convert to typing.OrderedDict when we drop 3.6, or OrderedDict when we reach 3.9
scopes: Dict[str, ConfigScope]
scopes: lang.PriorityOrderedMapping[str, ConfigScope]
def __init__(self, *scopes: ConfigScope) -> None:
"""Initialize a configuration with an initial list of scopes.
Args:
scopes: list of scopes to add to this
Configuration, ordered from lowest to highest precedence
"""
self.scopes = collections.OrderedDict()
for scope in scopes:
self.push_scope(scope)
def __init__(self) -> None:
self.scopes = lang.PriorityOrderedMapping()
self.format_updates: Dict[str, List[ConfigScope]] = collections.defaultdict(list)
def ensure_unwrapped(self) -> "Configuration":
@@ -435,36 +429,31 @@ def ensure_unwrapped(self) -> "Configuration":
return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
"""Scope with the highest precedence"""
return next(self.scopes.reversed_values()) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
def push_scope(self, scope: ConfigScope, priority: Optional[int] = None) -> None:
"""Adds a scope to the Configuration, at a given priority.
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}", level=2)
self.scopes[scope.name] = scope
If a priority is not given, it is assumed to be the current highest priority.
@_config_mutator
def pop_scope(self) -> ConfigScope:
"""Remove the highest precedence scope and return it."""
name, scope = self.scopes.popitem(last=True) # type: ignore[call-arg]
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
return scope
Args:
scope: scope to be added
priority: priority of the scope
"""
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
self.scopes.add(scope.name, value=scope, priority=priority)
@_config_mutator
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
"""Remove scope by name; has no effect when ``scope_name`` does not exist"""
scope = self.scopes.pop(scope_name, None)
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
try:
scope = self.scopes.remove(scope_name)
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
except KeyError as e:
tty.debug(f"[CONFIGURATION: POP SCOPE]: {e}", level=2)
return None
return scope
@property
@@ -473,15 +462,13 @@ def writable_scopes(self) -> Generator[ConfigScope, None, None]:
return (s for s in self.scopes.values() if s.writable)
def highest_precedence_scope(self) -> ConfigScope:
"""Writable scope with highest precedence."""
return next(s for s in reversed(self.scopes.values()) if s.writable) # type: ignore
"""Writable scope with the highest precedence."""
return next(s for s in self.scopes.reversed_values() if s.writable)
def highest_precedence_non_platform_scope(self) -> ConfigScope:
"""Writable non-platform scope with highest precedence"""
"""Writable non-platform scope with the highest precedence"""
return next(
s
for s in reversed(self.scopes.values()) # type: ignore
if s.writable and not s.is_platform_dependent
s for s in self.scopes.reversed_values() if s.writable and not s.is_platform_dependent
)
def matching_scopes(self, reg_expr) -> List[ConfigScope]:
@@ -748,7 +735,7 @@ def override(
"""
if isinstance(path_or_scope, ConfigScope):
overrides = path_or_scope
CONFIG.push_scope(path_or_scope)
CONFIG.push_scope(path_or_scope, priority=None)
else:
base_name = _OVERRIDES_BASE_NAME
# Ensure the new override gets a unique scope name
@@ -762,7 +749,7 @@ def override(
break
overrides = InternalConfigScope(scope_name)
CONFIG.push_scope(overrides)
CONFIG.push_scope(overrides, priority=None)
CONFIG.set(path_or_scope, value, scope=scope_name)
try:
@@ -772,13 +759,15 @@ def override(
assert scope is overrides
def _add_platform_scope(cfg: Configuration, name: str, path: str, writable: bool = True) -> None:
def _add_platform_scope(
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
) -> None:
"""Add a platform-specific subdirectory for the current platform."""
platform = spack.platforms.host().name
scope = DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform), writable=writable
)
cfg.push_scope(scope)
cfg.push_scope(scope, priority=priority)
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
@@ -813,11 +802,10 @@ def create() -> Configuration:
it. It is bundled inside a function so that configuration can be
initialized lazily.
"""
cfg = Configuration()
# first do the builtin, hardcoded defaults
builtin = InternalConfigScope("_builtin", CONFIG_DEFAULTS)
cfg.push_scope(builtin)
cfg = create_from(
(ConfigScopePriority.BUILTIN, InternalConfigScope("_builtin", CONFIG_DEFAULTS))
)
# Builtin paths to configuration files in Spack
configuration_paths = [
@@ -847,10 +835,9 @@ def create() -> Configuration:
# add each scope and its platform-specific directory
for name, path in configuration_paths:
cfg.push_scope(DirectoryConfigScope(name, path))
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, name, path)
cfg.push_scope(DirectoryConfigScope(name, path), priority=ConfigScopePriority.CONFIG_FILES)
# Each scope can have per-platform overrides in subdirectories
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
return cfg
@@ -955,7 +942,7 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
return CONFIG.set(path, value, scope)
def scopes() -> Dict[str, ConfigScope]:
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
"""Convenience function to get list of configuration scopes."""
return CONFIG.scopes
@@ -1409,7 +1396,7 @@ def ensure_latest_format_fn(section: str) -> Callable[[YamlConfigDict], bool]:
@contextlib.contextmanager
def use_configuration(
*scopes_or_paths: Union[ConfigScope, str]
*scopes_or_paths: Union[ScopeWithOptionalPriority, str]
) -> Generator[Configuration, None, None]:
"""Use the configuration scopes passed as arguments within the context manager.
@@ -1424,7 +1411,7 @@ def use_configuration(
global CONFIG
# Normalize input and construct a Configuration object
configuration = _config_from(scopes_or_paths)
configuration = create_from(*scopes_or_paths)
CONFIG.clear_caches(), configuration.clear_caches()
saved_config, CONFIG = CONFIG, configuration
@@ -1435,23 +1422,44 @@ def use_configuration(
CONFIG = saved_config
def _normalize_input(entry: Union[ScopeWithOptionalPriority, str]) -> ScopeWithPriority:
if isinstance(entry, tuple):
return entry
default_priority = ConfigScopePriority.CONFIG_FILES
if isinstance(entry, ConfigScope):
return default_priority, entry
# Otherwise we need to construct it
path = os.path.normpath(entry)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
return default_priority, DirectoryConfigScope(name, path)
@lang.memoized
def _config_from(scopes_or_paths: List[Union[ConfigScope, str]]) -> Configuration:
scopes = []
for scope_or_path in scopes_or_paths:
# If we have a config scope we are already done
if isinstance(scope_or_path, ConfigScope):
scopes.append(scope_or_path)
continue
def create_from(*scopes_or_paths: Union[ScopeWithOptionalPriority, str]) -> Configuration:
"""Creates a configuration object from the scopes passed in input.
# Otherwise we need to construct it
path = os.path.normpath(scope_or_path)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
scopes.append(DirectoryConfigScope(name, path))
Args:
*scopes_or_paths: either a tuple of (priority, ConfigScope), or a ConfigScope, or a string
If priority is not given, it is assumed to be ConfigScopePriority.CONFIG_FILES. If a
string is given, a DirectoryConfigScope is created from it.
configuration = Configuration(*scopes)
return configuration
Examples:
>>> builtin_scope = InternalConfigScope("_builtin", {"config": {"build_jobs": 1}})
>>> cl_scope = InternalConfigScope("command_line", {"config": {"build_jobs": 10}})
>>> cfg = create_from(
... (ConfigScopePriority.COMMAND_LINE, cl_scope),
... (ConfigScopePriority.BUILTIN, builtin_scope)
... )
"""
scopes_with_priority = [_normalize_input(x) for x in scopes_or_paths]
result = Configuration()
for priority, scope in scopes_with_priority:
result.push_scope(scope, priority=priority)
return result
def raw_github_gitlab_url(url: str) -> str:

View File

@@ -243,7 +243,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
self, *, pkg: Type["spack.package_base.PackageBase"], paths: Iterable[str]
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.

View File

@@ -25,7 +25,7 @@
}
def _check_concrete(spec):
def _check_concrete(spec: "spack.spec.Spec") -> None:
"""If the spec is not concrete, raise a ValueError"""
if not spec.concrete:
raise ValueError("Specs passed to a DirectoryLayout must be concrete!")
@@ -51,7 +51,7 @@ def specs_from_metadata_dirs(root: str) -> List["spack.spec.Spec"]:
spec = _get_spec(prefix)
if spec:
spec.prefix = prefix
spec.set_prefix(prefix)
specs.append(spec)
continue
@@ -84,7 +84,7 @@ class DirectoryLayout:
def __init__(
self,
root,
root: str,
*,
projections: Optional[Dict[str, str]] = None,
hash_length: Optional[int] = None,
@@ -120,17 +120,17 @@ def __init__(
self.manifest_file_name = "install_manifest.json"
@property
def hidden_file_regexes(self):
def hidden_file_regexes(self) -> Tuple[str]:
return ("^{0}$".format(re.escape(self.metadata_dir)),)
def relative_path_for_spec(self, spec):
def relative_path_for_spec(self, spec: "spack.spec.Spec") -> str:
_check_concrete(spec)
projection = spack.projections.get_projection(self.projections, spec)
path = spec.format_path(projection)
return str(Path(path))
def write_spec(self, spec, path):
def write_spec(self, spec: "spack.spec.Spec", path: str) -> None:
"""Write a spec out to a file."""
_check_concrete(spec)
with open(path, "w", encoding="utf-8") as f:
@@ -138,7 +138,7 @@ def write_spec(self, spec, path):
# the full provenance, so it's availabe if we want it later
spec.to_json(f, hash=ht.dag_hash)
def write_host_environment(self, spec):
def write_host_environment(self, spec: "spack.spec.Spec") -> None:
"""The host environment is a json file with os, kernel, and spack
versioning. We use it in the case that an analysis later needs to
easily access this information.
@@ -148,7 +148,7 @@ def write_host_environment(self, spec):
with open(env_file, "w", encoding="utf-8") as fd:
sjson.dump(environ, fd)
def read_spec(self, path):
def read_spec(self, path: str) -> "spack.spec.Spec":
"""Read the contents of a file and parse them as a spec"""
try:
with open(path, encoding="utf-8") as f:
@@ -159,26 +159,28 @@ def read_spec(self, path):
# Too late for conversion; spec_file_path() already called.
spec = spack.spec.Spec.from_yaml(f)
else:
raise SpecReadError(
"Did not recognize spec file extension:" " {0}".format(extension)
)
raise SpecReadError(f"Did not recognize spec file extension: {extension}")
except Exception as e:
if spack.config.get("config:debug"):
raise
raise SpecReadError("Unable to read file: %s" % path, "Cause: " + str(e))
raise SpecReadError(f"Unable to read file: {path}", f"Cause: {e}")
# Specs read from actual installations are always concrete
spec._mark_concrete()
return spec
def spec_file_path(self, spec):
def spec_file_path(self, spec: "spack.spec.Spec") -> str:
"""Gets full path to spec file"""
_check_concrete(spec)
yaml_path = os.path.join(self.metadata_path(spec), self._spec_file_name_yaml)
json_path = os.path.join(self.metadata_path(spec), self.spec_file_name)
return yaml_path if os.path.exists(yaml_path) else json_path
def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
def deprecated_file_path(
self,
deprecated_spec: "spack.spec.Spec",
deprecator_spec: Optional["spack.spec.Spec"] = None,
) -> str:
"""Gets full path to spec file for deprecated spec
If the deprecator_spec is provided, use that. Otherwise, assume
@@ -212,16 +214,16 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
return yaml_path if os.path.exists(yaml_path) else json_path
def metadata_path(self, spec):
def metadata_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(spec.prefix, self.metadata_dir)
def env_metadata_path(self, spec):
def env_metadata_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(self.metadata_path(spec), "install_environment.json")
def build_packages_path(self, spec):
def build_packages_path(self, spec: "spack.spec.Spec") -> str:
return os.path.join(self.metadata_path(spec), self.packages_dir)
def create_install_directory(self, spec):
def create_install_directory(self, spec: "spack.spec.Spec") -> None:
_check_concrete(spec)
# Create install directory with properly configured permissions
@@ -239,7 +241,7 @@ def create_install_directory(self, spec):
self.write_spec(spec, self.spec_file_path(spec))
def ensure_installed(self, spec):
def ensure_installed(self, spec: "spack.spec.Spec") -> None:
"""
Throws InconsistentInstallDirectoryError if:
1. spec prefix does not exist
@@ -266,7 +268,7 @@ def ensure_installed(self, spec):
"Spec file in %s does not match hash!" % spec_file_path
)
def path_for_spec(self, spec):
def path_for_spec(self, spec: "spack.spec.Spec") -> str:
"""Return absolute path from the root to a directory for the spec."""
_check_concrete(spec)
@@ -277,23 +279,13 @@ def path_for_spec(self, spec):
assert not path.startswith(self.root)
return os.path.join(self.root, path)
def remove_install_directory(self, spec, deprecated=False):
def remove_install_directory(self, spec: "spack.spec.Spec", deprecated: bool = False) -> None:
"""Removes a prefix and any empty parent directories from the root.
Raised RemoveFailedError if something goes wrong.
"""
path = self.path_for_spec(spec)
assert path.startswith(self.root)
# Windows readonly files cannot be removed by Python
# directly, change permissions before attempting to remove
if sys.platform == "win32":
kwargs = {
"ignore_errors": False,
"onerror": fs.readonly_file_handler(ignore_errors=False),
}
else:
kwargs = {} # the default value for ignore_errors is false
if deprecated:
if os.path.exists(path):
try:
@@ -304,7 +296,16 @@ def remove_install_directory(self, spec, deprecated=False):
raise RemoveFailedError(spec, path, e) from e
elif os.path.exists(path):
try:
shutil.rmtree(path, **kwargs)
if sys.platform == "win32":
# Windows readonly files cannot be removed by Python
# directly, change permissions before attempting to remove
shutil.rmtree(
path,
ignore_errors=False,
onerror=fs.readonly_file_handler(ignore_errors=False),
)
else:
shutil.rmtree(path)
except OSError as e:
raise RemoveFailedError(spec, path, e) from e

View File

@@ -12,3 +12,13 @@ class InstallRecordStatus(enum.Flag):
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING
class ConfigScopePriority(enum.IntEnum):
"""Priorities of the different kind of config scopes used by Spack"""
BUILTIN = 0
CONFIG_FILES = 1
CUSTOM = 2
ENVIRONMENT = 3
COMMAND_LINE = 4

View File

@@ -51,6 +51,8 @@
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
from ..enums import ConfigScopePriority
SpecPair = spack.concretize.SpecPair
#: environment variable used to indicate the active environment
@@ -387,6 +389,7 @@ def create_in_dir(
# dev paths in this environment to refer to their original
# locations.
_rewrite_relative_dev_paths_on_relocation(env, init_file_dir)
_rewrite_relative_repos_paths_on_relocation(env, init_file_dir)
return env
@@ -403,8 +406,8 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
dev_path = substitute_path_variables(entry["path"])
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
# Skip if the expanded path is the same (e.g. when absolute)
if dev_path == expanded_path:
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry["path"] == expanded_path:
continue
tty.debug("Expanding develop path for {0} to {1}".format(name, expanded_path))
@@ -419,6 +422,34 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
env._re_read()
def _rewrite_relative_repos_paths_on_relocation(env, init_file_dir):
"""When initializing the environment from a manifest file and we plan
to store the environment in a different directory, we have to rewrite
relative repo paths to absolute ones and expand environment variables."""
with env:
repos_specs = spack.config.get("repos", default={}, scope=env.scope_name)
if not repos_specs:
return
for i, entry in enumerate(repos_specs):
repo_path = substitute_path_variables(entry)
expanded_path = spack.util.path.canonicalize_path(repo_path, default_wd=init_file_dir)
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry == expanded_path:
continue
tty.debug("Expanding repo path for {0} to {1}".format(entry, expanded_path))
repos_specs[i] = expanded_path
spack.config.set("repos", repos_specs, scope=env.scope_name)
env.repos_specs = None
# If we changed the environment's spack.yaml scope, that will not be reflected
# in the manifest that we read
env._re_read()
def environment_dir_from_name(name: str, exists_ok: bool = True) -> str:
"""Returns the directory associated with a named environment.
@@ -2392,6 +2423,8 @@ def invalidate_repository_cache(self):
def __enter__(self):
self._previous_active = _active_environment
if self._previous_active:
deactivate()
activate(self)
return self
@@ -3067,14 +3100,12 @@ def env_config_scopes(self) -> List[spack.config.ConfigScope]:
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
spack.config.CONFIG.push_scope(scope, priority=ConfigScopePriority.ENVIRONMENT)
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager
def use_config(self):

View File

@@ -10,7 +10,7 @@
import stat
import sys
import tempfile
from typing import Callable, Dict, Optional
from typing import Callable, Dict, List, Optional
from typing_extensions import Literal
@@ -78,7 +78,7 @@ def view_copy(
# Order of this dict is somewhat irrelevant
prefix_to_projection = {
s.prefix: view.get_projection_for_spec(s)
str(s.prefix): view.get_projection_for_spec(s)
for s in spec.traverse(root=True, order="breadth")
if not s.external
}
@@ -185,7 +185,7 @@ def __init__(
def link(self, src: str, dst: str, spec: Optional[spack.spec.Spec] = None) -> None:
self._link(src, dst, self, spec)
def add_specs(self, *specs, **kwargs):
def add_specs(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Add given specs to view.
@@ -200,19 +200,19 @@ def add_specs(self, *specs, **kwargs):
"""
raise NotImplementedError
def add_standalone(self, spec):
def add_standalone(self, spec: spack.spec.Spec) -> bool:
"""
Add (link) a standalone package into this view.
"""
raise NotImplementedError
def check_added(self, spec):
def check_added(self, spec: spack.spec.Spec) -> bool:
"""
Check if the given concrete spec is active in this view.
"""
raise NotImplementedError
def remove_specs(self, *specs, **kwargs):
def remove_specs(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Removes given specs from view.
@@ -231,25 +231,25 @@ def remove_specs(self, *specs, **kwargs):
"""
raise NotImplementedError
def remove_standalone(self, spec):
def remove_standalone(self, spec: spack.spec.Spec) -> None:
"""
Remove (unlink) a standalone package from this view.
"""
raise NotImplementedError
def get_projection_for_spec(self, spec):
def get_projection_for_spec(self, spec: spack.spec.Spec) -> str:
"""
Get the projection in this view for a spec.
"""
raise NotImplementedError
def get_all_specs(self):
def get_all_specs(self) -> List[spack.spec.Spec]:
"""
Get all specs currently active in this view.
"""
raise NotImplementedError
def get_spec(self, spec):
def get_spec(self, spec: spack.spec.Spec) -> Optional[spack.spec.Spec]:
"""
Return the actual spec linked in this view (i.e. do not look it up
in the database by name).
@@ -263,7 +263,7 @@ def get_spec(self, spec):
"""
raise NotImplementedError
def print_status(self, *specs, **kwargs):
def print_status(self, *specs: spack.spec.Spec, **kwargs) -> None:
"""
Print a short summary about the given specs, detailing whether..
* ..they are active in the view.
@@ -694,7 +694,7 @@ def _sanity_check_view_projection(self, specs):
raise ConflictingSpecsError(current_spec, conflicting_spec)
seen[metadata_dir] = current_spec
def add_specs(self, *specs: spack.spec.Spec) -> None:
def add_specs(self, *specs, **kwargs) -> None:
"""Link a root-to-leaf topologically ordered list of specs into the view."""
assert all((s.concrete for s in specs))
if len(specs) == 0:
@@ -831,7 +831,7 @@ def get_projection_for_spec(self, spec):
#####################
# utility functions #
#####################
def get_spec_from_file(filename):
def get_spec_from_file(filename) -> Optional[spack.spec.Spec]:
try:
with open(filename, "r", encoding="utf-8") as f:
return spack.spec.Spec.from_yaml(f)

View File

@@ -2,198 +2,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import fnmatch
import io
import os
import re
from typing import Dict, List, Union
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, visit_directory_tree
from llnl.util.lang import stable_partition
from llnl.util.filesystem import visit_directory_tree
import spack.config
import spack.error
import spack.util.elf as elf
#: Patterns for names of libraries that are allowed to be unresolved when *just* looking at RPATHs
#: added by Spack. These are libraries outside of Spack's control, and assumed to be located in
#: default search paths of the dynamic linker.
ALLOW_UNRESOLVED = [
# kernel
"linux-vdso.so.*",
"libselinux.so.*",
# musl libc
"ld-musl-*.so.*",
# glibc
"ld-linux*.so.*",
"ld64.so.*",
"libanl.so.*",
"libc.so.*",
"libdl.so.*",
"libm.so.*",
"libmemusage.so.*",
"libmvec.so.*",
"libnsl.so.*",
"libnss_compat.so.*",
"libnss_db.so.*",
"libnss_dns.so.*",
"libnss_files.so.*",
"libnss_hesiod.so.*",
"libpcprofile.so.*",
"libpthread.so.*",
"libresolv.so.*",
"librt.so.*",
"libSegFault.so.*",
"libthread_db.so.*",
"libutil.so.*",
# gcc -- this is required even with gcc-runtime, because e.g. libstdc++ depends on libgcc_s,
# but the binaries we copy from the compiler don't have an $ORIGIN rpath.
"libasan.so.*",
"libatomic.so.*",
"libcc1.so.*",
"libgcc_s.so.*",
"libgfortran.so.*",
"libgomp.so.*",
"libitm.so.*",
"liblsan.so.*",
"libquadmath.so.*",
"libssp.so.*",
"libstdc++.so.*",
"libtsan.so.*",
"libubsan.so.*",
# systemd
"libudev.so.*",
# cuda driver
"libcuda.so.*",
]
def is_compatible(parent: elf.ElfFile, child: elf.ElfFile) -> bool:
return (
child.elf_hdr.e_type == elf.ELF_CONSTANTS.ET_DYN
and parent.is_little_endian == child.is_little_endian
and parent.is_64_bit == child.is_64_bit
and parent.elf_hdr.e_machine == child.elf_hdr.e_machine
)
def candidate_matches(current_elf: elf.ElfFile, candidate_path: bytes) -> bool:
try:
with open(candidate_path, "rb") as g:
return is_compatible(current_elf, elf.parse_elf(g))
except (OSError, elf.ElfParsingError):
return False
class Problem:
def __init__(
self, resolved: Dict[bytes, bytes], unresolved: List[bytes], relative_rpaths: List[bytes]
) -> None:
self.resolved = resolved
self.unresolved = unresolved
self.relative_rpaths = relative_rpaths
class ResolveSharedElfLibDepsVisitor(BaseDirectoryVisitor):
def __init__(self, allow_unresolved_patterns: List[str]) -> None:
self.problems: Dict[str, Problem] = {}
self._allow_unresolved_regex = re.compile(
"|".join(fnmatch.translate(x) for x in allow_unresolved_patterns)
)
def allow_unresolved(self, needed: bytes) -> bool:
try:
name = needed.decode("utf-8")
except UnicodeDecodeError:
return False
return bool(self._allow_unresolved_regex.match(name))
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# We work with byte strings for paths.
path = os.path.join(root, rel_path).encode("utf-8")
# For $ORIGIN interpolation: should not have trailing dir seperator.
origin = os.path.dirname(path)
# Retrieve the needed libs + rpaths.
try:
with open(path, "rb") as f:
parsed_elf = elf.parse_elf(f, interpreter=False, dynamic_section=True)
except (OSError, elf.ElfParsingError):
# Not dealing with an invalid ELF file.
return
# If there's no needed libs all is good
if not parsed_elf.has_needed:
return
# Get the needed libs and rpaths (notice: byte strings)
# Don't force an encoding cause paths are just a bag of bytes.
needed_libs = parsed_elf.dt_needed_strs
rpaths = parsed_elf.dt_rpath_str.split(b":") if parsed_elf.has_rpath else []
# We only interpolate $ORIGIN, not $LIB and $PLATFORM, they're not really
# supported in general. Also remove empty paths.
rpaths = [x.replace(b"$ORIGIN", origin) for x in rpaths if x]
# Do not allow relative rpaths (they are relative to the current working directory)
rpaths, relative_rpaths = stable_partition(rpaths, os.path.isabs)
# If there's a / in the needed lib, it's opened directly, otherwise it needs
# a search.
direct_libs, search_libs = stable_partition(needed_libs, lambda x: b"/" in x)
# Do not allow relative paths in direct libs (they are relative to the current working
# directory)
direct_libs, unresolved = stable_partition(direct_libs, os.path.isabs)
resolved: Dict[bytes, bytes] = {}
for lib in search_libs:
if self.allow_unresolved(lib):
continue
for rpath in rpaths:
candidate = os.path.join(rpath, lib)
if candidate_matches(parsed_elf, candidate):
resolved[lib] = candidate
break
else:
unresolved.append(lib)
# Check if directly opened libs are compatible
for lib in direct_libs:
if candidate_matches(parsed_elf, lib):
resolved[lib] = lib
else:
unresolved.append(lib)
if unresolved or relative_rpaths:
self.problems[rel_path] = Problem(resolved, unresolved, relative_rpaths)
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
pass
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# There can be binaries in .spack/test which shouldn't be checked.
if rel_path == ".spack":
return False
return True
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
return False
class CannotLocateSharedLibraries(spack.error.SpackError):
pass
def maybe_decode(byte_str: bytes) -> Union[str, bytes]:
try:
return byte_str.decode("utf-8")
except UnicodeDecodeError:
return byte_str
import spack.verify_libraries
def post_install(spec, explicit):
@@ -204,36 +20,23 @@ def post_install(spec, explicit):
if policy == "ignore" or spec.external or spec.platform not in ("linux", "freebsd"):
return
visitor = ResolveSharedElfLibDepsVisitor(
[*ALLOW_UNRESOLVED, *spec.package.unresolved_libraries]
visitor = spack.verify_libraries.ResolveSharedElfLibDepsVisitor(
[*spack.verify_libraries.ALLOW_UNRESOLVED, *spec.package.unresolved_libraries]
)
visit_directory_tree(spec.prefix, visitor)
# All good?
if not visitor.problems:
return
# For now just list the issues (print it in ldd style, except we don't recurse)
output = io.StringIO()
output.write("not all executables and libraries can resolve their dependencies:\n")
for path, problem in visitor.problems.items():
output.write(path)
output.write("\n")
for needed, full_path in problem.resolved.items():
output.write(" ")
if needed == full_path:
output.write(maybe_decode(needed))
else:
output.write(f"{maybe_decode(needed)} => {maybe_decode(full_path)}")
output.write("\n")
for not_found in problem.unresolved:
output.write(f" {maybe_decode(not_found)} => not found\n")
for relative_rpath in problem.relative_rpaths:
output.write(f" {maybe_decode(relative_rpath)} => relative rpath\n")
output = io.StringIO("not all executables and libraries can resolve their dependencies:\n")
visitor.write(output)
message = output.getvalue().strip()
if policy == "error":
raise CannotLocateSharedLibraries(message)
tty.warn(message)
class CannotLocateSharedLibraries(spack.error.SpackError):
pass

View File

@@ -47,6 +47,8 @@
import spack.util.environment
import spack.util.lock
from .enums import ConfigScopePriority
#: names of profile statistics
stat_names = pstats.Stats.sort_arg_dict_default
@@ -872,14 +874,19 @@ def add_command_line_scopes(
scopes = ev.environment_path_scopes(name, path)
if scopes is None:
if os.path.isdir(path): # directory with config files
cfg.push_scope(spack.config.DirectoryConfigScope(name, path, writable=False))
spack.config._add_platform_scope(cfg, name, path, writable=False)
cfg.push_scope(
spack.config.DirectoryConfigScope(name, path, writable=False),
priority=ConfigScopePriority.CUSTOM,
)
spack.config._add_platform_scope(
cfg, name, path, priority=ConfigScopePriority.CUSTOM, writable=False
)
continue
else:
raise spack.error.ConfigError(f"Invalid configuration scope: {path}")
for scope in scopes:
cfg.push_scope(scope)
cfg.push_scope(scope, priority=ConfigScopePriority.CUSTOM)
def _main(argv=None):
@@ -952,7 +959,9 @@ def _main(argv=None):
# Push scopes from the command line last
if args.config_scopes:
add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
spack.config.CONFIG.push_scope(spack.config.InternalConfigScope("command_line"))
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope("command_line"), priority=ConfigScopePriority.COMMAND_LINE
)
setup_main_options(args)
# ------------------------------------------------------------------------
@@ -998,6 +1007,7 @@ def finish_parse_and_run(parser, cmd_name, main_args, env_format_error):
args, unknown = parser.parse_known_args(main_args.command)
# we need to inherit verbose since the install command checks for it
args.verbose = main_args.verbose
args.lines = main_args.lines
# Now that we know what command this is and what its args are, determine
# whether we can continue with a bad environment and raise if not.

View File

@@ -330,18 +330,17 @@ class BaseConfiguration:
default_projections = {"all": "{name}/{version}-{compiler.name}-{compiler.version}"}
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
# Module where type(self) is defined
m = inspect.getmodule(self)
assert m is not None # make mypy happy
self.module = m
# Spec for which we want to generate a module file
self.spec = spec
self.name = module_set_name
self.explicit = explicit
# Dictionary of configuration options that should be applied
# to the spec
# Dictionary of configuration options that should be applied to the spec
self.conf = merge_config_rules(self.module.configuration(self.name), self.spec)
@property
def module(self):
return inspect.getmodule(self)
@property
def projections(self):
"""Projection from specs to module names"""
@@ -775,10 +774,6 @@ def __init__(
) -> None:
self.spec = spec
# This class is meant to be derived. Get the module of the
# actual writer.
self.module = inspect.getmodule(self)
assert self.module is not None # make mypy happy
m = self.module
# Create the triplet of configuration/layout/context
@@ -816,6 +811,10 @@ def __init__(
name = type(self).__name__
raise ModulercHeaderNotDefined(msg.format(name))
@property
def module(self):
return inspect.getmodule(self)
def _get_template(self):
"""Gets the template that will be rendered for this spec."""
# Get templates and put them in the order of importance:

View File

@@ -125,9 +125,10 @@ def windows_establish_runtime_linkage(self):
# Spack should in general not modify things it has not installed
# we can reasonably expect externals to have their link interface properly established
if sys.platform == "win32" and not self.spec.external:
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link()
win_rpath = fsys.WindowsSimulatedRPath(self)
win_rpath.add_library_dependent(*self.win_add_library_dependent())
win_rpath.add_rpath(*self.win_add_rpath())
win_rpath.establish_link()
#: Registers which are the detectable packages, by repo and package name
@@ -742,7 +743,6 @@ def __init__(self, spec):
# Set up timing variables
self._fetch_time = 0.0
self.win_rpath = fsys.WindowsSimulatedRPath(self)
super().__init__()
def __getitem__(self, key: str) -> "PackageBase":

View File

@@ -83,6 +83,7 @@ def __init__(
level: int,
working_dir: str,
reverse: bool = False,
ordering_key: Optional[Tuple[str, int]] = None,
) -> None:
"""Initialize a new Patch instance.
@@ -92,6 +93,7 @@ def __init__(
level: patch level
working_dir: relative path *within* the stage to change to
reverse: reverse the patch
ordering_key: key used to ensure patches are applied in a consistent order
"""
# validate level (must be an integer >= 0)
if not isinstance(level, int) or not level >= 0:
@@ -105,6 +107,13 @@ def __init__(
self.working_dir = working_dir
self.reverse = reverse
# The ordering key is passed when executing package.py directives, and is only relevant
# after a solve to build concrete specs with consistently ordered patches. For concrete
# specs read from a file, we add patches in the order of its patches variants and the
# ordering_key is irrelevant. In that case, use a default value so we don't need to branch
# on whether ordering_key is None where it's used, just to make static analysis happy.
self.ordering_key: Tuple[str, int] = ordering_key or ("", 0)
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
@@ -202,9 +211,8 @@ def __init__(
msg += "package %s.%s does not exist." % (pkg.namespace, pkg.name)
raise ValueError(msg)
super().__init__(pkg, abs_path, level, working_dir, reverse)
super().__init__(pkg, abs_path, level, working_dir, reverse, ordering_key)
self.path = abs_path
self.ordering_key = ordering_key
@property
def sha256(self) -> str:
@@ -266,13 +274,11 @@ def __init__(
archive_sha256: sha256 sum of the *archive*, if the patch is compressed
(only required for compressed URL patches)
"""
super().__init__(pkg, url, level, working_dir, reverse)
super().__init__(pkg, url, level, working_dir, reverse, ordering_key)
self.url = url
self._stage: Optional["spack.stage.Stage"] = None
self.ordering_key = ordering_key
if allowed_archive(self.url) and not archive_sha256:
raise spack.error.PatchDirectiveError(
"Compressed patches require 'archive_sha256' "

View File

@@ -108,6 +108,8 @@ def _get_user_cache_path():
#: transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
default_misc_cache_path = os.path.join(user_cache_path, "cache")
#: concretization cache for Spack concretizations
default_conc_cache_path = os.path.join(default_misc_cache_path, "concretization")
# Below paths pull configuration from the host environment.
#

View File

@@ -32,6 +32,7 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.caches
import spack.config
import spack.error
@@ -49,6 +50,8 @@
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
ROOT_PYTHON_NAMESPACE = "spack.pkg"
_API_REGEX = re.compile(r"^v(\d+)\.(\d+)$")
def python_package_for_repo(namespace):
"""Returns the full namespace of a repository, given its relative one
@@ -909,19 +912,52 @@ def __reduce__(self):
return RepoPath.unmarshal, self.marshal()
def _parse_package_api_version(
config: Dict[str, Any],
min_api: Tuple[int, int] = spack.min_package_api_version,
max_api: Tuple[int, int] = spack.package_api_version,
) -> Tuple[int, int]:
api = config.get("api")
if api is None:
package_api = (1, 0)
else:
if not isinstance(api, str):
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
api_match = _API_REGEX.match(api)
if api_match is None:
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
package_api = (int(api_match.group(1)), int(api_match.group(2)))
if min_api <= package_api <= max_api:
return package_api
min_str = ".".join(str(i) for i in min_api)
max_str = ".".join(str(i) for i in max_api)
curr_str = ".".join(str(i) for i in package_api)
raise BadRepoError(
f"Package API v{curr_str} is not supported by this version of Spack ("
f"must be between v{min_str} and v{max_str})"
)
class Repo:
"""Class representing a package repository in the filesystem.
Each package repository must have a top-level configuration file
called `repo.yaml`.
Each package repository must have a top-level configuration file called `repo.yaml`.
Currently, `repo.yaml` must define:
It contains the following keys:
`namespace`:
A Python namespace where the repository's packages should live.
`subdirectory`:
An optional subdirectory name where packages are placed
`api`:
A string of the form vX.Y that indicates the Package API version. The default is "v1.0".
For the repo to be compatible with the current version of Spack, the version must be
greater than or equal to :py:data:`spack.min_package_api_version` and less than or equal to
:py:data:`spack.package_api_version`.
"""
def __init__(
@@ -958,7 +994,7 @@ def check(condition, msg):
f"{os.path.join(root, repo_config_name)} must define a namespace.",
)
self.namespace = config["namespace"]
self.namespace: str = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
@@ -971,12 +1007,14 @@ def check(condition, msg):
# Keep name components around for checking prefixes.
self._names = self.full_namespace.split(".")
packages_dir = config.get("subdirectory", packages_dir_name)
packages_dir: str = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
)
self.package_api = _parse_package_api_version(config)
# Class attribute overrides by package name
self.overrides = overrides or {}
@@ -1026,7 +1064,7 @@ def is_prefix(self, fullname: str) -> bool:
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self) -> Dict[str, str]:
def _read_config(self) -> Dict[str, Any]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file, encoding="utf-8") as reponame_file:

View File

@@ -177,7 +177,7 @@ def build_report_for_package(self, report_dir, package, duration):
# something went wrong pre-cdash "configure" phase b/c we have an exception and only
# "update" was encounterd.
# dump the report in the configure line so teams can see what the issue is
if len(phases_encountered) == 1 and package["exception"]:
if len(phases_encountered) == 1 and package.get("exception"):
# TODO this mapping is not ideal since these are pre-configure errors
# we need to determine if a more appropriate cdash phase can be utilized
# for now we will add a message to the log explaining this

View File

@@ -7,8 +7,7 @@
import warnings
import jsonschema
import llnl.util.lang
import jsonschema.validators
from spack.error import SpecSyntaxError
@@ -18,59 +17,59 @@ class DeprecationMessage(typing.NamedTuple):
error: bool
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import spack.spec_parser
def _validate_spec(validator, is_spec, instance, schema):
"""Check if all additional keys are valid specs."""
import spack.spec_parser
if not validator.is_type(instance, "object"):
return
if not validator.is_type(instance, "object"):
return
for spec_str in instance:
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
properties = schema.get("properties") or {}
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
if not deprecated:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft7Validator,
{"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties},
)
for spec_str in instance:
if spec_str in properties:
continue
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
Validator = llnl.util.lang.Singleton(_make_validator)
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
if not deprecated:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
Validator = jsonschema.validators.extend(
jsonschema.Draft7Validator,
{"additionalKeysAreSpecs": _validate_spec, "deprecatedProperties": _deprecated_properties},
)
def _append(string: str) -> bool:

View File

@@ -58,6 +58,15 @@
{"type": "string"}, # deprecated
]
},
"concretization_cache": {
"type": "object",
"properties": {
"enable": {"type": "boolean"},
"url": {"type": "string"},
"entry_limit": {"type": "integer", "minimum": 0},
"size_limit": {"type": "integer", "minimum": 0},
},
},
"install_hash_length": {"type": "integer", "minimum": 1},
"install_path_scheme": {"type": "string"}, # deprecated
"build_stage": {

View File

@@ -39,7 +39,7 @@
"load": array_of_strings,
"suffixes": {
"type": "object",
"validate_spec": True,
"additionalKeysAreSpecs": True,
"additionalProperties": {"type": "string"}, # key
},
"environment": spack.schema.environment.definition,
@@ -48,40 +48,44 @@
projections_scheme = spack.schema.projections.properties["projections"]
module_type_configuration: Dict = {
common_props = {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
}
tcl_configuration = {
"type": "object",
"default": {},
"validate_spec": True,
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
},
"additionalKeysAreSpecs": True,
"properties": {**common_props},
"additionalProperties": module_file_configuration,
}
tcl_configuration = module_type_configuration.copy()
lmod_configuration = module_type_configuration.copy()
lmod_configuration["properties"].update(
{
lmod_configuration = {
"type": "object",
"default": {},
"additionalKeysAreSpecs": True,
"properties": {
**common_props,
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"validate_spec": True,
"additionalKeysAreSpecs": True,
"additionalProperties": array_of_strings,
},
}
)
},
"additionalProperties": module_file_configuration,
}
module_config_properties = {
"use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]},

View File

@@ -5,9 +5,12 @@
import collections.abc
import copy
import enum
import errno
import functools
import hashlib
import io
import itertools
import json
import os
import pathlib
import pprint
@@ -17,12 +20,25 @@
import typing
import warnings
from contextlib import contextmanager
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
from typing import (
IO,
Callable,
Dict,
Iterator,
List,
NamedTuple,
Optional,
Set,
Tuple,
Type,
Union,
)
import archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import current_file_position
from llnl.util.lang import elide_list
import spack
@@ -34,21 +50,27 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.hash_types as ht
import spack.package_base
import spack.package_prefs
import spack.patch
import spack.paths
import spack.platforms
import spack.repo
import spack.solver.splicing
import spack.spec
import spack.store
import spack.util.crypto
import spack.util.hash
import spack.util.libc
import spack.util.module_cmd as md
import spack.util.path
import spack.util.timer
import spack.variant as vt
import spack.version as vn
import spack.version.git_ref_lookup
from spack import traverse
from spack.util.file_cache import FileCache
from .core import (
AspFunction,
@@ -536,6 +558,365 @@ def format_unsolved(unsolved_specs):
msg += "\n\t(No candidate specs from solver)"
return msg
def to_dict(self, test: bool = False) -> dict:
"""Produces dict representation of Result object
Does not include anything related to unsatisfiability as we
are only interested in storing satisfiable results
"""
serial_node_arg = (
lambda node_dict: f"""{{"id": "{node_dict.id}", "pkg": "{node_dict.pkg}"}}"""
)
spec_hash_type = ht.process_hash if test else ht.dag_hash
ret = dict()
ret["asp"] = self.asp
ret["criteria"] = self.criteria
ret["optimal"] = self.optimal
ret["warnings"] = self.warnings
ret["nmodels"] = self.nmodels
ret["abstract_specs"] = [str(x) for x in self.abstract_specs]
ret["satisfiable"] = self.satisfiable
serial_answers = []
for answer in self.answers:
serial_answer = answer[:2]
serial_answer_dict = {}
for node, spec in answer[2].items():
serial_answer_dict[serial_node_arg(node)] = spec.to_dict(hash=spec_hash_type)
serial_answer = serial_answer + (serial_answer_dict,)
serial_answers.append(serial_answer)
ret["answers"] = serial_answers
ret["specs_by_input"] = {}
input_specs = {} if not self.specs_by_input else self.specs_by_input
for input, spec in input_specs.items():
ret["specs_by_input"][str(input)] = spec.to_dict(hash=spec_hash_type)
return ret
@staticmethod
def from_dict(obj: dict):
"""Returns Result object from compatible dictionary"""
def _dict_to_node_argument(dict):
id = dict["id"]
pkg = dict["pkg"]
return NodeArgument(id=id, pkg=pkg)
def _str_to_spec(spec_str):
return spack.spec.Spec(spec_str)
def _dict_to_spec(spec_dict):
loaded_spec = spack.spec.Spec.from_dict(spec_dict)
_ensure_external_path_if_external(loaded_spec)
spack.spec.Spec.ensure_no_deprecated(loaded_spec)
return loaded_spec
asp = obj.get("asp")
spec_list = obj.get("abstract_specs")
if not spec_list:
raise RuntimeError("Invalid json for concretization Result object")
if spec_list:
spec_list = [_str_to_spec(x) for x in spec_list]
result = Result(spec_list, asp)
result.criteria = obj.get("criteria")
result.optimal = obj.get("optimal")
result.warnings = obj.get("warnings")
result.nmodels = obj.get("nmodels")
result.satisfiable = obj.get("satisfiable")
result._unsolved_specs = []
answers = []
for answer in obj.get("answers", []):
loaded_answer = answer[:2]
answer_node_dict = {}
for node, spec in answer[2].items():
answer_node_dict[_dict_to_node_argument(json.loads(node))] = _dict_to_spec(spec)
loaded_answer.append(answer_node_dict)
answers.append(tuple(loaded_answer))
result.answers = answers
result._concrete_specs_by_input = {}
result._concrete_specs = []
for input, spec in obj.get("specs_by_input", {}).items():
result._concrete_specs_by_input[_str_to_spec(input)] = _dict_to_spec(spec)
result._concrete_specs.append(_dict_to_spec(spec))
return result
class ConcretizationCache:
"""Store for Spack concretization results and statistics
Serializes solver result objects and statistics to json and stores
at a given endpoint in a cache associated by the sha256 of the
asp problem and the involved control files.
"""
def __init__(self, root: Union[str, None] = None):
if not root:
root = spack.config.get(
"config:concretization_cache:url", spack.paths.default_conc_cache_path
)
self.root = pathlib.Path(spack.util.path.canonicalize_path(root))
self._fc = FileCache(self.root)
self._cache_manifest = ".cache_manifest"
self._manifest_queue: List[Tuple[pathlib.Path, int]] = []
def cleanup(self):
"""Prunes the concretization cache according to configured size and entry
count limits. Cleanup is done in FIFO ordering."""
# TODO: determine a better default
entry_limit = spack.config.get("config:concretization_cache:entry_limit", 1000)
bytes_limit = spack.config.get("config:concretization_cache:size_limit", 3e8)
# lock the entire buildcache as we're removing a lot of data from the
# manifest and cache itself
with self._fc.read_transaction(self._cache_manifest) as f:
count, cache_bytes = self._extract_cache_metadata(f)
if not count or not cache_bytes:
return
entry_count = int(count)
manifest_bytes = int(cache_bytes)
# move beyond the metadata entry
f.readline()
if entry_count > entry_limit and entry_limit > 0:
with self._fc.write_transaction(self._cache_manifest) as (old, new):
# prune the oldest 10% or until we have removed 10% of
# total bytes starting from oldest entry
# TODO: make this configurable?
prune_count = entry_limit // 10
lines_to_prune = f.readlines(prune_count)
for i, line in enumerate(lines_to_prune):
sha, cache_entry_bytes = self._parse_manifest_entry(line)
if sha and cache_entry_bytes:
cache_path = self._cache_path_from_hash(sha)
if self._fc.remove(cache_path):
entry_count -= 1
manifest_bytes -= int(cache_entry_bytes)
else:
tty.warn(
f"Invalid concretization cache entry: '{line}' on line: {i+1}"
)
self._write_manifest(f, entry_count, manifest_bytes)
elif manifest_bytes > bytes_limit and bytes_limit > 0:
with self._fc.write_transaction(self._cache_manifest) as (old, new):
# take 10% of current size off
prune_amount = bytes_limit // 10
total_pruned = 0
i = 0
while total_pruned < prune_amount:
sha, manifest_cache_bytes = self._parse_manifest_entry(f.readline())
if sha and manifest_cache_bytes:
entry_bytes = int(manifest_cache_bytes)
cache_path = self.root / sha[:2] / sha
if self._safe_remove(cache_path):
entry_count -= 1
entry_bytes -= entry_bytes
total_pruned += entry_bytes
else:
tty.warn(
"Invalid concretization cache entry "
f"'{sha} {manifest_cache_bytes}' on line: {i}"
)
i += 1
self._write_manifest(f, entry_count, manifest_bytes)
for cache_dir in self.root.iterdir():
if cache_dir.is_dir() and not any(cache_dir.iterdir()):
self._safe_remove(cache_dir)
def cache_entries(self):
"""Generator producing cache entries"""
for cache_dir in self.root.iterdir():
# ensure component is cache entry directory
# not metadata file
if cache_dir.is_dir():
for cache_entry in cache_dir.iterdir():
if not cache_entry.is_dir():
yield cache_entry
else:
raise RuntimeError(
"Improperly formed concretization cache. "
f"Directory {cache_entry.name} is improperly located "
"within the concretization cache."
)
def _parse_manifest_entry(self, line):
"""Returns parsed manifest entry lines
with handling for invalid reads."""
if line:
cache_values = line.strip("\n").split(" ")
if len(cache_values) < 2:
tty.warn(f"Invalid cache entry at {line}")
return None, None
return None, None
def _write_manifest(self, manifest_file, entry_count, entry_bytes):
"""Writes new concretization cache manifest file.
Arguments:
manifest_file: IO stream opened for readin
and writing wrapping the manifest file
with cursor at calltime set to location
where manifest should be truncated
entry_count: new total entry count
entry_bytes: new total entry bytes count
"""
persisted_entries = manifest_file.readlines()
manifest_file.truncate(0)
manifest_file.write(f"{entry_count} {entry_bytes}\n")
manifest_file.writelines(persisted_entries)
def _results_from_cache(self, cache_entry_buffer: IO[str]) -> Union[Result, None]:
"""Returns a Results object from the concretizer cache
Reads the cache hit and uses `Result`'s own deserializer
to produce a new Result object
"""
with current_file_position(cache_entry_buffer, 0):
cache_str = cache_entry_buffer.read()
# TODO: Should this be an error if None?
# Same for _stats_from_cache
if cache_str:
cache_entry = json.loads(cache_str)
result_json = cache_entry["results"]
return Result.from_dict(result_json)
return None
def _stats_from_cache(self, cache_entry_buffer: IO[str]) -> Union[List, None]:
"""Returns concretization statistic from the
concretization associated with the cache.
Deserialzes the the json representation of the
statistics covering the cached concretization run
and returns the Python data structures
"""
with current_file_position(cache_entry_buffer, 0):
cache_str = cache_entry_buffer.read()
if cache_str:
return json.loads(cache_str)["statistics"]
return None
def _extract_cache_metadata(self, cache_stream: IO[str]):
"""Extracts and returns cache entry count and bytes count from head of manifest
file"""
# make sure we're always reading from the beginning of the stream
# concretization cache manifest data lives at the top of the file
with current_file_position(cache_stream, 0):
return self._parse_manifest_entry(cache_stream.readline())
def _prefix_digest(self, problem: str) -> Tuple[str, str]:
"""Return the first two characters of, and the full, sha256 of the given asp problem"""
prob_digest = hashlib.sha256(problem.encode()).hexdigest()
prefix = prob_digest[:2]
return prefix, prob_digest
def _cache_path_from_problem(self, problem: str) -> pathlib.Path:
"""Returns a Path object representing the path to the cache
entry for the given problem"""
prefix, digest = self._prefix_digest(problem)
return pathlib.Path(prefix) / digest
def _cache_path_from_hash(self, hash: str) -> pathlib.Path:
"""Returns a Path object representing the cache entry
corresponding to the given sha256 hash"""
return pathlib.Path(hash[:2]) / hash
def _lock_prefix_from_cache_path(self, cache_path: str):
"""Returns the bit location corresponding to a given cache entry path
for file locking"""
return spack.util.hash.base32_prefix_bits(
spack.util.hash.b32_hash(cache_path), spack.util.crypto.bit_length(sys.maxsize)
)
def flush_manifest(self):
"""Updates the concretization cache manifest file after a cache write operation
Updates the current byte count and entry counts and writes to the head of the
manifest file"""
manifest_file = self.root / self._cache_manifest
manifest_file.touch(exist_ok=True)
with open(manifest_file, "r+", encoding="utf-8") as f:
# check if manifest is empty
count, cache_bytes = self._extract_cache_metadata(f)
if not count or not cache_bytes:
# cache is unintialized
count = 0
cache_bytes = 0
f.seek(0, io.SEEK_END)
for manifest_update in self._manifest_queue:
entry_path, entry_bytes = manifest_update
count += 1
cache_bytes += entry_bytes
f.write(f"{entry_path.name} {entry_bytes}")
f.seek(0, io.SEEK_SET)
new_stats = f"{int(count)+1} {int(cache_bytes)}\n"
f.write(new_stats)
def _register_cache_update(self, cache_path: pathlib.Path, bytes_written: int):
"""Adds manifest entry to update queue for later updates to the manifest"""
self._manifest_queue.append((cache_path, bytes_written))
def _safe_remove(self, cache_dir: pathlib.Path):
"""Removes cache entries with handling for the case where the entry has been
removed already or there are multiple cache entries in a directory"""
try:
if cache_dir.is_dir():
cache_dir.rmdir()
else:
cache_dir.unlink()
return True
except FileNotFoundError:
# This is acceptable, removal is idempotent
pass
except OSError as e:
if e.errno == errno.ENOTEMPTY:
# there exists another cache entry in this directory, don't clean yet
pass
return False
def store(self, problem: str, result: Result, statistics: List, test: bool = False):
"""Creates entry in concretization cache for problem if none exists,
storing the concretization Result object and statistics in the cache
as serialized json joined as a single file.
Hash membership is computed based on the sha256 of the provided asp
problem.
"""
cache_path = self._cache_path_from_problem(problem)
if self._fc.init_entry(cache_path):
# if an entry for this conc hash exists already, we're don't want
# to overwrite, just exit
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
return
with self._fc.write_transaction(cache_path) as (old, new):
if old:
# Entry for this conc hash exists already, do not overwrite
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
return
cache_dict = {"results": result.to_dict(test=test), "statistics": statistics}
bytes_written = new.write(json.dumps(cache_dict))
self._register_cache_update(cache_path, bytes_written)
def fetch(self, problem: str) -> Union[Tuple[Result, List], Tuple[None, None]]:
"""Returns the concretization cache result for a lookup based on the given problem.
Checks the concretization cache for the given problem, and either returns the
Python objects cached on disk representing the concretization results and statistics
or returns none if no cache entry was found.
"""
cache_path = self._cache_path_from_problem(problem)
result, statistics = None, None
with self._fc.read_transaction(cache_path) as f:
if f:
result = self._results_from_cache(f)
statistics = self._stats_from_cache(f)
if result and statistics:
tty.debug(f"Concretization cache hit at {str(cache_path)}")
return result, statistics
tty.debug(f"Concretization cache miss at {str(cache_path)}")
return None, None
CONC_CACHE: ConcretizationCache = llnl.util.lang.Singleton(
lambda: ConcretizationCache()
) # type: ignore
def _normalize_packages_yaml(packages_yaml):
normalized_yaml = copy.copy(packages_yaml)
@@ -804,6 +1185,15 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
if sys.platform == "win32":
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
spack.bootstrap.core.ensure_winsdk_external_or_raise()
control_files = ["concretize.lp", "heuristic.lp", "display.lp"]
if not setup.concretize_everything:
control_files.append("when_possible.lp")
if using_libc_compatibility():
control_files.append("libc_compatibility.lp")
else:
control_files.append("os_compatibility.lp")
if setup.enable_splicing:
control_files.append("splices.lp")
timer.start("setup")
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
@@ -813,123 +1203,133 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
return Result(specs), None, None
timer.stop("setup")
timer.start("load")
# Add the problem instance
self.control.add("base", [], asp_problem)
# Load the file itself
timer.start("cache-check")
timer.start("ordering")
# ensure deterministic output
problem_repr = "\n".join(sorted(asp_problem.split("\n")))
timer.stop("ordering")
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
full_path = lambda x: os.path.join(parent_dir, x)
abs_control_files = [full_path(x) for x in control_files]
for ctrl_file in abs_control_files:
with open(ctrl_file, "r+", encoding="utf-8") as f:
problem_repr += "\n" + f.read()
# Binary compatibility is based on libc on Linux, and on the os tag elsewhere
if using_libc_compatibility():
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
if setup.enable_splicing:
self.control.load(os.path.join(parent_dir, "splices.lp"))
result = None
conc_cache_enabled = spack.config.get("config:concretization_cache:enable", True)
if conc_cache_enabled:
result, concretization_stats = CONC_CACHE.fetch(problem_repr)
timer.stop("load")
timer.stop("cache-check")
if not result:
timer.start("load")
# Add the problem instance
self.control.add("base", [], asp_problem)
# Load the files
[self.control.load(lp) for lp in abs_control_files]
timer.stop("load")
# Grounding is the first step in the solve -- it turns our facts
# and first-order logic rules into propositional logic.
timer.start("ground")
self.control.ground([("base", [])])
timer.stop("ground")
# Grounding is the first step in the solve -- it turns our facts
# and first-order logic rules into propositional logic.
timer.start("ground")
self.control.ground([("base", [])])
timer.stop("ground")
# With a grounded program, we can run the solve.
models = [] # stable models if things go well
cores = [] # unsatisfiable cores if they do not
# With a grounded program, we can run the solve.
models = [] # stable models if things go well
cores = [] # unsatisfiable cores if they do not
def on_model(model):
models.append((model.cost, model.symbols(shown=True, terms=True)))
def on_model(model):
models.append((model.cost, model.symbols(shown=True, terms=True)))
solve_kwargs = {
"assumptions": setup.assumptions,
"on_model": on_model,
"on_core": cores.append,
}
solve_kwargs = {
"assumptions": setup.assumptions,
"on_model": on_model,
"on_core": cores.append,
}
if clingo_cffi():
solve_kwargs["on_unsat"] = cores.append
if clingo_cffi():
solve_kwargs["on_unsat"] = cores.append
timer.start("solve")
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
timer.start("solve")
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = (
f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
)
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
solve_result = handle.get()
timer.stop("solve")
solve_result = handle.get()
timer.stop("solve")
# once done, construct the solve result
result = Result(specs)
result.satisfiable = solve_result.satisfiable
# once done, construct the solve result
result = Result(specs)
result.satisfiable = solve_result.satisfiable
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
# first check for errors
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# first check for errors
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# build specs from spec attributes in the model
spec_attrs = [(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")]
answers = builder.build_specs(spec_attrs)
# build specs from spec attributes in the model
spec_attrs = [
(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")
]
answers = builder.build_specs(spec_attrs)
# add best spec to the results
result.answers.append((list(min_cost), 0, answers))
# add best spec to the results
result.answers.append((list(min_cost), 0, answers))
# get optimization criteria
criteria_args = extract_args(best_model, "opt_criterion")
result.criteria = build_criteria_names(min_cost, criteria_args)
# get optimization criteria
criteria_args = extract_args(best_model, "opt_criterion")
result.criteria = build_criteria_names(min_cost, criteria_args)
# record the number of models the solver considered
result.nmodels = len(models)
# record the number of models the solver considered
result.nmodels = len(models)
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
result.raise_if_unsat()
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
if conc_cache_enabled:
CONC_CACHE.store(problem_repr, result, self.control.statistics, test=setup.tests)
concretization_stats = self.control.statistics
if output.timers:
timer.write_tty()
print()
if output.stats:
print("Statistics:")
pprint.pprint(self.control.statistics)
result.raise_if_unsat()
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
return result, timer, self.control.statistics
pprint.pprint(concretization_stats)
return result, timer, concretization_stats
class ConcreteSpecsByHash(collections.abc.Mapping):
@@ -1371,7 +1771,7 @@ def effect_rules(self):
return
self.gen.h2("Imposed requirements")
for name in self._effect_cache:
for name in sorted(self._effect_cache):
cache = self._effect_cache[name]
for (spec_str, _), (effect_id, requirements) in cache.items():
self.gen.fact(fn.pkg_fact(name, fn.effect_id(effect_id)))
@@ -1424,8 +1824,8 @@ def define_variant(
elif isinstance(values, vt.DisjointSetsOfValues):
union = set()
for sid, s in enumerate(values.sets):
for value in s:
for sid, s in enumerate(sorted(values.sets)):
for value in sorted(s):
pkg_fact(fn.variant_value_from_disjoint_sets(vid, value, sid))
union.update(s)
values = union
@@ -1606,7 +2006,7 @@ def package_provider_rules(self, pkg):
self.gen.fact(fn.pkg_fact(pkg.name, fn.possible_provider(vpkg_name)))
for when, provided in pkg.provided.items():
for vpkg in provided:
for vpkg in sorted(provided):
if vpkg.name not in self.possible_virtuals:
continue
@@ -1621,8 +2021,8 @@ def package_provider_rules(self, pkg):
condition_id = self.condition(
when, required_name=pkg.name, msg="Virtuals are provided together"
)
for set_id, virtuals_together in enumerate(sets_of_virtuals):
for name in virtuals_together:
for set_id, virtuals_together in enumerate(sorted(sets_of_virtuals)):
for name in sorted(virtuals_together):
self.gen.fact(
fn.pkg_fact(pkg.name, fn.provided_together(condition_id, set_id, name))
)
@@ -1656,13 +2056,16 @@ def track_dependencies(input_spec, requirements):
return requirements + [fn.attr("track_dependencies", input_spec.name)]
def dependency_holds(input_spec, requirements):
return remove_node(input_spec, requirements) + [
result = remove_node(input_spec, requirements) + [
fn.attr(
"dependency_holds", pkg.name, input_spec.name, dt.flag_to_string(t)
)
for t in dt.ALL_FLAGS
if t & depflag
]
if input_spec.name not in pkg.extendees:
return result
return result + [fn.attr("extends", pkg.name, input_spec.name)]
context = ConditionContext()
context.source = ConstraintOrigin.append_type_suffix(
@@ -1729,7 +2132,7 @@ def package_splice_rules(self, pkg):
for map in pkg.variants.values():
for k in map:
filt_match_variants.add(k)
filt_match_variants = list(filt_match_variants)
filt_match_variants = sorted(filt_match_variants)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
)
@@ -2259,7 +2662,7 @@ def define_package_versions_and_validate_preferences(
):
"""Declare any versions in specs not declared in packages."""
packages_yaml = spack.config.get("packages")
for pkg_name in possible_pkgs:
for pkg_name in sorted(possible_pkgs):
pkg_cls = self.pkg_class(pkg_name)
# All the versions from the corresponding package.py file. Since concepts
@@ -2587,7 +2990,7 @@ def define_variant_values(self):
"""
# Tell the concretizer about possible values from specs seen in spec_clauses().
# We might want to order these facts by pkg and name if we are debugging.
for pkg_name, variant_def_id, value in self.variant_values_from_specs:
for pkg_name, variant_def_id, value in sorted(self.variant_values_from_specs):
try:
vid = self.variant_ids_by_def_id[variant_def_id]
except KeyError:
@@ -2625,6 +3028,8 @@ def concrete_specs(self):
# Declare as possible parts of specs that are not in package.py
# - Add versions to possible versions
# - Add OS to possible OS's
# is traverse deterministic?
for dep in spec.traverse():
self.possible_versions[dep.name].add(dep.version)
if isinstance(dep.version, vn.GitVersion):
@@ -2862,7 +3267,7 @@ def define_runtime_constraints(self):
recorder.consume_facts()
def literal_specs(self, specs):
for spec in specs:
for spec in sorted(specs):
self.gen.h2("Spec: %s" % str(spec))
condition_id = next(self._id_counter)
trigger_id = next(self._id_counter)
@@ -3363,7 +3768,7 @@ def consume_facts(self):
# on the available compilers)
self._setup.pkg_version_rules(runtime_pkg)
for imposed_spec, when_spec in self.runtime_conditions:
for imposed_spec, when_spec in sorted(self.runtime_conditions):
msg = f"{when_spec} requires {imposed_spec} at runtime"
_ = self._setup.condition(when_spec, imposed_spec=imposed_spec, msg=msg)
@@ -3702,11 +4107,11 @@ def build_specs(self, function_tuples):
roots = [spec.root for spec in self._specs.values()]
roots = dict((id(r), r) for r in roots)
for root in roots.values():
spack.spec.Spec.inject_patches_variant(root)
_inject_patches_variant(root)
# Add external paths to specs with just external modules
for s in self._specs.values():
spack.spec.Spec.ensure_external_path_if_external(s)
_ensure_external_path_if_external(s)
for s in self._specs.values():
_develop_specs_from_env(s, ev.active_environment())
@@ -3778,6 +4183,92 @@ def execute_explicit_splices(self):
return specs
def _inject_patches_variant(root: spack.spec.Spec) -> None:
# This dictionary will store object IDs rather than Specs as keys
# since the Spec __hash__ will change as patches are added to them
spec_to_patches: Dict[int, Set[spack.patch.Patch]] = {}
for s in root.traverse():
# After concretizing, assign namespaces to anything left.
# Note that this doesn't count as a "change". The repository
# configuration is constant throughout a spack run, and
# normalize and concretize evaluate Packages using Repo.get(),
# which respects precedence. So, a namespace assignment isn't
# changing how a package name would have been interpreted and
# we can do it as late as possible to allow as much
# compatibility across repositories as possible.
if s.namespace is None:
s.namespace = spack.repo.PATH.repo_for_pkg(s.name).namespace
if s.concrete:
continue
# Add any patches from the package to the spec.
node_patches = {
patch
for cond, patch_list in spack.repo.PATH.get_pkg_class(s.fullname).patches.items()
if s.satisfies(cond)
for patch in patch_list
}
if node_patches:
spec_to_patches[id(s)] = node_patches
# Also record all patches required on dependencies by depends_on(..., patch=...)
for dspec in root.traverse_edges(deptype=dt.ALL, cover="edges", root=False):
if dspec.spec.concrete:
continue
pkg_deps = spack.repo.PATH.get_pkg_class(dspec.parent.fullname).dependencies
edge_patches: List[spack.patch.Patch] = []
for cond, deps_by_name in pkg_deps.items():
if not dspec.parent.satisfies(cond):
continue
dependency = deps_by_name.get(dspec.spec.name)
if not dependency:
continue
for pcond, patch_list in dependency.patches.items():
if dspec.spec.satisfies(pcond):
edge_patches.extend(patch_list)
if edge_patches:
spec_to_patches.setdefault(id(dspec.spec), set()).update(edge_patches)
for spec in root.traverse():
if id(spec) not in spec_to_patches:
continue
patches = list(spec_to_patches[id(spec)])
variant: vt.MultiValuedVariant = spec.variants.setdefault(
"patches", vt.MultiValuedVariant("patches", ())
)
variant.value = tuple(p.sha256 for p in patches)
# FIXME: Monkey patches variant to store patches order
ordered_hashes = [(*p.ordering_key, p.sha256) for p in patches if p.ordering_key]
ordered_hashes.sort()
tty.debug(
f"Ordered hashes [{spec.name}]: "
+ ", ".join("/".join(str(e) for e in t) for t in ordered_hashes)
)
setattr(
variant, "_patches_in_order_of_appearance", [sha256 for _, _, sha256 in ordered_hashes]
)
def _ensure_external_path_if_external(spec: spack.spec.Spec) -> None:
if not spec.external_modules or spec.external_path:
return
# Get the path from the module the package can override the default
# (this is mostly needed for Cray)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
package = pkg_cls(spec)
spec.external_path = getattr(package, "external_prefix", None) or md.path_from_modules(
spec.external_modules
)
def _develop_specs_from_env(spec, env):
dev_info = env.dev_specs.get(spec.name, {}) if env else {}
if not dev_info:
@@ -4134,6 +4625,9 @@ def solve_with_stats(
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
CONC_CACHE.flush_manifest()
CONC_CACHE.cleanup()
return self.driver.solve(
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
)
@@ -4203,6 +4697,9 @@ def solve_in_rounds(
for spec in result.specs:
reusable_specs.extend(spec.traverse())
CONC_CACHE.flush_manifest()
CONC_CACHE.cleanup()
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""There was an issue with the spec that was requested (i.e. a user error)."""

View File

@@ -524,6 +524,16 @@ error(10, "'{0}' is not a valid dependency for any package in the DAG", Package)
:- attr("node", node(ID, Package)),
not needed(node(ID, Package)).
% Extensions depending on each other must all extend the same node (e.g. all Python packages
% depending on each other must depend on the same Python interpreter)
error(100, "{0} and {1} must depend on the same {2}", ExtensionParent, ExtensionChild, ExtendeePackage)
:- depends_on(ExtensionParent, ExtensionChild),
attr("extends", ExtensionParent, ExtendeePackage),
depends_on(ExtensionParent, node(X, ExtendeePackage)),
depends_on(ExtensionChild, node(Y, ExtendeePackage)),
X != Y.
#defined dependency_type/2.
%-----------------------------------------------------------------------------

View File

@@ -52,6 +52,7 @@
import enum
import io
import itertools
import json
import os
import pathlib
import platform
@@ -98,7 +99,6 @@
import spack.traverse
import spack.util.executable
import spack.util.hash
import spack.util.module_cmd as md
import spack.util.prefix
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
@@ -798,7 +798,7 @@ def update_deptypes(self, depflag: dt.DepFlag) -> bool:
self.depflag = new
return True
def update_virtuals(self, virtuals: Tuple[str, ...]) -> bool:
def update_virtuals(self, virtuals: Iterable[str]) -> bool:
"""Update the list of provided virtuals"""
old = self.virtuals
self.virtuals = tuple(sorted(set(virtuals).union(self.virtuals)))
@@ -2118,20 +2118,20 @@ def cshort_spec(self):
return self.cformat(spec_format)
@property
def prefix(self):
def prefix(self) -> spack.util.prefix.Prefix:
if not self._concrete:
raise spack.error.SpecError("Spec is not concrete: " + str(self))
raise spack.error.SpecError(f"Spec is not concrete: {self}")
if self._prefix is None:
upstream, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
_, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
if record and record.path:
self.prefix = record.path
self.set_prefix(record.path)
else:
self.prefix = spack.store.STORE.layout.path_for_spec(self)
self.set_prefix(spack.store.STORE.layout.path_for_spec(self))
assert self._prefix is not None
return self._prefix
@prefix.setter
def prefix(self, value):
def set_prefix(self, value: str) -> None:
self._prefix = spack.util.prefix.Prefix(llnl.path.convert_to_platform_path(value))
def spec_hash(self, hash):
@@ -2145,7 +2145,9 @@ def spec_hash(self, hash):
if hash.override is not None:
return hash.override(self)
node_dict = self.to_node_dict(hash=hash)
json_text = sjson.dump(node_dict)
json_text = json.dumps(
node_dict, ensure_ascii=True, indent=None, separators=(",", ":"), sort_keys=False
)
# This implements "frankenhashes", preserving the last 7 characters of the
# original hash when splicing so that we can avoid relocation issues
out = spack.util.hash.b32_hash(json_text)
@@ -2735,7 +2737,7 @@ def spec_and_dependency_types(
return spec_builder(spec_dict)
@staticmethod
def from_dict(data):
def from_dict(data) -> "Spec":
"""Construct a spec from JSON/YAML.
Args:
@@ -2758,7 +2760,7 @@ def from_dict(data):
return spec
@staticmethod
def from_yaml(stream):
def from_yaml(stream) -> "Spec":
"""Construct a spec from YAML.
Args:
@@ -2768,7 +2770,7 @@ def from_yaml(stream):
return Spec.from_dict(data)
@staticmethod
def from_json(stream):
def from_json(stream) -> "Spec":
"""Construct a spec from JSON.
Args:
@@ -2778,7 +2780,7 @@ def from_json(stream):
data = sjson.load(stream)
return Spec.from_dict(data)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON spec:", str(e)) from e
raise sjson.SpackJSONError("error parsing JSON spec:", e) from e
@staticmethod
def extract_json_from_clearsig(data):
@@ -2842,94 +2844,6 @@ def _patches_assigned(self):
return True
@staticmethod
def inject_patches_variant(root):
# This dictionary will store object IDs rather than Specs as keys
# since the Spec __hash__ will change as patches are added to them
spec_to_patches = {}
for s in root.traverse():
# After concretizing, assign namespaces to anything left.
# Note that this doesn't count as a "change". The repository
# configuration is constant throughout a spack run, and
# normalize and concretize evaluate Packages using Repo.get(),
# which respects precedence. So, a namespace assignment isn't
# changing how a package name would have been interpreted and
# we can do it as late as possible to allow as much
# compatibility across repositories as possible.
if s.namespace is None:
s.namespace = spack.repo.PATH.repo_for_pkg(s.name).namespace
if s.concrete:
continue
# Add any patches from the package to the spec.
patches = set()
for cond, patch_list in spack.repo.PATH.get_pkg_class(s.fullname).patches.items():
if s.satisfies(cond):
for patch in patch_list:
patches.add(patch)
if patches:
spec_to_patches[id(s)] = patches
# Also record all patches required on dependencies by
# depends_on(..., patch=...)
for dspec in root.traverse_edges(deptype=all, cover="edges", root=False):
if dspec.spec.concrete:
continue
pkg_deps = spack.repo.PATH.get_pkg_class(dspec.parent.fullname).dependencies
patches = []
for cond, deps_by_name in pkg_deps.items():
if not dspec.parent.satisfies(cond):
continue
dependency = deps_by_name.get(dspec.spec.name)
if not dependency:
continue
for pcond, patch_list in dependency.patches.items():
if dspec.spec.satisfies(pcond):
patches.extend(patch_list)
if patches:
all_patches = spec_to_patches.setdefault(id(dspec.spec), set())
for patch in patches:
all_patches.add(patch)
for spec in root.traverse():
if id(spec) not in spec_to_patches:
continue
patches = list(lang.dedupe(spec_to_patches[id(spec)]))
mvar = spec.variants.setdefault("patches", vt.MultiValuedVariant("patches", ()))
mvar.value = tuple(p.sha256 for p in patches)
# FIXME: Monkey patches mvar to store patches order
full_order_keys = list(tuple(p.ordering_key) + (p.sha256,) for p in patches)
ordered_hashes = sorted(full_order_keys)
tty.debug(
"Ordered hashes [{0}]: ".format(spec.name)
+ ", ".join("/".join(str(e) for e in t) for t in ordered_hashes)
)
mvar._patches_in_order_of_appearance = list(t[-1] for t in ordered_hashes)
@staticmethod
def ensure_external_path_if_external(external_spec):
if external_spec.external_modules and not external_spec.external_path:
compiler = spack.compilers.compiler_for_spec(
external_spec.compiler, external_spec.architecture
)
for mod in compiler.modules:
md.load_module(mod)
# Get the path from the module the package can override the default
# (this is mostly needed for Cray)
pkg_cls = spack.repo.PATH.get_pkg_class(external_spec.name)
package = pkg_cls(external_spec)
external_spec.external_path = getattr(
package, "external_prefix", md.path_from_modules(external_spec.external_modules)
)
@staticmethod
def ensure_no_deprecated(root):
"""Raise if a deprecated spec is in the dag.
@@ -4701,17 +4615,6 @@ def constrain(self, other: "VariantMap") -> bool:
return changed
@property
def concrete(self):
"""Returns True if the spec is concrete in terms of variants.
Returns:
bool: True or False
"""
return self.spec._concrete or all(
v in self for v in spack.repo.PATH.get_pkg_class(self.spec.fullname).variant_names()
)
def copy(self) -> "VariantMap":
clone = VariantMap(self.spec)
for name, variant in self.items():
@@ -4838,33 +4741,51 @@ def merge_abstract_anonymous_specs(*abstract_specs: Spec):
return merged_spec
def reconstruct_virtuals_on_edges(spec):
"""Reconstruct virtuals on edges. Used to read from old DB and reindex.
def reconstruct_virtuals_on_edges(spec: Spec) -> None:
"""Reconstruct virtuals on edges. Used to read from old DB and reindex."""
virtuals_needed: Dict[str, Set[str]] = {}
virtuals_provided: Dict[str, Set[str]] = {}
for edge in spec.traverse_edges(cover="edges", root=False):
parent_key = edge.parent.dag_hash()
if parent_key not in virtuals_needed:
# Construct which virtuals are needed by parent
virtuals_needed[parent_key] = set()
try:
parent_pkg = edge.parent.package
except Exception as e:
warnings.warn(
f"cannot reconstruct virtual dependencies on {edge.parent.name}: {e}"
)
continue
Args:
spec: spec on which we want to reconstruct virtuals
"""
# Collect all possible virtuals
possible_virtuals = set()
for node in spec.traverse():
try:
possible_virtuals.update(
{x for x in node.package.dependencies if spack.repo.PATH.is_virtual(x)}
virtuals_needed[parent_key].update(
name
for name, when_deps in parent_pkg.dependencies_by_name(when=True).items()
if spack.repo.PATH.is_virtual(name)
and any(edge.parent.satisfies(x) for x in when_deps)
)
except Exception as e:
warnings.warn(f"cannot reconstruct virtual dependencies on package {node.name}: {e}")
if not virtuals_needed[parent_key]:
continue
# Assume all incoming edges to provider are marked with virtuals=
for vspec in possible_virtuals:
try:
provider = spec[vspec]
except KeyError:
# Virtual not in the DAG
child_key = edge.spec.dag_hash()
if child_key not in virtuals_provided:
virtuals_provided[child_key] = set()
try:
child_pkg = edge.spec.package
except Exception as e:
warnings.warn(
f"cannot reconstruct virtual dependencies on {edge.parent.name}: {e}"
)
continue
virtuals_provided[child_key].update(x.name for x in child_pkg.virtuals_provided)
if not virtuals_provided[child_key]:
continue
for edge in provider.edges_from_dependents():
edge.update_virtuals([vspec])
virtuals_to_add = virtuals_needed[parent_key] & virtuals_provided[child_key]
if virtuals_to_add:
edge.update_virtuals(virtuals_to_add)
class SpecfileReaderBase:

View File

@@ -569,7 +569,6 @@ def test_FetchCacheError_only_accepts_lists_of_errors():
def test_FetchCacheError_pretty_printing_multiple():
e = bindist.FetchCacheError([RuntimeError("Oops!"), TypeError("Trouble!")])
str_e = str(e)
print("'" + str_e + "'")
assert "Multiple errors" in str_e
assert "Error 1: RuntimeError: Oops!" in str_e
assert "Error 2: TypeError: Trouble!" in str_e

View File

@@ -388,7 +388,7 @@ def test_wrapper_variables(
root = spack.concretize.concretize_one("dt-diamond")
for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name)
s.set_prefix(f"/{s.name}-prefix/")
dep_pkg = root["dt-diamond-left"].package
dep_lib_paths = ["/test/path/to/ex1.so", "/test/path/to/subdir/ex2.so"]
@@ -396,7 +396,7 @@ def test_wrapper_variables(
dep_libs = LibraryList(dep_lib_paths)
dep2_pkg = root["dt-diamond-right"].package
dep2_pkg.spec.prefix = str(installation_dir_with_headers)
dep2_pkg.spec.set_prefix(str(installation_dir_with_headers))
setattr(dep_pkg, "libs", dep_libs)
try:
@@ -542,7 +542,7 @@ def test_build_jobs_sequential_is_sequential():
spack.config.determine_number_of_jobs(
parallel=False,
max_cpus=8,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 8}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 8}}),
),
@@ -556,7 +556,7 @@ def test_build_jobs_command_line_overrides():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=1,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 10}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 1}}),
),
@@ -567,7 +567,7 @@ def test_build_jobs_command_line_overrides():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=100,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("command_line", {"config": {"build_jobs": 10}}),
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 100}}),
),
@@ -581,7 +581,7 @@ def test_build_jobs_defaults():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=10,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 1}})
),
)
@@ -591,7 +591,7 @@ def test_build_jobs_defaults():
spack.config.determine_number_of_jobs(
parallel=True,
max_cpus=10,
config=spack.config.Configuration(
config=spack.config.create_from(
spack.config.InternalConfigScope("defaults", {"config": {"build_jobs": 100}})
),
)

View File

@@ -403,8 +403,8 @@ def test_autoreconf_search_path_args_multiple(default_mock_concretization, tmpdi
aclocal_fst = str(tmpdir.mkdir("fst").mkdir("share").mkdir("aclocal"))
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == [
"-I",
aclocal_fst,
@@ -422,8 +422,8 @@ def test_autoreconf_search_path_args_skip_automake(default_mock_concretization,
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.name = "automake"
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == ["-I", aclocal_snd]
@@ -434,7 +434,7 @@ def test_autoreconf_search_path_args_external_order(default_mock_concretization,
aclocal_snd = str(tmpdir.mkdir("snd").mkdir("share").mkdir("aclocal"))
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.external_path = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == [
"-I",
aclocal_snd,
@@ -447,8 +447,8 @@ def test_autoreconf_search_path_skip_nonexisting(default_mock_concretization, tm
"""Skip -I flags for non-existing directories"""
spec = default_mock_concretization("dttop")
build_dep_one, build_dep_two = spec.dependencies(deptype="build")
build_dep_one.prefix = str(tmpdir.join("fst"))
build_dep_two.prefix = str(tmpdir.join("snd"))
build_dep_one.set_prefix(str(tmpdir.join("fst")))
build_dep_two.set_prefix(str(tmpdir.join("snd")))
assert spack.build_systems.autotools._autoreconf_search_path_args(spec) == []

View File

@@ -210,7 +210,6 @@ def check_args_contents(cc, args, must_contain, must_not_contain):
"""
with set_env(SPACK_TEST_COMMAND="dump-args"):
cc_modified_args = cc(*args, output=str).strip().split("\n")
print(cc_modified_args)
for a in must_contain:
assert a in cc_modified_args
for a in must_not_contain:

View File

@@ -347,7 +347,6 @@ def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
for key, val in expectations.items():
affected_specs = ci.get_spec_filter_list(e1, touched, dependent_traverse_depth=key)
affected_pkg_names = set([s.name for s in affected_specs])
print(f"{key}: {affected_pkg_names}")
assert affected_pkg_names == val

View File

@@ -12,7 +12,7 @@
build_env = SpackCommand("build-env")
@pytest.mark.parametrize("pkg", [("zlib",), ("zlib", "--")])
@pytest.mark.parametrize("pkg", [("pkg-c",), ("pkg-c", "--")])
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_it_just_runs(pkg):
build_env(*pkg)
@@ -38,7 +38,7 @@ def test_build_env_requires_a_spec(args):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_dump(shell_as, shell, tmpdir):
with tmpdir.as_cwd():
build_env("--dump", _out_file, "zlib")
build_env("--dump", _out_file, "pkg-c")
with open(_out_file, encoding="utf-8") as f:
if shell == "pwsh":
assert any(line.startswith("$Env:PATH") for line in f.readlines())
@@ -51,7 +51,7 @@ def test_dump(shell_as, shell, tmpdir):
@pytest.mark.usefixtures("config", "mock_packages", "working_env")
def test_pickle(tmpdir):
with tmpdir.as_cwd():
build_env("--pickle", _out_file, "zlib")
build_env("--pickle", _out_file, "pkg-c")
environment = pickle.load(open(_out_file, "rb"))
assert isinstance(environment, dict)
assert "PATH" in environment

View File

@@ -148,7 +148,7 @@ def test_update_key_index(
s = spack.concretize.concretize_one("libdwarf")
# Install a package
install(s.name)
install("--fake", s.name)
# Put installed package in the buildcache, which, because we're signing
# it, should result in the public key getting pushed to the buildcache
@@ -178,7 +178,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
s = spack.concretize.concretize_one("libdwarf")
# Install and generate build cache index
PackageInstaller([s.package], explicit=True).install()
PackageInstaller([s.package], fake=True, explicit=True).install()
metadata_file = spack.binary_distribution.tarball_name(s, ".spec.json")
@@ -214,13 +214,11 @@ def verify_mirror_contents():
if in_env_pkg in p:
found_pkg = True
if not found_pkg:
print("Expected to find {0} in {1}".format(in_env_pkg, dest_mirror_dir))
assert False
assert found_pkg, f"Expected to find {in_env_pkg} in {dest_mirror_dir}"
# Install a package and put it in the buildcache
s = spack.concretize.concretize_one(out_env_pkg)
install(s.name)
install("--fake", s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name)
env("create", "test")

View File

@@ -23,7 +23,15 @@
@pytest.fixture
def can_fetch_versions(monkeypatch):
def no_add(monkeypatch):
def add_versions_to_pkg(pkg, version_lines, open_in_editor):
raise AssertionError("Should not be called")
monkeypatch.setattr(spack.cmd.checksum, "add_versions_to_pkg", add_versions_to_pkg)
@pytest.fixture
def can_fetch_versions(monkeypatch, no_add):
"""Fake successful version detection."""
def fetch_remote_versions(pkg, concurrency):
@@ -38,10 +46,6 @@ def get_checksums_for_versions(url_by_version, package_name, **kwargs):
def url_exists(url, curl=None):
return True
def add_versions_to_pkg(pkg, version_lines, open_in_editor):
raise AssertionError("Should not be called")
monkeypatch.setattr(spack.cmd.checksum, "add_versions_to_pkg", add_versions_to_pkg)
monkeypatch.setattr(
spack.package_base.PackageBase, "fetch_remote_versions", fetch_remote_versions
)
@@ -50,7 +54,7 @@ def add_versions_to_pkg(pkg, version_lines, open_in_editor):
@pytest.fixture
def cannot_fetch_versions(monkeypatch):
def cannot_fetch_versions(monkeypatch, no_add):
"""Fake unsuccessful version detection."""
def fetch_remote_versions(pkg, concurrency):
@@ -283,11 +287,7 @@ def test_checksum_versions(mock_packages, can_fetch_versions, monkeypatch):
assert "version(" in output
def test_checksum_missing_version(mock_packages, cannot_fetch_versions, monkeypatch):
def add_versions_to_pkg(pkg, version_lines, open_in_editor):
raise AssertionError("Should not be called")
monkeypatch.setattr(spack.cmd.checksum, "add_versions_to_pkg", add_versions_to_pkg)
def test_checksum_missing_version(mock_packages, cannot_fetch_versions):
output = spack_checksum("preferred-test", "99.99.99", fail_on_error=False)
assert "Could not find any remote versions" in output
output = spack_checksum("--add-to-package", "preferred-test", "99.99.99", fail_on_error=False)
@@ -295,9 +295,6 @@ def add_versions_to_pkg(pkg, version_lines, open_in_editor):
def test_checksum_deprecated_version(mock_packages, can_fetch_versions):
def add_versions_to_pkg(pkg, version_lines, open_in_editor):
raise AssertionError("Should not be called")
output = spack_checksum("deprecated-versions", "1.1.0", fail_on_error=False)
assert "Version 1.1.0 is deprecated" in output
output = spack_checksum(

View File

@@ -28,6 +28,7 @@
from spack.ci.generator_registry import generator
from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE
from spack.database import INDEX_JSON_FILE
from spack.error import SpackError
from spack.schema.buildcache_spec import schema as specfile_schema
from spack.schema.database_index import schema as db_idx_schema
from spack.spec import Spec
@@ -170,7 +171,9 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
url: https://my.fake.cdash
project: Not used
site: Nothing
"""
""",
"--artifacts-root",
str(tmp_path / "my_artifacts_root"),
)
yaml_contents = syaml.load(outputfile.read_text())
@@ -192,7 +195,7 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
assert "variables" in yaml_contents
assert "SPACK_ARTIFACTS_ROOT" in yaml_contents["variables"]
assert yaml_contents["variables"]["SPACK_ARTIFACTS_ROOT"] == "jobs_scratch_dir"
assert yaml_contents["variables"]["SPACK_ARTIFACTS_ROOT"] == "my_artifacts_root"
def test_ci_generate_with_env_missing_section(ci_generate_test, tmp_path, mock_binary_index):
@@ -1062,7 +1065,7 @@ def test_ci_rebuild_index(
with open(tmp_path / "spec.json", "w", encoding="utf-8") as f:
f.write(concrete_spec.to_json(hash=ht.dag_hash))
install_cmd("--add", "-f", str(tmp_path / "spec.json"))
install_cmd("--fake", "--add", "-f", str(tmp_path / "spec.json"))
buildcache_cmd("push", "-u", "-f", mirror_url, "callpath")
ci_cmd("rebuild-index")
@@ -1322,44 +1325,50 @@ def test_ci_reproduce(
env.concretize()
env.write()
repro_dir.mkdir()
def fake_download_and_extract_artifacts(url, work_dir, merge_commit_test=True):
with working_dir(tmp_path), ev.Environment(".") as env:
if not os.path.exists(repro_dir):
repro_dir.mkdir()
job_spec = env.concrete_roots()[0]
with open(repro_dir / "archivefiles.json", "w", encoding="utf-8") as f:
f.write(job_spec.to_json(hash=ht.dag_hash))
job_spec = env.concrete_roots()[0]
with open(repro_dir / "archivefiles.json", "w", encoding="utf-8") as f:
f.write(job_spec.to_json(hash=ht.dag_hash))
artifacts_root = repro_dir / "jobs_scratch_dir"
pipeline_path = artifacts_root / "pipeline.yml"
artifacts_root = repro_dir / "scratch_dir"
pipeline_path = artifacts_root / "pipeline.yml"
ci_cmd(
"generate",
"--output-file",
str(pipeline_path),
"--artifacts-root",
str(artifacts_root),
)
job_name = gitlab_generator.get_job_name(job_spec)
with open(repro_dir / "repro.json", "w", encoding="utf-8") as f:
f.write(
json.dumps(
{
"job_name": job_name,
"job_spec_json": "archivefiles.json",
"ci_project_dir": str(repro_dir),
}
ci_cmd(
"generate",
"--output-file",
str(pipeline_path),
"--artifacts-root",
str(artifacts_root),
)
)
with open(repro_dir / "install.sh", "w", encoding="utf-8") as f:
f.write("#!/bin/sh\n\n#fake install\nspack install blah\n")
job_name = gitlab_generator.get_job_name(job_spec)
with open(repro_dir / "spack_info.txt", "w", encoding="utf-8") as f:
f.write(f"\nMerge {last_two_git_commits[1]} into {last_two_git_commits[0]}\n\n")
with open(repro_dir / "repro.json", "w", encoding="utf-8") as f:
f.write(
json.dumps(
{
"job_name": job_name,
"job_spec_json": "archivefiles.json",
"ci_project_dir": str(repro_dir),
}
)
)
def fake_download_and_extract_artifacts(url, work_dir):
pass
with open(repro_dir / "install.sh", "w", encoding="utf-8") as f:
f.write("#!/bin/sh\n\n#fake install\nspack install blah\n")
with open(repro_dir / "spack_info.txt", "w", encoding="utf-8") as f:
if merge_commit_test:
f.write(
f"\nMerge {last_two_git_commits[1]} into {last_two_git_commits[0]}\n\n"
)
else:
f.write(f"\ncommit {last_two_git_commits[1]}\n\n")
return "jobs_scratch_dir"
monkeypatch.setattr(ci, "download_and_extract_artifacts", fake_download_and_extract_artifacts)
rep_out = ci_cmd(
@@ -1375,6 +1384,64 @@ def fake_download_and_extract_artifacts(url, work_dir):
# Make sure we tell the user where it is when not in interactive mode
assert f"$ {repro_dir}/start.sh" in rep_out
# Ensure the correct commits are used
assert f"checkout_commit: {last_two_git_commits[0]}" in rep_out
assert f"merge_commit: {last_two_git_commits[1]}" in rep_out
# Test re-running in dirty working dir
with pytest.raises(SpackError, match=f"{repro_dir}"):
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--working-dir",
str(repro_dir),
output=str,
)
# Cleanup between tests
shutil.rmtree(repro_dir)
# Test --use-local-head
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--use-local-head",
"--working-dir",
str(repro_dir),
output=str,
)
# Make sure we are checkout out the HEAD commit without a merge commit
assert "checkout_commit: HEAD" in rep_out
assert "merge_commit: None" in rep_out
# Test the case where the spack_info.txt is not a merge commit
monkeypatch.setattr(
ci,
"download_and_extract_artifacts",
lambda url, wd: fake_download_and_extract_artifacts(url, wd, False),
)
# Cleanup between tests
shutil.rmtree(repro_dir)
rep_out = ci_cmd(
"reproduce-build",
"https://example.com/api/v1/projects/1/jobs/2/artifacts",
"--working-dir",
str(repro_dir),
output=str,
)
# Make sure the script was generated
assert (repro_dir / "start.sh").exists()
# Make sure we tell the user where it is when not in interactive mode
assert f"$ {repro_dir}/start.sh" in rep_out
# Ensure the correct commit is used (different than HEAD)
assert f"checkout_commit: {last_two_git_commits[1]}" in rep_out
assert "merge_commit: None" in rep_out
@pytest.mark.parametrize(
"url_in,url_out",

View File

@@ -335,7 +335,7 @@ def test_config_add_override_leaf_from_file(mutable_empty_config, tmpdir):
def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
config("add", "packages:all:compiler:[gcc]")
config("add", "packages:all:require:['%gcc']")
# contents to add to file
contents = """spack:
@@ -357,7 +357,7 @@ def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
expected = """packages:
all:
target: [x86_64]
compiler: [gcc]
require: ['%gcc']
"""
assert expected == output
@@ -606,7 +606,6 @@ def test_config_prefer_upstream(
packages = syaml.load(open(cfg_file, encoding="utf-8"))["packages"]
# Make sure only the non-default variants are set.
assert packages["all"] == {"compiler": ["gcc@=10.2.1"]}
assert packages["boost"] == {"variants": "+debug +graph", "version": ["1.63.0"]}
assert packages["dependency-install"] == {"version": ["2.0"]}
# Ensure that neither variant gets listed for hdf5, since they conflict

View File

@@ -17,16 +17,16 @@
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
all_installed = spack.store.STORE.db.query()
all_installed = spack.store.STORE.db.query("libelf")
assert len(all_installed) == 2
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert all_available == all_installed
assert non_deprecated == spack.store.STORE.db.query("libelf@0.8.13")
@@ -39,24 +39,24 @@ def test_deprecate_fails_no_such_package(mock_packages, mock_archive, mock_fetch
output = deprecate("-y", "libelf@0.8.10", "libelf@0.8.13", fail_on_error=False)
assert "Spec 'libelf@0.8.10' matches no installed packages" in output
install("libelf@0.8.10")
install("--fake", "libelf@0.8.10")
output = deprecate("-y", "libelf@0.8.10", "libelf@0.8.13", fail_on_error=False)
assert "Spec 'libelf@0.8.13' matches no installed packages" in output
def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that the ```-i`` option allows us to deprecate in favor of a spec
that is not yet installed."""
install("libelf@0.8.10")
to_deprecate = spack.store.STORE.db.query()
def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mockery, monkeypatch):
"""Tests that the -i option allows us to deprecate in favor of a spec
that is not yet installed.
"""
install("--fake", "libelf@0.8.10")
to_deprecate = spack.store.STORE.db.query("libelf")
assert len(to_deprecate) == 1
deprecate("-y", "-i", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
deprecated = spack.store.STORE.db.query(installed=InstallRecordStatus.DEPRECATED)
non_deprecated = spack.store.STORE.db.query("libelf")
deprecated = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.DEPRECATED)
assert deprecated == to_deprecate
assert len(non_deprecated) == 1
assert non_deprecated[0].satisfies("libelf@0.8.13")
@@ -64,8 +64,8 @@ def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mock
def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Test that the deprecate command deprecates all dependencies properly."""
install("libdwarf@20130729 ^libelf@0.8.13")
install("libdwarf@20130207 ^libelf@0.8.10")
install("--fake", "libdwarf@20130729 ^libelf@0.8.13")
install("--fake", "libdwarf@20130207 ^libelf@0.8.10")
new_spec = spack.concretize.concretize_one("libdwarf@20130729^libelf@0.8.13")
old_spec = spack.concretize.concretize_one("libdwarf@20130207^libelf@0.8.10")
@@ -81,14 +81,14 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
assert all_available == all_installed
assert sorted(all_available) == sorted(deprecated + non_deprecated)
assert sorted(non_deprecated) == sorted(list(new_spec.traverse()))
assert sorted(deprecated) == sorted(list(old_spec.traverse()))
assert sorted(non_deprecated) == sorted(new_spec.traverse())
assert sorted(deprecated) == sorted([old_spec, old_spec["libelf"]])
def test_uninstall_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that we can still uninstall deprecated packages."""
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
@@ -104,9 +104,9 @@ def test_uninstall_deprecated(mock_packages, mock_archive, mock_fetch, install_m
def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that we can re-deprecate a spec to change its deprecator."""
install("libelf@0.8.13")
install("libelf@0.8.12")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.12")
install("--fake", "libelf@0.8.10")
deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
@@ -117,8 +117,8 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 2
assert len(all_available) == 3
@@ -129,9 +129,9 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that when a deprecator spec is deprecated, its deprecatee specs
are updated to point to the new deprecator."""
install("libelf@0.8.13")
install("libelf@0.8.12")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.12")
install("--fake", "libelf@0.8.10")
first_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
second_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.12")
@@ -144,8 +144,8 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
deprecate("-y", "libelf@0.8.12", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
non_deprecated = spack.store.STORE.db.query("libelf")
all_available = spack.store.STORE.db.query("libelf", installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 1
assert len(all_available) == 3
@@ -158,8 +158,8 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
def test_concretize_deprecated(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Tests that the concretizer throws an error if we concretize to a
deprecated spec"""
install("libelf@0.8.13")
install("libelf@0.8.10")
install("--fake", "libelf@0.8.13")
install("--fake", "libelf@0.8.10")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")

View File

@@ -127,16 +127,15 @@ def test_dev_build_before_until(tmpdir, install_mockery):
assert not_installed in out
def print_spack_cc(*args):
# Eat arguments and print environment variable to test
print(os.environ.get("CC", ""))
def _print_spack_short_spec(*args):
print(f"SPACK_SHORT_SPEC={os.environ['SPACK_SHORT_SPEC']}")
def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery, working_env):
monkeypatch.setattr(os, "execvp", print_spack_cc)
monkeypatch.setattr(os, "execvp", _print_spack_short_spec)
with tmpdir.as_cwd():
output = dev_build("-b", "edit", "--drop-in", "sh", "dev-build-test-install@0.0.0")
assert os.path.join("lib", "spack", "env") in output
assert "SPACK_SHORT_SPEC=dev-build-test-install@0.0.0" in output
def test_dev_build_fails_already_installed(tmpdir, install_mockery):

View File

@@ -194,7 +194,7 @@ def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
def test_load_first(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test with and without the --first option"""
install_cmd("mpileaks")
install_cmd("--fake", "mpileaks")
# Only one version of mpileaks will work
diff_cmd("mpileaks", "mpileaks")

View File

@@ -1750,7 +1750,10 @@ def check_stage(spec):
spec = spack.concretize.concretize_one(spec)
for dep in spec.traverse():
stage_name = f"{stage_prefix}{dep.name}-{dep.version}-{dep.dag_hash()}"
assert os.path.isdir(os.path.join(root, stage_name))
if dep.external:
assert not os.path.exists(os.path.join(root, stage_name))
else:
assert os.path.isdir(os.path.join(root, stage_name))
check_stage("mpileaks")
check_stage("zmpi")
@@ -3075,11 +3078,10 @@ def test_stack_view_activate_from_default(
assert "FOOBAR=mpileaks" in shell
def test_envvar_set_in_activate(tmpdir, mock_fetch, mock_packages, mock_archive, install_mockery):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w", encoding="utf-8") as f:
f.write(
"""\
def test_envvar_set_in_activate(tmp_path, mock_packages, install_mockery):
spack_yaml = tmp_path / "spack.yaml"
spack_yaml.write_text(
"""
spack:
specs:
- cmake%gcc
@@ -3087,21 +3089,21 @@ def test_envvar_set_in_activate(tmpdir, mock_fetch, mock_packages, mock_archive,
set:
ENVAR_SET_IN_ENV_LOAD: "True"
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
install()
)
test_env = ev.read("test")
output = env("activate", "--sh", "test")
env("create", "test", str(spack_yaml))
with ev.read("test"):
install("--fake")
assert "ENVAR_SET_IN_ENV_LOAD=True" in output
test_env = ev.read("test")
output = env("activate", "--sh", "test")
with test_env:
with spack.util.environment.set_env(ENVAR_SET_IN_ENV_LOAD="True"):
output = env("deactivate", "--sh")
assert "unset ENVAR_SET_IN_ENV_LOAD" in output
assert "ENVAR_SET_IN_ENV_LOAD=True" in output
with test_env:
with spack.util.environment.set_env(ENVAR_SET_IN_ENV_LOAD="True"):
output = env("deactivate", "--sh")
assert "unset ENVAR_SET_IN_ENV_LOAD" in output
def test_stack_view_no_activate_without_default(

View File

@@ -233,21 +233,27 @@ def test_display_json_deps(database, capsys):
@pytest.mark.db
def test_find_format(database, config):
output = find("--format", "{name}-{^mpi.name}", "mpileaks")
assert set(output.strip().split("\n")) == set(
["mpileaks-zmpi", "mpileaks-mpich", "mpileaks-mpich2"]
)
assert set(output.strip().split("\n")) == {
"mpileaks-zmpi",
"mpileaks-mpich",
"mpileaks-mpich2",
}
output = find("--format", "{name}-{version}-{compiler.name}-{^mpi.name}", "mpileaks")
assert "installed package" not in output
assert set(output.strip().split("\n")) == set(
["mpileaks-2.3-gcc-zmpi", "mpileaks-2.3-gcc-mpich", "mpileaks-2.3-gcc-mpich2"]
)
assert set(output.strip().split("\n")) == {
"mpileaks-2.3-gcc-zmpi",
"mpileaks-2.3-gcc-mpich",
"mpileaks-2.3-gcc-mpich2",
}
output = find("--format", "{name}-{^mpi.name}-{hash:7}", "mpileaks")
elements = output.strip().split("\n")
assert set(e[:-7] for e in elements) == set(
["mpileaks-zmpi-", "mpileaks-mpich-", "mpileaks-mpich2-"]
)
assert set(e[:-7] for e in elements) == {
"mpileaks-zmpi-",
"mpileaks-mpich-",
"mpileaks-mpich2-",
}
# hashes are in base32
for e in elements:
@@ -348,7 +354,7 @@ def test_find_prefix_in_env(
"""Test `find` formats requiring concrete specs work in environments."""
env("create", "test")
with ev.read("test"):
install("--add", "mpileaks")
install("--fake", "--add", "mpileaks")
find("-p")
find("-l")
find("-L")

View File

@@ -898,7 +898,6 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
specfile = "./spec.json"
with open(specfile, "w", encoding="utf-8") as f:
f.write(spec.to_json())
print(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")

View File

@@ -42,7 +42,7 @@ def mock_pkg_git_repo(git, tmp_path_factory):
repo_dir = root_dir / "builtin.mock"
shutil.copytree(spack.paths.mock_packages_path, str(repo_dir))
repo_cache = spack.util.file_cache.FileCache(str(root_dir / "cache"))
repo_cache = spack.util.file_cache.FileCache(root_dir / "cache")
mock_repo = spack.repo.RepoPath(str(repo_dir), cache=repo_cache)
mock_repo_packages = mock_repo.repos[0].packages_path

View File

@@ -5,9 +5,13 @@
import pytest
import spack.config
import spack.environment as ev
import spack.main
from spack.main import SpackCommand
repo = spack.main.SpackCommand("repo")
env = SpackCommand("env")
def test_help_option():
@@ -33,3 +37,33 @@ def test_create_add_list_remove(mutable_config, tmpdir):
repo("remove", "--scope=site", str(tmpdir))
output = repo("list", "--scope=site", output=str)
assert "mockrepo" not in output
def test_env_repo_path_vars_substitution(
tmpdir, install_mockery, mutable_mock_env_path, monkeypatch
):
"""Test Spack correctly substitues repo paths with environment variables when creating an
environment from a manifest file."""
monkeypatch.setenv("CUSTOM_REPO_PATH", ".")
# setup environment from spack.yaml
envdir = tmpdir.mkdir("env")
with envdir.as_cwd():
with open("spack.yaml", "w", encoding="utf-8") as f:
f.write(
"""\
spack:
specs: []
repos:
- $CUSTOM_REPO_PATH
"""
)
# creating env from manifest file
env("create", "test", "./spack.yaml")
# check that repo path was correctly substituted with the environment variable
current_dir = os.getcwd()
with ev.read("test") as newenv:
repos_specs = spack.config.get("repos", default={}, scope=newenv.scope_name)
assert current_dir in repos_specs

View File

@@ -50,7 +50,7 @@ def test_list_long(capsys):
def test_list_long_with_pytest_arg(capsys):
with capsys.disabled():
output = spack_test("--list-long", cmd_test_py)
print(output)
assert "unit_test.py::\n" in output
assert "test_list" in output
assert "test_list_with_pytest_arg" in output

View File

@@ -4,19 +4,31 @@
"""Tests for the `spack verify` command"""
import os
import platform
import pytest
import llnl.util.filesystem as fs
import spack.cmd.verify
import spack.concretize
import spack.installer
import spack.store
import spack.util.executable
import spack.util.spack_json as sjson
import spack.verify
from spack.main import SpackCommand
from spack.main import SpackCommand, SpackCommandError
verify = SpackCommand("verify")
install = SpackCommand("install")
def skip_unless_linux(f):
return pytest.mark.skipif(
str(platform.system()) != "Linux", reason="only tested on linux for now"
)(f)
def test_single_file_verify_cmd(tmpdir):
# Test the verify command interface to verifying a single file.
filedir = os.path.join(str(tmpdir), "a", "b", "c", "d")
@@ -36,15 +48,14 @@ def test_single_file_verify_cmd(tmpdir):
with open(manifest_file, "w", encoding="utf-8") as f:
sjson.dump({filepath: data}, f)
results = verify("-f", filepath, fail_on_error=False)
print(results)
results = verify("manifest", "-f", filepath, fail_on_error=False)
assert not results
os.utime(filepath, (0, 0))
with open(filepath, "w", encoding="utf-8") as f:
f.write("I changed.")
results = verify("-f", filepath, fail_on_error=False)
results = verify("manifest", "-f", filepath, fail_on_error=False)
expected = ["hash"]
mtime = os.stat(filepath).st_mtime
@@ -55,7 +66,7 @@ def test_single_file_verify_cmd(tmpdir):
assert filepath in results
assert all(x in results for x in expected)
results = verify("-fj", filepath, fail_on_error=False)
results = verify("manifest", "-fj", filepath, fail_on_error=False)
res = sjson.load(results)
assert len(res) == 1
errors = res.pop(filepath)
@@ -69,18 +80,68 @@ def test_single_spec_verify_cmd(tmpdir, mock_packages, mock_archive, mock_fetch,
prefix = s.prefix
hash = s.dag_hash()
results = verify("/%s" % hash, fail_on_error=False)
results = verify("manifest", "/%s" % hash, fail_on_error=False)
assert not results
new_file = os.path.join(prefix, "new_file_for_verify_test")
with open(new_file, "w", encoding="utf-8") as f:
f.write("New file")
results = verify("/%s" % hash, fail_on_error=False)
results = verify("manifest", "/%s" % hash, fail_on_error=False)
assert new_file in results
assert "added" in results
results = verify("-j", "/%s" % hash, fail_on_error=False)
results = verify("manifest", "-j", "/%s" % hash, fail_on_error=False)
res = sjson.load(results)
assert len(res) == 1
assert res[new_file] == ["added"]
@pytest.mark.requires_executables("gcc")
@skip_unless_linux
def test_libraries(tmp_path, install_mockery, mock_fetch):
gcc = spack.util.executable.which("gcc", required=True)
s = spack.concretize.concretize_one("libelf")
spack.installer.PackageInstaller([s.package]).install()
os.mkdir(s.prefix.bin)
# There are no ELF files so the verification should pass
verify("libraries", f"/{s.dag_hash()}")
# Now put main_with_rpath linking to libf.so inside the prefix and verify again. This should
# work because libf.so can be located in the rpath.
(tmp_path / "f.c").write_text("void f(void){return;}")
(tmp_path / "main.c").write_text("void f(void); int main(void){f();return 0;}")
gcc("-shared", "-fPIC", "-o", str(tmp_path / "libf.so"), str(tmp_path / "f.c"))
gcc(
"-o",
str(s.prefix.bin.main_with_rpath),
str(tmp_path / "main.c"),
"-L",
str(tmp_path),
f"-Wl,-rpath,{tmp_path}",
"-lf",
)
verify("libraries", f"/{s.dag_hash()}")
# Now put main_without_rpath linking to libf.so inside the prefix and verify again. This should
# fail because libf.so cannot be located in the rpath.
gcc(
"-o",
str(s.prefix.bin.main_without_rpath),
str(tmp_path / "main.c"),
"-L",
str(tmp_path),
"-lf",
)
with pytest.raises(SpackCommandError):
verify("libraries", f"/{s.dag_hash()}")
# Check the error message
msg = spack.cmd.verify._verify_libraries(s, [])
assert msg is not None and "libf.so => not found" in msg
# And check that we can make it pass by ignoring it.
assert spack.cmd.verify._verify_libraries(s, ["libf.so"]) is None

View File

@@ -2018,7 +2018,6 @@ def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
s = Spec("develop-branch-version@git.%s=develop" % git_ref)
c = spack.concretize.concretize_one(s)
assert git_ref in str(c)
print(str(c))
assert s.satisfies("@develop")
assert s.satisfies("@0.1:")
@@ -2888,6 +2887,23 @@ def test_specifying_different_versions_build_deps(self):
assert any(x.satisfies(hdf5_str) for x in result.specs)
assert any(x.satisfies(pinned_str) for x in result.specs)
@pytest.mark.regression("44289")
def test_all_extensions_depend_on_same_extendee(self):
"""Tests that we don't reuse dependencies that bring in a different extendee"""
setuptools = spack.concretize.concretize_one("py-setuptools ^python@3.10")
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(
setup, [Spec("py-floating ^python@3.11")], reuse=list(setuptools.traverse())
)
assert len(result.specs) == 1
floating = result.specs[0]
assert all(setuptools.dag_hash() != x.dag_hash() for x in floating.traverse())
pythons = [x for x in floating.traverse() if x.name == "python"]
assert len(pythons) == 1 and pythons[0].satisfies("@3.11")
@pytest.mark.parametrize(
"v_str,v_opts,checksummed",
@@ -3238,3 +3254,54 @@ def test_spec_unification(unify, mutable_config, mock_packages):
maybe_fails = pytest.raises if unify is True else llnl.util.lang.nullcontext
with maybe_fails(spack.solver.asp.UnsatisfiableSpecError):
_ = spack.cmd.parse_specs([a_restricted, b], concretize=True)
def test_concretization_cache_roundtrip(use_concretization_cache, monkeypatch, mutable_config):
"""Tests whether we can write the results of a clingo solve to the cache
and load the same spec request from the cache to produce identical specs"""
# Force determinism:
# Solver setup is normally non-deterministic due to non-determinism in
# asp solver setup logic generation. The only other inputs to the cache keys are
# the .lp files, which are invariant over the course of this test.
# This method forces the same setup to be produced for the same specs
# which gives us a guarantee of cache hits, as it removes the only
# element of non deterministic solver setup for the same spec
# Basically just a quick and dirty memoization
solver_setup = spack.solver.asp.SpackSolverSetup.setup
def _setup(self, specs, *, reuse=None, allow_deprecated=False):
if not getattr(_setup, "cache_setup", None):
cache_setup = solver_setup(self, specs, reuse=reuse, allow_deprecated=allow_deprecated)
setattr(_setup, "cache_setup", cache_setup)
return getattr(_setup, "cache_setup")
# monkeypatch our forced determinism setup method into solver setup
monkeypatch.setattr(spack.solver.asp.SpackSolverSetup, "setup", _setup)
assert spack.config.get("config:concretization_cache:enable")
# run one standard concretization to populate the cache and the setup method
# memoization
h = spack.concretize.concretize_one("hdf5")
# due to our forced determinism above, we should not be observing
# cache misses, assert that we're not storing any new cache entries
def _ensure_no_store(self, problem: str, result, statistics, test=False):
# always throw, we never want to reach this code path
assert False, "Concretization cache hit expected"
# Assert that we're actually hitting the cache
cache_fetch = spack.solver.asp.ConcretizationCache.fetch
def _ensure_cache_hits(self, problem: str):
result, statistics = cache_fetch(self, problem)
assert result, "Expected successful concretization cache hit"
assert statistics, "Expected statistics to be non null on cache hit"
return result, statistics
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "store", _ensure_no_store)
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "fetch", _ensure_cache_hits)
# ensure subsequent concretizations of the same spec produce the same spec
# object
for _ in range(5):
assert h == spack.concretize.concretize_one("hdf5")

View File

@@ -33,6 +33,8 @@
import spack.util.path as spack_path
import spack.util.spack_yaml as syaml
from ..enums import ConfigScopePriority
# sample config data
config_low = {
"config": {
@@ -710,10 +712,7 @@ def assert_marked(obj):
def test_internal_config_from_data():
config = spack.config.Configuration()
# add an internal config initialized from an inline dict
config.push_scope(
config = spack.config.create_from(
spack.config.InternalConfigScope(
"_builtin", {"config": {"verify_ssl": False, "build_jobs": 6}}
)
@@ -1445,7 +1444,7 @@ def test_config_path_dsl(path, it_should_work, expected_parsed):
@pytest.mark.regression("48254")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
def test_env_activation_preserves_command_line_scope(mutable_mock_env_path):
"""Check that the "command_line" scope remains the highest priority scope, when we activate,
or deactivate, environments.
"""
@@ -1469,3 +1468,51 @@ def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
@pytest.mark.regression("48414")
@pytest.mark.regression("49188")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
"""Check that the priority of scopes is respected when merging configuration files."""
custom_scope = spack.config.InternalConfigScope("custom_scope")
spack.config.CONFIG.push_scope(custom_scope, priority=ConfigScopePriority.CUSTOM)
expected_scopes_without_env = ["custom_scope", "command_line"]
expected_scopes_with_first_env = ["custom_scope", "env:test", "command_line"]
expected_scopes_with_second_env = ["custom_scope", "env:test-2", "command_line"]
def highest_priority_scopes(config, *, nscopes):
return list(config.scopes)[-nscopes:]
assert highest_priority_scopes(spack.config.CONFIG, nscopes=2) == expected_scopes_without_env
# Creating an environment pushes a new scope
ev.create("test")
with ev.read("test"):
assert (
highest_priority_scopes(spack.config.CONFIG, nscopes=3)
== expected_scopes_with_first_env
)
# No active environment pops the scope
with ev.no_active_environment():
assert (
highest_priority_scopes(spack.config.CONFIG, nscopes=2)
== expected_scopes_without_env
)
assert (
highest_priority_scopes(spack.config.CONFIG, nscopes=3)
== expected_scopes_with_first_env
)
# Switch the environment to another one
ev.create("test-2")
with ev.read("test-2"):
assert (
highest_priority_scopes(spack.config.CONFIG, nscopes=3)
== expected_scopes_with_second_env
)
assert (
highest_priority_scopes(spack.config.CONFIG, nscopes=3)
== expected_scopes_with_first_env
)
assert highest_priority_scopes(spack.config.CONFIG, nscopes=2) == expected_scopes_without_env

View File

@@ -66,6 +66,8 @@
from spack.main import SpackCommand
from spack.util.pattern import Bunch
from ..enums import ConfigScopePriority
mirror_cmd = SpackCommand("mirror")
@@ -339,6 +341,16 @@ def pytest_collection_modifyitems(config, items):
item.add_marker(skip_as_slow)
@pytest.fixture(scope="function")
def use_concretization_cache(mutable_config, tmpdir):
"""Enables the use of the concretization cache"""
spack.config.set("config:concretization_cache:enable", True)
# ensure we have an isolated concretization cache
new_conc_cache_loc = str(tmpdir.mkdir("concretization"))
spack.config.set("config:concretization_cache:path", new_conc_cache_loc)
yield
#
# These fixtures are applied to all tests
#
@@ -723,11 +735,23 @@ def configuration_dir(tmpdir_factory, linux_os):
def _create_mock_configuration_scopes(configuration_dir):
"""Create the configuration scopes used in `config` and `mutable_config`."""
return [
spack.config.InternalConfigScope("_builtin", spack.config.CONFIG_DEFAULTS),
spack.config.DirectoryConfigScope("site", str(configuration_dir.join("site"))),
spack.config.DirectoryConfigScope("system", str(configuration_dir.join("system"))),
spack.config.DirectoryConfigScope("user", str(configuration_dir.join("user"))),
spack.config.InternalConfigScope("command_line"),
(
ConfigScopePriority.BUILTIN,
spack.config.InternalConfigScope("_builtin", spack.config.CONFIG_DEFAULTS),
),
(
ConfigScopePriority.CONFIG_FILES,
spack.config.DirectoryConfigScope("site", str(configuration_dir.join("site"))),
),
(
ConfigScopePriority.CONFIG_FILES,
spack.config.DirectoryConfigScope("system", str(configuration_dir.join("system"))),
),
(
ConfigScopePriority.CONFIG_FILES,
spack.config.DirectoryConfigScope("user", str(configuration_dir.join("user"))),
),
(ConfigScopePriority.COMMAND_LINE, spack.config.InternalConfigScope("command_line")),
]
@@ -794,13 +818,11 @@ def mock_wsdk_externals(monkeypatch_session):
def concretize_scope(mutable_config, tmpdir):
"""Adds a scope for concretization preferences"""
tmpdir.ensure_dir("concretize")
mutable_config.push_scope(
with spack.config.override(
spack.config.DirectoryConfigScope("concretize", str(tmpdir.join("concretize")))
)
):
yield str(tmpdir.join("concretize"))
yield str(tmpdir.join("concretize"))
mutable_config.pop_scope()
spack.repo.PATH._provider_index = None
@@ -2126,8 +2148,7 @@ def _c_compiler_always_exists():
@pytest.fixture(scope="session")
def mock_test_cache(tmp_path_factory):
cache_dir = tmp_path_factory.mktemp("cache")
print(cache_dir)
return spack.util.file_cache.FileCache(str(cache_dir))
return spack.util.file_cache.FileCache(cache_dir)
class MockHTTPResponse(io.IOBase):

View File

@@ -14,3 +14,5 @@ config:
checksum: true
dirty: false
locks: {1}
concretization_cache:
enable: false

View File

@@ -161,7 +161,7 @@ def test_handle_unknown_package(temporary_store, config, mock_packages, tmp_path
"""
layout = temporary_store.layout
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
mock_db = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=repo_cache)
not_in_mock = set.difference(

View File

@@ -519,7 +519,9 @@ def test_error_message_when_using_too_new_lockfile(tmp_path):
("when_possible", True),
],
)
def test_environment_concretizer_scheme_used(tmp_path, unify_in_lower_scope, unify_in_spack_yaml):
def test_environment_concretizer_scheme_used(
tmp_path, mutable_config, unify_in_lower_scope, unify_in_spack_yaml
):
"""Tests that "unify" settings in spack.yaml always take precedence over settings in lower
configuration scopes.
"""
@@ -533,10 +535,11 @@ def test_environment_concretizer_scheme_used(tmp_path, unify_in_lower_scope, uni
unify: {str(unify_in_spack_yaml).lower()}
"""
)
with spack.config.override("concretizer:unify", unify_in_lower_scope):
with ev.Environment(manifest.parent) as e:
assert e.unify == unify_in_spack_yaml
mutable_config.set("concretizer:unify", unify_in_lower_scope)
assert mutable_config.get("concretizer:unify") == unify_in_lower_scope
with ev.Environment(manifest.parent) as e:
assert mutable_config.get("concretizer:unify") == unify_in_spack_yaml
assert e.unify == unify_in_spack_yaml
@pytest.mark.parametrize("unify_in_config", [True, False, "when_possible"])

View File

@@ -977,7 +977,6 @@ class MyBuildException(Exception):
def _install_fail_my_build_exception(installer, task, install_status, **kwargs):
print(task, task.pkg.name)
if task.pkg.name == "pkg-a":
raise MyBuildException("mock internal package build error for pkg-a")
else:

View File

@@ -364,3 +364,44 @@ def test_fnmatch_multiple():
assert not regex.match("libbar.so.1")
assert not regex.match("libfoo.solibbar.so")
assert not regex.match("libbaz.so")
class TestPriorityOrderedMapping:
@pytest.mark.parametrize(
"elements,expected",
[
# Push out-of-order with explicit, and different, priorities
([("b", 2), ("a", 1), ("d", 4), ("c", 3)], ["a", "b", "c", "d"]),
# Push in-order with priority=None
([("a", None), ("b", None), ("c", None), ("d", None)], ["a", "b", "c", "d"]),
# Mix explicit and implicit priorities
([("b", 2), ("c", None), ("a", 1), ("d", None)], ["a", "b", "c", "d"]),
([("b", 10), ("c", None), ("a", -20), ("d", None)], ["a", "b", "c", "d"]),
([("b", 10), ("c", None), ("a", 20), ("d", None)], ["b", "c", "a", "d"]),
# Adding the same key twice with different priorities
([("b", 10), ("c", None), ("a", 20), ("d", None), ("a", -20)], ["a", "b", "c", "d"]),
# Adding the same key twice, no priorities
([("b", None), ("a", None), ("b", None)], ["a", "b"]),
],
)
def test_iteration_order(self, elements, expected):
"""Tests that the iteration order respects priorities, no matter the insertion order."""
m = llnl.util.lang.PriorityOrderedMapping()
for key, priority in elements:
m.add(key, value=None, priority=priority)
assert list(m) == expected
def test_reverse_iteration(self):
"""Tests that we can conveniently use reverse iteration"""
m = llnl.util.lang.PriorityOrderedMapping()
for key, value in [("a", 1), ("b", 2), ("c", 3)]:
m.add(key, value=value)
assert list(m) == ["a", "b", "c"]
assert list(reversed(m)) == ["c", "b", "a"]
assert list(m.keys()) == ["a", "b", "c"]
assert list(m.reversed_keys()) == ["c", "b", "a"]
assert list(m.values()) == [1, 2, 3]
assert list(m.reversed_values()) == [3, 2, 1]

View File

@@ -2,6 +2,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import pickle
import stat
import pytest
@@ -223,3 +224,10 @@ def test_check_module_set_name(mutable_config):
with pytest.raises(spack.error.ConfigError, match=msg):
spack.cmd.modules.check_module_set_name("third")
@pytest.mark.parametrize("module_type", ["tcl", "lmod"])
def test_module_writers_are_pickleable(default_mock_concretization, module_type):
s = default_mock_concretization("mpileaks")
writer = spack.modules.module_types[module_type](s, "default")
assert pickle.loads(pickle.dumps(writer)).spec == s

View File

@@ -34,7 +34,7 @@ def extra_repo(tmp_path_factory, request):
subdirectory: '{request.param}'
"""
)
repo_cache = spack.util.file_cache.FileCache(str(cache_dir))
repo_cache = spack.util.file_cache.FileCache(cache_dir)
return spack.repo.Repo(str(repo_dir), cache=repo_cache), request.param
@@ -194,7 +194,7 @@ def _repo_paths(repos):
repo_paths, namespaces = _repo_paths(repos)
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
repo_path = spack.repo.RepoPath(*repo_paths, cache=repo_cache)
assert len(repo_path.repos) == len(namespaces)
assert [x.namespace for x in repo_path.repos] == namespaces
@@ -319,3 +319,48 @@ def test_get_repo(self, mock_test_cache):
# foo is not there, raise
with pytest.raises(spack.repo.UnknownNamespaceError):
repo.get_repo("foo")
def test_parse_package_api_version():
"""Test that we raise an error if a repository has a version that is not supported."""
# valid version
assert spack.repo._parse_package_api_version(
{"api": "v1.2"}, min_api=(1, 0), max_api=(2, 3)
) == (1, 2)
# too new and too old
with pytest.raises(
spack.repo.BadRepoError,
match=r"Package API v2.4 is not supported .* \(must be between v1.0 and v2.3\)",
):
spack.repo._parse_package_api_version({"api": "v2.4"}, min_api=(1, 0), max_api=(2, 3))
with pytest.raises(
spack.repo.BadRepoError,
match=r"Package API v0.9 is not supported .* \(must be between v1.0 and v2.3\)",
):
spack.repo._parse_package_api_version({"api": "v0.9"}, min_api=(1, 0), max_api=(2, 3))
# default to v1.0 if not specified
assert spack.repo._parse_package_api_version({}, min_api=(1, 0), max_api=(2, 3)) == (1, 0)
# if v1.0 support is dropped we should also raise
with pytest.raises(
spack.repo.BadRepoError,
match=r"Package API v1.0 is not supported .* \(must be between v2.0 and v2.3\)",
):
spack.repo._parse_package_api_version({}, min_api=(2, 0), max_api=(2, 3))
# finally test invalid input
with pytest.raises(spack.repo.BadRepoError, match="Invalid Package API version"):
spack.repo._parse_package_api_version({"api": "v2"}, min_api=(1, 0), max_api=(3, 3))
with pytest.raises(spack.repo.BadRepoError, match="Invalid Package API version"):
spack.repo._parse_package_api_version({"api": 2.0}, min_api=(1, 0), max_api=(3, 3))
def test_repo_package_api_version(tmp_path: pathlib.Path):
"""Test that we can specify the API version of a repository."""
(tmp_path / "example" / "packages").mkdir(parents=True)
(tmp_path / "example" / "repo.yaml").write_text(
"""\
repo:
namespace: example
"""
)
cache = spack.util.file_cache.FileCache(tmp_path / "cache")
assert spack.repo.Repo(str(tmp_path / "example"), cache=cache).package_api == (1, 0)

View File

@@ -27,9 +27,7 @@ def check_spliced_spec_prefixes(spliced_spec):
text_file_path = os.path.join(node.prefix, node.name)
with open(text_file_path, "r", encoding="utf-8") as f:
text = f.read()
print(text)
for modded_spec in node.traverse(root=True, deptype=dt.ALL & ~dt.BUILD):
print(modded_spec)
assert modded_spec.prefix in text

View File

@@ -17,7 +17,7 @@
def validate_spec_schema():
return {
"type": "object",
"validate_spec": True,
"additionalKeysAreSpecs": True,
"patternProperties": {r"\w[\w-]*": {"type": "string"}},
}
@@ -34,7 +34,7 @@ def module_suffixes_schema():
"type": "object",
"properties": {
"suffixes": {
"validate_spec": True,
"additionalKeysAreSpecs": True,
"patternProperties": {r"\w[\w-]*": {"type": "string"}},
}
},

View File

@@ -427,6 +427,9 @@ def test_load_json_specfiles(specfile, expected_hash, reader_cls):
openmpi_edges = s2.edges_to_dependencies(name="openmpi")
assert len(openmpi_edges) == 1
# Check that virtuals have been reconstructed
assert "mpi" in openmpi_edges[0].virtuals
# The virtuals attribute must be a tuple, when read from a
# JSON or YAML file, not a list
for edge in s2.traverse_edges():

View File

@@ -149,11 +149,8 @@ def test_reverse_environment_modifications(working_env):
os.environ.clear()
os.environ.update(start_env)
print(os.environ)
to_reverse.apply_modifications()
print(os.environ)
reversal.apply_modifications()
print(os.environ)
start_env.pop("UNSET")
assert os.environ == start_env

View File

@@ -257,7 +257,6 @@ def test_core_lib_files():
names.append(os.path.join(test_dir, n))
for filename in names:
print("Testing %s" % filename)
source = read_pyfile(filename)
check_ast_roundtrip(source)

View File

@@ -685,22 +685,6 @@ def test_str(self) -> None:
c["shared"] = BoolValuedVariant("shared", True)
assert str(c) == "+shared feebar=foo foo=bar,baz foobar=fee"
def test_concrete(self, mock_packages, config) -> None:
spec = Spec("pkg-a")
assert not VariantMap(spec).concrete
# concrete if associated spec is concrete
spec = spack.concretize.concretize_one(spec)
assert VariantMap(spec).concrete
# concrete if all variants are present (even if spec not concrete)
spec._mark_concrete(False)
assert spec.variants.concrete
# remove a variant to test the condition
del spec.variants["foo"]
assert not spec.variants.concrete
def test_disjoint_set_initialization_errors():
# Constructing from non-disjoint sets should raise an exception

View File

@@ -133,7 +133,7 @@ def test_check_prefix_manifest(tmpdir):
spec = spack.spec.Spec("libelf")
spec._mark_concrete()
spec.prefix = prefix
spec.set_prefix(prefix)
results = spack.verify.check_spec_manifest(spec)
assert results.has_errors()

View File

@@ -35,8 +35,8 @@ def test_view_with_spec_not_contributing_files(mock_packages, tmpdir):
a = Spec("pkg-a")
b = Spec("pkg-b")
a.prefix = os.path.join(tmpdir, "a")
b.prefix = os.path.join(tmpdir, "b")
a.set_prefix(os.path.join(tmpdir, "a"))
b.set_prefix(os.path.join(tmpdir, "b"))
a._mark_concrete()
b._mark_concrete()

View File

@@ -68,7 +68,9 @@ def project_env_mods(
*specs: spack.spec.Spec, view, env: environment.EnvironmentModifications
) -> None:
"""Given a list of environment modifications, project paths changes to the view."""
prefix_to_prefix = {s.prefix: view.get_projection_for_spec(s) for s in specs if not s.external}
prefix_to_prefix = {
str(s.prefix): view.get_projection_for_spec(s) for s in specs if not s.external
}
# Avoid empty regex if all external
if not prefix_to_prefix:
return

View File

@@ -5,16 +5,17 @@
import errno
import math
import os
import pathlib
import shutil
from typing import IO, Optional, Tuple
from typing import IO, Dict, Optional, Tuple, Union
from llnl.util.filesystem import mkdirp, rename
from llnl.util.filesystem import rename
from spack.error import SpackError
from spack.util.lock import Lock, ReadTransaction, WriteTransaction
def _maybe_open(path: str) -> Optional[IO[str]]:
def _maybe_open(path: Union[str, pathlib.Path]) -> Optional[IO[str]]:
try:
return open(path, "r", encoding="utf-8")
except OSError as e:
@@ -24,7 +25,7 @@ def _maybe_open(path: str) -> Optional[IO[str]]:
class ReadContextManager:
def __init__(self, path: str) -> None:
def __init__(self, path: Union[str, pathlib.Path]) -> None:
self.path = path
def __enter__(self) -> Optional[IO[str]]:
@@ -70,7 +71,7 @@ class FileCache:
"""
def __init__(self, root, timeout=120):
def __init__(self, root: Union[str, pathlib.Path], timeout=120):
"""Create a file cache object.
This will create the cache directory if it does not exist yet.
@@ -82,58 +83,60 @@ def __init__(self, root, timeout=120):
for cache files, this specifies how long Spack should wait
before assuming that there is a deadlock.
"""
self.root = root.rstrip(os.path.sep)
if not os.path.exists(self.root):
mkdirp(self.root)
if isinstance(root, str):
root = pathlib.Path(root)
self.root = root
self.root.mkdir(parents=True, exist_ok=True)
self._locks = {}
self._locks: Dict[Union[pathlib.Path, str], Lock] = {}
self.lock_timeout = timeout
def destroy(self):
"""Remove all files under the cache root."""
for f in os.listdir(self.root):
path = os.path.join(self.root, f)
if os.path.isdir(path):
shutil.rmtree(path, True)
for f in self.root.iterdir():
if f.is_dir():
shutil.rmtree(f, True)
else:
os.remove(path)
f.unlink()
def cache_path(self, key):
def cache_path(self, key: Union[str, pathlib.Path]):
"""Path to the file in the cache for a particular key."""
return os.path.join(self.root, key)
return self.root / key
def _lock_path(self, key):
def _lock_path(self, key: Union[str, pathlib.Path]):
"""Path to the file in the cache for a particular key."""
keyfile = os.path.basename(key)
keydir = os.path.dirname(key)
return os.path.join(self.root, keydir, "." + keyfile + ".lock")
return self.root / keydir / ("." + keyfile + ".lock")
def _get_lock(self, key):
def _get_lock(self, key: Union[str, pathlib.Path]):
"""Create a lock for a key, if necessary, and return a lock object."""
if key not in self._locks:
self._locks[key] = Lock(self._lock_path(key), default_timeout=self.lock_timeout)
self._locks[key] = Lock(str(self._lock_path(key)), default_timeout=self.lock_timeout)
return self._locks[key]
def init_entry(self, key):
def init_entry(self, key: Union[str, pathlib.Path]):
"""Ensure we can access a cache file. Create a lock for it if needed.
Return whether the cache file exists yet or not.
"""
cache_path = self.cache_path(key)
# Avoid using pathlib here to allow the logic below to
# function as is
# TODO: Maybe refactor the following logic for pathlib
exists = os.path.exists(cache_path)
if exists:
if not os.path.isfile(cache_path):
if not cache_path.is_file():
raise CacheError("Cache file is not a file: %s" % cache_path)
if not os.access(cache_path, os.R_OK):
raise CacheError("Cannot access cache file: %s" % cache_path)
else:
# if the file is hierarchical, make parent directories
parent = os.path.dirname(cache_path)
if parent.rstrip(os.path.sep) != self.root:
mkdirp(parent)
parent = cache_path.parent
if parent != self.root:
parent.mkdir(parents=True, exist_ok=True)
if not os.access(parent, os.R_OK | os.W_OK):
raise CacheError("Cannot access cache directory: %s" % parent)
@@ -142,7 +145,7 @@ def init_entry(self, key):
self._get_lock(key)
return exists
def read_transaction(self, key):
def read_transaction(self, key: Union[str, pathlib.Path]):
"""Get a read transaction on a file cache item.
Returns a ReadTransaction context manager and opens the cache file for
@@ -153,9 +156,11 @@ def read_transaction(self, key):
"""
path = self.cache_path(key)
return ReadTransaction(self._get_lock(key), acquire=lambda: ReadContextManager(path))
return ReadTransaction(
self._get_lock(key), acquire=lambda: ReadContextManager(path) # type: ignore
)
def write_transaction(self, key):
def write_transaction(self, key: Union[str, pathlib.Path]):
"""Get a write transaction on a file cache item.
Returns a WriteTransaction context manager that opens a temporary file
@@ -167,9 +172,11 @@ def write_transaction(self, key):
if os.path.exists(path) and not os.access(path, os.W_OK):
raise CacheError(f"Insufficient permissions to write to file cache at {path}")
return WriteTransaction(self._get_lock(key), acquire=lambda: WriteContextManager(path))
return WriteTransaction(
self._get_lock(key), acquire=lambda: WriteContextManager(path) # type: ignore
)
def mtime(self, key) -> float:
def mtime(self, key: Union[str, pathlib.Path]) -> float:
"""Return modification time of cache file, or -inf if it does not exist.
Time is in units returned by os.stat in the mtime field, which is
@@ -179,14 +186,14 @@ def mtime(self, key) -> float:
if not self.init_entry(key):
return -math.inf
else:
return os.stat(self.cache_path(key)).st_mtime
return self.cache_path(key).stat().st_mtime
def remove(self, key):
def remove(self, key: Union[str, pathlib.Path]):
file = self.cache_path(key)
lock = self._get_lock(key)
try:
lock.acquire_write()
os.unlink(file)
file.unlink()
except OSError as e:
# File not found is OK, so remove is idempotent.
if e.errno != errno.ENOENT:

View File

@@ -0,0 +1,212 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import fnmatch
import os
import re
from typing import IO, Dict, List
from llnl.util.filesystem import BaseDirectoryVisitor
from llnl.util.lang import stable_partition
import spack.util.elf as elf
#: Patterns for names of libraries that are allowed to be unresolved when *just* looking at RPATHs
#: added by Spack. These are libraries outside of Spack's control, and assumed to be located in
#: default search paths of the dynamic linker.
ALLOW_UNRESOLVED = [
# kernel
"linux-vdso.so.*",
"libselinux.so.*",
# musl libc
"ld-musl-*.so.*",
# glibc
"ld-linux*.so.*",
"ld64.so.*",
"libanl.so.*",
"libc.so.*",
"libdl.so.*",
"libm.so.*",
"libmemusage.so.*",
"libmvec.so.*",
"libnsl.so.*",
"libnss_compat.so.*",
"libnss_db.so.*",
"libnss_dns.so.*",
"libnss_files.so.*",
"libnss_hesiod.so.*",
"libpcprofile.so.*",
"libpthread.so.*",
"libresolv.so.*",
"librt.so.*",
"libSegFault.so.*",
"libthread_db.so.*",
"libutil.so.*",
# gcc -- this is required even with gcc-runtime, because e.g. libstdc++ depends on libgcc_s,
# but the binaries we copy from the compiler don't have an $ORIGIN rpath.
"libasan.so.*",
"libatomic.so.*",
"libcc1.so.*",
"libgcc_s.so.*",
"libgfortran.so.*",
"libgomp.so.*",
"libitm.so.*",
"liblsan.so.*",
"libquadmath.so.*",
"libssp.so.*",
"libstdc++.so.*",
"libtsan.so.*",
"libubsan.so.*",
# systemd
"libudev.so.*",
# cuda driver
"libcuda.so.*",
# intel-oneapi-runtime
"libur_loader.so.*",
]
def is_compatible(parent: elf.ElfFile, child: elf.ElfFile) -> bool:
return (
child.elf_hdr.e_type == elf.ELF_CONSTANTS.ET_DYN
and parent.is_little_endian == child.is_little_endian
and parent.is_64_bit == child.is_64_bit
and parent.elf_hdr.e_machine == child.elf_hdr.e_machine
)
def candidate_matches(current_elf: elf.ElfFile, candidate_path: bytes) -> bool:
try:
with open(candidate_path, "rb") as g:
return is_compatible(current_elf, elf.parse_elf(g))
except (OSError, elf.ElfParsingError):
return False
class Problem:
def __init__(
self, resolved: Dict[bytes, bytes], unresolved: List[bytes], relative_rpaths: List[bytes]
) -> None:
self.resolved = resolved
self.unresolved = unresolved
self.relative_rpaths = relative_rpaths
class ResolveSharedElfLibDepsVisitor(BaseDirectoryVisitor):
def __init__(self, allow_unresolved_patterns: List[str]) -> None:
self.problems: Dict[str, Problem] = {}
self._allow_unresolved_regex = re.compile(
"|".join(fnmatch.translate(x) for x in allow_unresolved_patterns)
)
def allow_unresolved(self, needed: bytes) -> bool:
try:
name = needed.decode("utf-8")
except UnicodeDecodeError:
return False
return bool(self._allow_unresolved_regex.match(name))
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# We work with byte strings for paths.
path = os.path.join(root, rel_path).encode("utf-8")
# For $ORIGIN interpolation: should not have trailing dir seperator.
origin = os.path.dirname(path)
# Retrieve the needed libs + rpaths.
try:
with open(path, "rb") as f:
parsed_elf = elf.parse_elf(f, interpreter=False, dynamic_section=True)
except (OSError, elf.ElfParsingError):
# Not dealing with an invalid ELF file.
return
# If there's no needed libs all is good
if not parsed_elf.has_needed:
return
# Get the needed libs and rpaths (notice: byte strings)
# Don't force an encoding cause paths are just a bag of bytes.
needed_libs = parsed_elf.dt_needed_strs
rpaths = parsed_elf.dt_rpath_str.split(b":") if parsed_elf.has_rpath else []
# We only interpolate $ORIGIN, not $LIB and $PLATFORM, they're not really
# supported in general. Also remove empty paths.
rpaths = [x.replace(b"$ORIGIN", origin) for x in rpaths if x]
# Do not allow relative rpaths (they are relative to the current working directory)
rpaths, relative_rpaths = stable_partition(rpaths, os.path.isabs)
# If there's a / in the needed lib, it's opened directly, otherwise it needs
# a search.
direct_libs, search_libs = stable_partition(needed_libs, lambda x: b"/" in x)
# Do not allow relative paths in direct libs (they are relative to the current working
# directory)
direct_libs, unresolved = stable_partition(direct_libs, os.path.isabs)
resolved: Dict[bytes, bytes] = {}
for lib in search_libs:
if self.allow_unresolved(lib):
continue
for rpath in rpaths:
candidate = os.path.join(rpath, lib)
if candidate_matches(parsed_elf, candidate):
resolved[lib] = candidate
break
else:
unresolved.append(lib)
# Check if directly opened libs are compatible
for lib in direct_libs:
if candidate_matches(parsed_elf, lib):
resolved[lib] = lib
else:
unresolved.append(lib)
if unresolved or relative_rpaths:
self.problems[rel_path] = Problem(resolved, unresolved, relative_rpaths)
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
pass
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# There can be binaries in .spack/test which shouldn't be checked.
if rel_path == ".spack":
return False
return True
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
return False
def write(self, output: IO[str], *, indent=0, brief: bool = False) -> None:
indent_str = " " * indent
for path, problem in self.problems.items():
output.write(indent_str)
output.write(path)
output.write("\n")
if not brief:
for needed, full_path in problem.resolved.items():
output.write(indent_str)
output.write(" ")
if needed == full_path:
output.write(_decode_or_raw(needed))
else:
output.write(f"{_decode_or_raw(needed)} => {_decode_or_raw(full_path)}")
output.write("\n")
for not_found in problem.unresolved:
output.write(indent_str)
output.write(f" {_decode_or_raw(not_found)} => not found\n")
for relative_rpath in problem.relative_rpaths:
output.write(indent_str)
output.write(f" {_decode_or_raw(relative_rpath)} => relative rpath\n")
def _decode_or_raw(byte_str: bytes) -> str:
try:
return byte_str.decode("utf-8")
except UnicodeDecodeError:
return f"{byte_str!r}"

View File

@@ -10,7 +10,7 @@
COMMIT_VERSION = re.compile(r"^[a-f0-9]{40}$")
# Infinity-like versions. The order in the list implies the comparison rules
infinity_versions = ["stable", "trunk", "head", "master", "main", "develop"]
infinity_versions = ["stable", "nightly", "trunk", "head", "master", "main", "develop"]
iv_min_len = min(len(s) for s in infinity_versions)

View File

@@ -426,7 +426,7 @@ developer-tools-aarch64-linux-gnu-build:
SPACK_CI_STACK_NAME: developer-tools-darwin
developer-tools-darwin-generate:
tags: [ "macos-sonoma", "apple-clang-16", "aarch64-macos" ]
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
extends: [ ".developer-tools-darwin", ".generate-base"]
developer-tools-darwin-build:
@@ -686,7 +686,7 @@ ml-linux-aarch64-cuda-build:
SPACK_CI_STACK_NAME: ml-darwin-aarch64-mps
ml-darwin-aarch64-mps-generate:
tags: [ "macos-sonoma", "apple-clang-16", "aarch64-macos" ]
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
extends: [ ".ml-darwin-aarch64-mps", ".generate-base"]
ml-darwin-aarch64-mps-build:
@@ -910,7 +910,7 @@ bootstrap-x86_64-linux-gnu-build:
SPACK_CI_STACK_NAME: bootstrap-aarch64-darwin
bootstrap-aarch64-darwin-generate:
tags: [ "macos-sonoma", "apple-clang-16", "aarch64-macos" ]
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
extends: [.bootstrap-aarch64-darwin, .generate-base]
bootstrap-aarch64-darwin-build:

View File

@@ -20,8 +20,9 @@ ci:
- k=$CI_GPG_KEY_ROOT/intermediate_ci_signing_key.gpg; [[ -r $k ]] && spack gpg trust $k
- k=$CI_GPG_KEY_ROOT/spack_public_key.gpg; [[ -r $k ]] && spack gpg trust $k
script::
- - spack config blame mirrors
- spack --color=always --backtrace ci rebuild -j ${SPACK_BUILD_JOBS} --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
- - if [ -n "$SPACK_EXTRA_MIRROR" ]; then spack mirror add local "$SPACK_EXTRA_MIRROR"; fi
- spack config blame mirrors
- - spack --color=always --backtrace ci rebuild -j ${SPACK_BUILD_JOBS} --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
after_script:
- - cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true

View File

@@ -1,3 +1,5 @@
config:
build_stage:
- $spack/tmp/stage
install_tree:
root: $spack/opt/spack

View File

@@ -1,6 +1,20 @@
ci:
broken-tests-packages:
- mpich
- openmpi
- py-mpi4py
pipeline-gen:
- build-job-remove:
tags: [spack]
- build-job:
tags: [ "macos-sonoma", "apple-clang-16", "aarch64-macos" ]
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
# after_script intended to ensure all stage files are properly cleaned up,
# including those that may have been created as read-only by `go mod`
# as part of installation of a golang package
# see: https://github.com/spack/spack/issues/49147
after_script-:
- - if [[ -d tmp ]] ; then chmod -R u+w tmp ; else echo tmp not found ; fi
- ./bin/spack clean -a
- build-job-remove:
image:: macos-run-on-metal

View File

@@ -52,10 +52,12 @@ spack:
packages:
acfl:
require:
- '%gcc target=aarch64'
- "%gcc"
- "target=aarch64"
gromacs:
require:
- gromacs@2024.3 %gcc ^armpl-gcc ^openmpi
- gromacs@2024.3 ^armpl-gcc ^openmpi
- "%gcc"
libfabric:
buildable: true
externals:
@@ -67,13 +69,14 @@ spack:
variants: ~lldb
mpas-model:
require:
- precision=single %gcc ^parallelio+pnetcdf
- precision=single ^parallelio+pnetcdf
- "%gcc"
mpich:
require:
- mpich pmi=pmi2 device=ch4 netmod=ofi +slurm
nvhpc:
require:
- nvhpc %gcc target=aarch64
- "target=aarch64"
openfoam:
require:
- openfoam ^scotch@6.0.9
@@ -85,7 +88,7 @@ spack:
# require:
# - one_of: ["palace cxxflags=\"-include cstdint\" ^fmt@9.1.0"]
pmix:
require: 'pmix@3:'
require: "pmix@3:"
quantum-espresso:
require:
- quantum-espresso@6.6 %gcc ^armpl-gcc

View File

@@ -82,6 +82,12 @@ spack:
require:
- lammps_sizes=bigbig +molecule +kspace +rigid +asphere +opt +openmp +openmp-package fft=mkl ^intel-oneapi-mkl
- one_of: [+intel target=x86_64_v4, target=x86_64_v3]
bison:
require:
- "%gcc"
boost:
require:
- "%gcc"
libfabric:
buildable: true
externals:

View File

@@ -25,8 +25,6 @@ spack:
ci:
pipeline-gen:
- build-job-remove:
tags: [spack, public]
- build-job:
variables:
CI_GPG_KEY_ROOT: /etc/protected-runner

View File

@@ -2,25 +2,19 @@ spack:
view: false
packages:
all:
require: target=x86_64_v3
definitions:
- default_specs:
- 'uncrustify build_system=autotools'
- 'uncrustify build_system=cmake'
- lz4 # MakefilePackage
- mpich~fortran # AutotoolsPackage
- py-setuptools # PythonPackage
- openjpeg # CMakePackage
- r-rcpp # RPackage
- ruby-rake # RubyPackage
- perl-data-dumper # PerlPackage
- arch:
- '%gcc'
require:
- target=x86_64_v3
specs:
- matrix:
- - $default_specs
- - $arch
- 'uncrustify build_system=autotools'
- 'uncrustify build_system=cmake'
- lz4 # MakefilePackage
- mpich~fortran # AutotoolsPackage
- py-setuptools # PythonPackage
- openjpeg # CMakePackage
- r-rcpp # RPackage
- ruby-rake # RubyPackage
- perl-data-dumper # PerlPackage
cdash:
build-group: Build Systems

View File

@@ -2,79 +2,75 @@ spack:
view: false
packages:
all:
require: target=aarch64
require:
- target=aarch64
prefer:
- '%gcc'
concretizer:
unify: true
reuse: false
definitions:
- default_specs:
# editors
- neovim~no_luajit
- py-pynvim
- emacs+json+native+treesitter # note, pulls in gcc
# - tree-sitter is a dep, should also have cli but no package
- nano # just in case
# tags and scope search helpers
- universal-ctags # only maintained ctags, works better with c++
- direnv
# runtimes and compilers
- python
- llvm+link_llvm_dylib~lld~lldb~polly+python build_type=MinSizeRel # for clangd, clang-format
- node-js # for editor plugins etc., pyright language server
- npm
- cmake
- libtool
- go # to build fzf, gh, hub
- rust+dev # fd, ripgrep, hyperfine, exa, rust-analyzer
- binutils+ld+gold+plugins # support linking with built gcc
# styling and lints
- astyle
- cppcheck
- uncrustify
- py-fprettify
- py-fortran-language-server
- py-python-lsp-server
# cli dev tools
- ripgrep
- gh
- fd
- bfs
- fzf
- tree
- jq
- py-yq
- hub
- ncdu
- eza
- lsd
- hyperfine
- htop
- tmux
- ccache
# ensure we can use a jobserver build and do this fast
- gmake
- ninja # should be @kitware, can't be because of meson requirement
- openssl certs=system # must be this, system external does not work
- libtree
- patchelf
- sed
- which
- elfutils
- fontconfig
- font-util
- gdb
- flex
- graphviz
- doxygen
- meson
- arch:
- '%gcc target=aarch64'
specs:
- matrix:
- - $default_specs
- - $arch
# editors
- neovim~no_luajit
- py-pynvim
- emacs+json+native+treesitter # note, pulls in gcc
# - tree-sitter is a dep, should also have cli but no package
- nano # just in case
# tags and scope search helpers
- universal-ctags # only maintained ctags, works better with c++
- direnv
# runtimes and compilers
- python
- llvm+link_llvm_dylib~lld~lldb~polly+python build_type=MinSizeRel # for clangd, clang-format
- node-js # for editor plugins etc., pyright language server
- npm
- cmake
- libtool
- go # to build fzf, gh, hub
- rust+dev # fd, ripgrep, hyperfine, exa, rust-analyzer
- binutils+ld+gold+plugins # support linking with built gcc
# styling and lints
- astyle
- cppcheck
- uncrustify
- py-fprettify
- py-fortran-language-server
- py-python-lsp-server
# cli dev tools
- ripgrep
- gh
- fd
- bfs
- fzf
- tree
- jq
- py-yq
- hub
- ncdu
- eza
- lsd
- hyperfine
- htop
- tmux
- ccache
# ensure we can use a jobserver build and do this fast
- gmake
- ninja # should be @kitware, can't be because of meson requirement
- openssl certs=system # must be this, system external does not work
- libtree
- patchelf
- sed
- which
- elfutils
- fontconfig
- font-util
- gdb
- flex
- graphviz
- doxygen
- meson
ci:
pipeline-gen:

View File

@@ -9,7 +9,7 @@ spack:
reuse: false
specs:
# editors
- neovim~no_luajit
#- neovim~no_luajit # build fails: https://github.com/spack/spack/pull/48453#issuecomment-2624788262
- py-pynvim
- emacs+json~native+treesitter # TODO native not supported until gcc builds on darwin
# - tree-sitter is a dep, should also have cli but no package
@@ -64,8 +64,6 @@ spack:
ci:
pipeline-gen:
- build-job-remove:
tags: [ spack, public ]
- build-job:
variables:
CI_GPG_KEY_ROOT: /etc/protected-runner

View File

@@ -1,86 +1,79 @@
spack:
view: false
packages:
all:
require:
- target=x86_64_v3
- ~cuda
- ~rocm
prefer:
- "%gcc"
concretizer:
unify: true
reuse: false
static_analysis: true
definitions:
- default_specs:
# editors
- neovim~no_luajit
- py-pynvim
- emacs+json+native+treesitter # note, pulls in gcc
# - tree-sitter is a dep, should also have cli but no package
- nano # just in case
# tags and scope search helpers
- universal-ctags # only maintained ctags, works better with c++
- direnv
# runtimes and compilers
- python
- llvm+link_llvm_dylib~lld~lldb~polly+python build_type=MinSizeRel # for clangd, clang-format
- node-js # for editor plugins etc., pyright language server
- npm
- cmake
- libtool
- go # to build fzf, gh, hub
- rust+dev # fd, ripgrep, hyperfine, exa, rust-analyzer
- binutils+ld+gold+plugins # support linking with built gcc
# styling and lints
- astyle
- cppcheck
- uncrustify
- py-fprettify
- py-fortran-language-server
- py-python-lsp-server
# cli dev tools
- ripgrep
- gh
- fd
- bfs
- fzf
- tree
- jq
- py-yq
- hub
- ncdu
- eza
- lsd
- hyperfine
- htop
- tmux
- ccache
# ensure we can use a jobserver build and do this fast
- gmake
- ninja # should be @kitware, can't be because of meson requirement
- openssl certs=system # must be this, system external does not work
- libtree
- patchelf
- sed
- which
- elfutils
- fontconfig
- font-util
- gdb
- flex
- graphviz
- doxygen
- meson
- arch:
- '%gcc target=x86_64_v3'
specs:
- matrix:
- - $default_specs
- - $arch
# editors
- neovim~no_luajit
- py-pynvim
- emacs+json+native+treesitter # note, pulls in gcc
# - tree-sitter is a dep, should also have cli but no package
- nano # just in case
# tags and scope search helpers
- universal-ctags # only maintained ctags, works better with c++
- direnv
# runtimes and compilers
- python
- llvm+link_llvm_dylib~lld~lldb~polly+python build_type=MinSizeRel # for clangd, clang-format
- node-js # for editor plugins etc., pyright language server
- npm
- cmake
- libtool
- go # to build fzf, gh, hub
- rust+dev # fd, ripgrep, hyperfine, exa, rust-analyzer
- binutils+ld+gold+plugins # support linking with built gcc
# styling and lints
- astyle
- cppcheck
- uncrustify
- py-fprettify
- py-fortran-language-server
- py-python-lsp-server
# cli dev tools
- ripgrep
- gh
- fd
- bfs
- fzf
- tree
- jq
- py-yq
- hub
- ncdu
- eza
- lsd
- hyperfine
- htop
- tmux
- ccache
# ensure we can use a jobserver build and do this fast
- gmake
- ninja # should be @kitware, can't be because of meson requirement
- openssl certs=system # must be this, system external does not work
- libtree
- patchelf
- sed
- which
- elfutils
- fontconfig
- font-util
- gdb
- flex
- graphviz
- doxygen
- meson
ci:
pipeline-gen:

View File

@@ -5,6 +5,7 @@ spack:
view: false
concretizer:
static_analysis: true
reuse: false
unify: false
@@ -14,8 +15,9 @@ spack:
packages:
all:
require: "%cce@18.0.0 target=x86_64_v3"
compiler: [cce]
require:
- target=x86_64_v3
- "%cce"
providers:
blas: [cray-libsci]
lapack: [cray-libsci]
@@ -23,6 +25,21 @@ spack:
tbb: [intel-tbb]
scalapack: [netlib-scalapack]
variants: +mpi
# Virtuals
blas:
require:
- cray-libsci
lapack:
require:
- cray-libsci
mpi:
require:
- cray-mpich
scalapack:
require:
- netlib-scalapack
ncurses:
require: +termlib ldflags=-Wl,--undefined-version
tbb:
@@ -33,21 +50,28 @@ spack:
variants: +python +filesystem +iostreams +system
elfutils:
variants: ~nls
require: "%gcc"
require:
- target=x86_64_v3
- "%gcc"
gcc-runtime:
require: "%gcc"
require:
- target=x86_64_v3
- "%gcc"
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
mgard:
require:
- target=x86_64_v3
- "@2023-01-10:"
mpich:
variants: ~wrapperrpath
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "~qt ^[virtuals=gl] osmesa"
require:
- "~qt ^[virtuals=gl] osmesa"
- target=x86_64_v3
trilinos:
require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
@@ -58,6 +82,7 @@ spack:
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu]
- target=x86_64_v3
specs:
# CPU
@@ -140,8 +165,8 @@ spack:
# - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - hpctoolkit # dyninst requires %gcc
# - hpx max_cpu_count=512 networking=mpi # libxcrypt-4.4.35
# - lammps # lammps-20240829.1: Reversed (or previously applied) patch detected! Assume -R? [n]
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
# - lammps # lammps-20240829.1: Reversed (or previously applied) patch detected! Assume -R? [n]
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
# - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - nvhpc # requires %gcc

View File

@@ -7,7 +7,9 @@ spack:
packages:
all:
require: '%gcc target=neoverse_v2'
require:
- "%gcc"
- target=neoverse_v2
providers:
blas: [openblas]
mpi: [mpich]

View File

@@ -4,12 +4,13 @@ spack:
concretizer:
reuse: false
unify: false
static_analysis: false
packages:
all:
require:
- "target=x86_64_v3"
- "%oneapi"
- target=x86_64_v3
- '%oneapi'
providers:
blas: [openblas]
tbb: [intel-tbb]
@@ -38,32 +39,43 @@ spack:
xz:
variants: +pic
mpi:
require: 'mpich@4: target=x86_64_v3'
mpich:
require: '~wrapperrpath ~hwloc target=x86_64_v3'
require: intel-oneapi-mpi
intel-oneapi-mpi:
buildable: false
externals:
- spec: intel-oneapi-mpi@2021.13.1
prefix: /opt/intel/oneapi
unzip:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
binutils:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
variants: +ld +gold +headers +libiberty ~nls
llvm:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
ruby:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
rust:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
krb5:
require: '%gcc target=x86_64_v3'
papi:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
openssh:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
dyninst:
require: "%gcc target=x86_64_v3"
require:
- '%gcc target=x86_64_v3'
bison:
require: '%gcc target=x86_64_v3'
require:
- '%gcc target=x86_64_v3'
paraview:
require: "+examples %oneapi target=x86_64_v3"
require:
- +examples target=x86_64_v3
specs:
# CPU
@@ -128,7 +140,7 @@ spack:
- nccmp
- nco
- netlib-scalapack
- nrm
- nrm ^py-scipy cflags="-Wno-error=incompatible-function-pointer-types" # py-scipy@1.8.1 fails without cflags here
- nwchem
- omega-h
- openfoam

View File

@@ -7,7 +7,8 @@ spack:
packages:
all:
require: '%gcc target=x86_64_v3'
require:
- 'target=x86_64_v3'
providers:
blas: [openblas]
variants: +mpi
@@ -21,7 +22,9 @@ spack:
variants: threads=openmp
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 +examples ~qt ^[virtuals=gl] osmesa %gcc target=x86_64_v3"
require:
- "@5.11 +examples ~qt ^[virtuals=gl] osmesa"
- 'target=x86_64_v3'
# ROCm
comgr:

View File

@@ -9,7 +9,8 @@ spack:
packages:
all:
require:
- '%gcc target=x86_64_v3'
- "%gcc"
- target=x86_64_v3
variants: +mpi
mpi:
require:

View File

@@ -8,7 +8,9 @@ spack:
packages:
all:
require: '%gcc target=x86_64_v3'
require:
- "%gcc"
- target=x86_64_v3
providers:
blas: [openblas]
mpi: [mpich]

View File

@@ -9,13 +9,13 @@ spack:
- ~cuda
- ~rocm
mpi:
require: openmpi
require: mpich
openblas:
require: ~fortran
specs:
# Horovod
- py-horovod
# - py-horovod # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
# Hugging Face
- py-transformers
@@ -45,18 +45,18 @@ spack:
- py-segmentation-models-pytorch
- py-timm
- py-torch
- py-torch-cluster
# - py-torch-cluster # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
- py-torch-geometric
- py-torch-nvidia-apex
- py-torch-scatter
- py-torch-sparse
- py-torch-spline-conv
# - py-torch-nvidia-apex # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
# - py-torch-scatter # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
# - py-torch-sparse # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
# - py-torch-spline-conv # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
- py-torchaudio
- py-torchdata
- py-torchfile
- py-torchgeo
- py-torchmetrics
- py-torchtext
# - py-torchtext # https://github.com/spack/spack/pull/48453#issuecomment-2676023970
- py-torchvision
- py-vector-quantize-pytorch
@@ -84,8 +84,6 @@ spack:
ci:
pipeline-gen:
- build-job-remove:
tags: [ spack, public ]
- build-job:
variables:
CI_GPG_KEY_ROOT: /etc/protected-runner

Some files were not shown because too many files have changed in this diff Show More