Compare commits

...

462 Commits

Author SHA1 Message Date
dependabot[bot]
53b8f91c02 build(deps): bump docker/login-action from 3.2.0 to 3.3.0 (#45378)
Bumps [docker/login-action](https://github.com/docker/login-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](0d4c9c5ea7...9780b0c442)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 13:40:12 -04:00
Todd Gamblin
a841ddd00c spack pkg grep: don't warn when grepping for quoted strings (#45412)
The `Executable` class emits a warning when you pass quoted arguments to it:

```
> spack pkg grep '"namespace"'
==> Warning: Quotes in command arguments can confuse scripts like configure.
  The following arguments may cause problems when executed:
      "namespace"
  Quotes aren't needed because spack doesn't use a shell. Consider removing them.
  If multiple levels of quotation are required, use `ignore_quotes=True`.
```

This is to warn new package authors who aren't used to calling build commands in python.
It's not useful for `spack pkg grep`, where we really are passing args on the command
line, and if we pass a quoted string, we probably meant to.

- [x] make `ignore_quotes` an instance variable, not just an argument to ``__call__`
- [x] set `ignore_quotes` to `True` in `spack pkg grep`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-24 08:11:32 -07:00
snehring
39455768b2 hybpiper: change package type, add version 2.1.8 (#45262)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-24 09:16:54 -05:00
afzpatel
e529a454eb CI: add ML ROCm stack (#45302)
* add ML ROCm stack

* add suggested changes

* remove py-torch and py-tensorflow-estimator

* add TF_ROCM_AMDGPU_TARGETS env variable and remove packages from pipeline

* remove py-jax and py-xgboost
2024-07-24 16:16:15 +02:00
AcriusWinter
1b5dc396e3 uftrace: change to executable declaration (#45403) 2024-07-23 18:32:06 -06:00
Wouter Deconinck
15a3ac0512 py-arrow: add v1.3.0 (switch to flit-core) (#45123) 2024-07-23 17:50:21 -06:00
Thomas Applencourt
52f149266f ruby: add v3.3.4 (#45334)
* Ruby Add 3.3.4

* Update package.py

* Update package.py
2024-07-23 17:38:52 -06:00
Teague Sterling
8d33c2e7c0 generate-ninja: new package (#45391)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-23 14:47:04 -07:00
Karol Krizka
b3d82dc3a8 ROOT should add_include_path virtual glu for consistency. (#45057) 2024-07-23 14:12:54 -05:00
eugeneswalker
0fb44529bb e4s rocm external ci stack: upgrade to v6.1.2 (#45356)
* e4s rocm external ci stack: upgrade to v6.1.2

* magma: add rocm-core 6.1.2
2024-07-23 09:52:52 -07:00
dependabot[bot]
6ea944bf17 build(deps): bump pytest from 8.2.2 to 8.3.1 in /lib/spack/docs (#45377)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.2.2 to 8.3.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.2.2...8.3.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:55 -04:00
dependabot[bot]
8c6177c47f build(deps): bump sphinx from 7.4.6 to 7.4.7 in /lib/spack/docs (#45376)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.4.6 to 7.4.7.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.4.6...v7.4.7)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:43 -04:00
dependabot[bot]
b65d9f1524 build(deps): bump docker/setup-buildx-action from 3.4.0 to 3.5.0 (#45379)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.4.0 to 3.5.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](4fd812986e...aa33708b10)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:18 -04:00
dependabot[bot]
03e5dddf24 build(deps): bump docker/build-push-action from 6.4.1 to 6.5.0 (#45380)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.4.1 to 6.5.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1ca370b3a9...5176d81f87)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:04 -04:00
dependabot[bot]
7bb892f7b3 build(deps): bump docker/setup-qemu-action from 3.1.0 to 3.2.0 (#45381)
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](5927c834f5...49b3bc8e6b)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:52:42 -04:00
Wouter Deconinck
66ed8ebbd9 gh: convert to GoPackage (#45351)
* gh: convert to GoPackage

* gh: fix style
2024-07-23 11:49:05 -04:00
Wouter Deconinck
0d326f83b6 GoPackage: default -modcacherw to ensure stage cleanup (#45350) 2024-07-23 11:48:52 -04:00
Dom Heinzeller
fc0955b125 Update and clean up hdf-eos2 package.py to fix build errors with Intel oneAPI (#45399) 2024-07-23 07:54:12 -07:00
Mikael Simberg
13ba1b96c3 fmt: add 11.0.2 (#45396) 2024-07-23 07:46:45 -07:00
Mikael Simberg
d66d169027 Remove # generated comments from many packages, add some missing depends_on("cxx") (#45395) 2024-07-23 10:31:22 +02:00
Manuela Kuhn
6decd6aaa1 py-mne-bids: add new package (#45374) 2024-07-22 17:11:18 -06:00
AcriusWinter
3c0ffa8652 uftrace: new test API, add 0.16 with libtraceevent, fix build (#45364)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-07-22 16:31:38 -06:00
Christoph Junghans
4917e3f664 votca: add v2024.1 (#45341) 2024-07-22 14:22:46 -06:00
Adam J. Stewart
b2a14b456e py-numpy: add v2.0.1 (#45354) 2024-07-22 14:04:09 -06:00
Teague Sterling
ab1580a37f linux-pam: add v1.5.1, v1.5.3, v1.6.0, v1.6.1 and additional variants (#45344)
* package/linux-pam: dependencies
* Adding variants to linux-pam
* Updating linux-pam variants
* Fixing variants for linux-pam after testing
* clean up flag handling
* clean up terrible tabs
* cant use default_args for compiler dependencies
* Change selinux to off by default

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-22 12:57:39 -07:00
Teague Sterling
c1f979cd54 libwnck: new package (#44641) 2024-07-22 21:27:45 +02:00
Teague Sterling
d124338ecb libgudev: new package (#44639) 2024-07-22 21:24:42 +02:00
Brad Geltz
d001f14514 geopm: Add v3.1 and update the needed dependencies (#44793) 2024-07-22 21:23:11 +02:00
Jordan Galby
c43205d6de asciidoc: Fix asciidoc@10 install (#44926) 2024-07-22 21:16:08 +02:00
snehring
54f1af5a29 libint: Fix build for 2.6.0, add libs property (#45034)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-22 20:55:34 +02:00
Alex Seaton
350661f027 heyoka: add current 4.0.x and 5.0.0 releases (#45314) 2024-07-22 20:16:39 +02:00
Philipp Edelmann
3699df2651 rayleigh: add v1.2.0 (#45333) 2024-07-22 20:05:44 +02:00
Andrew W Elble
63197fea3e dedisp: fix preinstall: it only takes self (#45328)
too many arguments here, only takes "self".
2024-07-22 20:00:14 +02:00
Martin Lang
f044194b06 libvdwxc: Fix configure with gcc-14 (#45093) 2024-07-22 19:54:49 +02:00
William Moses
29302c13e9 Update Enzyme to 0.0.135 (#45346) 2024-07-22 09:35:40 -07:00
Diego Alvarez S.
c4808de2ff Add nextflow 24.04.3 (#45357) 2024-07-22 09:29:46 -07:00
Alex Leute
c390a4530e py-colored: Added new version (#45361) 2024-07-22 09:17:13 -07:00
Teague Sterling
be771d5d6f node-js: add v17.9.1, v20.15.0, v21.7.3, v22.3.0, v22.4.0 (#45007)
* Adding new versions and compilation conflict for nodejs
* Update failed version for gcc14
* Updating conflicts notes for correctness/clarity/format
* Applying spack-ized versions of fix in https://github.com/nodejs/node/issues/52223 to adddress CI failures
* Update fix-old-glibc-random-headers.patch
* Update package.py
* Update fix-old-glibc-random-headers.patch
* Update fix-old-glibc-random-headers.patch
* Adding conflict for older glibc
* Fixing patch for older systems, need to undef
* Removing overly strict conflicts

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-22 08:32:34 -07:00
Wouter Deconinck
8b45fa089e geant4: support Qt5 and Qt6 (#45352)
* geant4: support qt5 and qt6
* geant4: update conflict msg
2024-07-22 13:38:46 +01:00
dependabot[bot]
0d04223ccd build(deps): bump mypy from 1.10.1 to 1.11.0 in /lib/spack/docs (#45337)
Bumps [mypy](https://github.com/python/mypy) from 1.10.1 to 1.11.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.10.1...v1.11)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-20 11:25:31 -05:00
Peter Scheibel
5ef222d62f Testing: omit test on windows (#45340)
Re-disable a test that was enabled in #45031
2024-07-20 00:32:37 -07:00
Nicole C.
6810e9ed2e Windows Tests: enable more cmd tests on Windows (#45031)
* Several tests can be enabled on Windows with no changes to logic
  (either the test logic or logic being tested)
* Test for `spack location` requires modification of the test logic,
  but with a minor change can be enabled on Windows
2024-07-19 18:08:28 -07:00
Andrew W Elble
a6c638d0fa sqlite: fix AttributeError when +functions (#45327)
using self.compiler.cc_pic_flag here results in these errors:

==> sqlite: Executing phase: 'install'
==> Error: AttributeError: 'AutotoolsBuilder' object has no attribute 'compiler'

change it to self.pkg.compiler.cc_pic_flag instead.
2024-07-19 16:24:22 -06:00
Rémi Lacroix
fa8a512945 NCCL: add version 2.22.3-1. (#45322) 2024-07-19 12:18:16 -06:00
Wouter Deconinck
24b73da9e6 docs: util/environment.py: use re.Pattern[str] instead of re (#45329)
* docs: util/environment.py: use `re.Pattern[str]` instead of `re`

* docs: sphinx==7.4.6
2024-07-19 20:03:07 +02:00
afzpatel
4447d3339c change 2.16-rocm-enhanced to 2.16.1-rocm-enhanced (#45320) 2024-07-19 09:45:31 -06:00
Alex Leute
d82c9e7f2a usearch: new version (#45308)
* usearch: new versison
   Manual download no longer reqired for @12:
2024-07-19 01:42:58 -06:00
AcriusWinter
6828a7402a improved-rdock: new test API (#45239)
* imoroved-rdock: new test API
* Make test subpart names and or descriptions more accurate

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-18 16:06:45 -06:00
dmagdavector
a0d62a40dd sqlite: add some newer releases (#45297)
Included: 3.46.0 (most current), 3.45.3, 3.45.1 (for possible compat with Ubuntu 24.04 LTS), 3.44.2.
2024-07-18 15:06:24 -06:00
eugeneswalker
712dcf6b8d e4s ci: enable some disabled specs (#44934)
* e4s ci: enable some disabled specs

* comment out cp2k +cuda due to unsupported cuda_arch

* comment out dealii+cuda due to concretize error

* work through concretize errors

* e4s: comment out failing builds

* e4s stack: disabled non-building specs

* comment out failing specs

* comment out failing specs

* cleanup comments
2024-07-18 20:58:10 +00:00
Wouter Deconinck
ad1fc34199 rust: rework external find to require both rustc and cargo (#45286)
* rust: rework external find to require both rustc and cargo

* rust: handle unable to parse version

* [@spackbot] updating style on behalf of wdconinc

* rust: not x or not y -> not (x and y)

Co-authored-by: Alec Scott <hi@alecbcs.com>

* rust: pick first rustc found

Co-authored-by: Alec Scott <hi@alecbcs.com>

* rust: list comprehensions

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-07-18 14:19:50 -06:00
Adam J. Stewart
ab723b25d0 py-tensorflow: add v2.17.0 (#45201) 2024-07-18 12:54:52 -07:00
Wouter Deconinck
016673f419 py-cryptography: add v41.0.7, v42.0.8; py-setuptools-rust: add v1.7.0, v1.8.1, v1.9.0 (#45282) 2024-07-18 12:18:27 -07:00
Adam J. Stewart
7bc6d62e9b py-sphinx: add v7.4 (#45255) 2024-07-18 12:48:18 -06:00
Bill Williams
fca9cc3e0e Allow remapping of compiler names (#45299)
CCE in spack is Cray on the Score-P configure line. Others can be added.

Co-authored-by: William Williams <william.williams@tu-dresden.de>
2024-07-18 11:48:51 -06:00
snehring
2a178bfbb0 mafft: add version 7.525 (#45258)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-18 10:25:53 -07:00
snehring
3381879358 spades: add version 4.0.0 and new variants (#45278)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-18 10:24:05 -07:00
Matt Thompson
ed9058618a mapl: add v2.47.1 (#45280)
* mapl: add 2.47.1
* Approve compiler depends_on
2024-07-18 10:21:15 -07:00
Harmen Stoppels
a4c99bad6a git packages: add language dep (#45294) 2024-07-18 19:17:54 +02:00
Pranav Sivaraman
f31f58ff26 magic-enum: add version 0.9.6 (#45284)
* magic-enum: add version 0.9.6
* magic-enum: add maintainer

---------

Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-07-18 10:15:35 -07:00
Wouter Deconinck
f84918da4b harfbuzz: add v9.0.0 (#45290)
* harfbuzz: add v9.0.0
* harfbuzz: do not patch non-existing Makefile beyond v8
2024-07-18 10:10:18 -07:00
Richard Berger
80a237e250 netlib-lapack: add pic variant (#45291) 2024-07-18 10:06:25 -07:00
Cameron Smith
f52d3b26c3 pumi: language dependencies (#45301)
Signed-off-by: Cameron Smith <smithc11@rpi.edu>
2024-07-18 11:04:57 -06:00
AcriusWinter
2029d714a0 rocm-opencl: old to new test API (#45065)
* rocm-opencl: old to new test API
* Run tests from test stage directory

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-18 09:35:10 -07:00
Alec Scott
31ef1df74f go: remove invalid deps (#45279)
* go: remove invalid deps

* go: add dependencies sed and grep
2024-07-18 09:18:05 -07:00
Richard Berger
00ae96a7cb libmesh: add v1.7.1, and fixes (#45292)
* libmesh: add missing v1.7.1 release

* libmesh: avoid pulling in petsc if only +mpi or +metis is given

* libmesh: add shared variant

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2024-07-18 13:46:39 +02:00
Seth R. Johnson
8d2a6d6744 sethrj: update maintained package language dependencies (#45289) 2024-07-18 10:55:20 +02:00
Massimiliano Culpo
9443e31b1e Do not initialize previous store state in "use_store" (#45268)
The "use_store" context manager is used to swap the value
of a global variable (spack.store.STORE), while keeping
another global variable consistent (spack.config.CONFIG).

When doing that it tries to evaluate the previous value
of the store, if that was not done already. This is wrong,
since the configuration might be in an "intermediate" state
that was never meant to trigger side effects.

Remove that operation, and add a unit test to
prevent regressions.
2024-07-18 07:18:14 +02:00
Wouter Deconinck
2d8ca8af69 qt-*: add v6.7.1, v6.7.2 (#45288) 2024-07-17 22:03:13 -06:00
dependabot[bot]
de4d4695c4 build(deps): bump docker/build-push-action from 6.4.0 to 6.4.1 (#45283)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.4.0 to 6.4.1.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](a254f8ca60...1ca370b3a9)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-17 21:43:12 -06:00
MichaelLaufer
c8cf85223f py-pyfr: add v2.0.3 (#45274)
* py-pyfr: add v2.0.3
2024-07-17 18:36:40 -06:00
Wouter Deconinck
b869538544 environment: handle view root at existing directory better (#45263)
- remove empty dir if exists at view root
- error better if non-empty dir

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-17 23:17:30 +02:00
Adam J. Stewart
4710cbb281 py-lightning: setuptools required at run-time (#45260) 2024-07-17 11:11:42 -07:00
Massimiliano Culpo
9ae1014e55 Run minimization of weights only on known targets (#45269)
This prevents excessive output from clingo of the kind:

.../spack/lib/spack/spack/solver/concretize.lp:1640:5-11: info: tuple ignored:
  #sup@2
2024-07-17 11:10:00 -07:00
afzpatel
813c0dd031 hipsparselt, composable-kernel: add netlib-lapack test dependency and enable ck test (#45273)
* add netlib-lapack dependency to hipsparselt and enable ck test
* fix cmake args
2024-07-17 10:49:28 -07:00
Lucas Frérot
91071933d0 tamaas: added version 2.8.0 and petsc variant (#45267)
* tamaas: added version 2.8.0
* tamaas: added +petsc variant for extra solvers
2024-07-17 10:30:40 -07:00
snehring
df5bac3e6c giflib: remove convert call in doc generation (#45276)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-17 09:42:20 -07:00
Harmen Stoppels
7b9f8abce5 Add depends_on([c,cxx,fortran]) (#45217)
Add language dependencies `c`, `cxx`, and `fortran`.

These `depends_on` statements are auto-generated based on file extensions found
in source tarballs/zipfiles.

The `# generated` comment can be removed by package maintainers after
validating correctness.
2024-07-17 16:07:43 +02:00
Harmen Stoppels
a2f9d4b6a1 pixman: unconditional --with-pic (#45272) 2024-07-17 16:02:50 +02:00
Harmen Stoppels
77e16d55c1 warpx: fix openpmd backward compat bound (#45271) 2024-07-17 15:31:43 +02:00
Stephen Nicholas Swatman
ecb2442566 detray: new package (#45024)
* detray: new package

This commit adds the detray package, a detector description library for
HEP experiments that is designed to be GPU-friendly.

* Update var/spack/repos/builtin/packages/detray/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/detray/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-17 07:42:19 -05:00
Alec Scott
89c0b4accf libgcrypt: conflict with darwin when @1.11.0 (#45264)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-16 23:45:13 -06:00
downloadico
8e5b51395a abinit: add version 10.0.7 (#45250)
* abinit: add version 10.0.7
* abinit: simplified version constraint for applying rm_march_settings_v9.patch
2024-07-16 15:08:29 -07:00
AcriusWinter
c2ada0f15a parflow: Old test method to new test method (#44933)
* parflow: Old test method to new test method
* add output checker
* made req. changes
2024-07-16 12:43:31 -07:00
afzpatel
6d3541c5fd fix hipblas test (#44666)
* fix hipblas test
* add rocm-openmp-extras dependencies
2024-07-16 12:39:08 -07:00
snehring
31e4149067 vasp: add new version 6.4.3 and rework package (#44937)
* vasp: add new version 6.4.3 and rework package
* vasp: remove redundant cuda dep
* vasp: bump fftlib variant version restriction
* vasp: honor the still existing scalapack variant

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-16 12:37:44 -07:00
otsukay
c9fba9ec79 fujitsu.patch is no longer needed for versions>=4.5 (#45154) 2024-07-16 12:18:39 -07:00
Wouter Deconinck
282627714e gaudi: depends_on python +dbm (#45238) 2024-07-16 12:13:17 -07:00
fpruvost
714dd783f9 pastix: new version v6.4.0 (#45246) 2024-07-16 12:06:30 -07:00
Harmen Stoppels
40b390903d gmake: generic CXX, fix build.sh, deprecate 4.0, add 4.1 (#45137)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-16 18:58:57 +02:00
Massimiliano Culpo
ce1b569b69 Fix order of deserialization in subprocess context (#45229)
Since the the MetaPathFinder now owns a lazily constructed RepoPath object, we need to deserialize environments before the package that needs to be restored. Before we were relying on globals to be inconsistent in a way that let the entire process go.
2024-07-16 10:15:29 -06:00
Harmen Stoppels
b539eb5aab concretizer: show input specs on error (#45245) 2024-07-16 14:04:56 +02:00
Seth R. Johnson
e992e1efbd Celeritas: new version 0.4.4 (#45234) 2024-07-16 04:23:02 -06:00
Alec Scott
33a52dd836 pass: switch to git based versions to fix changing checksum in tarball (#45237) 2024-07-16 04:18:09 -06:00
Wouter Deconinck
b5f06fb3bc py-mpmath: ad v1.3.0; depends_on py-setuptools for old versions (#45232) 2024-07-16 04:07:40 -06:00
afzpatel
494817b616 correct test binary name (#45240) 2024-07-16 04:03:02 -06:00
Wouter Deconinck
02470a5aae geant4: add v11.3.0.beta (#45087)
* geant4: add v11.3.0.beta

* geant4: vecgeom@1.2.8: when 11.3:

* geant4-data: add v11.3.0

* g4particlexs: add v4.1

* g4emlow: add v8.6

* g4nudexlib, g4urrpt: add v1.0

* [@spackbot] updating style on behalf of wdconinc

* geant4: immediately deprecate geant4-11.3.0.beta

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-16 10:03:49 +01:00
Massimiliano Culpo
42232a8ab6 Fix error message for test log in child process (#45233)
If we don't have a log, we'll mask the real error with
another caused by using None as an argument to os.path.join
2024-07-16 06:58:36 +02:00
Matthieu Dorier
cb64df45c8 toml11: adds new versions (#45056) 2024-07-16 06:40:45 +02:00
Wouter Deconinck
a11da7bdb9 cmd/dependents.py: remove unused loop over all packages (#45166) 2024-07-16 06:38:01 +02:00
Matthew Lesko
9a22ae11c6 openmpi: fix pmix version check in v5 (#44928)
* OpenMPI 5 w/ PRRTE 3 series PMIX version check fix

OpenMPI fails to compile otherwise when targeting external PMIX 4.2.6 and likely others.

```
  >> 3369    base/ess_base_bootstrap.c:72:14: error: static declaration of 'pmix_getline' follows non-static declaration
     3370       72 | static char *pmix_getline(FILE *fp)
     3371          |              ^
     3372    /opt/pmix/include/pmix/src/util/pmix_string_copy.h:83:19: note: previous declaration is here
     3373       83 | PMIX_EXPORT char *pmix_getline(FILE *fp);
     3374          |                   ^
     3375    1 error generated.
  >> 3376    make[4]: *** [Makefile:820: base/ess_base_bootstrap.lo] Error 1
```

Upstream PRRTE fix (not released yet): https://github.com/openpmix/prrte/pull/1957
Upstream OpenMPI issue: https://github.com/open-mpi/ompi/issues/12537 ("fixed in next release")

Co-authored-by: Shahzeb Siddiqui <shahzebmsiddiqui@gmail.com>
2024-07-16 06:33:12 +02:00
Stephen Sachs
318a7e0e30 wrf: explicit conflict oneapi + older versions (#44787)
The patch which enables icx/ifx compilers is only added for `wrf@4.4:`. This PR prints a useful message at concretization time instead of failing the installation later on.

Co-authored-by: stephenmsachs <stephenmsachs@users.noreply.github.com>
2024-07-16 06:28:54 +02:00
Wouter Deconinck
e976f351f8 py-ipython: depends_on python +sqlite3 when @8: (#45231) 2024-07-16 06:26:41 +02:00
Wouter Deconinck
437341d40e py-nodeenv: depends_on python +ssl (#45225) 2024-07-16 06:23:21 +02:00
dependabot[bot]
9d7ea1a28b build(deps): bump docker/build-push-action from 6.3.0 to 6.4.0 (#45243)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.3.0 to 6.4.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1a162644f9...a254f8ca60)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 06:14:27 +02:00
AcriusWinter
d85668f096 slate: changed stand-alone test from old to new API (#44953)
* slate: changed from old to new format
* make code tighter
* replace assert method
* SkipTest plus other cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-15 21:10:50 -06:00
Alex Richert
5c3a23a481 pixman: add shared, pic variants (#44889)
* Add shared/pic variants to pixman
* add +shared~pic conflict
2024-07-15 17:36:45 -07:00
AcriusWinter
8be1f26ac6 tix: old to new test API (#45223) 2024-07-15 17:31:02 -07:00
Adam J. Stewart
35bd21fc64 py-tensorflow-estimator: correct dependencies (#44185) 2024-07-15 22:12:16 +02:00
Adam J. Stewart
652170fb54 DCMTK: fix build with libtiff (#45213) 2024-07-15 22:02:19 +02:00
Harmen Stoppels
d4e6c29f25 unparser.py: remove print statements (#45235) 2024-07-15 21:55:11 +02:00
Stephen Nicholas Swatman
c12772e73f vecmem: add infrastructure for working with SYCL (#45058)
* vecmem: add infrastructure for working with SYCL

The vecmem package uses an unorthodox build system where, instead of
expecting a SYCL-capable compiler in the `CXX` environment variable, it
expects one in `SYCLCXX`. It also needs the correct SYCL flags to be
set. This commit adds a custom build environment for the vecmem package
which allows it to be built in this way. I've also added an extra CMake
flag to ensure that the build system doesn't download any unwanted
dependencies.

* Update var/spack/repos/builtin/packages/vecmem/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-15 14:42:19 -05:00
pauleonix
a26ac1dbcc cuda: add v12.5.1 (#44342)
- Add explicit conflict on ppc64le for 12.5 and newer.
- Update/fix intel compiler conflict to reflect that intel@2021 is compatible
  only since 11.4.1 and not since 11.1.1.
- Add intel compiler conflicts to reflect strict support matrix since
  12.2.0.
2024-07-15 20:04:47 +02:00
Teague Sterling
2afaeba292 zip: add patch for gcc@14: (#45006) 2024-07-15 19:42:41 +02:00
Martin Lang
a14e76b98d FFTW: missing function declaration in pfft patch (#45095) 2024-07-15 18:50:35 +02:00
renjithravindrankannath
9a3a759ed3 Updating rocm-opencl to 6.1.2 (#45219) 2024-07-15 09:27:53 -07:00
Michael Kuhn
72f17d6961 rocksdb: add 9.4.0 (#45230) 2024-07-15 09:26:50 -07:00
Harmen Stoppels
1b967a9d98 iconv: remove requirement (#45206)
no longer necessary after 5c53973220
2024-07-15 17:45:32 +02:00
Harmen Stoppels
bb954390ec sqlite: fix url_for_version (#45228) 2024-07-15 17:45:15 +02:00
Wouter Deconinck
bf9b6940c9 abseil-cpp: patch to avoid googletest build dependency (#45103) 2024-07-15 17:09:21 +02:00
simonLeary42
22980b9e65 rust: update cmake dependency ranges (#45145) 2024-07-15 17:03:23 +02:00
Vanessasaurus
483426f771 flux-sched: add v0.36.0 (#45161) 2024-07-15 17:01:07 +02:00
rfbgo
1e5b976eb7 py-pytorch-lightning: add v2.0.7 (#45175) 2024-07-15 16:59:03 +02:00
Patrick Diehl
2aa6939b96 hpx: add instrumentation=thread_debug (#45199)
Co-authored-by: Hartmut Kaiser <hartmut.kaiser@gmail.com>
2024-07-15 14:47:13 +02:00
Simo Tuomisto
f7e601d352 google-cloud-cli: fix unquoted value in env variable (#45207) 2024-07-15 14:41:44 +02:00
Manuela Kuhn
c4082931e3 r-colourpicker: add 1.3.0 (#45209) 2024-07-15 14:39:28 +02:00
Julien Cortial
cee3e5436b perl-json: add optional dependency on perl-json-xs (#45050) 2024-07-15 14:23:25 +02:00
Adam J. Stewart
613fa56bfc py-shapely: add v2.0.5 (#45224) 2024-07-15 11:38:10 +02:00
Michael Kuhn
0752d94bbf libelf: fix build with GCC 14 (#45226) 2024-07-15 10:18:36 +02:00
afzpatel
3bf1a03760 py-tensorflow: change py-tensorflow@2.16-rocm-enhanced to use tarball instead of branch (#45218)
* change py-tensorflow@2.16-rocm-enhanced to use tarball instead of branch

* remove revert_fd6b0a4.patch and use github commit patch url
2024-07-13 12:19:19 +02:00
John W. Parent
e2844e2fef bootstrap ci: add exit code validation for windows (#45221) 2024-07-12 23:07:45 -06:00
AcriusWinter
2ca733bbc1 rocm-clang-ocl: old to new test API (#44938)
* rocm-ocl-clang: old to new test format
* Minor cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-12 15:56:38 -06:00
Dom Heinzeller
e2b6eca420 qt: Add support for compiling @5.15.14 with Intel oneAPI compilers (icx, icpx) (#45195) 2024-07-12 15:35:01 -06:00
Dom Heinzeller
67cb19614e Update fckit to build with Intel oneAPI compilers (icx, icpx) (#45196) 2024-07-12 13:37:58 -06:00
Manuela Kuhn
e464461c19 py-pyqt6: add v6.7.0 (#45212) 2024-07-12 11:03:28 -07:00
Manuela Kuhn
6efe88f7a1 py-multiecho: add v0.29 (#45216) 2024-07-12 10:42:04 -07:00
Harmen Stoppels
0ce35dafe1 Add c to the list of languages (#45191) 2024-07-12 15:25:41 +02:00
pauleonix
49e419b2df cuda: add maintainer (#45211) 2024-07-12 06:11:10 -07:00
Massimiliano Culpo
d9033d8dac llvm: detect short executable names (#45171)
Also, remove annotations for "ld.lld" and "lldb"
2024-07-12 14:03:00 +02:00
Harmen Stoppels
517b7fb0c9 directives: types, avoid redundant parsing (#45208) 2024-07-12 13:35:16 +02:00
Massimiliano Culpo
568e79a1e3 gcc: consider link when detecting compilers (#45169) 2024-07-12 11:30:56 +02:00
Harmen Stoppels
9c8846b37b Add pkg- prefix to builtin.mock a b c d ... (#45205) 2024-07-12 11:27:40 +02:00
Tamara Dahlgren
737b70cbbf Buildcache: remove deprecated --allow-root and preview subcommand (#45204) 2024-07-11 18:19:04 -07:00
Dom Heinzeller
03d2212881 Bug fix for mapl: configure mvapich2 (#45164)
* Bug fix for mapl: configure mvapich2
* Update var/spack/repos/builtin/packages/mapl/package.py

---------

Co-authored-by: Matt Thompson <fortran@gmail.com>
2024-07-11 15:03:37 -07:00
Lev Gorenstein
b8d10916af Remove some explicit dependencies (#45146)
As discussed in https://github.com/spack/spack/pull/44881#issuecomment-2218411735 a `spack install py-globus-cli` fails to concretize on an Ubuntu 22.04 under Windows WSL2 because of too strict of explicit dependencies.

Let's try to remove them here (since these are "just in case" and in all honesty should be handled by `py-globus-sdk` anyways).
2024-07-11 11:19:47 -07:00
Wouter Deconinck
4fe5f35c2f xrootd: add v5.7.0 (#45078)
* xrootd: add v5.7.0
* xrootd: new variant +ec, depends_on isa-l
2024-07-11 11:12:27 -07:00
Greg Sjaardema
cea1d8b935 seacas: new version to fix some portability bugs (#45179)
* Now builds with latest fmt release (11.0.1)
* Missing array include in nem_spread
* Fix timestep conssitency check in file-per-rank case if one or more dbs have no timesteps.
2024-07-11 09:52:07 -07:00
dependabot[bot]
e7946a3a41 build(deps): bump actions/checkout from 4.1.6 to 4.1.7 (#44693)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.6 to 4.1.7.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](a5ac7e51b4...692973e3d9)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 15:32:04 +02:00
Harmen Stoppels
5c53973220 concretize.lp: drop 0 weight of external providers (#45025)
If an external happens to be a provider of anything, the solver would
set its weight to 0, meaning that it is most preferred, even if
packages.yaml config disagrees.

That was done so that `spack external find mpich` would be sufficent to
pick it up as mpi provider.

That may have made sense for mpi specifically, but doesn't make sense
for other virtuals. For example `glibc` provides iconv, and is an
external by design, but it's better to use libiconv as a separate
package as a provider.

Therefore, drop this rule, and instead let users add config:

```
mpi:
  require: [mpich]
```

or

```
mpi:
  buildable: false
```

which is well-documented.
2024-07-11 15:29:56 +02:00
Harmen Stoppels
278a38f4af external find --not-buildable: mark virtuals (#45159)
This change makes `spack external find --not-buildable` mark virtuals
provided by detected packages as non-buildable, so that it's sufficient
for users to let spack detect say mpich and have the concretizer pick it
up as mpi provider even when openmpi is "more preferred".
2024-07-11 15:19:55 +02:00
Paolo
39bbedf517 acfl: update the headers property (#44653)
Consistently with ArmPL@24:, the include directory for acfl@24:
has changed to include. The change wants to update to this change
and distinguish the include path for releases previous to 24.04
and the future ones
2024-07-11 11:20:34 +02:00
Massimiliano Culpo
2153f6056d checksum: fix circular imports on macOS (#45187) 2024-07-11 10:49:29 +02:00
Adam J. Stewart
2be9b41362 py-tensorboard: update numpy compatibility (#45092) 2024-07-11 10:32:35 +02:00
AcriusWinter
f9fa024fc5 rocm-cmake: changed test API from old to new (#44939)
* rocm-cmake: changed test format from old to new
* Rename cmake variable
* post-conflict resolution: remove remaining version check

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-11 00:39:08 -06:00
Jack Morrison
7c7ac27900 MPICH: Add version 4.2.2 (#45040) 2024-07-10 21:32:09 -07:00
Derek Ryan Strong
253e8b1f2a Add libjpeg-turbo v3.0.3, v3.0.2, v3.0.1 (#44990) 2024-07-10 21:29:36 -07:00
dependabot[bot]
60e75c9234 build(deps): bump docker/login-action from 3.1.0 to 3.2.0 (#44424)
Bumps [docker/login-action](https://github.com/docker/login-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](e92390c5fb...0d4c9c5ea7)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 05:42:04 +02:00
dependabot[bot]
fb89337b04 build(deps): bump actions/setup-python from 5.1.0 to 5.1.1 (#45182)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](82c7e631bb...39cd14951b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 05:39:08 +02:00
Kyle Knoepfel
53f71fc4a7 Use ROOT_LIBRARY_PATH and adjust other environment variables (#45109)
* Use ROOT_LIBRARY_PATH and adjust other environment variables

* Accommodate versions older than ROOT 6.26

* Use os instead of pathlib
2024-07-10 19:40:52 -06:00
AcriusWinter
12e7c1569c pumi: new test API (#45181)
* pumi: new test API

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-10 17:57:05 -06:00
Nicole C
2eb566b884 Spack on Windows: update dev_build tests to run on Windows (#45039) 2024-07-10 16:52:01 -07:00
Hariharan Devarajan
a7444873b9 brahma: add 0.0.4 and 0.0.5 (#45168)
* Added Release 0.0.4 and 0.0.5
* Changed requirement for gotcha
   use gotcha 1.0.5 for 0.0.2 and 0.0.3
* Combine gotcha 1.0.7 for master and develop

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-10 14:30:32 -06:00
Sreenivasa Murthy Kolam
397ff11d6d rpp package - fix the add_tests build failure for 6.1 rocm rel (#44738)
* rpp package - fix the add_tests build failure for 6.1 rel
* fix test build failure
2024-07-10 13:29:14 -07:00
renjithravindrankannath
285563ad01 Need to configure rsmiBindings.py.in similar to rsmiBindingsInit.py.in (#45131) 2024-07-10 13:21:28 -07:00
Teague Sterling
c3111ac0b4 py-janus: new package (#44520)
* py-janus: add v.0.7.0,v1.0.0
*  Incorporating changes from review  including:    
    https://github.com/spack/spack/pull/44520#pullrequestreview-2095028464
2024-07-10 13:16:12 -07:00
HELICS-bot
6505e7e02a helics: Add version 3.5.3 (#45142)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-07-10 12:19:39 -07:00
Alex Richert
c82889058c bacio: recipe updates (#45150) 2024-07-10 12:18:23 -07:00
renjithravindrankannath
2f4c20567c Correcting sha256sum for 6.1.2 (#45152) 2024-07-10 12:13:03 -07:00
Adam J. Stewart
a7d6a1188b py-rtree: add v1.3.0 (#45157) 2024-07-10 12:09:56 -07:00
Matthieu Dorier
4f5244920f py-configspace: new versions (#45165) 2024-07-10 12:05:04 -07:00
Adam J. Stewart
feecb60b9e py-pyvista: declare numpy 2 support (#45158) 2024-07-10 12:03:38 -07:00
John W. Parent
4f18cab8d2 Cpuinfo: static build when on Windows (#44899)
* Mirror cpuinfo CI for msvc
2024-07-10 12:46:13 -05:00
Harmen Stoppels
e6f1b4e63a Avoid duplicate detectable tag (#45160)
in case of inheritance the static tags prop may be updated multiple
times, and it turns out builder classes magically inherit from
traditional package classes
2024-07-10 18:14:28 +02:00
Stephen Nicholas Swatman
c458e985af Set LD_LIBRARY_PATH for OneAPI compiler (#45059)
While trying to build packages with the OneAPI compiler version 2024.1 I
ran into the following error, indicating that the compiler is unable to
find some necessary libraries:

```
/storage/Software/oneapi/2024.1/compiler/2024.1/bin/sycl-post-link: error
  while loading shared libraries: libonnxruntime.1.12.22.721.so: cannot open
  shared object file: No such file or directory

  icpx: error: unable to execute command: No such file or directory

  icpx: error: sycl-post-link command failed due to signal (use -v to see
  invocation)
```

Indeed, `libonnxruntime.1.12.22.721.so` does come bundled with the
OneAPI compiler, but it is not available in the build environment by
default. In this commit, I update the custom environment created by
OneAPI to include the `lib/` directory in which these libraries reside
in the `LD_LIBRARY_PATH` environment variable.
2024-07-10 06:50:07 -06:00
Stephen Nicholas Swatman
5f234e16d0 dfelibs: add Boost as a testing dependency (#45133)
In my enthusiasm to add dfelibs to Spack, I didn't realise that the
unit tests of dfelibs use Boost and, as such, Boost is required as a
testing dependency.
2024-07-10 06:32:45 -06:00
Massimiliano Culpo
9001e9328a Remove unnecessary copy.deepcopy calls (#45135) 2024-07-10 09:33:48 +02:00
AcriusWinter
b237ee3689 octopus: old to new test API (#45143)
* octopus: old to new test API
* Minor simplifications and cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-09 19:02:36 -07:00
Massimiliano Culpo
aa911eca40 Add compatibility of sequoia with previous macOS versions (#45127)
* Add compatibility of sequoia with previous macOS versions

* Add compatibility of sequoia with previous macOS versions
2024-07-09 17:48:24 -07:00
Wouter Deconinck
6362c615f5 git: add several new patch-level versions (#45107)
* git: add new patch-level versions

* git: deprecate older previous with broken git lfs
2024-07-09 15:43:31 -06:00
Mikael Simberg
544be90469 fmt: add 11.0.1 (#45089)
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2024-07-09 13:21:50 -06:00
Peter Scheibel
56a1663cd9 spack find -c: search all concretized-but-not-installed specs (#44713)
Originally if you had `x -> y -> z`, and an env with `x` in its speclist that is concretized but not installed, then `spack find -c y` would not show anything. This was intended: `spack find` has up-until-now only ever listed out installed specs (and `-c` was for adding a preamble section about roots).

This changes `spack find` so:

* `-c` makes it search through all concretized specs in the env (in a sense it is anticipated that a concretized environment would serve as a "speculative" DB and users may want to query it like they query the DB outside of envs)
* Adds a `-i/--install-status` option, equivalent to `-I` from `spack spec`
* Shows install status for either `-c` or `-i`
* As a side effect to prior point, `spack find -i` can now distinguish different installation states (upstream/external)

Examples:

```
$ spack find -r
==> In environment findtest
==> 1 root specs
 -  raja

==> 6 installed packages (not shown)
==> 12 concretized packages to be installed (not shown)
```

```
$ spack find
==> In environment findtest
==> 1 root specs
 -  raja

-- darwin-ventura-m1 / apple-clang@14.0.3 -----------------------
berkeley-db@18.1.40  bzip2@1.0.8  diffutils@3.10  gmake@4.4.1  gnuconfig@2022-09-17  libiconv@1.17
==> 6 installed packages
==> 12 concretized packages to be installed (show with `spack find -c`)
```

```
$ spack find -c
==> In environment findtest
==> 1 root specs
 -  raja

-- darwin-ventura-m1 / apple-clang@14.0.3 -----------------------
[+]  berkeley-db@18.1.40  [+]  bzip2@1.0.8      -   cmake@3.29.4  [+]  diffutils@3.10  [+]  gmake@4.4.1           [+]  libiconv@1.17   -   nghttp2@1.62.0   -   pkgconf@2.2.0    -   readline@8.2
 -   blt@0.6.2             -   camp@2024.02.1   -   curl@8.7.1     -   gdbm@1.23       [+]  gnuconfig@2022-09-17   -   ncurses@6.5     -   perl@5.38.2      -   raja@2024.02.2   -   zlib-ng@2.1.6
==> 6 installed packages
==> 12 concretized packages to be installed


```
$ spack -E find
...
==> 82 installed packages
```
2024-07-09 11:53:20 -07:00
Rocco Meli
f9a46d61fa charmpp: add v8.0.0 (#45097)
* charmpp v8.0.0

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-07-09 10:21:41 -06:00
Mikael Simberg
a81451ba1f pika: add v0.26.0 (#45104) 2024-07-09 10:01:57 -06:00
Rocco Meli
b11e370888 namd 3.0 (#45096) 2024-07-09 09:54:29 -06:00
Massimiliano Culpo
54ee7d4165 Remove the "install_mockery_mutable_config" fixture (#45129)
This fixture was introduced in #16429, and made
redundant in #39024
2024-07-09 11:23:49 +02:00
Massimiliano Culpo
15efcbe042 Fix conflicting use of config and mutable_config fixtures in unit tests (#45106)
and add a fixture to detect use of conflicting fixtures
2024-07-09 09:51:04 +02:00
Alec Scott
7c5fbee327 Improve organization of CI workflow scripts and pip requirements (#45037) 2024-07-09 04:46:09 +02:00
Satish Balay
b19c4cdcf6 petsc, py-petsc4py: add v3.21.3 (#44954)
* petsc, py-petsc4py: add v3.21.3
* py-petsc4py: requires cython v3 since v3.20
2024-07-08 15:40:36 -07:00
Harmen Stoppels
3212cf86f4 environments.rst: go from simple to advanced (#45004)
* environments.rst: go from simple to advanced
* improvements
* notes about activation
2024-07-08 15:36:18 -07:00
Auriane R
fbceae7773 [py-datasets] Add py-datasets version 2.20.0 (#44903)
* Add py-datasets version 2.20.0

* Add dependency requirements for version 2.20 + refactor

* Add missing tqdm and requests versions and to install latest py-datasets

* Add missing python requirements for 2.8.0 and 2.20.0
2024-07-08 15:21:14 -07:00
George Young
b921d1a920 gtfsort: new package (#45062)
* gtfsort: new rust package @0.2.2

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-07-08 15:07:22 -07:00
Robert Cohn
8128b549a5 [intel-oneapi-dpct] correct 2024.2.0 hash (#45100) 2024-07-08 13:46:55 -07:00
Alex Richert
7405d95035 ip2: deprecate package, fix sp dependency (#45064) 2024-07-08 22:18:15 +02:00
Harmen Stoppels
a04b12a3ef spec.py: print right deptype in tree (#45091)
Fix a bug where Spec.tree with cover=nodes reduces deptypes from all
in-edges, including from nodes not reachable from the root, which almost
always happens for concrete specs
2024-07-08 18:25:57 +02:00
Massimiliano Culpo
cbf8f2326d pinentry: add v1.3.1 (#45073) 2024-07-08 08:58:14 -07:00
Harmen Stoppels
297874bfed spec.py: fix __getitem__ looking outside of dag (#45090)
`Spec.__getitem__` queries dependent edges, which almost always point to
nodes outside the sub-dag considered. It should only ever look at edges
being traversed.
2024-07-08 14:53:51 +02:00
Massimiliano Culpo
74398d74ac Add type-hints to RepoPath (#45068)
* Also, fix a bug with use_repositories + import spack.pkg
2024-07-08 11:48:39 +02:00
afzpatel
cef9c36183 kripke: update version to 1.2.7 (#44791)
* initial commit to update kripke to 1.2.7
* fix style errors
2024-07-08 02:16:24 -06:00
Wouter Deconinck
5e7430975a zlib-ng: add v2.1.7, v2.2.1 (#45076) 2024-07-08 09:40:44 +02:00
Adam J. Stewart
1456f6dba1 py-scikit-learn: add v1.5.1 (#45016) 2024-07-08 09:29:11 +02:00
Hariharan Devarajan
daf74a60ca cpp-logger: add v0.0.4 (#45033) 2024-07-08 09:28:33 +02:00
Niclas Jansson
87df95c097 neko: add v0.8.0 (#45086)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-08 01:26:44 -06:00
Richard Berger
9b49576875 legion: bugfix for +cuda+cuda_unsupported_compiler (#45036)
When using a newer Clang for Kokkos than supported by a given CUDA version, the
CUDA compiler detection in Legion's CMake still needs to be passed
CMAKE_CUDA_FLAGS to pass the compiler check.
2024-07-08 09:25:09 +02:00
Hariharan Devarajan
065cbf79fc gotcha: add v1.0.7 (#45043) 2024-07-08 09:22:19 +02:00
Rocco Meli
09b89e87a4 DLA-Future-Fortran: add v0.2.0 (#45055) 2024-07-08 09:16:31 +02:00
Adam J. Stewart
ddab6156a6 py-numpy: add v2.0.0 (#44735) 2024-07-08 09:14:50 +02:00
Harry Sharma
10cdfff0d1 feat: add diamond@2.1.[8,9] (#45047) 2024-07-08 09:05:32 +02:00
Adam J. Stewart
3328416976 py-matplotlib: add v3.9.1 (#45060) 2024-07-08 08:59:10 +02:00
Wouter Deconinck
094a621f3c acts: add v35.1.0, v35.2.0 (#44963)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:46:57 +02:00
Wouter Deconinck
87ce5d8ccb py-webcolors: add v24.6.0 (#45075) 2024-07-08 08:45:15 +02:00
Wouter Deconinck
bcdc92e25f vc: add v1.4.5 (#45077) 2024-07-08 08:43:16 +02:00
Wouter Deconinck
a323fab135 util-linux{-uuid}: add v2.40.2 (#45079)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:40:14 +02:00
Wouter Deconinck
d42031b075 assimp: add v5.4.2 (#45081)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:33:06 +02:00
Jonathon Anderson
efbb18aa25 hpctoolkit: minor fixes for build failures (#45070) 2024-07-08 08:22:11 +02:00
Harmen Stoppels
8a430f89b3 spack -C <env>: use env config w/o activation (#45046)
Precedence:

1. Named environment
2. Anonymous environment
3. Generic directory
2024-07-06 22:02:25 -07:00
dependabot[bot]
aeaa922eef build(deps): bump actions/upload-artifact from 4.3.3 to 4.3.4 (#45069)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.3 to 4.3.4.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](65462800fd...0b2256b8c0)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-06 08:20:30 +02:00
Hadrien G
a6d5a34be3 Remove myself from maintainer lists (#45071) 2024-07-06 08:17:31 +02:00
Todd Gamblin
ba79542f3c spack gc: remove debug print statement (#45067)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-05 22:36:45 +02:00
Auriane R
dc10c8a1ed [py-transformers] Add newer versions (#45022)
* Add newer versions for py-transformers

* Add dependencies needed for py-transformers latest version

* Enforce dependencies requirements for py-transformers newer versions
2024-07-05 14:52:24 +02:00
Auriane R
5ab814505e py-flash-attn: add v2.5.6 -> main (#44894)
* Add latest releases of py-flash-attn

* Add main branch for flash attention

* Add additional requirements
2024-07-05 06:19:01 -06:00
Harmen Stoppels
1d8bdcfc04 config: fix class hierarchy (#45044)
1. Avoid that `self.path` is of type `Optional[str]`
2. Simplify immutable config with a property.
2024-07-05 12:41:13 +02:00
Massimiliano Culpo
95cf341b50 Inject dependencies in repo classes (#45053) 2024-07-05 12:00:41 +02:00
Benjamin Fovet
a134485b1b remove trailing dot in gmsh 4.13.0 shasum (#45054) 2024-07-05 10:18:11 +02:00
dependabot[bot]
d39edeb9a1 build(deps): bump docker/setup-buildx-action from 3.3.0 to 3.4.0 (#45051)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](d70bba72b1...4fd812986e)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:35 -05:00
dependabot[bot]
453fb27be2 build(deps): bump docker/build-push-action from 6.2.0 to 6.3.0 (#45042)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.2.0 to 6.3.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](15560696de...1a162644f9)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:21 -05:00
dependabot[bot]
831f04fb7d build(deps): bump docker/setup-qemu-action from 3.0.0 to 3.1.0 (#45041)
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](68827325e0...5927c834f5)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:03 -05:00
Wouter Deconinck
04044a9744 containers: rm centos7 since EOL (#45049) 2024-07-04 22:22:23 +02:00
Massimiliano Culpo
8077285a63 flux-core: remove deprecated versions (#45014) 2024-07-04 22:15:55 +02:00
Paul R. C. Kent
537926c1a7 py-pyscf: add v2.6.0, v2.6.1, v2.6.2 (#45032)
* pyscf: add 260 261 262

* Fix max version typo
2024-07-04 12:43:27 -05:00
Stephen Nicholas Swatman
02ff3d7b1e acts-algebra-plugins: new package (#44861)
This commit adds the `acts-algebra-plugins` package which provides the
Acts project with linear algebra functionality.
2024-07-04 09:51:18 -05:00
Jordan Galby
491cb278f3 spack audit packages: Fix message (#45045)
Fix message formatting of the "virtual dependency cannot have variants" error.
2024-07-04 14:30:30 +02:00
Harmen Stoppels
ed1ebefd8f dray: deprecate and simplify (#45015) 2024-07-04 11:43:34 +02:00
Harmen Stoppels
36d64fcbd4 iconv: require libiconv on linux (#45026)
otherwise it is still picked up from glibc as it is external
2024-07-04 08:20:27 +02:00
Massimiliano Culpo
c5cdc2c0a2 Heuristic decays to default over time (#45023)
This modifies heuristic to decay to clingo default
over time. The hope is that this helps with specs
that have an optimal solution with a high penalty.

Let target and compiler heuristic decay too, do not
guess compiler
2024-07-04 08:19:52 +02:00
Carsten Uphoff
0eca86f64f tiny-tensor-compiler: fix missing dependencies (#44465)
* tiny-tensor-compiler: fix missing dependencies
* tiny-tensor-compiler: Remove dependency type for opencl c headers
* Re2c@3.1 has a mandatory Python interpreter dependency
* Remove re2c@3.1

---------

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-07-03 19:08:45 -06:00
Stephen Nicholas Swatman
50027d76a5 dfelibs: new package (#44860)
* dfelibs: new package

This commits adds the `dfelibs` package, which is an Acts project fork
of dfelibs.

* dfelibs: docstring typo

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-03 15:37:25 -06:00
Robert Cohn
b4748de5a9 [openfoam]: use latest cgal (#45003)
* [openfoam]: use latest cgal
* add version checks for CGAL
2024-07-03 14:04:38 -07:00
Teague Sterling
8f2532c624 Fix #44715 by removing the librsvg dependency, which is only needed for testing. Also fixing typo. (#44989)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-03 13:52:16 -07:00
Jordan Galby
0726513334 gettext: Fix unvendor libxml2 (#44924)
* gettext: Fix unvendor libxml2
* gettext: Fix build with external libxml2
* Revert "gettext: Fix build with external libxml2"
   This reverts commit c209ad65cb.
2024-07-03 13:40:55 -07:00
Adam J. Stewart
f951f38883 Add support for macOS Sequoia (#45018) 2024-07-03 10:59:26 -07:00
afzpatel
5262412c13 py-tensorflow: remove patch file for 2.16-rocm-enhanced (#44783)
* remove patch file for py-tensorflow@2.16-rocm-enhanced

* add changes

* fix style errors

* remove 2.14-rocm-enhanced version and add patch file

* fix stlye error

* remove jit patch

* add 2.14-rocm-enhanced version
2024-07-03 12:19:07 +02:00
Harmen Stoppels
f022b93249 cmake: add patch to allow static linking with -DCMAKE_INSTALL_RPATH set (#44900) 2024-07-03 10:46:07 +02:00
Vanessasaurus
60b5e98182 flux-core: add v0.64.0 (#45012)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-07-03 01:55:20 -06:00
mvlopri
682acae9fd seacas/package.py: Update default version sha to latest (#45009)
Update to use the latest tag of SEACAS through Spack.
2024-07-03 01:50:27 -06:00
Rocco Meli
c0f80e9117 elsi: improve package and add external libOMM (#44865)
Add MatrixSwitch package
Add libOMM package
Use libOMM as external in ELSI
Add include paths for all libraries
Fortran modules need one directory up (can't use libs.directories since they are just the include/ directories)
2024-07-03 08:56:30 +02:00
Teague Sterling
dcf13af459 kentutils: add v455, v460, v464, v465 + package updates (#44413)
This addresses a few issues in the kentutils package:
 - Issue #44372
 - Cleaning up formattting and styles
 - Removing old versions that are not avaialble anymore
 - Removing newer versions that are also not available anymore
 - The installer does not install the static libs
   that are expected by packages that depend on kentutils
 - There are conflicts and less-than-desirable dependencies
   that can be addressed via variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-03 08:40:02 +02:00
Auriane R
a2e4fb6b95 py-huggingface-hub: add v0.23.4 (#44905) 2024-07-03 08:35:50 +02:00
Adam J. Stewart
9dd92f493a py-tensorflow: fix numa build (#44607) 2024-07-03 08:34:34 +02:00
Robert Cohn
b23e832002 intel-oneapi-compilers: remove all patching (#45008) 2024-07-03 07:34:25 +02:00
Derek Ryan Strong
ae2f626168 Add giflib v5.2.2 (#44994) 2024-07-02 14:45:11 -07:00
Derek Ryan Strong
a73930da81 Add libwebp v1.4.0, v1.3.2, v1.3.1, v1.3.0 (#44993) 2024-07-02 14:43:45 -07:00
Derek Ryan Strong
921d446196 Add lcms v2.16, v2.15, v2.14 (#44992) 2024-07-02 14:41:02 -07:00
Derek Ryan Strong
8e2ea5de9d Add openjpeg v2.5.2, v2.5.1 (#44991) 2024-07-02 14:39:19 -07:00
Alex Richert
fd0baca222 fontconfig: add pic variant and support static deps (#44884)
* fontconfig: add pic variant; allow static dependencies
* Revise fontconfig to clean up flags
* Revise fontconfig to remove unnecessary flag
* Update fontconfig
* update fontconfig logic for static deps
2024-07-02 14:06:53 -07:00
Harmen Stoppels
d2fc0d6a35 Revert "e4s ci: reduce size due to 5mb gitlab artifact limit (#44986)" (#45001)
This reverts commit 18de6a480b.
2024-07-02 21:05:10 +00:00
renjithravindrankannath
eedc9e0eaf Bump-up the version for RoCm-6.1.2 release (#44849)
* Bumping up to ROCm 6.1.2
* Bump up rocmlir & amdsmi to 6.1.2
* Removing llvm dependency from amdsmi
* Removed redundant for loops and brought rocm-cmake in for loop
* Removing version check of deprecated versions
2024-07-02 14:02:25 -07:00
Massimiliano Culpo
6b85f6b405 ci: deprecate the --dependencies and --optimize option (#45005) 2024-07-02 22:06:52 +02:00
Matt Thompson
5686a6b928 mepo: new package (#44879)
* mepo: add new package
* Add mepo 2.0.0rc4
* Update dependencies
2024-07-02 12:18:40 -07:00
Hector Barrios
ad665c6af1 fix static linking when using Intel OneAPI mkl (#36991) 2024-07-02 13:55:30 -04:00
Mikael Simberg
d78d6db61e fmt: add 11.0.0 (#44997) 2024-07-02 11:51:37 -06:00
Wouter Deconinck
f47c307bf4 fastor: new package (#44984)
* fastor: new package

* fastor: apply code suggestions and fix style
2024-07-02 09:33:09 -06:00
Harmen Stoppels
5b4edb9499 queue -> stack (#45002) 2024-07-02 16:41:29 +02:00
Harmen Stoppels
a6e6093922 spack_yaml.py: fix default_flow_style (#44998) 2024-07-02 14:01:13 +02:00
Harmen Stoppels
2e8b4e660e spack_yaml: add anchorify function (#44995)
This adds spack.util.spack_yaml.anchorify, which takes a non-cyclic
dict/list structure, and replaces identical values with (back)
references to the first instance, so that yaml serialization will use
anchors.

`repr` is used to identify sub-dags, which in principle is quadratic
complexity in depth of the graph, but in practice the depth is O(1) so
this should not matter.

Then this is used in CI to reduce the size of generated YAML files to
30% of their original size.
2024-07-02 14:00:19 +02:00
Harmen Stoppels
0ca1ee8b91 Pipelines: update configuration for aws-isc (#44999)
1.18 is not a string in YAML

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-02 12:39:35 +02:00
Adam J. Stewart
a322672259 py-pillow: add v10.4.0 (#44983) 2024-07-02 03:59:24 -06:00
Dom Heinzeller
6cab86d0c1 New package py-pyhdf (#44877)
* Add var/spack/repos/builtin/packages/py-pyhdf/package.py
* Address reviewer comments regarding dependencies for new package py-pyhdf
2024-07-02 01:19:04 -06:00
Robert Cohn
86e7e2e070 [intel-oneapi-mpi]: restore support for classic wrapper names (#44982)
* [intel-oneapi-mpi]: add back support for classic wrappers

* spack style fixes

* add error checking
2024-07-01 18:54:14 -06:00
Cody Balos
69fca439f4 care: add v0.13.1, v0.13.0 and v0.12.0 (#44936)
* care: add v0.13.0 and v0.12.0
* add maintainer
* add 0.13.1
* update dependencies
2024-07-01 16:45:01 -06:00
Vicente Bolea
3b90fb589f vtk: update vtk to 9.3.1 (#44966)
* vtk: add 9.3.1 version

* vtk: update maintainers
2024-07-01 16:20:31 -06:00
Harmen Stoppels
fff126204c cmd/develop.py: fix readability (#44980)
stage[0] is assumed to be for sources, 1: and onwards is
patches/resources, make that a bit more clear.
2024-07-01 12:14:00 -07:00
Szabolcs Horvát
98e626cf67 igraph: add 0.10.13 (#44975) 2024-07-01 12:10:38 -07:00
Richard Berger
d8fe628a95 LAMMPS updates (#44977)
* lammps: add new versions
* lammps: add directories for plugins
* lammps-example-plugin: new package
   Example illustrating how to install a LAMMPS plugin, based on the
   example part of the LAMMPS distribution.
2024-07-01 11:57:26 -07:00
Weiqun Zhang
4c378840e3 amrex: add v24.07 (#44985) 2024-07-01 11:50:28 -07:00
eugeneswalker
18de6a480b e4s ci: reduce size due to 5mb gitlab artifact limit (#44986) 2024-07-01 20:28:56 +02:00
Kun Wu
c6da4d586b cuda: add v12.5.0 (#44971)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-01 17:45:04 +00:00
Tamara Dahlgren
61d6fc70e8 Docs: include cmake spec property for the command (#44956) 2024-07-01 10:28:26 -07:00
Vicente Bolea
7c65655c7e adios2: add patch to support rocm6 (#44941)
* adios2: add patch to support rocm6

* e4s rocm ci: re-enable adios2 +rocm

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2024-07-01 09:20:48 -07:00
snehring
76ca264b72 py-tesorter: add post install hmmpress step (#44940)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-01 16:51:26 +02:00
Vicente Bolea
aa58d3c170 vtk-m: update vtk-m to 2.2.0-rc1 (#44875) 2024-07-01 16:49:22 +02:00
Harmen Stoppels
a32b898a00 curl: remove deprecated versions (#44630) 2024-07-01 16:31:54 +02:00
Adam J. Stewart
6fcd43ee64 py-tensorflow: add v2.16.2 (#44974) 2024-07-01 15:50:52 +02:00
Richard Berger
f1f9f00d43 flecsi: cleanup spackage (#44873) 2024-07-01 15:49:59 +02:00
renjithravindrankannath
8ff27f9257 libffi: enable pic (#44970)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-07-01 07:01:12 -06:00
Adam J. Stewart
78c62532c7 py-scipy: add v1.14.0 (#44967) 2024-07-01 14:21:41 +02:00
Harmen Stoppels
a7f327dced zlib-ng: add patch for lld 17+ (#44684)
- add new patch for recent lld
- remove workaround for lld
- drop old versions
- drop old patches
2024-07-01 12:59:53 +02:00
Harmen Stoppels
310c435396 netlib-lapack: provide blas and lapack together (#44981)
If netlib-lapack is built with ~external-blas, it internally links
liblapack.so with libblas.so, meaning that whenever netlib-lapack is
used as a lapack provider, the package must also be a blas provider.

Conversely using netli-lapack as a blas provider does not imply that it
also must provide lapack, but nothing is lost disallowing that...
2024-07-01 12:53:03 +02:00
Harmen Stoppels
fa3f27e8e7 archive.py: undo unrelated changes from #43851 (#44947) 2024-07-01 11:35:32 +02:00
Harmen Stoppels
6b0fefff29 Use composite stage also for develop specs (#44950) 2024-07-01 11:24:16 +02:00
Massimiliano Culpo
f613316282 Flag propagation: restrict to link/run (#44925)
In practice people don't care about having their build dependencies reinstalled with say cflags=-O3 if that is what is set at the input spec, so restrict propagation to link/run deps only.

Also simplify the encoding in asp.
2024-07-01 09:57:51 +02:00
Massimiliano Culpo
1b5b74390f neoverse-v1: restore py-cinemasci (#44976)
Use a different tactic for determining conflicts.

Give more priority to setting False very old versions.
2024-07-01 09:53:24 +02:00
Adam J. Stewart
b57f88cb89 JAX: add v0.4.30 (#44964) 2024-07-01 09:02:48 +02:00
Adam J. Stewart
03afc2a1e6 py-scikit-image: add v0.24.0 (#44965) 2024-07-01 09:02:32 +02:00
Adam J. Stewart
1f6ed9324d py-kornia: add v0.7.3 (#44973) 2024-07-01 09:02:12 +02:00
Adam J. Stewart
5559772afa GDAL: add v3.9.1 (#44969) 2024-07-01 09:01:37 +02:00
Adam J. Stewart
8728631fe0 py-keras: add v3.4 (#44968) 2024-07-01 09:01:18 +02:00
Harmen Stoppels
e34d9cbe5f Remove DIYStage (#44949) 2024-07-01 08:33:56 +02:00
Wouter Deconinck
0efba09990 root: depends_on veccore@0.4.0: resp. @0.4.2: when certain versions (#36667) 2024-06-30 19:37:22 -06:00
Wouter Deconinck
a9cb80d792 xl: avoid matching "_r" in tempfile.TemporaryDirectory during audit (#44831) 2024-06-30 17:49:01 -06:00
Teague Sterling
dae6fe711c gobject-introspection: add v1.60.2 (#44643)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-30 20:46:02 +02:00
Massimiliano Culpo
c9a24bc6c5 neoverse-v1: comment out py-cinemasci (#44972) 2024-06-30 10:23:35 +02:00
AcriusWinter
00663f29a9 Strumpack: Changed old test method to new test method (#44874)
* added try except
* Resolve style issues

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-28 12:54:03 -07:00
Elliott Slaughter
15a48990b6 legion: Add 24.06.0. (#44951) 2024-06-28 12:20:11 -06:00
fpruvost
af0b898c2e composyx: fix cmake options prefix (#44948) 2024-06-28 09:49:08 -07:00
Massimiliano Culpo
ddf8384bc6 libgpg-error, libassuan: fixes for darwin (#44946) 2024-06-28 18:27:12 +02:00
Mikael Simberg
670f92f42b mold: add v2.32.1 (#44920) 2024-06-28 10:24:22 -06:00
Dominic Hofer
c81b0e3d2a icon: add 2024.01-1 (#44919) 2024-06-28 09:52:53 -06:00
joscot-linaro
f56d804d85 linaro-forge: added 24.0.2 version (#44922) 2024-06-28 09:47:47 -06:00
sarahtang-amzn
b57f08f22b WRF4.5.x: Limit the patching of ADIOS function to 4.5.0 (#44929) 2024-06-28 08:58:25 -06:00
Harmen Stoppels
34f3b8fdd0 Add missing v0.22.0 changelog (#44945) 2024-06-28 13:31:38 +02:00
Harmen Stoppels
0b3b49b4e0 installer.py: handle external roots the same (#44917)
There was logic not to enqueue build requests for externals if they occur as
roots. That's unnecessary.
2024-06-28 11:26:56 +02:00
Massimiliano Culpo
fa96422702 libassuan: add v3.0.0 (#44913)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-28 10:44:25 +02:00
Massimiliano Culpo
e12168ed24 libgcrypt: add v1.11.0 (#44915)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-28 10:43:57 +02:00
Massimiliano Culpo
c0f2df8e0a libgpg-error: add v1.50 (#44914) 2024-06-28 10:39:19 +02:00
Massimiliano Culpo
8807ade98f libksba: add v1.6.7 (#44916) 2024-06-28 10:38:43 +02:00
arezaii
13356ddbcc update chapel package for v2.1 (#44931) 2024-06-27 21:29:11 -06:00
AcriusWinter
974033be80 hiop: Corrected test name, added docstring, and changed test to new API (#44765)
* Changed test method from old method to new method
* Corrected test name and added docstring
* Split test method into stand alone-tests
* Added SkipTest
* code refactor
* removed repeated code
* Remove exe from test part purpose

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-27 18:02:42 -06:00
Alan Dayton
8755fc7291 Add new releases of CHAI (#44911) 2024-06-27 17:47:28 -06:00
Chris Marsh
17c02fe759 vtk: Fix proj dependency ranges (#44370)
* Improve proj version selection to avoid too new of a proj with older api usage in vtk9

* Proj constraint range around 8.2.1


---------

Co-authored-by: John Parent <john.parent@kitware.com>
2024-06-27 15:00:56 -04:00
Dan Lipsa
4c7d18a772 Spack on Windows: fix "spack load --list" and "spack unload" (#35720)
Fix the following on Windows:

* `spack load --list` (this printed 0 packages even if packages were
  loaded)
* `spack unload <package>` (this said that the package is not loaded
  even if it was)

Update unit tests for `spack load` to also run on Windows (specifically
for ".bat"). This involved refactoring a few tests to parameterize
based on whether the unit tests are being run on a Windows system
(and to account for batch syntax).
2024-06-27 11:44:36 -07:00
dependabot[bot]
b28b26c39a build(deps): bump docker/build-push-action from 5.3.0 to 6.2.0 (#44910)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.3.0 to 6.2.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](2cdde995de...15560696de)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-27 13:06:39 -05:00
Teague Sterling
23aed605ec plink2: add v2.00a5.10, v2.00a5.11 (#44707)
* Adding additional versions to plink2 and switching to tarballs to allow for better version detection in the future

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-27 10:48:20 -07:00
Massimiliano Culpo
fdb8d565aa zig: add v0.13.0 (#44912) 2024-06-27 10:38:12 -07:00
Benjamin Fovet
9b08296236 gmsh: add v4.13.1 (#44901)
Also fix gmsh 4.13.0 checksum

Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2024-06-27 11:03:34 -06:00
Julien Cortial
c82d8c63fa lcov: Add missing perl dependency (#44896)
Date::Parse, provided by TimeDate, is an undocumented dependency
of lcov 2
2024-06-27 07:08:06 -06:00
Julien Cortial
7a8989bbfc mumps: add v5.7.2 (#44897) 2024-06-27 07:07:51 -06:00
Matthew Whitlock
22c86074c8 Fix bug in logfile parsing (#42706)
* Fix bug in logfile parsing

* Cite ECMA-48 standards for CSI

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-27 07:07:30 -06:00
Massimiliano Culpo
ef9e449322 Ensure parent runtime version >= child (#44834)
Fixes a bug where old gcc-runtime libraries would be loaded at runtime, but newer are required by dependencies, breaking the binaries.
2024-06-27 14:58:42 +02:00
Henrique BR
6b73195478 Fixed typo in hypre package (#44921)
Co-authored-by: “Henrique <henrique.bergallo-rocha@ukaea.uk>
2024-06-27 12:59:11 +02:00
Emil Briggs
c7b9bf6a77 rmgdft: add v6.1.0 (#44797) 2024-06-27 12:58:35 +02:00
Thomas Madlener
a84c91b259 podio: add version 1.0.1 (#44854) 2024-06-27 12:55:40 +02:00
Michael Kuhn
e6566dfd67 gcc: add 12.4.0 (#44882) 2024-06-27 12:42:55 +02:00
dependabot[bot]
d6419f32b8 build(deps): bump mypy from 1.10.0 to 1.10.1 in /lib/spack/docs (#44885)
Bumps [mypy](https://github.com/python/mypy) from 1.10.0 to 1.10.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.10.0...v1.10.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-27 12:40:14 +02:00
Miguel
60ed682577 Fix typo in docs (#44891)
Fixes on the rst formatting of the make_jobs
2024-06-27 12:32:16 +02:00
Harmen Stoppels
6c1fa8c30b mongo-c-driver: overhaul package recipe (#44866) 2024-06-27 12:30:40 +02:00
Harmen Stoppels
09167fe8ac ci: mgard build failure w/ system zlib (#44918)
The pcluster pipeline runs spack external find and autodetects the most ancient of zlib versions on the system: 1.2.7, release date May 3, 2012.

Turns out some packages require parts of the zlib api added within the last decade.
2024-06-27 12:19:17 +02:00
Dom Heinzeller
eb7951818d py-poetry, py-numexpr, and py-metpy: correct dependencies (#44651) 2024-06-27 09:40:17 +00:00
Auriane R
6959656d51 py-pyyaml: add v6.0.1 (#44906) 2024-06-27 11:13:20 +02:00
Wouter Deconinck
f916b50491 containers: centos:stream -> centos:stream9 (#44876) 2024-06-27 10:38:56 +02:00
Jiakun Yan
7160e1d3e7 lci: new package (#44041) 2024-06-27 10:31:14 +02:00
Cody Balos
16369d50a7 sundials: add v7.1.1 (#44814)
* sundials: add 7.1.0

* add patch

* remove patch, add 7.1.1 instead of 7.1.0
2024-06-26 15:44:57 -07:00
eugeneswalker
35fc371222 update e4s stacks (#44813)
* update e4s stacks

* adios2 +rocm: disable kokkos due to spack issue #44832

* comment out mgard+cuda due to spack issue #44833

* comment out cabana, legion, arborx due to kokkos spack issue #44832

* comment out slepc, petsc due to petsc spack issue #44600

* comment out adios2+rocm due to kokkos rocm spack issue #44832

* comment out kokkos due to spack issue #44832
2024-06-26 15:24:34 -07:00
Harmen Stoppels
4bbf4a5e79 otf2: fix package (#44856)
- extend python
- forward compat bound with python due to imp import
- restore -O2 flags (CFLAGS do not compose)
2024-06-26 17:28:51 +02:00
Alex Richert
e5bd79b011 py-jcb: new package (#44880)
* Add py-jcb: JEDI Configuration Builder
* Update maintainers
2024-06-26 08:18:40 -06:00
Dom Heinzeller
2267b40bda New package py-globus-cli, update py-globus-sdk (#44881)
* Add var/spack/repos/builtin/packages/py-globus-cli/package.py
* Update var/spack/repos/builtin/packages/py-globus-sdk: add version 3.25.0
* For py-globus-cli: make py-typing-extension dependency conditional of Python version, and allow newer Python versions than 3.10

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-26 08:13:03 -06:00
Massimiliano Culpo
b0b6016e12 ASP-based solver: add a generic rule for propagation (#44870)
This adds a generic propagate/2 rule to propagate any
fact to children in the DAG.
2024-06-26 16:02:00 +02:00
Harmen Stoppels
c24265fe7e opennurbs: multiple build systems (#44871) 2024-06-26 15:34:20 +02:00
Jon Rood
9d0102ac89 nalu-wind: fix linking when using trilinos-solvers variant (#44887) 2024-06-26 07:33:50 -06:00
Mikael Simberg
52b8b3ed8d stdexec: Don't skip build stage on main since no longer header only (#44892) 2024-06-26 14:50:44 +02:00
Harmen Stoppels
380030c59a mpifileutils: cmakepackage (#44863) 2024-06-26 14:28:29 +02:00
Harmen Stoppels
867a813328 kallisto: simplify (#44862) 2024-06-26 14:27:12 +02:00
Harmen Stoppels
ded3fa50a3 jasper: multiple build systems (#44859) 2024-06-26 14:25:23 +02:00
Harmen Stoppels
84653e8d9f cutlang: remove (#44853) 2024-06-26 14:24:37 +02:00
Harmen Stoppels
ac0fd7138f oce: deprecate (#44864) 2024-06-26 14:19:41 +02:00
Massimiliano Culpo
c0f9f47b8c Simplify and improve solver heuristic (#44893)
When we changed how to deal with errors in November,
we didn't realize that for an unconstrained choice
rule it is more important in the heuristic to guess
what is NOT in the answer set, since it will be the
majority of options.

Previously this was following automatically from what
was in the answer set, via `1 { ... } 1` cardinality
constraints.

Here we improve the heuristic and the solve time for specs.
2024-06-26 13:08:41 +02:00
Daniel Ahlin
7e9d24a145 gromacs: maintainer change (#44890)
Adding mabraham to maintainer. Mark will bring a lot of GROMACS knowledge. Removing danielahlin as I am currently not working actively on this.
2024-06-26 01:44:35 -06:00
Simon Pintarelli
99405e6a4d nlcglib: add v1.1.0 (#44580) 2024-06-26 09:42:26 +02:00
Simon Frasch
7500a4853c spla: add v1.6.1 (#44878) 2024-06-26 09:40:01 +02:00
psakievich
54d192e026 Steal source was not assigning the package class (#44886)
Fetcher was missing the package class assignment
2024-06-25 23:17:56 -07:00
Harmen Stoppels
e6a8eba72d eztrace: multiple build systems (#44855) 2024-06-26 06:12:02 +02:00
Wouter Deconinck
4d604c8c9f root: add v6.30.08, v6.32.02 (#44807) 2024-06-25 14:57:44 -07:00
fpruvost
b081e0046f Move/rename maphyspp to composyx (#44836) 2024-06-25 13:58:54 -07:00
Tamara Dahlgren
baf82c0245 Docs: Update stand-alone test information (#44755)
Update and slightly reorganize stand-alone test information to include new and improved examples and more links that can be used in PR feedback.
2024-06-25 13:33:41 -07:00
Hector Martinez-Seara
884a0a392d rdma-core: Patched CMakeLists.txt to find drm include at libdrm - archlinux (#44839)
* Patched CMakeLists.txt to find drm include at libdrm - archlinux
* Restricted usage patch to versions 34 onwards
* Update var/spack/repos/builtin/packages/rdma-core/libdrm.patch

---------

Co-authored-by: Mark Abraham <Mark.J.Abraham@gmail.com>
2024-06-25 13:06:38 -07:00
Harmen Stoppels
824f2a5652 memaxes: cmakepackage (#44869) 2024-06-25 12:55:54 -07:00
Wouter Deconinck
b894acf1fc geant4: add v11.2.2, incl g4ndl v4.7.1 (#44830)
* geant4: add v11.2.2

* g4ndl: add v4.7.1

* geant4-data, g4ndl: use =4.7 for 11.2.0:11.2.1, 4.7.1 for 11.2.2
2024-06-25 14:35:31 -04:00
Miguel
536856874c add documentation for make_jobs variable (#44838)
* add documentation for make_jobs variable

* apply suggested changes

* Update packaging_guide.rst

add suggestions to the documentation

* Update packaging_guide.rst

fix missing quotes in the documentation

* suggestions to packaging_guide.rst
2024-06-25 11:24:49 -06:00
eugeneswalker
8d1aaef8b8 e4s ci: paraview: require +examples (#44847) 2024-06-25 15:12:56 +00:00
Simon Pintarelli
3216c4362e tiled-mm: add v2.3.1 (#44867) 2024-06-25 09:08:14 -06:00
Harmen Stoppels
2e822e65fd libbson: cmake & autotools package (#44852) 2024-06-25 14:54:22 +02:00
Harmen Stoppels
41fde4db8c glog: remove custom cmake stuff (#44858) 2024-06-25 14:52:28 +02:00
Wouter Deconinck
81125c3bd8 hepmc3: pass root variant cxxstd as HEPMC3_CXX_STANDARD (#44806)
* hepmc3: pass `root` variant `cxxstd` as `HEPMC3_CXX_STANDARD`

* hepmc3: when @:3.2.3 +rootio, depends_on root cxxstd=11

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-06-25 05:50:36 -05:00
Nicole C
d8b0df6f5b Make url_fetch tests work on Windows (#44809) 2024-06-25 07:34:11 +00:00
Martin Pokorny
99e3fdb180 Legion: add redop_half variant, and add missing build dependency (#44792)
* legion: add redop_half variant
* legion: conditionally add py-setuptools as build dependency
   Dependency exists only for '+python' variant.
2024-06-25 01:33:54 -06:00
Paul Romano
64cfdc07cb OpenMC: add v0.15.0 and v0.14.0 (#44840) 2024-06-25 01:19:23 -06:00
Harmen Stoppels
59fbbdd9ce blitz: use cmake (#44804) 2024-06-25 09:11:14 +02:00
Melven Roehrig-Zoellner
adedf58297 py-heat: add v1.4.1 (#44835) 2024-06-25 01:10:18 -06:00
Wouter Deconinck
f13ea8aa75 py-scikit-build-core: typing-extension dependency fix (#44817) 2024-06-25 09:00:03 +02:00
Wouter Deconinck
109a4d52b5 vecmem: add v0.25.0 -> v1.6.0 (#43528) 2024-06-25 08:48:53 +02:00
Davis Herring
7f2117e2cf flecsi: add v2.3.0 (#44798) 2024-06-25 08:34:02 +02:00
Rocco Meli
dfab3e8829 ELSI: improve package (#44717)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* elsi

* elsi improvements

* [@spackbot] updating style on behalf of RMeli

* remove test variants

* fix ntpoly and and use externals as default

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-06-25 07:38:56 +02:00
Martin Pokorny
822622f07a thrust: add v1.17.x (#44561) 2024-06-25 07:35:29 +02:00
Vicente Bolea
c4194e4f58 vtk-m: support kokkos@4: (#44850) 2024-06-25 07:21:20 +02:00
Raffaele Solcà
05c7ff4595 dla-future: add v0.6.0 (#44841) 2024-06-24 22:33:14 -06:00
Matt Thompson
a6ac78c7c6 mapl: add 2.47.0 (#44842)
* mapl: add 2.47.0
* Fix typo
2024-06-24 22:32:46 -06:00
eugeneswalker
8479122e71 ecp rocm ci: add ecp-dav back + use updated container image w rocm-llvm-dev installed (#44827) 2024-06-24 16:48:40 -06:00
Teague Sterling
3d91bfea75 libgtop: new package (#44642)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-24 15:43:27 -06:00
Wouter Deconinck
27e28b33ee py-fsspec: add v2024.3.1, v2024.5.0 (switch to hatchling) (#44491) 2024-06-24 18:16:37 +02:00
Teague Sterling
eaf8ac7407 py-orjson: add v3.8.14, v3.9.15, v3.10.3 and dependencies (#44514)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* rollback

* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* Update package.py
2024-06-24 18:15:11 +02:00
John W. Parent
1e9d550bc6 cmake: add v3.29.6 (#44616) 2024-06-24 09:19:14 -06:00
Harmen Stoppels
0282fe9efd spec_list: do not resolve abstract hashes (#44760) 2024-06-24 16:43:01 +02:00
eugeneswalker
ad3d9c83fe e4s: add glvis (#44812) 2024-06-24 05:57:54 -07:00
Adam J. Stewart
910b923c5d JAX: add v0.4.29 (#44683)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2024-06-24 08:43:45 +02:00
snehring
c1f1e1396d libint: add v2.9.0 (#44776)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-24 08:36:13 +02:00
Neil Flood
3139dbdd39 py-rios: add v2.0.2 (#44779) 2024-06-24 08:35:00 +02:00
Teague Sterling
cc6dcdcab9 htslib: add variants (#44501)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-24 08:34:10 +02:00
Wouter Deconinck
57b83e5fb2 root: depends_on libc only on linux (#44823) 2024-06-22 20:39:08 +02:00
Andrey Perestoronin
55fe73586e add packages from intel-oneapi-2024.2.0 release (#44789) 2024-06-21 16:17:54 -06:00
Auriane R
3be497344a py-triton: add main (#44774)
* Add triton env variable to build with clang

* Add git url and main branch for py-triton
2024-06-21 08:57:58 -07:00
Carlos Bederián
05413689b9 python: add v3.12.4 (#44801) 2024-06-21 17:00:39 +02:00
Alberto Invernizzi
50da223888 lua-lpeg: add latest version (#44803) 2024-06-21 07:16:08 -06:00
Alberto Invernizzi
719b260cf1 tree-sitter: new versions (#44802) 2024-06-21 06:52:37 -06:00
Tamara Dahlgren
3848c41494 Bugfix: test_is_externally_detectable needs to use mockpackages (#44795) 2024-06-21 12:24:46 +02:00
Veselin Dobrev
da720cafd8 openssl: add latest OpenSSL versions, deprecate previous ones (#44620) 2024-06-21 08:32:23 +02:00
AcriusWinter
8f0b029308 openpmd-api: Changed from old to new test API (#44764)
* changed from old to new test API
* changed test name so it works
* small docstring change
* fixed skiptest check
* made tests check output
2024-06-20 15:06:48 -07:00
Pierre Blanchard
901f4b789d sleef: add v3.6.1 (#44784)
Update preferred package to SLEEF 3.6.1
2024-06-20 14:47:54 -07:00
kwryankrattiger
6e6fef1b0e ParaView: add 5.13.0-RC1 (#44785)
* ParaView: add 5.13.0-RC1
2024-06-20 14:44:26 -07:00
George Young
ae131a5c7c py-multiqc: updating to @1.21, adding new dependency py-pyyaml-env (#44768)
* py-multiqc: updating to @1.21, adding new dependency py-pyyaml-env
* Correcting dependencies
* Drop matplotlib depends_on() for versions of multiqc not included in package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-06-20 14:28:54 -07:00
MichaelLaufer
ed59e43e1d py-pyfr: Add v2.0.2 (#44487)
* py-gimmik: add v3.2.1
* py-pyfr: add v2.0.2
2024-06-20 14:22:29 -07:00
Loris Ercole
5e8f9ed1c7 elpa: new version and recipe tuning (#44749)
Add version 2024.03.001 and fix #43902.

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-06-20 14:16:10 -07:00
Brian Van Essen
9c31ff74c4 Allow zlib to find external installations. (#44694)
* Allow zlib to find external installations.
* Apply suggestions from code review
* Fixed the determine_version function to loop over all of the potential
libraries that could be installed by zlib.

---------

Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
2024-06-20 13:40:23 -07:00
AcriusWinter
90c8fe0182 snakemake: Changed test method name and added docstring (#44723)
* snakemake: Changed tested method name and added docstring
2024-06-20 10:28:29 -07:00
John W. Parent
58db81c323 bootstrap: test building clingo from sources on windows (#44624) 2024-06-20 16:39:58 +00:00
snehring
e867008819 votca: tighten up boost dep (#44777)
* votca: tighten up boost dep

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-20 10:13:08 -06:00
Paul R. C. Kent
9910e06b25 Update package.py (#44772) 2024-06-20 17:41:15 +02:00
Harmen Stoppels
a3ece7ff4d cmake: remove version deprecated in 0.22 (#44628) 2024-06-20 17:15:26 +02:00
Robert Manson-Sawko
00f0ca2060 openmpi: add with-lsf-libdir config option (#44563) 2024-06-20 08:25:25 -06:00
Derek Ryan Strong
35557ac21c pmix: add v5.0.2, v4.2.9, v4.2.8, v4.2.7 (#44576) 2024-06-20 14:49:00 +02:00
Josh Bowden
54662f7ae1 damaris: add v1.11.0 (#44610)
Co-authored-by: Joshua Bowden <joshua-chales.bowden@inria.fr>
2024-06-20 14:30:11 +02:00
Wouter Deconinck
9cdb2a8dbb dd4hep: depends_on root +root7 in some cases (#43671)
* dd4hep: depends_on root +root7 in some cases

* dd4hep: avoid self-referential whens -> conflicts

* dd4hep: avoid conflict with (narrower) depends_on root@6.27:
2024-06-20 14:22:47 +02:00
Marcel Breyer
ffdab20294 googletest: fix checking a variant condition (#44781)
Change "pthread" to "+pthread" since otherwise the spec is never satisfied and pthread is never used.
2024-06-20 05:59:53 -06:00
Dominic Hofer
e91a69a756 py-tabulate: add v0.8.10, v0.9.0 (#44604) 2024-06-20 13:51:50 +02:00
Matthew Lesko
7665076339 ESMF: set ESMF_LAPACK_LIBPATH if +external-lapack (#44672)
This PR set the `ESMF_LAPACK_LIBPATH` variable (from the `lapack` spec
attributes), removing the previous FIXME and commented example.
2024-06-20 13:48:38 +02:00
Thomas Madlener
49ba2d84a0 edm4hep: Make edm4hep extend python after upstream changes (#44681) 2024-06-20 13:47:14 +02:00
Robert Cohn
0cec923e0a intel-oneapi-mpi: update mpi wrappers to icx variants (#44688) 2024-06-20 13:46:32 +02:00
Nick Hagerty
b97b001dad hipfort: add non-system gcc support (#44612) 2024-06-20 13:45:18 +02:00
Stephen Sachs
113e231abe Remove deprecated intel-* packages (#44700)
Packages will be removed with https://github.com/spack/spack/pull/44689. This PR
makes sure that the `aws-pcluster-x86_64_v4` stack still works as expected.
2024-06-20 07:07:27 -04:00
Adam J. Stewart
f43ca7a554 py-lightning: add v2.3 (#44731)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2024-06-20 11:32:53 +02:00
Robert Cohn
3c2c215619 [intel-*] remove deprecated packages (#44689)
* [intel-*] remove deprecated packages

* undo delete of mkl, mpi, parallel-studio
2024-06-20 11:31:38 +02:00
Frédéric Simonis
a2b3a004bf precice: add version 3.1.2 (#44583) 2024-06-20 11:17:11 +02:00
Harmen Stoppels
f650133f83 build_environment: fix ccache error handling (#44740) 2024-06-20 11:16:24 +02:00
Thomas Helfer
81f9d5baa5 tfel: add support for versions up to 4.2.1 (#44578) 2024-06-20 11:09:50 +02:00
psakievich
3938a85ff8 nalu-wind: update submodules (#44687)
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
2024-06-20 10:41:44 +02:00
Thomas Madlener
84cb604b19 podio: Add version 1.0 (#44780)
Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
2024-06-20 02:31:55 -06:00
Alec Scott
093504d9a0 go: drop deprecated versions prior to v0.22 release, clean up build location (#44778) 2024-06-20 10:06:54 +02:00
Alex Richert
70a93a746d py-wxflow: new package (#44754)
* Add py-wxflow, a set of tools used in weather workflows
* Update package.py
* add 0.2.0
* add unit testing
2024-06-19 16:19:31 -06:00
Adam J. Stewart
4326efddbf py-maturin: add v1.6.0 (#44734)
* py-maturin: add v1.6.0
* bzip2 only on macOS
2024-06-19 15:02:59 -07:00
Joe Schoonover
6f51b543f0 Update hohqmesh and feq-parse packages (#44737)
* Update hohqmesh and feq-parse versions
   Update feq-parse homepage to new documentation page
   Update feq-parse license to be consistent with 2.2.2 release

---------

Co-authored-by: fluidnumerics-joe <fluidnumerics-joe@users.noreply.github.com>
2024-06-19 14:54:58 -07:00
Christian Glusa
3316e49ad3 Binder: Add newer version (#44741) 2024-06-19 11:14:43 -07:00
Stephen Nicholas Swatman
0c4a91cd18 actsvg: add versions up to 0.4.44 (#44770)
This commit adds new versions of actsvg. I have also taken the liberty
of adding myself a maintainer as I work actively on the ACTS project and
its dependencies, and would like to start keeping the Spack specs more
up to date.
2024-06-19 11:13:17 -07:00
Dave Keeshan
07c6c3ebac verilator: add v5.026 (#44757) 2024-06-19 11:07:56 -07:00
SXS Bot
e3a4a07616 spectre: add v2024.06.18 (#44761)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-06-19 11:06:12 -07:00
Stephen Nicholas Swatman
57467139e5 covfie: new package (#44771)
This commit adds the covfie package which facilitates the
high-performance storage of vector fields and other structured
multi-dimensional data.
2024-06-19 10:34:32 -07:00
Carlos Bederián
e8bc53f37b ucx: add v1.17.0 (#44767) 2024-06-19 13:24:54 +02:00
Dan Bonachea
3b78515fd4 upcxx package: Add resilience to broken libfabric (#44618)
Some systems have a libfabric install that doesn't work, so don't
drop dead if a call to `fi_info` fails (e.g. due to missing shared libraries)
2024-06-18 18:10:04 -07:00
downloadico
6b052c3af9 Abinit fix hdf5 (#44763)
* abinit: fix locating HDF5
Remove the check in the configure script to locate HDF5.  Replaced by using
Spack to locate the package.
2024-06-18 16:28:11 -06:00
AcriusWinter
37e2d46d7d Legion: Reformatted Old To New Test Method and skipping tests (#44733)
* Legion: reformatted old test method to match new test method
* Updated docstring and how cmake file is opened
2024-06-18 12:18:56 -07:00
eugeneswalker
a389eb5a08 e4s external rocm ci: bump rocm stack to v6.1.1 (#44449)
* e4s external rocm ci: bump rocm stack to v6.1.1

* comment out exago+rocm due to issue with raja@0.14.0 see spack issue #44593

* comment out adios2+rocm due to spack issue #44594

* comment out petsc+rocm due to spack issue #44600

* comment out sundials+rocm due to spack issue #44601

* comment out slepc+rocm due to petsc spack issue #44600

* comment out tau+rocm due to spack issue #44659

* comment out ecp-data-vis-sdk due to spack issue #44745

* packages: register rocm-core as external

* re-enable tau due to issue #44659 having been resolved

* use latest ci image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4-rocm6.1.1:2024.06.17

* comment out paraview due to spack issue #44745

* comment out ecp-data-vis-sdk +vtkm due to issue https://gitlab.spack.io/spack/spack/-/jobs/11632511
2024-06-18 11:43:03 -07:00
Neil Flood
d57f174ca3 py-rios: add 1.4.17, v2.0.1 (#44679)
* Update for 2.0.1
* cloudpickle dependency is only 'run'
* Follow new formatting guidelines
* black wants trailing commas
* Simplified version ranges, as recommended by @tldahlgren
2024-06-18 10:35:43 -06:00
AcriusWinter
e6ae42b1eb vtk-m: Changed test method names and skipping non-applicable tests from old to new approach (#44705)
* vtk-m: Changed test method names and skipping non-applicable tests from old to new approach

-----
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-18 09:40:17 -06:00
George Young
d911b9c48d seacr: new package @1.4-b2 (#42677)
* seacr: new package @1.4-b2

* Update var/spack/repos/builtin/packages/seacr/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-18 07:37:58 -07:00
Tim Haines
3b35b7f4fa Boost: switch from jfrog to boost.io for downloads (#44728)
The jfrog hosting will be shut down in Dec 2024.
2024-06-18 08:40:03 +02:00
dependabot[bot]
a82fb33b31 build(deps): bump flake8 from 7.0.0 to 7.1.0 in /lib/spack/docs (#44752) 2024-06-18 08:27:15 +02:00
dependabot[bot]
5f35a90529 build(deps): bump urllib3 from 2.2.1 to 2.2.2 in /lib/spack/docs (#44751) 2024-06-18 08:27:04 +02:00
dependabot[bot]
81c620b61b build(deps): bump flake8 from 7.0.0 to 7.1.0 in /.github/workflows/style (#44750) 2024-06-18 08:26:18 +02:00
Chris White
12866eb0d6 ascent: add v0.9.3 (#44571)
* add new ascent version
* add requirement for new version of umpire/raja
* add patch for vtk-m dependency
2024-06-17 19:37:50 -06:00
arezaii
24b8d0666e Chapel package: major update (#42197)
* add cray detection taken from upcxx
* add CUDA/ROCm support
* add numerous pass-through options to Chapel build,
  like gpu_mem_strategy, comm_substrate, etc.; all variants are
  translated to analogous CHPL_* environment variables. As a side
  effect, this defines a number of environment variables that are
  not actually used by Chapel.
* Define LD_LIBRARY_PATH, LIBRARY_PATH, and PKG_CONFIG_PATH to
  help programs built with Chapel properly locate needed runtime
  dependencies

---------

Co-authored-by: bonachea <dobonachea@lbl.gov>
2024-06-17 18:09:05 -06:00
Tamara Dahlgren
0f2c7248c8 Bugfix: omega-h stand-alone tests: ensure proper ordering (#44748) 2024-06-17 15:17:30 -06:00
Alec Scott
8cc4ad3ac5 pass: install autocompletion for all shells (#44744) 2024-06-17 12:35:02 -07:00
Dan Lipsa
9ba90e322e PROJ: add new versions, improve tiff patches (#42767)
* Projsync needs curl

This fixes a proj compilation error from paraview

* tiff does not set TIFF_INCLUDE_DIR tested by proj

* TIFF patch exists for proj 9.2

* Add comments with location of proj patch

* Use TIFF patches for different cmake versions

* Use modules starting cmake 3.28 and variables before that

* Fix black style

* Simpler patches, add newer proj

* Remove duplicated flag

* Deprecate PROJ 4.8 and older

* Remove PRIVATE

* Deprecate PROJ 7.0.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-17 15:33:22 +02:00
Taillefumier Mathieu
d9c6b40d8e [cp2k] Enforce exclusion of libxsmm main (#44739)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-17 07:30:15 -06:00
Alberto Invernizzi
d792e1f052 update minimum libvterm version for 0.10.0 (#44720)
7a5effb0f9
2024-06-15 21:15:50 -07:00
4073 changed files with 20076 additions and 8713 deletions

View File

@@ -12,6 +12,7 @@ updates:
interval: "daily"
# Requirements to run style checks
- package-ecosystem: "pip"
directory: "/.github/workflows/style"
directories:
- "/.github/workflows/requirements/*"
schedule:
interval: "daily"

View File

@@ -28,8 +28,8 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -53,27 +53,33 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest' }}
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: "3.12"
- name: Bootstrap clingo
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
source share/spack/setup-env.sh
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
gnupg-sources:
runs-on: ${{ matrix.runner }}
@@ -90,7 +96,7 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -119,10 +125,10 @@ jobs:
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: |
3.8
@@ -148,7 +154,7 @@ jobs:
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
./bin/spack-tmpconfig -b ./.github/workflows/bin/bootstrap-test.sh
export PATH="$old_path"
fi
fi
@@ -162,4 +168,3 @@ jobs:
source share/spack/setup-env.sh
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -40,8 +40,7 @@ jobs:
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos7, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:7'],
[centos-stream, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream'],
[centos-stream9, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream9'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
@@ -56,7 +55,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -77,7 +76,7 @@ jobs:
env:
SPACK_YAML_OS: "${{ matrix.dockerfile[2] }}"
run: |
.github/workflows/generate_spack_yaml_containerize.sh
.github/workflows/bin/generate_spack_yaml_containerize.sh
. share/spack/setup-env.sh
mkdir -p dockerfiles/${{ matrix.dockerfile[0] }}
spack containerize --last-stage=bootstrap | tee dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile
@@ -88,19 +87,19 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact@0b2256b8c012f0828dc542b3febcab082c67f72b
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
uses: docker/setup-qemu-action@49b3bc8e6bdd4a60e6116a5414239cba5943d3cf
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
uses: docker/setup-buildx-action@aa33708b10e362ff993539393ff100fa93ed6a27
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -108,13 +107,13 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@2cdde995de11925a030ce8070c3d77a52ffcf1c0
uses: docker/build-push-action@5176d81f87c23d6fc96624dfdbcd9f3830bbe445
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
@@ -127,7 +126,7 @@ jobs:
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact/merge@0b2256b8c012f0828dc542b3febcab082c67f72b
with:
name: dockerfiles
pattern: dockerfiles_*

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -53,6 +53,13 @@ jobs:
- 'var/spack/repos/builtin/packages/clingo/**'
- 'var/spack/repos/builtin/packages/python/**'
- 'var/spack/repos/builtin/packages/re2c/**'
- 'var/spack/repos/builtin/packages/gnupg/**'
- 'var/spack/repos/builtin/packages/libassuan/**'
- 'var/spack/repos/builtin/packages/libgcrypt/**'
- 'var/spack/repos/builtin/packages/libgpg-error/**'
- 'var/spack/repos/builtin/packages/libksba/**'
- 'var/spack/repos/builtin/packages/npth/**'
- 'var/spack/repos/builtin/packages/pinentry/**'
- 'lib/spack/**'
- 'share/spack/**'
- '.github/workflows/bootstrap.yml'

View File

@@ -1,8 +0,0 @@
#!/usr/bin/env sh
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
spack compiler find
spack compiler info apple-clang
spack debug report
spack solve zlib

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,6 +1,6 @@
black==24.4.2
clingo==5.7.1
flake8==7.0.0
flake8==7.1.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20240513

View File

@@ -51,10 +51,10 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -72,7 +72,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Bootstrap clingo
if: ${{ matrix.concretizer == 'clingo' }}
env:
@@ -100,10 +100,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
- name: Install System packages
@@ -118,7 +118,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run shell tests
env:
COVERAGE: true
@@ -141,13 +141,13 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Run unit tests
@@ -160,10 +160,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
- name: Install System packages
@@ -178,7 +178,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run unit tests (full suite with coverage)
env:
COVERAGE: true
@@ -198,10 +198,10 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -217,7 +217,7 @@ jobs:
SPACK_TEST_PARALLEL: 4
run: |
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
. share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
@@ -236,10 +236,10 @@ jobs:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: 3.9
- name: Install Python packages
@@ -247,7 +247,7 @@ jobs:
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
./.github/workflows/bin/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml

View File

@@ -18,15 +18,15 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python Packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
pip install -r .github/workflows/requirements/style/requirements.txt
- name: vermin (Spack's Core)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
@@ -35,22 +35,22 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
pip install -r .github/workflows/requirements/style/requirements.txt
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run style tests
run: |
share/spack/qa/run-style-tests
@@ -70,13 +70,13 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Bootstrap Spack development environment

View File

@@ -1,3 +1,324 @@
# v0.22.0 (2024-05-12)
`v0.22.0` is a major feature release.
## Features in this release
1. **Compiler dependencies**
We are in the process of making compilers proper dependencies in Spack, and a number
of changes in `v0.22` support that effort. You may notice nodes in your dependency
graphs for compiler runtime libraries like `gcc-runtime` or `libgfortran`, and you
may notice that Spack graphs now include `libc`. We've also begun moving compiler
configuration from `compilers.yaml` to `packages.yaml` to make it consistent with
other externals. We are trying to do this with the least disruption possible, so
your existing `compilers.yaml` files should still work. We expect to be done with
this transition by the `v0.23` release in November.
* #41104: Packages compiled with `%gcc` on Linux, macOS and FreeBSD now depend on a
new package `gcc-runtime`, which contains a copy of the shared compiler runtime
libraries. This enables gcc runtime libraries to be installed and relocated when
using a build cache. When building minimal Spack-generated container images it is
no longer necessary to install libgfortran, libgomp etc. using the system package
manager.
* #42062: Packages compiled with `%oneapi` now depend on a new package
`intel-oneapi-runtime`. This is similar to `gcc-runtime`, and the runtimes can
provide virtuals and compilers can inject dependencies on virtuals into compiled
packages. This allows us to model library soname compatibility and allows
compilers like `%oneapi` to provide virtuals like `sycl` (which can also be
provided by standalone libraries). Note that until we have an agreement in place
with intel, Intel packages are marked `redistribute(source=False, binary=False)`
and must be downloaded outside of Spack.
* #43272: changes to the optimization criteria of the solver improve the hit-rate of
buildcaches by a fair amount. The solver more relaxed compatibility rules and will
not try to strictly match compilers or targets of reused specs. Users can still
enforce the previous strict behavior with `require:` sections in `packages.yaml`.
Note that to enforce correct linking, Spack will *not* reuse old `%gcc` and
`%oneapi` specs that do not have the runtime libraries as a dependency.
* #43539: Spack will reuse specs built with compilers that are *not* explicitly
configured in `compilers.yaml`. Because we can now keep runtime libraries in build
cache, we do not require you to also have a local configured compiler to *use* the
runtime libraries. This improves reuse in buildcaches and avoids conflicts with OS
updates that happen underneath Spack.
* #43190: binary compatibility on `linux` is now based on the `libc` version,
instead of on the `os` tag. Spack builds now detect the host `libc` (`glibc` or
`musl`) and add it as an implicit external node in the dependency graph. Binaries
with a `libc` with the same name and a version less than or equal to that of the
detected `libc` can be reused. This is only on `linux`, not `macos` or `Windows`.
* #43464: each package that can provide a compiler is now detectable using `spack
external find`. External packages defining compiler paths are effectively used as
compilers, and `spack external find -t compiler` can be used as a substitute for
`spack compiler find`. More details on this transition are in
[the docs](https://spack.readthedocs.io/en/latest/getting_started.html#manual-compiler-configuration)
2. **Improved `spack find` UI for Environments**
If you're working in an enviroment, you likely care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
We've tweaked `spack find` in environments to show this information much more
clearly. Installation status is shown next to each root, so you can see what is
installed. Roots are also shown in bold in the list of installed packages. There is
also a new option for `spack find -r` / `--only-roots` that will only show env
roots, if you don't want to look at all the installed specs.
More details in #42334.
3. **Improved command-line string quoting**
We are making some breaking changes to how Spack parses specs on the CLI in order to
respect shell quoting instead of trying to fight it. If you (sadly) had to write
something like this on the command line:
```
spack install zlib cflags=\"-O2 -g\"
```
That will now result in an error, but you can now write what you probably expected
to work in the first place:
```
spack install zlib cflags="-O2 -g"
```
Quoted can also now include special characters, so you can supply flags like:
```
spack intall zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'
```
To reduce ambiguity in parsing, we now require that you *not* put spaces around `=`
and `==` when for flags or variants. This would not have broken before but will now
result in an error:
```
spack install zlib cflags = "-O2 -g"
```
More details and discussion in #30634.
4. **Revert default `spack install` behavior to `--reuse`**
We changed the default concretizer behavior from `--reuse` to `--reuse-deps` in
#30990 (in `v0.20`), which meant that *every* `spack install` invocation would
attempt to build a new version of the requested package / any environment roots.
While this is a common ask for *upgrading* and for *developer* workflows, we don't
think it should be the default for a package manager.
We are going to try to stick to this policy:
1. Prioritize reuse and build as little as possible by default.
2. Only upgrade or install duplicates if they are explicitly asked for, or if there
is a known security issue that necessitates an upgrade.
With the install command you now have three options:
* `--reuse` (default): reuse as many existing installations as possible.
* `--reuse-deps` / `--fresh-roots`: upgrade (freshen) roots but reuse dependencies if possible.
* `--fresh`: install fresh versions of requested packages (roots) and their dependencies.
We've also introduced `--fresh-roots` as an alias for `--reuse-deps` to make it more clear
that it may give you fresh versions. More details in #41302 and #43988.
5. **More control over reused specs**
You can now control which packages to reuse and how. There is a new
`concretizer:reuse` config option, which accepts the following properties:
- `roots`: `true` to reuse roots, `false` to reuse just dependencies
- `exclude`: list of constraints used to select which specs *not* to reuse
- `include`: list of constraints used to select which specs *to* reuse
- `from`: list of sources for reused specs (some combination of `local`,
`buildcache`, or `external`)
For example, to reuse only specs compiled with GCC, you could write:
```yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
```
Or, if `openmpi` must be used from externals, and it must be the only external used:
```yaml
concretizer:
reuse:
roots: true
from:
- type: local
exclude: ["openmpi"]
- type: buildcache
exclude: ["openmpi"]
- type: external
include: ["openmpi"]
```
6. **New `redistribute()` directive**
Some packages can't be redistributed in source or binary form. We need an explicit
way to say that in a package.
Now there is a `redistribute()` directive so that package authors can write:
```python
class MyPackage(Package):
redistribute(source=False, binary=False)
```
Like other directives, this works with `when=`:
```python
class MyPackage(Package):
# 12.0 and higher are proprietary
redistribute(source=False, binary=False, when="@12.0:")
# can't redistribute when we depend on some proprietary dependency
redistribute(source=False, binary=False, when="^proprietary-dependency")
```
More in #20185.
7. **New `conflict:` and `prefer:` syntax for package preferences**
Previously, you could express conflicts and preferences in `packages.yaml` through
some contortions with `require:`:
```yaml
packages:
zlib-ng:
require:
- one_of: ["%clang", "@:"] # conflict on %clang
- any_of: ["+shared", "@:"] # strong preference for +shared
```
You can now use `require:` and `prefer:` for a much more readable configuration:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
See [the documentation](https://spack.readthedocs.io/en/latest/packages_yaml.html#conflicts-and-strong-preferences)
and #41832 for more details.
8. **`include_concrete` in environments**
You may want to build on the *concrete* contents of another environment without
changing that environment. You can now include the concrete specs from another
environment's `spack.lock` with `include_concrete`:
```yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /path/to/environment1
- /path/to/environment2
```
Now, when *this* environment is concretized, it will bring in the already concrete
specs from `environment1` and `environment2`, and build on top of them without
changing them. This is useful if you have phased deployments, where old deployments
should not be modified but you want to use as many of them as possible. More details
in #33768.
9. **`python-venv` isolation**
Spack has unique requirements for Python because it:
1. installs every package in its own independent directory, and
2. allows users to register *external* python installations.
External installations may contain their own installed packages that can interfere
with Spack installations, and some distributions (Debian and Ubuntu) even change the
`sysconfig` in ways that alter the installation layout of installed Python packages
(e.g., with the addition of a `/local` prefix on Debian or Ubuntu). To isolate Spack
from these and other issues, we now insert a small `python-venv` package in between
`python` and packages that need to install Python code. This isolates Spack's build
environment, isolates Spack from any issues with an external python, and resolves a
large number of issues we've had with Python installations.
See #40773 for further details.
## New commands, options, and directives
* Allow packages to be pushed to build cache after install from source (#42423)
* `spack develop`: stage build artifacts in same root as non-dev builds #41373
* Don't delete `spack develop` build artifacts after install (#43424)
* `spack find`: add options for local/upstream only (#42999)
* `spack logs`: print log files for packages (either partially built or installed) (#42202)
* `patch`: support reversing patches (#43040)
* `develop`: Add -b/--build-directory option to set build_directory package attribute (#39606)
* `spack list`: add `--namesapce` / `--repo` option (#41948)
* directives: add `checked_by` field to `license()`, add some license checks
* `spack gc`: add options for environments and build dependencies (#41731)
* Add `--create` to `spack env activate` (#40896)
## Performance improvements
* environment.py: fix excessive re-reads (#43746)
* ruamel yaml: fix quadratic complexity bug (#43745)
* Refactor to improve `spec format` speed (#43712)
* Do not acquire a write lock on the env post install if no views (#43505)
* asp.py: fewer calls to `spec.copy()` (#43715)
* spec.py: early return in `__str__`
* avoid `jinja2` import at startup unless needed (#43237)
## Other new features of note
* `archspec`: update to `v0.2.4`: support for Windows, bugfixes for `neoverse-v1` and
`neoverse-v2` detection.
* `spack config get`/`blame`: with no args, show entire config
* `spack env create <env>`: dir if dir-like (#44024)
* ASP-based solver: update os compatibility for macOS (#43862)
* Add handling of custom ssl certs in urllib ops (#42953)
* Add ability to rename environments (#43296)
* Add config option and compiler support to reuse across OS's (#42693)
* Support for prereleases (#43140)
* Only reuse externals when configured (#41707)
* Environments: Add support for including views (#42250)
## Binary caches
* Build cache: make signed/unsigned a mirror property (#41507)
* tools stack
## Removals, deprecations, and syntax changes
* remove `dpcpp` compiler and package (#43418)
* spack load: remove --only argument (#42120)
## Notable Bugfixes
* repo.py: drop deleted packages from provider cache (#43779)
* Allow `+` in module file names (#41999)
* `cmd/python`: use runpy to allow multiprocessing in scripts (#41789)
* Show extension commands with spack -h (#41726)
* Support environment variable expansion inside module projections (#42917)
* Alert user to failed concretizations (#42655)
* shell: fix zsh color formatting for PS1 in environments (#39497)
* spack mirror create --all: include patches (#41579)
## Spack community stats
* 7,994 total packages; 525 since `v0.21.0`
* 178 new Python packages, 5 new R packages
* 358 people contributed to this release
* 344 committers to packages
* 45 committers to core
# v0.21.2 (2024-03-01)
## Bugfixes

View File

@@ -188,25 +188,27 @@ if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :end_switch
:case_load
:: If args contain --sh, --csh, or -h/--help: just execute.
if defined _sp_args (
if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:-h=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
goto :default_case
)
if NOT defined _sp_args (
exit /B 0
)
:: If args contain --bat, or -h/--help: just execute.
if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:-h=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--list=%" (
goto :default_case
)
for /f "tokens=* USEBACKQ" %%I in (
`python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%`) do %%I
`python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%`
) do %%I
goto :end_switch
:case_unload
goto :case_load
:default_case
python "%spack%" %_sp_flags% %_sp_subcommand% %_sp_args%
goto :end_switch

View File

@@ -203,12 +203,9 @@ The OS that are currently supported are summarized in the table below:
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
* - CentOS Stream
- ``quay.io/centos/centos:stream``
- ``spack/centos-stream``
* - CentOS Stream9
- ``quay.io/centos/centos:stream9``
- ``spack/centos-stream9``
* - openSUSE Leap
- ``opensuse/leap``
- ``spack/leap15``

View File

@@ -931,32 +931,84 @@ This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
-----------------
Environment Views
-----------------
Spack Environments can define filesystem views, which provide a direct access point
for software similar to the directory hierarchy that might exist under ``/usr/local``.
Filesystem views are updated every time the environment is written out to the lock
file ``spack.lock``, so the concrete environment and the view are always compatible.
The files of the view's installed packages are brought into the view by symbolic or
hard links, referencing the original Spack installation, or by copy.
Spack Environments can have an associated filesystem view, which is a directory
with a more traditional structure ``<view>/bin``, ``<view>/lib``, ``<view>/include``
in which all files of the installed packages are linked.
By default a view is created for each environment, thanks to the ``view: true``
option in the ``spack.yaml`` manifest file:
.. code-block:: yaml
spack:
specs: [perl, python]
view: true
The view is created in a hidden directory ``.spack-env/view`` relative to the environment.
If you've used ``spack env activate``, you may have already interacted with this view. Spack
prepends its ``<view>/bin`` dir to ``PATH`` when the environment is activated, so that
you can directly run executables from all installed packages in the environment.
Views are highly customizable: you can control where they are put, modify their structure,
include and exclude specs, change how files are linked, and you can even generate multiple
views for a single environment.
.. _configuring_environment_views:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configuration in ``spack.yaml``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^
Minimal view configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^
The Spack Environment manifest file has a top-level keyword
``view``. Each entry under that heading is a **view descriptor**, headed
by a name. Any number of views may be defined under the ``view`` heading.
The view descriptor contains the root of the view, and
optionally the projections for the view, ``select`` and
``exclude`` lists for the view and link information via ``link`` and
The minimal configuration
.. code-block:: yaml
spack:
# ...
view: true
lets Spack generate a single view with default settings under the
``.spack-env/view`` directory of the environment.
Another short way to configure a view is to specify just where to put it:
.. code-block:: yaml
spack:
# ...
view: /path/to/view
Views can also be disabled by setting ``view: false``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Advanced view configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^
One or more **view descriptors** can be defined under ``view``, keyed by a name.
The example from the previous section with ``view: /path/to/view`` is equivalent
to defining a view descriptor named ``default`` with a ``root`` attribute:
.. code-block:: yaml
spack:
# ...
view:
default: # name of the view
root: /path/to/view # view descriptor attribute
The ``default`` view descriptor name is special: when you ``spack env activate`` your
environment, this view will be used to update (among other things) your ``PATH``
variable.
View descriptors must contain the root of the view, and optionally projections,
``select`` and ``exclude`` lists and link information via ``link`` and
``link_type``.
For example, in the following manifest
As a more advanced example, in the following manifest
file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
@@ -1001,59 +1053,10 @@ of ``hardlink`` or ``copy``.
when the environment is not activated, and linked libraries will be located
*outside* of the view thanks to rpaths.
There are two shorthands for environments with a single view. If the
environment at ``/path/to/env`` has a single view, with a root at
``/path/to/env/.spack-env/view``, with default selection and exclusion
and the default projection, we can put ``view: True`` in the
environment manifest. Similarly, if the environment has a view with a
different root, but default selection, exclusion, and projections, the
manifest can say ``view: /path/to/view``. These views are
automatically named ``default``, so that
.. code-block:: yaml
spack:
# ...
view: True
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: .spack-env/view
and
.. code-block:: yaml
spack:
# ...
view: /path/to/view
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: /path/to/view
By default, Spack environments are configured with ``view: True`` in
the manifest. Environments can be configured without views using
``view: False``. For backwards compatibility reasons, environments
with no ``view`` key are treated the same as ``view: True``.
From the command line, the ``spack env create`` command takes an
argument ``--with-view [PATH]`` that sets the path for a single, default
view. If no path is specified, the default path is used (``view:
True``). The argument ``--without-view`` can be used to create an
true``). The argument ``--without-view`` can be used to create an
environment without any view configured.
The ``spack env view`` command can be used to change the manage views
@@ -1119,11 +1122,18 @@ the projection under ``all`` before reaching those entries.
Activating environment views
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``spack env activate`` command will put the default view for the
environment into the user's path, in addition to activating the
environment for Spack commands. The arguments ``-v,--with-view`` and
``-V,--without-view`` can be used to tune this behavior. The default
behavior is to activate with the environment view if there is one.
The ``spack env activate <env>`` has two effects:
1. It activates the environment so that further Spack commands such
as ``spack install`` will run in the context of the environment.
2. It activates the view so that environment variables such as
``PATH`` are updated to include the view.
Without further arguments, the ``default`` view of the environment is
activated. If a view with a different name has to be activated,
``spack env activate --with-view <name> <env>`` can be
used instead. You can also activate the environment without modifying
further environment variables using ``--without-view``.
The environment variables affected by the ``spack env activate``
command and the paths that are used to update them are determined by
@@ -1146,8 +1156,8 @@ relevant variable if the path exists. For this reason, it is not
recommended to use non-default projections with the default view of an
environment.
The ``spack env deactivate`` command will remove the default view of
the environment from the user's path.
The ``spack env deactivate`` command will remove the active view of
the Spack environment from the user's environment variables.
.. _env-generate-depfile:
@@ -1306,7 +1316,7 @@ index once every package is pushed. Note how this target uses the generated
example/push/%: example/install/%
@mkdir -p $(dir $@)
$(info About to push $(SPEC) to a buildcache)
$(SPACK) -e . buildcache push --allow-root --only=package $(BUILDCACHE_DIR) /$(HASH)
$(SPACK) -e . buildcache push --only=package $(BUILDCACHE_DIR) /$(HASH)
@touch $@
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))

View File

@@ -2344,6 +2344,27 @@ you set ``parallel`` to ``False`` at the package level, then each call
to ``make()`` will be sequential by default, but packagers can call
``make(parallel=True)`` to override it.
Note that the ``--jobs`` option works out of the box for all standard
build systems. If you are using a non-standard build system instead, you
can use the variable ``make_jobs`` to extract the number of jobs specified
by the ``--jobs`` option:
.. code-block:: python
:emphasize-lines: 7, 11
:linenos:
class Xios(Package):
...
def install(self, spec, prefix):
...
options = [
...
'--jobs', str(make_jobs),
]
...
make_xios = Executable("./make_xios")
make_xios(*options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Install-level build parallelism
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -5173,12 +5194,6 @@ installed executable. The check is implemented as follows:
reframe = Executable(self.prefix.bin.reframe)
reframe("-l")
.. warning::
The API for adding tests is not yet considered stable and may change
in future releases.
""""""""""""""""""""""""""""""""
Checking build-time test results
""""""""""""""""""""""""""""""""
@@ -5216,38 +5231,42 @@ be left in the build stage directory as illustrated below:
Stand-alone tests
^^^^^^^^^^^^^^^^^
While build-time tests are integrated with the build process, stand-alone
While build-time tests are integrated with the installation process, stand-alone
tests are expected to run days, weeks, even months after the software is
installed. The goal is to provide a mechanism for gaining confidence that
packages work as installed **and** *continue* to work as the underlying
software evolves. Packages can add and inherit stand-alone tests. The
`spack test`` command is used to manage stand-alone testing.
``spack test`` command is used for stand-alone testing.
.. note::
.. admonition:: Stand-alone test methods should complete within a few minutes.
Execution speed is important since these tests are intended to quickly
assess whether installed specs work on the system. Consequently, they
should run relatively quickly -- as in on the order of at most a few
minutes -- while ideally executing all, or at least key aspects of the
installed software.
assess whether installed specs work on the system. Spack cannot spare
resources for more extensive testing of packages included in CI stacks.
.. note::
Failing stand-alone tests indicate problems with the installation and,
therefore, there is no reason to proceed with more resource-intensive
tests until those have been investigated.
Passing stand-alone tests indicate that more thorough testing, such
as running extensive unit or regression tests, or tests that run at
scale can proceed without wasting resources on a problematic installation.
Consequently, stand-alone tests should run relatively quickly -- as in
on the order of at most a few minutes -- while testing at least key aspects
of the installed software. Save more extensive testing for other tools.
Tests are defined in the package using methods with names beginning ``test_``.
This allows Spack to support multiple independent checks, or parts. Files
needed for testing, such as source, data, and expected outputs, may be saved
from the build and or stored with the package in the repository. Regardless
of origin, these files are automatically copied to the spec's test stage
directory prior to execution of the test method(s). Spack also provides some
helper functions to facilitate processing.
directory prior to execution of the test method(s). Spack also provides helper
functions to facilitate common processing.
.. tip::
**The status of stand-alone tests can be used to guide follow-up testing efforts.**
Passing stand-alone tests justify performing more thorough testing, such
as running extensive unit or regression tests or tests that run at scale,
when available. These tests are outside of the scope of Spack packaging.
Failing stand-alone tests indicate problems with the installation and,
therefore, no reason to proceed with more resource-intensive tests until
the failures have been investigated.
.. _configure-test-stage:
@@ -5255,30 +5274,26 @@ helper functions to facilitate processing.
Configuring the test stage directory
""""""""""""""""""""""""""""""""""""
Stand-alone tests utilize a test stage directory for building, running,
and tracking results in the same way Spack uses a build stage directory.
The default test stage root directory, ``~/.spack/test``, is defined in
:ref:`etc/spack/defaults/config.yaml <config-yaml>`. This location is
customizable by adding or changing the ``test_stage`` path in the high-level
``config`` of the appropriate ``config.yaml`` file such that:
Stand-alone tests utilize a test stage directory to build, run, and track
tests in the same way Spack uses a build stage directory to install software.
The default test stage root directory, ``$HOME/.spack/test``, is defined in
:ref:`config.yaml <config-yaml>`. This location is customizable by adding or
changing the ``test_stage`` path such that:
.. code-block:: yaml
config:
test_stage: /path/to/test/stage
Packages can use the ``self.test_suite.stage`` property to access this setting.
Other package properties that provide access to spec-specific subdirectories
and files are described in :ref:`accessing staged files <accessing-files>`.
Packages can use the ``self.test_suite.stage`` property to access the path.
.. note::
.. admonition:: Each spec being tested has its own test stage directory.
The test stage path is the root directory for the **entire suite**.
In other words, it is the root directory for **all specs** being
tested by the ``spack test run`` command. Each spec gets its own
stage subdirectory. Use ``self.test_suite.test_dir_for_spec(self.spec)``
to access the spec-specific test stage directory.
The ``config:test_stage`` option is the path to the root of a
**test suite**'s stage directories.
Other package properties that provide paths to spec-specific subdirectories
and files are described in :ref:`accessing-files`.
.. _adding-standalone-tests:
@@ -5291,61 +5306,144 @@ Test recipes are defined in the package using methods with names beginning
Each method has access to the information Spack tracks on the package, such
as options, compilers, and dependencies, supporting the customization of tests
to the build. Standard python ``assert`` statements and other error reporting
mechanisms are available. Such exceptions are automatically caught and reported
mechanisms can be used. These exceptions are automatically caught and reported
as test failures.
Each test method is an implicit test part named by the method and whose
purpose is the method's docstring. Providing a purpose gives context for
aiding debugging. A test method may contain embedded test parts. Spack
outputs the test name and purpose prior to running each test method and
any embedded test parts. For example, ``MyPackage`` below provides two basic
examples of installation tests: ``test_always_fails`` and ``test_example``.
As the name indicates, the first always fails. The second simply runs the
installed example.
Each test method is an *implicit test part* named by the method. Its purpose
is the method's docstring. Providing a meaningful purpose for the test gives
context that can aid debugging. Spack outputs both the name and purpose at the
start of test execution so it's also important that the docstring/purpose be
brief.
.. tip::
We recommend naming test methods so it is clear *what* is being tested.
For example, if a test method is building and or running an executable
called ``example``, then call the method ``test_example``. This, together
with a similarly meaningful test purpose, will aid test comprehension,
debugging, and maintainability.
Stand-alone tests run in an environment that provides access to information
on the installed software, such as build options, dependencies, and compilers.
Build options and dependencies are accessed using the same spec checks used
by build recipes. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
.. admonition:: Spack automatically sets up the test stage directory and environment.
Spack automatically creates the test stage directory and copies
relevant files *prior to* running tests. It can also ensure build
dependencies are available **if** necessary.
The path to the test stage is configurable (see :ref:`configure-test-stage`).
Files that Spack knows to copy are those saved from the build (see
:ref:`cache_extra_test_sources`) and those added to the package repository
(see :ref:`cache_custom_files`).
Spack will use the value of the ``test_requires_compiler`` property to
determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming
the executable runs successfully. If the installed spec is not expected to have
``example2``, then the check at the top of the method will raise a special
``SkipTest`` exception, which is captured to facilitate reporting skipped test
parts to tools like CDash.
.. code-block:: python
class MyPackage(Package):
...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self):
"""run installed example"""
"""ensure installed example works"""
expected = "Done."
example = which(self.prefix.bin.example)
example()
# Capture stdout and stderr from running the Executable
# and check that the expected output was produced.
out = example(output=str.split, error=str.split)
assert expected in out, f"Expected '{expected}' in the output"
def test_example2(self):
"""run installed example2"""
if self.spec.satisfies("@:1.0"):
# Raise SkipTest to ensure flagging the test as skipped for
# test reporting purposes.
raise SkipTest("Test is only available for v1.1 on")
example2 = which(self.prefix.bin.example2)
example2()
Output showing the identification of each test part after running the tests
is illustrated below.
.. code-block:: console
$ spack test run --alias mypackage mypackage@1.0
$ spack test run --alias mypackage mypackage@2.0
==> Spack test mypackage
...
$ spack test results -l mypackage
==> Results for test suite 'mypackage':
...
==> [2023-03-10-16:03:56.625204] test: test_always_fails: use assert to always fail
==> [2024-03-10-16:03:56.625439] test: test_example: ensure installed example works
...
FAILED
==> [2023-03-10-16:03:56.625439] test: test_example: run installed example
PASSED: MyPackage::test_example
==> [2024-03-10-16:03:56.625439] test: test_example2: run installed example2
...
PASSED
PASSED: MyPackage::test_example2
.. admonition:: Do NOT implement tests that must run in the installation prefix.
.. note::
Use of the package spec's installation prefix for building and running
tests is **strongly discouraged**. Doing so causes permission errors for
shared spack instances *and* facilities that install the software in
read-only file systems or directories.
If ``MyPackage`` were a recipe for a library, the tests should build
an example or test program that is then executed.
Instead, start these test methods by explicitly copying the needed files
from the installation prefix to the test stage directory. Note the test
stage directory is the current directory when the test is executed with
the ``spack test run`` command.
A test method can include test parts using the ``test_part`` context manager.
Each part is treated as an independent check to allow subsequent test parts
to execute even after a test part fails.
.. admonition:: Test methods for library packages should build test executables.
.. _test-part:
Stand-alone tests for library packages *should* build test executables
that utilize the *installed* library. Doing so ensures the tests follow
a similar build process that users of the library would follow.
For more information on how to do this, see :ref:`test-build-tests`.
.. tip::
If you want to see more examples from packages with stand-alone tests, run
``spack pkg grep "def\stest" | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _adding-standalone-test-parts:
"""""""""""""""""""""""""""""
Adding stand-alone test parts
"""""""""""""""""""""""""""""
Sometimes dependencies between steps of a test lend themselves to being
broken into parts. Tracking the pass/fail status of each part may aid
debugging. Spack provides a ``test_part`` context manager for use within
test methods.
Each test part is independently run, tracked, and reported. Test parts are
executed in the order they appear. If one fails, subsequent test parts are
still performed even if they would also fail. This allows tools like CDash
to track and report the status of test parts across runs. The pass/fail status
of the enclosing test is derived from the statuses of the embedded test parts.
.. admonition:: Test method and test part names **must** be unique.
Test results reporting requires that test methods and embedded test parts
within a package have unique names.
The signature for ``test_part`` is:
@@ -5367,40 +5465,68 @@ where each argument has the following meaning:
* ``work_dir`` is the path to the directory in which the test will run.
The default of ``None``, or ``"."``, corresponds to the the spec's test
stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``.
stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``).
.. admonition:: Tests should **not** run under the installation directory.
.. admonition:: Start test part names with the name of the enclosing test.
Use of the package spec's installation directory for building and running
tests is **strongly** discouraged. Doing so causes permission errors for
shared spack instances *and* facilities that install the software in
read-only file systems or directories.
We **highly recommend** starting the names of test parts with the name
of the enclosing test. Doing so helps with the comprehension, readability
and debugging of test results.
Suppose ``MyPackage`` actually installs two examples we want to use for tests.
These checks can be implemented as separate checks or, as illustrated below,
embedded test parts.
Suppose ``MyPackage`` installs multiple executables that need to run in a
specific order since the outputs from one are inputs of others. Further suppose
we want to add an integration test that runs the executables in order. We can
accomplish this goal by implementing a stand-alone test method consisting of
test parts for each executable as follows:
.. code-block:: python
class MyPackage(Package):
...
def test_example(self):
"""run installed examples"""
for example in ["ex1", "ex2"]:
with test_part(
self,
f"test_example_{example}",
purpose=f"run installed {example}",
):
exe = which(join_path(self.prefix.bin, example))
exe()
def test_series(self):
"""run setup, perform, and report"""
In this case, there will be an implicit test part for ``test_example``
and separate sub-parts for ``ex1`` and ``ex2``. The second sub-part
will be executed regardless of whether the first passes. The test
log for a run where the first executable fails and the second passes
is illustrated below.
with test_part(self, "test_series_setup", purpose="setup operation"):
exe = which(self.prefix.bin.setup))
exe()
with test_part(self, "test_series_run", purpose="perform operation"):
exe = which(self.prefix.bin.run))
exe()
with test_part(self, "test_series_report", purpose="generate report"):
exe = which(self.prefix.bin.report))
exe()
The result is ``test_series`` runs the following executable in order: ``setup``,
``run``, and ``report``. In this case no options are passed to any of the
executables and no outputs from running them are checked. Consequently, the
implementation could be simplified with a for-loop as follows:
.. code-block:: python
class MyPackage(Package):
...
def test_series(self):
"""execute series setup, run, and report"""
for exe, reason in [
("setup", "setup operation"),
("run", "perform operation"),
("report", "generate report")
]:
with test_part(self, f"test_series_{exe}", purpose=reason):
exe = which(self.prefix.bin.join(exe))
exe()
In both cases, since we're using a context manager, each test part in
``test_series`` will execute regardless of the status of the other test
parts.
Now let's look at the output from running the stand-alone tests where
the second test part, ``test_series_run``, fails.
.. code-block:: console
@@ -5410,50 +5536,68 @@ is illustrated below.
$ spack test results -l mypackage
==> Results for test suite 'mypackage':
...
==> [2023-03-10-16:03:56.625204] test: test_example: run installed examples
==> [2023-03-10-16:03:56.625439] test: test_example_ex1: run installed ex1
==> [2024-03-10-16:03:56.625204] test: test_series: execute series setup, run, and report
==> [2024-03-10-16:03:56.625439] test: test_series_setup: setup operation
...
FAILED
==> [2023-03-10-16:03:56.625555] test: test_example_ex2: run installed ex2
PASSED: MyPackage::test_series_setup
==> [2024-03-10-16:03:56.625555] test: test_series_run: perform operation
...
PASSED
FAILED: MyPackage::test_series_run
==> [2024-03-10-16:03:57.003456] test: test_series_report: generate report
...
FAILED: MyPackage::test_series_report
FAILED: MyPackage::test_series
...
.. warning::
Since test parts depended on the success of previous parts, we see that the
failure of one results in the failure of subsequent checks and the overall
result of the test method, ``test_series``, is failure.
Test results reporting requires that each test method and embedded
test part for a package have a unique name.
.. tip::
Stand-alone tests run in an environment that provides access to information
Spack has on how the software was built, such as build options, dependencies,
and compilers. Build options and dependencies are accessed with the normal
spec checks. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
Accessing compilers in stand-alone tests that are used by the build requires
setting a package property as described :ref:`below <test-compilation>`.
If you want to see more examples from packages using ``test_part``, run
``spack pkg grep "test_part(" | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _test-build-tests:
.. _test-compilation:
"""""""""""""""""""""""""""""""""""""
Building and running test executables
"""""""""""""""""""""""""""""""""""""
"""""""""""""""""""""""""
Enabling test compilation
"""""""""""""""""""""""""
.. admonition:: Re-use build-time sources and (small) input data sets when possible.
If you want to build and run binaries in tests, then you'll need to tell
Spack to load the package's compiler configuration. This is accomplished
by setting the package's ``test_requires_compiler`` property to ``True``.
We **highly recommend** re-using build-time test sources and pared down
input files for testing installed software. These files are easier
to keep synchronized with software capabilities when they reside
within the software's repository. More information on saving files from
the installation process can be found at :ref:`cache_extra_test_sources`.
Setting the property to ``True`` ensures access to the compiler through
canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
It also gives access to build dependencies like ``cmake`` through their
``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake``).
If that is not possible, you can add test-related files to the package
repository (see :ref:`cache_custom_files`). It will be important to
remember to maintain them so they work across listed or supported versions
of the package.
.. note::
Packages that build libraries are good examples of cases where you'll want
to build test executables from the installed software before running them.
Doing so requires you to let Spack know it needs to load the package's
compiler configuration. This is accomplished by setting the package's
``test_requires_compiler`` property to ``True``.
The ``test_requires_compiler`` property should be added at the top of
the package near other attributes, such as the ``homepage`` and ``url``.
.. admonition:: ``test_requires_compiler = True`` is required to build test executables.
Below illustrates using this feature to compile an example.
Setting the property to ``True`` ensures access to the compiler through
canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
It also gives access to build dependencies like ``cmake`` through their
``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake`` for the
path or ``self.spec["cmake"].command`` for the ``Executable`` instance).
Be sure to add the property at the top of the package class under other
properties like the ``homepage``.
The example below, which ignores how ``cxx-example.cpp`` is acquired,
illustrates the basic process of compiling a test executable using the
installed library before running it.
.. code-block:: python
@@ -5477,28 +5621,22 @@ Below illustrates using this feature to compile an example.
cxx_example = which(exe)
cxx_example()
Typically the files used to build and or run test executables are either
cached from the installation (see :ref:`cache_extra_test_sources`) or added
to the package repository (see :ref:`cache_custom_files`). There is nothing
preventing the use of both.
.. _cache_extra_test_sources:
"""""""""""""""""""""""
Saving build-time files
"""""""""""""""""""""""
""""""""""""""""""""""""""""""""""""
Saving build- and install-time files
""""""""""""""""""""""""""""""""""""
.. note::
We highly recommend re-using build-time test sources and pared down
input files for testing installed software. These files are easier
to keep synchronized with software capabilities since they reside
within the software's repository.
If that is not possible, you can add test-related files to the package
repository (see :ref:`adding custom files <cache_custom_files>`). It
will be important to maintain them so they work across listed or supported
versions of the package.
You can use the ``cache_extra_test_sources`` helper to copy directories
and or files from the source build stage directory to the package's
installation directory.
You can use the ``cache_extra_test_sources`` helper routine to copy
directories and or files from the source build stage directory to the
package's installation directory. Spack will automatically copy these
files for you when it sets up the test stage directory and before it
begins running the tests.
The signature for ``cache_extra_test_sources`` is:
@@ -5513,46 +5651,69 @@ where each argument has the following meaning:
* ``srcs`` is a string *or* a list of strings corresponding to the
paths of subdirectories and or files needed for stand-alone testing.
The paths must be relative to the staged source directory. Contents of
subdirectories and files are copied to a special test cache subdirectory
of the installation prefix. They are automatically copied to the appropriate
relative paths under the test stage directory prior to executing stand-alone
tests.
.. warning::
For example, a package method for copying everything in the ``tests``
subdirectory plus the ``foo.c`` and ``bar.c`` files from ``examples``
and using ``foo.c`` in a test method is illustrated below.
Paths provided in the ``srcs`` argument **must be relative** to the
staged source directory. They will be copied to the equivalent relative
location under the test stage directory prior to test execution.
Contents of subdirectories and files are copied to a special test cache
subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to
executing stand-alone tests.
.. tip::
*Perform test-related conversions once when copying files.*
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the post-``install`` copy method
**after** the call to ``cache_extra_test_sources``. This will reduce
the amount of unnecessary work in the test method **and** avoid problems
running stand-alone tests in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes
(see :ref:`file-filtering`).
Below is a basic example of a test that relies on files from the installation.
This package method re-uses the contents of the ``examples`` subdirectory,
which is assumed to have all of the files implemented to allow ``make`` to
compile and link ``foo.c`` and ``bar.c`` against the package's installed
library.
.. code-block:: python
class MyLibPackage(Package):
class MyLibPackage(MakefilePackage):
...
@run_after("install")
def copy_test_files(self):
srcs = ["tests",
join_path("examples", "foo.c"),
join_path("examples", "bar.c")]
cache_extra_test_sources(self, srcs)
cache_extra_test_sources(self, "examples")
def test_foo(self):
exe = "foo"
src_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.c",
"-o", exe
)
foo = which(exe)
foo()
def test_example(self):
"""build and run the examples"""
examples_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(examples_dir):
make = which("make")
make()
In this case, the method copies the associated files from the build
stage, **after** the software is installed, to the package's test
cache directory. Then ``test_foo`` builds ``foo`` using ``foo.c``
before running the program.
for program in ["foo", "bar"]:
with test_part(
self,
f"test_example_{program}",
purpose=f"ensure {program} runs"
):
exe = Executable(program)
exe()
In this case, ``copy_test_files`` copies the associated files from the
build stage to the package's test cache directory under the installation
prefix. Running ``spack test run`` for the package results in Spack copying
the directory and its contents to the the test stage directory. The
``working_dir`` context manager ensures the commands within it are executed
from the ``examples_dir``. The test builds the software using ``make`` before
running each executable, ``foo`` and ``bar``, as independent test parts.
.. note::
@@ -5561,43 +5722,18 @@ before running the program.
The key to copying files for stand-alone testing at build time is use
of the ``run_after`` directive, which ensures the associated files are
copied **after** the provided build stage where the files **and**
installation prefix are available.
copied **after** the provided build stage (``install``) when the installation
prefix **and** files are available.
These paths are **automatically copied** from cache to the test stage
directory prior to the execution of any stand-alone tests. Tests access
the files using the ``self.test_suite.current_test_cache_dir`` property.
In our example above, test methods can use the following paths to reference
the copy of each entry listed in ``srcs``, respectively:
The test method uses the path contained in the package's
``self.test_suite.current_test_cache_dir`` property for the root directory
of the copied files. In this case, that's the ``examples`` subdirectory.
* ``self.test_suite.current_test_cache_dir.tests``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "foo.c")``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "bar.c")``
.. admonition:: Library packages should build stand-alone tests
Library developers will want to build the associated tests
against their **installed** libraries before running them.
.. note::
While source and input files are generally recommended, binaries
**may** also be cached by the build process. Only you, as the package
writer or maintainer, know whether these files would be appropriate
for testing the installed software weeks to months later.
.. note::
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the ``copy_test_sources`` method and
***after** the call to ``cache_extra_test_sources()``. This will
reduce the amount of unnecessary work in the test method **and** avoid
problems testing in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes.
See :ref:`file manipulation <file-manipulation>`.
.. tip::
If you want to see more examples from packages that cache build files, run
``spack pkg grep cache_extra_test_sources | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _cache_custom_files:
@@ -5605,8 +5741,9 @@ the copy of each entry listed in ``srcs``, respectively:
Adding custom files
"""""""""""""""""""
In some cases it can be useful to have files that can be used to build or
check the results of tests. Examples include:
Sometimes it is helpful or necessary to include custom files for building and
or checking the results of tests as part of the package. Examples of the types
of files that might be useful are:
- test source files
- test input files
@@ -5614,17 +5751,15 @@ check the results of tests. Examples include:
- expected test outputs
While obtaining such files from the software repository is preferred (see
:ref:`adding build-time files <cache_extra_test_sources>`), there are
circumstances where that is not feasible (e.g., the software is not being
actively maintained). When test files can't be obtained from the repository
or as a supplement to files that can, Spack supports the inclusion of
additional files under the ``test`` subdirectory of the package in the
Spack repository.
:ref:`cache_extra_test_sources`), there are circumstances where doing so is not
feasible such as when the software is not being actively maintained. When test
files cannot be obtained from the repository or there is a need to supplement
files that can, Spack supports the inclusion of additional files under the
``test`` subdirectory of the package in the Spack repository.
Spack **automatically copies** the contents of that directory to the
test staging directory prior to running stand-alone tests. Test methods
access those files using the ``self.test_suite.current_test_data_dir``
property as shown below.
The following example assumes a ``custom-example.c`` is saved in ``MyLibary``
package's ``test`` subdirectory. It also assumes the program simply needs to
be compiled and linked against the installed ``MyLibrary`` software.
.. code-block:: python
@@ -5634,17 +5769,29 @@ property as shown below.
test_requires_compiler = True
...
def test_example(self):
def test_custom_example(self):
"""build and run custom-example"""
data_dir = self.test_suite.current_test_data_dir
src_dir = self.test_suite.current_test_data_dir
exe = "custom-example"
src = datadir.join(f"{exe}.cpp")
...
# TODO: Build custom-example using src and exe
...
custom_example = which(exe)
custom_example()
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.cpp",
"-o", exe
)
custom_example = Executable(exe)
custom_example()
In this case, ``spack test run`` for the package results in Spack copying
the contents of the ``test`` subdirectory to the test stage directory path
in ``self.test_suite.current_test_data_dir`` before calling
``test_custom_example``. Use of the ``working_dir`` context manager
ensures the commands to build and run the program are performed from
within the appropriate subdirectory of the test stage.
.. _expected_test_output_from_file:
@@ -5653,9 +5800,8 @@ Reading expected output from a file
"""""""""""""""""""""""""""""""""""
The helper function ``get_escaped_text_output`` is available for packages
to retrieve and properly format the text from a file that contains the
expected output from running an executable that may contain special
characters.
to retrieve properly formatted text from a file potentially containing
special characters.
The signature for ``get_escaped_text_output`` is:
@@ -5665,10 +5811,13 @@ The signature for ``get_escaped_text_output`` is:
where ``filename`` is the path to the file containing the expected output.
The ``filename`` for a :ref:`custom file <cache_custom_files>` can be
accessed by tests using the ``self.test_suite.current_test_data_dir``
property. The example below illustrates how to read a file that was
added to the package's ``test`` subdirectory.
The path provided to ``filename`` for one of the copied custom files
(:ref:`custom file <cache_custom_files>`) is in the path rooted at
``self.test_suite.current_test_data_dir``.
The example below shows how to reference both the custom database
(``packages.db``) and expected output (``dump.out``) files Spack copies
to the test stage:
.. code-block:: python
@@ -5690,8 +5839,9 @@ added to the package's ``test`` subdirectory.
for exp in expected:
assert re.search(exp, out), f"Expected '{exp}' in output"
If the file was instead copied from the ``tests`` subdirectory of the staged
source code, the path would be obtained as shown below.
If the files were instead cached from installing the software, the paths to the
two files would be found under the ``self.test_suite.current_test_cache_dir``
directory as shown below:
.. code-block:: python
@@ -5699,17 +5849,24 @@ source code, the path would be obtained as shown below.
"""check example table dump"""
test_cache_dir = self.test_suite.current_test_cache_dir
db_filename = test_cache_dir.join("packages.db")
..
expected = get_escaped_text_output(test_cache_dir.join("dump.out"))
...
Alternatively, if the file was copied to the ``share/tests`` subdirectory
as part of the installation process, the test could access the path as
follows:
Alternatively, if both files had been installed by the software into the
``share/tests`` subdirectory of the installation prefix, the paths to the
two files would be referenced as follows:
.. code-block:: python
def test_example(self):
"""check example table dump"""
db_filename = join_path(self.prefix.share.tests, "packages.db")
db_filename = self.prefix.share.tests.join("packages.db")
..
expected = get_escaped_text_output(
self.prefix.share.tests.join("dump.out")
)
...
.. _check_outputs:
@@ -5717,9 +5874,9 @@ follows:
Comparing expected to actual outputs
""""""""""""""""""""""""""""""""""""
The helper function ``check_outputs`` is available for packages to ensure
the expected outputs from running an executable are contained within the
actual outputs.
The ``check_outputs`` helper routine is available for packages to ensure
multiple expected outputs from running an executable are contained within
the actual outputs.
The signature for ``check_outputs`` is:
@@ -5745,11 +5902,17 @@ Invoking the method is the equivalent of:
if errors:
raise RuntimeError("\n ".join(errors))
.. tip::
If you want to see more examples from packages that use this helper, run
``spack pkg grep check_outputs | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _accessing-files:
"""""""""""""""""""""""""""""""""""""""""
Accessing package- and test-related files
Finding package- and test-related files
"""""""""""""""""""""""""""""""""""""""""
You may need to access files from one or more locations when writing
@@ -5758,8 +5921,7 @@ include test source files or includes them but has no way to build the
executables using the installed headers and libraries. In these cases
you may need to reference the files relative to one or more root directory.
The table below lists relevant path properties and provides additional
examples of their use.
:ref:`Reading expected output <expected_test_output_from_file>` provides
examples of their use. See :ref:`expected_test_output_from_file` for
examples of accessing files saved from the software repository, package
repository, and installation.
@@ -5788,7 +5950,6 @@ repository, and installation.
- ``self.test_suite.current_test_data_dir``
- ``join_path(self.test_suite.current_test_data_dir, "hello.f90")``
.. _inheriting-tests:
""""""""""""""""""""""""""""
@@ -5831,7 +5992,7 @@ maintainers provide additional stand-alone tests customized to the package.
.. warning::
Any package that implements a test method with the same name as an
inherited method overrides the inherited method. If that is not the
inherited method will override the inherited method. If that is not the
goal and you are not explicitly calling and adding functionality to
the inherited method for the test, then make sure that all test methods
and embedded test parts have unique test names.
@@ -5996,6 +6157,8 @@ running:
This is already part of the boilerplate for packages created with
``spack create``.
.. _file-filtering:
^^^^^^^^^^^^^^^^^^^
Filtering functions
^^^^^^^^^^^^^^^^^^^

View File

@@ -253,17 +253,6 @@ can easily happen if it is not updated frequently, this behavior ensures that
spack has a way to know for certain about the status of any concrete spec on
the remote mirror, but can slow down pipeline generation significantly.
The ``--optimize`` argument is experimental and runs the generated pipeline
document through a series of optimization passes designed to reduce the size
of the generated file.
The ``--dependencies`` is also experimental and disables what in Gitlab is
referred to as DAG scheduling, internally using the ``dependencies`` keyword
rather than ``needs`` to list dependency jobs. The drawback of using this option
is that before any job can begin, all jobs in previous stages must first
complete. The benefit is that Gitlab allows more dependencies to be listed
when using ``dependencies`` instead of ``needs``.
The optional ``--output-file`` argument should be an absolute path (including
file name) to the generated pipeline, and if not given, the default is
``./.gitlab-ci.yml``.

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinx==7.4.7
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.18.0
urllib3==2.2.1
pytest==8.2.2
urllib3==2.2.2
pytest==8.3.1
isort==5.13.2
black==24.4.2
flake8==7.0.0
mypy==1.10.0
flake8==7.1.0
mypy==1.11.0

View File

@@ -33,8 +33,23 @@
pass
esc, bell, lbracket, bslash, newline = r"\x1b", r"\x07", r"\[", r"\\", r"\n"
# Ansi Control Sequence Introducers (CSI) are a well-defined format
# Standard ECMA-48: Control Functions for Character-Imaging I/O Devices, section 5.4
# https://www.ecma-international.org/wp-content/uploads/ECMA-48_5th_edition_june_1991.pdf
csi_pre = f"{esc}{lbracket}"
csi_param, csi_inter, csi_post = r"[0-?]", r"[ -/]", r"[@-~]"
ansi_csi = f"{csi_pre}{csi_param}*{csi_inter}*{csi_post}"
# General ansi escape sequences have well-defined prefixes,
# but content and suffixes are less reliable.
# Conservatively assume they end with either "<ESC>\" or "<BELL>",
# with no intervening "<ESC>"/"<BELL>" keys or newlines
esc_pre = f"{esc}[@-_]"
esc_content = f"[^{esc}{bell}{newline}]"
esc_post = f"(?:{esc}{bslash}|{bell})"
ansi_esc = f"{esc_pre}{esc_content}*{esc_post}"
# Use this to strip escape sequences
_escape = re.compile(r"\x1b[^m]*m|\x1b\[?1034h|\x1b\][0-9]+;[^\x07]*\x07")
_escape = re.compile(f"{ansi_csi}|{ansi_esc}")
# control characters for enabling/disabling echo
#

View File

@@ -791,7 +791,7 @@ def check_virtual_with_variants(spec, msg):
return
error = error_cls(
f"{pkg_name}: {msg}",
f"remove variants from '{spec}' in depends_on directive in {filename}",
[f"remove variants from '{spec}' in depends_on directive in {filename}"],
)
errors.append(error)

View File

@@ -129,10 +129,10 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
configuration_paths = (spack.config.CONFIGURATION_DEFAULTS_PATH, ("bootstrap", _config_path()))
for name, path in configuration_paths:
platform = spack.platforms.host().name
platform_scope = spack.config.ConfigScope(
"/".join([name, platform]), os.path.join(path, platform)
platform_scope = spack.config.DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform)
)
generic_scope = spack.config.ConfigScope(name, path)
generic_scope = spack.config.DirectoryConfigScope(name, path)
config_scopes.extend([generic_scope, platform_scope])
msg = "[BOOTSTRAP CONFIG SCOPE] name={0}, path={1}"
tty.debug(msg.format(generic_scope.name, generic_scope.path))

View File

@@ -72,6 +72,7 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.executable
import spack.util.path
import spack.util.pattern
from spack import traverse
@@ -458,10 +459,7 @@ def set_wrapper_variables(pkg, env):
# Find ccache binary and hand it to build environment
if spack.config.get("config:ccache"):
ccache = Executable("ccache")
if not ccache:
raise RuntimeError("No ccache binary found in PATH")
env.set(SPACK_CCACHE_BINARY, ccache)
env.set(SPACK_CCACHE_BINARY, spack.util.executable.which_string("ccache", required=True))
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
@@ -1475,7 +1473,7 @@ def long_message(self):
out.write(" {0}\n".format(self.log_name))
# Also output the test log path IF it exists
if self.context != "test":
if self.context != "test" and have_log:
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
if os.path.isfile(test_log):
out.write("\nSee test log for details:\n")

View File

@@ -124,6 +124,8 @@ def cuda_flags(arch_list):
# minimum supported versions
conflicts("%gcc@:4", when="+cuda ^cuda@11.0:")
conflicts("%gcc@:5", when="+cuda ^cuda@11.4:")
conflicts("%gcc@:7.2", when="+cuda ^cuda@12.4:")
conflicts("%clang@:6", when="+cuda ^cuda@12.2:")
# maximum supported version
# NOTE:
@@ -136,14 +138,14 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.5")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")
@@ -211,12 +213,16 @@ def cuda_flags(arch_list):
conflicts("%intel@19.0:", when="+cuda ^cuda@:10.0")
conflicts("%intel@19.1:", when="+cuda ^cuda@:10.1")
conflicts("%intel@19.2:", when="+cuda ^cuda@:11.1.0")
conflicts("%intel@2021:", when="+cuda ^cuda@:11.4.0")
# XL is mostly relevant for ppc64le Linux
conflicts("%xl@:12,14:", when="+cuda ^cuda@:9.1")
conflicts("%xl@:12,14:15,17:", when="+cuda ^cuda@9.2")
conflicts("%xl@:12,17:", when="+cuda ^cuda@:11.1.0")
# PowerPC.
conflicts("target=ppc64le", when="+cuda ^cuda@12.5:")
# Darwin.
# TODO: add missing conflicts for %apple-clang cuda@:10
conflicts("platform=darwin", when="+cuda ^cuda@11.0.2: ")
conflicts("platform=darwin", when="+cuda ^cuda@11.0.2:")

View File

@@ -72,7 +72,7 @@ def build_directory(self):
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
return ["-modcacherw", "-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property
def check_args(self):

View File

@@ -34,6 +34,8 @@ def _misc_cache():
return spack.util.file_cache.FileCache(path)
FileCacheType = Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton]
#: Spack's cache for small data
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache)

View File

@@ -22,6 +22,8 @@
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
import ruamel.yaml
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -551,10 +553,9 @@ def generate_gitlab_ci_yaml(
env,
print_summary,
output_file,
*,
prune_dag=False,
check_index_only=False,
run_optimizer=False,
use_dependencies=False,
artifacts_root=None,
remote_mirror_override=None,
):
@@ -575,12 +576,6 @@ def generate_gitlab_ci_yaml(
this mode results in faster yaml generation time). Otherwise, also
check each spec directly by url (useful if there is no index or it
might be out of date).
run_optimizer (bool): If True, post-process the generated yaml to try
try to reduce the size (attempts to collect repeated configuration
and replace with definitions).)
use_dependencies (bool): If true, use "dependencies" rather than "needs"
("needs" allows DAG scheduling). Useful if gitlab instance cannot
be configured to handle more than a few "needs" per job.
artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory.
@@ -814,7 +809,8 @@ def ensure_expected_target_path(path):
cli_scopes = [
os.path.relpath(s.path, concrete_env_dir)
for s in cfg.scopes().values()
if isinstance(s, cfg.ImmutableConfigScope)
if not s.writable
and isinstance(s, (cfg.DirectoryConfigScope))
and s.path not in env_includes
and os.path.exists(s.path)
]
@@ -1271,17 +1267,6 @@ def main_script_replacements(cmd):
with open(copy_specs_file, "w") as fd:
fd.write(json.dumps(buildcache_copies))
# TODO(opadron): remove this or refactor
if run_optimizer:
import spack.ci_optimization as ci_opt
output_object = ci_opt.optimizer(output_object)
# TODO(opadron): remove this or refactor
if use_dependencies:
import spack.ci_needs_workaround as cinw
output_object = cinw.needs_to_dependencies(output_object)
else:
# No jobs were generated
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
@@ -1310,8 +1295,11 @@ def main_script_replacements(cmd):
if not rebuild_everything:
sys.exit(1)
with open(output_file, "w") as outf:
outf.write(syaml.dump(sorted_output, default_flow_style=True))
# Minimize yaml output size through use of anchors
syaml.anchorify(sorted_output)
with open(output_file, "w") as f:
ruamel.yaml.YAML().dump(sorted_output, f)
def _url_encode_string(input_string):

View File

@@ -1,34 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
get_job_name = lambda needs_entry: (
needs_entry.get("job")
if (isinstance(needs_entry, collections.abc.Mapping) and needs_entry.get("artifacts", True))
else needs_entry if isinstance(needs_entry, str) else None
)
def convert_job(job_entry):
if not isinstance(job_entry, collections.abc.Mapping):
return job_entry
needs = job_entry.get("needs")
if needs is None:
return job_entry
new_job = {}
new_job.update(job_entry)
del new_job["needs"]
new_job["dependencies"] = list(
filter((lambda x: x is not None), (get_job_name(needs_entry) for needs_entry in needs))
)
return new_job
def needs_to_dependencies(yaml):
return dict((k, convert_job(v)) for k, v in yaml.items())

View File

@@ -1,363 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import collections.abc
import copy
import hashlib
import spack.util.spack_yaml as syaml
def sort_yaml_obj(obj):
if isinstance(obj, collections.abc.Mapping):
return syaml.syaml_dict(
(k, sort_yaml_obj(v)) for k, v in sorted(obj.items(), key=(lambda item: str(item[0])))
)
if isinstance(obj, collections.abc.Sequence) and not isinstance(obj, str):
return syaml.syaml_list(sort_yaml_obj(x) for x in obj)
return obj
def matches(obj, proto):
"""Returns True if the test object "obj" matches the prototype object
"proto".
If obj and proto are mappings, obj matches proto if (key in obj) and
(obj[key] matches proto[key]) for every key in proto.
If obj and proto are sequences, obj matches proto if they are of the same
length and (a matches b) for every (a,b) in zip(obj, proto).
Otherwise, obj matches proto if obj == proto.
Precondition: proto must not have any reference cycles
"""
if isinstance(obj, collections.abc.Mapping):
if not isinstance(proto, collections.abc.Mapping):
return False
return all((key in obj and matches(obj[key], val)) for key, val in proto.items())
if isinstance(obj, collections.abc.Sequence) and not isinstance(obj, str):
if not (isinstance(proto, collections.abc.Sequence) and not isinstance(proto, str)):
return False
if len(obj) != len(proto):
return False
return all(matches(obj[index], val) for index, val in enumerate(proto))
return obj == proto
def subkeys(obj, proto):
"""Returns the test mapping "obj" after factoring out the items it has in
common with the prototype mapping "proto".
Consider a recursive merge operation, merge(a, b) on mappings a and b, that
returns a mapping, m, whose keys are the union of the keys of a and b, and
for every such key, "k", its corresponding value is:
- merge(a[key], b[key]) if a[key] and b[key] are mappings, or
- b[key] if (key in b) and not matches(a[key], b[key]),
or
- a[key] otherwise
If obj and proto are mappings, the returned object is the smallest object,
"a", such that merge(a, proto) matches obj.
Otherwise, obj is returned.
"""
if not (
isinstance(obj, collections.abc.Mapping) and isinstance(proto, collections.abc.Mapping)
):
return obj
new_obj = {}
for key, value in obj.items():
if key not in proto:
new_obj[key] = value
continue
if matches(value, proto[key]) and matches(proto[key], value):
continue
if isinstance(value, collections.abc.Mapping):
new_obj[key] = subkeys(value, proto[key])
continue
new_obj[key] = value
return new_obj
def add_extends(yaml, key):
"""Modifies the given object "yaml" so that it includes an "extends" key
whose value features "key".
If "extends" is not in yaml, then yaml is modified such that
yaml["extends"] == key.
If yaml["extends"] is a str, then yaml is modified such that
yaml["extends"] == [yaml["extends"], key]
If yaml["extends"] is a list that does not include key, then key is
appended to the list.
Otherwise, yaml is left unchanged.
"""
has_key = "extends" in yaml
extends = yaml.get("extends")
if has_key and not isinstance(extends, (str, collections.abc.Sequence)):
return
if extends is None:
yaml["extends"] = key
return
if isinstance(extends, str):
if extends != key:
yaml["extends"] = [extends, key]
return
if key not in extends:
extends.append(key)
def common_subobject(yaml, sub):
"""Factor prototype object "sub" out of the values of mapping "yaml".
Consider a modified copy of yaml, "new", where for each key, "key" in yaml:
- If yaml[key] matches sub, then new[key] = subkeys(yaml[key], sub).
- Otherwise, new[key] = yaml[key].
If the above match criteria is not satisfied for any such key, then (yaml,
None) is returned. The yaml object is returned unchanged.
Otherwise, each matching value in new is modified as in
add_extends(new[key], common_key), and then new[common_key] is set to sub.
The common_key value is chosen such that it does not match any preexisting
key in new. In this case, (new, common_key) is returned.
"""
match_list = set(k for k, v in yaml.items() if matches(v, sub))
if not match_list:
return yaml, None
common_prefix = ".c"
common_index = 0
while True:
common_key = "".join((common_prefix, str(common_index)))
if common_key not in yaml:
break
common_index += 1
new_yaml = {}
for key, val in yaml.items():
new_yaml[key] = copy.deepcopy(val)
if not matches(val, sub):
continue
new_yaml[key] = subkeys(new_yaml[key], sub)
add_extends(new_yaml[key], common_key)
new_yaml[common_key] = sub
return new_yaml, common_key
def print_delta(name, old, new, applied=None):
delta = new - old
reldelta = (1000 * delta) // old
reldelta = (reldelta // 10, reldelta % 10)
if applied is None:
applied = new <= old
print(
"\n".join(
(
"{0} {1}:",
" before: {2: 10d}",
" after : {3: 10d}",
" delta : {4:+10d} ({5:=+3d}.{6}%)",
)
).format(name, ("+" if applied else "x"), old, new, delta, reldelta[0], reldelta[1])
)
def try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs):
"""Try applying an optimization pass and return information about the
result
"name" is a string describing the nature of the pass. If it is a non-empty
string, summary statistics are also printed to stdout.
"yaml" is the object to apply the pass to.
"optimization_pass" is the function implementing the pass to be applied.
"args" and "kwargs" are the additional arguments to pass to optimization
pass. The pass is applied as
>>> (new_yaml, *other_results) = optimization_pass(yaml, *args, **kwargs)
The pass's results are greedily rejected if it does not modify the original
yaml document, or if it produces a yaml document that serializes to a
larger string.
Returns (new_yaml, yaml, applied, other_results) if applied, or
(yaml, new_yaml, applied, other_results) otherwise.
"""
result = optimization_pass(yaml, *args, **kwargs)
new_yaml, other_results = result[0], result[1:]
if new_yaml is yaml:
# pass was not applied
return (yaml, new_yaml, False, other_results)
pre_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
post_size = len(syaml.dump_config(sort_yaml_obj(new_yaml), default_flow_style=True))
# pass makes the size worse: not applying
applied = post_size <= pre_size
if applied:
yaml, new_yaml = new_yaml, yaml
if name:
print_delta(name, pre_size, post_size, applied)
return (yaml, new_yaml, applied, other_results)
def build_histogram(iterator, key):
"""Builds a histogram of values given an iterable of mappings and a key.
For each mapping "m" with key "key" in iterator, the value m[key] is
considered.
Returns a list of tuples (hash, count, proportion, value), where
- "hash" is a sha1sum hash of the value.
- "count" is the number of occurences of values that hash to "hash".
- "proportion" is the proportion of all values considered above that
hash to "hash".
- "value" is one of the values considered above that hash to "hash".
Which value is chosen when multiple values hash to the same "hash" is
undefined.
The list is sorted in descending order by count, yielding the most
frequently occuring hashes first.
"""
buckets = collections.defaultdict(int)
values = {}
num_objects = 0
for obj in iterator:
num_objects += 1
try:
val = obj[key]
except (KeyError, TypeError):
continue
value_hash = hashlib.sha1()
value_hash.update(syaml.dump_config(sort_yaml_obj(val)).encode())
value_hash = value_hash.hexdigest()
buckets[value_hash] += 1
values[value_hash] = val
return [
(h, buckets[h], float(buckets[h]) / num_objects, values[h])
for h in sorted(buckets.keys(), key=lambda k: -buckets[k])
]
def optimizer(yaml):
original_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
# try factoring out commonly repeated portions
common_job = {
"variables": {"SPACK_COMPILER_ACTION": "NONE"},
"after_script": ['rm -rf "./spack"'],
"artifacts": {"paths": ["jobs_scratch_dir", "cdash_report"], "when": "always"},
}
# look for a list of tags that appear frequently
_, count, proportion, tags = next(iter(build_histogram(yaml.values(), "tags")), (None,) * 4)
# If a list of tags is found, and there are more than one job that uses it,
# *and* the jobs that do use it represent at least 70% of all jobs, then
# add the list to the prototype object.
if tags and count > 1 and proportion >= 0.70:
common_job["tags"] = tags
# apply common object factorization
yaml, other, applied, rest = try_optimization_pass(
"general common object factorization", yaml, common_subobject, common_job
)
# look for a common script, and try factoring that out
_, count, proportion, script = next(
iter(build_histogram(yaml.values(), "script")), (None,) * 4
)
if script and count > 1 and proportion >= 0.70:
yaml, other, applied, rest = try_optimization_pass(
"script factorization", yaml, common_subobject, {"script": script}
)
# look for a common before_script, and try factoring that out
_, count, proportion, script = next(
iter(build_histogram(yaml.values(), "before_script")), (None,) * 4
)
if script and count > 1 and proportion >= 0.70:
yaml, other, applied, rest = try_optimization_pass(
"before_script factorization", yaml, common_subobject, {"before_script": script}
)
# Look specifically for the SPACK_ROOT_SPEC environment variables.
# Try to factor them out.
h = build_histogram(
(getattr(val, "get", lambda *args: {})("variables") for val in yaml.values()),
"SPACK_ROOT_SPEC",
)
# In this case, we try to factor out *all* instances of the SPACK_ROOT_SPEC
# environment variable; not just the one that appears with the greatest
# frequency. We only require that more than 1 job uses a given instance's
# value, because we expect the value to be very large, and so expect even
# few-to-one factorizations to yield large space savings.
counter = 0
for _, count, proportion, spec in h:
if count <= 1:
continue
counter += 1
yaml, other, applied, rest = try_optimization_pass(
"SPACK_ROOT_SPEC factorization ({count})".format(count=counter),
yaml,
common_subobject,
{"variables": {"SPACK_ROOT_SPEC": spec}},
)
new_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
print("\n")
print_delta("overall summary", original_size, new_size)
print("\n")
return yaml

View File

@@ -336,6 +336,7 @@ def display_specs(specs, args=None, **kwargs):
groups (bool): display specs grouped by arch/compiler (default True)
decorator (typing.Callable): function to call to decorate specs
all_headers (bool): show headers even when arch/compiler aren't defined
status_fn (typing.Callable): if provided, prepend install-status info
output (typing.IO): A file object to write to. Default is ``sys.stdout``
"""
@@ -359,6 +360,7 @@ def get_arg(name, default=None):
groups = get_arg("groups", True)
all_headers = get_arg("all_headers", False)
output = get_arg("output", sys.stdout)
status_fn = get_arg("status_fn", None)
decorator = get_arg("decorator", None)
if decorator is None:
@@ -386,6 +388,13 @@ def get_arg(name, default=None):
def fmt(s, depth=0):
"""Formatter function for all output specs"""
string = ""
if status_fn:
# This was copied from spec.tree's colorization logic
# then shortened because it seems like status_fn should
# always return an InstallStatus
string += colorize(status_fn(s).value)
if hashes:
string += gray_hash(s, hlen) + " "
string += depth * " "
@@ -444,7 +453,7 @@ def format_list(specs):
def filter_loaded_specs(specs):
"""Filter a list of specs returning only those that are
currently loaded."""
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(":")
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(os.pathsep)
return [x for x in specs if x.dag_hash() in hashes]

View File

@@ -165,7 +165,7 @@ def _reset(args):
if not ok_to_continue:
raise RuntimeError("Aborting")
for scope in spack.config.CONFIG.file_scopes:
for scope in spack.config.CONFIG.writable_scopes:
# The default scope should stay untouched
if scope.name == "defaults":
continue

View File

@@ -70,12 +70,6 @@ def setup_parser(subparser: argparse.ArgumentParser):
push = subparsers.add_parser("push", aliases=["create"], help=push_fn.__doc__)
push.add_argument("-f", "--force", action="store_true", help="overwrite tarball if it exists")
push.add_argument(
"--allow-root",
"-a",
action="store_true",
help="allow install root string in binary files after RPATH substitution",
)
push_sign = push.add_mutually_exclusive_group(required=False)
push_sign.add_argument(
"--unsigned",
@@ -190,10 +184,6 @@ def setup_parser(subparser: argparse.ArgumentParser):
keys.add_argument("-f", "--force", action="store_true", help="force new download of keys")
keys.set_defaults(func=keys_fn)
preview = subparsers.add_parser("preview", help=preview_fn.__doc__)
arguments.add_common_arguments(preview, ["installed_specs"])
preview.set_defaults(func=preview_fn)
# Check if binaries need to be rebuilt on remote mirror
check = subparsers.add_parser("check", help=check_fn.__doc__)
check.add_argument(
@@ -404,11 +394,6 @@ def push_fn(args):
else:
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
if args.allow_root:
tty.warn(
"The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22"
)
mirror: spack.mirror.Mirror = args.mirror
# Check if this is an OCI image.
@@ -960,14 +945,6 @@ def keys_fn(args):
bindist.get_keys(args.install, args.trust, args.force)
def preview_fn(args):
"""analyze an installed spec and reports whether executables and libraries are relocatable"""
tty.warn(
"`spack buildcache preview` is deprecated since `spack buildcache push --allow-root` is "
"now the default. This command will be removed in Spack 0.22"
)
def check_fn(args: argparse.Namespace):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt

View File

@@ -6,6 +6,7 @@
import json
import os
import shutil
import warnings
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
@@ -73,7 +74,7 @@ def setup_parser(subparser):
"--optimize",
action="store_true",
default=False,
help="(experimental) optimize the gitlab yaml file for size\n\n"
help="(DEPRECATED) optimize the gitlab yaml file for size\n\n"
"run the generated document through a series of optimization passes "
"designed to reduce the size of the generated file",
)
@@ -81,7 +82,7 @@ def setup_parser(subparser):
"--dependencies",
action="store_true",
default=False,
help="(experimental) disable DAG scheduling (use 'plain' dependencies)",
help="(DEPRECATED) disable DAG scheduling (use 'plain' dependencies)",
)
generate.add_argument(
"--buildcache-destination",
@@ -200,6 +201,18 @@ def ci_generate(args):
before invoking this command. the value must be the CDash authorization token needed to create
a build group and register all generated jobs under it
"""
if args.optimize:
warnings.warn(
"The --optimize option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
if args.dependencies:
warnings.warn(
"The --dependencies option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
env = spack.cmd.require_active_env(cmd_name="ci generate")
if args.copy_to:
@@ -212,8 +225,6 @@ def ci_generate(args):
output_file = args.output_file
copy_yaml_to = args.copy_to
run_optimizer = args.optimize
use_dependencies = args.dependencies
prune_dag = args.prune_dag
index_only = args.index_only
artifacts_root = args.artifacts_root
@@ -234,8 +245,6 @@ def ci_generate(args):
output_file,
prune_dag=prune_dag,
check_index_only=index_only,
run_optimizer=run_optimizer,
use_dependencies=use_dependencies,
artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination,
)

View File

@@ -156,7 +156,7 @@ def print_flattened_configuration(*, blame: bool) -> None:
"""
env = ev.active_environment()
if env is not None:
pristine = env.manifest.pristine_yaml_content
pristine = env.manifest.yaml_content
flattened = pristine.copy()
flattened[spack.schema.env.TOP_LEVEL_KEY] = pristine[spack.schema.env.TOP_LEVEL_KEY].copy()
else:
@@ -264,7 +264,9 @@ def config_remove(args):
def _can_update_config_file(scope: spack.config.ConfigScope, cfg_file):
if isinstance(scope, spack.config.SingleFileScope):
return fs.can_access(cfg_file)
return fs.can_write_to_dir(scope.path) and fs.can_access(cfg_file)
elif isinstance(scope, spack.config.DirectoryConfigScope):
return fs.can_write_to_dir(scope.path) and fs.can_access(cfg_file)
return False
def _config_change_requires_scope(path, spec, scope, match_spec=None):
@@ -362,14 +364,11 @@ def config_change(args):
def config_update(args):
# Read the configuration files
spack.config.CONFIG.get_config(args.section, scope=args.scope)
updates: List[spack.config.ConfigScope] = list(
filter(
lambda s: not isinstance(
s, (spack.config.InternalConfigScope, spack.config.ImmutableConfigScope)
),
spack.config.CONFIG.format_updates[args.section],
)
)
updates: List[spack.config.ConfigScope] = [
x
for x in spack.config.CONFIG.format_updates[args.section]
if not isinstance(x, spack.config.InternalConfigScope) and x.writable
]
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
@@ -447,7 +446,7 @@ def _can_revert_update(scope_dir, cfg_file, bkp_file):
def config_revert(args):
scopes = [args.scope] if args.scope else [x.name for x in spack.config.CONFIG.file_scopes]
scopes = [args.scope] if args.scope else [x.name for x in spack.config.CONFIG.writable_scopes]
# Search for backup files in the configuration scopes
Entry = collections.namedtuple("Entry", ["scope", "cfg", "bkp"])

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import sys
@@ -934,7 +933,7 @@ def get_repository(args, name):
# Figure out where the new package should live
repo_path = args.repo
if repo_path is not None:
repo = spack.repo.Repo(repo_path)
repo = spack.repo.from_path(repo_path)
if spec.namespace and spec.namespace != repo.namespace:
tty.die(
"Can't create package with namespace {0} in repo with "
@@ -942,9 +941,7 @@ def get_repository(args, name):
)
else:
if spec.namespace:
repo = spack.repo.PATH.get_repo(spec.namespace, None)
if not repo:
tty.die("Unknown namespace: '{0}'".format(spec.namespace))
repo = spack.repo.PATH.get_repo(spec.namespace)
else:
repo = spack.repo.PATH.first_repo()

View File

@@ -47,16 +47,6 @@ def inverted_dependencies():
dependents of, e.g., `mpi`, but virtuals are not included as
actual dependents.
"""
dag = {}
for pkg_cls in spack.repo.PATH.all_package_classes():
dag.setdefault(pkg_cls.name, set())
for dep in pkg_cls.dependencies_by_name():
deps = [dep]
# expand virtuals if necessary
if spack.repo.PATH.is_virtual(dep):
deps += [s.name for s in spack.repo.PATH.providers_for(dep)]
dag = collections.defaultdict(set)
for pkg_cls in spack.repo.PATH.all_package_classes():
for _, deps_by_name in pkg_cls.dependencies.items():

View File

@@ -9,6 +9,8 @@
import spack.cmd
import spack.config
import spack.fetch_strategy
import spack.repo
import spack.spec
import spack.util.path
import spack.version
@@ -69,13 +71,15 @@ def _retrieve_develop_source(spec, abspath):
# We construct a package class ourselves, rather than asking for
# Spec.package, since Spec only allows this when it is concrete
package = pkg_cls(spec)
if isinstance(package.stage[0].fetcher, spack.fetch_strategy.GitFetchStrategy):
package.stage[0].fetcher.get_full_repo = True
source_stage = package.stage[0]
if isinstance(source_stage.fetcher, spack.fetch_strategy.GitFetchStrategy):
source_stage.fetcher.get_full_repo = True
# If we retrieved this version before and cached it, we may have
# done so without cloning the full git repo; likewise, any
# mirror might store an instance with truncated history.
package.stage[0].disable_mirrors()
source_stage.disable_mirrors()
source_stage.fetcher.set_package(package)
package.stage.steal_source(abspath)

View File

@@ -123,7 +123,7 @@ def edit(parser, args):
spack.util.editor.editor(*paths)
elif names:
if args.repo:
repo = spack.repo.Repo(args.repo)
repo = spack.repo.from_path(args.repo)
elif args.namespace:
repo = spack.repo.PATH.get_repo(args.namespace)
else:

View File

@@ -7,7 +7,7 @@
import os
import re
import sys
from typing import List, Optional
from typing import List, Optional, Set
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
@@ -19,6 +19,7 @@
import spack.detection
import spack.error
import spack.repo
import spack.spec
import spack.util.environment
from spack.cmd.common import arguments
@@ -138,14 +139,26 @@ def external_find(args):
candidate_packages, path_hints=args.path, max_workers=args.jobs
)
new_entries = spack.detection.update_configuration(
new_specs = spack.detection.update_configuration(
detected_packages, scope=args.scope, buildable=not args.not_buildable
)
if new_entries:
# If the user runs `spack external find --not-buildable mpich` we also mark `mpi` non-buildable
# to avoid that the concretizer picks a different mpi provider.
if new_specs and args.not_buildable:
virtuals: Set[str] = {
virtual.name
for new_spec in new_specs
for virtual_specs in spack.repo.PATH.get_pkg_class(new_spec.name).provided.values()
for virtual in virtual_specs
}
new_virtuals = spack.detection.set_virtuals_nonbuildable(virtuals, scope=args.scope)
new_specs.extend(spack.spec.Spec(name) for name in new_virtuals)
if new_specs:
path = spack.config.CONFIG.get_config_filename(args.scope, "packages")
msg = "The following specs have been detected on this system and added to {0}"
tty.msg(msg.format(path))
spack.cmd.display_specs(new_entries)
tty.msg(f"The following specs have been detected on this system and added to {path}")
spack.cmd.display_specs(new_specs)
else:
tty.msg("No new external packages detected")

View File

@@ -46,6 +46,10 @@ def setup_parser(subparser):
help="output specs as machine-readable json records",
)
subparser.add_argument(
"-I", "--install-status", action="store_true", help="show install status of packages"
)
subparser.add_argument(
"-d", "--deps", action="store_true", help="output dependencies along with found specs"
)
@@ -293,25 +297,24 @@ def root_decorator(spec, string):
)
print()
if args.show_concretized:
tty.msg("Concretized roots")
cmd.display_specs(env.specs_by_hash.values(), args, decorator=decorator)
print()
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results and not args.only_roots:
tty.msg("Installed packages")
def find(parser, args):
q_args = query_arguments(args)
results = args.specs(**q_args)
env = ev.active_environment()
if not env and args.only_roots:
tty.die("-r / --only-roots requires an active environment")
if not env and args.show_concretized:
tty.die("-c / --show-concretized requires an active environment")
if env:
if args.constraint:
init_specs = spack.cmd.parse_specs(args.constraint)
results = env.all_matching_specs(*init_specs)
else:
results = env.all_specs()
else:
q_args = query_arguments(args)
results = args.specs(**q_args)
decorator = make_env_decorator(env) if env else lambda s, f: f
@@ -332,6 +335,11 @@ def find(parser, args):
if args.loaded:
results = spack.cmd.filter_loaded_specs(results)
if args.install_status or args.show_concretized:
status_fn = spack.spec.Spec.install_status
else:
status_fn = None
# Display the result
if args.json:
cmd.display_specs_as_json(results, deps=args.deps)
@@ -340,12 +348,34 @@ def find(parser, args):
if env:
display_env(env, args, decorator, results)
count_suffix = " (not shown)"
if not args.only_roots:
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = ""
display_results = results
if not args.show_concretized:
display_results = list(x for x in results if x.installed)
cmd.display_specs(
display_results, args, decorator=decorator, all_headers=True, status_fn=status_fn
)
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
installed_suffix = ""
concretized_suffix = " to be installed"
if args.only_roots:
installed_suffix += " (not shown)"
concretized_suffix += " (not shown)"
else:
if env and not args.show_concretized:
concretized_suffix += " (show with `spack find -c`)"
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type, suffix=count_suffix)
spack.cmd.print_how_many_pkgs(
list(x for x in results if x.installed), pkg_type, suffix=installed_suffix
)
if env:
spack.cmd.print_how_many_pkgs(
list(x for x in results if not x.installed),
"concretized",
suffix=concretized_suffix,
)

View File

@@ -56,7 +56,6 @@ def roots_from_environments(args, active_env):
# -e says "also preserve things needed by this particular env"
for env_name_or_dir in args.except_environment:
print("HMM", env_name_or_dir)
if ev.exists(env_name_or_dir):
env = ev.read(env_name_or_dir)
elif ev.is_env_dir(env_name_or_dir):

View File

@@ -169,7 +169,9 @@ def pkg_hash(args):
def get_grep(required=False):
"""Get a grep command to use with ``spack pkg grep``."""
return exe.which(os.environ.get("SPACK_GREP") or "grep", required=required)
grep = exe.which(os.environ.get("SPACK_GREP") or "grep", required=required)
grep.ignore_quotes = True # allow `spack pkg grep '"quoted string"'` without warning
return grep
def pkg_grep(args, unknown_args):

View File

@@ -91,7 +91,7 @@ def repo_add(args):
tty.die("Not a Spack repository: %s" % path)
# Make sure it's actually a spack repository by constructing it.
repo = spack.repo.Repo(canon_path)
repo = spack.repo.from_path(canon_path)
# If that succeeds, finally add it to the configuration.
repos = spack.config.get("repos", scope=args.scope)
@@ -124,7 +124,7 @@ def repo_remove(args):
# If it is a namespace, remove corresponding repo
for path in repos:
try:
repo = spack.repo.Repo(path)
repo = spack.repo.from_path(path)
if repo.namespace == namespace_or_path:
repos.remove(path)
spack.config.set("repos", repos, args.scope)
@@ -142,7 +142,7 @@ def repo_list(args):
repos = []
for r in roots:
try:
repos.append(spack.repo.Repo(r))
repos.append(spack.repo.from_path(r))
except spack.repo.RepoError:
continue

View File

@@ -71,7 +71,7 @@ def unload(parser, args):
"Cannot specify specs on command line when unloading all specs with '--all'"
)
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(":")
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(os.pathsep)
if args.specs:
specs = [
spack.cmd.disambiguate_spec_from_hashes(spec, hashes)

View File

@@ -18,7 +18,6 @@
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec

View File

@@ -260,7 +260,7 @@ def _init_compiler_config(
def compiler_config_files():
config_files = list()
config = spack.config.CONFIG
for scope in config.file_scopes:
for scope in config.writable_scopes:
name = scope.name
compiler_config = config.get("compilers", scope=name)
if compiler_config:
@@ -488,7 +488,7 @@ def supported_compilers_for_host_platform() -> List[str]:
return supported_compilers_for_platform(host_plat)
def supported_compilers_for_platform(platform: spack.platforms.Platform) -> List[str]:
def supported_compilers_for_platform(platform: "spack.platforms.Platform") -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the provided platform

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from os.path import dirname
from os.path import dirname, join
from llnl.util import tty
@@ -135,8 +135,12 @@ def setup_custom_environment(self, pkg, env):
# It is located in the same directory as the driver. Error message:
# clang++: error: unable to execute command:
# Executable "sycl-post-link" doesn't exist!
if self.cxx:
# also ensures that shared objects and libraries required by the compiler,
# e.g. libonnx, can be found succesfully
# due to a fix, this is no longer required for OneAPI versions >= 2024.2
if self.cxx and pkg.spec.satisfies("%oneapi@:2024.1"):
env.prepend_path("PATH", dirname(self.cxx))
env.prepend_path("LD_LIBRARY_PATH", join(dirname(dirname(self.cxx)), "lib"))
# 2024 release bumped the libsycl version because of an ABI
# change, 2024 compilers are required. You will see this

View File

@@ -35,11 +35,10 @@
import os
import re
import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Type, Union
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
from llnl.util import filesystem, lang, tty
import spack.compilers
import spack.paths
import spack.platforms
import spack.schema
@@ -117,21 +116,39 @@
class ConfigScope:
"""This class represents a configuration scope.
def __init__(self, name: str) -> None:
self.name = name
self.writable = False
self.sections = syaml.syaml_dict()
A scope is one directory containing named configuration files.
Each file is a config "section" (e.g., mirrors, compilers, etc.).
"""
def get_section_filename(self, section: str) -> str:
raise NotImplementedError
def __init__(self, name, path) -> None:
self.name = name # scope name.
self.path = path # path to directory containing configs.
self.sections = syaml.syaml_dict() # sections read from config files.
def get_section(self, section: str) -> Optional[YamlConfigDict]:
raise NotImplementedError
def _write_section(self, section: str) -> None:
raise NotImplementedError
@property
def is_platform_dependent(self) -> bool:
"""Returns true if the scope name is platform specific"""
return os.sep in self.name
return False
def clear(self) -> None:
"""Empty cached config information."""
self.sections = syaml.syaml_dict()
def __repr__(self) -> str:
return f"<ConfigScope: {self.name}>"
class DirectoryConfigScope(ConfigScope):
"""Config scope backed by a directory containing one file per section."""
def __init__(self, name: str, path: str, *, writable: bool = True) -> None:
super().__init__(name)
self.path = path
self.writable = writable
def get_section_filename(self, section: str) -> str:
"""Returns the filename associated with a given section"""
@@ -148,14 +165,15 @@ def get_section(self, section: str) -> Optional[YamlConfigDict]:
return self.sections[section]
def _write_section(self, section: str) -> None:
if not self.writable:
raise ConfigError(f"Cannot write to immutable scope {self}")
filename = self.get_section_filename(section)
data = self.get_section(section)
if data is None:
return
# We copy data here to avoid adding defaults at write time
validate_data = copy.deepcopy(data)
validate(validate_data, SECTION_SCHEMAS[section])
validate(data, SECTION_SCHEMAS[section])
try:
filesystem.mkdirp(self.path)
@@ -164,19 +182,23 @@ def _write_section(self, section: str) -> None:
except (syaml.SpackYAMLError, OSError) as e:
raise ConfigFileError(f"cannot write to '{filename}'") from e
def clear(self) -> None:
"""Empty cached config information."""
self.sections = syaml.syaml_dict()
def __repr__(self) -> str:
return f"<ConfigScope: {self.name}: {self.path}>"
@property
def is_platform_dependent(self) -> bool:
"""Returns true if the scope name is platform specific"""
return "/" in self.name
class SingleFileScope(ConfigScope):
"""This class represents a configuration scope in a single YAML file."""
def __init__(
self, name: str, path: str, schema: YamlConfigDict, yaml_path: Optional[List[str]] = None
self,
name: str,
path: str,
schema: YamlConfigDict,
*,
yaml_path: Optional[List[str]] = None,
writable: bool = True,
) -> None:
"""Similar to ``ConfigScope`` but can be embedded in another schema.
@@ -195,15 +217,13 @@ def __init__(
config:
install_tree: $spack/opt/spack
"""
super().__init__(name, path)
super().__init__(name)
self._raw_data: Optional[YamlConfigDict] = None
self.schema = schema
self.path = path
self.writable = writable
self.yaml_path = yaml_path or []
@property
def is_platform_dependent(self) -> bool:
return False
def get_section_filename(self, section) -> str:
return self.path
@@ -257,6 +277,8 @@ def get_section(self, section: str) -> Optional[YamlConfigDict]:
return self.sections.get(section, None)
def _write_section(self, section: str) -> None:
if not self.writable:
raise ConfigError(f"Cannot write to immutable scope {self}")
data_to_write: Optional[YamlConfigDict] = self._raw_data
# If there is no existing data, this section SingleFileScope has never
@@ -301,19 +323,6 @@ def __repr__(self) -> str:
return f"<SingleFileScope: {self.name}: {self.path}>"
class ImmutableConfigScope(ConfigScope):
"""A configuration scope that cannot be written to.
This is used for ConfigScopes passed on the command line.
"""
def _write_section(self, section) -> None:
raise ConfigError(f"Cannot write to immutable scope {self}")
def __repr__(self) -> str:
return f"<ImmutableConfigScope: {self.name}: {self.path}>"
class InternalConfigScope(ConfigScope):
"""An internal configuration scope that is not persisted to a file.
@@ -323,7 +332,7 @@ class InternalConfigScope(ConfigScope):
"""
def __init__(self, name: str, data: Optional[YamlConfigDict] = None) -> None:
super().__init__(name, None)
super().__init__(name)
self.sections = syaml.syaml_dict()
if data is not None:
@@ -333,9 +342,6 @@ def __init__(self, name: str, data: Optional[YamlConfigDict] = None) -> None:
validate({section: dsec}, SECTION_SCHEMAS[section])
self.sections[section] = _mark_internal(syaml.syaml_dict({section: dsec}), name)
def get_section_filename(self, section: str) -> str:
raise NotImplementedError("Cannot get filename for InternalConfigScope.")
def get_section(self, section: str) -> Optional[YamlConfigDict]:
"""Just reads from an internal dictionary."""
if section not in self.sections:
@@ -440,27 +446,21 @@ def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
return scope
@property
def file_scopes(self) -> List[ConfigScope]:
"""List of writable scopes with an associated file."""
return [
s
for s in self.scopes.values()
if (type(s) is ConfigScope or type(s) is SingleFileScope)
]
def writable_scopes(self) -> Generator[ConfigScope, None, None]:
"""Generator of writable scopes with an associated file."""
return (s for s in self.scopes.values() if s.writable)
def highest_precedence_scope(self) -> ConfigScope:
"""Non-internal scope with highest precedence."""
return next(reversed(self.file_scopes))
"""Writable scope with highest precedence."""
return next(s for s in reversed(self.scopes.values()) if s.writable) # type: ignore
def highest_precedence_non_platform_scope(self) -> ConfigScope:
"""Non-internal non-platform scope with highest precedence
Platform-specific scopes are of the form scope/platform"""
generator = reversed(self.file_scopes)
highest = next(generator)
while highest and highest.is_platform_dependent:
highest = next(generator)
return highest
"""Writable non-platform scope with highest precedence"""
return next(
s
for s in reversed(self.scopes.values()) # type: ignore
if s.writable and not s.is_platform_dependent
)
def matching_scopes(self, reg_expr) -> List[ConfigScope]:
"""
@@ -755,13 +755,14 @@ def override(
def _add_platform_scope(
cfg: Union[Configuration, lang.Singleton], scope_type: Type[ConfigScope], name: str, path: str
cfg: Union[Configuration, lang.Singleton], name: str, path: str, writable: bool = True
) -> None:
"""Add a platform-specific subdirectory for the current platform."""
platform = spack.platforms.host().name
plat_name = os.path.join(name, platform)
plat_path = os.path.join(path, platform)
cfg.push_scope(scope_type(plat_name, plat_path))
scope = DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform), writable=writable
)
cfg.push_scope(scope)
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
@@ -792,22 +793,27 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
"""Add additional scopes from the --config-scope argument.
"""Add additional scopes from the --config-scope argument, either envs or dirs."""
import spack.environment.environment as env # circular import
Command line scopes are named after their position in the arg list.
"""
for i, path in enumerate(command_line_scopes):
# We ensure that these scopes exist and are readable, as they are
# provided on the command line by the user.
if not os.path.isdir(path):
raise ConfigError(f"config scope is not a directory: '{path}'")
elif not os.access(path, os.R_OK):
raise ConfigError(f"config scope is not readable: '{path}'")
name = f"cmd_scope_{i}"
# name based on order on the command line
name = f"cmd_scope_{i:d}"
cfg.push_scope(ImmutableConfigScope(name, path))
_add_platform_scope(cfg, ImmutableConfigScope, name, path)
if env.exists(path): # managed environment
manifest = env.EnvironmentManifestFile(env.root(path))
elif env.is_env_dir(path): # anonymous environment
manifest = env.EnvironmentManifestFile(path)
elif os.path.isdir(path): # directory with config files
cfg.push_scope(DirectoryConfigScope(name, path, writable=False))
_add_platform_scope(cfg, name, path, writable=False)
continue
else:
raise ConfigError(f"Invalid configuration scope: {path}")
for scope in manifest.env_config_scopes:
scope.name = f"{name}:{scope.name}"
scope.writable = False
cfg.push_scope(scope)
def create() -> Configuration:
@@ -851,10 +857,10 @@ def create() -> Configuration:
# add each scope and its platform-specific directory
for name, path in configuration_paths:
cfg.push_scope(ConfigScope(name, path))
cfg.push_scope(DirectoryConfigScope(name, path))
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, ConfigScope, name, path)
_add_platform_scope(cfg, name, path)
# add command-line scopes
_add_command_line_scopes(cfg, COMMAND_LINE_SCOPES)
@@ -969,7 +975,7 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
def add_default_platform_scope(platform: str) -> None:
plat_name = os.path.join("defaults", platform)
plat_path = os.path.join(CONFIGURATION_DEFAULTS_PATH[1], platform)
CONFIG.push_scope(ConfigScope(plat_name, plat_path))
CONFIG.push_scope(DirectoryConfigScope(plat_name, plat_path))
def scopes() -> Dict[str, ConfigScope]:
@@ -978,19 +984,10 @@ def scopes() -> Dict[str, ConfigScope]:
def writable_scopes() -> List[ConfigScope]:
"""
Return list of writable scopes. Higher-priority scopes come first in the
list.
"""
return list(
reversed(
list(
x
for x in CONFIG.scopes.values()
if not isinstance(x, (InternalConfigScope, ImmutableConfigScope))
)
)
)
"""Return list of writable scopes. Higher-priority scopes come first in the list."""
scopes = [x for x in CONFIG.scopes.values() if x.writable]
scopes.reverse()
return scopes
def writable_scope_names() -> List[str]:
@@ -1080,11 +1077,8 @@ def validate(
"""
import jsonschema
# Validate a copy to avoid adding defaults
# This allows us to round-trip data without adding to it.
test_data = syaml.deepcopy(data)
try:
spack.schema.Validator(schema).validate(test_data)
spack.schema.Validator(schema).validate(data)
except jsonschema.ValidationError as e:
if hasattr(e.instance, "lc"):
line_number = e.instance.lc.line + 1
@@ -1093,7 +1087,7 @@ def validate(
raise ConfigFormatError(e, data, filename, line_number) from e
# return the validated data so that we can access the raw data
# mostly relevant for environments
return test_data
return data
def read_config_file(
@@ -1599,7 +1593,7 @@ def _config_from(scopes_or_paths: List[Union[ConfigScope, str]]) -> Configuratio
path = os.path.normpath(scope_or_path)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
scopes.append(ConfigScope(name, path))
scopes.append(DirectoryConfigScope(name, path))
configuration = Configuration(*scopes)
return configuration

View File

@@ -78,24 +78,17 @@
"image": "quay.io/almalinuxorg/almalinux:8"
}
},
"centos:stream": {
"centos:stream9": {
"bootstrap": {
"template": "container/centos_stream.dockerfile",
"image": "quay.io/centos/centos:stream"
"template": "container/centos_stream9.dockerfile",
"image": "quay.io/centos/centos:stream9"
},
"os_package_manager": "dnf_epel",
"build": "spack/centos-stream",
"build": "spack/centos-stream9",
"final": {
"image": "quay.io/centos/centos:stream"
"image": "quay.io/centos/centos:stream9"
}
},
"centos:7": {
"bootstrap": {
"template": "container/centos_7.dockerfile"
},
"os_package_manager": "yum",
"build": "spack/centos7"
},
"opensuse/leap:15": {
"bootstrap": {
"template": "container/leap-15.dockerfile"

View File

@@ -2,7 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from .common import DetectedPackage, executable_prefix, update_configuration
from .common import (
DetectedPackage,
executable_prefix,
set_virtuals_nonbuildable,
update_configuration,
)
from .path import by_path, executables_in_path
from .test import detection_tests
@@ -12,5 +17,6 @@
"executables_in_path",
"executable_prefix",
"update_configuration",
"set_virtuals_nonbuildable",
"detection_tests",
]

View File

@@ -252,6 +252,27 @@ def update_configuration(
return all_new_specs
def set_virtuals_nonbuildable(virtuals: Set[str], scope: Optional[str] = None) -> List[str]:
"""Update packages:virtual:buildable:False for the provided virtual packages, if the property
is not set by the user. Returns the list of virtual packages that have been updated."""
packages = spack.config.get("packages")
new_config = {}
for virtual in virtuals:
# If the user has set the buildable prop do not override it
if virtual in packages and "buildable" in packages[virtual]:
continue
new_config[virtual] = {"buildable": False}
# Update the provided scope
spack.config.set(
"packages",
spack.config.merge_yaml(spack.config.get("packages", scope=scope), new_config),
scope=scope,
)
return list(new_config.keys())
def _windows_drive() -> str:
"""Return Windows drive string extracted from the PROGRAMFILES environment variable,
which is guaranteed to be defined for all logins.

View File

@@ -12,7 +12,7 @@
import re
import sys
import warnings
from typing import Dict, List, Optional, Set, Tuple
from typing import Dict, List, Optional, Set, Tuple, Type
import llnl.util.filesystem
import llnl.util.lang
@@ -200,7 +200,7 @@ class Finder:
def default_path_hints(self) -> List[str]:
return []
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
"""Returns the list of patterns used to match candidate files.
Args:
@@ -226,7 +226,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: "spack.package_base.PackageBase", paths: List[str]
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
) -> List[DetectedPackage]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
@@ -327,7 +327,7 @@ class ExecutablesFinder(Finder):
def default_path_hints(self) -> List[str]:
return spack.util.environment.get_path("PATH")
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
if hasattr(pkg, "executables") and hasattr(pkg, "platform_executables"):
result = pkg.platform_executables()
@@ -356,7 +356,7 @@ class LibrariesFinder(Finder):
DYLD_LIBRARY_PATH, DYLD_FALLBACK_LIBRARY_PATH, and standard system library paths
"""
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
if hasattr(pkg, "libraries"):
result = pkg.libraries

View File

@@ -90,14 +90,14 @@ class OpenMpi(Package):
_patch_order_index = 0
SpecType = Union["spack.spec.Spec", str]
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx")
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
@@ -475,7 +475,7 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: "spack.package_base.PackageBase",
spec: SpecType,
spec: "spack.spec.Spec",
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
@@ -485,11 +485,10 @@ def _depends_on(
if not when_spec:
return
dep_spec = spack.spec.Spec(spec)
if not dep_spec.name:
raise DependencyError("Invalid dependency specification in package '%s':" % pkg.name, spec)
if pkg.name == dep_spec.name:
raise CircularReferenceError("Package '%s' cannot depend on itself." % pkg.name)
if not spec.name:
raise DependencyError(f"Invalid dependency specification in package '{pkg.name}':", spec)
if pkg.name == spec.name:
raise CircularReferenceError(f"Package '{pkg.name}' cannot depend on itself.")
depflag = dt.canonicalize(type)
@@ -505,7 +504,7 @@ def _depends_on(
# ensure `Spec.virtual` is a valid thing to call in a directive.
# For now, we comment out the following check to allow for virtual packages
# with package files.
# if patches and dep_spec.virtual:
# if patches and spec.virtual:
# raise DependencyPatchError("Cannot patch a virtual dependency.")
# ensure patches is a list
@@ -520,13 +519,13 @@ def _depends_on(
# this is where we actually add the dependency to this package
deps_by_name = pkg.dependencies.setdefault(when_spec, {})
dependency = deps_by_name.get(dep_spec.name)
dependency = deps_by_name.get(spec.name)
if not dependency:
dependency = Dependency(pkg, dep_spec, depflag=depflag)
deps_by_name[dep_spec.name] = dependency
dependency = Dependency(pkg, spec, depflag=depflag)
deps_by_name[spec.name] = dependency
else:
dependency.spec.constrain(dep_spec, deps=False)
dependency.spec.constrain(spec, deps=False)
dependency.depflag |= depflag
# apply patches to the dependency
@@ -591,12 +590,13 @@ def depends_on(
@see The section "Dependency specs" in the Spack Packaging Guide.
"""
if spack.spec.Spec(spec).name in SUPPORTED_LANGUAGES:
dep_spec = spack.spec.Spec(spec)
if dep_spec.name in SUPPORTED_LANGUAGES:
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
_depends_on(pkg, spec, when=when, type=type, patches=patches)
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
@@ -666,25 +666,24 @@ def extends(spec, when=None, type=("build", "run"), patches=None):
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
mechanism.
"""
mechanism."""
def _execute_extends(pkg):
when_spec = _make_when_spec(when)
if not when_spec:
return
_depends_on(pkg, spec, when=when, type=type, patches=patches)
spec_obj = spack.spec.Spec(spec)
dep_spec = spack.spec.Spec(spec)
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
# When extending python, also add a dependency on python-venv. This is done so that
# Spack environment views are Python virtual environments.
if spec_obj.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, "python-venv", when=when, type=("build", "run"))
if dep_spec.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, spack.spec.Spec("python-venv"), when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[spec_obj.name] = (spec_obj, None)
pkg.extendees[dep_spec.name] = (dep_spec, None)
return _execute_extends

View File

@@ -5,7 +5,7 @@
import collections
import collections.abc
import contextlib
import copy
import errno
import os
import pathlib
import re
@@ -24,6 +24,7 @@
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import readlink, symlink
import spack.caches
import spack.cmd
import spack.compilers
import spack.concretize
@@ -268,9 +269,7 @@ def root(name):
def exists(name):
"""Whether an environment with this name exists or not."""
if not valid_env_name(name):
return False
return os.path.isdir(root(name))
return valid_env_name(name) and os.path.isdir(_root(name))
def active(name):
@@ -529,8 +528,8 @@ def _read_yaml(str_or_file):
)
filename = getattr(str_or_file, "name", None)
default_data = spack.config.validate(data, spack.schema.env.schema, filename)
return data, default_data
spack.config.validate(data, spack.schema.env.schema, filename)
return data
def _write_yaml(data, str_or_file):
@@ -790,6 +789,23 @@ def regenerate(self, concrete_roots: List[Spec]) -> None:
root_dirname = os.path.dirname(self.root)
tmp_symlink_name = os.path.join(root_dirname, "._view_link")
# Remove self.root if is it an empty dir, since we need a symlink there. Note that rmdir
# fails if self.root is a symlink.
try:
os.rmdir(self.root)
except (FileNotFoundError, NotADirectoryError):
pass
except OSError as e:
if e.errno == errno.ENOTEMPTY:
msg = "it is a non-empty directory"
elif e.errno == errno.EACCES:
msg = "of insufficient permissions"
else:
raise
raise SpackEnvironmentViewError(
f"The environment view in {self.root} cannot not be created because {msg}."
) from e
# Create a new view
try:
fs.mkdirp(new_root)
@@ -921,7 +937,7 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
def _load_manifest_file(self):
"""Instantiate and load the manifest file contents into memory."""
with lk.ReadTransaction(self.txlock):
self.manifest = EnvironmentManifestFile(self.path)
self.manifest = EnvironmentManifestFile(self.path, self.name)
with self.manifest.use_config():
self._read()
@@ -958,18 +974,25 @@ def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return lk.WriteTransaction(self.txlock, acquire=self._re_read)
def _process_definition(self, item):
def _process_definition(self, entry):
"""Process a single spec definition item."""
entry = copy.deepcopy(item)
when = _eval_conditional(entry.pop("when", "True"))
assert len(entry) == 1
when_string = entry.get("when")
if when_string is not None:
when = _eval_conditional(when_string)
assert len([x for x in entry if x != "when"]) == 1
else:
when = True
assert len(entry) == 1
if when:
name, spec_list = next(iter(entry.items()))
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
for name, spec_list in entry.items():
if name == "when":
continue
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
def _process_view(self, env_view: Optional[Union[bool, str, Dict]]):
"""Process view option(s), which can be boolean, string, or None.
@@ -2542,7 +2565,7 @@ def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
def make_repo_path(root):
"""Make a RepoPath from the repo subdirectories in an environment."""
path = spack.repo.RepoPath()
path = spack.repo.RepoPath(cache=spack.caches.MISC_CACHE)
if os.path.isdir(root):
for repo_root in os.listdir(root):
@@ -2551,7 +2574,7 @@ def make_repo_path(root):
if not os.path.isdir(repo_root):
continue
repo = spack.repo.Repo(repo_root)
repo = spack.repo.from_path(repo_root)
path.put_last(repo)
return path
@@ -2752,10 +2775,11 @@ def from_lockfile(manifest_dir: Union[pathlib.Path, str]) -> "EnvironmentManifes
manifest.flush()
return manifest
def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
def __init__(self, manifest_dir: Union[pathlib.Path, str], name: Optional[str] = None) -> None:
self.manifest_dir = pathlib.Path(manifest_dir)
self.name = name or str(manifest_dir)
self.manifest_file = self.manifest_dir / manifest_name
self.scope_name = f"env:{environment_name(self.manifest_dir)}"
self.scope_name = f"env:{self.name}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
#: Configuration scopes associated with this environment. Note that these are not
@@ -2767,12 +2791,8 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
raise SpackEnvironmentError(msg)
with self.manifest_file.open() as f:
raw, with_defaults_added = _read_yaml(f)
self.yaml_content = _read_yaml(f)
#: Pristine YAML content, without defaults being added
self.pristine_yaml_content = raw
#: YAML content with defaults added by Spack, if they're missing
self.yaml_content = with_defaults_added
self.changed = False
def _all_matches(self, user_spec: str) -> List[str]:
@@ -2786,7 +2806,7 @@ def _all_matches(self, user_spec: str) -> List[str]:
ValueError: if no equivalent match is found
"""
result = []
for yaml_spec_str in self.pristine_configuration["specs"]:
for yaml_spec_str in self.configuration["specs"]:
if Spec(yaml_spec_str) == Spec(user_spec):
result.append(yaml_spec_str)
@@ -2801,7 +2821,6 @@ def add_user_spec(self, user_spec: str) -> None:
Args:
user_spec: user spec to be appended
"""
self.pristine_configuration.setdefault("specs", []).append(user_spec)
self.configuration.setdefault("specs", []).append(user_spec)
self.changed = True
@@ -2816,7 +2835,6 @@ def remove_user_spec(self, user_spec: str) -> None:
"""
try:
for key in self._all_matches(user_spec):
self.pristine_configuration["specs"].remove(key)
self.configuration["specs"].remove(key)
except ValueError as e:
msg = f"cannot remove {user_spec} from {self}, no such spec exists"
@@ -2834,7 +2852,6 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
SpackEnvironmentError: when the user spec cannot be overridden
"""
try:
self.pristine_configuration["specs"][idx] = user_spec
self.configuration["specs"][idx] = user_spec
except ValueError as e:
msg = f"cannot override {user_spec} from {self}"
@@ -2847,10 +2864,10 @@ def set_include_concrete(self, include_concrete: List[str]) -> None:
Args:
include_concrete: list of already existing concrete environments to include
"""
self.pristine_configuration[included_concrete_name] = []
self.configuration[included_concrete_name] = []
for env_path in include_concrete:
self.pristine_configuration[included_concrete_name].append(env_path)
self.configuration[included_concrete_name].append(env_path)
self.changed = True
@@ -2864,14 +2881,13 @@ def add_definition(self, user_spec: str, list_name: str) -> None:
Raises:
SpackEnvironmentError: is no valid definition exists already
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = f"cannot add {user_spec} to the '{list_name}' definition, no valid list exists"
for idx, item in self._iterate_on_definitions(defs, list_name=list_name, err_msg=msg):
item[list_name].append(user_spec)
break
self.configuration["definitions"][idx][list_name].append(user_spec)
self.changed = True
def remove_definition(self, user_spec: str, list_name: str) -> None:
@@ -2885,7 +2901,7 @@ def remove_definition(self, user_spec: str, list_name: str) -> None:
SpackEnvironmentError: if the user spec cannot be removed from the list,
or the list does not exist
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = (
f"cannot remove {user_spec} from the '{list_name}' definition, "
f"no valid list exists"
@@ -2898,7 +2914,6 @@ def remove_definition(self, user_spec: str, list_name: str) -> None:
except ValueError:
pass
self.configuration["definitions"][idx][list_name].remove(user_spec)
self.changed = True
def override_definition(self, user_spec: str, *, override: str, list_name: str) -> None:
@@ -2913,7 +2928,7 @@ def override_definition(self, user_spec: str, *, override: str, list_name: str)
Raises:
SpackEnvironmentError: if the user spec cannot be overridden
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = f"cannot override {user_spec} with {override} in the '{list_name}' definition"
for idx, item in self._iterate_on_definitions(defs, list_name=list_name, err_msg=msg):
@@ -2924,7 +2939,6 @@ def override_definition(self, user_spec: str, *, override: str, list_name: str)
except ValueError:
pass
self.configuration["definitions"][idx][list_name][sub_index] = override
self.changed = True
def _iterate_on_definitions(self, definitions, *, list_name, err_msg):
@@ -2956,7 +2970,6 @@ def set_default_view(self, view: Union[bool, str, pathlib.Path, Dict[str, str]])
True the default view is used for the environment, if False there's no view.
"""
if isinstance(view, dict):
self.pristine_configuration["view"][default_view_name].update(view)
self.configuration["view"][default_view_name].update(view)
self.changed = True
return
@@ -2964,15 +2977,13 @@ def set_default_view(self, view: Union[bool, str, pathlib.Path, Dict[str, str]])
if not isinstance(view, bool):
view = str(view)
self.pristine_configuration["view"] = view
self.configuration["view"] = view
self.changed = True
def remove_default_view(self) -> None:
"""Removes the default view from the manifest file"""
view_data = self.pristine_configuration.get("view")
view_data = self.configuration.get("view")
if isinstance(view_data, collections.abc.Mapping):
self.pristine_configuration["view"].pop(default_view_name)
self.configuration["view"].pop(default_view_name)
self.changed = True
return
@@ -2985,17 +2996,12 @@ def flush(self) -> None:
return
with fs.write_tmp_and_move(os.path.realpath(self.manifest_file)) as f:
_write_yaml(self.pristine_yaml_content, f)
_write_yaml(self.yaml_content, f)
self.changed = False
@property
def pristine_configuration(self):
"""Return the dictionaries in the pristine YAML, without the top level attribute"""
return self.pristine_yaml_content[TOP_LEVEL_KEY]
@property
def configuration(self):
"""Return the dictionaries in the YAML, without the top level attribute"""
"""Return the dictionaries in the pristine YAML, without the top level attribute"""
return self.yaml_content[TOP_LEVEL_KEY]
def __len__(self):
@@ -3027,12 +3033,11 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
SpackEnvironmentError: if the manifest includes a remote file but
no configuration stage directory has been identified
"""
scopes = []
scopes: List[spack.config.ConfigScope] = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self[TOP_LEVEL_KEY].get("include", [])
env_name = environment_name(self.manifest_dir)
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
@@ -3095,24 +3100,22 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (env_name, os.path.basename(config_path))
tty.debug("Creating ConfigScope {0} for '{1}'".format(config_name, config_path))
scope = spack.config.ConfigScope(config_name, config_path)
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (env_name, config_path)
tty.debug(
"Creating SingleFileScope {0} for '{1}'".format(config_name, config_path)
)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
config_name = f"env:{self.name}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
scopes.append(
spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
)
else:
missing.append(config_path)
continue
scopes.append(scope)
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
@@ -3129,7 +3132,10 @@ def env_config_scopes(self) -> List[spack.config.ConfigScope]:
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
self.scope_name,
str(self.manifest_file),
spack.schema.env.schema,
yaml_path=[TOP_LEVEL_KEY],
),
]
ensure_no_disallowed_env_config_mods(scopes)

View File

@@ -582,7 +582,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
# Create a source repo and get the pkg directory out of it.
try:
source_repo = spack.repo.Repo(source_repo_root)
source_repo = spack.repo.from_path(source_repo_root)
source_pkg_dir = source_repo.dirname_for_package_name(node.name)
except spack.repo.RepoError as err:
tty.debug(f"Failed to create source repo for {node.name}: {str(err)}")
@@ -593,7 +593,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
dest_repo_root = os.path.join(path, node.namespace)
if not os.path.exists(dest_repo_root):
spack.repo.create_repo(dest_repo_root)
repo = spack.repo.Repo(dest_repo_root)
repo = spack.repo.from_path(dest_repo_root)
# Get the location of the package in the dest repo.
dest_pkg_dir = repo.dirname_for_package_name(node.name)
@@ -1542,17 +1542,6 @@ def _add_tasks(self, request: BuildRequest, all_deps):
tty.warn(f"Installation request refused: {str(err)}")
return
# Skip out early if the spec is not being installed locally (i.e., if
# external or upstream).
#
# External and upstream packages need to get flagged as installed to
# ensure proper status tracking for environment build.
explicit = request.pkg.spec.dag_hash() in request.install_args.get("explicit", [])
not_local = _handle_external_and_upstream(request.pkg, explicit)
if not_local:
self._flag_installed(request.pkg)
return
install_compilers = spack.config.get("config:install_missing_compilers", False)
install_deps = request.install_args.get("install_deps")
@@ -2029,11 +2018,10 @@ def install(self) -> None:
# Skip the installation if the spec is not being installed locally
# (i.e., if external or upstream) BUT flag it as installed since
# some package likely depends on it.
if not task.explicit:
if _handle_external_and_upstream(pkg, False):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
if _handle_external_and_upstream(pkg, task.explicit):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
# Flag a failed spec. Do not need an (install) prefix lock since
# assume using a separate (failed) prefix lock file.

View File

@@ -444,8 +444,9 @@ def make_argument_parser(**kwargs):
"--config-scope",
dest="config_scopes",
action="append",
metavar="DIR",
help="add a custom configuration scope",
metavar="DIR|ENV",
help="add directory or environment as read-only configuration scope, without activating "
"the environment.",
)
parser.add_argument(
"-d",

View File

@@ -143,6 +143,7 @@ def __init__(self):
"12": "monterey",
"13": "ventura",
"14": "sonoma",
"15": "sequoia",
}
version = macos_version()

View File

@@ -35,6 +35,7 @@
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives
import spack.directory_layout
@@ -199,10 +200,10 @@ def __init__(cls, name, bases, attr_dict):
# assumed to be detectable
if hasattr(cls, "executables") or hasattr(cls, "libraries"):
# Append a tag to each detectable package, so that finding them is faster
if hasattr(cls, "tags"):
getattr(cls, "tags").append(DetectablePackageMeta.TAG)
else:
if not hasattr(cls, "tags"):
setattr(cls, "tags", [DetectablePackageMeta.TAG])
elif DetectablePackageMeta.TAG not in cls.tags:
cls.tags.append(DetectablePackageMeta.TAG)
@classmethod
def platform_executables(cls):
@@ -748,11 +749,6 @@ def __init__(self, spec):
self._fetch_time = 0.0
self.win_rpath = fsys.WindowsSimulatedRPath(self)
if self.is_extension:
pkg_cls = spack.repo.PATH.get_pkg_class(self.extendee_spec.name)
pkg_cls(self.extendee_spec)._check_extendable()
super().__init__()
@classmethod
@@ -1141,10 +1137,9 @@ def _make_stage(self):
if not link_format:
link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format)
return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
# To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
source_stage = DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
else:
source_stage = self._make_root_stage(self.fetcher)
# all_stages is source + resources + patches
all_stages = StageComposite()
@@ -1473,10 +1468,8 @@ def do_fetch(self, mirror_only=False):
return
checksum = spack.config.get("config:checksum")
fetch = self.stage.needs_fetching
if (
checksum
and fetch
and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion))
):
@@ -1583,13 +1576,11 @@ def do_patch(self):
tty.debug("Patching failed last time. Restaging.")
self.stage.restage()
else:
# develop specs/ DIYStages may have patch failures but
# should never be restaged
msg = (
"A patch failure was detected in %s." % self.name
+ " Build errors may occur due to this."
# develop specs may have patch failures but should never be restaged
tty.warn(
f"A patch failure was detected in {self.name}."
" Build errors may occur due to this."
)
tty.warn(msg)
return
# If this file exists, then we already applied all the patches.
@@ -2393,10 +2384,6 @@ def do_deprecate(self, deprecator, link_fn):
PackageBase.uninstall_by_spec(spec, force=True, deprecator=deprecator)
link_fn(deprecator.prefix, spec.prefix)
def _check_extendable(self):
if not self.extendable:
raise ValueError("Package %s is not extendable!" % self.name)
def view(self):
"""Create a view with the prefix of this package as the root.
Extensions added to this view will modify the installation prefix of

View File

@@ -9,7 +9,7 @@
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type
from typing import Any, Dict, Optional, Tuple, Type, Union
import llnl.util.filesystem
from llnl.url import allowed_archive
@@ -65,6 +65,9 @@ def apply_patch(
patch(*args)
PatchPackageType = Union["spack.package_base.PackageBase", Type["spack.package_base.PackageBase"]]
class Patch:
"""Base class for patches.
@@ -77,7 +80,7 @@ class Patch:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
path_or_url: str,
level: int,
working_dir: str,
@@ -159,7 +162,7 @@ class FilePatch(Patch):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
relative_path: str,
level: int,
working_dir: str,
@@ -183,7 +186,7 @@ def __init__(
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls):
for cls in inspect.getmro(pkg_cls): # type: ignore
if not hasattr(cls, "module"):
# We've gone too far up the MRO
break
@@ -242,7 +245,7 @@ class UrlPatch(Patch):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
url: str,
level: int = 1,
*,
@@ -361,8 +364,9 @@ def from_dict(
"""
repository = repository or spack.repo.PATH
owner = dictionary.get("owner")
if "owner" not in dictionary:
raise ValueError("Invalid patch dictionary: %s" % dictionary)
if owner is None:
raise ValueError(f"Invalid patch dictionary: {dictionary}")
assert isinstance(owner, str)
pkg_cls = repository.get_pkg_class(owner)
if "url" in dictionary:

View File

@@ -25,7 +25,8 @@
import traceback
import types
import uuid
from typing import Any, Dict, List, Set, Tuple, Union
import warnings
from typing import Any, Dict, Generator, List, Optional, Set, Tuple, Type, Union
import llnl.path
import llnl.util.filesystem as fs
@@ -126,11 +127,35 @@ def exec_module(self, module):
class ReposFinder:
"""MetaPathFinder class that loads a Python module corresponding to a Spack package
"""MetaPathFinder class that loads a Python module corresponding to a Spack package.
Return a loader based on the inspection of the current global repository list.
Returns a loader based on the inspection of the current repository list.
"""
def __init__(self):
self._repo_init = _path
self._repo = None
@property
def current_repository(self):
if self._repo is None:
self._repo = self._repo_init()
return self._repo
@current_repository.setter
def current_repository(self, value):
self._repo = value
@contextlib.contextmanager
def switch_repo(self, substitute: "RepoType"):
"""Switch the current repository list for the duration of the context manager."""
old = self.current_repository
try:
self.current_repository = substitute
yield
finally:
self.current_repository = old
def find_spec(self, fullname, python_path, target=None):
# "target" is not None only when calling importlib.reload()
if target is not None:
@@ -149,9 +174,14 @@ def compute_loader(self, fullname):
# namespaces are added to repo, and package modules are leaves.
namespace, dot, module_name = fullname.rpartition(".")
# If it's a module in some repo, or if it is the repo's
# namespace, let the repo handle it.
for repo in PATH.repos:
# If it's a module in some repo, or if it is the repo's namespace, let the repo handle it.
is_repo_path = isinstance(self.current_repository, RepoPath)
if is_repo_path:
repos = self.current_repository.repos
else:
repos = [self.current_repository]
for repo in repos:
# We are using the namespace of the repo and the repo contains the package
if namespace == repo.full_namespace:
# With 2 nested conditionals we can call "repo.real_name" only once
@@ -165,7 +195,7 @@ def compute_loader(self, fullname):
# No repo provides the namespace, but it is a valid prefix of
# something in the RepoPath.
if PATH.by_namespace.is_prefix(fullname):
if is_repo_path and self.current_repository.by_namespace.is_prefix(fullname):
return SpackNamespaceLoader()
return None
@@ -560,7 +590,7 @@ def __init__(
self,
package_checker: FastPackageChecker,
namespace: str,
cache: spack.util.file_cache.FileCache,
cache: "spack.caches.FileCacheType",
):
self.checker = package_checker
self.packages_path = self.checker.packages_path
@@ -645,33 +675,39 @@ class RepoPath:
repository.
Args:
repos (list): list Repo objects or paths to put in this RepoPath
repos: list Repo objects or paths to put in this RepoPath
cache: file cache associated with this repository
overrides: dict mapping package name to class attribute overrides for that package
"""
def __init__(self, *repos, **kwargs):
cache = kwargs.get("cache", spack.caches.MISC_CACHE)
self.repos = []
def __init__(
self,
*repos: Union[str, "Repo"],
cache: "spack.caches.FileCacheType",
overrides: Optional[Dict[str, Any]] = None,
) -> None:
self.repos: List[Repo] = []
self.by_namespace = nm.NamespaceTrie()
self._provider_index = None
self._patch_index = None
self._tag_index = None
self._provider_index: Optional[spack.provider_index.ProviderIndex] = None
self._patch_index: Optional[spack.patch.PatchCache] = None
self._tag_index: Optional[spack.tag.TagIndex] = None
# Add each repo to this path.
for repo in repos:
try:
if isinstance(repo, str):
repo = Repo(repo, cache=cache)
repo = Repo(repo, cache=cache, overrides=overrides)
repo.finder(self)
self.put_last(repo)
except RepoError as e:
tty.warn(
"Failed to initialize repository: '%s'." % repo,
f"Failed to initialize repository: '{repo}'.",
e.message,
"To remove the bad repository, run this command:",
" spack repo rm %s" % repo,
f" spack repo rm {repo}",
)
def put_first(self, repo):
def put_first(self, repo: "Repo") -> None:
"""Add repo first in the search path."""
if isinstance(repo, RepoPath):
for r in reversed(repo.repos):
@@ -699,50 +735,34 @@ def remove(self, repo):
if repo in self.repos:
self.repos.remove(repo)
def get_repo(self, namespace, default=NOT_PROVIDED):
"""Get a repository by namespace.
Arguments:
namespace:
Look up this namespace in the RepoPath, and return it if found.
Optional Arguments:
default:
If default is provided, return it when the namespace
isn't found. If not, raise an UnknownNamespaceError.
"""
def get_repo(self, namespace: str) -> "Repo":
"""Get a repository by namespace."""
full_namespace = python_package_for_repo(namespace)
if full_namespace not in self.by_namespace:
if default == NOT_PROVIDED:
raise UnknownNamespaceError(namespace)
return default
raise UnknownNamespaceError(namespace)
return self.by_namespace[full_namespace]
def first_repo(self):
def first_repo(self) -> Optional["Repo"]:
"""Get the first repo in precedence order."""
return self.repos[0] if self.repos else None
@llnl.util.lang.memoized
def _all_package_names_set(self, include_virtuals):
def _all_package_names_set(self, include_virtuals) -> Set[str]:
return {name for repo in self.repos for name in repo.all_package_names(include_virtuals)}
@llnl.util.lang.memoized
def _all_package_names(self, include_virtuals):
def _all_package_names(self, include_virtuals: bool) -> List[str]:
"""Return all unique package names in all repositories."""
return sorted(self._all_package_names_set(include_virtuals), key=lambda n: n.lower())
def all_package_names(self, include_virtuals=False):
def all_package_names(self, include_virtuals: bool = False) -> List[str]:
return self._all_package_names(include_virtuals)
def package_path(self, name):
def package_path(self, name: str) -> str:
"""Get path to package.py file for this repo."""
return self.repo_for_pkg(name).package_path(name)
def all_package_paths(self):
def all_package_paths(self) -> Generator[str, None, None]:
for name in self.all_package_names():
yield self.package_path(name)
@@ -758,53 +778,52 @@ def packages_with_tags(self, *tags: str, full: bool = False) -> Set[str]:
for pkg in repo.packages_with_tags(*tags)
}
def all_package_classes(self):
def all_package_classes(self) -> Generator[Type["spack.package_base.PackageBase"], None, None]:
for name in self.all_package_names():
yield self.get_pkg_class(name)
@property
def provider_index(self):
def provider_index(self) -> spack.provider_index.ProviderIndex:
"""Merged ProviderIndex from all Repos in the RepoPath."""
if self._provider_index is None:
self._provider_index = spack.provider_index.ProviderIndex(repository=self)
for repo in reversed(self.repos):
self._provider_index.merge(repo.provider_index)
return self._provider_index
@property
def tag_index(self):
def tag_index(self) -> spack.tag.TagIndex:
"""Merged TagIndex from all Repos in the RepoPath."""
if self._tag_index is None:
self._tag_index = spack.tag.TagIndex(repository=self)
for repo in reversed(self.repos):
self._tag_index.merge(repo.tag_index)
return self._tag_index
@property
def patch_index(self):
def patch_index(self) -> spack.patch.PatchCache:
"""Merged PatchIndex from all Repos in the RepoPath."""
if self._patch_index is None:
self._patch_index = spack.patch.PatchCache(repository=self)
for repo in reversed(self.repos):
self._patch_index.update(repo.patch_index)
return self._patch_index
@autospec
def providers_for(self, vpkg_spec):
def providers_for(self, virtual_spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
providers = [
spec
for spec in self.provider_index.providers_for(vpkg_spec)
for spec in self.provider_index.providers_for(virtual_spec)
if spec.name in self._all_package_names_set(include_virtuals=False)
]
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
raise UnknownPackageError(virtual_spec.fullname)
return providers
@autospec
def extensions_for(self, extendee_spec):
def extensions_for(
self, extendee_spec: "spack.spec.Spec"
) -> List["spack.package_base.PackageBase"]:
return [
pkg_cls(spack.spec.Spec(pkg_cls.name))
for pkg_cls in self.all_package_classes()
@@ -815,7 +834,7 @@ def last_mtime(self):
"""Time a package file in this repo was last updated."""
return max(repo.last_mtime() for repo in self.repos)
def repo_for_pkg(self, spec):
def repo_for_pkg(self, spec: Union[str, "spack.spec.Spec"]) -> "Repo":
"""Given a spec, get the repository for its package."""
# We don't @_autospec this function b/c it's called very frequently
# and we want to avoid parsing str's into Specs unnecessarily.
@@ -840,17 +859,20 @@ def repo_for_pkg(self, spec):
return repo
# If the package isn't in any repo, return the one with
# highest precedence. This is for commands like `spack edit`
# highest precedence. This is for commands like `spack edit`
# that can operate on packages that don't exist yet.
return self.first_repo()
selected = self.first_repo()
if selected is None:
raise UnknownPackageError(name)
return selected
def get(self, spec):
def get(self, spec: "spack.spec.Spec") -> "spack.package_base.PackageBase":
"""Returns the package associated with the supplied spec."""
msg = "RepoPath.get can only be called on concrete specs"
assert isinstance(spec, spack.spec.Spec) and spec.concrete, msg
return self.repo_for_pkg(spec).get(spec)
def get_pkg_class(self, pkg_name):
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
"""Find a class for the spec's package and return the class object."""
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
@@ -863,26 +885,26 @@ def dump_provenance(self, spec, path):
"""
return self.repo_for_pkg(spec).dump_provenance(spec, path)
def dirname_for_package_name(self, pkg_name):
def dirname_for_package_name(self, pkg_name: str) -> str:
return self.repo_for_pkg(pkg_name).dirname_for_package_name(pkg_name)
def filename_for_package_name(self, pkg_name):
def filename_for_package_name(self, pkg_name: str) -> str:
return self.repo_for_pkg(pkg_name).filename_for_package_name(pkg_name)
def exists(self, pkg_name):
def exists(self, pkg_name: str) -> bool:
"""Whether package with the give name exists in the path's repos.
Note that virtual packages do not "exist".
"""
return any(repo.exists(pkg_name) for repo in self.repos)
def _have_name(self, pkg_name):
def _have_name(self, pkg_name: str) -> bool:
have_name = pkg_name is not None
if have_name and not isinstance(pkg_name, str):
raise ValueError("is_virtual(): expected package name, got %s" % type(pkg_name))
raise ValueError(f"is_virtual(): expected package name, got {type(pkg_name)}")
return have_name
def is_virtual(self, pkg_name):
def is_virtual(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
@@ -894,7 +916,7 @@ def is_virtual(self, pkg_name):
have_name = self._have_name(pkg_name)
return have_name and pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
def is_virtual_safe(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
@@ -915,18 +937,28 @@ class Repo:
Each package repository must have a top-level configuration file
called `repo.yaml`.
Currently, `repo.yaml` this must define:
Currently, `repo.yaml` must define:
`namespace`:
A Python namespace where the repository's packages should live.
`subdirectory`:
An optional subdirectory name where packages are placed
"""
def __init__(self, root, cache=None):
def __init__(
self,
root: str,
*,
cache: "spack.caches.FileCacheType",
overrides: Optional[Dict[str, Any]] = None,
) -> None:
"""Instantiate a package repository from a filesystem path.
Args:
root: the root directory of the repository
cache: file cache associated with this repository
overrides: dict mapping package name to class attribute overrides for that package
"""
# Root directory, containing _repo.yaml and package dirs
# Allow roots to by spack-relative by starting with '$spack'
@@ -939,20 +971,20 @@ def check(condition, msg):
# Validate repository layout.
self.config_file = os.path.join(self.root, repo_config_name)
check(os.path.isfile(self.config_file), "No %s found in '%s'" % (repo_config_name, root))
check(os.path.isfile(self.config_file), f"No {repo_config_name} found in '{root}'")
# Read configuration and validate namespace
config = self._read_config()
check(
"namespace" in config,
"%s must define a namespace." % os.path.join(root, repo_config_name),
f"{os.path.join(root, repo_config_name)} must define a namespace.",
)
self.namespace = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
("Invalid namespace '%s' in repo '%s'. " % (self.namespace, self.root))
+ "Namespaces must be valid python identifiers separated by '.'",
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
"Namespaces must be valid python identifiers separated by '.'",
)
# Set up 'full_namespace' to include the super-namespace
@@ -964,23 +996,26 @@ def check(condition, msg):
packages_dir = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path),
"No directory '%s' found in '%s'" % (packages_dir, root),
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
)
# These are internal cache variables.
self._modules = {}
self._classes = {}
self._instances = {}
# Class attribute overrides by package name
self.overrides = overrides or {}
# Optional reference to a RepoPath to influence module import from spack.pkg
self._finder: Optional[RepoPath] = None
# Maps that goes from package name to corresponding file stat
self._fast_package_checker = None
self._fast_package_checker: Optional[FastPackageChecker] = None
# Indexes for this repository, computed lazily
self._repo_index = None
self._cache = cache or spack.caches.MISC_CACHE
self._repo_index: Optional[RepoIndex] = None
self._cache = cache
def real_name(self, import_name):
def finder(self, value: RepoPath) -> None:
self._finder = value
def real_name(self, import_name: str) -> Optional[str]:
"""Allow users to import Spack packages using Python identifiers.
A python identifier might map to many different Spack package
@@ -999,18 +1034,21 @@ def real_name(self, import_name):
return import_name
options = nm.possible_spack_module_names(import_name)
options.remove(import_name)
try:
options.remove(import_name)
except ValueError:
pass
for name in options:
if name in self:
return name
return None
def is_prefix(self, fullname):
def is_prefix(self, fullname: str) -> bool:
"""True if fullname is a prefix of this Repo's namespace."""
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self):
def _read_config(self) -> Dict[str, str]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file) as reponame_file:
@@ -1021,14 +1059,14 @@ def _read_config(self):
or "repo" not in yaml_data
or not isinstance(yaml_data["repo"], dict)
):
tty.die("Invalid %s in repository %s" % (repo_config_name, self.root))
tty.die(f"Invalid {repo_config_name} in repository {self.root}")
return yaml_data["repo"]
except IOError:
tty.die("Error reading %s when opening %s" % (self.config_file, self.root))
tty.die(f"Error reading {self.config_file} when opening {self.root}")
def get(self, spec):
def get(self, spec: "spack.spec.Spec") -> "spack.package_base.PackageBase":
"""Returns the package associated with the supplied spec."""
msg = "Repo.get can only be called on concrete specs"
assert isinstance(spec, spack.spec.Spec) and spec.concrete, msg
@@ -1049,16 +1087,13 @@ def get(self, spec):
# pass these through as their error messages will be fine.
raise
except Exception as e:
tty.debug(e)
# Make sure other errors in constructors hit the error
# handler by wrapping them
if spack.config.get("config:debug"):
sys.excepthook(*sys.exc_info())
raise FailedConstructorError(spec.fullname, *sys.exc_info())
tty.debug(e)
raise FailedConstructorError(spec.fullname, *sys.exc_info()) from e
@autospec
def dump_provenance(self, spec, path):
def dump_provenance(self, spec: "spack.spec.Spec", path: str) -> None:
"""Dump provenance information for a spec to a particular path.
This dumps the package file and any associated patch files.
@@ -1066,7 +1101,7 @@ def dump_provenance(self, spec, path):
"""
if spec.namespace and spec.namespace != self.namespace:
raise UnknownPackageError(
"Repository %s does not contain package %s." % (self.namespace, spec.fullname)
f"Repository {self.namespace} does not contain package {spec.fullname}."
)
package_path = self.filename_for_package_name(spec.name)
@@ -1083,17 +1118,13 @@ def dump_provenance(self, spec, path):
if os.path.exists(patch.path):
fs.install(patch.path, path)
else:
tty.warn("Patch file did not exist: %s" % patch.path)
warnings.warn(f"Patch file did not exist: {patch.path}")
# Install the package.py file itself.
fs.install(self.filename_for_package_name(spec.name), path)
def purge(self):
"""Clear entire package instance cache."""
self._instances.clear()
@property
def index(self):
def index(self) -> RepoIndex:
"""Construct the index for this repo lazily."""
if self._repo_index is None:
self._repo_index = RepoIndex(self._pkg_checker, self.namespace, cache=self._cache)
@@ -1103,42 +1134,40 @@ def index(self):
return self._repo_index
@property
def provider_index(self):
def provider_index(self) -> spack.provider_index.ProviderIndex:
"""A provider index with names *specific* to this repo."""
return self.index["providers"]
@property
def tag_index(self):
def tag_index(self) -> spack.tag.TagIndex:
"""Index of tags and which packages they're defined on."""
return self.index["tags"]
@property
def patch_index(self):
def patch_index(self) -> spack.patch.PatchCache:
"""Index of patches and packages they're defined on."""
return self.index["patches"]
@autospec
def providers_for(self, vpkg_spec):
def providers_for(self, vpkg_spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
providers = self.provider_index.providers_for(vpkg_spec)
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
return providers
@autospec
def extensions_for(self, extendee_spec):
return [
pkg_cls(spack.spec.Spec(pkg_cls.name))
for pkg_cls in self.all_package_classes()
if pkg_cls(spack.spec.Spec(pkg_cls.name)).extends(extendee_spec)
]
def extensions_for(
self, extendee_spec: "spack.spec.Spec"
) -> List["spack.package_base.PackageBase"]:
result = [pkg_cls(spack.spec.Spec(pkg_cls.name)) for pkg_cls in self.all_package_classes()]
return [x for x in result if x.extends(extendee_spec)]
def dirname_for_package_name(self, pkg_name):
"""Get the directory name for a particular package. This is the
directory that contains its package.py file."""
def dirname_for_package_name(self, pkg_name: str) -> str:
"""Given a package name, get the directory containing its package.py file."""
_, unqualified_name = self.partition_package_name(pkg_name)
return os.path.join(self.packages_path, unqualified_name)
def filename_for_package_name(self, pkg_name):
def filename_for_package_name(self, pkg_name: str) -> str:
"""Get the filename for the module we should load for a particular
package. Packages for a Repo live in
``$root/<package_name>/package.py``
@@ -1151,23 +1180,23 @@ def filename_for_package_name(self, pkg_name):
return os.path.join(pkg_dir, package_file_name)
@property
def _pkg_checker(self):
def _pkg_checker(self) -> FastPackageChecker:
if self._fast_package_checker is None:
self._fast_package_checker = FastPackageChecker(self.packages_path)
return self._fast_package_checker
def all_package_names(self, include_virtuals=False):
def all_package_names(self, include_virtuals: bool = False) -> List[str]:
"""Returns a sorted list of all package names in the Repo."""
names = sorted(self._pkg_checker.keys())
if include_virtuals:
return names
return [x for x in names if not self.is_virtual(x)]
def package_path(self, name):
def package_path(self, name: str) -> str:
"""Get path to package.py file for this repo."""
return os.path.join(self.packages_path, name, package_file_name)
def all_package_paths(self):
def all_package_paths(self) -> Generator[str, None, None]:
for name in self.all_package_names():
yield self.package_path(name)
@@ -1176,7 +1205,7 @@ def packages_with_tags(self, *tags: str) -> Set[str]:
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
return v
def all_package_classes(self):
def all_package_classes(self) -> Generator[Type["spack.package_base.PackageBase"], None, None]:
"""Iterator over all package *classes* in the repository.
Use this with care, because loading packages is slow.
@@ -1184,7 +1213,7 @@ def all_package_classes(self):
for name in self.all_package_names():
yield self.get_pkg_class(name)
def exists(self, pkg_name):
def exists(self, pkg_name: str) -> bool:
"""Whether a package with the supplied name exists."""
if pkg_name is None:
return False
@@ -1201,28 +1230,22 @@ def last_mtime(self):
"""Time a package file in this repo was last updated."""
return self._pkg_checker.last_mtime()
def is_virtual(self, pkg_name):
def is_virtual(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
is used to construct the provider index use the ``is_virtual_safe`` function.
Args:
pkg_name (str): name of the package we want to check
"""
return pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
def is_virtual_safe(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
Args:
pkg_name (str): name of the package we want to check
"""
return not self.exists(pkg_name) or self.get_pkg_class(pkg_name).virtual
def get_pkg_class(self, pkg_name):
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
"""Get the class for the package out of its module.
First loads (or fetches from cache) a module for the
@@ -1234,7 +1257,8 @@ def get_pkg_class(self, pkg_name):
fullname = f"{self.full_namespace}.{pkg_name}"
try:
module = importlib.import_module(fullname)
with REPOS_FINDER.switch_repo(self._finder or self):
module = importlib.import_module(fullname)
except ImportError:
raise UnknownPackageError(fullname)
except Exception as e:
@@ -1245,26 +1269,21 @@ def get_pkg_class(self, pkg_name):
if not inspect.isclass(cls):
tty.die(f"{pkg_name}.{class_name} is not a class")
new_cfg_settings = (
spack.config.get("packages").get(pkg_name, {}).get("package_attributes", {})
)
# Clear any prior changes to class attributes in case the class was loaded from the
# same repo, but with different overrides
overridden_attrs = getattr(cls, "overridden_attrs", {})
attrs_exclusively_from_config = getattr(cls, "attrs_exclusively_from_config", [])
# Clear any prior changes to class attributes in case the config has
# since changed
for key, val in overridden_attrs.items():
setattr(cls, key, val)
for key in attrs_exclusively_from_config:
delattr(cls, key)
# Keep track of every class attribute that is overridden by the config:
# if the config changes between calls to this method, we make sure to
# restore the original config values (in case the new config no longer
# sets attributes that it used to)
# Keep track of every class attribute that is overridden: if different overrides
# dictionaries are used on the same physical repo, we make sure to restore the original
# config values
new_overridden_attrs = {}
new_attrs_exclusively_from_config = set()
for key, val in new_cfg_settings.items():
for key, val in self.overrides.get(pkg_name, {}).items():
if hasattr(cls, key):
new_overridden_attrs[key] = getattr(cls, key)
else:
@@ -1291,13 +1310,13 @@ def partition_package_name(self, pkg_name: str) -> Tuple[str, str]:
return namespace, pkg_name
def __str__(self):
return "[Repo '%s' at '%s']" % (self.namespace, self.root)
def __str__(self) -> str:
return f"Repo '{self.namespace}' at {self.root}"
def __repr__(self):
def __repr__(self) -> str:
return self.__str__()
def __contains__(self, pkg_name):
def __contains__(self, pkg_name: str) -> bool:
return self.exists(pkg_name)
@@ -1373,12 +1392,17 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
return full_path, namespace
def from_path(path: str) -> "Repo":
"""Returns a repository from the path passed as input. Injects the global misc cache."""
return Repo(path, cache=spack.caches.MISC_CACHE)
def create_or_construct(path, namespace=None):
"""Create a repository, or just return a Repo if it already exists."""
if not os.path.exists(path):
fs.mkdirp(path)
create_repo(path, namespace)
return Repo(path)
return from_path(path)
def _path(configuration=None):
@@ -1387,7 +1411,9 @@ def _path(configuration=None):
return create(configuration=configuration)
def create(configuration):
def create(
configuration: Union["spack.config.Configuration", llnl.util.lang.Singleton]
) -> RepoPath:
"""Create a RepoPath from a configuration object.
Args:
@@ -1396,7 +1422,17 @@ def create(configuration):
repo_dirs = configuration.get("repos")
if not repo_dirs:
raise NoRepoConfiguredError("Spack configuration contains no package repositories.")
return RepoPath(*repo_dirs)
overrides = {}
for pkg_name, data in configuration.get("packages").items():
if pkg_name == "all":
continue
value = data.get("package_attributes", {})
if not value:
continue
overrides[pkg_name] = value
return RepoPath(*repo_dirs, cache=spack.caches.MISC_CACHE, overrides=overrides)
#: Singleton repo path instance
@@ -1413,20 +1449,20 @@ def all_package_names(include_virtuals=False):
@contextlib.contextmanager
def use_repositories(*paths_and_repos, **kwargs):
def use_repositories(
*paths_and_repos: Union[str, Repo], override: bool = True
) -> Generator[RepoPath, None, None]:
"""Use the repositories passed as arguments within the context manager.
Args:
*paths_and_repos: paths to the repositories to be used, or
already constructed Repo objects
override (bool): if True use only the repositories passed as input,
override: if True use only the repositories passed as input,
if False add them to the top of the list of current repositories.
Returns:
Corresponding RepoPath object
"""
global PATH
# TODO (Python 2.7): remove this kwargs on deprecation of Python 2.7 support
override = kwargs.get("override", True)
paths = [getattr(x, "root", x) for x in paths_and_repos]
scope_name = "use-repo-{}".format(uuid.uuid4())
repos_key = "repos:" if override else "repos"
@@ -1435,7 +1471,8 @@ def use_repositories(*paths_and_repos, **kwargs):
)
PATH, saved = create(configuration=spack.config.CONFIG), PATH
try:
yield PATH
with REPOS_FINDER.switch_repo(PATH): # type: ignore
yield PATH
finally:
spack.config.CONFIG.remove_scope(scope_name=scope_name)
PATH = saved
@@ -1535,10 +1572,9 @@ class UnknownNamespaceError(UnknownEntityError):
"""Raised when we encounter an unknown namespace"""
def __init__(self, namespace, name=None):
msg, long_msg = "Unknown namespace: {}".format(namespace), None
msg, long_msg = f"Unknown namespace: {namespace}", None
if name == "yaml":
long_msg = "Did you mean to specify a filename with './{}.{}'?"
long_msg = long_msg.format(namespace, name)
long_msg = f"Did you mean to specify a filename with './{namespace}.{name}'?"
super().__init__(msg, long_msg)

View File

@@ -23,6 +23,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import elide_list
import spack
import spack.binary_distribution
@@ -621,8 +622,9 @@ def _external_config_with_implicit_externals(configuration):
class ErrorHandler:
def __init__(self, model):
def __init__(self, model, input_specs: List[spack.spec.Spec]):
self.model = model
self.input_specs = input_specs
self.full_model = None
def multiple_values_error(self, attribute, pkg):
@@ -709,12 +711,13 @@ def handle_error(self, msg, *args):
return msg
def message(self, errors) -> str:
messages = [
f" {idx+1: 2}. {self.handle_error(msg, *args)}"
input_specs = ", ".join(elide_list([f"`{s}`" for s in self.input_specs], 5))
header = f"failed to concretize {input_specs} for the following reasons:"
messages = (
f" {idx+1:2}. {self.handle_error(msg, *args)}"
for idx, (_, msg, args) in enumerate(errors)
]
header = "concretization failed for the following reasons:\n"
return "\n".join([header] + messages)
)
return "\n".join((header, *messages))
def raise_if_errors(self):
initial_error_args = extract_args(self.model, "error")
@@ -750,7 +753,7 @@ def on_model(model):
f"unexpected error during concretization [{str(e)}]. "
f"Please report a bug at https://github.com/spack/spack/issues"
)
raise spack.error.SpackError(msg)
raise spack.error.SpackError(msg) from e
raise UnsatisfiableSpecError(msg)
@@ -846,8 +849,6 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
@@ -896,7 +897,7 @@ def on_model(model):
min_cost, best_model = min(models)
# first check for errors
error_handler = ErrorHandler(best_model)
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# build specs from spec attributes in the model
@@ -1882,11 +1883,8 @@ def _spec_clauses(
)
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
# Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we
@@ -1919,9 +1917,12 @@ def _spec_clauses(
for flag_type, flags in spec.compiler_flags.items():
for flag in flags:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(f.node_flag_source(spec.name, flag_type, spec.name))
if not spec.concrete and flag.propagate is True:
clauses.append(f.node_flag_propagate(spec.name, flag_type))
clauses.append(
f.propagate(
spec.name, fn.node_flag(flag_type, flag), fn.edge_types("link", "run")
)
)
# dependencies
if spec.concrete:
@@ -2746,9 +2747,7 @@ class _Head:
node_compiler = fn.attr("node_compiler_set")
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class _Body:
@@ -2763,9 +2762,7 @@ class _Body:
node_compiler = fn.attr("node_compiler")
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class ProblemInstanceBuilder:
@@ -3239,6 +3236,39 @@ def requires(self, impose: str, *, when: str):
self.runtime_conditions.add((imposed_spec, when_spec))
self.reset()
def propagate(self, constraint_str: str, *, when: str):
msg = "the 'propagate' method can be called only with pkg('*')"
assert self.current_package == "*", msg
when_spec = spack.spec.Spec(when)
assert when_spec.name is None, "only anonymous when specs are accepted"
placeholder = "XXX"
node_variable = "node(ID, Package)"
when_spec.name = placeholder
body_clauses = self._setup.spec_clauses(when_spec, body=True)
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{placeholder}"', f"{node_variable}")
constraint_spec = spack.spec.Spec(constraint_str)
assert constraint_spec.name is None, "only anonymous constraint specs are accepted"
constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_compiler_version_satisfies":
self._setup.compiler_version_constraints.add(constraint_spec.compiler)
args = f'"{constraint_spec.compiler.name}", "{constraint_spec.compiler.versions}"'
head_str = f"propagate({node_variable}, node_compiler_version_satisfies({args}))"
rule = f"{head_str} :-\n{body_str}.\n\n"
self.rules.append(rule)
self.reset()
def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and
facts for the runtimes.
@@ -3318,6 +3348,8 @@ def hash(self, node, h):
def node(self, node):
if node not in self._specs:
self._specs[node] = spack.spec.Spec(node.pkg)
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
self._specs[node].compiler_flags[flag_type] = []
def _arch(self, node):
arch = self._specs[node].architecture
@@ -3370,9 +3402,6 @@ def node_flag(self, node, flag_type, flag):
def node_flag_source(self, node, flag_type, source):
self._flag_sources[(node, flag_type)].add(source)
def no_flags(self, node, flag_type):
self._specs[node].compiler_flags[flag_type] = []
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx has been selected for this package."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
@@ -3465,7 +3494,7 @@ def reorder_flags(self):
ordered_compiler_flags = list(llnl.util.lang.dedupe(from_compiler + from_sources))
compiler_flags = spec.compiler_flags.get(flag_type, [])
msg = "%s does not equal %s" % (set(compiler_flags), set(ordered_compiler_flags))
msg = f"{set(compiler_flags)} does not equal {set(ordered_compiler_flags)}"
assert set(compiler_flags) == set(ordered_compiler_flags), msg
spec.compiler_flags.update({flag_type: ordered_compiler_flags})
@@ -3535,9 +3564,8 @@ def build_specs(self, function_tuples):
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
spec = self._specs.get(args[0])
if spec and spec.concrete:
if name != "node_flag_source":
continue
if spec and spec.concrete and name != "node_flag_source":
continue
action(*args)

View File

@@ -29,7 +29,6 @@
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode).
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode).
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode).
:- attr("no_flags", PackageNode, _), not attr("node", PackageNode).
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode).
:- attr("depends_on", ParentNode, _, _), not attr("node", ParentNode).
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode).
@@ -612,25 +611,18 @@ do_not_impose(EffectID, node(X, Package))
% Virtual dependency weights
%-----------------------------------------------------------------------------
% A provider may have different possible weights depending on whether it's an external
% or not, or on preferences expressed in packages.yaml etc. This rule ensures that
% A provider has different possible weights depending on its preference. This rule ensures that
% we select the weight, among the possible ones, that minimizes the overall objective function.
1 { provider_weight(DependencyNode, VirtualNode, Weight) :
possible_provider_weight(DependencyNode, VirtualNode, Weight, _) } 1
:- provider(DependencyNode, VirtualNode), internal_error("Package provider weights must be unique").
% A provider that is an external can use a weight of 0
possible_provider_weight(DependencyNode, VirtualNode, 0, "external")
:- provider(DependencyNode, VirtualNode),
external(DependencyNode).
% A provider mentioned in the default configuration can use a weight
% according to its priority in the list of providers
% Any configured provider has a weight based on index in the preference list
possible_provider_weight(node(ProviderID, Provider), node(VirtualID, Virtual), Weight, "default")
:- provider(node(ProviderID, Provider), node(VirtualID, Virtual)),
default_provider_preference(Virtual, Provider, Weight).
% Any provider can use 100 as a weight, which is very high and discourage its use
% Any non-configured provider has a default weight of 100
possible_provider_weight(node(ProviderID, Provider), VirtualNode, 100, "fallback")
:- provider(node(ProviderID, Provider), VirtualNode).
@@ -812,37 +804,6 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)).
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant),
@@ -920,7 +881,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not attr("variant_propagate", node(ID, Package), Variant, Value, _),
not propagate(node(ID, Package), variant_value(Variant, Value)),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -933,7 +894,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
not attr("variant_propagate", node(ID, Package), Variant, _, _),
not propagate(node(ID, Package), variant_value(Variant, _)),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -990,6 +951,101 @@ pkg_fact(Package, variant_single_value("dev_path"))
#defined variant_default_value/3.
#defined variant_default_value_from_packages_yaml/3.
%-----------------------------------------------------------------------------
% Propagation semantics
%-----------------------------------------------------------------------------
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
propagate(RootNode, PropagatedAttribute, EdgeTypes) :- attr("propagate", RootNode, PropagatedAttribute, EdgeTypes).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
propagate(ParentNode, PropagatedAttribute),
depends_on(ParentNode, ChildNode).
propagate(ChildNode, PropagatedAttribute, edge_types(DepType1, DepType2)) :-
propagate(ParentNode, PropagatedAttribute, edge_types(DepType1, DepType2)),
depends_on(ParentNode, ChildNode),
1 { attr("depends_on", ParentNode, ChildNode, DepType1); attr("depends_on", ParentNode, ChildNode, DepType2) }.
%-----------------------------------------------------------------------------
% Activation of propagated values
%-----------------------------------------------------------------------------
%----
% Variants
%----
% If a variant is propagated, and can be accepted, set its value
attr("variant_value", node(ID, Package), Variant, Value) :-
propagate(node(ID, Package), variant_value(Variant, Value)),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not propagate(PackageNode, variant_value(Variant, Value)).
%----
% Flags
%----
% A propagated flag implies:
% 1. The same flag type is not set on this node
% 2. This node has the same compiler as the propagation source
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag), _),
not attr("node_flag_set", node(PackageID, Package), FlagType, _),
% Same compiler as propagation source
node_compiler(node(PackageID, Package), CompilerID),
node_compiler(SourceNode, CompilerID),
attr("propagate", SourceNode, node_flag(FlagType, Flag), _),
node(PackageID, Package) != SourceNode,
not runtime(Package).
attr("node_flag", PackageNode, FlagType, Flag) :- propagated_flag(PackageNode, node_flag(FlagType, Flag), _).
attr("node_flag_source", PackageNode, FlagType, SourceNode) :- propagated_flag(PackageNode, node_flag(FlagType, _), SourceNode).
% Cannot propagate the same flag from two distinct sources
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source2)),
Source1 < Source2.
%----
% Compiler constraints
%----
attr("node_compiler_version_satisfies", node(ID, Package), Compiler, Version) :-
propagate(node(ID, Package), node_compiler_version_satisfies(Compiler, Version)),
node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
not runtime(Package),
not external(Package).
%-----------------------------------------------------------------------------
% Runtimes
%-----------------------------------------------------------------------------
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% If we build packages, the runtime nodes must use an available compiler
1 { node_compiler(PackageNode, CompilerID) : build(PackageNode), not external(PackageNode) } :-
has_built_packages(),
runtime(RuntimePackage),
node_compiler(node(_, RuntimePackage), CompilerID).
%-----------------------------------------------------------------------------
% Platform semantics
%-----------------------------------------------------------------------------
@@ -1091,10 +1147,18 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target
node_target_weight(node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
target_weight(Target, Weight).
target_weight(Target, 0)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
attr("node_target_set", PackageNode, Target).
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
target(Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
:- attr("node_target", PackageNode, Target), not node_target_weight(PackageNode, _).
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
@@ -1156,12 +1220,12 @@ error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _).
not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
@@ -1242,45 +1306,9 @@ error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_miss
% Compiler flags
%-----------------------------------------------------------------------------
% propagate flags when compiler match
can_inherit_flags(PackageNode, DependencyNode, FlagType)
:- same_compiler(PackageNode, DependencyNode),
not attr("node_flag_set", DependencyNode, FlagType, _),
flag_type(FlagType).
same_compiler(PackageNode, DependencyNode)
:- depends_on(PackageNode, DependencyNode),
node_compiler(PackageNode, CompilerID),
node_compiler(DependencyNode, CompilerID),
compiler_id(CompilerID).
node_flag_inherited(DependencyNode, FlagType, Flag)
:- attr("node_flag_set", PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
% Ensure propagation
:- node_flag_inherited(PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
depends_on(Source1, Package),
depends_on(Source2, Package),
attr("node_flag_propagate", Source1, FlagType),
attr("node_flag_propagate", Source2, FlagType),
can_inherit_flags(Source1, Package, FlagType),
can_inherit_flags(Source2, Package, FlagType),
Source1 < Source2.
% remember where flags came from
attr("node_flag_source", PackageNode, FlagType, PackageNode)
:- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", DependencyNode, FlagType, Q)
:- attr("node_flag_source", PackageNode, FlagType, Q),
node_flag_inherited(DependencyNode, FlagType, _),
attr("node_flag_propagate", PackageNode, FlagType).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag", PackageNode, FlagType, _), attr("hash", PackageNode, _).
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", PackageNode, FlagType, Flag)
@@ -1300,15 +1328,8 @@ attr("node_flag_compiler_default", PackageNode)
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
% if a flag is set to something or inherited, it's included
% Flag set to something
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_set", PackageNode, FlagType, Flag).
attr("node_flag", PackageNode, FlagType, Flag) :- node_flag_inherited(PackageNode, FlagType, Flag).
% if no node flags are set for a type, there are no flags.
attr("no_flags", PackageNode, FlagType)
:- not attr("node_flag", PackageNode, FlagType, _),
attr("node", PackageNode),
flag_type(FlagType).
#defined compiler_flag/3.
@@ -1497,7 +1518,7 @@ opt_criterion(45, "preferred providers (non-roots)").
}.
% Try to minimize the number of compiler mismatches in the DAG.
opt_criterion(40, "compiler mismatches that are not from CLI").
opt_criterion(40, "compiler mismatches that are not required").
#minimize{ 0@240: #true }.
#minimize{ 0@40: #true }.
#minimize{
@@ -1507,7 +1528,7 @@ opt_criterion(40, "compiler mismatches that are not from CLI").
not runtime(Dependency)
}.
opt_criterion(39, "compiler mismatches that are not from CLI").
opt_criterion(39, "compiler mismatches that are required").
#minimize{ 0@239: #true }.
#minimize{ 0@39: #true }.
#minimize{

View File

@@ -4,21 +4,35 @@
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID 0)
% Heuristic to speed-up solves
%=============================================================================
% No duplicates by default (most of them will be true)
#heuristic attr("node", node(PackageID, Package)). [100, init]
#heuristic attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("virtual_node", node(VirtualID, Virtual)). [100, init]
#heuristic attr("node", node(1..X-1, Package)) : max_dupes(Package, X), not virtual(Package), X > 1. [-1, sign]
#heuristic attr("virtual_node", node(1..X-1, Package)) : max_dupes(Package, X), virtual(Package) , X > 1. [-1, sign]
%-----------------
% Domain heuristic
%-----------------
% Pick preferred version
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)). [40, init]
#heuristic version_weight(node(PackageID, Package), 0) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Root node
#heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Use default variants
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, true]
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : not variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, false]
% Providers
#heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]
% Use default operating system and platform
#heuristic attr("node_os", node(PackageID, Package), OS) : os(OS, 0), attr("root", node(PackageID, Package)). [40, true]
#heuristic attr("node_platform", node(PackageID, Package), Platform) : allowed_platform(Platform), attr("root", node(PackageID, Package)). [40, true]
% Use default targets
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [30, init]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, 0), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use the default compilers
#heuristic node_compiler(node(PackageID, Package), ID) : compiler_weight(ID, 0), compiler_id(ID), attr("node", node(PackageID, Package)). [30, init]

View File

@@ -1,24 +0,0 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID > 0)
%=============================================================================
% node(ID, _)
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -18,9 +18,6 @@ error(100, "Cannot reuse {0} since we cannot determine libc compatibility", Reus
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).

View File

@@ -12,6 +12,7 @@
%=============================================================================
% macOS
os_compatible("sequoia", "sonoma").
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").

View File

@@ -1332,6 +1332,12 @@ def tree(
if color is None:
color = clr.get_color_when()
# reduce deptypes over all in-edges when covering nodes
if show_types and cover == "nodes":
deptype_lookup: Dict[str, dt.DepFlag] = collections.defaultdict(dt.DepFlag)
for edge in traverse.traverse_edges(specs, cover="edges", deptype=deptypes, root=False):
deptype_lookup[edge.spec.dag_hash()] |= edge.depflag
for d, dep_spec in traverse.traverse_tree(
sorted(specs), cover=cover, deptype=deptypes, depth_first=depth_first, key=key
):
@@ -1358,11 +1364,7 @@ def tree(
if show_types:
if cover == "nodes":
# when only covering nodes, we merge dependency types
# from all dependents before showing them.
depflag = 0
for ds in node.edges_from_dependents():
depflag |= ds.depflag
depflag = deptype_lookup[dep_spec.spec.dag_hash()]
else:
# when covering edges or paths, we show dependency
# types only for the edge through which we visited
@@ -4258,29 +4260,21 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
# In some cases a package appears multiple times in the same DAG for *distinct*
# specs. For example, a build-type dependency may itself depend on a package
# the current spec depends on, but their specs may differ. Therefore we iterate
# in an order here that prioritizes the build, test and runtime dependencies;
# only when we don't find the package do we consider the full DAG.
order = lambda: itertools.chain(
self.traverse(deptype="link"),
self.dependencies(deptype=dt.BUILD | dt.RUN | dt.TEST),
self.traverse(), # fall back to a full search
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"),
)
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try:
child: Spec = next(
itertools.chain(
# Regular specs
(x for x in order() if x.name == name),
(
x
for x in order()
if (not x.virtual)
and any(name in edge.virtuals for edge in x.edges_from_dependents())
),
(x for x in order() if (not x.virtual) and x.package.provides(name)),
e.spec
for e in itertools.chain(
(e for e in order() if e.spec.name == name or name in e.virtuals),
# for historical reasons
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)),
)
)
except StopIteration:
@@ -4656,7 +4650,7 @@ def colored_str(self):
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
def install_status(self):
def install_status(self) -> InstallStatus:
"""Helper for tree to print DB install status."""
if not self.concrete:
return InstallStatus.absent

View File

@@ -212,10 +212,7 @@ def _expand_matrix_constraints(matrix_config):
results = []
for combo in itertools.product(*expanded_rows):
# Construct a combined spec to test against excludes
flat_combo = [constraint for constraint_list in combo for constraint in constraint_list]
# Resolve abstract hashes so we can exclude by their concrete properties
flat_combo = [Spec(x).lookup_hash() for x in flat_combo]
flat_combo = [Spec(constraint) for constraints in combo for constraint in constraints]
test_spec = flat_combo[0].copy()
for constraint in flat_combo[1:]:
@@ -231,7 +228,9 @@ def _expand_matrix_constraints(matrix_config):
spack.variant.substitute_abstract_variants(test_spec)
except spack.variant.UnknownVariantError:
pass
if any(test_spec.satisfies(x) for x in excludes):
# Resolve abstract hashes for exclusion criteria
if any(test_spec.lookup_hash().satisfies(x) for x in excludes):
continue
if sigil:

View File

@@ -346,8 +346,6 @@ class Stage(LockableStagingDir):
similar, and are intended to persist for only one run of spack.
"""
#: Most staging is managed by Spack. DIYStage is one exception.
needs_fetching = True
requires_patch_success = True
def __init__(
@@ -772,8 +770,6 @@ def __init__(self):
"cache_mirror",
"steal_source",
"disable_mirrors",
"needs_fetching",
"requires_patch_success",
]
)
@@ -812,6 +808,10 @@ def path(self):
def archive_file(self):
return self[0].archive_file
@property
def requires_patch_success(self):
return self[0].requires_patch_success
@property
def keep(self):
return self[0].keep
@@ -822,64 +822,7 @@ def keep(self, value):
item.keep = value
class DIYStage:
"""
Simple class that allows any directory to be a spack stage. Consequently,
it does not expect or require that the source path adhere to the standard
directory naming convention.
"""
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
raise ValueError("Cannot construct DIYStage without a path.")
elif not os.path.isdir(path):
raise StagePathError("The stage path directory does not exist:", path)
self.archive_file = None
self.path = path
self.source_path = path
self.created = True
# DIY stages do nothing as context managers.
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def fetch(self, *args, **kwargs):
tty.debug("No need to fetch for DIY.")
def check(self):
tty.debug("No checksum needed for DIY.")
def expand_archive(self):
tty.debug("Using source directory: {0}".format(self.source_path))
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def restage(self):
raise RestageError("Cannot restage a DIY stage.")
def create(self):
self.created = True
def destroy(self):
# No need to destroy DIY stage.
pass
def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False
def __init__(self, name, dev_path, reference_link):

View File

@@ -371,7 +371,6 @@ def use_store(
data.update(extra_data)
# Swap the store with the one just constructed and return it
ensure_singleton_created()
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}})
)

View File

@@ -79,9 +79,11 @@ def restore(self):
self.test_state.restore()
spack.main.spack_working_dir = self.spack_working_dir
env = pickle.load(self.serialized_env) if _SERIALIZE else self.env
pkg = pickle.load(self.serialized_pkg) if _SERIALIZE else self.pkg
if env:
spack.environment.activate(env)
# Order of operation is important, since the package might be retrieved
# from a repo defined within the environment configuration
pkg = pickle.load(self.serialized_pkg) if _SERIALIZE else self.pkg
return pkg

View File

@@ -213,7 +213,9 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec(f"a %gcc@10 foobar=bar target={root_target_range} ^b target={dep_target_range}")
spec = Spec(
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert spec.target == spec["b"].target == result
assert spec.target == spec["pkg-b"].target == result

View File

@@ -105,18 +105,18 @@ def config_directory(tmpdir_factory):
@pytest.fixture(scope="function")
def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutable_config):
# This fixture depends on install_mockery_mutable_config to ensure
def default_config(tmpdir, config_directory, monkeypatch, install_mockery):
# This fixture depends on install_mockery to ensure
# there is a clear order of initialization. The substitution of the
# config scopes here is done on top of the substitution that comes with
# install_mockery_mutable_config
# install_mockery
mutable_dir = tmpdir.mkdir("mutable_config").join("tmp")
config_directory.copy(mutable_dir)
cfg = spack.config.Configuration(
*[
spack.config.ConfigScope(name, str(mutable_dir))
for name in ["site/%s" % platform.system().lower(), "site", "user"]
spack.config.DirectoryConfigScope(name, str(mutable_dir))
for name in [f"site/{platform.system().lower()}", "site", "user"]
]
)
@@ -398,9 +398,7 @@ def fake_dag_hash(spec, length=None):
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"[:length]
@pytest.mark.usefixtures(
"install_mockery_mutable_config", "mock_packages", "mock_fetch", "test_mirror"
)
@pytest.mark.usefixtures("install_mockery", "mock_packages", "mock_fetch", "test_mirror")
def test_spec_needs_rebuild(monkeypatch, tmpdir):
"""Make sure needs_rebuild properly compares remote hash
against locally computed one, avoiding unnecessary rebuilds"""
@@ -429,7 +427,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
assert rebuild
@pytest.mark.usefixtures("install_mockery_mutable_config", "mock_packages", "mock_fetch")
@pytest.mark.usefixtures("install_mockery", "mock_packages", "mock_fetch")
def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
"""Ensure spack buildcache index only reports available packages"""
@@ -587,9 +585,7 @@ def test_update_sbang(tmpdir, test_mirror):
str(archspec.cpu.host().family) != "x86_64",
reason="test data uses gcc 4.5.0 which does not support aarch64",
)
def test_install_legacy_buildcache_layout(
mutable_config, compiler_factory, install_mockery_mutable_config
):
def test_install_legacy_buildcache_layout(mutable_config, compiler_factory, install_mockery):
"""Legacy buildcache layout involved a nested archive structure
where the .spack file contained a repeated spec.json and another
compressed archive file containing the install tree. This test

View File

@@ -228,3 +228,25 @@ def test_source_is_disabled(mutable_config):
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")
def test_use_store_does_not_try_writing_outside_root(tmp_path, monkeypatch, mutable_config):
"""Tests that when we use the 'use_store' context manager, there is no attempt at creating
a Store outside the given root.
"""
initial_store = mutable_config.get("config:install_tree:root")
user_store = tmp_path / "store"
fn = spack.store.Store.__init__
def _checked_init(self, root, *args, **kwargs):
fn(self, root, *args, **kwargs)
assert self.root == str(user_store)
monkeypatch.setattr(spack.store.Store, "__init__", _checked_init)
spack.store.reinitialize()
with spack.store.use_store(user_store):
assert spack.config.CONFIG.get("config:install_tree:root") == str(user_store)
assert spack.config.CONFIG.get("config:install_tree:root") == initial_store

View File

@@ -177,7 +177,7 @@ def _set_wrong_cc(x):
def test_setup_dependent_package_inherited_modules(
config, working_env, mock_packages, install_mockery, mock_fetch
working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized()
@@ -457,14 +457,14 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
# a foobar=bar (parallel = False)
# |
# b (parallel =True)
s = default_mock_concretization("a foobar=bar")
s = default_mock_concretization("pkg-a foobar=bar")
spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD)
assert s["a"].package.module.make_jobs == 1
assert s["pkg-a"].package.module.make_jobs == 1
spack.build_environment.set_package_py_globals(s["b"].package, context=Context.BUILD)
assert s["b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["b"].package.parallel
spack.build_environment.set_package_py_globals(s["pkg-b"].package, context=Context.BUILD)
assert s["pkg-b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["pkg-b"].package.parallel
)

View File

@@ -94,10 +94,10 @@ def test_negative_ninja_check(self, input_dir, test_dir, concretize_and_setup):
@pytest.mark.not_on_windows("autotools not available on windows")
@pytest.mark.usefixtures("config", "mock_packages")
@pytest.mark.usefixtures("mock_packages")
class TestAutotoolsPackage:
def test_with_or_without(self, default_mock_concretization):
s = default_mock_concretization("a")
s = default_mock_concretization("pkg-a")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -129,7 +129,7 @@ def activate(value):
assert "--without-lorem-ipsum" in options
def test_none_is_allowed(self, default_mock_concretization):
s = default_mock_concretization("a foo=none")
s = default_mock_concretization("pkg-a foo=none")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -139,11 +139,9 @@ def test_none_is_allowed(self, default_mock_concretization):
assert "--without-baz" in options
assert "--no-fee" in options
def test_libtool_archive_files_are_deleted_by_default(
self, default_mock_concretization, mutable_database
):
def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Install a package that creates a mock libtool archive
s = default_mock_concretization("libtool-deletion")
s = Spec("libtool-deletion").concretized()
s.package.do_install(explicit=True)
# Assert the libtool archive is not there and we have
@@ -154,25 +152,23 @@ def test_libtool_archive_files_are_deleted_by_default(
assert libtool_deletion_log
def test_libtool_archive_files_might_be_installed_on_demand(
self, mutable_database, monkeypatch, default_mock_concretization
self, mutable_database, monkeypatch
):
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = default_mock_concretization("libtool-deletion")
s = Spec("libtool-deletion").concretized()
monkeypatch.setattr(type(s.package.builder), "install_libtool_archives", True)
s.package.do_install(explicit=True)
# Assert libtool archives are installed
assert os.path.exists(s.package.builder.libtool_archive_file)
def test_autotools_gnuconfig_replacement(self, default_mock_concretization, mutable_database):
def test_autotools_gnuconfig_replacement(self, mutable_database):
"""
Tests whether only broken config.sub and config.guess are replaced with
files from working alternatives from the gnuconfig package.
"""
s = default_mock_concretization(
"autotools-config-replacement +patch_config_files +gnuconfig"
)
s = Spec("autotools-config-replacement +patch_config_files +gnuconfig").concretized()
s.package.do_install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:
@@ -187,15 +183,11 @@ def test_autotools_gnuconfig_replacement(self, default_mock_concretization, muta
with open(os.path.join(s.prefix.working, "config.guess")) as f:
assert "gnuconfig version of config.guess" not in f.read()
def test_autotools_gnuconfig_replacement_disabled(
self, default_mock_concretization, mutable_database
):
def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
"""
Tests whether disabling patch_config_files
"""
s = default_mock_concretization(
"autotools-config-replacement ~patch_config_files +gnuconfig"
)
s = Spec("autotools-config-replacement ~patch_config_files +gnuconfig").concretized()
s.package.do_install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import subprocess
@@ -11,15 +10,12 @@
import llnl.util.filesystem as fs
import spack.ci as ci
import spack.ci_needs_workaround as cinw
import spack.ci_optimization as ci_opt
import spack.config
import spack.environment as ev
import spack.error
import spack.paths as spack_paths
import spack.util.git
import spack.util.gpg
import spack.util.spack_yaml as syaml
@pytest.fixture
@@ -203,165 +199,7 @@ def __call__(self, *args, **kwargs):
assert "Unable to merge {0}".format(c1) in err
@pytest.mark.parametrize("obj, proto", [({}, [])])
def test_ci_opt_argument_checking(obj, proto):
"""Check that matches() and subkeys() return False when `proto` is not a dict."""
assert not ci_opt.matches(obj, proto)
assert not ci_opt.subkeys(obj, proto)
@pytest.mark.parametrize("yaml", [{"extends": 1}])
def test_ci_opt_add_extends_non_sequence(yaml):
"""Check that add_extends() exits if 'extends' is not a sequence."""
yaml_copy = yaml.copy()
ci_opt.add_extends(yaml, None)
assert yaml == yaml_copy
def test_ci_workarounds():
fake_root_spec = "x" * 544
fake_spack_ref = "x" * 40
common_variables = {"SPACK_IS_PR_PIPELINE": "False"}
common_before_script = [
'git clone "https://github.com/spack/spack"',
" && ".join(("pushd ./spack", 'git checkout "{ref}"'.format(ref=fake_spack_ref), "popd")),
'. "./spack/share/spack/setup-env.sh"',
]
def make_build_job(name, deps, stage, use_artifact_buildcache, optimize, use_dependencies):
variables = common_variables.copy()
variables["SPACK_JOB_SPEC_PKG_NAME"] = name
result = {
"stage": stage,
"tags": ["tag-0", "tag-1"],
"artifacts": {
"paths": ["jobs_scratch_dir", "cdash_report", name + ".spec.json", name],
"when": "always",
},
"retry": {"max": 2, "when": ["always"]},
"after_script": ['rm -rf "./spack"'],
"script": ["spack ci rebuild"],
"image": {"name": "spack/centos7", "entrypoint": [""]},
}
if optimize:
result["extends"] = [".c0", ".c1"]
else:
variables["SPACK_ROOT_SPEC"] = fake_root_spec
result["before_script"] = common_before_script
result["variables"] = variables
if use_dependencies:
result["dependencies"] = list(deps) if use_artifact_buildcache else []
else:
result["needs"] = [{"job": dep, "artifacts": use_artifact_buildcache} for dep in deps]
return {name: result}
def make_rebuild_index_job(use_artifact_buildcache, optimize, use_dependencies):
result = {
"stage": "stage-rebuild-index",
"script": "spack buildcache update-index s3://mirror",
"tags": ["tag-0", "tag-1"],
"image": {"name": "spack/centos7", "entrypoint": [""]},
"after_script": ['rm -rf "./spack"'],
}
if optimize:
result["extends"] = ".c0"
else:
result["before_script"] = common_before_script
return {"rebuild-index": result}
def make_factored_jobs(optimize):
return (
{
".c0": {"before_script": common_before_script},
".c1": {"variables": {"SPACK_ROOT_SPEC": fake_root_spec}},
}
if optimize
else {}
)
def make_stage_list(num_build_stages):
return {
"stages": (
["-".join(("stage", str(i))) for i in range(num_build_stages)]
+ ["stage-rebuild-index"]
)
}
def make_yaml_obj(use_artifact_buildcache, optimize, use_dependencies):
result = {}
result.update(
make_build_job(
"pkg-a", [], "stage-0", use_artifact_buildcache, optimize, use_dependencies
)
)
result.update(
make_build_job(
"pkg-b", ["pkg-a"], "stage-1", use_artifact_buildcache, optimize, use_dependencies
)
)
result.update(
make_build_job(
"pkg-c",
["pkg-a", "pkg-b"],
"stage-2",
use_artifact_buildcache,
optimize,
use_dependencies,
)
)
result.update(make_rebuild_index_job(use_artifact_buildcache, optimize, use_dependencies))
result.update(make_factored_jobs(optimize))
result.update(make_stage_list(3))
return result
# test every combination of:
# use artifact buildcache: true or false
# run optimization pass: true or false
# convert needs to dependencies: true or false
for use_ab in (False, True):
original = make_yaml_obj(
use_artifact_buildcache=use_ab, optimize=False, use_dependencies=False
)
for opt, deps in itertools.product(*(((False, True),) * 2)):
# neither optimizing nor converting needs->dependencies
if not (opt or deps):
# therefore, nothing to test
continue
predicted = make_yaml_obj(
use_artifact_buildcache=use_ab, optimize=opt, use_dependencies=deps
)
actual = original.copy()
if opt:
actual = ci_opt.optimizer(actual)
if deps:
actual = cinw.needs_to_dependencies(actual)
predicted = syaml.dump_config(ci_opt.sort_yaml_obj(predicted), default_flow_style=True)
actual = syaml.dump_config(ci_opt.sort_yaml_obj(actual), default_flow_style=True)
assert predicted == actual
def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
"""Test that given an active environment and list of touched pkgs,
we get the right list of possibly-changed env specs"""
e1 = ev.create("test")
@@ -415,7 +253,7 @@ def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
@pytest.mark.regression("29947")
def test_affected_specs_on_first_concretization(mutable_mock_env_path, mock_packages, config):
def test_affected_specs_on_first_concretization(mutable_mock_env_path, mock_packages):
e = ev.create("first_concretization")
e.add("mpileaks~shared")
e.add("mpileaks+shared")
@@ -484,12 +322,12 @@ def test_ci_run_standalone_tests_missing_requirements(
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
def test_ci_run_standalone_tests_not_installed_junit(
tmp_path, repro_dir, working_env, default_mock_concretization, mock_test_stage, capfd
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
):
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": default_mock_concretization("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
"fail_fast": True,
}
@@ -502,13 +340,13 @@ def test_ci_run_standalone_tests_not_installed_junit(
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
def test_ci_run_standalone_tests_not_installed_cdash(
tmp_path, repro_dir, working_env, default_mock_concretization, mock_test_stage, capfd
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
):
"""Test run_standalone_tests with cdash and related options."""
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": default_mock_concretization("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
}

View File

@@ -48,11 +48,6 @@ def mock_get_specs_multiarch(database, monkeypatch):
monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", lambda: specs)
def test_buildcache_preview_just_runs():
# TODO: remove in Spack 0.21
buildcache("preview", "mpileaks")
@pytest.mark.db
@pytest.mark.regression("13757")
def test_buildcache_list_duplicates(mock_get_specs, capsys):
@@ -189,12 +184,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
def test_buildcache_sync(
mutable_mock_env_path,
install_mockery_mutable_config,
mock_packages,
mock_fetch,
mock_stage,
tmpdir,
mutable_mock_env_path, install_mockery, mock_packages, mock_fetch, mock_stage, tmpdir
):
"""
Make sure buildcache sync works in an environment-aware manner, ignoring
@@ -323,7 +313,7 @@ def manifest_insert(manifest, spec, dest_url):
def test_buildcache_create_install(
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_packages,
mock_fetch,
mock_stage,

View File

@@ -83,7 +83,6 @@ def test_checksum_args(arguments, expected):
assert check == expected
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
@pytest.mark.parametrize(
"arguments,expected",
[

View File

@@ -106,24 +106,24 @@ def test_specs_staging(config, tmpdir):
"""
builder = repo.MockRepositoryBuilder(tmpdir)
builder.add_package("g")
builder.add_package("f")
builder.add_package("e")
builder.add_package("d", dependencies=[("f", None, None), ("g", None, None)])
builder.add_package("c")
builder.add_package("b", dependencies=[("d", None, None), ("e", None, None)])
builder.add_package("a", dependencies=[("b", None, None), ("c", None, None)])
builder.add_package("pkg-g")
builder.add_package("pkg-f")
builder.add_package("pkg-e")
builder.add_package("pkg-d", dependencies=[("pkg-f", None, None), ("pkg-g", None, None)])
builder.add_package("pkg-c")
builder.add_package("pkg-b", dependencies=[("pkg-d", None, None), ("pkg-e", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
with repo.use_repositories(builder.root):
spec_a = Spec("a").concretized()
spec_a = Spec("pkg-a").concretized()
spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["b"])
spec_c_label = ci._spec_ci_label(spec_a["c"])
spec_d_label = ci._spec_ci_label(spec_a["d"])
spec_e_label = ci._spec_ci_label(spec_a["e"])
spec_f_label = ci._spec_ci_label(spec_a["f"])
spec_g_label = ci._spec_ci_label(spec_a["g"])
spec_b_label = ci._spec_ci_label(spec_a["pkg-b"])
spec_c_label = ci._spec_ci_label(spec_a["pkg-c"])
spec_d_label = ci._spec_ci_label(spec_a["pkg-d"])
spec_e_label = ci._spec_ci_label(spec_a["pkg-e"])
spec_f_label = ci._spec_ci_label(spec_a["pkg-f"])
spec_g_label = ci._spec_ci_label(spec_a["pkg-g"])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])
@@ -748,7 +748,7 @@ def test_ci_rebuild_mock_success(
tmpdir,
working_env,
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_gnupghome,
mock_stage,
mock_fetch,
@@ -782,7 +782,7 @@ def test_ci_rebuild_mock_failure_to_push(
tmpdir,
working_env,
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_gnupghome,
mock_stage,
mock_fetch,
@@ -820,7 +820,7 @@ def test_ci_rebuild(
tmpdir,
working_env,
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_packages,
monkeypatch,
mock_gnupghome,
@@ -1019,7 +1019,7 @@ def fake_dl_method(spec, *args, **kwargs):
def test_ci_generate_mirror_override(
tmpdir,
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_packages,
mock_fetch,
mock_stage,
@@ -1104,7 +1104,7 @@ def test_ci_generate_mirror_override(
def test_push_to_build_cache(
tmpdir,
mutable_mock_env_path,
install_mockery_mutable_config,
install_mockery,
mock_packages,
mock_fetch,
mock_stage,
@@ -1290,7 +1290,7 @@ def test_ci_generate_override_runner_attrs(
spack:
specs:
- flatten-deps
- a
- pkg-a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1307,12 +1307,12 @@ def test_ci_generate_override_runner_attrs(
- match:
- dependency-install
- match:
- a
- pkg-a
build-job:
tags:
- specific-a-2
- match:
- a
- pkg-a
build-job-remove:
tags:
- toplevel2
@@ -1372,8 +1372,8 @@ def test_ci_generate_override_runner_attrs(
assert global_vars["SPACK_CHECKOUT_VERSION"] == git_version or "v0.20.0.test0"
for ci_key in yaml_contents.keys():
if ci_key.startswith("a"):
# Make sure a's attributes override variables, and all the
if ci_key.startswith("pkg-a"):
# Make sure pkg-a's attributes override variables, and all the
# scripts. Also, make sure the 'toplevel' tag doesn't
# appear twice, but that a's specific extra tag does appear
the_elt = yaml_contents[ci_key]
@@ -1432,55 +1432,6 @@ def test_ci_generate_override_runner_attrs(
assert the_elt["after_script"][0] == "post step one"
def test_ci_generate_with_workarounds(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
):
"""Make sure the post-processing cli workarounds do what they should"""
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
specs:
- callpath%gcc@=9.5
mirrors:
some-mirror: https://my.fake.mirror
ci:
pipeline-gen:
- submapping:
- match: ['%gcc@9.5']
build-job:
tags:
- donotcare
image: donotcare
enable-artifacts-buildcache: true
"""
)
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
outputfile = str(tmpdir.join(".gitlab-ci.yml"))
with ev.read("test"):
ci_cmd("generate", "--output-file", outputfile, "--dependencies")
with open(outputfile) as f:
contents = f.read()
yaml_contents = syaml.load(contents)
found_one = False
non_rebuild_keys = ["workflow", "stages", "variables", "rebuild-index"]
for ci_key in yaml_contents.keys():
if ci_key not in non_rebuild_keys:
found_one = True
job_obj = yaml_contents[ci_key]
assert "needs" not in job_obj
assert "dependencies" in job_obj
assert found_one is True
@pytest.mark.disable_clean_stage_check
def test_ci_rebuild_index(
tmpdir,
@@ -1830,7 +1781,7 @@ def test_ci_generate_read_broken_specs_url(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
):
"""Verify that `broken-specs-url` works as intended"""
spec_a = Spec("a")
spec_a = Spec("pkg-a")
spec_a.concretize()
a_dag_hash = spec_a.dag_hash()
@@ -1856,7 +1807,7 @@ def test_ci_generate_read_broken_specs_url(
spack:
specs:
- flatten-deps
- a
- pkg-a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1864,9 +1815,9 @@ def test_ci_generate_read_broken_specs_url(
pipeline-gen:
- submapping:
- match:
- a
- pkg-a
- flatten-deps
- b
- pkg-b
- dependency-install
build-job:
tags:

View File

@@ -72,7 +72,6 @@ def test_parse_spec_flags_with_spaces(specs, cflags, propagation, negated_varian
assert "~{0}".format(v) in s
@pytest.mark.usefixtures("config")
def test_match_spec_env(mock_packages, mutable_mock_env_path):
"""
Concretize a spec with non-default options in an environment. Make
@@ -81,44 +80,42 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
"""
# Initial sanity check: we are planning on choosing a non-default
# value, so make sure that is in fact not the default.
check_defaults = spack.cmd.parse_specs(["a"], concretize=True)[0]
check_defaults = spack.cmd.parse_specs(["pkg-a"], concretize=True)[0]
assert not check_defaults.satisfies("foobar=baz")
e = ev.create("test")
e.add("a foobar=baz")
e.add("pkg-a foobar=baz")
e.concretize()
with e:
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
assert env_spec.satisfies("foobar=baz")
assert env_spec.concrete
@pytest.mark.usefixtures("config")
def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("a foobar=baz")
e.add("a foobar=fee")
e.add("pkg-a foobar=baz")
e.add("pkg-a foobar=fee")
e.concretize()
with e:
with pytest.raises(ev.SpackEnvironmentError) as exc_info:
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
assert "matches multiple specs" in exc_info.value.message
@pytest.mark.usefixtures("config")
def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("b@0.9")
e.add("a foobar=bar") # Depends on b, should choose b@1.0
e.add("pkg-b@0.9")
e.add("pkg-a foobar=bar") # Depends on b, should choose b@1.0
e.concretize()
with e:
# This query matches the root b and b as a dependency of a. In that
# case the root instance should be preferred.
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b"])[0])
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b"])[0])
assert env_spec1.satisfies("@0.9")
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b@1.0"])[0])
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b@1.0"])[0])
assert env_spec2

View File

@@ -10,7 +10,7 @@
from spack import spack_version
from spack.main import SpackCommand
pytestmark = pytest.mark.usefixtures("config", "mutable_mock_repo")
pytestmark = pytest.mark.usefixtures("mutable_config", "mutable_mock_repo")
env = SpackCommand("env")
add = SpackCommand("add")
@@ -51,8 +51,8 @@ def test_concretize_root_test_dependencies_are_concretized(unify, mutable_mock_e
with ev.read("test") as e:
e.unify = unify
add("a")
add("b")
add("pkg-a")
add("pkg-b")
concretize("--test", "root")
assert e.matching_spec("test-dependency")

View File

@@ -640,4 +640,4 @@ def update_config(data):
config("update", "-y", "config")
with ev.Environment(str(tmpdir)) as e:
assert not e.manifest.pristine_yaml_content["spack"]["config"]["ccache"]
assert not e.manifest.yaml_content["spack"]["config"]["ccache"]

View File

@@ -12,29 +12,29 @@
@pytest.fixture(scope="function")
def test_env(mutable_mock_env_path, config, mock_packages):
def test_env(mutable_mock_env_path, mock_packages):
ev.create("test")
with ev.read("test") as e:
e.add("a@2.0 foobar=bar ^b@1.0")
e.add("a@1.0 foobar=bar ^b@0.9")
e.add("pkg-a@2.0 foobar=bar ^pkg-b@1.0")
e.add("pkg-a@1.0 foobar=bar ^pkg-b@0.9")
e.concretize()
e.write()
def test_deconcretize_dep(test_env):
with ev.read("test") as e:
deconcretize("-y", "b@1.0")
deconcretize("-y", "pkg-b@1.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("a@1.0")
assert specs[0].satisfies("pkg-a@1.0")
def test_deconcretize_all_dep(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "b")
deconcretize("-y", "--all", "b")
deconcretize("-y", "pkg-b")
deconcretize("-y", "--all", "pkg-b")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0
@@ -42,27 +42,27 @@ def test_deconcretize_all_dep(test_env):
def test_deconcretize_root(test_env):
with ev.read("test") as e:
output = deconcretize("-y", "--root", "b@1.0")
output = deconcretize("-y", "--root", "pkg-b@1.0")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "a@2.0")
deconcretize("-y", "--root", "pkg-a@2.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("a@1.0")
assert specs[0].satisfies("pkg-a@1.0")
def test_deconcretize_all_root(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "--root", "a")
deconcretize("-y", "--root", "pkg-a")
output = deconcretize("-y", "--root", "--all", "b")
output = deconcretize("-y", "--root", "--all", "pkg-b")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "--all", "a")
deconcretize("-y", "--root", "--all", "pkg-a")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0

View File

@@ -14,8 +14,6 @@
deprecate = SpackCommand("deprecate")
find = SpackCommand("find")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
install("libelf@0.8.13")

View File

@@ -20,10 +20,7 @@
install = SpackCommand("install")
env = SpackCommand("env")
pytestmark = [
pytest.mark.not_on_windows("does not run on windows"),
pytest.mark.disable_clean_stage_check,
]
pytestmark = [pytest.mark.disable_clean_stage_check]
def test_dev_build_basics(tmpdir, install_mockery):
@@ -96,7 +93,7 @@ def test_dev_build_until_last_phase(tmpdir, install_mockery):
assert os.path.exists(str(tmpdir))
def test_dev_build_before_until(tmpdir, install_mockery, capsys):
def test_dev_build_before_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
with tmpdir.as_cwd():
@@ -129,7 +126,7 @@ def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery,
monkeypatch.setattr(os, "execvp", print_spack_cc)
with tmpdir.as_cwd():
output = dev_build("-b", "edit", "--drop-in", "sh", "dev-build-test-install@0.0.0")
assert "lib/spack/env" in output
assert os.path.join("lib", "spack", "env") in output
def test_dev_build_fails_already_installed(tmpdir, install_mockery):

View File

@@ -181,7 +181,6 @@ def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
assert ["hash", "mpileaks %s" % specB.dag_hash()] in c["b_not_a"]
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
def test_load_first(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test with and without the --first option"""
install_cmd("mpileaks")

View File

@@ -15,9 +15,9 @@
def test_edit_packages(monkeypatch, mock_packages: spack.repo.RepoPath):
"""Test spack edit a b"""
path_a = mock_packages.filename_for_package_name("a")
path_b = mock_packages.filename_for_package_name("b")
"""Test spack edit pkg-a pkg-b"""
path_a = mock_packages.filename_for_package_name("pkg-a")
path_b = mock_packages.filename_for_package_name("pkg-b")
called = False
def editor(*args: str, **kwargs):
@@ -27,7 +27,7 @@ def editor(*args: str, **kwargs):
assert args[1] == path_b
monkeypatch.setattr(spack.util.editor, "editor", editor)
edit("a", "b")
edit("pkg-a", "pkg-b")
assert called

View File

@@ -28,7 +28,9 @@
import spack.package_base
import spack.paths
import spack.repo
import spack.store
import spack.util.spack_json as sjson
import spack.util.spack_yaml
from spack.cmd.env import _env_create
from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
@@ -40,7 +42,7 @@
# TODO-27021
# everything here uses the mock_env_path
pytestmark = [
pytest.mark.usefixtures("mutable_mock_env_path", "config", "mutable_mock_repo"),
pytest.mark.usefixtures("mutable_config", "mutable_mock_env_path", "mutable_mock_repo"),
pytest.mark.maybeslow,
pytest.mark.not_on_windows("Envs unsupported on Window"),
]
@@ -501,7 +503,7 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
"""\
spack:
specs:
- a
- pkg-a
- depb
"""
)
@@ -520,8 +522,8 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
depb = spack.store.STORE.db.query_one("depb", installed=True)
assert depb, "Expected depb to be installed"
a = spack.store.STORE.db.query_one("a", installed=True)
assert a, "Expected a to be installed"
a = spack.store.STORE.db.query_one("pkg-a", installed=True)
assert a, "Expected pkg-a to be installed"
def test_remove_after_concretize():
@@ -812,7 +814,6 @@ def test_init_from_yaml(environment_from_manifest):
assert not e2.specs_by_hash
@pytest.mark.usefixtures("config")
def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
fake_prefix = tmp_path / "a-prefix"
fake_bin = fake_prefix / "bin"
@@ -825,7 +826,7 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
"""\
spack:
specs:
- a
- pkg-a
view: true
"""
)
@@ -833,9 +834,9 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
external_config = io.StringIO(
"""\
packages:
a:
pkg-a:
externals:
- spec: a@2.0
- spec: pkg-a@2.0
prefix: {a_prefix}
buildable: false
""".format(
@@ -1559,7 +1560,6 @@ def test_uninstall_removes_from_env(mock_stage, mock_fetch, install_mockery):
assert not test.user_specs
@pytest.mark.usefixtures("config")
def test_indirect_build_dep(tmp_path):
"""Simple case of X->Y->Z where Y is a build/link dep and Z is a
build-only dep. Make sure this concrete DAG is preserved when writing the
@@ -1587,7 +1587,6 @@ def test_indirect_build_dep(tmp_path):
assert x_env_spec == x_concretized
@pytest.mark.usefixtures("config")
def test_store_different_build_deps(tmp_path):
r"""Ensure that an environment can store two instances of a build-only
dependency::
@@ -2328,7 +2327,7 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
assert mpileaks_spec not in after_conc
def test_stack_concretize_extraneous_deps(tmpdir, config, mock_packages):
def test_stack_concretize_extraneous_deps(tmpdir, mock_packages):
# FIXME: The new concretizer doesn't handle yet soft
# FIXME: constraints for stacks
# FIXME: This now works for statically-determinable invalid deps
@@ -2367,7 +2366,7 @@ def test_stack_concretize_extraneous_deps(tmpdir, config, mock_packages):
assert concrete.satisfies("^mpi")
def test_stack_concretize_extraneous_variants(tmpdir, config, mock_packages):
def test_stack_concretize_extraneous_variants(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
@@ -2399,7 +2398,7 @@ def test_stack_concretize_extraneous_variants(tmpdir, config, mock_packages):
assert concrete.variants["shared"].value == user.variants["shared"].value
def test_stack_concretize_extraneous_variants_with_dash(tmpdir, config, mock_packages):
def test_stack_concretize_extraneous_variants_with_dash(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
@@ -3750,7 +3749,7 @@ def test_environment_query_spec_by_hash(mock_stage, mock_fetch, install_mockery)
@pytest.mark.parametrize("lockfile", ["v1", "v2", "v3"])
def test_read_old_lock_and_write_new(config, tmpdir, lockfile):
def test_read_old_lock_and_write_new(tmpdir, lockfile):
# v1 lockfiles stored by a coarse DAG hash that did not include build deps.
# They could not represent multiple build deps with different build hashes.
#
@@ -3816,7 +3815,7 @@ def test_read_old_lock_and_write_new(config, tmpdir, lockfile):
assert old_hashes == hashes
def test_read_v1_lock_creates_backup(config, tmp_path):
def test_read_v1_lock_creates_backup(tmp_path):
"""When reading a version-1 lockfile, make sure that a backup of that file
is created.
"""
@@ -4199,7 +4198,7 @@ def test_env_include_packages_url(
assert "openmpi" in cfg["all"]["providers"]["mpi"]
def test_relative_view_path_on_command_line_is_made_absolute(tmp_path, config):
def test_relative_view_path_on_command_line_is_made_absolute(tmp_path):
with fs.working_dir(str(tmp_path)):
env("create", "--with-view", "view", "--dir", "env")
environment = ev.Environment(os.path.join(".", "env"))

View File

@@ -24,7 +24,7 @@ def python_database(mock_packages, mutable_database):
@pytest.mark.not_on_windows("All Fetchers Failed")
@pytest.mark.db
def test_extensions(mock_packages, python_database, config, capsys):
def test_extensions(mock_packages, python_database, capsys):
ext2 = Spec("py-extension2").concretized()
def check_output(ni):

View File

@@ -11,6 +11,7 @@
from llnl.util.filesystem import getuid, touch
import spack
import spack.cmd.external
import spack.detection
import spack.detection.path
from spack.main import SpackCommand
@@ -311,3 +312,29 @@ def test_failures_in_scanning_do_not_result_in_an_error(
assert "cmake" in output
assert "3.23.3" in output
assert "3.19.1" not in output
def test_detect_virtuals(mock_executable, mutable_config, monkeypatch):
"""Test whether external find --not-buildable sets virtuals as non-buildable (unless user
config sets them to buildable)"""
mpich = mock_executable("mpichversion", output="echo MPICH Version: 4.0.2")
prefix = os.path.dirname(mpich)
external("find", "--path", prefix, "--not-buildable", "mpich")
# Check that mpich was correctly detected
mpich = mutable_config.get("packages:mpich")
assert mpich["buildable"] is False
assert Spec(mpich["externals"][0]["spec"]).satisfies("mpich@4.0.2")
# Check that the virtual package mpi was marked as non-buildable
assert mutable_config.get("packages:mpi:buildable") is False
# Delete the mpich entry, and set mpi explicitly to buildable
mutable_config.set("packages:mpich", {})
mutable_config.set("packages:mpi:buildable", True)
# Run the detection again
external("find", "--path", prefix, "--not-buildable", "mpich")
# Check that the mpi:buildable entry was not overwritten
assert mutable_config.get("packages:mpi:buildable") is True

View File

@@ -9,7 +9,9 @@
from spack.main import SpackCommand, SpackCommandError
# everything here uses the mock_env_path
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path", "config", "mutable_mock_repo")
pytestmark = pytest.mark.usefixtures(
"mutable_mock_env_path", "mutable_config", "mutable_mock_repo"
)
@pytest.mark.disable_clean_stage_check

View File

@@ -337,7 +337,7 @@ def test_find_command_basic_usage(database):
@pytest.mark.not_on_windows("envirnment is not yet supported on windows")
@pytest.mark.regression("9875")
def test_find_prefix_in_env(
mutable_mock_env_path, install_mockery, mock_fetch, mock_packages, mock_archive, config
mutable_mock_env_path, install_mockery, mock_fetch, mock_packages, mock_archive
):
"""Test `find` formats requiring concrete specs work in environments."""
env("create", "test")
@@ -349,7 +349,7 @@ def test_find_prefix_in_env(
# Would throw error on regression
def test_find_specs_include_concrete_env(mutable_mock_env_path, config, mutable_mock_repo, tmpdir):
def test_find_specs_include_concrete_env(mutable_mock_env_path, mutable_mock_repo, tmpdir):
path = tmpdir.join("spack.yaml")
with tmpdir.as_cwd():
@@ -393,9 +393,7 @@ def test_find_specs_include_concrete_env(mutable_mock_env_path, config, mutable_
assert "libelf" in output
def test_find_specs_nested_include_concrete_env(
mutable_mock_env_path, config, mutable_mock_repo, tmpdir
):
def test_find_specs_nested_include_concrete_env(mutable_mock_env_path, mutable_mock_repo, tmpdir):
path = tmpdir.join("spack.yaml")
with tmpdir.as_cwd():
@@ -434,7 +432,7 @@ def test_find_loaded(database, working_env):
output = find("--loaded", "--group")
assert output == ""
os.environ[uenv.spack_loaded_hashes_var] = ":".join(
os.environ[uenv.spack_loaded_hashes_var] = os.pathsep.join(
[x.dag_hash() for x in spack.store.STORE.db.query()]
)
output = find("--loaded")

View File

@@ -20,13 +20,13 @@
@pytest.mark.db
def test_gc_without_build_dependency(config, mutable_database):
def test_gc_without_build_dependency(mutable_database):
assert "There are no unused specs." in gc("-yb")
assert "There are no unused specs." in gc("-y")
@pytest.mark.db
def test_gc_with_build_dependency(config, mutable_database):
def test_gc_with_build_dependency(mutable_database):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)
@@ -37,7 +37,7 @@ def test_gc_with_build_dependency(config, mutable_database):
@pytest.mark.db
def test_gc_with_environment(config, mutable_database, mutable_mock_env_path):
def test_gc_with_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)
@@ -53,7 +53,7 @@ def test_gc_with_environment(config, mutable_database, mutable_mock_env_path):
@pytest.mark.db
def test_gc_with_build_dependency_in_environment(config, mutable_database, mutable_mock_env_path):
def test_gc_with_build_dependency_in_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)
@@ -78,7 +78,7 @@ def test_gc_with_build_dependency_in_environment(config, mutable_database, mutab
@pytest.mark.db
def test_gc_except_any_environments(config, mutable_database, mutable_mock_env_path):
def test_gc_except_any_environments(mutable_database, mutable_mock_env_path):
"""Tests whether the garbage collector can remove all specs except those still needed in some
environment (needed in the sense of roots + link/run deps)."""
assert mutable_database.query_local("zmpi")
@@ -105,7 +105,7 @@ def test_gc_except_any_environments(config, mutable_database, mutable_mock_env_p
@pytest.mark.db
def test_gc_except_specific_environments(config, mutable_database, mutable_mock_env_path):
def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)
@@ -125,14 +125,14 @@ def test_gc_except_specific_environments(config, mutable_database, mutable_mock_
@pytest.mark.db
def test_gc_except_nonexisting_dir_env(config, mutable_database, mutable_mock_env_path, tmpdir):
def test_gc_except_nonexisting_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
output = gc("-ye", tmpdir.strpath, fail_on_error=False)
assert "No such environment" in output
gc.returncode == 1
@pytest.mark.db
def test_gc_except_specific_dir_env(config, mutable_database, mutable_mock_env_path, tmpdir):
def test_gc_except_specific_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)

View File

@@ -57,9 +57,9 @@ def test_info_noversion(mock_packages, print_buffer):
@pytest.mark.parametrize(
"pkg_query,expected", [("zlib", "False"), ("gcc", "True (version, variants)")]
"pkg_query,expected", [("zlib", "False"), ("find-externals1", "True (version)")]
)
def test_is_externally_detectable(pkg_query, expected, parser, print_buffer):
def test_is_externally_detectable(mock_packages, pkg_query, expected, parser, print_buffer):
args = parser.parse_args(["--detectable", pkg_query])
spack.cmd.info.info(parser, args)

View File

@@ -49,7 +49,7 @@ def noop(*args, **kwargs):
def test_install_package_and_dependency(
tmpdir, mock_packages, mock_archive, mock_fetch, config, install_mockery
tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery
):
log = "test"
with tmpdir.as_cwd():
@@ -89,11 +89,11 @@ def check(pkg):
assert pkg.run_tests
monkeypatch.setattr(spack.package_base.PackageBase, "unit_test_check", check)
install("--test=all", "a")
install("--test=all", "pkg-a")
def test_install_package_already_installed(
tmpdir, mock_packages, mock_archive, mock_fetch, config, install_mockery
tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery
):
with tmpdir.as_cwd():
install("libdwarf")
@@ -149,7 +149,7 @@ def test_package_output(tmpdir, capsys, install_mockery, mock_fetch):
@pytest.mark.disable_clean_stage_check
def test_install_output_on_build_error(
mock_packages, mock_archive, mock_fetch, config, install_mockery, capfd
mock_packages, mock_archive, mock_fetch, install_mockery, capfd
):
"""
This test used to assume receiving full output, but since we've updated
@@ -163,9 +163,7 @@ def test_install_output_on_build_error(
@pytest.mark.disable_clean_stage_check
def test_install_output_on_python_error(
mock_packages, mock_archive, mock_fetch, config, install_mockery
):
def test_install_output_on_python_error(mock_packages, mock_archive, mock_fetch, install_mockery):
out = install("failing-build", fail_on_error=False)
assert isinstance(install.error, spack.build_environment.ChildError)
assert install.error.name == "InstallError"
@@ -173,7 +171,7 @@ def test_install_output_on_python_error(
@pytest.mark.disable_clean_stage_check
def test_install_with_source(mock_packages, mock_archive, mock_fetch, config, install_mockery):
def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Verify that source has been copied into place."""
install("--source", "--keep-stage", "trivial-install-test-package")
spec = Spec("trivial-install-test-package").concretized()
@@ -183,7 +181,7 @@ def test_install_with_source(mock_packages, mock_archive, mock_fetch, config, in
)
def test_install_env_variables(mock_packages, mock_archive, mock_fetch, config, install_mockery):
def test_install_env_variables(mock_packages, mock_archive, mock_fetch, install_mockery):
spec = Spec("libdwarf")
spec.concretize()
install("libdwarf")
@@ -191,9 +189,7 @@ def test_install_env_variables(mock_packages, mock_archive, mock_fetch, config,
@pytest.mark.disable_clean_stage_check
def test_show_log_on_error(
mock_packages, mock_archive, mock_fetch, config, install_mockery, capfd
):
def test_show_log_on_error(mock_packages, mock_archive, mock_fetch, install_mockery, capfd):
"""
Make sure --show-log-on-error works.
"""
@@ -206,7 +202,7 @@ def test_show_log_on_error(
assert "See build log for details:" in out
def test_install_overwrite(mock_packages, mock_archive, mock_fetch, config, install_mockery):
def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = Spec("libdwarf")
spec.concretize()
@@ -240,9 +236,7 @@ def test_install_overwrite(mock_packages, mock_archive, mock_fetch, config, inst
assert fs.hash_directory(spec.prefix, ignore=ignores) != bad_md5
def test_install_overwrite_not_installed(
mock_packages, mock_archive, mock_fetch, config, install_mockery
):
def test_install_overwrite_not_installed(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = Spec("libdwarf")
spec.concretize()
@@ -277,9 +271,7 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
assert content == "[0]" # contents are weird for another test
def test_install_overwrite_multiple(
mock_packages, mock_archive, mock_fetch, config, install_mockery
):
def test_install_overwrite_multiple(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
libdwarf = Spec("libdwarf")
libdwarf.concretize()
@@ -337,18 +329,14 @@ def test_install_overwrite_multiple(
assert cm_hash != bad_cmake_md5
@pytest.mark.usefixtures(
"mock_packages", "mock_archive", "mock_fetch", "config", "install_mockery"
)
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")
def test_install_conflicts(conflict_spec):
# Make sure that spec with conflicts raises a SpackError
with pytest.raises(SpackError):
install(conflict_spec)
@pytest.mark.usefixtures(
"mock_packages", "mock_archive", "mock_fetch", "config", "install_mockery"
)
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")
def test_install_invalid_spec(invalid_spec):
# Make sure that invalid specs raise a SpackError
with pytest.raises(SpecSyntaxError, match="unexpected tokens"):
@@ -390,9 +378,7 @@ def test_install_from_file(spec, concretize, error_code, tmpdir):
@pytest.mark.disable_clean_stage_check
@pytest.mark.usefixtures(
"mock_packages", "mock_archive", "mock_fetch", "config", "install_mockery"
)
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")
@pytest.mark.parametrize(
"exc_typename,msg",
[("RuntimeError", "something weird happened"), ("ValueError", "spec is not concrete")],
@@ -448,7 +434,6 @@ def test_junit_output_with_errors(
mock_archive,
mock_fetch,
install_mockery,
config,
tmpdir,
monkeypatch,
):
@@ -509,9 +494,7 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
assert install.returncode == 0
def test_extra_files_are_archived(
mock_packages, mock_archive, mock_fetch, config, install_mockery
):
def test_extra_files_are_archived(mock_packages, mock_archive, mock_fetch, install_mockery):
s = Spec("archive-files")
s.concretize()
@@ -570,100 +553,96 @@ def test_cdash_upload_build_error(tmpdir, mock_fetch, install_mockery, capfd):
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_clean_build(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
with capfd.disabled(), tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_extra_params(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
with capfd.disabled(), tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - pkg-a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
@pytest.mark.disable_clean_stage_check
def test_cdash_buildstamp_param(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
with capfd.disabled(), tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
@pytest.mark.disable_clean_stage_check
def test_cdash_install_from_spec_json(
tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive, config
tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive
):
# capfd interferes with Spack's capturing
with capfd.disabled():
with tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("a")
pkg_spec.concretize()
pkg_spec = Spec("pkg-a")
pkg_spec.concretize()
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "a@" in install_command
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "pkg-a@" in install_command
@pytest.mark.disable_clean_stage_check
@@ -795,15 +774,15 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^libdwarf
# ^mpich
# libelf@0.8.10
# a~bvv
# ^b
# a
# ^b
# pkg-a~bvv
# ^pkg-b
# pkg-a
# ^pkg-b
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs
e.add("a")
e.add("a ~bvv")
e.add("pkg-a")
e.add("pkg-a ~bvv")
e.concretize()
e.write()
env_specs = e.all_specs()
@@ -814,9 +793,9 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# First find and remember some target concrete specs in the environment
for e_spec in env_specs:
if e_spec.satisfies(Spec("a ~bvv")):
if e_spec.satisfies(Spec("pkg-a ~bvv")):
a_spec = e_spec
elif e_spec.name == "b":
elif e_spec.name == "pkg-b":
b_spec = e_spec
elif e_spec.satisfies(Spec("mpi")):
mpi_spec = e_spec
@@ -839,8 +818,8 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that two packages "a" get installed
inst_out = install("a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "a"]) == 2
inst_out = install("pkg-a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "pkg-a"]) == 2
# Install an unambiguous dependency spec (that already exists as a dep
# in the environment) and make sure it gets installed (w/ deps),
@@ -873,7 +852,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# root of the environment as well as installed.
assert b_spec not in e.roots()
install("--add", "b")
install("--add", "pkg-b")
assert b_spec in e.roots()
assert b_spec not in e.uninstalled_specs()
@@ -908,7 +887,7 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
# capfd interferes with Spack's capturing
with tmpdir.as_cwd(), capfd.disabled():
monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "asdf")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "a")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
assert "Using CDash auth token from environment" in out
@@ -916,54 +895,42 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
@pytest.mark.disable_clean_stage_check
def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
# Test would fail if install raised an error.
with capfd.disabled(), tmpdir.as_cwd():
# Test would fail if install raised an error.
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = spack.spec.Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
def test_compiler_bootstrap(
install_mockery_mutable_config,
mock_packages,
mock_fetch,
mock_archive,
mutable_config,
monkeypatch,
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("a%gcc@=12.0")
install("pkg-a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows")
def test_compiler_bootstrap_from_binary_mirror(
install_mockery_mutable_config,
mock_packages,
mock_fetch,
mock_archive,
mutable_config,
monkeypatch,
tmpdir,
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
):
"""
Make sure installing compiler from buildcache registers compiler
@@ -992,19 +959,14 @@ def test_compiler_bootstrap_from_binary_mirror(
# Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not
# raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "b%gcc@10.2.0")
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@pytest.mark.regression("16221")
def test_compiler_bootstrap_already_installed(
install_mockery_mutable_config,
mock_packages,
mock_fetch,
mock_archive,
mutable_config,
monkeypatch,
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
@@ -1013,7 +975,7 @@ def test_compiler_bootstrap_already_installed(
# Test succeeds if it does not raise an error
install("gcc@=12.0")
install("a%gcc@=12.0")
install("pkg-a%gcc@=12.0")
def test_install_fails_no_args(tmpdir):
@@ -1104,13 +1066,7 @@ def test_installation_fail_tests(install_mockery, mock_fetch, name, method):
@pytest.mark.not_on_windows("Buildcache not supported on windows")
def test_install_use_buildcache(
capsys,
mock_packages,
mock_fetch,
mock_archive,
mock_binary_index,
tmpdir,
install_mockery_mutable_config,
capsys, mock_packages, mock_fetch, mock_archive, mock_binary_index, tmpdir, install_mockery
):
"""
Make sure installing with use-buildcache behaves correctly.
@@ -1183,19 +1139,19 @@ def install_use_buildcache(opt):
@pytest.mark.not_on_windows("Windows logger I/O operation on closed file when install fails")
@pytest.mark.regression("34006")
@pytest.mark.disable_clean_stage_check
def test_padded_install_runtests_root(install_mockery_mutable_config, mock_fetch):
def test_padded_install_runtests_root(install_mockery, mock_fetch):
spack.config.set("config:install_tree:padded_length", 255)
output = install("--test=root", "--no-cache", "test-build-callbacks", fail_on_error=False)
assert output.count("method not implemented") == 1
@pytest.mark.regression("35337")
def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
def test_report_filename_for_cdash(install_mockery, mock_fetch):
"""Test that the temporary file used to write the XML for CDash is not the upload URL"""
parser = argparse.ArgumentParser()
spack.cmd.install.setup_parser(parser)
args = parser.parse_args(
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "a"]
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "pkg-a"]
)
specs = spack.cmd.install.concrete_specs_from_cli(args, {})
filename = spack.cmd.install.report_filename(args, specs)

Some files were not shown because too many files have changed in this diff Show More