Compare commits

..

197 Commits

Author SHA1 Message Date
Alex Richert
a6cfeabc10 cairo: add shared and pic variants (#40302) 2023-10-13 14:21:43 -06:00
Martin Aumüller
a3a29006aa wayland: dot is a build dependency (#39854)
* wayland: dot is a build dependency

otherwise this build failure happens:
../spack-src/doc/meson.build:5:6: ERROR: Program 'dot' not found or not executable

* wayland: make building of documentation optional

renders several dependencies optional
2023-10-13 21:57:13 +02:00
Harmen Stoppels
a5cb7a9816 spack checksum: improve interactive filtering (#40403)
* spack checksum: improve interactive filtering

* fix signature of executable

* Fix restart when using editor

* Don't show [x version(s) are new] when no known versions (e.g. in spack create <url>)

* Test ^D in test_checksum_interactive_quit_from_ask_each

* formatting

* colorize / skip header on invalid command

* show original total, not modified total

* use colify for command list

* Warn about possible URL changes

* show possible URL change as comment

* make mypy happy

* drop numbers

* [o]pen editor -> [e]dit
2023-10-13 19:43:22 +00:00
Gabriel Cretin
edf4aa9f52 Fpocket: fix installation (#40499)
* Fpocket: fix edit() positional args + add install()

* Remove comments

* Fix line too long

* Fix line too long

* Remove extension specification in version

Co-authored-by: Alec Scott <alec@bcs.sh>

* Use f-strings

Co-authored-by: Alec Scott <alec@bcs.sh>

* Fix styling

* Use the default MakefilePackage install stage

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-13 11:30:20 -07:00
Dom Heinzeller
02c680ec3a texinfo package: fix external detection (#40470)
A complete texinfo install includes both `info` and `makeinfo`. Some
system installations of texinfo may exclude one or the other. This
updates the external finding logic to require both.
2023-10-13 10:39:08 -07:00
David Huber
8248e180ca Add gsi-ncdiag v1.1.2 (#40508) 2023-10-13 08:48:23 -07:00
Harmen Stoppels
c9677b2465 Expand multiple build systems section (#39589)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-13 14:59:44 +02:00
Massimiliano Culpo
3752fe9e42 Better error message when wrong platform is used (#40492)
fixes #40299
2023-10-13 11:18:55 +02:00
Matthew Chan
8a0de10f60 containerize: update docs to activate env before using container templates (#40493) 2023-10-13 06:59:44 +00:00
Adam J. Stewart
6aa8d76e32 py-cmocean: add v3.0.3 (#40482) 2023-10-12 20:32:48 -06:00
Nils Vu
fb1d0f60d9 catch2: add +pic and +shared options (#40337)
Also add latest version
2023-10-12 20:13:01 -06:00
Martin Aumüller
728eaa515f ospray: new versions 2.11.0 and 2.12.0 (#40394)
* openimagedenoise: checksum 2.0.1
* ospray: new versions 2.11.0 and 2.12.0
  - both depend on embree@4
  - also update dependency versions for rkcommon, openvkl, openimagedenois and ispc
  - expose that dependency on openvkl is optional since @2.11 with variant "volumes"
* ospray: limit embree to @3 for ospray @:2.10
2023-10-12 14:13:15 -07:00
Julius Plehn
7c354095a9 Updates Variorum to 0.7.0 (#40488) 2023-10-12 11:27:06 -06:00
Harmen Stoppels
64ef33767f modules:prefix_inspections: allow empty dict (#40485)
Currently

```
modules:
  prefix_inspections:: {}
```

gives you the builtin defaults instead of no mapping.
2023-10-12 09:28:16 -07:00
Dennis Klein
265432f7b7 libzmq: Revert "libzmq: make location of libsodium explicit (#34553)" (#40477)
and make variants independent of upstream defaults
2023-10-12 09:15:00 -06:00
dependabot[bot]
aa7dfdb5c7 build(deps): bump python-levenshtein in /lib/spack/docs (#40461)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.22.0 to 0.23.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.22.0...v0.23.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-12 14:38:33 +00:00
Alec Scott
bfe37435a4 go: add v1.21.3 and deprecate previous versions due to CVE-2023-39533 (#40454) 2023-10-12 07:40:04 -06:00
Adam J. Stewart
285a50f862 PyTorch: fix build with Xcode 15 (#40460) 2023-10-12 07:27:03 -06:00
Harmen Stoppels
995e82e72b gettext: Add 0.22.3 and fix keyerror: "shared" (#39423)
After the merge of #37957 (Add static and pic variants), if a gettext install
from a build before that merge is present, building any package using gettext
fails with keyerror: "shared" because the use of self.spec.variants["shared"]
does not check for the presence of the new variant in the old installation
but expects that the new key variants["shared"] exists always.

Fix it with a fallback to the default of True and update gettext to v22.3

Co-authored-by: Bernharad Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2023-10-12 04:40:38 -06:00
Massimiliano Culpo
3935e047c6 Remove deprecated "extra_instructions" option for containers (#40365) 2023-10-12 12:12:15 +02:00
Harmen Stoppels
0fd2427d9b clingo: fix build with Python 3.12 (#40154) 2023-10-12 12:11:22 +02:00
Alec Scott
30d29d0201 acfl: use f-strings (#40433) 2023-10-12 11:30:22 +02:00
Tim Haines
3e1f2392d4 must: add versions 1.8.0 and 1.9.0 (#40141) 2023-10-11 22:34:22 -07:00
Lydéric Debusschère
6a12a40208 [add] py-cylc-rose: new recipe (#39980)
* [add] py-cylc-rose: new recipe

* py-cylc-rose: update recipe

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-11 22:31:46 -07:00
Leonhard Reichenbach
90e73391c2 opendatadetector: add version v3.0.0 (#39693) 2023-10-11 22:29:00 -07:00
Alec Scott
deec1b7c2e adios2: use f-strings (#40437) 2023-10-11 21:08:00 -06:00
Scott Wittenburg
d9cb1a1070 buildcache: Tell servers not to cache index or hash (#40339) 2023-10-11 18:13:57 -06:00
afzpatel
01747b50df fix ck build for 5.6.1 (#40304)
* initial commit to fix ck build for 5.6.1
* disable mlir for miopen-hip
* use satisfies for checking specs and add nlohmann-json dependency for 5.4 onwards
2023-10-11 14:43:21 -07:00
Alex Richert
df01a11e07 apr-util: Fix spack install apr-util +crypto ^openssl~shared (#40301) 2023-10-11 23:38:28 +02:00
Victor Brunini
7a4b479724 cmake: drop CMAKE_STATIC_LINKER_FLAGS (#40423)
Because those end up being passed to ar which does not understand linker
arguments. This was making ldflags largely unusuable for statically
linked cmake packages.
2023-10-11 23:30:44 +02:00
Harmen Stoppels
89e34d56a1 curl: add v8.4.0, allow r@8.3: to use it (#40442)
* curl: 8.4.0

* fix r curl upperbound range
2023-10-11 21:04:09 +02:00
Miroslav Stoyanov
a5853ee51a update for the tasmanain versions (#40453) 2023-10-11 12:33:31 -06:00
Alec Scott
537ab48167 acts: use f-strings (#40434) 2023-10-11 11:03:30 -07:00
Alec Scott
e43a090877 abduco: use f-string (#40432) 2023-10-11 19:51:26 +02:00
Alec Scott
275a2f35b5 adiak: use f-strings (#40435) 2023-10-11 10:48:39 -07:00
Alec Scott
dae746bb96 abacus: use f-string (#40431) 2023-10-11 19:48:32 +02:00
Alec Scott
3923b81d87 adios: use f-strings (#40436) 2023-10-11 10:47:42 -07:00
Alec Scott
5d582a5e48 7zip: use f-strings (#40430) 2023-10-11 19:46:23 +02:00
Alec Scott
7dbc712fba 3proxy: use f-strings (#40429) 2023-10-11 19:44:57 +02:00
Alec Scott
639ef9e24a adol-c: use f-strings (#40438) 2023-10-11 10:43:31 -07:00
Alex Richert
86d2200523 krb5: Fix spack install krb5 ^openssl~shared (#40306) 2023-10-11 19:37:48 +02:00
Massimiliano Culpo
fe6860e0d7 cmake: add v3.27.7 (#40441) 2023-10-11 10:20:49 -07:00
Manuela Kuhn
8f2e68aeb8 c-blosc2: add 2.10.5 (#40428) 2023-10-11 19:19:58 +02:00
Laura Weber
bc4c887452 tecplot: Add version 2023r1 (#40425) 2023-10-11 10:05:37 -07:00
Alec Scott
b3534b4435 restic: add v0.16.0 (#40439) 2023-10-11 19:04:26 +02:00
Massimiliano Culpo
861bb4d35a Update bootstrap buildcache to support Python 3.12 (#40404)
* Add support for Python 3.12
* Use optimized build of clingo
2023-10-11 19:03:17 +02:00
Harmen Stoppels
65e7ec0509 spider: respect <base> tag (#40443) 2023-10-11 08:49:50 -07:00
Manuela Kuhn
1ab8886695 qt-base: fix-build without opengl (#40421) 2023-10-11 08:56:59 -04:00
dependabot[bot]
26136c337f build(deps): bump mypy from 1.5.1 to 1.6.0 in /.github/workflows/style (#40422)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 06:35:51 -06:00
dependabot[bot]
e3b71b32aa build(deps): bump mypy from 1.5.1 to 1.6.0 in /lib/spack/docs (#40424)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 12:35:41 +00:00
Alec Scott
6d1711f4c2 Update legacy .format() calls to fstrings in installer.py (#40426) 2023-10-11 12:35:37 +00:00
Massimiliano Culpo
26f291ef25 spack buildcache: fix a typo in a function call (#40446)
fixes #40415
2023-10-11 13:09:21 +02:00
Martin Aumüller
da030617a1 botan: checksum 3.2.0 (#40417) 2023-10-10 21:52:51 -06:00
Martin Aumüller
1ebfcd3b18 openvkl: add 1.3.2 (#40392)
* openvkl: add 1.3.2

works with (and requires) embree@4

* openvkl: simplify formatting with f-string

thank you for the suggestion in the review
2023-10-10 21:24:17 -06:00
Edward Hartnett
d385a57da3 w3emc: add v2.11.0 (#40376)
* added version 2.11.0

* more fixes
2023-10-10 15:11:14 -07:00
eugeneswalker
37df8bfc73 e4s arm stack: duplicate and target neoverse v1 (#40369)
* e4s arm stack: duplicate and target both neoverse n1, v1

* remove neoverse_n1 target until issue #40397 is resolved
2023-10-10 14:32:51 -07:00
Adam J. Stewart
b781a530a1 GCC: fix build with Apple Clang 15 (#40318) 2023-10-10 22:35:15 +02:00
Harmen Stoppels
390b0aa25c More helpful error when patch lookup fails (#40379) 2023-10-10 21:09:04 +02:00
Adam J. Stewart
620835e30c py-jupyter-packaging: remove duplicate packages (#38671)
* py-jupyter-packaging: remove duplicate packages

* Allow py-jupyter-packaging to be duplicated in DAG

* Deprecate version of py-jupyterlab that requires py-jupyter-packaging at run-time
2023-10-10 13:50:22 -05:00
Adam J. Stewart
da10487219 Update PyTorch ecosystem (#40321)
* Update PyTorch ecosystem

* py-pybind11: better documentation of supported compilers

* py-torchdata: add v0.7.0

* Black fixes

* py-torchtext: fix Python reqs

* py-horovod: py-torch 2.1.0 not yet supported
2023-10-10 13:45:32 -05:00
Miroslav Stoyanov
4d51810888 find rocm fix (#40388)
* find rocm fix
* format fix
* style fix
* formatting is broken
2023-10-10 10:42:26 -07:00
Harmen Stoppels
6c7b2e1056 git: optimize build by not setting CFLAGS (#40387) 2023-10-10 11:48:06 +02:00
Carlos Bederián
749e99bf11 python: add 3.11.6 (#40384) 2023-10-10 09:53:04 +02:00
Martin Aumüller
6db8e0a61e embree: checksum 4.3.0 (#40395) 2023-10-09 20:44:10 -06:00
Martin Aumüller
6fe914421a rkcommon: checksum 0.11.0 (#40391) 2023-10-09 20:43:56 -06:00
Andrew W Elble
9275f180bb openmm: new version 8.0.0 (#40396) 2023-10-09 20:33:40 -06:00
Tom Epperly
2541b42fc2 Add a new release sha256 hash (#37680) 2023-10-09 20:28:45 -06:00
Auriane R
fb340f130b Add pika 0.19.1 (#40385) 2023-10-09 20:23:54 -06:00
Dennis Klein
d2ddd99ef6 libzmq: add v4.3.5 (#40383) 2023-10-09 17:30:46 -06:00
kenche-linaro
492a8111b9 linaro-forge: added package file for rebranded product (#39587) 2023-10-09 12:14:14 -07:00
George Young
d846664165 paintor: new package @3.0 (#40359)
* paintor: new package @3.0
* Update var/spack/repos/builtin/packages/paintor/package.py
---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:54:44 -07:00
Vanessasaurus
31b3e4898b Add: flux-pmix 0.4.0 (#40323)
* Automated deployment to update package flux-pmix 2023-10-05

* Pin exactly to flux-core 0.49.0 when between 0.3 and 0.4

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Mark Grondona <mark.grondona@gmail.com>

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Mark Grondona <mark.grondona@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:29:18 -07:00
Harmen Stoppels
82f1267486 unparse: drop python 3.3 remnants (#40331) 2023-10-09 08:22:27 -07:00
Harmen Stoppels
19202b2528 docs: update Spack prerequisites (#40381) 2023-10-09 13:41:36 +00:00
Adam J. Stewart
831cbec71f py-pydevtool: add new package (#40377) 2023-10-09 15:09:59 +02:00
Patrick Broderick
bb2ff802e2 fftx: add v1.1.3 (#40283) 2023-10-09 15:06:59 +02:00
dependabot[bot]
83e9537f57 build(deps): bump python-levenshtein in /lib/spack/docs (#40220)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.21.1 to 0.22.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.21.1...v0.22.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-09 15:02:14 +02:00
Gavin John
3488e83deb py-s3cmd: Add new versions (#40212)
* Add new versions of py-s3cmd

* Use correct hashes
2023-10-09 07:46:38 -05:00
Jacob King
c116eee921 nimrod-aai: add v23.9. (#40303) 2023-10-09 14:39:40 +02:00
Adam J. Stewart
9cb291b41b py-jsonargparse: add v4.25.0 (#40185) 2023-10-09 14:33:45 +02:00
Thomas Dickerson
c0f1072dc7 racket packages: fix typo after multiple build systems support (#40088) 2023-10-09 14:21:13 +02:00
Jordan Galby
3108036533 elfutils: fix +debuginfod again with new libarchive versions (#40314) 2023-10-09 14:17:58 +02:00
Mike Renfro
215c699307 velvet: improved variants (#40225) 2023-10-09 12:47:54 +02:00
jmuddnv
f609093c6e Adding NVIDIA HPC SDK 23.9 (#40371) 2023-10-09 12:35:59 +02:00
Joe Schoonover
eb4fd98f09 feq-parse: add v2.0.3 (#40230) 2023-10-09 12:31:23 +02:00
Harmen Stoppels
08da9a854a parser: use non-capturing groups (#40373) 2023-10-09 07:18:27 +02:00
Mark (he/his) C. Miller
3a18fe04cc Update CITATION.cff with conf dates (#40375)
Add `start-date` and `end-date` to citation
2023-10-08 18:04:25 -07:00
Harmen Stoppels
512e41a84a py-setuptools: sdist + rpath patch backport (#40205) 2023-10-08 11:47:36 -06:00
Alex Richert
8089aedde1 gettext: Add static and pic options (#37957) 2023-10-08 17:42:21 +02:00
Manuela Kuhn
6b9e103305 py-tables: add 3.9.0 (#40340)
* py-tables: add 3.9.0

* Add conflict with apple-clang
2023-10-08 10:28:13 -05:00
Jen Herting
00396fbe6c [py-tokenizers] added version 0.13.3 (#40360) 2023-10-08 10:27:18 -05:00
Jen Herting
a3be9cb853 [py-tensorboardx] Added version 2.6.2.2 (#39731)
* [py-tensorboardx] Added version 2.6.2.2

* [py-tensorboardx] flake8

* [py-tensorboardx] requires py-setuptools-scm
2023-10-08 10:21:55 -05:00
Jen Herting
81f58229ab py-torch-sparse: add v0.6.17 (#39495)
* [py-torch-sparse] New version 0.6.17

* [py-torch-sparse] added dependency on parallel-hashmap

* [py-torch-sparse]

- spack only supports python@3.7:
- py-pytest-runner only needed with old versions
2023-10-08 10:20:46 -05:00
Jen Herting
2eb16a8ea2 [py-lvis] New package (#39080)
* [py-lvis] New package

* [py-lvis] flake8

* [py-lvis] os agnostic

* [py-lvis] added comment for imported dependency

* [py-lvis] style fix
2023-10-08 10:18:01 -05:00
Manuela Kuhn
9db782f8d9 py-bids-validator: add 1.13.1 (#40356)
* py-bids-validator: add 1.13.1

* Fix style
2023-10-08 10:16:08 -05:00
Lydéric Debusschère
633df54520 [add] py-cylc-flow: new recipe (#39986)
* [add] py-cylc-flow: new recipe

* py-cylc-flow: fix py-protobuf version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cylc-flow: fix py-colorama version

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-aiofiles

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-pyzmq

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylcflow: remove useless dependence

* py-cylc-flow: fix indent

* py-cylc-flow: fix argument in depends_on; move lines

* py-cylc-flow: fix the type of the dependence py-setuptools

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-08 10:06:13 -05:00
Mark (he/his) C. Miller
e2a7f2ee9a Update CITATION.cff (#40363)
You will note the `Cite this repository` link is not working.

This commit fixes the underlying file...

* `authors` was not indented
* `authors` required by `preferred-citation`
* `authors` list required at top level (I simply duplicated)
* `"USA"` not correct country code
* `month` requires an integer month number
* Added URL to the actual pdf of the cited paper
* Used `identifiers` for doi and LLNL doc number
* added `abstract` copied from paper

Various fixes were confirmed by `cffconvert` using `docker run -v `pwd`:/app citationcff/cffconvert --validate`
2023-10-07 17:44:31 -07:00
Massimiliano Culpo
28c49930e2 Remove warning for custom module configuration, when no module is enabled (#40358)
The warning was added in v0.20 and was slated for removal in v0.21
2023-10-07 09:21:04 +02:00
Ken Raffenetti
6c1868f8ae yaksa: add version 0.3 (#40368) 2023-10-06 22:28:15 -06:00
Massimiliano Culpo
4f992475f4 Python add v3.11.5 (#40330) 2023-10-06 18:03:00 -06:00
Alex Richert
7a358c9005 Change 'exit' to 'return' in setup-env.sh (#36137)
* Change 'exit' to 'return' in `setup-env.sh` to avoid losing shell in some cases when sourcing twice.
2023-10-06 16:19:19 -07:00
Veselin Dobrev
b5079614b0 MFEM: add new version v4.6 (#40170)
* [mfem] Initial changes for v4.6

* [@spackbot] updating style on behalf of v-dobrev

* [mfem] Set the proper download link for v4.6
2023-10-06 15:31:23 -07:00
Alex Richert
482525d0f9 Update bufr recipe (#40033)
* Update bufr recipe
* Add v12.0.1
* style fixes
* remove test-related functionality for bufr
* Re-add testing

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-06 15:04:51 -07:00
snehring
599220924d wise2: adding new package wise2 (#40341) 2023-10-06 14:24:44 -07:00
Harmen Stoppels
d341be83e5 VersionRange: improve error message for empty range (#40345) 2023-10-06 14:19:49 -07:00
George Young
b027d7d0de metal: new package @2020-05-05 (#40355)
* metal: new package
* style

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-06 14:14:45 -07:00
Howard Pritchard
0357df0c8b openmpi: add 4.1.6 release (#40361)
related to #40232

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-10-06 13:06:37 -07:00
Edward Hartnett
f70ae6e3c4 g2: updated for 3.4.8 release (#40366)
* updated for 3.4.8 release
2023-10-06 13:02:17 -07:00
Manuela Kuhn
921ed1c21b py-expecttest: new package (#40347) 2023-10-06 14:34:47 -05:00
Manuela Kuhn
c95d43771a py-asttokens: add 2.4.0 (#40349) 2023-10-06 14:33:56 -05:00
Manuela Kuhn
db3d816f8b py-argcomplete: add 3.1.2 (#40348) 2023-10-06 14:32:21 -05:00
Manuela Kuhn
1d6a142608 py-anyio: add 4.0.0 (#40346) 2023-10-06 14:27:53 -05:00
George Young
98271c3712 topaz: new package @0.2.5 (#40352)
* topaz: new package @0.2.5

* switching over to pypi

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-06 13:00:03 -06:00
Manuela Kuhn
e3f6df884e py-zipp: add 3.17.0 (#40278)
* py-zipp: add 3.17.0

* Re-add python@3.7 dependency
2023-10-06 13:54:25 -05:00
Sam Gillingham
b0f36b2cd9 RIOS: add recent versions (#40243)
* add recent versions of RIOS

* fix depends_on syntax

* fix typo

* fix sha and add parallel variant

* remove self

* try doing in one
2023-10-06 13:52:15 -05:00
Lydéric Debusschère
5524492e25 [add] py-metomi-rose: new recipe, required by py-cylc-rose (#39981)
* [add] py-metomi-rose: new recipe, required by py-cylc-rose

* py-metomi-rose: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-06 13:49:59 -05:00
Sam Gillingham
112f045352 py-tuiview: add recent versions of tuiview (#40244)
* add recent versions of tuiview and remove Qt4 version

* reformat

* fix stray tabs

* add back a deprecated 1.1.7

* tabs

* more tabs

* reformat

* comma
2023-10-06 13:49:06 -05:00
Harmen Stoppels
72ed8711a7 unparse: drop python 2 remnants (#40329) 2023-10-06 09:42:47 -07:00
Harmen Stoppels
55e0c2c900 openssh: 9.5p1 (#40354) 2023-10-06 09:37:42 -07:00
Massimiliano Culpo
e20c05fcdf Make "minimal" the default duplicate strategy (#39621)
* Allow branching out of the "generic build" unification set

For cases like the one in https://github.com/spack/spack/pull/39661
we need to relax rules on unification sets.

The issue is that, right now, nodes in the "generic build" unification
set are unified together with their build dependencies. This was done
out of caution to avoid the risk of circular dependencies, which would
ultimately cause a very slow solve.

For build-tools like Cython, however, the build dependencies is masked
by a long chain of "build, run" dependencies that belong in the
"generic build" unification space.

To allow splitting on cases like this, we relax the rule disallowing
branching out of the "generic build" unification set.

* Fix issue with pure build virtual dependencies

Pure build virtual dependencies were not accounted properly in the
list of possible virtuals. This caused some facts connecting virtuals
to the corresponding providers to not be emitted, and in the end
lead to unsat problems.

* Fixed a few issues in packages

py-gevent: restore dependency on py-cython@3
jsoncpp: fix typo in build dependency
ecp-data-vis-sdk: update spack.yaml and cmake recipe
py-statsmodels: add v0.13.5

* Make dependency on "blt" of type "build"
2023-10-06 10:24:21 +02:00
Harmen Stoppels
36183eac40 python: add 3.12.0 (but keep 3.11 preferred) (#40282) 2023-10-06 09:44:09 +02:00
Alex Richert
7254c76b68 ecFlow update (#40305)
* Support static openssl for ecflow

* Update ecflow/static openssl

* Update ssl settings in ecflow

* add pic variant for ecflow

* style fix

* Update package.py

* Update package.py
2023-10-05 20:04:44 -07:00
eugeneswalker
e0e6ff5a68 Revert "cray rhel: disable due to runner issues (#40324)" (#40335)
This reverts commit bf7f54449b.
2023-10-05 10:32:52 -07:00
snehring
b0d49d4973 pbmpi: adding new version and maintainer (#40319) 2023-10-05 10:13:50 -07:00
Harmen Stoppels
4ce5d14066 unparse: drop python 3.4 remnants (#40333) 2023-10-05 09:52:23 -07:00
Thomas Madlener
9e9653ac58 whizard: Make sure to detect LCIO if requested (#40316) 2023-10-05 08:53:56 -07:00
Auriane R
bec873aec9 Add pika 0.19.0 (#40313) 2023-10-05 09:34:22 +02:00
Harmen Stoppels
bf7f54449b cray rhel: disable due to runner issues (#40324) 2023-10-05 08:45:33 +02:00
Cameron Rutherford
9f0e3c0fed exago: add and logging variant. (#40188) 2023-10-05 06:57:17 +02:00
eugeneswalker
79e7da9420 trilinos: add variant to build tests (#40284)
* trilinos: add variant: testing

* trilinos: rename +testing to +test
2023-10-04 16:32:30 -05:00
Harmen Stoppels
0f43074f3e Improve build isolation in PythonPipBuilder (#40224)
We run pip with `--no-build-isolation` because we don't wanna let pip
install build deps.

As a consequence, when pip runs hooks, it runs hooks of *any* package it
can find in `sys.path`.

For Spack-built Python this includes user site packages -- there
shouldn't be any system site packages. So in this case it suffices to
set the environment variable PYTHONNOUSERSITE=1.

For external Python, more needs to be done, cause there is no env
variable that disables both system and user site packages; setting the
`python -S` flag doesn't work because pip runs subprocesses that don't
inherit this flag (and there is no API to know if -S was passed)

So, for external Python, an empty venv is created before invoking pip in
Spack's build env ensures that pip can no longer see anything but
standard libraries and `PYTHONPATH`.

The downside of this is that pip will generate shebangs that point to
the python executable from the venv. So, for external python an extra
step is necessary where we fix up shebangs post install.
2023-10-04 14:38:50 -05:00
Ken Raffenetti
d297098504 yaksa: Allow unsupported host compiler with CUDA (#40298)
Fixes #40272.
2023-10-04 09:37:58 -07:00
Josh Bowden
284eaf1afe Damaris release v1.9.2 (#40285)
* Update to latest dot versions and improved installation of Damaris python module damaris4py

* fix for visit dependency typo

* whitespace check

* whitespace check

* fix for style issue

* reviewer suggestions for integrating Python added

* suggestion for boost depends statement added
2023-10-04 09:34:43 -06:00
Dom Heinzeller
da637dba84 Add new package awscli-v2 and its missing dependency awscrt (#40288)
* Add new package awscli-v2 and its missing dependency awscrt

* Remove boilerplate comments from awscli-v2 and awscrt packages

* Fix typos in var/spack/repos/builtin/packages/awscli-v2/package.py

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Address reviewer comments

* Remove py-pip version dependency from var/spack/repos/builtin/packages/awscli-v2/package.py

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-04 09:34:23 -06:00
Harmen Stoppels
931fce2c24 py-isort: needs setuptools build dep before v5 (#40234)
* py-isort: needs setuptools build dep before v5

Detected in #40224.

In the past, system setuptools could be picked up when using an external
python, so py-isort@4 would install fine. With the linked PR, pip can
only consider packages that Spack controls from PYTHONPATH, so the issue
of missing py-setuptools showed up.

* py-importlib-metadata: fix lowerbounds on python

* review

* py-isort unconditionally add optional setuptools dep to prevent picking up user package at runtime

* style

* drop optional py-setuptools run dep
2023-10-04 03:31:46 -06:00
Adam J. Stewart
42fbf17c82 py-einops: add v0.7.0 (#40296) 2023-10-04 04:28:49 -05:00
Harmen Stoppels
d9cacf664c petsc: add conflict on rocm 5.6: for now (#40300)
hipsparse@5.6.0 changed hipsparseSpSV_solve() API, but reverted in 5.6.1
2023-10-04 09:59:59 +02:00
Scott Wittenburg
7bf6780de2 ci: pull E4S images from github instead of dockerhub (#40307) 2023-10-04 09:55:06 +02:00
eugeneswalker
91178d40f3 e4s arm: disable bricks due to target=aarch64 not being respected (#40308) 2023-10-04 09:51:14 +02:00
kwryankrattiger
2817cd2936 ADIOS2: v2.8 is not compatible with HDF5 v1.14: (#40258) 2023-10-03 16:45:24 -05:00
Scott Wittenburg
92a6ddcbc3 ci: Change how job names appear in gitlab (#39963) 2023-10-03 15:16:41 -06:00
Sinan
58017f484c fix_qgis_build_with_pysip5 (#39941)
* fix_qgis_build_with_pysip5

* build fails with newer protobuf

* somehow findgdal can figure this out.

* Update var/spack/repos/builtin/packages/qgis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix gdal lib again

* qgis needs QtPositioning provided by qt+location option

* fix FindPyQt5 cmake file

* fix bug

* fix qsci sip issue

* fix bug

* blackify

* improve

* add latest LTR

* add build dep

* revert until bug is fixed

* specify proj version for qgis 3.28

* improve gdal libs search via indicating gdal-config

* make flake happy

* improve deps

* add 3.28.11, improve style

* fix style

* [@spackbot] updating style on behalf of Sinan81

---------

Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-10-03 14:21:51 -05:00
George Young
86d2e1af97 falco: new package @1.2.1 (#40289)
* falco: new package @1.2.1
* specifying gmake
* Replacing homepage - readthedocs is empty

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-03 11:26:49 -07:00
Adam J. Stewart
bf23be291b py-tables: add v3.8.0 (#40295) 2023-10-03 11:12:55 -07:00
Lydéric Debusschère
3b32a9918c [add] py-graphene: new recipe, required by py-cylc-flow (#39988) 2023-10-03 11:21:36 -06:00
Jordan Galby
f0260c84b4 Fix binutils regression on +gas~ld fix (#40292) 2023-10-03 17:28:55 +02:00
eugeneswalker
8746c75db0 e4s ci stacks: sync with e4s-23.08 (#40263)
* e4s amd64 gcc ci stack: sync with e4s-23.08

* e4s amd64 oneapi ci stack: sync with e4s-23.08

* e4s ppc64 gcc ci stack: sync with e4s-23.08

* add new ci stack: e4s amd64 gcc w/ external rocm

* add new ci stack: e4s arm gcc ci

* updates

* py-scipy: -fvisibility issue is resolved in 2023.1.0: #39464

* paraview oneapi fails

* comment out pkgs that fail to build on power

* fix arm stack name

* fix cabana +cuda specification

* comment out failing spces

* visit fails build on arm

* comment out slepc arm builds due to make issue

* comment out failing dealii arm builds
2023-10-03 08:12:51 -07:00
Gavin John
e8f230199f py-biom-format: Fix package file issues and bring up to date (#39962)
* Fix python versioning issue for py-biom-format

* Update deps according to feedback

* Remove version requirement for py-cython

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Only depend on py-six for versions >=2.1.10

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-future as non-dependency for 2.1.15

* There we are. Everything anyone could ever want

* Missed cython version change

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-03 09:55:15 -05:00
GeorgeJuniorGG
1e3c7abc1c Adding dynaconf module (#39762)
* Adding dynaconf module

* Fixing style

* Adding dependency
2023-10-03 06:23:29 -05:00
Manuela Kuhn
12e51da102 py-yarl: add 1.9.2 (#40277) 2023-10-03 06:22:20 -05:00
Tom Payerle
992291c738 py-dipy: Update version to support python@3.10 (#40229)
* py-dipy: Update version to support python@3.10

py-dipy only adds support to python@3.10 in py-dipy@1.5.0

See #40228

* py-dipy: fix formatting issues

* py-dipy: another formatting fix

* py-dipy: Use depends instead of conflicts

* py-dipy: formatting fix

* py-dipy: Updating for @1.7.0

Added new minimum version requirements for
py-cython
py-numpy
py-scipy
py-h5py
as suggested by @manuelakuhn
(py-nibabel min version unchanged for @1.7.0)
2023-10-03 06:10:14 -05:00
Manuela Kuhn
78e63fa257 py-wcwidth: add 0.2.7 (#40256) 2023-10-03 06:04:30 -05:00
Manuela Kuhn
487ea8b263 py-virtualenv: add 20.24.5 (#40255)
* py-virtualenv: add 20.22.0

* py-platformdirs: add 3.5.3

* py-filelock: add 3.12.4

* py-distlib: add 0.3.7
2023-10-03 06:03:35 -05:00
Manuela Kuhn
0d877b4184 py-websocket-client: add 1.6.3 (#40274) 2023-10-03 05:58:53 -05:00
Manuela Kuhn
994544f208 py-werkzeug: add 2.3.7 and 3.0.0 (#40275) 2023-10-03 05:57:27 -05:00
Manuela Kuhn
36bb2a5d09 py-wrapt: add 1.15.0 (#40276) 2023-10-03 05:55:28 -05:00
Stephen Hudson
071c1c38dc py-libEnsemble: add v1.0.0 (#40203)
* py-libEnsemble: add v1.0.0

* Add version ranges for deps

* Add comments to variants

* Put when before type

* Fix line lengths

* Re-format

* Update var/spack/repos/builtin/packages/py-libensemble/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Update var/spack/repos/builtin/packages/py-libensemble/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-10-03 05:48:17 -05:00
Lydéric Debusschère
b480ae2b7d [add] py-graphql-relay: new package (#39807)
* [add] py-graphql-relay: new package

* py-graphql-relay: Update package.py

Remove leftovers from version 3.2.0:
* archive name in pypi
* dependencies

* [fix] py-graphql-relay: remove constraint on python version; add dependence py-rx because of ModuleNotFoundError during spack test

* [fix] py-graphql-relay: remove py-rx dependence; py-graphql-core: add dependencies for version 2.3.2

* py-graphql-core: Update from review; set build backend, py-poetry for version 3: and py-setuptools for version 2

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-03 05:45:06 -05:00
Lydéric Debusschère
7a390f503d [add] py-metomi-isodatetime: new recipe, required by py-cylc-flow (#39990)
* [add] py-metomi-isodatetime: new recipe, required by py-cylc-flow

* py-metomi-isodatetime: use sources from pypi instead of github
2023-10-03 05:42:46 -05:00
dependabot[bot]
b7cb3462d4 build(deps): bump actions/setup-python from 4.7.0 to 4.7.1 (#40287)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.0 to 4.7.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](61a6322f88...65d7f2d534)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-03 12:28:02 +02:00
dependabot[bot]
f2230100ac build(deps): bump urllib3 from 2.0.5 to 2.0.6 in /lib/spack/docs (#40286)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.5 to 2.0.6.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/v2.0.5...2.0.6)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-03 12:27:36 +02:00
Massimiliano Culpo
4b06862a7f Deactivate Cray sles, due to unavailable runner (#40291)
This reverts commit 0274091204.
2023-10-03 11:06:36 +02:00
Harmen Stoppels
06057d6dba Buildcache tarballs with rootfs structure (#39341)
Two changes in this PR:

1. Register absolute paths in tarballs, which makes it easier
   to use them as container image layers, or rootfs in general, outside
   of Spack. Spack supports this already on develop.
2. Assemble the tarfile entries "by hand", which has a few advantages:
   1. Avoid reading `/etc/passwd`, `/etc/groups`, `/etc/nsswitch.conf`
      which `tar.add(dir)` does _for each file it adds_
   2. Reduce the number of stat calls per file added by a factor two,
      compared to `tar.add`, which should help with slow, shared filesystems
      where these calls are expensive
   4. Create normalized `TarInfo` entries from the start, instead of letting
      Python create them and patching them after the fact
   5. Don't recurse into subdirs before processing files, to avoid
      keeping nested directories opened. (this changes the tar entry
      order slightly, it's like sorting by `(not is_dir, name)`.
2023-10-03 09:56:18 +02:00
Sam Reeve
bb03a1768b Add AdditiveFOAM package (#39295)
* Add AdditiveFOAM package
* Add AdditiveFOAM build and install

Co-authored-by: kjrstory <kjrstory@gmail.com>
Co-authored-by: Knapp, Gerry <knappgl@ornl.gov>

---------

Co-authored-by: kjrstory <kjrstory@gmail.com>
Co-authored-by: Knapp, Gerry <knappgl@ornl.gov>
2023-10-02 16:27:13 -07:00
Adam J. Stewart
75ed26258c py-torchgeo: add v0.5.0 (#40267)
* py-torchgeo: add v0.5.0

* Better documentation of dependency quirks
2023-10-02 15:23:49 -05:00
eugeneswalker
1da8477a3c vtk-m@2.0: depend on kokkos@3.7:3.9 per issue #40268 (#40281) 2023-10-02 12:26:55 -07:00
Christoph Weber
4c111554ae Add ipm package (#40069) 2023-10-02 11:46:17 -07:00
renjithravindrankannath
615312fcee Rocm 5.6.0 & 5.6.1 release updates (#39673)
* 5.6.0 updates
* Rocm 5.6.0 updates
* Style and audit corrections for 5.6
* Patching smi path for tests.
* Style correction
* 5.6.1 updates
* Updated hip tests for ci build  failure
   Updated hiprand with the release tag
   Taken care the review comment rocsolver
* Adding rocm-smi path for 5.6
* Adding the patch file
* Setting library directory uniform
* gl depends on mesa but it should not be llvm variant
* Fix for the issue 39520 by setting CMAKE_INSTALL_LIBDIR=lib
* i1 muls can sometimes happen after SCEV. They resulted in
   ISel failures because we were missing the patterns for them.
* 5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx
* Revert "5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx"
   This reverts commit f54c9c6c67.
* Revert operator mixup fix
* Splitting compiler-rt-linkage-for-host and operator mixup patch
* Adding missing patch for reverting operator mixup
* 5.6 update for composable-kernel,migraphx,miopen-hip and mivisionx
* Updating rvs, rcd and rccl for 5.6.1. adding comment for llvm patch
2023-10-02 11:36:51 -07:00
Joseph Wang
453625014d fix lhapdf package (#37384)
* fix lhapdf package
* [@spackbot] updating style on behalf of joequant

---------

Co-authored-by: joequant <joequant@users.noreply.github.com>
2023-10-02 11:33:54 -07:00
G-Ragghianti
1b75651af6 Implemented +sycl for slate (#39927)
* Implemented +sycl for slate

* style

* style

* add slate +sycl to ci

* slate +sycl: explicitly specify +mpi

* comment out slate+sycl+mpi in e4s oneapi stack

* removing requirement of intel-oneapi-mpi

* Added slate+sycl to e4s oneapi stack

* Removing obsolete comment

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-10-02 07:35:00 -07:00
Thomas Madlener
b3e3604f46 libtirpc: Add latest version and allow for builds on macOS (#40189) 2023-10-02 15:02:33 +02:00
Harmen Stoppels
6c4ce379ca buildcache: ignore errors of newer buildcache version (#40279)
Currently buildcaches are forward incompatible in an annoying way: spack
errors out when trying to use them.

With this change, you just get a warning.
2023-10-02 14:51:48 +02:00
Harmen Stoppels
a9dcba76ce py-importlib-metadata: add patch releases (#40249)
importlib_metadata 4.11.4 fixes some nasty issues with
`metadata.entry_points(...)`, so ensure we have those bugfixes.
2023-10-02 10:09:36 +02:00
Harmen Stoppels
32f21f2a01 Spack python 3.12: PEP 695 unparse support (#40155) 2023-10-02 00:25:52 -07:00
Greg Sjaardema
e60bbd1bfc Cgns fix no fortran config (#40241)
* SEACAS: Update package.py to handle new SEACAS project name

The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.

The refactor also changes several:
    "-DSome_CMAKE_Option:BOOL=ON"
to
   define("Some_CMAKE_Option", True)

* CGNS: If fortran not enabled, do not specify parallel fortran compiler

* Update package formatting as suggested by black

* Accept suggested change
2023-10-02 16:03:47 +09:00
Freifrau von Bleifrei
71c5b948d0 discotec package: add selalib variant (#40222) 2023-10-02 07:14:50 +02:00
Tom Payerle
726d6b9881 zoltan: Fixes for #40198 (#40199)
Fix issue in configure_args which resulted in duplicate "--with-ldflags" arguments (with different values) being passed to configure.  And extended the fix to similar arguments.

Also, repeated some flags to "--with-libs" to "--with-ldflags" as when the flags were only in "--with-libs", they did not seem to be picked up everywhere.  I suspect this is a bug in the configure script, but adding to both locations seems to solve it and should not have any adverse effects.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-02 07:11:36 +02:00
Sinan
aff64c02e8 xv: new package (#40032)
Co-authored-by: Sinan81 <Sinan@world>
2023-10-02 07:06:00 +02:00
Matthew Archer
31ae5cba91 add maintainers to py-nanobind (#40235) 2023-10-02 07:03:49 +02:00
Sam Reeve
0a91d2411a cabana: add v0.6 (#40168) 2023-10-02 06:57:55 +02:00
snehring
5f3af3d5e4 perl-dbd-mysql update to 4.050 (#40245)
* perl-devel-checklib: adding new package perl-devel-checklib

* perl-dbd-mysql: adding new version 4.050 and a new build dep
2023-10-02 06:45:11 +02:00
Satish Balay
37158cb913 petsc,py-petsc4py,slepc,py-slepc4py: add version 3.20.0 (#40260) 2023-10-02 06:28:37 +02:00
Wouter Deconinck
a596e16a37 libxkbcommon: new versions 1.4.1, 1.5.0 (#40273) 2023-10-02 05:54:55 +02:00
Weiqun Zhang
4e69f5121f amrex: add v23.10 (#40270) 2023-10-02 05:53:16 +02:00
Wouter Deconinck
2a0f4393c3 assimp: new version 5.3.1 (#40271) 2023-10-02 05:52:30 +02:00
Wouter Deconinck
c9e1e7d90c acts: impose cxxstd variant on geant4 dependency (#39767) 2023-10-02 05:49:51 +02:00
Juan Miguel Carceller
7170f2252c rivet: remove deprecated versions and clean up recipe (#39861)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2023-10-02 05:48:35 +02:00
eugeneswalker
b09073e01e py-pandas@0.24.2 %oneapi: add cflag=-Wno-error=implicit-function-declaration (#39470) 2023-10-01 14:42:56 -06:00
eugeneswalker
2d509dc3eb py-scipy: -fvisibility issue is resolved in 2023.1.0: (#39464)
* py-scipy: -fvisibility issue is resolved in 2023.1.0:

* e4s oneapi ci: add py-scipy
2023-10-01 17:29:21 +00:00
Martin Aumüller
8a9d45cc29 embree: fix linux build on aarch64 for 3.13.5 (#39749)
- upstream patch does not apply cleanly to older versions, not required
  for newer ones
- also add conflict for older versions, except for 3.13.3 which works by
  chance
2023-10-01 09:04:37 -07:00
Wouter Deconinck
b25f8643ff geant4, vecgeom: support variant cxxstd=20 (#39785) 2023-10-01 21:33:37 +09:00
Juan Miguel Carceller
9120b6644d qt: change version for opengl dependencies (#39718) 2023-10-01 21:31:11 +09:00
eugeneswalker
68dbd25f5f e4s cray sles ci: expand spec list (#40162)
* e4s cray sles stack: expand spec list

* remove unnecessary packages:trilinos:one_of
2023-09-30 21:32:33 -07:00
Todd Gamblin
9e54134daf docs: Replace package list with packages.spack.io (#40251)
For a long time, the docs have generated a huge, static HTML package list. It has some
disadvantages:

* It's slow to load
* It's slow to build
* It's hard to search

We now have a nice website that can tell us about Spack packages, and it's searchable so
users can easily find the one or two packages out of 7400 that they're looking for. We
should link to this instead of including a static package list page in the docs.

- [x] Replace package list link with link to packages.spack.io
- [x] Remove `package_list.html` generation from `conf.py`.
- [x] Add a new section for "Links" to the docs.
- [x] Remove docstring notes from contribution guide (we haven't generated RST
      for package docstrings for a while)
- [x] Remove referencese to `package-list` from docs.
2023-10-01 05:36:22 +02:00
eugeneswalker
08a9345fcc e4s ci: add packages: drishti, dxt-explorer (#39597)
* e4s ci: add packages: drishti, dxt-explorer

* e4s oneapi ci: comment out dxt-explorer until r%oneapi issue #40257 is resolved
2023-09-30 06:15:58 +09:00
John W. Parent
7d072cc16f Windows: detect all available SDK versions (#39823)
Currently, Windows SDK detection will only pick up SDK versions
related to the current version of Windows Spack is running on.
However, in some circumstances, we want to detect other version
of the SDK, for example, for compiling on Windows 11 for Windows
10 to ensure an API is compatible with Win10.
2023-09-29 20:17:10 +00:00
snehring
d81f457e7a modeltest-ng: adding new version, swapping maintainer (#40217)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-09-29 20:19:29 +02:00
360 changed files with 13150 additions and 3705 deletions

View File

@@ -23,7 +23,7 @@ jobs:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -42,8 +42,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -80,8 +80,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -145,8 +145,8 @@ jobs:
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -163,8 +163,8 @@ jobs:
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -265,6 +265,7 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -302,8 +303,8 @@ jobs:
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -320,6 +321,7 @@ jobs:
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -338,8 +340,8 @@ jobs:
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -2,6 +2,6 @@ black==23.9.1
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.5.1
mypy==1.6.0
types-six==1.16.21.9
vermin==1.5.2

View File

@@ -15,7 +15,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
@@ -45,12 +45,16 @@ jobs:
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.11'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -97,7 +101,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -155,7 +159,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -185,12 +189,12 @@ jobs:
runs-on: macos-latest
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.11"]
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@8ade135a41bc03ea155e62e844d188df1ea18608
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -27,12 +27,53 @@
# And here's the CITATION.cff format:
#
cff-version: 1.2.0
type: software
message: "If you are referencing Spack in a publication, please cite the paper below."
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
abstract: >-
Large HPC centers spend considerable time supporting software for thousands of users, but the complexity of HPC software is quickly outpacing the capabilities of existing software management tools.
Scientific applications require specific versions of compilers, MPI, and other dependency libraries, so using a single, standard software stack is infeasible.
However, managing many configurations is difficult because the configuration space is combinatorial in size.
We introduce Spack, a tool used at Lawrence Livermore National Laboratory to manage this complexity.
Spack provides a novel, re- cursive specification syntax to invoke parametric builds of packages and dependencies.
It allows any number of builds to coexist on the same system, and it ensures that installed packages can find their dependencies, regardless of the environment.
We show through real-world use cases that Spack supports diverse and demanding applications, bringing order to HPC software chaos.
preferred-citation:
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
type: conference-paper
doi: "10.1145/2807591.2807623"
url: "https://github.com/spack/spack"
url: "https://tgamblin.github.io/pubs/spack-sc15.pdf"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
given-names: "Matthew"
- family-names: "Collette"
given-names: "Michael R."
- family-names: "Lee"
given-names: "Gregory L."
- family-names: "Moody"
given-names: "Adam"
- family-names: "de Supinski"
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "US"
date-start: 2015-11-15
date-end: 2015-11-20
month: 11
year: 2015
identifiers:
- description: "The concept DOI of the work."
type: doi
value: 10.1145/2807591.2807623
- description: "The DOE Document Release Number of the work"
type: other
value: "LLNL-CONF-669890"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
@@ -47,12 +88,3 @@ preferred-citation:
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "USA"
month: November 15-20
year: 2015
notes: LLNL-CONF-669890

View File

@@ -9,15 +9,15 @@ bootstrap:
# may not be able to bootstrap all the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions-v0.5'
metadata: $spack/share/spack/bootstrap/github-actions-v0.5
- name: 'github-actions-v0.4'
metadata: $spack/share/spack/bootstrap/github-actions-v0.4
- name: 'github-actions-v0.3'
metadata: $spack/share/spack/bootstrap/github-actions-v0.3
- name: 'spack-install'
metadata: $spack/share/spack/bootstrap/spack-install
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions-v0.5: true
github-actions-v0.4: true
github-actions-v0.3: true
spack-install: true

View File

@@ -41,4 +41,4 @@ concretizer:
# "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: none
strategy: minimal

View File

@@ -1,4 +1,3 @@
package_list.html
command_index.rst
spack*.rst
llnl*.rst

View File

@@ -45,7 +45,8 @@ Listing available packages
To install software with Spack, you need to know what software is
available. You can see a list of available package names at the
:ref:`package-list` webpage, or using the ``spack list`` command.
`packages.spack.io <https://packages.spack.io>`_ website, or
using the ``spack list`` command.
.. _cmd-spack-list:
@@ -60,7 +61,7 @@ can install:
:ellipsis: 10
There are thousands of them, so we've truncated the output above, but you
can find a :ref:`full list here <package-list>`.
can find a `full list here <https://packages.spack.io>`_.
Packages are listed by name in alphabetical order.
A pattern to match with no wildcards, ``*`` or ``?``,
will be treated as though it started and ended with

View File

@@ -3,6 +3,103 @@
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _concretizer-options:
==========================================
Concretization Settings (concretizer.yaml)
==========================================
The ``concretizer.yaml`` configuration file allows to customize aspects of the
algorithm used to select the dependencies you install. The default configuration
is the following:
.. literalinclude:: _spack_root/etc/spack/defaults/concretizer.yaml
:language: yaml
--------------------------------
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
------------------------------------------
Selection of the target microarchitectures
------------------------------------------
The options under the ``targets`` attribute control which targets are considered during a solve.
Currently the options in this section are only configurable from the ``concretizer.yaml`` file
and there are no corresponding command line arguments to enable them for a single solve.
The ``granularity`` option can take two possible values: ``microarchitectures`` and ``generic``.
If set to:
.. code-block:: yaml
concretizer:
targets:
granularity: microarchitectures
Spack will consider all the microarchitectures known to ``archspec`` to label nodes for
compatibility. If instead the option is set to:
.. code-block:: yaml
concretizer:
targets:
granularity: generic
Spack will consider only generic microarchitectures. For instance, when running on an
Haswell node, Spack will consider ``haswell`` as the best target in the former case and
``x86_64_v3`` as the best target in the latter case.
The ``host_compatible`` option is a Boolean option that determines whether or not the
microarchitectures considered during the solve are constrained to be compatible with the
host Spack is currently running on. For instance, if this option is set to ``true``, a
user cannot concretize for ``target=icelake`` while running on an Haswell node.
---------------
Duplicate nodes
---------------
The ``duplicates`` attribute controls whether the DAG can contain multiple configurations of
the same package. This is mainly relevant for build dependencies, which may have their version
pinned by some nodes, and thus be required at different versions by different nodes in the same
DAG.
The ``strategy`` option controls how the solver deals with duplicates. If the value is ``none``,
then a single configuration per package is allowed in the DAG. This means, for instance, that only
a single ``cmake`` or a single ``py-setuptools`` version is allowed. The result would be a slightly
faster concretization, at the expense of making a few specs unsolvable.
If the value is ``minimal`` Spack will allow packages tagged as ``build-tools`` to have duplicates.
This allows, for instance, to concretize specs whose nodes require different, and incompatible, ranges
of some build tool. For instance, in the figure below the latest `py-shapely` requires a newer `py-setuptools`,
while `py-numpy` still needs an older version:
.. figure:: images/shapely_duplicates.svg
:scale: 70 %
:align: center
Up to Spack v0.20 ``duplicates:strategy:none`` was the default (and only) behavior. From Spack v0.21 the
default behavior is ``duplicates:strategy:minimal``.
.. _build-settings:
================================
@@ -232,76 +329,6 @@ Specific limitations include:
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _concretizer-options:
----------------------
Concretizer options
----------------------
``packages.yaml`` gives the concretizer preferences for specific packages,
but you can also use ``concretizer.yaml`` to customize aspects of the
algorithm it uses to select the dependencies you install:
.. literalinclude:: _spack_root/etc/spack/defaults/concretizer.yaml
:language: yaml
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Reuse already installed packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Selection of the target microarchitectures
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The options under the ``targets`` attribute control which targets are considered during a solve.
Currently the options in this section are only configurable from the ``concretizer.yaml`` file
and there are no corresponding command line arguments to enable them for a single solve.
The ``granularity`` option can take two possible values: ``microarchitectures`` and ``generic``.
If set to:
.. code-block:: yaml
concretizer:
targets:
granularity: microarchitectures
Spack will consider all the microarchitectures known to ``archspec`` to label nodes for
compatibility. If instead the option is set to:
.. code-block:: yaml
concretizer:
targets:
granularity: generic
Spack will consider only generic microarchitectures. For instance, when running on an
Haswell node, Spack will consider ``haswell`` as the best target in the former case and
``x86_64_v3`` as the best target in the latter case.
The ``host_compatible`` option is a Boolean option that determines whether or not the
microarchitectures considered during the solve are constrained to be compatible with the
host Spack is currently running on. For instance, if this option is set to ``true``, a
user cannot concretize for ``target=icelake`` while running on an Haswell node.
.. _package-requirements:
--------------------

View File

@@ -25,8 +25,8 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. See the :ref:`package-list` for the full list of available
oneAPI packages or use::
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -48,9 +48,6 @@
os.environ["COLIFY_SIZE"] = "25x120"
os.environ["COLUMNS"] = "120"
# Generate full package list if needed
subprocess.call(["spack", "list", "--format=html", "--update=package_list.html"])
# Generate a command index if an update is needed
subprocess.call(
[

View File

@@ -212,18 +212,12 @@ under the ``container`` attribute of environments:
final:
- libgomp
# Extra instructions
extra_instructions:
final: |
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
# Labels for the image
labels:
app: "gromacs"
mpi: "mpich"
A detailed description of the options available can be found in the
:ref:`container_config_options` section.
A detailed description of the options available can be found in the :ref:`container_config_options` section.
-------------------
Setting Base Images
@@ -525,6 +519,13 @@ the example below:
COPY data /share/myapp/data
{% endblock %}
The Dockerfile is generated by running:
.. code-block:: console
$ spack -e /opt/environment containerize
Note that the environment must be active for spack to read the template.
The recipe that gets generated contains the two extra instruction that we added in our template extension:
.. code-block:: Dockerfile

View File

@@ -310,53 +310,11 @@ Once all of the dependencies are installed, you can try building the documentati
$ make clean
$ make
If you see any warning or error messages, you will have to correct those before
your PR is accepted.
If you are editing the documentation, you should obviously be running the
documentation tests. But even if you are simply adding a new package, your
changes could cause the documentation tests to fail:
.. code-block:: console
package_list.rst:8745: WARNING: Block quote ends without a blank line; unexpected unindent.
At first, this error message will mean nothing to you, since you didn't edit
that file. Until you look at line 8745 of the file in question:
.. code-block:: rst
Description:
NetCDF is a set of software libraries and self-describing, machine-
independent data formats that support the creation, access, and sharing
of array-oriented scientific data.
Our documentation includes :ref:`a list of all Spack packages <package-list>`.
If you add a new package, its docstring is added to this page. The problem in
this case was that the docstring looked like:
.. code-block:: python
class Netcdf(Package):
"""
NetCDF is a set of software libraries and self-describing,
machine-independent data formats that support the creation,
access, and sharing of array-oriented scientific data.
"""
Docstrings cannot start with a newline character, or else Sphinx will complain.
Instead, they should look like:
.. code-block:: python
class Netcdf(Package):
"""NetCDF is a set of software libraries and self-describing,
machine-independent data formats that support the creation,
access, and sharing of array-oriented scientific data."""
Documentation changes can result in much more obfuscated warning messages.
If you don't understand what they mean, feel free to ask when you submit
your PR.
If you see any warning or error messages, you will have to correct those before your PR
is accepted. If you are editing the documentation, you should be running the
documentation tests to make sure there are no errors. Documentation changes can result
in some obfuscated warning messages. If you don't understand what they mean, feel free
to ask when you submit your PR.
--------
Coverage

File diff suppressed because it is too large Load Diff

After

Width:  |  Height:  |  Size: 108 KiB

View File

@@ -54,9 +54,16 @@ or refer to the full manual below.
features
getting_started
basic_usage
Tutorial: Spack 101 <https://spack-tutorial.readthedocs.io>
replace_conda_homebrew
.. toctree::
:maxdepth: 2
:caption: Links
Tutorial (spack-tutorial.rtfd.io) <https://spack-tutorial.readthedocs.io>
Packages (packages.spack.io) <https://packages.spack.io>
Binaries (binaries.spack.io) <https://cache.spack.io>
.. toctree::
:maxdepth: 2
:caption: Reference
@@ -72,7 +79,6 @@ or refer to the full manual below.
repositories
binary_caches
command_index
package_list
chain
extensions
pipelines

View File

@@ -1,17 +0,0 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _package-list:
============
Package List
============
This is a list of things you can install using Spack. It is
automatically generated based on the packages in this Spack
version.
.. raw:: html
:file: package_list.html

View File

@@ -3635,7 +3635,8 @@ regardless of the build system. The arguments for the phase are:
The arguments ``spec`` and ``prefix`` are passed only for convenience, as they always
correspond to ``self.spec`` and ``self.spec.prefix`` respectively.
If the ``package.py`` encodes builders explicitly, the signature for a phase changes slightly:
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
@@ -3645,56 +3646,6 @@ If the ``package.py`` encodes builders explicitly, the signature for a phase cha
In this case the package is passed as the second argument, and ``self`` is the builder instance.
.. _multiple_build_systems:
^^^^^^^^^^^^^^^^^^^^^^
Multiple build systems
^^^^^^^^^^^^^^^^^^^^^^
There are cases where a software actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases natively, if a recipe is written using builders explicitly.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default. The ``package.py``
will likely contain some overriding of default builder methods:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
pass
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
pass
In more complex cases it might happen that the build system changes according to certain conditions,
for instance across versions. That can be expressed with conditional variant values:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
^^^^^^^^^^^^^^^^^^
Mixin base classes
^^^^^^^^^^^^^^^^^^
@@ -3741,6 +3692,106 @@ for instance:
In the example above ``Cp2k`` inherits all the conflicts and variants that ``CudaPackage`` defines.
.. _multiple_build_systems:
----------------------
Multiple build systems
----------------------
There are cases where a package actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases by splitting the build instructions into separate builder classes.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
variant("my_feature", default=True)
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default.
Additional build instructions are split into separate builder classes:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
return [
self.define_from_variant("MY_FEATURE", "my_feature")
]
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
return self.with_or_without("my-feature", variant="my_feature")
In this example, ``spack install example +feature build_sytem=cmake`` will
pick the ``CMakeBuilder`` and invoke ``cmake -DMY_FEATURE:BOOL=ON``.
Similarly, ``spack install example +feature build_system=autotools`` will pick
the ``AutotoolsBuilder`` and invoke ``./configure --with-my-feature``.
Dependencies are always specified in the package class. When some dependencies
depend on the choice of the build system, it is possible to use when conditions as
usual:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
# Runtime dependencies
depends_on("ncurses")
depends_on("libxml2")
# Lowerbounds for cmake only apply when using cmake as the build system
with when("build_system=cmake"):
depends_on("cmake@3.18:", when="@2.0:", type="build")
depends_on("cmake@3:", type="build")
# Specify extra build dependencies used only in the configure script
with when("build_system=autotools"):
depends_on("perl", type="build")
depends_on("pkgconfig", type="build")
Very often projects switch from one build system to another, or add support
for a new build system from a certain version, which means that the choice
of the build system typically depends on a version range. Those situations can
be handled by using conditional values in the ``build_system`` directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
The ``build_system`` can be used as an ordinary variant, which also means that it can
be used in ``depends_on`` statements. This can be useful when a package *requires* that
its dependency has a CMake config file, meaning that the dependent can only build when the
dependency is built with CMake, and not Autotools. In that case, you can force the choice
of the build system in the dependent:
.. code-block:: python
class Dependent(CMakePackage):
depends_on("example build_system=cmake")
.. _install-environment:
-----------------------

View File

@@ -4,7 +4,7 @@
SPDX-License-Identifier: (Apache-2.0 OR MIT)
=====================================
Using Spack to Replace Homebrew/Conda
Spack for Homebrew/Conda Users
=====================================
Spack is an incredibly powerful package manager, designed for supercomputers
@@ -191,18 +191,18 @@ The ``--fresh`` flag tells Spack to use the latest version of every package
where possible instead of trying to optimize for reuse of existing installed
packages.
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
won't re-install them, it will keep the symlinks in place.
-----------------------------------
Updating & Cleaning Up Old Packages
-----------------------------------
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
package versions, simply run the following commands:
.. code-block:: console
@@ -212,9 +212,9 @@ package versions, simply run the following commands:
$ spack concretize --fresh --force
$ spack install
$ spack gc
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
spack's garbage collection system that these packages should be cleaned up.
Don't worry however, this will not remove your entire environment.
@@ -223,8 +223,8 @@ a fresh concretization and will re-mark any packages that should remain
installed as "explicitly" installed.
**Note:** if you use multiple spack environments you should re-run ``spack install``
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
environment you just upgraded.
--------------

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
python-levenshtein==0.21.1
python-levenshtein==0.23.0
docutils==0.18.1
pygments==2.16.1
urllib3==2.0.5
urllib3==2.0.6
pytest==7.4.2
isort==5.12.0
black==23.9.1
flake8==6.1.0
mypy==1.5.1
mypy==1.6.0

View File

@@ -1,9 +1,8 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 3.6--3.11, , Interpreter for Spack
Python, 3.6--3.12, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
1 Name Supported Versions Notes Requirement Reason
2 Python 3.6--3.11 3.6--3.12 Interpreter for Spack
3 C/C++ Compilers Building software
4 make Build software
5 patch Build software
bash Compiler wrappers
6 tar Extract/create archives
7 gzip Compress/Decompress archives
8 unzip Compress/Decompress archives

View File

@@ -23,7 +23,7 @@
import warnings
from contextlib import closing, contextmanager
from gzip import GzipFile
from typing import List, NamedTuple, Optional, Union
from typing import Dict, List, NamedTuple, Optional, Tuple, Union
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
@@ -216,11 +216,11 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
except spack_db.InvalidDatabaseVersionError as e:
msg = (
tty.warn(
f"you need a newer Spack version to read the buildcache index for the "
f"following mirror: '{mirror_url}'. {e.database_version_message}"
)
raise BuildcacheIndexError(msg) from e
return
spec_list = db.query_local(installed=False, in_buildcache=True)
@@ -625,8 +625,7 @@ def buildinfo_file_name(prefix):
"""
Filename of the binary package meta-data file
"""
name = os.path.join(prefix, ".spack/binary_distribution")
return name
return os.path.join(prefix, ".spack/binary_distribution")
def read_buildinfo_file(prefix):
@@ -914,7 +913,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
index_json_path,
url_util.join(cache_prefix, "index.json"),
keep_original=False,
extra_args={"ContentType": "application/json"},
extra_args={"ContentType": "application/json", "CacheControl": "no-cache"},
)
# Push the hash
@@ -922,7 +921,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
index_hash_path,
url_util.join(cache_prefix, "index.json.hash"),
keep_original=False,
extra_args={"ContentType": "text/plain"},
extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"},
)
@@ -1158,57 +1157,99 @@ def gzip_compressed_tarfile(path):
yield tar
def deterministic_tarinfo(tarinfo: tarfile.TarInfo):
# We only add files, symlinks, hardlinks, and directories
# No character devices, block devices and FIFOs should ever enter a tarball.
if tarinfo.isdev():
return None
# For distribution, it makes no sense to user/group data; since (a) they don't exist
# on other machines, and (b) they lead to surprises as `tar x` run as root will change
# ownership if it can. We want to extract as the current user. By setting owner to root,
# root will extract as root, and non-privileged user will extract as themselves.
tarinfo.uid = 0
tarinfo.gid = 0
tarinfo.uname = ""
tarinfo.gname = ""
# Reset mtime to epoch time, our prefixes are not truly immutable, so files may get
# touched; as long as the content does not change, this ensures we get stable tarballs.
tarinfo.mtime = 0
# Normalize mode
if tarinfo.isfile() or tarinfo.islnk():
# If user can execute, use 0o755; else 0o644
# This is to avoid potentially unsafe world writable & exeutable files that may get
# extracted when Python or tar is run with privileges
tarinfo.mode = 0o644 if tarinfo.mode & 0o100 == 0 else 0o755
else: # symbolic link and directories
tarinfo.mode = 0o755
return tarinfo
def _tarinfo_name(p: str):
return p.lstrip("/")
def tar_add_metadata(tar: tarfile.TarFile, path: str, data: dict):
# Serialize buildinfo for the tarball
bstring = syaml.dump(data, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=path)
tarinfo.size = len(bstring)
tar.addfile(deterministic_tarinfo(tarinfo), io.BytesIO(bstring))
def tarfile_of_spec_prefix(tar: tarfile.TarFile, prefix: str) -> None:
"""Create a tarfile of an install prefix of a spec. Skips existing buildinfo file.
Only adds regular files, symlinks and dirs. Skips devices, fifos. Preserves hardlinks.
Normalizes permissions like git. Tar entries are added in depth-first pre-order, with
dir entries partitioned by file | dir, and sorted alphabetically, for reproducibility.
Partitioning ensures only one dir is in memory at a time, and sorting improves compression.
Args:
tar: tarfile object to add files to
prefix: absolute install prefix of spec"""
if not os.path.isabs(prefix) or not os.path.isdir(prefix):
raise ValueError(f"prefix '{prefix}' must be an absolute path to a directory")
hardlink_to_tarinfo_name: Dict[Tuple[int, int], str] = dict()
stat_key = lambda stat: (stat.st_dev, stat.st_ino)
try: # skip buildinfo file if it exists
files_to_skip = [stat_key(os.lstat(buildinfo_file_name(prefix)))]
except OSError:
files_to_skip = []
dir_stack = [prefix]
while dir_stack:
dir = dir_stack.pop()
# Add the dir before its contents
dir_info = tarfile.TarInfo(_tarinfo_name(dir))
dir_info.type = tarfile.DIRTYPE
dir_info.mode = 0o755
tar.addfile(dir_info)
# Sort by name: reproducible & improves compression
with os.scandir(dir) as it:
entries = sorted(it, key=lambda entry: entry.name)
new_dirs = []
for entry in entries:
if entry.is_dir(follow_symlinks=False):
new_dirs.append(entry.path)
continue
file_info = tarfile.TarInfo(_tarinfo_name(entry.path))
s = entry.stat(follow_symlinks=False)
# Skip existing binary distribution files.
id = stat_key(s)
if id in files_to_skip:
continue
# Normalize the mode
file_info.mode = 0o644 if s.st_mode & 0o100 == 0 else 0o755
if entry.is_symlink():
file_info.type = tarfile.SYMTYPE
file_info.linkname = os.readlink(entry.path)
tar.addfile(file_info)
elif entry.is_file(follow_symlinks=False):
# Deduplicate hardlinks
if s.st_nlink > 1:
if id in hardlink_to_tarinfo_name:
file_info.type = tarfile.LNKTYPE
file_info.linkname = hardlink_to_tarinfo_name[id]
tar.addfile(file_info)
continue
hardlink_to_tarinfo_name[id] = file_info.name
# If file not yet seen, copy it.
file_info.type = tarfile.REGTYPE
file_info.size = s.st_size
with open(entry.path, "rb") as f:
tar.addfile(file_info, f)
dir_stack.extend(reversed(new_dirs)) # we pop, so reverse to stay alphabetical
def deterministic_tarinfo_without_buildinfo(tarinfo: tarfile.TarInfo):
"""Skip buildinfo file when creating a tarball, and normalize other tarinfo fields."""
if tarinfo.name.endswith("/.spack/binary_distribution"):
return None
return deterministic_tarinfo(tarinfo)
def _do_create_tarball(tarfile_path: str, binaries_dir: str, pkg_dir: str, buildinfo: dict):
def _do_create_tarball(tarfile_path: str, binaries_dir: str, buildinfo: dict):
with gzip_compressed_tarfile(tarfile_path) as tar:
tar.add(name=binaries_dir, arcname=pkg_dir, filter=deterministic_tarinfo_without_buildinfo)
tar_add_metadata(tar, buildinfo_file_name(pkg_dir), buildinfo)
# Tarball the install prefix
tarfile_of_spec_prefix(tar, binaries_dir)
# Serialize buildinfo for the tarball
bstring = syaml.dump(buildinfo, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=_tarinfo_name(buildinfo_file_name(binaries_dir)))
tarinfo.type = tarfile.REGTYPE
tarinfo.size = len(bstring)
tarinfo.mode = 0o644
tar.addfile(tarinfo, io.BytesIO(bstring))
class PushOptions(NamedTuple):
@@ -1280,14 +1321,12 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
):
raise NoOverwriteException(url_util.format(remote_specfile_path))
pkg_dir = os.path.basename(spec.prefix.rstrip(os.path.sep))
binaries_dir = spec.prefix
# create info for later relocation and create tar
buildinfo = get_buildinfo_dict(spec)
_do_create_tarball(tarfile_path, binaries_dir, pkg_dir, buildinfo)
_do_create_tarball(tarfile_path, binaries_dir, buildinfo)
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)

View File

@@ -228,7 +228,7 @@ def _install_and_test(
if not abstract_spec.intersects(candidate_spec):
continue
if python_spec is not None and python_spec not in abstract_spec:
if python_spec is not None and not abstract_spec.intersects(f"^{python_spec}"):
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:

View File

@@ -142,10 +142,10 @@ def flags_to_build_system_args(self, flags):
# We specify for each of them.
if flags["ldflags"]:
ldflags = " ".join(flags["ldflags"])
ld_string = "-DCMAKE_{0}_LINKER_FLAGS={1}"
# cmake has separate linker arguments for types of builds.
for type in ["EXE", "MODULE", "SHARED", "STATIC"]:
self.cmake_flag_args.append(ld_string.format(type, ldflags))
self.cmake_flag_args.append(f"-DCMAKE_EXE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_MODULE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_SHARED_LINKER_FLAGS={ldflags}")
# CMake has libs options separated by language. Apply ours to each.
if flags["ldlibs"]:

View File

@@ -10,7 +10,7 @@
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on, variant
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
@@ -47,6 +47,13 @@ class MesonPackage(spack.package_base.PackageBase):
variant("strip", default=False, description="Strip targets on install")
depends_on("meson", type="build")
depends_on("ninja", type="build")
# Python detection in meson requires distutils to be importable, but distutils no longer
# exists in Python 3.12. In Spack, we can't use setuptools as distutils replacement,
# because the distutils-precedence.pth startup file that setuptools ships with is not run
# when setuptools is in PYTHONPATH; it has to be in system site-packages. In a future meson
# release, the distutils requirement will be dropped, so this conflict can be relaxed.
# We have patches to make it work with meson 1.1 and above.
conflicts("^python@3.12:", when="^meson@:1.0")
def flags_to_build_system_args(self, flags):
"""Produces a list of all command line arguments to pass the specified

View File

@@ -6,6 +6,7 @@
import os
import re
import shutil
import stat
from typing import Optional
import archspec
@@ -25,6 +26,7 @@
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError, SpecError
from spack.install_test import test_part
from spack.util.executable import Executable
from spack.version import Version
from ._checks import BaseBuilder, execute_install_time_tests
@@ -351,6 +353,51 @@ def libs(self):
raise NoLibrariesError(msg.format(self.spec.name, root))
def fixup_shebangs(path: str, old_interpreter: bytes, new_interpreter: bytes):
# Recurse into the install prefix and fixup shebangs
exe = stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
dirs = [path]
hardlinks = set()
while dirs:
with os.scandir(dirs.pop()) as entries:
for entry in entries:
if entry.is_dir(follow_symlinks=False):
dirs.append(entry.path)
continue
# Only consider files, not symlinks
if not entry.is_file(follow_symlinks=False):
continue
lstat = entry.stat(follow_symlinks=False)
# Skip over files that are not executable
if not (lstat.st_mode & exe):
continue
# Don't modify hardlinks more than once
if lstat.st_nlink > 1:
key = (lstat.st_ino, lstat.st_dev)
if key in hardlinks:
continue
hardlinks.add(key)
# Finally replace shebangs if any.
with open(entry.path, "rb+") as f:
contents = f.read(2)
if contents != b"#!":
continue
contents += f.read()
if old_interpreter not in contents:
continue
f.seek(0)
f.write(contents.replace(old_interpreter, new_interpreter))
f.truncate()
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
phases = ("install",)
@@ -447,8 +494,36 @@ def global_options(self, spec, prefix):
"""
return []
@property
def _build_venv_path(self):
"""Return the path to the virtual environment used for building when
python is external."""
return os.path.join(self.spec.package.stage.path, "build_env")
@property
def _build_venv_python(self) -> Executable:
"""Return the Python executable in the build virtual environment when
python is external."""
return Executable(os.path.join(self._build_venv_path, "bin", "python"))
def install(self, pkg, spec, prefix):
"""Install everything from build directory."""
python: Executable = spec["python"].command
# Since we invoke pip with --no-build-isolation, we have to make sure that pip cannot
# execute hooks from user and system site-packages.
if spec["python"].external:
# There are no environment variables to disable the system site-packages, so we use a
# virtual environment instead. The downside of this approach is that pip produces
# incorrect shebangs that refer to the virtual environment, which we have to fix up.
python("-m", "venv", "--without-pip", self._build_venv_path)
pip = self._build_venv_python
else:
# For a Spack managed Python, system site-packages is empty/unused by design, so it
# suffices to disable user site-packages, for which there is an environment variable.
pip = python
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
args = PythonPipBuilder.std_args(pkg) + ["--prefix=" + prefix]
@@ -472,8 +547,31 @@ def install(self, pkg, spec, prefix):
else:
args.append(".")
pip = inspect.getmodule(pkg).pip
with fs.working_dir(self.build_directory):
pip(*args)
@spack.builder.run_after("install")
def fixup_shebangs_pointing_to_build(self):
"""When installing a package using an external python, we use a temporary virtual
environment which improves build isolation. The downside is that pip produces shebangs
that point to the temporary virtual environment. This method fixes them up to point to the
underlying Python."""
# No need to fixup shebangs if no build venv was used. (this post install function also
# runs when install was overridden in another package, so check existence of the venv path)
if not os.path.exists(self._build_venv_path):
return
# Use sys.executable, since that's what pip uses.
interpreter = (
lambda python: python("-c", "import sys; print(sys.executable)", output=str)
.strip()
.encode("utf-8")
)
fixup_shebangs(
path=self.spec.prefix,
old_interpreter=interpreter(self._build_venv_python),
new_interpreter=interpreter(self.spec["python"].command),
)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -64,7 +64,7 @@ class RacketBuilder(spack.builder.Builder):
@property
def subdirectory(self):
if self.racket_name:
if self.pkg.racket_name:
return "pkgs/{0}".format(self.pkg.racket_name)
return None

View File

@@ -50,6 +50,9 @@
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{arch=architecture}"
)
spack_gpg = spack.main.SpackCommand("gpg")
spack_compiler = spack.main.SpackCommand("compiler")
@@ -69,48 +72,23 @@ def __exit__(self, exc_type, exc_value, exc_traceback):
return False
def get_job_name(spec, osarch, build_group):
"""Given the necessary parts, format the gitlab job name
def get_job_name(spec: spack.spec.Spec, build_group: str = ""):
"""Given a spec and possibly a build group, return the job name. If the
resulting name is longer than 255 characters, it will be truncated.
Arguments:
spec (spack.spec.Spec): Spec job will build
osarch: Architecture TODO: (this is a spack.spec.ArchSpec,
but sphinx doesn't recognize the type and fails).
build_group (str): Name of build group this job belongs to (a CDash
notion)
Returns: The job name
"""
item_idx = 0
format_str = ""
format_args = []
format_str += "{{{0}}}".format(item_idx)
format_args.append(spec.name)
item_idx += 1
format_str += "/{{{0}}}".format(item_idx)
format_args.append(spec.dag_hash(7))
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(spec.version)
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(spec.compiler)
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(osarch)
item_idx += 1
job_name = spec.format(JOB_NAME_FORMAT)
if build_group:
format_str += " {{{0}}}".format(item_idx)
format_args.append(build_group)
item_idx += 1
job_name = "{0} {1}".format(job_name, build_group)
return format_str.format(*format_args)
return job_name[:255]
def _remove_reserved_tags(tags):
@@ -337,7 +315,7 @@ def _spec_matches(spec, match_string):
def _format_job_needs(
dep_jobs, osname, build_group, prune_dag, rebuild_decisions, enable_artifacts_buildcache
dep_jobs, build_group, prune_dag, rebuild_decisions, enable_artifacts_buildcache
):
needs_list = []
for dep_job in dep_jobs:
@@ -347,7 +325,7 @@ def _format_job_needs(
if not prune_dag or rebuild:
needs_list.append(
{
"job": get_job_name(dep_job, dep_job.architecture, build_group),
"job": get_job_name(dep_job, build_group),
"artifacts": enable_artifacts_buildcache,
}
)
@@ -1023,8 +1001,7 @@ def main_script_replacements(cmd):
if "after_script" in job_object:
job_object["after_script"] = _unpack_script(job_object["after_script"])
osname = str(release_spec.architecture)
job_name = get_job_name(release_spec, osname, build_group)
job_name = get_job_name(release_spec, build_group)
job_vars = job_object.setdefault("variables", {})
job_vars["SPACK_JOB_SPEC_DAG_HASH"] = release_spec_dag_hash
@@ -1051,7 +1028,6 @@ def main_script_replacements(cmd):
job_object["needs"].extend(
_format_job_needs(
dep_jobs,
osname,
build_group,
prune_dag,
rebuild_decisions,

View File

@@ -268,7 +268,7 @@ def _matching_specs(specs: List[Spec]) -> List[Spec]:
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
def push_fn(args):
def push_fn(args: argparse.Namespace):
"""create a binary package and push it to a mirror"""
if args.spec_file:
tty.warn(
@@ -414,7 +414,7 @@ def preview_fn(args):
)
def check_fn(args):
def check_fn(args: argparse.Namespace):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt
this command uses the process exit code to indicate its result, specifically, if the
@@ -429,7 +429,7 @@ def check_fn(args):
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs, specs)
specs = _matching_specs(specs)
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()

View File

@@ -7,6 +7,7 @@
import re
import sys
import llnl.string
import llnl.util.lang
from llnl.util import tty
@@ -15,6 +16,7 @@
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.package_base import PackageBase, deprecated_version, preferred_version
from spack.util.editor import editor
@@ -128,18 +130,38 @@ def checksum(parser, args):
remote_versions = pkg.fetch_remote_versions(args.jobs)
url_dict = remote_versions
# A spidered URL can differ from the package.py *computed* URL, pointing to different tarballs.
# For example, GitHub release pages sometimes have multiple tarballs with different shasum:
# - releases/download/1.0/<pkg>-1.0.tar.gz (uploaded tarball)
# - archive/refs/tags/1.0.tar.gz (generated tarball)
# We wanna ensure that `spack checksum` and `spack install` ultimately use the same URL, so
# here we check whether the crawled and computed URLs disagree, and if so, prioritize the
# former if that URL exists (just sending a HEAD request that is).
url_changed_for_version = set()
for version, url in url_dict.items():
possible_urls = pkg.all_urls_for_version(version)
if url not in possible_urls:
for possible_url in possible_urls:
if web_util.url_exists(possible_url):
url_dict[version] = possible_url
break
else:
url_changed_for_version.add(version)
if not url_dict:
tty.die(f"Could not find any remote versions for {pkg.name}")
# print an empty line to create a new output section block
print()
elif len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
filtered_url_dict = spack.stage.interactive_version_filter(
url_dict, pkg.versions, url_changes=url_changed_for_version
)
if filtered_url_dict is None:
exit(0)
url_dict = filtered_url_dict
else:
tty.info(f"Found {llnl.string.plural(len(url_dict), 'version')} of {pkg.name}")
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
pkg.name,
keep_stage=args.keep_stage,
batch=(args.batch or len(versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options,
url_dict, pkg.name, keep_stage=args.keep_stage, fetch_options=pkg.fetch_options
)
if args.verify:

View File

@@ -5,6 +5,7 @@
import os
import re
import sys
import urllib.parse
import llnl.util.tty as tty
@@ -823,6 +824,11 @@ def get_versions(args, name):
# Find available versions
try:
url_dict = spack.url.find_versions_of_archive(args.url)
if len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
url_dict_filtered = spack.stage.interactive_version_filter(url_dict)
if url_dict_filtered is None:
exit(0)
url_dict = url_dict_filtered
except UndetectableVersionError:
# Use fake versions
tty.warn("Couldn't detect version in: {0}".format(args.url))
@@ -834,11 +840,7 @@ def get_versions(args, name):
url_dict = {version: args.url}
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
name,
first_stage_function=guesser,
keep_stage=args.keep_stage,
batch=(args.batch or len(url_dict) == 1),
url_dict, name, first_stage_function=guesser, keep_stage=args.keep_stage
)
versions = get_version_lines(version_hashes, url_dict)

View File

@@ -112,9 +112,7 @@ def load(parser, args):
if "dependencies" in args.things_to_load:
include_roots = "package" in args.things_to_load
specs = [
dep
for spec in specs
for dep in spec.traverse(root=include_roots, order="post", deptype=("run"))
dep for spec in specs for dep in spec.traverse(root=include_roots, order="post")
]
env_mod = spack.util.environment.EnvironmentModifications()

View File

@@ -272,13 +272,6 @@ def _os_pkg_manager(self):
raise spack.error.SpackError(msg)
return os_pkg_manager
@tengine.context_property
def extra_instructions(self):
Extras = namedtuple("Extra", ["build", "final"])
extras = self.container_config.get("extra_instructions", {})
build, final = extras.get("build", None), extras.get("final", None)
return Extras(build=build, final=final)
@tengine.context_property
def labels(self):
return self.container_config.get("labels", {})

View File

@@ -299,36 +299,36 @@ def find_windows_compiler_bundled_packages() -> List[str]:
class WindowsKitExternalPaths:
plat_major_ver = None
if sys.platform == "win32":
plat_major_ver = str(winOs.windows_version()[0])
@staticmethod
def find_windows_kit_roots() -> Optional[str]:
def find_windows_kit_roots() -> List[str]:
"""Return Windows kit root, typically %programfiles%\\Windows Kits\\10|11\\"""
if sys.platform != "win32":
return None
return []
program_files = os.environ["PROGRAMFILES(x86)"]
kit_base = os.path.join(
program_files, "Windows Kits", WindowsKitExternalPaths.plat_major_ver
)
return kit_base
kit_base = os.path.join(program_files, "Windows Kits", "**")
return glob.glob(kit_base)
@staticmethod
def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]:
"""Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base is not None, "unexpected value for kit_base"
kit_bin = os.path.join(kit_base, "bin")
return glob.glob(os.path.join(kit_bin, "[0-9]*", "*\\"))
assert kit_base, "Unexpectedly empty value for Windows kit base path"
kit_paths = []
for kit in kit_base:
kit_bin = os.path.join(kit, "bin")
kit_paths.extend(glob.glob(os.path.join(kit_bin, "[0-9]*", "*\\")))
return kit_paths
@staticmethod
def find_windows_kit_lib_paths(kit_base: Optional[str] = None) -> List[str]:
"""Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base is not None, "unexpected value for kit_base"
kit_lib = os.path.join(kit_base, "Lib")
return glob.glob(os.path.join(kit_lib, "[0-9]*", "*", "*\\"))
assert kit_base, "Unexpectedly empty value for Windows kit base path"
kit_paths = []
for kit in kit_base:
kit_lib = os.path.join(kit, "Lib")
kit_paths.extend(glob.glob(os.path.join(kit_lib, "[0-9]*", "*", "*\\")))
return kit_paths
@staticmethod
def find_windows_driver_development_kit_paths() -> List[str]:
@@ -347,23 +347,30 @@ def find_windows_kit_reg_installed_roots_paths() -> List[str]:
if not reg:
# couldn't find key, return empty list
return []
return WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("KitsRoot%s" % WindowsKitExternalPaths.plat_major_ver).value
)
kit_root_reg = re.compile(r"KitsRoot[0-9]+")
root_paths = []
for kit_root in filter(kit_root_reg.match, reg.get_values().keys()):
root_paths.extend(
WindowsKitExternalPaths.find_windows_kit_lib_paths(reg.get_value(kit_root).value)
)
return root_paths
@staticmethod
def find_windows_kit_reg_sdk_paths() -> List[str]:
reg = spack.util.windows_registry.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft\\Microsoft SDKs\\Windows\\v%s.0"
% WindowsKitExternalPaths.plat_major_ver,
sdk_paths = []
sdk_regex = re.compile(r"v[0-9]+.[0-9]+")
windows_reg = spack.util.windows_registry.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft\\Microsoft SDKs\\Windows",
root_key=spack.util.windows_registry.HKEY.HKEY_LOCAL_MACHINE,
)
if not reg:
# couldn't find key, return empty list
return []
return WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("InstallationFolder").value
)
for key in filter(sdk_regex.match, [x.name for x in windows_reg.get_subkeys()]):
reg = windows_reg.get_subkey(key)
sdk_paths.extend(
WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("InstallationFolder").value
)
)
return sdk_paths
def find_win32_additional_install_paths() -> List[str]:

View File

@@ -11,7 +11,6 @@
def _for_each_enabled(spec, method_name, explicit=None):
"""Calls a method for each enabled module"""
spack.modules.ensure_modules_are_enabled_or_warn()
set_names = set(spack.config.get("modules", {}).keys())
for name in set_names:
enabled = spack.config.get("modules:%s:enable" % name)

View File

@@ -131,12 +131,12 @@ def set_term_title(self, text: str):
if not sys.stdout.isatty():
return
status = "{0} {1}".format(text, self.get_progress())
sys.stdout.write("\033]0;Spack: {0}\007".format(status))
status = f"{text} {self.get_progress()}"
sys.stdout.write(f"\x1b]0;Spack: {status}\x07")
sys.stdout.flush()
def get_progress(self) -> str:
return "[{0}/{1}]".format(self.pkg_num, self.pkg_count)
return f"[{self.pkg_num}/{self.pkg_count}]"
class TermStatusLine:
@@ -175,7 +175,7 @@ def clear(self):
# Move the cursor to the beginning of the first "Waiting for" message and clear
# everything after it.
sys.stdout.write("\x1b[%sF\x1b[J" % lines)
sys.stdout.write(f"\x1b[{lines}F\x1b[J")
sys.stdout.flush()
@@ -220,14 +220,13 @@ def _handle_external_and_upstream(pkg: "spack.package_base.PackageBase", explici
# consists in module file generation and registration in the DB.
if pkg.spec.external:
_process_external_package(pkg, explicit)
_print_installed_pkg("{0} (external {1})".format(pkg.prefix, package_id(pkg)))
_print_installed_pkg(f"{pkg.prefix} (external {package_id(pkg)})")
return True
if pkg.spec.installed_upstream:
tty.verbose(
"{0} is installed in an upstream Spack instance at {1}".format(
package_id(pkg), pkg.spec.prefix
)
f"{package_id(pkg)} is installed in an upstream Spack instance at "
f"{pkg.spec.prefix}"
)
_print_installed_pkg(pkg.prefix)
@@ -296,7 +295,7 @@ def _packages_needed_to_bootstrap_compiler(
package is the bootstrap compiler (``True``) or one of its dependencies
(``False``). The list will be empty if there are no compilers.
"""
tty.debug("Bootstrapping {0} compiler".format(compiler))
tty.debug(f"Bootstrapping {compiler} compiler")
compilers = spack.compilers.compilers_for_spec(compiler, arch_spec=architecture)
if compilers:
return []
@@ -305,9 +304,9 @@ def _packages_needed_to_bootstrap_compiler(
# Set the architecture for the compiler package in a way that allows the
# concretizer to back off if needed for the older bootstrapping compiler
dep.constrain("platform=%s" % str(architecture.platform))
dep.constrain("os=%s" % str(architecture.os))
dep.constrain("target=%s:" % architecture.target.microarchitecture.family.name)
dep.constrain(f"platform={str(architecture.platform)}")
dep.constrain(f"os={str(architecture.os)}")
dep.constrain(f"target={architecture.target.microarchitecture.family.name}:")
# concrete CompilerSpec has less info than concrete Spec
# concretize as Spec to add that information
dep.concretize()
@@ -340,15 +339,15 @@ def _hms(seconds: int) -> str:
if m:
parts.append("%dm" % m)
if s:
parts.append("%.2fs" % s)
parts.append(f"{s:.2f}s")
return " ".join(parts)
def _log_prefix(pkg_name) -> str:
"""Prefix of the form "[pid]: [pkg name]: ..." when printing a status update during
the build."""
pid = "{0}: ".format(os.getpid()) if tty.show_pid() else ""
return "{0}{1}:".format(pid, pkg_name)
pid = f"{os.getpid()}: " if tty.show_pid() else ""
return f"{pid}{pkg_name}:"
def _print_installed_pkg(message: str) -> None:
@@ -375,9 +374,9 @@ def print_install_test_log(pkg: "spack.package_base.PackageBase") -> None:
def _print_timer(pre: str, pkg_id: str, timer: timer.BaseTimer) -> None:
phases = ["{}: {}.".format(p.capitalize(), _hms(timer.duration(p))) for p in timer.phases]
phases.append("Total: {}".format(_hms(timer.duration())))
tty.msg("{0} Successfully installed {1}".format(pre, pkg_id), " ".join(phases))
phases = [f"{p.capitalize()}: {_hms(timer.duration(p))}." for p in timer.phases]
phases.append(f"Total: {_hms(timer.duration())}")
tty.msg(f"{pre} Successfully installed {pkg_id}", " ".join(phases))
def _install_from_cache(
@@ -402,14 +401,14 @@ def _install_from_cache(
)
pkg_id = package_id(pkg)
if not installed_from_cache:
pre = "No binary for {0} found".format(pkg_id)
pre = f"No binary for {pkg_id} found"
if cache_only:
tty.die("{0} when cache-only specified".format(pre))
tty.die(f"{pre} when cache-only specified")
tty.msg("{0}: installing from source".format(pre))
tty.msg(f"{pre}: installing from source")
return False
t.stop()
tty.debug("Successfully extracted {0} from binary cache".format(pkg_id))
tty.debug(f"Successfully extracted {pkg_id} from binary cache")
_write_timer_json(pkg, t, True)
_print_timer(pre=_log_prefix(pkg.name), pkg_id=pkg_id, timer=t)
@@ -430,19 +429,19 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
"""
assert pkg.spec.external, "Expected to post-install/register an external package."
pre = "{s.name}@{s.version} :".format(s=pkg.spec)
pre = f"{pkg.spec.name}@{pkg.spec.version} :"
spec = pkg.spec
if spec.external_modules:
tty.msg("{0} has external module in {1}".format(pre, spec.external_modules))
tty.debug("{0} is actually installed in {1}".format(pre, spec.external_path))
tty.msg(f"{pre} has external module in {spec.external_modules}")
tty.debug(f"{pre} is actually installed in {spec.external_path}")
else:
tty.debug("{0} externally installed in {1}".format(pre, spec.external_path))
tty.debug(f"{pre} externally installed in {spec.external_path}")
try:
# Check if the package was already registered in the DB.
# If this is the case, then only make explicit if required.
tty.debug("{0} already registered in DB".format(pre))
tty.debug(f"{pre} already registered in DB")
record = spack.store.STORE.db.get_record(spec)
if explicit and not record.explicit:
spack.store.STORE.db.update_explicit(spec, explicit)
@@ -451,11 +450,11 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
# If not, register it and generate the module file.
# For external packages we just need to run
# post-install hooks to generate module files.
tty.debug("{0} generating module file".format(pre))
tty.debug(f"{pre} generating module file")
spack.hooks.post_install(spec, explicit)
# Add to the DB
tty.debug("{0} registering into DB".format(pre))
tty.debug(f"{pre} registering into DB")
spack.store.STORE.db.add(spec, None, explicit=explicit)
@@ -490,7 +489,7 @@ def _process_binary_cache_tarball(
if download_result is None:
return False
tty.msg("Extracting {0} from binary cache".format(package_id(pkg)))
tty.msg(f"Extracting {package_id(pkg)} from binary cache")
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(
@@ -522,7 +521,7 @@ def _try_install_from_binary_cache(
if not spack.mirror.MirrorCollection(binary=True):
return False
tty.debug("Searching for binary cache of {0}".format(package_id(pkg)))
tty.debug(f"Searching for binary cache of {package_id(pkg)}")
with timer.measure("search"):
matches = binary_distribution.get_mirrors_for_spec(pkg.spec, index_only=True)
@@ -590,9 +589,9 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
source_repo = spack.repo.Repo(source_repo_root)
source_pkg_dir = source_repo.dirname_for_package_name(node.name)
except spack.repo.RepoError as err:
tty.debug("Failed to create source repo for {0}: {1}".format(node.name, str(err)))
tty.debug(f"Failed to create source repo for {node.name}: {str(err)}")
source_pkg_dir = None
tty.warn("Warning: Couldn't copy in provenance for {0}".format(node.name))
tty.warn(f"Warning: Couldn't copy in provenance for {node.name}")
# Create a destination repository
dest_repo_root = os.path.join(path, node.namespace)
@@ -632,7 +631,7 @@ def install_msg(name: str, pid: int, install_status: InstallStatus) -> str:
Return: Colorized installing message
"""
pre = "{0}: ".format(pid) if tty.show_pid() else ""
pre = f"{pid}: " if tty.show_pid() else ""
post = (
" @*{%s}" % install_status.get_progress()
if install_status and spack.config.get("config:install_status", True)
@@ -698,7 +697,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
# in the stage tree (not arbitrary files)
abs_expr = os.path.realpath(glob_expr)
if os.path.realpath(pkg.stage.path) not in abs_expr:
errors.write("[OUTSIDE SOURCE PATH]: {0}\n".format(glob_expr))
errors.write(f"[OUTSIDE SOURCE PATH]: {glob_expr}\n")
continue
# Now that we are sure that the path is within the correct
# folder, make it relative and check for matches
@@ -718,14 +717,14 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
# Here try to be conservative, and avoid discarding
# the whole install procedure because of copying a
# single file failed
errors.write("[FAILED TO ARCHIVE]: {0}".format(f))
errors.write(f"[FAILED TO ARCHIVE]: {f}")
if errors.getvalue():
error_file = os.path.join(target_dir, "errors.txt")
fs.mkdirp(target_dir)
with open(error_file, "w") as err:
err.write(errors.getvalue())
tty.warn("Errors occurred when archiving files.\n\t" "See: {0}".format(error_file))
tty.warn(f"Errors occurred when archiving files.\n\tSee: {error_file}")
dump_packages(pkg.spec, packages_dir)
@@ -761,11 +760,11 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
raise ValueError(f"{str(pkg)} must be a package")
self.pkg = pkg
if not self.pkg.spec.concrete:
raise ValueError("{0} must have a concrete spec".format(self.pkg.name))
raise ValueError(f"{self.pkg.name} must have a concrete spec")
# Cache the package phase options with the explicit package,
# popping the options to ensure installation of associated
@@ -797,14 +796,14 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
def __repr__(self) -> str:
"""Returns a formal representation of the build request."""
rep = "{0}(".format(self.__class__.__name__)
rep = f"{self.__class__.__name__}("
for attr, value in self.__dict__.items():
rep += "{0}={1}, ".format(attr, value.__repr__())
return "{0})".format(rep.strip(", "))
rep += f"{attr}={value.__repr__()}, "
return f"{rep.strip(', ')})"
def __str__(self) -> str:
"""Returns a printable version of the build request."""
return "package={0}, install_args={1}".format(self.pkg.name, self.install_args)
return f"package={self.pkg.name}, install_args={self.install_args}"
def _add_default_args(self) -> None:
"""Ensure standard install options are set to at least the default."""
@@ -930,18 +929,18 @@ def __init__(
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
raise ValueError(f"{str(pkg)} must be a package")
self.pkg = pkg
if not self.pkg.spec.concrete:
raise ValueError("{0} must have a concrete spec".format(self.pkg.name))
raise ValueError(f"{self.pkg.name} must have a concrete spec")
# The "unique" identifier for the task's package
self.pkg_id = package_id(self.pkg)
# The explicit build request associated with the package
if not isinstance(request, BuildRequest):
raise ValueError("{0} must have a build request".format(str(pkg)))
raise ValueError(f"{str(pkg)} must have a build request")
self.request = request
@@ -949,8 +948,9 @@ def __init__(
# ensure priority queue invariants when tasks are "removed" from the
# queue.
if status == STATUS_REMOVED:
msg = "Cannot create a build task for {0} with status '{1}'"
raise InstallError(msg.format(self.pkg_id, status), pkg=pkg)
raise InstallError(
f"Cannot create a build task for {self.pkg_id} with status '{status}'", pkg=pkg
)
self.status = status
@@ -964,9 +964,9 @@ def __init__(
# to support tracking of parallel, multi-spec, environment installs.
self.dependents = set(get_dependent_ids(self.pkg.spec))
tty.debug("Pkg id {0} has the following dependents:".format(self.pkg_id))
tty.debug(f"Pkg id {self.pkg_id} has the following dependents:")
for dep_id in self.dependents:
tty.debug("- {0}".format(dep_id))
tty.debug(f"- {dep_id}")
# Set of dependencies
#
@@ -988,9 +988,9 @@ def __init__(
if not spack.compilers.compilers_for_spec(compiler_spec, arch_spec=arch_spec):
# The compiler is in the queue, identify it as dependency
dep = spack.compilers.pkg_spec_for_compiler(compiler_spec)
dep.constrain("platform=%s" % str(arch_spec.platform))
dep.constrain("os=%s" % str(arch_spec.os))
dep.constrain("target=%s:" % arch_spec.target.microarchitecture.family.name)
dep.constrain(f"platform={str(arch_spec.platform)}")
dep.constrain(f"os={str(arch_spec.os)}")
dep.constrain(f"target={arch_spec.target.microarchitecture.family.name}:")
dep.concretize()
dep_id = package_id(dep.package)
self.dependencies.add(dep_id)
@@ -1026,14 +1026,14 @@ def __ne__(self, other):
def __repr__(self) -> str:
"""Returns a formal representation of the build task."""
rep = "{0}(".format(self.__class__.__name__)
rep = f"{self.__class__.__name__}("
for attr, value in self.__dict__.items():
rep += "{0}={1}, ".format(attr, value.__repr__())
return "{0})".format(rep.strip(", "))
rep += f"{attr}={value.__repr__()}, "
return f"{rep.strip(', ')})"
def __str__(self) -> str:
"""Returns a printable version of the build task."""
dependencies = "#dependencies={0}".format(len(self.dependencies))
dependencies = f"#dependencies={len(self.dependencies)}"
return "priority={0}, status={1}, start={2}, {3}".format(
self.priority, self.status, self.start, dependencies
)
@@ -1056,7 +1056,7 @@ def add_dependent(self, pkg_id: str) -> None:
pkg_id: package identifier of the dependent package
"""
if pkg_id != self.pkg_id and pkg_id not in self.dependents:
tty.debug("Adding {0} as a dependent of {1}".format(pkg_id, self.pkg_id))
tty.debug(f"Adding {pkg_id} as a dependent of {self.pkg_id}")
self.dependents.add(pkg_id)
def flag_installed(self, installed: List[str]) -> None:
@@ -1070,9 +1070,8 @@ def flag_installed(self, installed: List[str]) -> None:
for pkg_id in now_installed:
self.uninstalled_deps.remove(pkg_id)
tty.debug(
"{0}: Removed {1} from uninstalled deps list: {2}".format(
self.pkg_id, pkg_id, self.uninstalled_deps
),
f"{self.pkg_id}: Removed {pkg_id} from uninstalled deps list: "
f"{self.uninstalled_deps}",
level=2,
)
@@ -1170,18 +1169,18 @@ def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]]
def __repr__(self) -> str:
"""Returns a formal representation of the package installer."""
rep = "{0}(".format(self.__class__.__name__)
rep = f"{self.__class__.__name__}("
for attr, value in self.__dict__.items():
rep += "{0}={1}, ".format(attr, value.__repr__())
return "{0})".format(rep.strip(", "))
rep += f"{attr}={value.__repr__()}, "
return f"{rep.strip(', ')})"
def __str__(self) -> str:
"""Returns a printable version of the package installer."""
requests = "#requests={0}".format(len(self.build_requests))
tasks = "#tasks={0}".format(len(self.build_tasks))
failed = "failed ({0}) = {1}".format(len(self.failed), self.failed)
installed = "installed ({0}) = {1}".format(len(self.installed), self.installed)
return "{0}: {1}; {2}; {3}; {4}".format(self.pid, requests, tasks, installed, failed)
requests = f"#requests={len(self.build_requests)}"
tasks = f"#tasks={len(self.build_tasks)}"
failed = f"failed ({len(self.failed)}) = {self.failed}"
installed = f"installed ({len(self.installed)}) = {self.installed}"
return f"{self.pid}: {requests}; {tasks}; {installed}; {failed}"
def _add_bootstrap_compilers(
self,
@@ -1226,9 +1225,7 @@ def _modify_existing_task(self, pkgid: str, attr, value) -> None:
for i, tup in enumerate(self.build_pq):
key, task = tup
if task.pkg_id == pkgid:
tty.debug(
"Modifying task for {0} to treat it as a compiler".format(pkgid), level=2
)
tty.debug(f"Modifying task for {pkgid} to treat it as a compiler", level=2)
setattr(task, attr, value)
self.build_pq[i] = (key, task)
@@ -1293,7 +1290,7 @@ def _check_deps_status(self, request: BuildRequest) -> None:
# Check for failure since a prefix lock is not required
if spack.store.STORE.failure_tracker.has_failed(dep):
action = "'spack install' the dependency"
msg = "{0} is marked as an install failure: {1}".format(dep_id, action)
msg = f"{dep_id} is marked as an install failure: {action}"
raise InstallError(err.format(request.pkg_id, msg), pkg=dep_pkg)
# Attempt to get a read lock to ensure another process does not
@@ -1301,7 +1298,7 @@ def _check_deps_status(self, request: BuildRequest) -> None:
# installed
ltype, lock = self._ensure_locked("read", dep_pkg)
if lock is None:
msg = "{0} is write locked by another process".format(dep_id)
msg = f"{dep_id} is write locked by another process"
raise InstallError(err.format(request.pkg_id, msg), pkg=request.pkg)
# Flag external and upstream packages as being installed
@@ -1320,7 +1317,7 @@ def _check_deps_status(self, request: BuildRequest) -> None:
or rec.installation_time > request.overwrite_time
)
):
tty.debug("Flagging {0} as installed per the database".format(dep_id))
tty.debug(f"Flagging {dep_id} as installed per the database")
self._flag_installed(dep_pkg)
else:
lock.release_read()
@@ -1356,9 +1353,9 @@ def _prepare_for_install(self, task: BuildTask) -> None:
# Ensure there is no other installed spec with the same prefix dir
if spack.store.STORE.db.is_occupied_install_prefix(task.pkg.spec.prefix):
raise InstallError(
"Install prefix collision for {0}".format(task.pkg_id),
long_msg="Prefix directory {0} already used by another "
"installed spec.".format(task.pkg.spec.prefix),
f"Install prefix collision for {task.pkg_id}",
long_msg=f"Prefix directory {task.pkg.spec.prefix} already "
"used by another installed spec.",
pkg=task.pkg,
)
@@ -1368,7 +1365,7 @@ def _prepare_for_install(self, task: BuildTask) -> None:
if not keep_prefix:
task.pkg.remove_prefix()
else:
tty.debug("{0} is partially installed".format(task.pkg_id))
tty.debug(f"{task.pkg_id} is partially installed")
# Destroy the stage for a locally installed, non-DIYStage, package
if restage and task.pkg.stage.managed_by_spack:
@@ -1413,9 +1410,8 @@ def _cleanup_failed(self, pkg_id: str) -> None:
lock = self.failed.get(pkg_id, None)
if lock is not None:
err = "{0} exception when removing failure tracking for {1}: {2}"
msg = "Removing failure mark on {0}"
try:
tty.verbose(msg.format(pkg_id))
tty.verbose(f"Removing failure mark on {pkg_id}")
lock.release_write()
except Exception as exc:
tty.warn(err.format(exc.__class__.__name__, pkg_id, str(exc)))
@@ -1442,19 +1438,19 @@ def _ensure_install_ready(self, pkg: "spack.package_base.PackageBase") -> None:
pkg: the package being locally installed
"""
pkg_id = package_id(pkg)
pre = "{0} cannot be installed locally:".format(pkg_id)
pre = f"{pkg_id} cannot be installed locally:"
# External packages cannot be installed locally.
if pkg.spec.external:
raise ExternalPackageError("{0} {1}".format(pre, "is external"))
raise ExternalPackageError(f"{pre} is external")
# Upstream packages cannot be installed locally.
if pkg.spec.installed_upstream:
raise UpstreamPackageError("{0} {1}".format(pre, "is upstream"))
raise UpstreamPackageError(f"{pre} is upstream")
# The package must have a prefix lock at this stage.
if pkg_id not in self.locks:
raise InstallLockError("{0} {1}".format(pre, "not locked"))
raise InstallLockError(f"{pre} not locked")
def _ensure_locked(
self, lock_type: str, pkg: "spack.package_base.PackageBase"
@@ -1481,14 +1477,14 @@ def _ensure_locked(
assert lock_type in [
"read",
"write",
], '"{0}" is not a supported package management lock type'.format(lock_type)
], f'"{lock_type}" is not a supported package management lock type'
pkg_id = package_id(pkg)
ltype, lock = self.locks.get(pkg_id, (lock_type, None))
if lock and ltype == lock_type:
return ltype, lock
desc = "{0} lock".format(lock_type)
desc = f"{lock_type} lock"
msg = "{0} a {1} on {2} with timeout {3}"
err = "Failed to {0} a {1} for {2} due to {3}: {4}"
@@ -1507,11 +1503,7 @@ def _ensure_locked(
op = "acquire"
lock = spack.store.STORE.prefix_locker.lock(pkg.spec, timeout)
if timeout != lock.default_timeout:
tty.warn(
"Expected prefix lock timeout {0}, not {1}".format(
timeout, lock.default_timeout
)
)
tty.warn(f"Expected prefix lock timeout {timeout}, not {lock.default_timeout}")
if lock_type == "read":
lock.acquire_read()
else:
@@ -1536,7 +1528,7 @@ def _ensure_locked(
tty.debug(msg.format("Upgrading to", desc, pkg_id, pretty_seconds(timeout or 0)))
op = "upgrade to"
lock.upgrade_read_to_write(timeout)
tty.debug("{0} is now {1} locked".format(pkg_id, lock_type))
tty.debug(f"{pkg_id} is now {lock_type} locked")
except (lk.LockDowngradeError, lk.LockTimeoutError) as exc:
tty.debug(err.format(op, desc, pkg_id, exc.__class__.__name__, str(exc)))
@@ -1561,14 +1553,14 @@ def _add_tasks(self, request: BuildRequest, all_deps):
all_deps (defaultdict(set)): dictionary of all dependencies and
associated dependents
"""
tty.debug("Initializing the build queue for {0}".format(request.pkg.name))
tty.debug(f"Initializing the build queue for {request.pkg.name}")
# Ensure not attempting to perform an installation when user didn't
# want to go that far for the requested package.
try:
_check_last_phase(request.pkg)
except BadInstallPhase as err:
tty.warn("Installation request refused: {0}".format(str(err)))
tty.warn(f"Installation request refused: {str(err)}")
return
# Skip out early if the spec is not being installed locally (i.e., if
@@ -1719,9 +1711,9 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
# A StopPhase exception means that do_install was asked to
# stop early from clients, and is not an error at this point
spack.hooks.on_install_failure(task.request.pkg.spec)
pid = "{0}: ".format(self.pid) if tty.show_pid() else ""
tty.debug("{0}{1}".format(pid, str(e)))
tty.debug("Package stage directory: {0}".format(pkg.stage.source_path))
pid = f"{self.pid}: " if tty.show_pid() else ""
tty.debug(f"{pid}{str(e)}")
tty.debug(f"Package stage directory: {pkg.stage.source_path}")
def _next_is_pri0(self) -> bool:
"""
@@ -1816,7 +1808,7 @@ def _remove_task(self, pkg_id: str) -> Optional[BuildTask]:
pkg_id: identifier for the package to be removed
"""
if pkg_id in self.build_tasks:
tty.debug("Removing build task for {0} from list".format(pkg_id))
tty.debug(f"Removing build task for {pkg_id} from list")
task = self.build_tasks.pop(pkg_id)
task.status = STATUS_REMOVED
return task
@@ -1832,10 +1824,8 @@ def _requeue_task(self, task: BuildTask, install_status: InstallStatus) -> None:
"""
if task.status not in [STATUS_INSTALLED, STATUS_INSTALLING]:
tty.debug(
"{0} {1}".format(
install_msg(task.pkg_id, self.pid, install_status),
"in progress by another process",
)
f"{install_msg(task.pkg_id, self.pid, install_status)} "
"in progress by another process"
)
new_task = task.next_attempt(self.installed)
@@ -1852,7 +1842,7 @@ def _setup_install_dir(self, pkg: "spack.package_base.PackageBase") -> None:
"""
if not os.path.exists(pkg.spec.prefix):
path = spack.util.path.debug_padded_filter(pkg.spec.prefix)
tty.debug("Creating the installation directory {0}".format(path))
tty.debug(f"Creating the installation directory {path}")
spack.store.STORE.layout.create_install_directory(pkg.spec)
else:
# Set the proper group for the prefix
@@ -1888,8 +1878,8 @@ def _update_failed(
exc: optional exception if associated with the failure
"""
pkg_id = task.pkg_id
err = "" if exc is None else ": {0}".format(str(exc))
tty.debug("Flagging {0} as failed{1}".format(pkg_id, err))
err = "" if exc is None else f": {str(exc)}"
tty.debug(f"Flagging {pkg_id} as failed{err}")
if mark:
self.failed[pkg_id] = spack.store.STORE.failure_tracker.mark(task.pkg.spec)
else:
@@ -1898,14 +1888,14 @@ def _update_failed(
for dep_id in task.dependents:
if dep_id in self.build_tasks:
tty.warn("Skipping build of {0} since {1} failed".format(dep_id, pkg_id))
tty.warn(f"Skipping build of {dep_id} since {pkg_id} failed")
# Ensure the dependent's uninstalled dependents are
# up-to-date and their build tasks removed.
dep_task = self.build_tasks[dep_id]
self._update_failed(dep_task, mark)
self._remove_task(dep_id)
else:
tty.debug("No build task for {0} to skip since {1} failed".format(dep_id, pkg_id))
tty.debug(f"No build task for {dep_id} to skip since {pkg_id} failed")
def _update_installed(self, task: BuildTask) -> None:
"""
@@ -1935,23 +1925,21 @@ def _flag_installed(
# Already determined the package has been installed
return
tty.debug("Flagging {0} as installed".format(pkg_id))
tty.debug(f"Flagging {pkg_id} as installed")
self.installed.add(pkg_id)
# Update affected dependents
dependent_ids = dependent_ids or get_dependent_ids(pkg.spec)
for dep_id in set(dependent_ids):
tty.debug("Removing {0} from {1}'s uninstalled dependencies.".format(pkg_id, dep_id))
tty.debug(f"Removing {pkg_id} from {dep_id}'s uninstalled dependencies.")
if dep_id in self.build_tasks:
# Ensure the dependent's uninstalled dependencies are
# up-to-date. This will require requeueing the task.
dep_task = self.build_tasks[dep_id]
self._push_task(dep_task.next_attempt(self.installed))
else:
tty.debug(
"{0} has no build task to update for {1}'s success".format(dep_id, pkg_id)
)
tty.debug(f"{dep_id} has no build task to update for {pkg_id}'s success")
def _init_queue(self) -> None:
"""Initialize the build queue from the list of build requests."""
@@ -2032,8 +2020,8 @@ def install(self) -> None:
pkg, pkg_id, spec = task.pkg, task.pkg_id, task.pkg.spec
install_status.next_pkg(pkg)
install_status.set_term_title("Processing {0}".format(pkg.name))
tty.debug("Processing {0}: task={1}".format(pkg_id, task))
install_status.set_term_title(f"Processing {pkg.name}")
tty.debug(f"Processing {pkg_id}: task={task}")
# Ensure that the current spec has NO uninstalled dependencies,
# which is assumed to be reflected directly in its priority.
#
@@ -2045,24 +2033,19 @@ def install(self) -> None:
if task.priority != 0:
term_status.clear()
tty.error(
"Detected uninstalled dependencies for {0}: {1}".format(
pkg_id, task.uninstalled_deps
)
f"Detected uninstalled dependencies for {pkg_id}: " f"{task.uninstalled_deps}"
)
left = [dep_id for dep_id in task.uninstalled_deps if dep_id not in self.installed]
if not left:
tty.warn(
"{0} does NOT actually have any uninstalled deps" " left".format(pkg_id)
)
tty.warn(f"{pkg_id} does NOT actually have any uninstalled deps left")
dep_str = "dependencies" if task.priority > 1 else "dependency"
# Hook to indicate task failure, but without an exception
spack.hooks.on_install_failure(task.request.pkg.spec)
raise InstallError(
"Cannot proceed with {0}: {1} uninstalled {2}: {3}".format(
pkg_id, task.priority, dep_str, ",".join(task.uninstalled_deps)
),
f"Cannot proceed with {pkg_id}: {task.priority} uninstalled "
f"{dep_str}: {','.join(task.uninstalled_deps)}",
pkg=pkg,
)
@@ -2079,7 +2062,7 @@ def install(self) -> None:
# assume using a separate (failed) prefix lock file.
if pkg_id in self.failed or spack.store.STORE.failure_tracker.has_failed(spec):
term_status.clear()
tty.warn("{0} failed to install".format(pkg_id))
tty.warn(f"{pkg_id} failed to install")
self._update_failed(task)
# Mark that the package failed
@@ -2096,7 +2079,7 @@ def install(self) -> None:
# another process is likely (un)installing the spec or has
# determined the spec has already been installed (though the
# other process may be hung).
install_status.set_term_title("Acquiring lock for {0}".format(pkg.name))
install_status.set_term_title(f"Acquiring lock for {pkg.name}")
term_status.add(pkg_id)
ltype, lock = self._ensure_locked("write", pkg)
if lock is None:
@@ -2119,7 +2102,7 @@ def install(self) -> None:
task.request.overwrite_time = time.time()
# Determine state of installation artifacts and adjust accordingly.
install_status.set_term_title("Preparing {0}".format(pkg.name))
install_status.set_term_title(f"Preparing {pkg.name}")
self._prepare_for_install(task)
# Flag an already installed package
@@ -2165,7 +2148,7 @@ def install(self) -> None:
# Proceed with the installation since we have an exclusive write
# lock on the package.
install_status.set_term_title("Installing {0}".format(pkg.name))
install_status.set_term_title(f"Installing {pkg.name}")
try:
action = self._install_action(task)
@@ -2186,8 +2169,9 @@ def install(self) -> None:
except KeyboardInterrupt as exc:
# The build has been terminated with a Ctrl-C so terminate
# regardless of the number of remaining specs.
err = "Failed to install {0} due to {1}: {2}"
tty.error(err.format(pkg.name, exc.__class__.__name__, str(exc)))
tty.error(
f"Failed to install {pkg.name} due to " f"{exc.__class__.__name__}: {str(exc)}"
)
spack.hooks.on_install_cancel(task.request.pkg.spec)
raise
@@ -2196,9 +2180,10 @@ def install(self) -> None:
raise
# Checking hash on downloaded binary failed.
err = "Failed to install {0} from binary cache due to {1}:"
err += " Requeueing to install from source."
tty.error(err.format(pkg.name, str(exc)))
tty.error(
f"Failed to install {pkg.name} from binary cache due "
f"to {str(exc)}: Requeueing to install from source."
)
# this overrides a full method, which is ugly.
task.use_cache = False # type: ignore[misc]
self._requeue_task(task, install_status)
@@ -2216,13 +2201,12 @@ def install(self) -> None:
# lower levels -- skip printing if already printed.
# TODO: sort out this and SpackError.print_context()
tty.error(
"Failed to install {0} due to {1}: {2}".format(
pkg.name, exc.__class__.__name__, str(exc)
)
f"Failed to install {pkg.name} due to "
f"{exc.__class__.__name__}: {str(exc)}"
)
# Terminate if requested to do so on the first failure.
if self.fail_fast:
raise InstallError("{0}: {1}".format(fail_fast_err, str(exc)), pkg=pkg)
raise InstallError(f"{fail_fast_err}: {str(exc)}", pkg=pkg)
# Terminate at this point if the single explicit spec has
# failed to install.
@@ -2261,17 +2245,17 @@ def install(self) -> None:
if failed_explicits or missing:
for _, pkg_id, err in failed_explicits:
tty.error("{0}: {1}".format(pkg_id, err))
tty.error(f"{pkg_id}: {err}")
for _, pkg_id in missing:
tty.error("{0}: Package was not installed".format(pkg_id))
tty.error(f"{pkg_id}: Package was not installed")
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
tty.debug(
"Associating installation failure with first failed "
"explicit package ({0}) from {1}".format(ids[0], ", ".join(ids))
f"explicit package ({ids[0]}) from {', '.join(ids)}"
)
elif len(missing) > 0:
@@ -2279,7 +2263,7 @@ def install(self) -> None:
ids = [pkg_id for _, pkg_id in missing]
tty.debug(
"Associating installation failure with first "
"missing package ({0}) from {1}".format(ids[0], ", ".join(ids))
f"missing package ({ids[0]}) from {', '.join(ids)}"
)
raise InstallError(
@@ -2357,7 +2341,7 @@ def run(self) -> bool:
self.timer.stop("stage")
tty.debug(
"{0} Building {1} [{2}]".format(self.pre, self.pkg_id, self.pkg.build_system_class) # type: ignore[attr-defined] # noqa: E501
f"{self.pre} Building {self.pkg_id} [{self.pkg.build_system_class}]" # type: ignore[attr-defined] # noqa: E501
)
# get verbosity from do_install() parameter or saved value
@@ -2402,7 +2386,7 @@ def _install_source(self) -> None:
return
src_target = os.path.join(pkg.spec.prefix, "share", pkg.name, "src")
tty.debug("{0} Copying source to {1}".format(self.pre, src_target))
tty.debug(f"{self.pre} Copying source to {src_target}")
fs.install_tree(
pkg.stage.source_path, src_target, allow_broken_symlinks=(sys.platform != "win32")
@@ -2464,8 +2448,7 @@ def _real_install(self) -> None:
with logger.force_echo():
inner_debug_level = tty.debug_level()
tty.set_debug(debug_level)
msg = "{0} Executing phase: '{1}'"
tty.msg(msg.format(self.pre, phase_fn.name))
tty.msg(f"{self.pre} Executing phase: '{phase_fn.name}'")
tty.set_debug(inner_debug_level)
# Catch any errors to report to logging
@@ -2539,12 +2522,9 @@ def install(self):
except fs.CouldNotRestoreDirectoryBackup as e:
self.database.remove(self.task.pkg.spec)
tty.error(
"Recovery of install dir of {0} failed due to "
"{1}: {2}. The spec is now uninstalled.".format(
self.task.pkg.name,
e.outer_exception.__class__.__name__,
str(e.outer_exception),
)
f"Recovery of install dir of {self.task.pkg.name} failed due to "
f"{e.outer_exception.__class__.__name__}: {str(e.outer_exception)}. "
"The spec is now uninstalled."
)
# Unwrap the actual installation exception.
@@ -2567,7 +2547,7 @@ class BadInstallPhase(InstallError):
"""Raised for an install phase option is not allowed for a package."""
def __init__(self, pkg_name, phase):
super().__init__("'{0}' is not a valid phase for package {1}".format(phase, pkg_name))
super().__init__(f"'{phase}' is not a valid phase for package {pkg_name}")
class ExternalPackageError(InstallError):

View File

@@ -7,15 +7,10 @@
include Tcl non-hierarchical modules, Lua hierarchical modules, and others.
"""
from .common import disable_modules, ensure_modules_are_enabled_or_warn
from .common import disable_modules
from .lmod import LmodModulefileWriter
from .tcl import TclModulefileWriter
__all__ = [
"TclModulefileWriter",
"LmodModulefileWriter",
"disable_modules",
"ensure_modules_are_enabled_or_warn",
]
__all__ = ["TclModulefileWriter", "LmodModulefileWriter", "disable_modules"]
module_types = {"tcl": TclModulefileWriter, "lmod": LmodModulefileWriter}

View File

@@ -33,10 +33,8 @@
import datetime
import inspect
import os.path
import pathlib
import re
import string
import warnings
from typing import Optional
import llnl.util.filesystem
@@ -820,43 +818,6 @@ def verbose(self):
return self.conf.verbose
def ensure_modules_are_enabled_or_warn():
"""Ensures that, if a custom configuration file is found with custom configuration for the
default tcl module set, then tcl module file generation is enabled. Otherwise, a warning
is emitted.
"""
# TODO (v0.21 - Remove this function)
# Check if TCL module generation is enabled, return early if it is
enabled = spack.config.get("modules:default:enable", [])
if "tcl" in enabled:
return
# Check if we have custom TCL module sections
for scope in spack.config.CONFIG.file_scopes:
# Skip default configuration
if scope.name.startswith("default"):
continue
data = spack.config.get("modules:default:tcl", scope=scope.name)
if data:
config_file = pathlib.Path(scope.path)
if not scope.name.startswith("env"):
config_file = config_file / "modules.yaml"
break
else:
return
# If we are here we have a custom "modules" section in "config_file"
msg = (
f"detected custom TCL modules configuration in {config_file}, while TCL module file "
f"generation for the default module set is disabled. "
f"In Spack v0.20 module file generation has been disabled by default. To enable "
f"it run:\n\n\t$ spack config add 'modules:default:enable:[tcl]'\n"
)
warnings.warn(msg)
class BaseModuleFileWriter:
def __init__(self, spec, module_set_name, explicit=None):
self.spec = spec

View File

@@ -73,10 +73,10 @@
#: Valid name for specs and variants. Here we are not using
#: the previous "w[\w.-]*" since that would match most
#: characters that can be part of a word in any language
IDENTIFIER = r"([a-zA-Z_0-9][a-zA-Z_0-9\-]*)"
DOTTED_IDENTIFIER = rf"({IDENTIFIER}(\.{IDENTIFIER})+)"
GIT_HASH = r"([A-Fa-f0-9]{40})"
GIT_VERSION = rf"((git\.({DOTTED_IDENTIFIER}|{IDENTIFIER}))|({GIT_HASH}))"
IDENTIFIER = r"(?:[a-zA-Z_0-9][a-zA-Z_0-9\-]*)"
DOTTED_IDENTIFIER = rf"(?:{IDENTIFIER}(?:\.{IDENTIFIER})+)"
GIT_HASH = r"(?:[A-Fa-f0-9]{40})"
GIT_VERSION = rf"(?:(?:git\.(?:{DOTTED_IDENTIFIER}|{IDENTIFIER}))|(?:{GIT_HASH}))"
NAME = r"[a-zA-Z_0-9][a-zA-Z_0-9\-.]*"
@@ -85,15 +85,15 @@
#: A filename starts either with a "." or a "/" or a "{name}/,
# or on Windows, a drive letter followed by a colon and "\"
# or "." or {name}\
WINDOWS_FILENAME = r"(\.|[a-zA-Z0-9-_]*\\|[a-zA-Z]:\\)([a-zA-Z0-9-_\.\\]*)(\.json|\.yaml)"
UNIX_FILENAME = r"(\.|\/|[a-zA-Z0-9-_]*\/)([a-zA-Z0-9-_\.\/]*)(\.json|\.yaml)"
WINDOWS_FILENAME = r"(?:\.|[a-zA-Z0-9-_]*\\|[a-zA-Z]:\\)(?:[a-zA-Z0-9-_\.\\]*)(?:\.json|\.yaml)"
UNIX_FILENAME = r"(?:\.|\/|[a-zA-Z0-9-_]*\/)(?:[a-zA-Z0-9-_\.\/]*)(?:\.json|\.yaml)"
if not IS_WINDOWS:
FILENAME = UNIX_FILENAME
else:
FILENAME = WINDOWS_FILENAME
VALUE = r"([a-zA-Z_0-9\-+\*.,:=\~\/\\]+)"
QUOTED_VALUE = r"[\"']+([a-zA-Z_0-9\-+\*.,:=\~\/\\\s]+)[\"']+"
VALUE = r"(?:[a-zA-Z_0-9\-+\*.,:=\~\/\\]+)"
QUOTED_VALUE = r"[\"']+(?:[a-zA-Z_0-9\-+\*.,:=\~\/\\\s]+)[\"']+"
VERSION = r"=?([a-zA-Z0-9_][a-zA-Z_0-9\-\.]*\b)"
VERSION_RANGE = rf"({VERSION}\s*:\s*{VERSION}(?!\s*=)|:\s*{VERSION}(?!\s*=)|{VERSION}\s*:|:)"
@@ -125,34 +125,34 @@ class TokenType(TokenBase):
"""
# Dependency
DEPENDENCY = r"(\^)"
DEPENDENCY = r"(?:\^)"
# Version
VERSION_HASH_PAIR = rf"(@({GIT_VERSION})=({VERSION}))"
VERSION = rf"(@\s*({VERSION_LIST}))"
VERSION_HASH_PAIR = rf"(?:@(?:{GIT_VERSION})=(?:{VERSION}))"
VERSION = rf"(?:@\s*(?:{VERSION_LIST}))"
# Variants
PROPAGATED_BOOL_VARIANT = rf"((\+\+|~~|--)\s*{NAME})"
BOOL_VARIANT = rf"([~+-]\s*{NAME})"
PROPAGATED_KEY_VALUE_PAIR = rf"({NAME}\s*==\s*({VALUE}|{QUOTED_VALUE}))"
KEY_VALUE_PAIR = rf"({NAME}\s*=\s*({VALUE}|{QUOTED_VALUE}))"
PROPAGATED_BOOL_VARIANT = rf"(?:(?:\+\+|~~|--)\s*{NAME})"
BOOL_VARIANT = rf"(?:[~+-]\s*{NAME})"
PROPAGATED_KEY_VALUE_PAIR = rf"(?:{NAME}\s*==\s*(?:{VALUE}|{QUOTED_VALUE}))"
KEY_VALUE_PAIR = rf"(?:{NAME}\s*=\s*(?:{VALUE}|{QUOTED_VALUE}))"
# Compilers
COMPILER_AND_VERSION = rf"(%\s*({NAME})([\s]*)@\s*({VERSION_LIST}))"
COMPILER = rf"(%\s*({NAME}))"
COMPILER_AND_VERSION = rf"(?:%\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))"
COMPILER = rf"(?:%\s*(?:{NAME}))"
# FILENAME
FILENAME = rf"({FILENAME})"
FILENAME = rf"(?:{FILENAME})"
# Package name
FULLY_QUALIFIED_PACKAGE_NAME = rf"({DOTTED_IDENTIFIER})"
UNQUALIFIED_PACKAGE_NAME = rf"({IDENTIFIER})"
FULLY_QUALIFIED_PACKAGE_NAME = rf"(?:{DOTTED_IDENTIFIER})"
UNQUALIFIED_PACKAGE_NAME = rf"(?:{IDENTIFIER})"
# DAG hash
DAG_HASH = rf"(/({HASH}))"
DAG_HASH = rf"(?:/(?:{HASH}))"
# White spaces
WS = r"(\s+)"
WS = r"(?:\s+)"
class ErrorTokenType(TokenBase):
"""Enum with regexes for error analysis"""
# Unexpected character
UNEXPECTED = r"(.[\s]*)"
UNEXPECTED = r"(?:.[\s]*)"
class Token:

View File

@@ -312,21 +312,19 @@ def from_json(cls, stream, repository):
def to_json(self, stream):
sjson.dump({"patches": self.index}, stream)
def patch_for_package(self, sha256, pkg):
def patch_for_package(self, sha256: str, pkg):
"""Look up a patch in the index and build a patch object for it.
Arguments:
sha256 (str): sha256 hash to look up
sha256: sha256 hash to look up
pkg (spack.package_base.PackageBase): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo.
"""
we have information about the package's location in its repo."""
sha_index = self.index.get(sha256)
if not sha_index:
raise NoSuchPatchError(
"Couldn't find patch for package %s with sha256: %s" % (pkg.fullname, sha256)
raise PatchLookupError(
f"Couldn't find patch for package {pkg.fullname} with sha256: {sha256}"
)
# Find patches for this class or any class it inherits from
@@ -335,8 +333,8 @@ def patch_for_package(self, sha256, pkg):
if patch_dict:
break
else:
raise NoSuchPatchError(
"Couldn't find patch for package %s with sha256: %s" % (pkg.fullname, sha256)
raise PatchLookupError(
f"Couldn't find patch for package {pkg.fullname} with sha256: {sha256}"
)
# add the sha256 back (we take it out on write to save space,
@@ -405,5 +403,9 @@ class NoSuchPatchError(spack.error.SpackError):
"""Raised when a patch file doesn't exist."""
class PatchLookupError(NoSuchPatchError):
"""Raised when a patch file cannot be located from sha256."""
class PatchDirectiveError(spack.error.SpackError):
"""Raised when the wrong arguments are suppled to the patch directive."""

View File

@@ -68,12 +68,6 @@
"labels": {"type": "object"},
# Use a custom template to render the recipe
"template": {"type": "string", "default": None},
# Add a custom extra section at the bottom of a stage
"extra_instructions": {
"type": "object",
"additionalProperties": False,
"properties": {"build": {"type": "string"}, "final": {"type": "string"}},
},
# Reserved for properties that are specific to each format
"singularity": {
"type": "object",
@@ -89,15 +83,6 @@
"docker": {"type": "object", "additionalProperties": False, "default": {}},
"depfile": {"type": "boolean", "default": False},
},
"deprecatedProperties": {
"properties": ["extra_instructions"],
"message": (
"container:extra_instructions has been deprecated and will be removed "
"in Spack v0.21. Set container:template appropriately to use custom Jinja2 "
"templates instead."
),
"error": False,
},
}
properties = {"container": container_schema}

View File

@@ -2595,6 +2595,7 @@ class SpecBuilder:
r"^node_compiler$",
r"^package_hash$",
r"^root$",
r"^variant_default_value_from_cli$",
r"^virtual_node$",
r"^virtual_root$",
]

View File

@@ -20,7 +20,7 @@
% Integrity constraints on DAG nodes
:- attr("root", PackageNode), not attr("node", PackageNode).
:- attr("version", PackageNode), not attr("node", PackageNode).
:- attr("version", PackageNode, _), not attr("node", PackageNode), not attr("virtual_node", PackageNode).
:- attr("node_version_satisfies", PackageNode), not attr("node", PackageNode).
:- attr("hash", PackageNode, _), not attr("node", PackageNode).
:- attr("node_platform", PackageNode, _), not attr("node", PackageNode).
@@ -58,7 +58,6 @@ unification_set(SetID, ChildNode) :- attr("depends_on", ParentNode, ChildNode, T
unification_set(("build", node(X, Child)), node(X, Child))
:- attr("depends_on", ParentNode, node(X, Child), Type),
Type == "build",
SetID != "generic_build",
multiple_unification_sets(Child),
unification_set(SetID, ParentNode).
@@ -68,18 +67,18 @@ unification_set("generic_build", node(X, Child))
not multiple_unification_sets(Child),
unification_set(_, ParentNode).
% Any dependency of type "build" in a unification set that is in the leaf unification set,
% stays in that unification set
unification_set(SetID, ChildNode)
:- attr("depends_on", ParentNode, ChildNode, Type),
Type == "build",
SetID == "generic_build",
unification_set(SetID, ParentNode).
unification_set(SetID, VirtualNode)
:- provider(PackageNode, VirtualNode),
unification_set(SetID, PackageNode).
% Do not allow split dependencies, for now. This ensures that we don't construct graphs where e.g.
% a python extension depends on setuptools@63.4 as a run dependency, but uses e.g. setuptools@68
% as a build dependency.
%
% We'll need to relax the rule before we get to actual cross-compilation
:- depends_on(ParentNode, node(X, Dependency)), depends_on(ParentNode, node(Y, Dependency)), X < Y.
#defined multiple_unification_sets/1.
%----
@@ -924,7 +923,8 @@ pkg_fact(Package, variant_single_value("dev_path"))
%-----------------------------------------------------------------------------
% if no platform is set, fall back to the default
:- attr("node_platform", _, Platform), not allowed_platform(Platform).
error(100, "platform '{0}' is not allowed on the current host", Platform)
:- attr("node_platform", _, Platform), not allowed_platform(Platform).
attr("node_platform", PackageNode, Platform)
:- attr("node", PackageNode),

View File

@@ -5,6 +5,8 @@
import collections
from typing import List, Set
from llnl.util import lang
import spack.deptypes as dt
import spack.package_base
import spack.repo
@@ -95,8 +97,17 @@ def _compute_cache_values(self):
)
self._link_run_virtuals.update(self._possible_virtuals)
for x in self._link_run:
current = spack.repo.PATH.get_pkg_class(x).dependencies_of_type(dt.BUILD)
self._direct_build.update(current)
build_dependencies = spack.repo.PATH.get_pkg_class(x).dependencies_of_type(dt.BUILD)
virtuals, reals = lang.stable_partition(
build_dependencies, spack.repo.PATH.is_virtual_safe
)
self._possible_virtuals.update(virtuals)
for virtual_dep in virtuals:
providers = spack.repo.PATH.providers_for(virtual_dep)
self._direct_build.update(str(x) for x in providers)
self._direct_build.update(reals)
self._total_build = set(
spack.package_base.possible_dependencies(

View File

@@ -74,6 +74,7 @@
import spack.deptypes as dt
import spack.error
import spack.hash_types as ht
import spack.patch
import spack.paths
import spack.platforms
import spack.provider_index
@@ -1604,13 +1605,20 @@ def _add_dependency(self, spec: "Spec", *, depflag: dt.DepFlag, virtuals: Tuple[
try:
dspec = next(dspec for dspec in orig if depflag == dspec.depflag)
except StopIteration:
raise DuplicateDependencyError("Cannot depend on '%s' twice" % spec)
current_deps = ", ".join(
dt.flag_to_chars(x.depflag) + " " + x.spec.short_spec for x in orig
)
raise DuplicateDependencyError(
f"{self.short_spec} cannot depend on '{spec.short_spec}' multiple times.\n"
f"\tRequired: {dt.flag_to_chars(depflag)}\n"
f"\tDependency: {current_deps}"
)
try:
dspec.spec.constrain(spec)
except spack.error.UnsatisfiableSpecError:
raise DuplicateDependencyError(
"Cannot depend on incompatible specs '%s' and '%s'" % (dspec.spec, spec)
f"Cannot depend on incompatible specs '{dspec.spec}' and '{spec}'"
)
def add_dependency_edge(
@@ -3664,7 +3672,7 @@ def _autospec(self, spec_like):
return spec_like
return Spec(spec_like)
def intersects(self, other: "Spec", deps: bool = True) -> bool:
def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
"""Return True if there exists at least one concrete spec that matches both
self and other, otherwise False.
@@ -3787,7 +3795,7 @@ def _intersects_dependencies(self, other):
return True
def satisfies(self, other: "Spec", deps: bool = True) -> bool:
def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
Args:
@@ -3899,7 +3907,15 @@ def patches(self):
for sha256 in self.variants["patches"]._patches_in_order_of_appearance:
index = spack.repo.PATH.patch_index
pkg_cls = spack.repo.PATH.get_pkg_class(self.name)
patch = index.patch_for_package(sha256, pkg_cls)
try:
patch = index.patch_for_package(sha256, pkg_cls)
except spack.patch.PatchLookupError as e:
raise spack.error.SpecError(
f"{e}. This usually means the patch was modified or removed. "
"To fix this, either reconcretize or use the original package "
"repository"
) from e
self._patches.append(patch)
return self._patches

View File

@@ -7,12 +7,13 @@
import getpass
import glob
import hashlib
import io
import os
import shutil
import stat
import sys
import tempfile
from typing import Callable, Dict, Iterable, Optional
from typing import Callable, Dict, Iterable, Optional, Set
import llnl.string
import llnl.util.lang
@@ -27,6 +28,8 @@
partition_path,
remove_linked_tree,
)
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.caches
import spack.config
@@ -35,11 +38,14 @@
import spack.mirror
import spack.paths
import spack.spec
import spack.stage
import spack.util.lock
import spack.util.path as sup
import spack.util.pattern as pattern
import spack.util.url as url_util
from spack.util.crypto import bit_length, prefix_bits
from spack.util.editor import editor, executable
from spack.version import StandardVersion, VersionList
# The well-known stage source subdirectory name.
_source_path_subdir = "spack-src"
@@ -860,11 +866,187 @@ def purge():
os.remove(stage_path)
def interactive_version_filter(
url_dict: Dict[StandardVersion, str],
known_versions: Iterable[StandardVersion] = (),
*,
url_changes: Set[StandardVersion] = set(),
input: Callable[..., str] = input,
) -> Optional[Dict[StandardVersion, str]]:
"""Interactively filter the list of spidered versions.
Args:
url_dict: Dictionary of versions to URLs
known_versions: Versions that can be skipped because they are already known
Returns:
Filtered dictionary of versions to URLs or None if the user wants to quit
"""
# Find length of longest string in the list for padding
sorted_and_filtered = sorted(url_dict.keys(), reverse=True)
version_filter = VersionList([":"])
max_len = max(len(str(v)) for v in sorted_and_filtered)
orig_url_dict = url_dict # only copy when using editor to modify
print_header = True
VERSION_COLOR = spack.spec.VERSION_COLOR
while True:
if print_header:
has_filter = version_filter != VersionList([":"])
header = []
if len(sorted_and_filtered) == len(orig_url_dict):
header.append(
f"Selected {llnl.string.plural(len(sorted_and_filtered), 'version')}"
)
else:
header.append(
f"Selected {len(sorted_and_filtered)} of {len(orig_url_dict)} versions"
)
if known_versions:
num_new = sum(1 for v in sorted_and_filtered if v not in known_versions)
header.append(f"{llnl.string.plural(num_new, 'new version')}")
if has_filter:
header.append(colorize(f"Filtered by {VERSION_COLOR}{version_filter}@."))
version_with_url = [
colorize(
f"{VERSION_COLOR}{str(v):{max_len}}@. {url_dict[v]}"
f"{' @K{# NOTE: change of URL}' if v in url_changes else ''}"
)
for v in sorted_and_filtered
]
tty.msg(". ".join(header), *llnl.util.lang.elide_list(version_with_url))
print()
print_header = True
print("commands:")
commands = (
"@*b{[c]}hecksum",
"@*b{[e]}dit",
"@*b{[f]}ilter",
"@*b{[a]}sk each",
"@*b{[n]}ew only",
"@*b{[r]}estart",
"@*b{[q]}uit",
)
colify(list(map(colorize, commands)), indent=2)
try:
command = input(colorize("@*g{command>} ")).strip().lower()
except EOFError:
print()
command = "q"
if command == "c":
break
elif command == "e":
# Create a temporary file in the stage dir with lines of the form
# <version> <url>
# which the user can modify. Once the editor is closed, the file is
# read back in and the versions to url dict is updated.
# Create a temporary file by hashing its contents.
buffer = io.StringIO()
buffer.write("# Edit this file to change the versions and urls to fetch\n")
for v in sorted_and_filtered:
buffer.write(f"{str(v):{max_len}} {url_dict[v]}\n")
data = buffer.getvalue().encode("utf-8")
short_hash = hashlib.sha1(data).hexdigest()[:7]
filename = f"{spack.stage.stage_prefix}versions-{short_hash}.txt"
filepath = os.path.join(spack.stage.get_stage_root(), filename)
# Write contents
with open(filepath, "wb") as f:
f.write(data)
# Open editor
editor(filepath, exec_fn=executable)
# Read back in
with open(filepath, "r") as f:
orig_url_dict, url_dict = url_dict, {}
for line in f:
line = line.strip()
# Skip empty lines and comments
if not line or line.startswith("#"):
continue
try:
version, url = line.split(None, 1)
except ValueError:
tty.warn(f"Couldn't parse: {line}")
continue
try:
url_dict[StandardVersion.from_string(version)] = url
except ValueError:
tty.warn(f"Invalid version: {version}")
continue
sorted_and_filtered = sorted(url_dict.keys(), reverse=True)
os.unlink(filepath)
elif command == "f":
tty.msg(
colorize(
f"Examples filters: {VERSION_COLOR}1.2@. "
f"or {VERSION_COLOR}1.1:1.3@. "
f"or {VERSION_COLOR}=1.2, 1.2.2:@."
)
)
try:
# Allow a leading @ version specifier
filter_spec = input(colorize("@*g{filter>} ")).strip().lstrip("@")
except EOFError:
print()
continue
try:
version_filter.intersect(VersionList([filter_spec]))
except ValueError:
tty.warn(f"Invalid version specifier: {filter_spec}")
continue
# Apply filter
sorted_and_filtered = [v for v in sorted_and_filtered if v.satisfies(version_filter)]
elif command == "a":
i = 0
while i < len(sorted_and_filtered):
v = sorted_and_filtered[i]
try:
answer = input(f" {str(v):{max_len}} {url_dict[v]} [Y/n]? ").strip().lower()
except EOFError:
# If ^D, don't fully exit, but go back to the command prompt, now with possibly
# fewer versions
print()
break
if answer in ("n", "no"):
del sorted_and_filtered[i]
elif answer in ("y", "yes", ""):
i += 1
else:
# Went over each version, so go to checksumming
break
elif command == "n":
sorted_and_filtered = [v for v in sorted_and_filtered if v not in known_versions]
elif command == "r":
url_dict = orig_url_dict
sorted_and_filtered = sorted(url_dict.keys(), reverse=True)
version_filter = VersionList([":"])
elif command == "q":
try:
if input("Really quit [y/N]? ").strip().lower() in ("y", "yes"):
return None
except EOFError:
print()
return None
else:
tty.warn(f"Ignoring invalid command: {command}")
print_header = False
continue
return {v: url_dict[v] for v in sorted_and_filtered}
def get_checksums_for_versions(
url_by_version: Dict[str, str],
package_name: str,
*,
batch: bool = False,
first_stage_function: Optional[Callable[[Stage, str], None]] = None,
keep_stage: bool = False,
concurrency: Optional[int] = None,
@@ -890,32 +1072,7 @@ def get_checksums_for_versions(
Returns:
A dictionary mapping each version to the corresponding checksum
"""
sorted_versions = sorted(url_by_version.keys(), reverse=True)
# Find length of longest string in the list for padding
max_len = max(len(str(v)) for v in sorted_versions)
num_ver = len(sorted_versions)
tty.msg(
f"Found {llnl.string.plural(num_ver, 'version')} of {package_name}:",
"",
*llnl.util.lang.elide_list(
["{0:{1}} {2}".format(str(v), max_len, url_by_version[v]) for v in sorted_versions]
),
)
print()
if batch:
archives_to_fetch = len(sorted_versions)
else:
archives_to_fetch = tty.get_number(
"How many would you like to checksum?", default=1, abort="q"
)
if not archives_to_fetch:
tty.die("Aborted.")
versions = sorted_versions[:archives_to_fetch]
versions = sorted(url_by_version.keys(), reverse=True)
search_arguments = [(url_by_version[v], v) for v in versions]
version_hashes, errors = {}, []

View File

@@ -899,22 +899,21 @@ def test_tarball_doesnt_include_buildinfo_twice(tmpdir):
tarball = str(tmpdir.join("prefix.tar.gz"))
bindist._do_create_tarball(
tarfile_path=tarball,
binaries_dir=str(p),
pkg_dir="my-pkg-prefix",
buildinfo={"metadata": "new"},
tarfile_path=tarball, binaries_dir=p.strpath, buildinfo={"metadata": "new"}
)
expected_prefix = p.strpath.lstrip("/")
# Verify we don't have a repeated binary_distribution file,
# and that the tarball contains the new one, not the old one.
with tarfile.open(tarball) as tar:
assert syaml.load(tar.extractfile("my-pkg-prefix/.spack/binary_distribution")) == {
assert syaml.load(tar.extractfile(f"{expected_prefix}/.spack/binary_distribution")) == {
"metadata": "new"
}
assert tar.getnames() == [
"my-pkg-prefix",
"my-pkg-prefix/.spack",
"my-pkg-prefix/.spack/binary_distribution",
f"{expected_prefix}",
f"{expected_prefix}/.spack",
f"{expected_prefix}/.spack/binary_distribution",
]
@@ -935,15 +934,17 @@ def test_reproducible_tarball_is_reproducible(tmpdir):
# Create a tarball with a certain mtime of bin/app
os.utime(app, times=(0, 0))
bindist._do_create_tarball(tarball_1, binaries_dir=p, pkg_dir="pkg", buildinfo=buildinfo)
bindist._do_create_tarball(tarball_1, binaries_dir=p.strpath, buildinfo=buildinfo)
# Do it another time with different mtime of bin/app
os.utime(app, times=(10, 10))
bindist._do_create_tarball(tarball_2, binaries_dir=p, pkg_dir="pkg", buildinfo=buildinfo)
bindist._do_create_tarball(tarball_2, binaries_dir=p.strpath, buildinfo=buildinfo)
# They should be bitwise identical:
assert filecmp.cmp(tarball_1, tarball_2, shallow=False)
expected_prefix = p.strpath.lstrip("/")
# Sanity check for contents:
with tarfile.open(tarball_1, mode="r") as f:
for m in f.getmembers():
@@ -951,11 +952,11 @@ def test_reproducible_tarball_is_reproducible(tmpdir):
assert m.uname == m.gname == ""
assert set(f.getnames()) == {
"pkg",
"pkg/bin",
"pkg/bin/app",
"pkg/.spack",
"pkg/.spack/binary_distribution",
f"{expected_prefix}",
f"{expected_prefix}/bin",
f"{expected_prefix}/bin/app",
f"{expected_prefix}/.spack",
f"{expected_prefix}/.spack/binary_distribution",
}
@@ -979,21 +980,23 @@ def test_tarball_normalized_permissions(tmpdir):
with open(data, "w", opener=lambda path, flags: os.open(path, flags, 0o477)) as f:
f.write("hello world")
bindist._do_create_tarball(tarball, binaries_dir=p, pkg_dir="pkg", buildinfo={})
bindist._do_create_tarball(tarball, binaries_dir=p.strpath, buildinfo={})
expected_prefix = p.strpath.lstrip("/")
with tarfile.open(tarball) as tar:
path_to_member = {member.name: member for member in tar.getmembers()}
# directories should have 0o755
assert path_to_member["pkg"].mode == 0o755
assert path_to_member["pkg/bin"].mode == 0o755
assert path_to_member["pkg/.spack"].mode == 0o755
assert path_to_member[f"{expected_prefix}"].mode == 0o755
assert path_to_member[f"{expected_prefix}/bin"].mode == 0o755
assert path_to_member[f"{expected_prefix}/.spack"].mode == 0o755
# executable-by-user files should be 0o755
assert path_to_member["pkg/bin/app"].mode == 0o755
assert path_to_member[f"{expected_prefix}/bin/app"].mode == 0o755
# not-executable-by-user files should be 0o644
assert path_to_member["pkg/share/file"].mode == 0o644
assert path_to_member[f"{expected_prefix}/share/file"].mode == 0o644
def test_tarball_common_prefix(dummy_prefix, tmpdir):
@@ -1062,3 +1065,50 @@ def test_tarfile_with_files_outside_common_prefix(tmpdir, dummy_prefix):
ValueError, match="Tarball contains file /etc/config_file outside of prefix"
):
bindist._ensure_common_prefix(tarfile.open("broken.tar", mode="r"))
def test_tarfile_of_spec_prefix(tmpdir):
"""Tests whether hardlinks, symlinks, files and dirs are added correctly,
and that the order of entries is correct."""
prefix = tmpdir.mkdir("prefix")
prefix.ensure("a_directory", dir=True).join("file").write("hello")
prefix.ensure("c_directory", dir=True).join("file").write("hello")
prefix.ensure("b_directory", dir=True).join("file").write("hello")
prefix.join("file").write("hello")
os.symlink(prefix.join("file"), prefix.join("symlink"))
os.link(prefix.join("file"), prefix.join("hardlink"))
file = tmpdir.join("example.tar")
with tarfile.open(file, mode="w") as tar:
bindist.tarfile_of_spec_prefix(tar, prefix.strpath)
expected_prefix = prefix.strpath.lstrip("/")
with tarfile.open(file, mode="r") as tar:
# Verify that entries are added in depth-first pre-order, files preceding dirs,
# entries ordered alphabetically
assert tar.getnames() == [
f"{expected_prefix}",
f"{expected_prefix}/file",
f"{expected_prefix}/hardlink",
f"{expected_prefix}/symlink",
f"{expected_prefix}/a_directory",
f"{expected_prefix}/a_directory/file",
f"{expected_prefix}/b_directory",
f"{expected_prefix}/b_directory/file",
f"{expected_prefix}/c_directory",
f"{expected_prefix}/c_directory/file",
]
# Check that the types are all correct
assert tar.getmember(f"{expected_prefix}").isdir()
assert tar.getmember(f"{expected_prefix}/file").isreg()
assert tar.getmember(f"{expected_prefix}/hardlink").islnk()
assert tar.getmember(f"{expected_prefix}/symlink").issym()
assert tar.getmember(f"{expected_prefix}/a_directory").isdir()
assert tar.getmember(f"{expected_prefix}/a_directory/file").isreg()
assert tar.getmember(f"{expected_prefix}/b_directory").isdir()
assert tar.getmember(f"{expected_prefix}/b_directory/file").isreg()
assert tar.getmember(f"{expected_prefix}/c_directory").isdir()
assert tar.getmember(f"{expected_prefix}/c_directory/file").isreg()

View File

@@ -169,7 +169,7 @@ def test_remove_and_add_a_source(mutable_config):
assert not sources
# Add it back and check we restored the initial state
_bootstrap("add", "github-actions", "$spack/share/spack/bootstrap/github-actions-v0.3")
_bootstrap("add", "github-actions", "$spack/share/spack/bootstrap/github-actions-v0.5")
sources = spack.bootstrap.core.bootstrapping_sources()
assert len(sources) == 1

View File

@@ -7,12 +7,12 @@
import pytest
import llnl.util.tty as tty
import spack.cmd.checksum
import spack.repo
import spack.spec
from spack.main import SpackCommand
from spack.stage import interactive_version_filter
from spack.version import Version
spack_checksum = SpackCommand("checksum")
@@ -56,18 +56,134 @@ def test_checksum(arguments, expected, mock_packages, mock_clone_repo, mock_stag
assert "version(" in output
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
def test_checksum_interactive(mock_packages, mock_fetch, mock_stage, monkeypatch):
# TODO: mock_fetch doesn't actually work with stage, working around with ignoring
# fail_on_error for now
def _get_number(*args, **kwargs):
return 1
def input_from_commands(*commands):
"""Create a function that returns the next command from a list of inputs for interactive spack
checksum. If None is encountered, this is equivalent to EOF / ^D."""
commands = iter(commands)
monkeypatch.setattr(tty, "get_number", _get_number)
def _input(prompt):
cmd = next(commands)
if cmd is None:
raise EOFError
assert isinstance(cmd, str)
return cmd
output = spack_checksum("preferred-test", fail_on_error=False)
assert "version of preferred-test" in output
assert "version(" in output
return _input
def test_checksum_interactive_filter():
# Filter effectively by 1:1.0, then checksum.
input = input_from_commands("f", "@1:", "f", "@:1.0", "c")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
) == {
Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
}
def test_checksum_interactive_return_from_filter_prompt():
# Enter and then exit filter subcommand.
input = input_from_commands("f", None, "c")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
) == {
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
}
def test_checksum_interactive_quit_returns_none():
# Quit after filtering something out (y to confirm quit)
input = input_from_commands("f", "@1:", "q", "y")
assert (
interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
)
is None
)
def test_checksum_interactive_reset_resets():
# Filter 1:, then reset, then filter :0, should just given 0.9 (it was filtered out
# before reset)
input = input_from_commands("f", "@1:", "r", "f", ":0", "c")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
) == {Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz"}
def test_checksum_interactive_ask_each():
# Ask each should run on the filtered list. First select 1.x, then select only the second
# entry, which is 1.0.1.
input = input_from_commands("f", "@1:", "a", "n", "y", "n")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
) == {Version("1.0.1"): "https://www.example.com/pkg-1.0.1.tar.gz"}
def test_checksum_interactive_quit_from_ask_each():
# Enter ask each mode, select the second item, then quit from submenu, then checksum, which
# should still include the last item at which ask each stopped.
input = input_from_commands("a", "n", "y", None, "c")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
input=input,
) == {
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
}
def test_checksum_interactive_new_only():
# The 1.0 version is known already, and should be dropped on `n`.
input = input_from_commands("n", "c")
assert interactive_version_filter(
{
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("1.0"): "https://www.example.com/pkg-1.0.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
},
known_versions=[Version("1.0")],
input=input,
) == {
Version("1.1"): "https://www.example.com/pkg-1.1.tar.gz",
Version("0.9"): "https://www.example.com/pkg-0.9.tar.gz",
}
def test_checksum_versions(mock_packages, mock_clone_repo, mock_fetch, mock_stage):

View File

@@ -1990,8 +1990,7 @@ def test_ci_reproduce(
ci_cmd("generate", "--output-file", pipeline_path, "--artifacts-root", artifacts_root)
target_name = spack.platforms.test.Test.default
job_name = ci.get_job_name(job_spec, "test-debian6-%s" % target_name, None)
job_name = ci.get_job_name(job_spec)
repro_file = os.path.join(working_dir.strpath, "repro.json")
repro_details = {

View File

@@ -2121,12 +2121,9 @@ def duplicates_test_repository():
@pytest.mark.usefixtures("mutable_config", "duplicates_test_repository")
@pytest.mark.only_clingo("Not supported by the original concretizer")
class TestConcretizeSeparately:
@pytest.mark.parametrize("strategy", ["minimal", "full"])
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Not supported by the original concretizer",
)
def test_two_gmake(self, strategy):
"""Tests that we can concretize a spec with nodes using the same build
dependency pinned at different versions.
@@ -2151,10 +2148,6 @@ def test_two_gmake(self, strategy):
assert len(pinned_gmake) == 1 and pinned_gmake[0].satisfies("@=3.0")
@pytest.mark.parametrize("strategy", ["minimal", "full"])
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Not supported by the original concretizer",
)
def test_two_setuptools(self, strategy):
"""Tests that we can concretize separate build dependencies, when we are dealing
with extensions.
@@ -2191,10 +2184,6 @@ def test_two_setuptools(self, strategy):
gmake = s["python"].dependencies(name="gmake", deptype="build")
assert len(gmake) == 1 and gmake[0].satisfies("@=3.0")
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Not supported by the original concretizer",
)
def test_solution_without_cycles(self):
"""Tests that when we concretize a spec with cycles, a fallback kicks in to recompute
a solution without cycles.
@@ -2207,6 +2196,21 @@ def test_solution_without_cycles(self):
assert s["cycle-a"].satisfies("~cycle")
assert s["cycle-b"].satisfies("+cycle")
@pytest.mark.parametrize("strategy", ["minimal", "full"])
def test_pure_build_virtual_dependency(self, strategy):
"""Tests that we can concretize a pure build virtual dependency, and ensures that
pure build virtual dependencies are accounted in the list of possible virtual
dependencies.
virtual-build@1.0
| [type=build, virtual=pkgconfig]
pkg-config@1.0
"""
spack.config.CONFIG.set("concretizer:duplicates:strategy", strategy)
s = Spec("virtual-build").concretized()
assert s["pkgconfig"].name == "pkg-config"
@pytest.mark.parametrize(
"v_str,v_opts,checksummed",

View File

@@ -82,23 +82,6 @@ def test_strip_is_set_from_config(minimal_configuration):
assert writer.strip is False
def test_extra_instructions_is_set_from_config(minimal_configuration):
writer = writers.create(minimal_configuration)
assert writer.extra_instructions == (None, None)
test_line = "RUN echo Hello world!"
e = minimal_configuration["spack"]["container"]
e["extra_instructions"] = {}
e["extra_instructions"]["build"] = test_line
writer = writers.create(minimal_configuration)
assert writer.extra_instructions == (test_line, None)
e["extra_instructions"]["final"] = test_line
del e["extra_instructions"]["build"]
writer = writers.create(minimal_configuration)
assert writer.extra_instructions == (None, test_line)
def test_custom_base_images(minimal_configuration):
"""Test setting custom base images from configuration file"""
minimal_configuration["spack"]["container"]["images"] = {

View File

@@ -1,5 +1,5 @@
bootstrap:
sources:
- name: 'github-actions'
metadata: $spack/share/spack/bootstrap/github-actions-v0.3
metadata: $spack/share/spack/bootstrap/github-actions-v0.5
trusted: {}

View File

@@ -4,4 +4,4 @@ concretizer:
granularity: microarchitectures
host_compatible: false
duplicates:
strategy: none
strategy: minimal

View File

@@ -121,7 +121,6 @@ def test_ld_flags_cmake(self, temp_env):
"-DCMAKE_EXE_LINKER_FLAGS=-mthreads",
"-DCMAKE_MODULE_LINKER_FLAGS=-mthreads",
"-DCMAKE_SHARED_LINKER_FLAGS=-mthreads",
"-DCMAKE_STATIC_LINKER_FLAGS=-mthreads",
}
def test_ld_libs_cmake(self, temp_env):

View File

@@ -17,6 +17,7 @@
import spack.package_base
import spack.spec
from spack.version import (
EmptyRangeError,
GitVersion,
StandardVersion,
Version,
@@ -695,9 +696,9 @@ def test_version_range_nonempty():
def test_empty_version_range_raises():
with pytest.raises(ValueError):
with pytest.raises(EmptyRangeError, match="2:1.0 is an empty range"):
assert VersionRange("2", "1.0")
with pytest.raises(ValueError):
with pytest.raises(EmptyRangeError, match="2:1.0 is an empty range"):
assert ver("2:1.0")

View File

@@ -647,7 +647,7 @@ def find_versions_of_archive(
list_urls |= additional_list_urls
# Grab some web pages to scrape.
pages, links = spack.util.web.spider(list_urls, depth=list_depth, concurrency=concurrency)
_, links = spack.util.web.spider(list_urls, depth=list_depth, concurrency=concurrency)
# Scrape them for archive URLs
regexes = []

View File

@@ -26,8 +26,8 @@ def prefix_inspections(platform):
A dictionary mapping subdirectory names to lists of environment
variables to modify with that directory if it exists.
"""
inspections = spack.config.get("modules:prefix_inspections", {})
if inspections:
inspections = spack.config.get("modules:prefix_inspections")
if isinstance(inspections, dict):
return inspections
inspections = {

View File

@@ -61,7 +61,7 @@ def executable(exe: str, args: List[str]) -> int:
return cmd.returncode
def editor(*args: List[str], exec_fn: Callable[[str, List[str]], int] = os.execv) -> bool:
def editor(*args: str, exec_fn: Callable[[str, List[str]], int] = os.execv) -> bool:
"""Invoke the user's editor.
This will try to execute the following, in order:

View File

@@ -270,16 +270,6 @@ def visit_Assert(self, node):
self.write(", ")
self.dispatch(node.msg)
def visit_Exec(self, node):
self.fill("exec ")
self.dispatch(node.body)
if node.globals:
self.write(" in ")
self.dispatch(node.globals)
if node.locals:
self.write(", ")
self.dispatch(node.locals)
def visit_Global(self, node):
self.fill("global ")
interleave(lambda: self.write(", "), self.write, node.names)
@@ -338,31 +328,6 @@ def visit_Try(self, node):
with self.block():
self.dispatch(node.finalbody)
def visit_TryExcept(self, node):
self.fill("try")
with self.block():
self.dispatch(node.body)
for ex in node.handlers:
self.dispatch(ex)
if node.orelse:
self.fill("else")
with self.block():
self.dispatch(node.orelse)
def visit_TryFinally(self, node):
if len(node.body) == 1 and isinstance(node.body[0], ast.TryExcept):
# try-except-finally
self.dispatch(node.body)
else:
self.fill("try")
with self.block():
self.dispatch(node.body)
self.fill("finally")
with self.block():
self.dispatch(node.finalbody)
def visit_ExceptHandler(self, node):
self.fill("except")
if node.type:
@@ -380,6 +345,10 @@ def visit_ClassDef(self, node):
self.fill("@")
self.dispatch(deco)
self.fill("class " + node.name)
if getattr(node, "type_params", False):
self.write("[")
interleave(lambda: self.write(", "), self.dispatch, node.type_params)
self.write("]")
with self.delimit_if("(", ")", condition=node.bases or node.keywords):
comma = False
for e in node.bases:
@@ -394,21 +363,6 @@ def visit_ClassDef(self, node):
else:
comma = True
self.dispatch(e)
if sys.version_info[:2] < (3, 5):
if node.starargs:
if comma:
self.write(", ")
else:
comma = True
self.write("*")
self.dispatch(node.starargs)
if node.kwargs:
if comma:
self.write(", ")
else:
comma = True
self.write("**")
self.dispatch(node.kwargs)
with self.block():
self.dispatch(node.body)
@@ -425,6 +379,10 @@ def __FunctionDef_helper(self, node, fill_suffix):
self.dispatch(deco)
def_str = fill_suffix + " " + node.name
self.fill(def_str)
if getattr(node, "type_params", False):
self.write("[")
interleave(lambda: self.write(", "), self.dispatch, node.type_params)
self.write("]")
with self.delimit("(", ")"):
self.dispatch(node.args)
if getattr(node, "returns", False):
@@ -640,11 +598,6 @@ def visit_Name(self, node):
def visit_NameConstant(self, node):
self.write(repr(node.value))
def visit_Repr(self, node):
self.write("`")
self.dispatch(node.value)
self.write("`")
def _write_constant(self, value):
if isinstance(value, (float, complex)):
# Substitute overflowing decimal literal for AST infinities.
@@ -985,16 +938,10 @@ def visit_arguments(self, node):
self.write(", ")
self.write("*")
if node.vararg:
if hasattr(node.vararg, "arg"):
self.write(node.vararg.arg)
if node.vararg.annotation:
self.write(": ")
self.dispatch(node.vararg.annotation)
else:
self.write(node.vararg)
if getattr(node, "varargannotation", None):
self.write(": ")
self.dispatch(node.varargannotation)
self.write(node.vararg.arg)
if node.vararg.annotation:
self.write(": ")
self.dispatch(node.vararg.annotation)
# keyword-only arguments
if getattr(node, "kwonlyargs", False):
@@ -1014,16 +961,10 @@ def visit_arguments(self, node):
first = False
else:
self.write(", ")
if hasattr(node.kwarg, "arg"):
self.write("**" + node.kwarg.arg)
if node.kwarg.annotation:
self.write(": ")
self.dispatch(node.kwarg.annotation)
else:
self.write("**" + node.kwarg)
if getattr(node, "kwargannotation", None):
self.write(": ")
self.dispatch(node.kwargannotation)
self.write("**" + node.kwarg.arg)
if node.kwarg.annotation:
self.write(": ")
self.dispatch(node.kwarg.annotation)
def visit_keyword(self, node):
if node.arg is None:
@@ -1138,3 +1079,23 @@ def visit_MatchOr(self, node):
with self.require_parens(_Precedence.BOR, node):
self.set_precedence(pnext(_Precedence.BOR), *node.patterns)
interleave(lambda: self.write(" | "), self.dispatch, node.patterns)
def visit_TypeAlias(self, node):
self.fill("type ")
self.dispatch(node.name)
self.write(" = ")
self.dispatch(node.value)
def visit_TypeVar(self, node):
self.write(node.name)
if node.bound:
self.write(": ")
self.dispatch(node.bound)
def visit_TypeVarTuple(self, node):
self.write("*")
self.write(node.name)
def visit_ParamSpec(self, node):
self.write("**")
self.write(node.name)

View File

@@ -110,19 +110,28 @@ def handle_starttag(self, tag, attrs):
self.links.append(val)
class IncludeFragmentParser(HTMLParser):
class ExtractMetadataParser(HTMLParser):
"""This parser takes an HTML page and selects the include-fragments,
used on GitHub, https://github.github.io/include-fragment-element."""
used on GitHub, https://github.github.io/include-fragment-element,
as well as a possible base url."""
def __init__(self):
super().__init__()
self.links = []
self.fragments = []
self.base_url = None
def handle_starttag(self, tag, attrs):
# <include-fragment src="..." />
if tag == "include-fragment":
for attr, val in attrs:
if attr == "src":
self.links.append(val)
self.fragments.append(val)
# <base href="..." />
elif tag == "base":
for attr, val in attrs:
if attr == "href":
self.base_url = val
def read_from_url(url, accept_content_type=None):
@@ -625,12 +634,15 @@ def _spider(url: urllib.parse.ParseResult, collect_nested: bool, _visited: Set[s
# Parse out the include-fragments in the page
# https://github.github.io/include-fragment-element
include_fragment_parser = IncludeFragmentParser()
include_fragment_parser.feed(page)
metadata_parser = ExtractMetadataParser()
metadata_parser.feed(page)
# Change of base URL due to <base href="..." /> tag
response_url = metadata_parser.base_url or response_url
fragments = set()
while include_fragment_parser.links:
raw_link = include_fragment_parser.links.pop()
while metadata_parser.fragments:
raw_link = metadata_parser.fragments.pop()
abs_link = url_util.join(response_url, raw_link.strip(), resolve_href=True)
try:

View File

@@ -16,6 +16,7 @@
"""
from .common import (
EmptyRangeError,
VersionChecksumError,
VersionError,
VersionLookupError,
@@ -54,5 +55,6 @@
"VersionError",
"VersionChecksumError",
"VersionLookupError",
"EmptyRangeError",
"any_version",
]

View File

@@ -35,3 +35,7 @@ class VersionChecksumError(VersionError):
class VersionLookupError(VersionError):
"""Raised for errors looking up git commits as versions."""
class EmptyRangeError(VersionError):
"""Raised when constructing an empty version range."""

View File

@@ -12,6 +12,7 @@
from .common import (
COMMIT_VERSION,
EmptyRangeError,
VersionLookupError,
infinity_versions,
is_git_version,
@@ -595,14 +596,17 @@ def up_to(self, index) -> StandardVersion:
class ClosedOpenRange:
def __init__(self, lo: StandardVersion, hi: StandardVersion):
if hi < lo:
raise ValueError(f"{lo}:{hi} is an empty range")
raise EmptyRangeError(f"{lo}..{hi} is an empty range")
self.lo: StandardVersion = lo
self.hi: StandardVersion = hi
@classmethod
def from_version_range(cls, lo: StandardVersion, hi: StandardVersion):
"""Construct ClosedOpenRange from lo:hi range."""
return ClosedOpenRange(lo, next_version(hi))
try:
return ClosedOpenRange(lo, next_version(hi))
except EmptyRangeError as e:
raise EmptyRangeError(f"{lo}:{hi} is an empty range") from e
def __str__(self):
# This simplifies 3.1:<3.2 to 3.1:3.1 to 3.1

View File

@@ -1,268 +0,0 @@
{
"verified": [
{
"binaries": [
[
"clingo-bootstrap",
"i5rx6vbyw7cyg3snajcpnuozo7l3lcab",
"c55d1c76adb82ac9fbe67725641ef7e4fe1ae11e2e8da0dc93a3efe362549127"
]
],
"python": "python@3.10",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"xoxkdgo3n332ewhbh7pz2zuevrjxkrke",
"b50e2fba026e85af3f99b3c412b4f0c88ec2fbce15b48eeb75072f1d3737f3cc"
]
],
"python": "python@3.5",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"sgmirxbu3bpn4rdpfs6jlyycfrkfxl5i",
"b0a574df6f5d59491a685a31a8ed99fb345c850a91df62ef232fbe0cca716ed1"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"5hn7hszlizeqq3leqi6lrdmyy5ssv6zs",
"36e24bc3bd27b125fdeb30d51d2554e44288877c0ce6df5a878bb4e8a1d5847a"
]
],
"python": "python@3.7",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"qk3ecxakadq4naakng6mhdfkwauef3dn",
"9d974c0d2b546d18f0ec35e08d5ba114bf2867f7ff7c7ea990b79d019ece6380"
]
],
"python": "python@3.8",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"2omdsvzshkn2u3l5vwvwoey4es5cowfu",
"cbf72eb932ac847f87b1640f8e70e26f5261967288f7d6db19206ef352e36a88"
]
],
"python": "python@3.9",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"ifgzrctoh2ibrmitp6ushrvrnaeqtkr7",
"1c609df7351286fe09aa3452fa7ed7fedf903e9fa12cde89b916a0fc4c022949"
]
],
"python": "python@3.10",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"esfzjhodgh5be22hvh3trg2ojzrmhzwt",
"8d070cdb2a5103cde3e6f873b1eb11d25f60464f3059d8643f943e5c9a9ec76c"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"5b4uhkhrvtvdmsnctjx2isrxciy6v2o2",
"336b8b1202a8a28a0e34a98e5780ae0e2b2370b342ce67434551009b1a7c8db9"
]
],
"python": "python@3.7",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"czapgrrey6llnsu2m4qaamv3so2lybxm",
"16bdfe4b08ee8da38f3e2c7d5cc44a38d87696cc2b6de0971a4de25efb8ad8e4"
]
],
"python": "python@3.8",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"7za6vsetahbghs4d2qe4ajtf2iyiacwx",
"730ae7e6096ec8b83a0fc9464dda62bd6c2fec1f8565bb291f4d1ffe7746703b"
]
],
"python": "python@3.9",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"zulnxrmchldtasffqw6qacmgg4y2qumj",
"8988325db53c0c650f64372c21571ac85e9ba4577975d14ae7dba8ab7728b5fc"
]
],
"python": "python@3.10",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"lx54ebqzwtjpfgch7kagoxkmul56z7fa",
"81d64229299e76f9dc81f88d286bc94725e7cbcbb29ad0d66aaeaff73dd6473a"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"isu2rjoicl4xzmbl3k2c4bg35gvejkgz",
"fcc4b052832cfd327d11f657c2b7715d981b0894ed03bbce18b23a842c7d706d"
]
],
"python": "python@3.7",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"ob3k3g2wjy7cw33lfobjar44sqmojyth",
"f51fd6256bfd3afc8470614d87df61e5c9dd582fcc70f707ca66ba2b7255da12"
]
],
"python": "python@3.8",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"norpsmparkl5dfuzdqj4537o77vjbgsl",
"477c041857b60f29ff9d6c7d2982b7eb49a2e02ebbc98af11488c32e2fb24081"
]
],
"python": "python@3.9",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"gypv5loj2ml73duq6sr76yg5rj25te2m",
"c855d7d32aadec37c41e51f19b83558b32bc0b946a9565dba0e659c6820bd6c3"
]
],
"python": "python@2.7+ucs4",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"rjopyx7hum3hqhgsdyw3st7frdfgrv3p",
"0e555f9bc99b4e4152939b30b2257f4f353941d152659e716bf6123c0ce11a60"
]
],
"python": "python@2.7~ucs4",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"2l45t4kw3cqqwj6vbxhfwhzlo6b3q2p4",
"6cb90de5a3d123b7408cfef693a9a78bb69c66abbfed746c1e85aa0acb848d03"
]
],
"python": "python@3.10",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"4psiezojm7dexequtbnav77wvgcajigq",
"b3fc33b5482357613294becb54968bd74de638abeae69e27c6c4319046a7e352"
]
],
"python": "python@3.5",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"dzhvhynye4z7oalowdcy5zt25lej3m2n",
"61c5f3e80bcc7acfc65e335f1910762df2cc5ded9d7e1e5977380a24de553dd7"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"dtwevigmwgke4g6ee5byktpmzmrp2kvx",
"636937244b58611ec2eedb4422a1076fcaf09f3998593befb5a6ff1a74e1d5f7"
]
],
"python": "python@3.7",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"shqedxgvjnhiwdcdrvjhbd73jaevv7wt",
"b3615b2a94a8a15fddaa74cf4d9f9b3a516467a843cdeab597f72dcf6be5e31d"
]
],
"python": "python@3.8",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"z6v6zvc6awioeompbvo735b4flr3yuyz",
"1389192bd74c1f7059d95c4a41500201cbc2905cbba553678613e0b7e3b96c71"
]
],
"python": "python@3.9",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
}
]
}

View File

@@ -1,204 +0,0 @@
{
"verified": [
{
"binaries": [
[
"libiconv",
"d6dhoguolmllbzy2h6pnvjm3tti6uy6f",
"7fe765a87945991d4e57782ed67c4bf42a10f95582eecd6f57de80a545bde821"
],
[
"npth",
"x6fb7zx6n7mos5knvi6wlnaadd7r2szx",
"fd1e5a62107339f45219c32ba20b5e82aa0880c31ac86d1b245d388ca4546990"
],
[
"zlib",
"c5wm3jilx6zsers3sfgdisjqusoza4wr",
"7500a717c62736872aa65df4599f797ef67b21086dd6236b4c7712cfffac9bf3"
],
[
"libassuan",
"3qv4bprobfwes37clg764cfipdzjdbto",
"d85cd9d2c63a296300d4dcbd667421956df241109daef5e12d3ca63fa241cb14"
],
[
"libgcrypt",
"3y4ubdgxvgpvhxr3bk4l5mkw4gv42n7e",
"9dad7c2635344957c4db68378964d3af84ea052d45dbe8ded9a6e6e47211daa8"
],
[
"libgpg-error",
"doido34kfwsvwpj4c4jcocahjb5ltebw",
"20e5c238bee91d2a841f0b4bd0358ded59a0bd665d7f251fd9cd42f83e0b283b"
],
[
"libksba",
"mttecm7gzdv544lbzcoahchnboxysrvi",
"1c0ae64e828a597e4cf15dd997c66cd677e41f68c63db09b9551480a197052a2"
],
[
"pinentry",
"se7xgv7yf4ywpjnbv7voxgeuuvs77ahb",
"2fd13fbee7ca2361dc5dd09708c72d0489611301b60635cb0206bc5b94add884"
],
[
"gnupg",
"yannph34bpaqkhsv5mz2icwhy3epiqxd",
"1de8b4e119fa3455d0170466fa0fb8e04957fab740aec32535b4667279312b3f"
]
],
"spec": "gnupg@2.3: %apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"zlib",
"t2hjzsyf3txkg64e4bq3nihe26rzzdws",
"171e720840a28af50b62141be77bc525e666cffd1fbbe2ee62673214e8b0280f"
],
[
"libiconv",
"yjdji2wj4njz72fyrg46jlz5f5wfbhfr",
"94c773c3d0294cf248ec1f3e9862669dfa743fe1a76de580d9425c14c8f7dcd2"
],
[
"npth",
"kx3vzmpysee7jxwsudarthrmyop6hzgc",
"f8cc6204fa449ce576d450396ec2cad40a75d5712c1381a61ed1681a54f9c79f"
],
[
"libassuan",
"e5n5l5ftzwxs4ego5furrdbegphb6hxp",
"ef0428874aa81bcb9944deed88e1fc639f629fe3d522cab3c281235ae2a53db9"
],
[
"libgcrypt",
"wyncpahrpqsmpk4b7nlhg5ekkjzyjdzs",
"2309548c51a17f580f036445b701feb85d2bc552b9c4404418c2f223666cfe3b"
],
[
"libgpg-error",
"vhcdd6jkbiday2seg3rlkbzpf6jzfdx7",
"79dd719538d9223d6287c0bba07b981944ab6d3ab11e5060274f1b7c727daf55"
],
[
"libksba",
"azcgpgncynoox3dce45hkz46bp2tb5rr",
"15d301f201a5162234261fcfccd579b0ff484131444a0b6f5c0006224bb155d6"
],
[
"pinentry",
"e3z5ekbv4jlsie4qooubcfvsk2sb6t7l",
"5fd27b8e47934b06554e84f1374a90a93e71e60a14dbde672a8da414b27b97f4"
],
[
"gnupg",
"i5agfvsmzdokuooaqhlh6vro5giwei2t",
"f1bde7a1f0c84c1bbcde5757a96cf7a3e9157c2cfa9907fde799aa8e04c0d51f"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"zlib",
"v5rr6ba37tudzfuv2jszwikgcl4wd3cd",
"371ad4b277af7b97c7871b9931f2764c97362620c7990c5ad8fdb5c42a1d30dc"
],
[
"libiconv",
"bvcnx2e4bumjcgya4dczdhjb3fhqyass",
"65a00b717b3a4ee1b5ab9f84163722bdfea8eb20a2eecc9cf657c0eaac0227e9"
],
[
"npth",
"dkb6ez6a4c3iyrv67llwf5mzmynqdmtj",
"4d77351661d0e0130b1c89fb6c6a944aee41d701ef80d056d3fc0178a7f36075"
],
[
"libassuan",
"tuydcxdbb5jfvw3gri7y24b233kgotgd",
"d8775e7c1dd252437c6fa0781675b1d2202cfc0c8190e60d248928b6fca8bc9f"
],
[
"libgcrypt",
"kgxmg4eukwx6nn3bdera3j7cf7hxfy6n",
"6046523f10ed54be50b0211c27191b3422886984fc0c00aed1a85d1f121c42e6"
],
[
"libgpg-error",
"ewhrwnltlrzkpqyix2vbkf4ruq6b6ea3",
"3f3bbbf1a3cb82d39313e39bcbe3dad94a176130fc0e9a8045417d6223fb4f31"
],
[
"libksba",
"onxt5ry2fotgwiognwmhxlgnekuvtviq",
"3a4df13f8b880441d1df4b234a4ca01de7601d84a6627185c2b3191a34445d40"
],
[
"pinentry",
"fm3m4rsszzxxakcpssd34jbbe4ihrhac",
"73afa46176a7ec8f02d01a2caad3e400dc18c3c8a53f92b88a9aa9e3653db3e6"
],
[
"gnupg",
"gwr65ovh4wbxjgniaoqlbt3yla6rdikj",
"7a3f7afe69ca67797a339c04028ca45a9630933020b57cb56e28453197fe8a57"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"libiconv",
"vec3ac6t4ag3lb7ycvisafthqmpci74b",
"35d184218e525d8aaea60082fd2d0f1e80449ec32746cceda2ea0ca106e9a095"
],
[
"npth",
"jx3kmy3ilc66rgg5mqtbed5z6qwt3vrd",
"74c2c1b087667661da3e24ac83bcecf1bc2d10d69e7678d1fd232875fe295135"
],
[
"zlib",
"wnpbp4pu7xca24goggcy773d2y4pobbd",
"bcbd5310e8c5e75cbf33d8155448b212486dc543469d6df7e56dcecb6112ee88"
],
[
"libassuan",
"ynn33wutdtoo2lbjjoizgslintxst2zl",
"ac3b060690c6da0c64dcf35da047b84cc81793118fb9ff29b993f3fb9efdc258"
],
[
"libgcrypt",
"zzofcjer43vsxwj27c3rxapjxhsz4hlx",
"4b1977d815f657c2d6af540ea4b4ce80838cadcf4ada72a8ba142a7441e571ea"
],
[
"libgpg-error",
"gzr2ucybgks5jquvf4lv7iprxq5vx5le",
"a12ecb5cfd083a29d042fd309ebb5ab8fd4ace0b68b27f89b857e9a84d75b5be"
],
[
"libksba",
"hw4u4pam6mp3henpw476axtqaahfdy64",
"5424caf98a2d48e0ed0b9134353c242328ebeef6d2b31808d58969165e809b47"
],
[
"pinentry",
"hffsjitsewdgoijwgzvub6vpjwm33ywr",
"8ed7504b5b2d13ab7e1f4a0e27a882c33c5a6ebfcb43c51269333c0d6d5e1448"
],
[
"gnupg",
"lge4h2kjgvssyspnvutq6t3q2xual5oc",
"6080ce00fcc24185e4051a30f6d52982f86f46eee6d8a2dc4d83ab08d8195be8"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=x86_64"
}
]
}

View File

@@ -0,0 +1,389 @@
{
"verified": [
{
"binaries": [
[
"clingo-bootstrap",
"riu2vekwzrloc3fktlf6v7kwv6fja7lp",
"7527bc4d2d75671162fe0db3de04c5d3e1e6ab7991dfd85442c302c698febb45"
]
],
"python": "python@3.10.13",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"sgf6pgn4ihfcbxutxhevp36n3orfpdkw",
"958531adcb449094bca7703f8f08d0f55a18f9a4c0f10a175ae4190d20982891"
]
],
"python": "python@3.11.5",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"ie4wlhhnb4snroymbnjksajwvoid6omx",
"4af14c3375a211ead3d2b4a31b59683744adcb79b820cc0c6b168ab162a7d983"
]
],
"python": "python@3.12.0",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"5ke32podcipzxxwrj6uzm324bxegbwca",
"a4106c42ee68d07c3d954ab73fe305ca4204f44d90b58fd91a8f784d9b96e7e3"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"scu4cnnf5axmjgozqc7cccpqnj5nc5tj",
"54de4ca141b92222c8f1729e9e336c8a71dad9efa641e76438fcfb79bb58fc7f"
]
],
"python": "python@3.7.17",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"ajbswc25irhmhbc4qibdcr6ohsvpcdku",
"8b9e7af163a4259256eca4b4a1a92b5d95463a5cf467be2a11c64ab536ca5b04"
]
],
"python": "python@3.8.18",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"vwkuxa5z4pj7vviwsmrpw2r6kbbqej2p",
"a3f10024ff859e15b79ccd06c970a5f0e6ba11b0eae423f096ec9a35863816d2"
]
],
"python": "python@3.9.18",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"attdjmyzpfnhoobadw55pgg4hwkyp7zk",
"f3258af3a648b47f12285dd3f048b685ed652b2b55b53861ac9913926de0f1c3"
]
],
"python": "python@3.10",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"w4vnbsxjgkhsmgwozudzcsqlvccjsec4",
"19322c2c951fc80234963ac068c78442df57ac63055325b24a39ab705d27a5b9"
]
],
"python": "python@3.11",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"dw7ez2xcx6e5dxo3n4jin7pdbo3ihwtw",
"c368edda4b3c8fd767f5f0f098ea416864b088c767dc43135df49cf5f6ef4c93"
]
],
"python": "python@3.12",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"audrlxaw3ny3kyjkf6kqywumhokcxh3p",
"db2f44966ec104ffe57c0911f0b1e0d3d052753f4c46c30c0890dfb26d547b09"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"al7brxvvvhih5nlxvtfkavufqc3pe5t2",
"4e09b6d50d42c898e075fd20f7c7eddf91cb80edfd2d1326d26fd779e4d1ffed"
]
],
"python": "python@3.7",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"v3ctpkyogl542wjibng6m2h2426spjbb",
"d9ceb4f9ca23ef1dcc33872e5410ccfef6ea0360247d3e8faedf1751fb1ae4ca"
]
],
"python": "python@3.8",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"zxo5ih5ac6r7lj6miwyx36ot7s6a4dcw",
"f8f5e124d0e7bada34ff687a05e80b2fe207ce4d26205dab09b144edb148f05e"
]
],
"python": "python@3.9",
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"wki4qcy3wzpoxav3auxt2u7yb4sk3xcc",
"f5b9251eb51c60a71f7a0359c252f48c1a1121c426e1e6f9181808c626cb5fef"
]
],
"python": "python@3.10.13",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"gun6hbksmsecau5wjyrmxodq4hxievzx",
"28839ec43db444d6725bde3fcff99adadf61a392d967041fb16f0ffc0afa2f9d"
]
],
"python": "python@3.11.5",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"er73owosuqfmmkxvuw3f7sqnvvj6s4xp",
"99264d48c290256bf16e202c155bf3f8c88fdbbe9894d901344d0db7258abce3"
]
],
"python": "python@3.12.0",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"kv6l7qttuzk7zxkxi5fhff52qso3pj7m",
"59aa052e89d3c698fdd35e30ac21a896c8e49bbcc2f589a8f777bd5dafff2af7"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"uw5o2z753otspa3lmmy2bdodh5munkir",
"7a8b6359ce83463541ff68c221296fe9875adf28ea2b2c1416229750cf4935d2"
]
],
"python": "python@3.7.17",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"d63pp2l453bfygh6q7afwdj5mw7lhsns",
"425bef3a8605732b2fbe74cdd77ef6a359cbdb62800490bbd05620a57da35b0c"
]
],
"python": "python@3.8.18",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"nap44jiznzwlma6n75uxbpznppazs7av",
"316d940ca9af8c6b3bc50f8fdaadba02b0e955c4f24345a63a1a6715b01a752c"
]
],
"python": "python@3.9.18",
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"clingo-bootstrap",
"qhvnw4yowmk2tofg3u7a4uomisktgzw5",
"d30ec81385377521dd2d1ac091546cc2dec6a852ad31f35c24c65919f94fbf64"
]
],
"python": "python@3.10.13",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"b3y37ryfuhjq6ljbkq7piglsafg5stgw",
"3c2f9cca3a6d37685fdf7d7dffb7a0505336c32562715069004631c446e46a7c"
]
],
"python": "python@3.11.5",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"dbloojtq5kcfd3pjmj4pislgpzrcvjpn",
"f8aeba80e6c106b769adba164702db94e077255fe1a22d6d265ccc3172b4ab1a"
]
],
"python": "python@3.12.0",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"gtlngzdb7iggcjmaottob54qi3b24blt",
"3efc534ba293ee51156971b8c19a597ebcb237b003c98e3c215a49a88064dfd1"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"4ab4wobwa7bvhlkrmhdp2dwgtcq5rpzo",
"3dc6539a989701ec1d83d644a79953af912c11fe6046a8d720970faf8e477991"
]
],
"python": "python@3.7.17",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"fgout3h4mt4i64xaovqrpcsdy3ly2aml",
"ade67f0623e941b16f2dd531270b4863de8befd56a9a47bd87af85345bc8bed6"
]
],
"python": "python@3.8.18",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"5fv2q4agg4b4g53f4zhnymrbv6ogiwpy",
"18047d48538a770f014cce73756258c1a320d4ac143abef3c5d8bc09dd7a03cc"
]
],
"python": "python@3.9.18",
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"clingo-bootstrap",
"smkmkb5xqz4v2f7tl22g4e2ghamglox5",
"a850c80c7a48dab506f807cc936b9e54e6f5640fe96543ff58281c046140f112"
]
],
"python": "python@3.10.13",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"55qeu52pkt5shpwd7ulugv7wzt5j7vqd",
"e5e1a10b3b2d543b1555f5caef9ac1a9ccdcddb36a1278d3bf68bf0e9f490626"
]
],
"python": "python@3.11.5",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"zcw5ieomfwwpzpzpabetix2plfqzpvwd",
"ed409165109488d13afe8ef12edd3b373ed08967903dc802889523b5d3bccd14"
]
],
"python": "python@3.12.0",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"t4yf34cuvquqp5xd66zybmcfyhwbdlsf",
"b14e26e86bcfdac98b3a55109996265683f32910d3452e034ddc0d328bf62d67"
]
],
"python": "python@3.6",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"grkrpj76lxsxa753uzndwfmrj3pwvyhp",
"11a535d4a8a9dbb18c2f995e10bc90b27b6ebc61f7ac2090f15db9b4f9be1a64"
]
],
"python": "python@3.7.17",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"zowwoarrf3hvo6i3iereolfujr42iyro",
"154d3a725f02c1775644d99a0b74f9e2cdf6736989a264ccfd5d9a8bce77a16b"
]
],
"python": "python@3.8.18",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
},
{
"binaries": [
[
"clingo-bootstrap",
"bhqgwuvef354fwuxq7heeighavunpber",
"399dec8cb6b8cd1b03737e68ea32e6ed69030b57e5f05d983e8856024143ea78"
]
],
"python": "python@3.9.18",
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
}
]
}

View File

@@ -0,0 +1,254 @@
{
"verified": [
{
"binaries": [
[
"libgpg-error",
"stcmj3wdfxrohn2a53ecvsfsxe7rzrn4",
"942b0f0918798f0a5f007de0f104d71273e6988165c7a34a874e0846b1aa8977"
],
[
"libassuan",
"z27suzptvelnavipmldx6dcntiwqmguq",
"c703d6b534e89e383893913fb3b71b47322726c5e19f69178e4d1a3a42a76426"
],
[
"libgcrypt",
"if4uocx75kk6nc5vwvvuxq4dvaoljxkm",
"a2320f8cfc8201d15c0e9e244b824ce3d76542c148f4f0631648987957759f07"
],
[
"libiconv",
"nccvt7adwkq5anilrjspffdzl4hggon5",
"e23aa0184eb6661331bc850292fa22579005fd8ed62efd4c0c7a87489d8acaf6"
],
[
"libksba",
"lbfaarmpo2tupbezmqhfjvyspvwepv4r",
"96888ed37642a2425e2262a5904b82a38f9eecfb18a900493e32d4ab742f994b"
],
[
"npth",
"yc7h5c7cp7mupstvh5wlujp3xqet3xxq",
"3ac8e284878c5a556e38aab706e4303daf0a4d2bbb9fac2644495f8a362f9988"
],
[
"pinentry",
"rlo36pidutbjxxc3atooiwruaptfwmml",
"70114fe6c9e8723daa960f1a3dc36ed8b5a6c6f9cc828d43f79b8f59f7363605"
],
[
"zlib-ng",
"hewnrm76ju4qcjaezxole5htrulkij25",
"7babbe4d3d6e58631a944472356c07f0f4ad4a0759eaeefcf8584f33cce51ca6"
],
[
"gnupg",
"5cguax2vflgy2cwmt2ikvixtynommlmr",
"23fdd223493f441fa2e5f82d7e02837ecfad831fbfa4c27c175b3e294ed977d1"
]
],
"spec": "gnupg@2.3: %apple-clang platform=darwin target=aarch64"
},
{
"binaries": [
[
"libgpg-error",
"7yjoei55i6wxycmzbopyrw7nrquc22ac",
"c29cfe32521a4a1e2108c711233964c27ca74ffc7505eea86cb8c047ace5715b"
],
[
"libassuan",
"b4pkkugfhdtitffvlh4o3dexmthr6rmk",
"27ee6fc272f011f9ad4f000dc54961cccd67b34d6f24f316ca7faf26673bf98b"
],
[
"libgcrypt",
"uqjmpmpeta3w7c66m4e5jojopngpibvp",
"d73fbb6e9327faec75af450d602b663ed6bb65ac9657bd795034a53f6acd32c8"
],
[
"libiconv",
"rfsiwcq6tlw6to42a3uxw7wcmcyk5m6r",
"1f0176395130ed8b919538fa4b1cbda9f0ff8b836e51097258efc8cf5e11f753"
],
[
"libksba",
"gsobopcvr2p7d7rpgrbk2ulrnhvrpt6u",
"0e404a8353f91918f385db8cf661f53f91ffd805798fcd83fb1168a1f1758fe8"
],
[
"npth",
"gib2edyujm2oymkvu2hllm2yeghttvn3",
"e04e579e514cd965baf71b7f160b063bff8b116e991e6931c6919cd5f3270e59"
],
[
"pinentry",
"5ndbckveeaywx77rqmujglfnqwpxu3t6",
"0ec02dca08ad2e8b3dd1c71195ed3fe3bb8856b746726708f5e5d450619e1285"
],
[
"zlib-ng",
"fg366ys6nx3hthuiix4xooi6xx4qe5d2",
"cc372a21608885182233c7800355c7c0bbaff47ea16e190827a9618b0c4703e2"
],
[
"gnupg",
"2x5ftl46zcnxk6knz5y3nuhyn7zcttk3",
"b9481e122e2cb26f69b70505830d0fcc0d200aadbb6c6572339825f17ad1e52d"
]
],
"spec": "gnupg@2.3: %apple-clang platform=darwin target=x86_64"
},
{
"binaries": [
[
"libgpg-error",
"b7o5zrguyniw5362eey3peglzhlmig7l",
"b4373f2b0a2567b3b87e6bfc934135ce7790432aea58c802139bb5352f24b6a9"
],
[
"libassuan",
"6k2arop3mjwfhe4cwga6a775ud5m4scp",
"1e5143d35b0938a206ecf1ecb39b77e732629897d2b936cb8274239770055d90"
],
[
"libgcrypt",
"eh5h3zisjkupzr2pgqarvgs2fm7pun5r",
"b57eff265b48d0472243babfd1221c7c16189a4e324ea26e65d1a0a8c1391020"
],
[
"libiconv",
"vgk2zgjeflpnksj3lywuwdzs2nez63qv",
"d153953c40c630fd2bf271f3de901d7671f80e8161cf746cb54afbf28d934d03"
],
[
"libksba",
"au3xdl4oyfbxat6dknp3mldid7gupgt5",
"f1b1a1a02138109bc41b0b2ba54e689b43f35e2828f58b5de74280ce754fac0b"
],
[
"npth",
"ja7cauk7yhhyj7msnprlirue7cn3jpnj",
"cf6fd998a8f92ce1cf34c63db09c77b1891bf8f5915deef03c0cae5492bd691b"
],
[
"pinentry",
"6yo4flozla2tvw3ojkh2atvnfxuqx6ym",
"e78826a269109b3d67a54b1d01ff0a93be043dddcb4f52d329770ae1f75313f3"
],
[
"zlib-ng",
"4cgenrt3rcinueq6peyolxhegnryoeem",
"918a1e48f823806f1562c95569953a4658b2fbc54a2606a09bcd7e259b62f492"
],
[
"gnupg",
"lrmigjenpqj5fy4ojcs5jy6doktiu4qz",
"228ccb475932f7f40a64e9d87dec045931cc57f71b1dfd4b4c3926107222d96c"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=aarch64"
},
{
"binaries": [
[
"libgpg-error",
"km6l24czfhnmlya74nu6cxwufgimyhzz",
"23c3b7b487b36b9b03eeebbcc484adc6c8190c1bbcaa458943847148c915c6b2"
],
[
"libassuan",
"crkk525xdgsn2k5s4xqdaxkudz6pjqbm",
"ae3048a8059c0709d3efe832de1a8f82594373ba853d4bc2dfa05fb9dbfbc782"
],
[
"libgcrypt",
"4s5lkowqilor35fscjwvtmg4wasdknkc",
"62d3d13278d60d0329af1a9649b06591153ff68de4584f57777d13d693c7012e"
],
[
"libiconv",
"kbijqx45l3n64dlhenbuwgqpmf434g2d",
"dddf581a14a35b85cb69a8c785dd8e250f41e6de7697e34bb0ab2a942e0c2128"
],
[
"libksba",
"jnll3rfuh6xhgqxbwfnpizammcwloxjc",
"6200f2b6150aaf6d0e69771dfd5621582bd99ed0024fe83e7bc777cb66cabb29"
],
[
"npth",
"6j6b4hbkhwkb5gfigysqgn5lpu3i4kw5",
"0be0c70f3d9d45c4fe7490d8fdb8d7584de6324c3bfac8d884072409799c9951"
],
[
"pinentry",
"cdpcdd4iah6jot4odehm3xmulw3t3e32",
"5b447c770d0f705fbc97564fccdfbb0dfff8b6f8e2b4abbea326a538bc1bff80"
],
[
"zlib-ng",
"ogchs3i5tosoqrtsp3czp2azxvm7icig",
"acfa12c4e73560416e1169b37adabfbec5ee9a580a684b23e75d7591d8e39a03"
],
[
"gnupg",
"jwpu2wrofbwylpztltmi257benj2wp6z",
"98e2bcb4064ec0830d896938bc1fe5264dac611da71ea546b9ca03349b752041"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=ppc64le"
},
{
"binaries": [
[
"libgpg-error",
"dwcgnnqt364enpf5554dio7kklspmrko",
"bfe9b506ccba0cca619133a3d2e05aa23c929749428bf6eecbff0c6985447009"
],
[
"libassuan",
"yl5rfsfuxd6if36h7rap7zbbpbfztkpw",
"4343dabbeed0851885992acd7b63fd74cb9d1acc06501a8af934e7e103801a15"
],
[
"libgcrypt",
"ka3t3dq73bkz4bs5ilyz6kymkypgbzxl",
"ec1bcc324e9f9d660395e2c586094431361a02196da43fce91be41cca5da9636"
],
[
"libiconv",
"5tog27ephuzc4j6kdxavhjsjm2kd5nu6",
"928fab3c32a1ae09651bb8491ee3855ccaf3c57a146ee72a289a073accd3fc8f"
],
[
"libksba",
"4ezfhjkmfc4fr34ozzl5q6b4x6jqqmsw",
"3045841c50c19a41beb0f32b4e8a960901397b95e82af3a73817babf35d4cfca"
],
[
"npth",
"bn4zrugdajgpk5dssoeccbl7o2gfgmcp",
"ef90ef85a818456afbff709b4a0757a077d69fd3c07d1b7612e1d461d837c46f"
],
[
"pinentry",
"cdwqocmusjomjjavnz6nn764oo54j5xj",
"b251047c1cb4be1bb884a7843d4419fae40fdbe5e1d36904e35f5e3fef5e4ced"
],
[
"zlib-ng",
"ozawh46coczjwtlul27msr3swe6pl6l5",
"0a397b53d64ac8191a36de8b32c5ced28a4c7a6dbafe9396dd897c55bcf7a168"
],
[
"gnupg",
"jra2dbsvpr5c5gj3ittejusa2mjh2sf5",
"054fac6eaad7c862ea4661461d847fb069876eb114209416b015748266f7d166"
]
],
"spec": "gnupg@2.3: %gcc platform=linux target=x86_64"
}
]
}

View File

@@ -3,6 +3,6 @@ description: |
Buildcache generated from a public workflow using Github Actions.
The sha256 checksum of binaries is checked before installation.
info:
url: https://mirror.spack.io/bootstrap/github-actions/v0.3
url: https://mirror.spack.io/bootstrap/github-actions/v0.5
homepage: https://github.com/spack/spack-bootstrap-mirrors
releases: https://github.com/spack/spack-bootstrap-mirrors/releases

View File

@@ -4,8 +4,8 @@
"binaries": [
[
"patchelf",
"cn4gsqzdnnffk7ynvbcai6wrt5ehqqrl",
"8c6a28cbe8133d719be27ded11159f0aa2c97ed1d0881119ae0ebd71f8ccc755"
"4txke6ixd2zg2yzg33l3fqnjyassono7",
"102800775f789cc293e244899f39a22f0b7a19373305ef0497ca3189223123f3"
]
],
"spec": "patchelf@0.13: %gcc platform=linux target=aarch64"
@@ -14,8 +14,8 @@
"binaries": [
[
"patchelf",
"mgq6n2heyvcx2ebdpchkbknwwn3u63s6",
"1d4ea9167fb8345a178c1352e0377cc37ef2b421935cf2b48fb6fa03a94fca3d"
"tnbgxc22uebqsiwrhchf3nieatuqlsrr",
"91cf0a9d4750c04575c5ed3bcdefc4754e1cf9d1cd1bf197eb1fe20ccaa869f1"
]
],
"spec": "patchelf@0.13: %gcc platform=linux target=ppc64le"
@@ -24,8 +24,8 @@
"binaries": [
[
"patchelf",
"htk62k7efo2z22kh6kmhaselru7bfkuc",
"833df21b20eaa7999ac4c5779ae26aa90397d9027aebaa686a428589befda693"
"afv7arjarb7nzmlh7c5slkfxykybuqce",
"73f4bde46b843c96521e3f5c31ab94756491404c1ad6429c9f61dbafbbfa6470"
]
],
"spec": "patchelf@0.13: %gcc platform=linux target=x86_64"

View File

@@ -165,6 +165,10 @@ default:
extends: [ ".generate-base" ]
tags: ["spack", "public", "medium", "aarch64"]
.generate-neoverse_v1:
extends: [ ".generate-base" ]
tags: ["spack", "public", "medium", "aarch64", "graviton3"]
.generate-deprecated:
extends: [ ".base-job" ]
stage: generate
@@ -287,7 +291,7 @@ protected-publish:
e4s-generate:
extends: [ ".e4s", ".generate-x86_64"]
image: ghcr.io/spack/ubuntu20.04-runner-x86_64:2023-01-01
image: ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01
e4s-build:
extends: [ ".e4s", ".build" ]
@@ -300,6 +304,52 @@ e4s-build:
- artifacts: True
job: e4s-generate
########################################
# E4S Neoverse V1 pipeline
########################################
.e4s-neoverse_v1:
extends: [ ".linux_neoverse_v1" ]
variables:
SPACK_CI_STACK_NAME: e4s-neoverse_v1
e4s-neoverse_v1-generate:
extends: [ ".e4s-neoverse_v1", ".generate-neoverse_v1" ]
image: ghcr.io/spack/ubuntu20.04-runner-arm64-gcc-11.4:2023.08.01
e4s-neoverse_v1-build:
extends: [ ".e4s-neoverse_v1", ".build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: e4s-neoverse_v1-generate
strategy: depend
needs:
- artifacts: True
job: e4s-neoverse_v1-generate
########################################
# E4S ROCm External pipeline
########################################
.e4s-rocm-external:
extends: [ ".linux_x86_64_v3" ]
variables:
SPACK_CI_STACK_NAME: e4s-rocm-external
e4s-rocm-external-generate:
extends: [ ".e4s-rocm-external", ".generate-x86_64"]
image: ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4-rocm5.4.3:2023.08.01
e4s-rocm-external-build:
extends: [ ".e4s-rocm-external", ".build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: e4s-rocm-external-generate
strategy: depend
needs:
- artifacts: True
job: e4s-rocm-external-generate
########################################
# GPU Testing Pipeline
########################################
@@ -333,7 +383,7 @@ gpu-tests-build:
e4s-oneapi-generate:
extends: [ ".e4s-oneapi", ".generate-x86_64"]
image: ecpe4s/ubuntu20.04-runner-x86_64-oneapi:2023.07.21
image: ghcr.io/spack/ubuntu20.04-runner-amd64-oneapi-2023.2.1:2023.08.01
e4s-oneapi-build:
extends: [ ".e4s-oneapi", ".build" ]
@@ -350,7 +400,7 @@ e4s-oneapi-build:
# E4S on Power
########################################
.e4s-power-generate-tags-and-image:
image: { "name": "ecpe4s/ubuntu20.04-runner-ppc64le:2023-01-01", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/ubuntu20.04-runner-ppc64-gcc-11.4:2023.08.01", "entrypoint": [""] }
tags: ["spack", "public", "large", "ppc64le"]
.e4s-power:
@@ -827,16 +877,16 @@ e4s-cray-rhel-build:
variables:
SPACK_CI_STACK_NAME: e4s-cray-sles
e4s-cray-sles-generate:
extends: [ ".generate-cray-sles", ".e4s-cray-sles" ]
# e4s-cray-sles-generate:
# extends: [ ".generate-cray-sles", ".e4s-cray-sles" ]
e4s-cray-sles-build:
extends: [ ".build", ".e4s-cray-sles" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: e4s-cray-sles-generate
strategy: depend
needs:
- artifacts: True
job: e4s-cray-sles-generate
# e4s-cray-sles-build:
# extends: [ ".build", ".e4s-cray-sles" ]
# trigger:
# include:
# - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
# job: e4s-cray-sles-generate
# strategy: depend
# needs:
# - artifacts: True
# job: e4s-cray-sles-generate

View File

@@ -4,22 +4,16 @@ spack:
cmake:
variants: ~ownlibs
ecp-data-vis-sdk:
require:
- one_of:
- +ascent +adios2 +cinema +darshan +faodel +hdf5 +pnetcdf +sensei +sz +unifyfs
+veloc +vtkm +zfp
- one_of:
- +paraview ~visit
- ~paraview +visit
require: "+ascent +adios2 +cinema +darshan +faodel +hdf5 +pnetcdf +sensei +sz +unifyfs +veloc +vtkm +zfp"
hdf5:
require:
- one_of: ['@1.14', '@1.12']
mesa:
require: +glx +osmesa +opengl ~opengles +llvm
require: "+glx +osmesa +opengl ~opengles +llvm"
libosmesa:
require: mesa +osmesa
require: "mesa +osmesa"
libglx:
require: mesa +glx
require: "mesa +glx"
ospray:
require: '@2.8.0 +denoiser +mpi'
llvm:
@@ -57,9 +51,11 @@ spack:
# Test ParaView and VisIt builds with different GL backends
- matrix:
- [$sdk_base_spec]
- ["+paraview ~visit"]
- [$^paraview_specs]
- matrix:
- [$sdk_base_spec]
- ["~paraview +visit"]
- [$^visit_specs]
mirrors: {mirror: s3://spack-binaries/develop/data-vis-sdk}

View File

@@ -20,42 +20,156 @@ spack:
target: [zen4]
variants: +mpi
tbb:
require: "intel-tbb"
binutils:
variants: +ld +gold +headers +libiberty ~nls
hdf5:
variants: +fortran +hl +shared
libunwind:
variants: +pic +xz
ncurses:
require: '@6.3 +termlib'
openblas:
require: '@0.3.20'
variants: threads=openmp
xz:
variants: +pic
boost:
variants: +python +filesystem +iostreams +system
cuda:
version: [11.7.0]
elfutils:
variants: +bzip2 ~nls +xz
require: '%gcc'
require: "%gcc"
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mpich:
variants: ~wrapperrpath
ncurses:
variants: +termlib
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt+osmesa"
python:
version: [3.8.13]
trilinos:
require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
+intrepid +intrepid2 +isorropia +kokkos +minitensor +nox +piro +phalanx
+rol +rythmos +sacado +stk +shards +stratimikos +tempus +tpetra
+trilinoscouplings +zoltan]
- one_of: [gotype=long_long, gotype=all]
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
xz:
variants: +pic
mesa:
version: [21.3.8]
unzip:
require: '%gcc'
require: "%gcc"
specs:
- adios2
- amrex
# CPU
- adios
- aml
- arborx
- argobots
- bolt
- butterflypack
- boost +python +filesystem +iostreams +system
- cabana
- chai ~benchmarks ~tests
- conduit
- datatransferkit
- flecsi
- fortrilinos
- ginkgo
- globalarrays
- gmp
- gotcha
- h5bench
- hdf5-vol-async
- hdf5-vol-cache
- hdf5-vol-log
- heffte +fftw
- hypre
- kokkos
- kokkos-kernels
- kokkos +openmp
- kokkos-kernels +openmp
- lammps
- legion
- libnrm
- libquo
- libunwind
- mercury
- metall
- mfem
- mgard +serial +openmp +timing +unstructured ~cuda
- mpark-variant
- mpifileutils ~xattr
- nccmp
- nco
- netlib-scalapack
- omega-h
- openmpi
- openpmd-api
- papi
- papyrus
- pdt
- pumi
- qthreads scheduler=distrib
- raja
- slate ~cuda
- stc
- sundials
- superlu
- superlu-dist
# - flux-core # python cray sles issue
- swig
- swig@4.0.2-fortran
- sz3
- tasmanian
- trilinos +belos +ifpack2 +stokhos
- turbine
- umap
- umpire
- veloc
- wannier90
# ERRORS
# - caliper # caliper: ModuleNotFoundError: No module named 'math'; src/mpi/services/mpiwrap/CMakeFiles/caliper-mpiwrap.dir/build.make:77: src/mpi/services/mpiwrap/Wrapper.cpp] Error 1
# - charliecloud # python: Could not find platform dependent libraries <exec_prefix>
# - flit # python: Could not find platform dependent libraries <exec_prefix>
# - flux-core # python: Could not find platform dependent libraries <exec_prefix>
# - hpx max_cpu_count=512 networking=mpi # python: Could not find platform dependent libraries <exec_prefix>
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # python: Could not find platform dependent libraries <exec_prefix>
# - petsc # petsc: SyntaxError: (unicode error) \N escapes not supported (can't load unicodedata module)
# - plumed # python: Could not find platform dependent libraries <exec_prefix>
# - precice # petsc: SyntaxError: (unicode error) \N escapes not supported (can't load unicodedata module)
# - py-h5py +mpi # python: Could not find platform dependent libraries <exec_prefix>
# - py-h5py ~mpi # python: Could not find platform dependent libraries <exec_prefix>
# - py-libensemble +mpi +nlopt # python: Could not find platform dependent libraries <exec_prefix>
# - py-petsc4py # python: Could not find platform dependent libraries <exec_prefix>
# - slepc # petsc: SyntaxError: (unicode error) \N escapes not supported (can't load unicodedata module)
# - tau +mpi +python # tau: ERROR: Cannot find python library (libpython*.[so|dylib]
# HOLDING THESE BACK UNTIL CRAY SLES CAPACITY IS EXPANDED AT UO
# - alquimia
# - amrex
# - archer
# - axom
# - bricks
# - dealii
# - dyninst
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?;
# - exaworks
# - gasnet
# - gptune
# - hpctoolkit
# - nrm
# - nvhpc
# - parsec ~cuda
# - phist
# - plasma
# - py-jupyterhub
# - py-warpx
# - quantum-espresso
# - scr
# - strumpack ~slate
# - upcxx
# - variorum
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-cray-sles" }

View File

@@ -0,0 +1,351 @@
spack:
view: false
concretizer:
reuse: false
unify: false
packages:
all:
require: '%gcc@11.4.0 target=neoverse_v1'
providers:
blas: [openblas]
mpi: [mpich]
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
cuda:
version: [11.8.0]
compilers:
- compiler:
spec: gcc@11.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: aarch64
modules: []
environment: {}
extra_rpaths: []
specs:
# CPU
- adios
- alquimia
- aml
- amrex
- arborx
- argobots
- ascent # ecp dav
- axom
- bolt
- boost
- butterflypack
- cabana
- caliper
- chai ~benchmarks ~tests
- charliecloud
- conduit
- datatransferkit
- dyninst
- ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # +visit: ?
- exaworks
- flecsi
- flit
- flux-core
- fortrilinos
- gasnet
- ginkgo
- globalarrays
- gmp
- gotcha
- gptune ~mpispawn
- h5bench
- hdf5-vol-async
- hdf5-vol-cache
- hdf5-vol-log
- heffte +fftw
- hpctoolkit
- hpx networking=mpi
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- lammps
- lbann
- legion
- libnrm
- libquo
- libunwind
- loki
- mercury
- metall
- mfem
- mgard +serial +openmp +timing +unstructured ~cuda
- mpark-variant
- mpifileutils ~xattr
- nccmp
- nco
- netlib-scalapack
- nrm
- nvhpc
- omega-h
- openfoam
- openmpi
- openpmd-api
- papi
- papyrus
- parsec ~cuda
- pdt
- petsc
- phist
- plasma
- plumed
- precice
- pruners-ninja
- pumi
- py-h5py
- py-jupyterhub
- py-libensemble
- py-petsc4py
- py-warpx
- qthreads scheduler=distrib
- quantum-espresso
- raja
- rempi
- scr
- slate ~cuda
- slepc
- stc
- strumpack ~slate
- sundials
- superlu
- superlu-dist
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
- umpire
- upcxx
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls
# INCLUDED IN ECP DAV CPU
- adios2
- darshan-runtime
- darshan-util
- faodel
- hdf5
- libcatalyst
- parallel-netcdf
- paraview
- py-cinemasci
- sz
- unifyfs
- veloc
# - visit # silo: https://github.com/spack/spack/issues/39538
- vtk-m
- zfp
# --
# - archer # part of llvm +omp_tsan
# - bricks ~cuda # not respecting target=aarch64?
# - dealii # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - geopm # geopm: https://github.com/spack/spack/issues/38795
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs@0.7.3: gcc: error: unrecognized command-line option '-mno-sse2'
# - variorum # variorum: https://github.com/spack/spack/issues/38786
# CUDA NOARCH
- flux-core +cuda
- hpctoolkit +cuda
- papi +cuda
- tau +mpi +cuda
# --
# - bricks +cuda # not respecting target=aarch64?
# - legion +cuda # legion: needs NVIDIA driver
# CUDA 75
- amrex +cuda cuda_arch=75
- arborx +cuda cuda_arch=75 ^kokkos +wrapper
- cabana +cuda cuda_arch=75 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=75
- caliper +cuda cuda_arch=75
- chai ~benchmarks ~tests +cuda cuda_arch=75 ^umpire ~shared
- flecsi +cuda cuda_arch=75
- ginkgo +cuda cuda_arch=75
- heffte +cuda cuda_arch=75
- hpx +cuda cuda_arch=75
- hypre +cuda cuda_arch=75
- kokkos +wrapper +cuda cuda_arch=75
- kokkos-kernels +cuda cuda_arch=75 ^kokkos +wrapper +cuda cuda_arch=75
- magma +cuda cuda_arch=75
- mfem +cuda cuda_arch=75
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=75
- omega-h +cuda cuda_arch=75
- parsec +cuda cuda_arch=75
- petsc +cuda cuda_arch=75
- raja +cuda cuda_arch=75
- slate +cuda cuda_arch=75
- strumpack ~slate +cuda cuda_arch=75
- sundials +cuda cuda_arch=75
- superlu-dist +cuda cuda_arch=75
- tasmanian +cuda cuda_arch=75
- trilinos +cuda cuda_arch=75
- umpire ~shared +cuda cuda_arch=75
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=75
- paraview +cuda cuda_arch=75
- vtk-m +cuda cuda_arch=75
- zfp +cuda cuda_arch=75
# --
# - ascent +cuda cuda_arch=75 # ascent: https://github.com/spack/spack/issues/38045
# - axom +cuda cuda_arch=75 # axom: https://github.com/spack/spack/issues/29520
# - cusz +cuda cuda_arch=75 # cusz: https://github.com/spack/spack/issues/38787
# - dealii +cuda cuda_arch=75 # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - ecp-data-vis-sdk +adios2 +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=75 # embree: https://github.com/spack/spack/issues/39534
# - lammps +cuda cuda_arch=75 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=75 # lbann: https://github.com/spack/spack/issues/38788
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf ~cusz +mgard +cuda cuda_arch=75 # libpressio: CMake Error at CMakeLists.txt:498 (find_library): Could not find CUFile_LIBRARY using the following names: cufile ; +cusz: https://github.com/spack/spack/issues/38787
# - py-torch +cuda cuda_arch=75 # skipped, installed by other means
# - slepc +cuda cuda_arch=75 # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - upcxx +cuda cuda_arch=75 # upcxx: needs NVIDIA driver
# CUDA 80
- amrex +cuda cuda_arch=80
- arborx +cuda cuda_arch=80 ^kokkos +wrapper
- cabana +cuda cuda_arch=80 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=80
- caliper +cuda cuda_arch=80
- chai ~benchmarks ~tests +cuda cuda_arch=80 ^umpire ~shared
- flecsi +cuda cuda_arch=80
- ginkgo +cuda cuda_arch=80
- heffte +cuda cuda_arch=80
- hpx +cuda cuda_arch=80
- hypre +cuda cuda_arch=80
- kokkos +wrapper +cuda cuda_arch=80
- kokkos-kernels +cuda cuda_arch=80 ^kokkos +wrapper +cuda cuda_arch=80
- magma +cuda cuda_arch=80
- mfem +cuda cuda_arch=80
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=80
- omega-h +cuda cuda_arch=80
- parsec +cuda cuda_arch=80
- petsc +cuda cuda_arch=80
- raja +cuda cuda_arch=80
- slate +cuda cuda_arch=80
- strumpack ~slate +cuda cuda_arch=80
- sundials +cuda cuda_arch=80
- superlu-dist +cuda cuda_arch=80
- tasmanian +cuda cuda_arch=80
- trilinos +cuda cuda_arch=80
- umpire ~shared +cuda cuda_arch=80
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=80
- paraview +cuda cuda_arch=80
- vtk-m +cuda cuda_arch=80
- zfp +cuda cuda_arch=80
# --
# - ascent +cuda cuda_arch=80 # ascent: https://github.com/spack/spack/issues/38045
# - axom +cuda cuda_arch=80 # axom: https://github.com/spack/spack/issues/29520
# - cusz +cuda cuda_arch=80 # cusz: https://github.com/spack/spack/issues/38787
# - dealii +cuda cuda_arch=80 # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - ecp-data-vis-sdk +adios2 +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=80 # embree: https://github.com/spack/spack/issues/39534
# - lammps +cuda cuda_arch=80 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=80 # lbann: https://github.com/spack/spack/issues/38788
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf ~cusz +mgard +cuda cuda_arch=80 # libpressio: CMake Error at CMakeLists.txt:498 (find_library): Could not find CUFile_LIBRARY using the following names: cufile ; +cusz: https://github.com/spack/spack/issues/38787
# - py-torch +cuda cuda_arch=80 # skipped, installed by other means
# - slepc +cuda cuda_arch=80 # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - upcxx +cuda cuda_arch=80 # upcxx: needs NVIDIA driver
# CUDA 90
- amrex +cuda cuda_arch=90
- arborx +cuda cuda_arch=90 ^kokkos +wrapper
- cabana +cuda cuda_arch=90 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=90
- caliper +cuda cuda_arch=90
- chai ~benchmarks ~tests +cuda cuda_arch=90 ^umpire ~shared
- flecsi +cuda cuda_arch=90
- ginkgo +cuda cuda_arch=90
- heffte +cuda cuda_arch=90
- hpx +cuda cuda_arch=90
- kokkos +wrapper +cuda cuda_arch=90
- kokkos-kernels +cuda cuda_arch=90 ^kokkos +wrapper +cuda cuda_arch=90
- magma +cuda cuda_arch=90
- mfem +cuda cuda_arch=90
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=90
- parsec +cuda cuda_arch=90
- petsc +cuda cuda_arch=90
- raja +cuda cuda_arch=90
- slate +cuda cuda_arch=90
- strumpack ~slate +cuda cuda_arch=90
- sundials +cuda cuda_arch=90
- superlu-dist +cuda cuda_arch=90
- trilinos +cuda cuda_arch=90
- umpire ~shared +cuda cuda_arch=90
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=90
# - paraview +cuda cuda_arch=90 # paraview: InstallError: Incompatible cuda_arch=90
- vtk-m +cuda cuda_arch=90
- zfp +cuda cuda_arch=90
# --
# - ascent +cuda cuda_arch=90 # ascent: https://github.com/spack/spack/issues/38045
# - axom +cuda cuda_arch=90 # axom: https://github.com/spack/spack/issues/29520
# - cusz +cuda cuda_arch=90 # cusz: https://github.com/spack/spack/issues/38787
# - dealii +cuda cuda_arch=90 # dealii: https://github.com/spack/spack/issues/39532
# - ecp-data-vis-sdk +adios2 +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=90 # embree: https://github.com/spack/spack/issues/39534
# - hypre +cuda cuda_arch=90 # concretizer: hypre +cuda requires cuda@:11, but cuda_arch=90 requires cuda@12:
# - lammps +cuda cuda_arch=90 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=90 # concretizer: Cannot select a single "version" for package "lbann"
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf ~cusz +mgard +cuda cuda_arch=90 # libpressio: CMake Error at CMakeLists.txt:498 (find_library): Could not find CUFile_LIBRARY using the following names: cufile ; +cusz: https://github.com/spack/spack/issues/38787
# - omega-h +cuda cuda_arch=90 # omega-h: https://github.com/spack/spack/issues/39535
# - py-torch +cuda cuda_arch=90 # skipped, installed by other means
# - slepc +cuda cuda_arch=90 # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - tasmanian +cuda cuda_arch=90 # tasmanian: conflicts with cuda@12
# - upcxx +cuda cuda_arch=90 # upcxx: needs NVIDIA driver
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-arm-neoverse_v1" }
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-arm64-gcc-11.4:2023.08.01"
cdash:
build-group: E4S ARM Neoverse V1

View File

@@ -1,22 +1,51 @@
spack:
view: false
concretizer:
reuse: false
unify: false
compilers:
- compiler:
spec: oneapi@2023.2.1
paths:
cc: /opt/intel/oneapi/compiler/2023.2.1/linux/bin/icx
cxx: /opt/intel/oneapi/compiler/2023.2.1/linux/bin/icpx
f77: /opt/intel/oneapi/compiler/2023.2.1/linux/bin/ifx
fc: /opt/intel/oneapi/compiler/2023.2.1/linux/bin/ifx
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
- compiler:
spec: gcc@=11.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
packages:
all:
require: '%oneapi'
require: '%oneapi target=x86_64_v3'
providers:
blas: [openblas]
mpi: [mpich]
tbb: [intel-tbb]
target: [x86_64]
variants: +mpi
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
require: "%gcc"
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
@@ -34,15 +63,12 @@ spack:
variants: +pic
mesa:
version: [21.3.8]
hdf5:
require: "%gcc"
variants: +fortran +hl +shared
mpi:
require: "mpich"
require: 'mpich@4:'
mpich:
require: '@4.1.1 ~wrapperrpath ~hwloc'
require: '~wrapperrpath ~hwloc'
py-cryptography:
require: '@38.0'
require: '@38.0.1'
unzip:
require: '%gcc'
binutils:
@@ -60,40 +86,12 @@ spack:
require: '%gcc'
openssh:
require: '%gcc'
bison:
require: '%gcc'
libffi:
require: "@3.4.4"
dyninst:
require: "%gcc"
compilers:
- compiler:
spec: oneapi@2023.2.0
paths:
cc: /opt/intel/oneapi/compiler/2023.2.0/linux/bin/icx
cxx: /opt/intel/oneapi/compiler/2023.2.0/linux/bin/icpx
f77: /opt/intel/oneapi/compiler/2023.2.0/linux/bin/ifx
fc: /opt/intel/oneapi/compiler/2023.2.0/linux/bin/ifx
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
- compiler:
spec: gcc@11.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
bison:
require: '%gcc'
specs:
# CPU
@@ -101,7 +99,6 @@ spack:
- aml
- amrex
- arborx
- archer
- argobots
- axom
- bolt
@@ -114,17 +111,21 @@ spack:
- charliecloud
- conduit
- datatransferkit
- drishti
- exaworks
- flecsi
- flit
- flux-core
- fortrilinos
- gasnet
- ginkgo
- globalarrays
- gmp
- gotcha
- gptune ~mpispawn
- h5bench
- hdf5-vol-async
- hdf5-vol-cache
- hdf5-vol-log
- heffte +fftw
- hpx networking=mpi
@@ -134,22 +135,22 @@ spack:
- lammps
- lbann
- legion
- libnrm
- libnrm
- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp
- libquo
- libunwind
- loki
- mercury
- metall
- mfem
- mgard +serial +openmp +timing +unstructured ~cuda
- mpark-variant
- mpifileutils ~xattr
- nccmp
- nco
- netlib-scalapack
- nrm
- omega-h
- openmpi
- openpmd-api
- papi
- papyrus
- parsec ~cuda
@@ -159,14 +160,18 @@ spack:
- plasma
- plumed
- precice
- pruners-ninja
- pumi
- py-h5py
- py-jupyterhub
- py-libensemble
- py-petsc4py
- py-warpx
- qthreads scheduler=distrib
- quantum-espresso
- raja
- rempi
- scr
- slate ~cuda
- slepc
- stc
@@ -174,53 +179,46 @@ spack:
- sundials
- superlu
- superlu-dist
- swig@4.0.2-fortran
- sz3
- tasmanian
- trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- tau +mpi +python
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
- umpire
- variorum
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls
# INCLUDED IN ECP DAV CPU
# - adios2
# - ascent
# - darshan-runtime
# - darshan-util
# - faodel
# - hdf5
# - libcatalyst
# - parallel-netcdf
# - paraview
# - py-cinemasci
# - sz
# - unifyfs
# - veloc
# - visit
# - vtk-m ~openmp # https://github.com/spack/spack/issues/31830
# - zfp
- adios2 # mgard: mgard.tpp:63:48: error: non-constant-expression cannot be narrowed from type 'int' to 'unsigned long' in initializer list [-Wc++11-narrowing]
- ascent
- darshan-runtime
- darshan-util
- faodel
- hdf5
- libcatalyst
- parallel-netcdf
# - paraview # paraview: VTK/ThirdParty/cgns/vtkcgns/src/adfh/ADFH.c:2002:23: error: incompatible function pointer types passing 'herr_t (hid_t, const char *, const H5L_info1_t *, void *)' (aka 'int (long, const char *, const H5L_info1_t *, void *)') to parameter of type 'H5L_iterate2_t' (aka 'int (*)(long, const char *,const H5L_info2_t *, void *)') [-Wincompatible-function-pointer-types]
- py-cinemasci
- sz
- unifyfs
- veloc
# - visit # silo: https://github.com/spack/spack/issues/39538
- vtk-m ~openmp # https://github.com/spack/spack/issues/31830
- zfp
# --
# - alquimia # pflotran: pflotran/hdf5_aux.F90(5): error #7013: This module file was not generated by any release of this compiler. [HDF5]
# - dealii # intel-tbb: icpx: error: unknown argument: '-flifetime-dse=1'
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc +visit +vtkm +zfp # sz: hdf5-filter/H5Z-SZ/src/H5Z_SZ.c:24:9: error: call to undeclared function 'gettimeofday'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
# - geopm # geopm: In file included from src/ProfileTable.cpp:34: ./src/ProfileTable.hpp:79:45: error: no type named 'string' in namespace 'std'
# - ginkgo # ginkgo: icpx: error: clang frontend command failed with exit code 139 (use -v to see invocation)
# - gptune ~mpispawn # py-scipy: for_main.c:(.text+0x19): undefined reference to `MAIN__'
# - hdf5-vol-cache # /H5VLcache_ext.c:580:9: error: incompatible function pointer types initializing 'herr_t (*)(const void *, uint64_t *)' (aka 'int (*)(const void *, unsigned long *)') with an expression of type 'herr_t (const void *, unsigned int *)' (aka 'int (const void *, unsigned int *)') [-Wincompatible-function-pointer-types]
# - hpctoolkit # intel-tbb: icpx: error: unknown argument: '-flifetime-dse=1'
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs: c-blosc/internal-complibs/zlib-1.2.8/gzread.c:30:15: error: call to undeclared function 'read'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
# - nrm # py-scipy: for_main.c:(.text+0x19): undefined reference to `MAIN__'
# - openfoam # adios2: patch failed
# - pruners-ninja # pruners-ninja: ninja_test_pingpong.c:79:5: error: call to undeclared library function 'memset' with type 'void *(void *, int, unsigned long)'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
# - py-jupyterhub # py-ruamel-yaml-clib: setuptools/dist.py:287: SetuptoolsDeprecationWarning: The namespace_packages parameter is deprecated, consider using implicit namespaces instead (PEP 420). See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
# - py-warpx ^warpx dims=2 # py-scipy: for_main.c:(.text+0x19): undefined reference to `MAIN__'
# - py-warpx ^warpx dims=3 # py-scipy: for_main.c:(.text+0x19): undefined reference to `MAIN__'
# - py-warpx ^warpx dims=rz # py-scipy: for_main.c:(.text+0x19): undefined reference to `MAIN__'
# - scr # libyogrt: configure: error: slurm is not in specified location!
# - tau +mpi +python # tau: x86_64/lib/Makefile.tau-icpx-papi-mpi-pthread-python-pdt: No such file or directory
# - upcxx # upcxx: /opt/intel/oneapi/mpi/2021.9.0//libfabric/bin/fi_info: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # cmake/tps.cmake:220 (message): Unable to compile against Trilinos. It is possible Trilinos was not properly configured, or the environment has changed since Trilinos was installed. See the CMake log files for more information.
# - alquimia # pflotran: https://github.com/spack/spack/issues/39474
# - archer # subsumed under llvm +libomp_tsan
# - dealii # dealii: https://github.com/spack/spack/issues/39482
# - dxt-explorer # r: https://github.com/spack/spack/issues/40257
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc +visit +vtkm +zfp # embree: CMake Error at CMakeLists.txt:215 (MESSAGE): Unsupported compiler: IntelLLVM; qt: qtbase/src/corelib/global/qendian.h:333:54: error: incomplete type 'std::numeric_limits' used in nested name specifier
# - geopm # geopm issue: https://github.com/spack/spack/issues/38795
# - hpctoolkit # dyninst@12.3.0%gcc: /usr/bin/ld: libiberty/./d-demangle.c:142: undefined reference to `_intel_fast_memcpy'; can't mix intel-tbb@%oneapi with dyninst%gcc
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard: mgard.tpp:63:48: error: non-constant-expression cannot be narrowed from type 'int' to 'unsigned long' in initializer list [-Wc++11-narrowing]
# - openfoam # cgal: https://github.com/spack/spack/issues/39481
# - openpmd-api # mgard: mgard.tpp:63:48: error: non-constant-expression cannot be narrowed from type 'int' to 'unsigned long' in initializer list [-Wc++11-narrowing]
# - swig@4.0.2-fortran # ?
# - upcxx # upcxx: /opt/intel/oneapi/mpi/2021.10.0//libfabric/bin/fi_info: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
# GPU
- aml +ze
@@ -229,23 +227,21 @@ spack:
- cabana +sycl ^kokkos +sycl +openmp cxxstd=17 +tests +examples
- kokkos +sycl +openmp cxxstd=17 +tests +examples
- kokkos-kernels build_type=Release %oneapi ^kokkos +sycl +openmp cxxstd=17 +tests +examples
- tau +mpi +opencl +level_zero ~pdt # tau: requires libdrm.so to be installed
- slate +sycl
# --
# - ginkgo +oneapi # InstallError: Ginkgo's oneAPI backend requires theDPC++ compiler as main CXX compiler.
# - hpctoolkit +level_zero # intel-tbb: icpx: error: unknown argument: '-flifetime-dse=1'
# - hpctoolkit +level_zero # dyninst@12.3.0%gcc: /usr/bin/ld: libiberty/./d-demangle.c:142: undefined reference to `_intel_fast_memcpy'; can't mix intel-tbb@%oneapi with dyninst%gcc
# - sundials +sycl cxxstd=17 # sundials: include/sunmemory/sunmemory_sycl.h:20:10: fatal error: 'CL/sycl.hpp' file not found
# - tau +mpi +opencl +level_zero ~pdt # builds ok in container, but needs libdrm, will update container
# SKIPPED
# - nvhpc
# - dyninst # only %gcc
- py-scipy
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-oneapi" }
ci:
pipeline-gen:
- build-job:
image: ecpe4s/ubuntu20.04-runner-x86_64-oneapi:2023.07.21
image: ghcr.io/spack/ubuntu20.04-runner-amd64-oneapi-2023.2.1:2023.08.01
cdash:
build-group: E4S OneAPI

View File

@@ -1,19 +1,35 @@
spack:
view: false
concretizer:
reuse: false
unify: false
compilers:
- compiler:
spec: gcc@9.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: ppc64le
modules: []
environment: {}
extra_rpaths: []
packages:
all:
compiler: [gcc@11.1.0]
require: "%gcc@9.4.0 target=ppc64le"
compiler: [gcc@9.4.0]
providers:
blas: [openblas]
mpi: [mpich]
target: [ppc64le]
variants: +mpi cuda_arch=70
tbb:
require: intel-tbb
binutils:
variants: +ld +gold +headers +libiberty ~nls
cuda:
version: [11.7.0]
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
@@ -22,30 +38,34 @@ spack:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mpich:
variants: ~wrapperrpath
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
paraview:
require: '@5.11 ~qt+osmesa'
trilinos:
require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
+intrepid +intrepid2 +isorropia +kokkos +minitensor +nox +piro +phalanx
+rol +rythmos +sacado +stk +shards +stratimikos +tempus +tpetra
+trilinoscouplings +zoltan]
- one_of: [gotype=long_long, gotype=all]
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu]
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
faodel:
require: ~tcmalloc # needed for ppc64le
require: "~tcmalloc"
tbb:
require: intel-tbb
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
cuda:
require: "@11.4.4"
specs:
# CPU
@@ -57,6 +77,8 @@ spack:
- argobots
- axom
- bolt
- boost
- bricks
- butterflypack
- cabana
- caliper
@@ -64,8 +86,10 @@ spack:
- charliecloud
- conduit
- datatransferkit
- drishti
- dxt-explorer
- dyninst
- ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 ~paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # +paraview fails: FAILED: VTK/Filters/Statistics/CMakeFiles/FiltersStatistics-objects.dir/vtkPCAStatistics.cxx.o: /tmp/ccgvkIk5.s: Assembler messages: /tmp/ccgvkIk5.s:260012: Error: invalid machine `power10'
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 ~paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # +visit: libext, libxkbfile, libxrender, libxt, silo (https://github.com/spack/spack/issues/39538), cairo
- exaworks
- flecsi
- flit
@@ -83,15 +107,17 @@ spack:
- hdf5-vol-log
- heffte +fftw
- hpctoolkit
- hpx max_cpu_count=512 networking=mpi
- hpx networking=mpi
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- lammps
- lbann
- legion
- libnrm
- libquo
- libunwind
- loki
- mercury
- metall
- mfem
@@ -104,20 +130,23 @@ spack:
- nrm
- nvhpc
- omega-h
- openfoam
- openmpi
- openpmd-api
- papi
- papyrus
- paraview ~cuda ~rocm
- parsec ~cuda
- pdt
- petsc
- phist
- plasma
- plumed
- precice
- pruners-ninja
- pumi
- py-h5py
- py-jupyterhub
- py-libensemble +mpi +nlopt
- py-libensemble
- py-petsc4py
- py-warpx
- qthreads scheduler=distrib
@@ -132,84 +161,102 @@ spack:
- sundials
- superlu
- superlu-dist
- swig
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- trilinos +belos +ifpack2 +stokhos
- tau +mpi +python # tau: has issue with `spack env depfile` build
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
- umpire
- upcxx
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu
- xyce +mpi +shared +pymi +pymi_static_tpls
# INCLUDED IN ECP DAV CPU
- adios2
- ascent
- darshan-runtime
- darshan-util
- faodel
- hdf5
- libcatalyst
- parallel-netcdf
- paraview
- py-cinemasci
- sz
- unifyfs
- veloc
# - visit # libext, libxkbfile, libxrender, libxt, silo (https://github.com/spack/spack/issues/39538), cairo
- vtk-m
- zfp
# --
# - archer # part of llvm +omp_tsan
# - dealii # fltk: https://github.com/spack/spack/issues/38791
# - geopm # geopm: https://github.com/spack/spack/issues/38798
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs: gcc: error: unrecognized command line option '-mno-sse2'; did you mean '-mno-isel'? gcc: error: unrecognized command line option '-mno-avx2'
# - phist +mpi # ghost@develop: gcc-9: error: unrecognized command line option '-march=native'; did you mean '-mcpu=native'?
# - variorum # variorum: https://github.com/spack/spack/issues/38786
# CUDA
- amrex +cuda
- arborx +cuda ^kokkos +wrapper
- cabana +cuda ^kokkos +wrapper +cuda_lambda +cuda
- caliper +cuda
- chai ~benchmarks ~tests +cuda ^umpire ~shared
- ecp-data-vis-sdk +cuda cuda_arch=70 +adios2 +hdf5 ~paraview +vtkm +zfp # +paraview fails: FAILED: VTK/Filters/Statistics/CMakeFiles/FiltersStatistics-objects.dir/vtkPCAStatistics.cxx.o; /tmp/ccjmJhb6.s: Assembler messages: /tmp/ccjmJhb6.s:260012: Error: invalid machine `power10'
- flecsi +cuda
# CUDA NOARCH
- bricks +cuda
- cabana +cuda ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=70
- flux-core +cuda
- ginkgo +cuda
- heffte +cuda
- hpctoolkit +cuda
- hpx max_cpu_count=512 +cuda
- hypre +cuda
- kokkos +wrapper +cuda
- kokkos-kernels +cuda ^kokkos +wrapper +cuda +cuda_lambda
- magma +cuda
- mfem +cuda
- mgard +serial +openmp +timing +unstructured +cuda
- omega-h +cuda
- papi +cuda
- petsc +cuda
- py-torch +cuda
- raja +cuda
- slate +cuda
- slepc +cuda
- strumpack ~slate +cuda
- sundials +cuda
- superlu-dist +cuda
- tasmanian +cuda
- tau +mpi +cuda
- "trilinos@13.4.0: +belos +ifpack2 +stokhos +cuda"
- umpire ~shared +cuda
- parsec +cuda
# CPU FAILURES
# - archer # llvm@8
# - bricks # bricks
# - geopm # geopm
# - hdf5-vol-daos # hdf5-vol-daos: vhost/vhost_user.c:65:32: error: array size missing in 'vhost_message_handlers'
# - loki # loki
# - precice # precice
# - pruners-ninja # pruners-ninja
# - variorum # Intel/variorum_cpuid.c:11:5: error: impossible constraint in 'asm'
- tau +mpi +cuda # tau: has issue with `spack env depfile` build
# --
# bricks: VSBrick-7pt.py-Scalar-8x8x8-1:30:3: error: 'vfloat512' was not declared in this scope
# fltk: /usr/bin/ld: ../lib/libfltk_png.a(pngrutil.o): in function `png_read_filter_row': pngrutil.c:(.text.png_read_filter_row+0x90): undefined reference to `png_init_filter_functions_vsx'
# geopm: libtool.m4: error: problem compiling CXX test program
# llvm@8: clang/lib/Lex/Lexer.cpp:2547:34: error: ISO C++ forbids declaration of 'type name' with no type [-fpermissive]
# loki: include/loki/SmallObj.h:462:57: error: ISO C++17 does not allow dynamic exception specifications
# precice: /tmp/ccYNMwgE.s: Assembler messages: /tmp/ccYNMwgE.s:278115: Error: invalid machine `power10'
# pruners-ninja: test/ninja_test_util.c:34: multiple definition of `a';
# - legion +cuda # legion: needs NVIDIA driver
# CUDA FAILURES
# - bricks +cuda # bricks
# - dealii +cuda # fltk
# CUDA 70
- amrex +cuda cuda_arch=70
- arborx +cuda cuda_arch=70 ^kokkos +wrapper
- caliper +cuda cuda_arch=70
- chai ~benchmarks ~tests +cuda cuda_arch=70 ^umpire ~shared
- ecp-data-vis-sdk ~rocm +adios2 ~ascent +hdf5 +vtkm +zfp ~paraview +cuda cuda_arch=70
- flecsi +cuda cuda_arch=70
- ginkgo +cuda cuda_arch=70
- heffte +cuda cuda_arch=70
- hpx +cuda cuda_arch=70
- hypre +cuda cuda_arch=70
- kokkos +wrapper +cuda cuda_arch=70
- kokkos-kernels +cuda cuda_arch=70 ^kokkos +wrapper +cuda cuda_arch=70
- magma +cuda cuda_arch=70
- mfem +cuda cuda_arch=70
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=70
- omega-h +cuda cuda_arch=70
- parsec +cuda cuda_arch=70
- petsc +cuda cuda_arch=70
- raja +cuda cuda_arch=70
- slate +cuda cuda_arch=70
- slepc +cuda cuda_arch=70
- strumpack ~slate +cuda cuda_arch=70
- sundials +cuda cuda_arch=70
- superlu-dist +cuda cuda_arch=70
- tasmanian +cuda cuda_arch=70
- umpire ~shared +cuda cuda_arch=70
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=70
# - ascent +cuda cuda_arch=70 # ascent: https://github.com/spack/spack/issues/38045
- paraview +cuda cuda_arch=70
- vtk-m +cuda cuda_arch=70
- zfp +cuda cuda_arch=70
# --
# bricks: VSBrick-7pt.py-Scalar-8x8x8-1:30:3: error: 'vfloat512' was not declared in this scope
# - axom +cuda cuda_arch=70 # axom: https://github.com/spack/spack/issues/29520
# - cusz +cuda cuda_arch=70 # cusz: https://github.com/spack/spack/issues/38787
# - dealii +cuda cuda_arch=70 # fltk: https://github.com/spack/spack/issues/38791
# - lammps +cuda cuda_arch=70 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=70 # lbann: https://github.com/spack/spack/issues/38788
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=70 ^cusz +cuda cuda_arch=70 # depends_on("cuda@11.7.1:", when="+cuda")
# - py-torch +cuda cuda_arch=70 # skipped
# - trilinos +cuda cuda_arch=70 # trilinos: https://github.com/trilinos/Trilinos/issues/11630
# - upcxx +cuda cuda_arch=70 # upcxx: needs NVIDIA driver
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-power" }
ci:
pipeline-gen:
- build-job:
image: ecpe4s/ubuntu20.04-runner-ppc64le:2023-01-01
image: ghcr.io/spack/ubuntu20.04-runner-ppc64-gcc-11.4:2023.08.01
cdash:
build-group: E4S Power

View File

@@ -0,0 +1,346 @@
spack:
view: false
concretizer:
reuse: false
unify: false
compilers:
- compiler:
spec: gcc@=11.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
packages:
all:
require: '%gcc target=x86_64_v3'
providers:
blas: [openblas]
mpi: [mpich]
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
cuda:
version: [11.8.0]
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt+osmesa"
# ROCm 5.4.3
comgr:
buildable: false
externals:
- spec: comgr@5.4.3
prefix: /opt/rocm-5.4.3/
hip-rocclr:
buildable: false
externals:
- spec: hip-rocclr@5.4.3
prefix: /opt/rocm-5.4.3/hip
hipblas:
buildable: false
externals:
- spec: hipblas@5.4.3
prefix: /opt/rocm-5.4.3/
hipcub:
buildable: false
externals:
- spec: hipcub@5.4.3
prefix: /opt/rocm-5.4.3/
hipfft:
buildable: false
externals:
- spec: hipfft@5.4.3
prefix: /opt/rocm-5.4.3/
hipsparse:
buildable: false
externals:
- spec: hipsparse@5.4.3
prefix: /opt/rocm-5.4.3/
miopen-hip:
buildable: false
externals:
- spec: hip-rocclr@5.4.3
prefix: /opt/rocm-5.4.3/
miopengemm:
buildable: false
externals:
- spec: miopengemm@5.4.3
prefix: /opt/rocm-5.4.3/
rccl:
buildable: false
externals:
- spec: rccl@5.4.3
prefix: /opt/rocm-5.4.3/
rocblas:
buildable: false
externals:
- spec: rocblas@5.4.3
prefix: /opt/rocm-5.4.3/
rocfft:
buildable: false
externals:
- spec: rocfft@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-clang-ocl:
buildable: false
externals:
- spec: rocm-clang-ocl@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-cmake:
buildable: false
externals:
- spec: rocm-cmake@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-dbgapi:
buildable: false
externals:
- spec: rocm-dbgapi@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-debug-agent:
buildable: false
externals:
- spec: rocm-debug-agent@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-device-libs:
buildable: false
externals:
- spec: rocm-device-libs@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-gdb:
buildable: false
externals:
- spec: rocm-gdb@5.4.3
prefix: /opt/rocm-5.4.3/
rocm-opencl:
buildable: false
externals:
- spec: rocm-opencl@5.4.3
prefix: /opt/rocm-5.4.3/opencl
rocm-smi-lib:
buildable: false
externals:
- spec: rocm-smi-lib@5.4.3
prefix: /opt/rocm-5.4.3/
hip:
buildable: false
externals:
- spec: hip@5.4.3
prefix: /opt/rocm-5.4.3
extra_attributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
c++: /opt/rocm-5.4.3/llvm/bin/clang++
hip: /opt/rocm-5.4.3/hip/bin/hipcc
hipify-clang:
buildable: false
externals:
- spec: hipify-clang@5.4.3
prefix: /opt/rocm-5.4.3
llvm-amdgpu:
buildable: false
externals:
- spec: llvm-amdgpu@5.4.3
prefix: /opt/rocm-5.4.3/llvm
extra_attributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
cxx: /opt/rocm-5.4.3/llvm/bin/clang++
hsakmt-roct:
buildable: false
externals:
- spec: hsakmt-roct@5.4.3
prefix: /opt/rocm-5.4.3/
hsa-rocr-dev:
buildable: false
externals:
- spec: hsa-rocr-dev@5.4.3
prefix: /opt/rocm-5.4.3/
extra_atributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
cxx: /opt/rocm-5.4.3/llvm/bin/clang++
roctracer-dev-api:
buildable: false
externals:
- spec: roctracer-dev-api@5.4.3
prefix: /opt/rocm-5.4.3
roctracer-dev:
buildable: false
externals:
- spec: roctracer-dev@4.5.3
prefix: /opt/rocm-5.4.3
rocprim:
buildable: false
externals:
- spec: rocprim@5.4.3
prefix: /opt/rocm-5.4.3
rocrand:
buildable: false
externals:
- spec: rocrand@5.4.3
prefix: /opt/rocm-5.4.3
hipsolver:
buildable: false
externals:
- spec: hipsolver@5.4.3
prefix: /opt/rocm-5.4.3
rocsolver:
buildable: false
externals:
- spec: rocsolver@5.4.3
prefix: /opt/rocm-5.4.3
rocsparse:
buildable: false
externals:
- spec: rocsparse@5.4.3
prefix: /opt/rocm-5.4.3
rocthrust:
buildable: false
externals:
- spec: rocthrust@5.4.3
prefix: /opt/rocm-5.4.3
rocprofiler-dev:
buildable: false
externals:
- spec: rocprofiler-dev@5.4.3
prefix: /opt/rocm-5.4.3
specs:
# ROCM NOARCH
- hpctoolkit +rocm
- tau +mpi +rocm # tau: has issue with `spack env depfile` build
# ROCM 908
- amrex +rocm amdgpu_target=gfx908
- arborx +rocm amdgpu_target=gfx908
- cabana +rocm amdgpu_target=gfx908
- caliper +rocm amdgpu_target=gfx908
- chai ~benchmarks +rocm amdgpu_target=gfx908
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx908
- gasnet +rocm amdgpu_target=gfx908
- ginkgo +rocm amdgpu_target=gfx908
- heffte +rocm amdgpu_target=gfx908
- hpx +rocm amdgpu_target=gfx908
- hypre +rocm amdgpu_target=gfx908
- kokkos +rocm amdgpu_target=gfx908
- legion +rocm amdgpu_target=gfx908
- magma ~cuda +rocm amdgpu_target=gfx908
- mfem +rocm amdgpu_target=gfx908
- petsc +rocm amdgpu_target=gfx908
- raja ~openmp +rocm amdgpu_target=gfx908
- slate +rocm amdgpu_target=gfx908
- slepc +rocm amdgpu_target=gfx908 ^petsc +rocm amdgpu_target=gfx908
- strumpack ~slate +rocm amdgpu_target=gfx908
- sundials +rocm amdgpu_target=gfx908
- superlu-dist +rocm amdgpu_target=gfx908
- tasmanian ~openmp +rocm amdgpu_target=gfx908
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx908
- umpire +rocm amdgpu_target=gfx908
- upcxx +rocm amdgpu_target=gfx908
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx908
# - vtk-m ~openmp +rocm amdgpu_target=gfx908 # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - lbann ~cuda +rocm amdgpu_target=gfx908 # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx908 # papi: https://github.com/spack/spack/issues/27898
# ROCM 90a
- amrex +rocm amdgpu_target=gfx90a
- arborx +rocm amdgpu_target=gfx90a
- cabana +rocm amdgpu_target=gfx90a
- caliper +rocm amdgpu_target=gfx90a
- chai ~benchmarks +rocm amdgpu_target=gfx90a
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx90a
- gasnet +rocm amdgpu_target=gfx90a
- ginkgo +rocm amdgpu_target=gfx90a
- heffte +rocm amdgpu_target=gfx90a
- hpx +rocm amdgpu_target=gfx90a
- hypre +rocm amdgpu_target=gfx90a
- kokkos +rocm amdgpu_target=gfx90a
- legion +rocm amdgpu_target=gfx90a
- magma ~cuda +rocm amdgpu_target=gfx90a
- mfem +rocm amdgpu_target=gfx90a
- petsc +rocm amdgpu_target=gfx90a
- raja ~openmp +rocm amdgpu_target=gfx90a
- slate +rocm amdgpu_target=gfx90a
- slepc +rocm amdgpu_target=gfx90a ^petsc +rocm amdgpu_target=gfx90a
- strumpack ~slate +rocm amdgpu_target=gfx90a
- sundials +rocm amdgpu_target=gfx90a
- superlu-dist +rocm amdgpu_target=gfx90a
- tasmanian ~openmp +rocm amdgpu_target=gfx90a
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx90a
- umpire +rocm amdgpu_target=gfx90a
- upcxx +rocm amdgpu_target=gfx90a
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx90a
# - vtk-m ~openmp +rocm amdgpu_target=gfx90a # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - lbann ~cuda +rocm amdgpu_target=gfx90a # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx90a # papi: https://github.com/spack/spack/issues/27898
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-rocm-external" }
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4-rocm5.4.3:2023.08.01"
cdash:
build-group: E4S ROCm External

View File

@@ -1,21 +1,34 @@
spack:
view: false
concretizer:
reuse: false
unify: false
compilers:
- compiler:
spec: gcc@=11.4.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu20.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
packages:
all:
compiler: [gcc@11.1.0]
require: '%gcc target=x86_64_v3'
providers:
blas: [openblas]
mpi: [mpich]
require: target=x86_64_v3
variants: +mpi amdgpu_target=gfx90a cuda_arch=80
tbb:
require: "intel-tbb"
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
boost:
variants: +python +filesystem +iostreams +system
cuda:
version: [11.7.0]
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
@@ -24,29 +37,40 @@ spack:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mpich:
variants: ~wrapperrpath
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt+osmesa"
trilinos:
require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
+intrepid +intrepid2 +isorropia +kokkos +minitensor +nox +piro +phalanx
+rol +rythmos +sacado +stk +shards +stratimikos +tempus +tpetra
+trilinoscouplings +zoltan]
- one_of: [gotype=long_long, gotype=all]
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu]
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
cuda:
version: [11.8.0]
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt+osmesa"
specs:
# CPU
@@ -55,13 +79,12 @@ spack:
- aml
- amrex
- arborx
- archer
- argobots
- axom
- bolt
- bricks
- boost
- bricks ~cuda
- butterflypack
- boost +python +filesystem +iostreams +system
- cabana
- caliper
- chai ~benchmarks ~tests
@@ -69,8 +92,10 @@ spack:
- conduit
- datatransferkit
- dealii
- drishti
- dxt-explorer
- dyninst
- ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14
- ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc +visit +vtkm +zfp # adios2~cuda, ascent~cuda, darshan-runtime, darshan-util, faodel, hdf5, libcatalyst, parallel-netcdf, paraview~cuda, py-cinemasci, sz, unifyfs, veloc, visit, vtk-m, zfp
- exaworks
- flecsi
- flit
@@ -81,24 +106,25 @@ spack:
- globalarrays
- gmp
- gotcha
- gptune
- gptune ~mpispawn
- h5bench
- hdf5-vol-async
- hdf5-vol-cache
- hdf5-vol-log
- heffte +fftw
- hpctoolkit
- hpx max_cpu_count=512 networking=mpi
- hpx networking=mpi
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- lammps
- lbann
- legion
- libnrm
- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed
+lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard
- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp
- libquo
- libunwind
- loki
- mercury
- metall
- mfem
@@ -111,6 +137,7 @@ spack:
- nrm
- nvhpc
- omega-h
- openfoam
- openmpi
- openpmd-api
- papi
@@ -122,16 +149,17 @@ spack:
- plasma
- plumed
- precice
- pruners-ninja
- pumi
- py-h5py +mpi
- py-h5py ~mpi
- py-h5py
- py-jupyterhub
- py-libensemble +mpi +nlopt
- py-libensemble
- py-petsc4py
- py-warpx
- qthreads scheduler=distrib
- quantum-espresso
- raja
- rempi
- scr
- slate ~cuda
- slepc
@@ -140,107 +168,226 @@ spack:
- sundials
- superlu
- superlu-dist
- swig
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- trilinos@13.0.1 +belos +ifpack2 +stokhos
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
- umpire
- upcxx
- variorum
- veloc
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos +shylu
- xyce +mpi +shared +pymi +pymi_static_tpls
# INCLUDED IN ECP DAV CPU
- adios2
- ascent
- darshan-runtime
- darshan-util
- faodel
- hdf5
- libcatalyst
- parallel-netcdf
- paraview
- py-cinemasci
- sz
- unifyfs
- veloc
# - visit # silo: https://github.com/spack/spack/issues/39538
- vtk-m
- zfp
# --
# - archer # submerged into llvm +libomp_tsan
# - geopm # geopm: https://github.com/spack/spack/issues/38795
# CUDA
- amrex +cuda
- arborx +cuda ^kokkos +wrapper
# CUDA NOARCH
- bricks +cuda
- cabana +cuda ^kokkos +wrapper +cuda_lambda +cuda
- caliper +cuda
- chai ~benchmarks ~tests +cuda ^umpire ~shared
- cusz +cuda
- dealii +cuda
- ecp-data-vis-sdk +cuda ~ascent +adios2 +hdf5 +paraview +sz +vtkm +zfp ^hdf5@1.14 # Removing ascent because RAJA build failure
- flecsi +cuda
- flux-core +cuda
- ginkgo +cuda
- heffte +cuda
- hpctoolkit +cuda
- hpx max_cpu_count=512 +cuda
- hypre +cuda
- kokkos +wrapper +cuda
- kokkos-kernels +cuda ^kokkos +wrapper +cuda +cuda_lambda
- libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua
+openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz
+mgard +cuda ^cusz +cuda
- magma +cuda
- mfem +cuda
- mgard +serial +openmp +timing +unstructured +cuda
- omega-h +cuda
- papi +cuda
- petsc +cuda
- py-torch +cuda
- raja +cuda
- slate +cuda
- slepc +cuda
- strumpack ~slate +cuda
- sundials +cuda
- superlu-dist +cuda
- tasmanian +cuda
- tau +mpi +cuda
- "trilinos@13.4.0: +belos +ifpack2 +stokhos +cuda"
- umpire ~shared +cuda
# --
# - legion +cuda # legion: needs NVIDIA driver
# ROCm
- amrex +rocm
- arborx +rocm
- cabana +rocm
- caliper +rocm
- chai ~benchmarks +rocm
- ecp-data-vis-sdk +adios2 +hdf5 +paraview +pnetcdf +sz +vtkm +zfp +rocm ^hdf5@1.14 # Excludes ascent for now due to C++ standard issues
- gasnet +rocm
- ginkgo +rocm
- heffte +rocm
# CUDA 80
- amrex +cuda cuda_arch=80
- arborx +cuda cuda_arch=80 ^kokkos +wrapper
- cabana +cuda cuda_arch=80 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=80
- caliper +cuda cuda_arch=80
- chai ~benchmarks ~tests +cuda cuda_arch=80 ^umpire ~shared
- cusz +cuda cuda_arch=80
- dealii +cuda cuda_arch=80
- ecp-data-vis-sdk ~rocm +adios2 ~ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=80 # +ascent fails because fides fetch error
- flecsi +cuda cuda_arch=80
- ginkgo +cuda cuda_arch=80
- heffte +cuda cuda_arch=80
- hpx +cuda cuda_arch=80
- hypre +cuda cuda_arch=80
- kokkos +wrapper +cuda cuda_arch=80
- kokkos-kernels +cuda cuda_arch=80 ^kokkos +wrapper +cuda cuda_arch=80
- libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=80 ^cusz +cuda cuda_arch=80
- magma +cuda cuda_arch=80
- mfem +cuda cuda_arch=80
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=80
- omega-h +cuda cuda_arch=80
- parsec +cuda cuda_arch=80
- petsc +cuda cuda_arch=80
- py-torch +cuda cuda_arch=80
- raja +cuda cuda_arch=80
- slate +cuda cuda_arch=80
- slepc +cuda cuda_arch=80
- strumpack ~slate +cuda cuda_arch=80
- sundials +cuda cuda_arch=80
- superlu-dist +cuda cuda_arch=80
- tasmanian +cuda cuda_arch=80
- trilinos +cuda cuda_arch=80
- umpire ~shared +cuda cuda_arch=80
# INCLUDED IN ECP DAV CUDA
# - adios2 +cuda cuda_arch=80
# - ascent +cuda cuda_arch=80 # ascent: https://github.com/spack/spack/issues/38045
# - paraview +cuda cuda_arch=80
# - vtk-m +cuda cuda_arch=80
# - zfp +cuda cuda_arch=80
# --
# - lammps +cuda cuda_arch=80 # lammps: needs NVIDIA driver
# - upcxx +cuda cuda_arch=80 # upcxx: needs NVIDIA driver
# - axom +cuda cuda_arch=80 # axom: https://github.com/spack/spack/issues/29520
# - lbann +cuda cuda_arch=80 # lbann: https://github.com/spack/spack/issues/38788
# CUDA 90
- amrex +cuda cuda_arch=90
- arborx +cuda cuda_arch=90 ^kokkos +wrapper
- cabana +cuda cuda_arch=90 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=90
- caliper +cuda cuda_arch=90
- chai ~benchmarks ~tests +cuda cuda_arch=90 ^umpire ~shared
- cusz +cuda cuda_arch=90
- flecsi +cuda cuda_arch=90
- ginkgo +cuda cuda_arch=90
- heffte +cuda cuda_arch=90
- hpx +cuda cuda_arch=90
- kokkos +wrapper +cuda cuda_arch=90
- kokkos-kernels +cuda cuda_arch=90 ^kokkos +wrapper +cuda cuda_arch=90
- libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=90 ^cusz +cuda cuda_arch=90
- magma +cuda cuda_arch=90
- mfem +cuda cuda_arch=90
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=90
- parsec +cuda cuda_arch=90
- petsc +cuda cuda_arch=90
- py-torch +cuda cuda_arch=90
- raja +cuda cuda_arch=90
- slate +cuda cuda_arch=90
- slepc +cuda cuda_arch=90
- strumpack ~slate +cuda cuda_arch=90
- sundials +cuda cuda_arch=90
- superlu-dist +cuda cuda_arch=90
- trilinos +cuda cuda_arch=90
- umpire ~shared +cuda cuda_arch=90
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=90
# - ascent +cuda cuda_arch=90 # ascent: https://github.com/spack/spack/issues/38045
# - paraview +cuda cuda_arch=90 # paraview: InstallError: Incompatible cuda_arch=90
- vtk-m +cuda cuda_arch=90
- zfp +cuda cuda_arch=90
# --
# - axom +cuda cuda_arch=90 # axom: https://github.com/spack/spack/issues/29520
# - dealii +cuda cuda_arch=90 # dealii: https://github.com/spack/spack/issues/39532
# - ecp-data-vis-sdk ~rocm +adios2 +ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=90 # paraview: incompatible cuda_arch; vtk-m: CMake Error at CMake/VTKmWrappers.cmake:413 (message): vtkm_cont needs to be built STATIC as CUDA doesn't support virtual methods across dynamic library boundaries. You need to set the CMake opt ion BUILD_SHARED_LIBS to `OFF` or (better) turn VTKm_NO_DEPRECATED_VIRTUAL to `ON`.
# - hypre +cuda cuda_arch=90 # concretizer: hypre +cuda requires cuda@:11, but cuda_arch=90 requires cuda@12:
# - lammps +cuda cuda_arch=90 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=90 # concretizer: Cannot select a single "version" for package "lbann"
# - omega-h +cuda cuda_arch=90 # omega-h: https://github.com/spack/spack/issues/39535
# - tasmanian +cuda cuda_arch=90 # tasmanian: conflicts with cuda@12
# - upcxx +cuda cuda_arch=90 # upcxx: needs NVIDIA driver
# ROCM NOARCH
- hpctoolkit +rocm
- hpx max_cpu_count=512 +rocm
- hypre +rocm
- kokkos +rocm
- magma ~cuda +rocm
- mfem +rocm
- papi +rocm
- petsc +rocm
- raja ~openmp +rocm
- slate +rocm
- slepc +rocm ^petsc +rocm
- strumpack ~slate +rocm
- sundials +rocm
- superlu-dist +rocm
- tasmanian ~openmp +rocm
- tau +mpi +rocm
- "trilinos@13.4.0: +belos ~ifpack2 ~stokhos +rocm"
- umpire +rocm
- upcxx +rocm
- tau +mpi +rocm # tau: has issue with `spack env depfile` build
# CPU failures
# - geopm # /usr/include/x86_64-linux-gnu/bits/string_fortified.h:95:10: error:'__builtin_strncpy' specified bound 512 equals destination size [-Werror=stringop-truncation]
# - hdf5-vol-daos # hdf5-vol-daos: vhost/vhost_user.c:65:32: error: array size missing in 'vhost_message_handlers'
# - loki # ../include/loki/Singleton.h:158:14: warning: 'template<class> class std::auto_ptr' is deprecated: use 'std::unique_ptr' instead [-Wdeprecated-declarations]
# - pruners-ninja # test/ninja_test_util.c:34: multiple definition of `a';
# - rempi # rempi_message_manager.h:53:3: error: 'string' does not name a type
# ROCM 908
- amrex +rocm amdgpu_target=gfx908
- arborx +rocm amdgpu_target=gfx908
- cabana +rocm amdgpu_target=gfx908
- caliper +rocm amdgpu_target=gfx908
- chai ~benchmarks +rocm amdgpu_target=gfx908
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx908
- gasnet +rocm amdgpu_target=gfx908
- ginkgo +rocm amdgpu_target=gfx908
- heffte +rocm amdgpu_target=gfx908
- hpx +rocm amdgpu_target=gfx908
- hypre +rocm amdgpu_target=gfx908
- kokkos +rocm amdgpu_target=gfx908
- legion +rocm amdgpu_target=gfx908
- magma ~cuda +rocm amdgpu_target=gfx908
- mfem +rocm amdgpu_target=gfx908
- petsc +rocm amdgpu_target=gfx908
- raja ~openmp +rocm amdgpu_target=gfx908
- slate +rocm amdgpu_target=gfx908
- slepc +rocm amdgpu_target=gfx908 ^petsc +rocm amdgpu_target=gfx908
- strumpack ~slate +rocm amdgpu_target=gfx908
- sundials +rocm amdgpu_target=gfx908
- superlu-dist +rocm amdgpu_target=gfx908
- tasmanian ~openmp +rocm amdgpu_target=gfx908
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx908
- umpire +rocm amdgpu_target=gfx908
- upcxx +rocm amdgpu_target=gfx908
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx908
# - vtk-m ~openmp +rocm amdgpu_target=gfx908 # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - lbann ~cuda +rocm amdgpu_target=gfx908 # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx908 # papi: https://github.com/spack/spack/issues/27898
# CUDA failures
# - parsec +cuda # parsec/mca/device/cuda/transfer.c:168: multiple definition of `parsec_CUDA_d2h_max_flows';
# ROCM 90a
- amrex +rocm amdgpu_target=gfx90a
- arborx +rocm amdgpu_target=gfx90a
- cabana +rocm amdgpu_target=gfx90a
- caliper +rocm amdgpu_target=gfx90a
- chai ~benchmarks +rocm amdgpu_target=gfx90a
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx90a
- gasnet +rocm amdgpu_target=gfx90a
- ginkgo +rocm amdgpu_target=gfx90a
- heffte +rocm amdgpu_target=gfx90a
- hpx +rocm amdgpu_target=gfx90a
- hypre +rocm amdgpu_target=gfx90a
- kokkos +rocm amdgpu_target=gfx90a
- legion +rocm amdgpu_target=gfx90a
- magma ~cuda +rocm amdgpu_target=gfx90a
- mfem +rocm amdgpu_target=gfx90a
- petsc +rocm amdgpu_target=gfx90a
- raja ~openmp +rocm amdgpu_target=gfx90a
- slate +rocm amdgpu_target=gfx90a
- slepc +rocm amdgpu_target=gfx90a ^petsc +rocm amdgpu_target=gfx90a
- strumpack ~slate +rocm amdgpu_target=gfx90a
- sundials +rocm amdgpu_target=gfx90a
- superlu-dist +rocm amdgpu_target=gfx90a
- tasmanian ~openmp +rocm amdgpu_target=gfx90a
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx90a
- umpire +rocm amdgpu_target=gfx90a
- upcxx +rocm amdgpu_target=gfx90a
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx90a
# - vtk-m ~openmp +rocm amdgpu_target=gfx90a # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - lbann ~cuda +rocm amdgpu_target=gfx90a # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx90a # papi: https://github.com/spack/spack/issues/27898
mirrors: { "mirror": "s3://spack-binaries/develop/e4s" }
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-x86_64:2023-01-01"
image: "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01"
cdash:
build-group: E4S

View File

@@ -41,7 +41,7 @@
# prevent infinite recursion when spack shells out (e.g., on cray for modules)
if [ -n "${_sp_initializing:-}" ]; then
exit 0
return 0
fi
export _sp_initializing=true

View File

@@ -39,9 +39,6 @@ RUN find -L {{ paths.view }}/* -type f -exec readlink -f '{}' \; | \
RUN cd {{ paths.environment }} && \
spack env activate --sh -d . > activate.sh
{% if extra_instructions.build %}
{{ extra_instructions.build }}
{% endif %}
{% endblock build_stage %}
{% endif %}
@@ -70,10 +67,6 @@ RUN {% if os_package_update %}{{ os_packages_final.update }} \
&& {% endif %}{{ os_packages_final.install }} {{ os_packages_final.list | join | replace('\n', ' ') }} \
&& {{ os_packages_final.clean }}
{% endif %}
{% if extra_instructions.final %}
{{ extra_instructions.final }}
{% endif %}
{% endblock final_stage %}
{% for label, value in labels.items() %}
LABEL "{{ label }}"="{{ value }}"

View File

@@ -39,9 +39,6 @@ EOF
grep 'x-executable\|x-archive\|x-sharedlib' | \
awk -F: '{print $1}' | xargs strip
{% endif %}
{% if extra_instructions.build %}
{{ extra_instructions.build }}
{% endif %}
{% endblock build_stage %}
{% if apps %}
{% for application, help_text in apps.items() %}
@@ -80,9 +77,6 @@ Stage: final
{% endif %}
# Modify the environment without relying on sourcing shell specific files at startup
cat {{ paths.environment }}/environment_modifications.sh >> $SINGULARITY_ENVIRONMENT
{% if extra_instructions.final %}
{{ extra_instructions.final }}
{% endif %}
{% endblock final_stage %}
{% if runscript %}

View File

@@ -24,9 +24,7 @@ class _3proxy(MakefilePackage):
depends_on("m4", type="build")
def build(self, spec, prefix):
make("-f", "Makefile.{0}".format(platform.system()))
make("-f", f"Makefile.{platform.system()}")
def install(self, spec, prefix):
make(
"-f", "Makefile.{0}".format(platform.system()), "prefix={0}".format(prefix), "install"
)
make("-f", f"Makefile.{platform.system()}", f"prefix={prefix}", "install")

View File

@@ -75,8 +75,8 @@ def is_64bit(self):
def build(self, spec, prefix):
link_type = "1" if "static" in spec.variants["link_type"].value else "0"
nmake_args = [
"PLATFORM=%s" % self.plat_arch,
"MY_STATIC_LINK=%s" % link_type,
f"PLATFORM={self.plat_arch}",
f"MY_STATIC_LINK={link_type}",
"NEW_COMPILER=1",
]
# 7zips makefile is configured in such as way that if this value is set

View File

@@ -65,7 +65,7 @@ def edit(self, spec, prefix):
spec["fftw"].prefix,
spec["elpa"].prefix,
inc_var,
"{0}".format(spec["elpa"].version),
f"{spec['elpa'].version}",
spec["cereal"].prefix,
)
)

View File

@@ -21,4 +21,4 @@ class Abduco(MakefilePackage):
version("0.4", sha256="bda3729df116ce41f9a087188d71d934da2693ffb1ebcf33b803055eb478bcbb")
def install(self, spec, prefix):
make("PREFIX={0}".format(prefix), "install")
make(f"PREFIX={prefix}", "install")

View File

@@ -185,8 +185,7 @@ def get_acfl_prefix(spec):
)
else:
return join_path(
spec.prefix,
"arm-linux-compiler-{0}_{1}".format(spec.version, get_os(spec.version.string)),
spec.prefix, f"arm-linux-compiler-{spec.version}_{get_os(spec.version.string)}"
)
@@ -238,7 +237,7 @@ class Acfl(Package):
# Run the installer with the desired install directory
def install(self, spec, prefix):
exe = Executable(
"./arm-compiler-for-linux_{0}_{1}.sh".format(spec.version, get_os(spec.version.string))
f"./arm-compiler-for-linux_{spec.version}_{get_os(spec.version.string)}.sh"
)
exe("--accept", "--force", "--install-to", prefix)

View File

@@ -320,8 +320,12 @@ class Acts(CMakePackage, CudaPackage):
for _cxxstd in _cxxstd_values:
if isinstance(_cxxstd, _ConditionalVariantValues):
for _v in _cxxstd:
depends_on(
f"geant4 cxxstd={_v.value}", when=f"cxxstd={_v.value} {_v.when} ^geant4"
)
depends_on(f"root cxxstd={_v.value}", when=f"cxxstd={_v.value} {_v.when} ^root")
else:
depends_on(f"geant4 cxxstd={_v.value}", when=f"cxxstd={_v.value} {_v.when} ^geant4")
depends_on(f"root cxxstd={_cxxstd}", when=f"cxxstd={_cxxstd} ^root")
# ACTS has been using C++17 for a while, which precludes use of old GCC
@@ -332,15 +336,15 @@ def cmake_args(self):
def cmake_variant(cmake_label, spack_variant):
enabled = spec.satisfies("+" + spack_variant)
return "-DACTS_BUILD_{0}={1}".format(cmake_label, enabled)
return f"-DACTS_BUILD_{cmake_label}={enabled}"
def enable_cmake_variant(cmake_label, spack_variant):
enabled = spec.satisfies(spack_variant)
return "-DACTS_ENABLE_{0}={1}".format(cmake_label, enabled)
return f"-DACTS_ENABLE_{cmake_label}={enabled}"
def example_cmake_variant(cmake_label, spack_variant, type="BUILD"):
enabled = spec.satisfies("+examples +" + spack_variant)
return "-DACTS_{0}_EXAMPLES_{1}={2}".format(type, cmake_label, enabled)
return f"-DACTS_{type}_EXAMPLES_{cmake_label}={enabled}"
def plugin_label(plugin_name):
if spec.satisfies("@0.33:"):
@@ -396,7 +400,7 @@ def plugin_cmake_variant(plugin_name, spack_variant):
]
log_failure_threshold = spec.variants["log_failure_threshold"].value
args.append("-DACTS_LOG_FAILURE_THRESHOLD={0}".format(log_failure_threshold))
args.append(f"-DACTS_LOG_FAILURE_THRESHOLD={log_failure_threshold}")
if spec.satisfies("@19.4.0:"):
args.append("-DACTS_ENABLE_LOG_FAILURE_THRESHOLD=ON")
@@ -427,11 +431,11 @@ def plugin_cmake_variant(plugin_name, spack_variant):
if "+cuda" in spec:
cuda_arch = spec.variants["cuda_arch"].value
if cuda_arch != "none":
args.append("-DCUDA_FLAGS=-arch=sm_{0}".format(cuda_arch[0]))
args.append(f"-DCUDA_FLAGS=-arch=sm_{cuda_arch[0]}")
if "+python" in spec:
python = spec["python"].command.path
args.append("-DPython_EXECUTABLE={0}".format(python))
args.append(f"-DPython_EXECUTABLE={python}")
args.append(self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"))

View File

@@ -0,0 +1,4 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
applications/Allwmake $targetType $*

View File

@@ -0,0 +1,5 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
wmake libso solvers/additiveFoam/movingHeatSource
wmake solvers/additiveFoam

View File

@@ -0,0 +1,59 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.package import *
from spack.pkg.builtin.openfoam import add_extra_files
class Additivefoam(Package):
"""AdditiveFOAM is a heat and mass transfer software for Additive Manufacturing (AM)"""
homepage = "https://github.com/ORNL/AdditiveFOAM"
git = "https://github.com/ORNL/AdditiveFOAM.git"
url = "https://github.com/ORNL/AdditiveFOAM/archive/1.0.0.tar.gz"
maintainers("streeve", "colemanjs", "gknapp1")
tags = ["ecp"]
version("main", branch="main")
version("1.0.0", sha256="abbdf1b0230cd2f26f526be76e973f508978611f404fe8ec4ecdd7d5df88724c")
depends_on("openfoam-org@10")
common = ["spack-derived-Allwmake"]
assets = ["applications/Allwmake", "Allwmake"]
build_script = "./spack-derived-Allwmake"
phases = ["configure", "build", "install"]
def patch(self):
add_extra_files(self, self.common, self.assets)
def configure(self, spec, prefix):
pass
def build(self, spec, prefix):
"""Build with Allwmake script, wrapped to source environment first."""
args = []
if self.parallel: # Parallel build? - pass via environment
os.environ["WM_NCOMPPROCS"] = str(make_jobs)
builder = Executable(self.build_script)
builder(*args)
def install(self, spec, prefix):
"""Install under the prefix directory"""
for f in ["README.md", "LICENSE"]:
if os.path.isfile(f):
install(f, join_path(self.prefix, f))
dirs = ["tutorials", "applications"]
for d in dirs:
if os.path.isdir(d):
install_tree(d, join_path(self.prefix, d), symlinks=True)

View File

@@ -36,8 +36,8 @@ class Adiak(CMakePackage):
def cmake_args(self):
args = []
if self.spec.satisfies("+mpi"):
args.append("-DMPI_CXX_COMPILER=%s" % self.spec["mpi"].mpicxx)
args.append("-DMPI_C_COMPILER=%s" % self.spec["mpi"].mpicc)
args.append(f"-DMPI_CXX_COMPILER={self.spec['mpi'].mpicxx}")
args.append(f"-DMPI_C_COMPILER={self.spec['mpi'].mpicc}")
args.append("-DENABLE_MPI=ON")
else:
args.append("-DENABLE_MPI=OFF")

View File

@@ -119,7 +119,7 @@ def validate(self, spec):
def with_or_without_hdf5(self, activated):
if activated:
return "--with-phdf5={0}".format(self.spec["hdf5"].prefix)
return f"--with-phdf5={self.spec['hdf5'].prefix}"
return "--without-phdf5"
@@ -134,7 +134,7 @@ def configure_args(self):
extra_args = [
# required, otherwise building its python bindings will fail
"CFLAGS={0}".format(self.compiler.cc_pic_flag)
f"CFLAGS={self.compiler.cc_pic_flag}"
]
extra_args += self.enable_or_disable("shared")
@@ -148,7 +148,7 @@ def configure_args(self):
extra_args += self.with_or_without("infiniband")
if "+zlib" in spec:
extra_args.append("--with-zlib={0}".format(spec["zlib-api"].prefix))
extra_args.append(f"--with-zlib={spec['zlib-api'].prefix}")
else:
extra_args.append("--without-zlib")

View File

@@ -109,19 +109,19 @@ class Adios2(CMakePackage, CudaPackage):
depends_on("cmake@3.12.0:", type="build")
for _platform in ["linux", "darwin", "cray"]:
depends_on("pkgconfig", type="build", when="platform=%s" % _platform)
depends_on("pkgconfig", type="build", when=f"platform={_platform}")
variant(
"pic",
default=False,
description="Build pic-enabled static libraries",
when="platform=%s" % _platform,
when=f"platform={_platform}",
)
# libffi and libfabric and not currently supported on Windows
# see Paraview's superbuild handling of libfabric at
# https://gitlab.kitware.com/paraview/paraview-superbuild/-/blob/master/projects/adios2.cmake#L3
depends_on("libffi", when="+sst platform=%s" % _platform) # optional in DILL
depends_on("libffi", when=f"+sst platform={_platform}") # optional in DILL
depends_on(
"libfabric@1.6.0:", when="+sst platform=%s" % _platform
"libfabric@1.6.0:", when=f"+sst platform={_platform}"
) # optional in EVPath and SST
# depends_on('bison', when='+sst') # optional in FFS, broken package
# depends_on('flex', when='+sst') # optional in FFS, depends on BISON
@@ -130,6 +130,7 @@ class Adios2(CMakePackage, CudaPackage):
depends_on("libzmq", when="+dataman")
depends_on("dataspaces@1.8.0:", when="+dataspaces")
depends_on("hdf5@:1.12", when="@:2.8 +hdf5")
depends_on("hdf5~mpi", when="+hdf5~mpi")
depends_on("hdf5+mpi", when="+hdf5+mpi")
@@ -240,8 +241,8 @@ def cmake_args(self):
args.extend(["-DCMAKE_Fortran_SUBMODULE_EXT=.smod", "-DCMAKE_Fortran_SUBMODULE_SEP=."])
if "+python" in spec or self.run_tests:
args.append("-DPYTHON_EXECUTABLE:FILEPATH=%s" % spec["python"].command.path)
args.append("-DPython_EXECUTABLE:FILEPATH=%s" % spec["python"].command.path)
args.append(f"-DPYTHON_EXECUTABLE:FILEPATH={spec['python'].command.path}")
args.append(f"-DPython_EXECUTABLE:FILEPATH={spec['python'].command.path}")
return args

View File

@@ -83,12 +83,12 @@ def configure_args(self):
configure_args = []
if "+boost" in spec:
configure_args.append("--with-boost={0}".format(spec["boost"].prefix))
configure_args.append(f"--with-boost={spec['boost'].prefix}")
else:
configure_args.append("--with-boost=no")
if "+openmp" in spec:
configure_args.append("--with-openmp-flag={0}".format(self.compiler.openmp_flag))
configure_args.append(f"--with-openmp-flag={self.compiler.openmp_flag}")
configure_args.extend(
self.enable_or_disable("advanced-branching", variant="advanced_branching")

View File

@@ -24,6 +24,7 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
maintainers("WeiqunZhang", "asalmgren", "etpalmer63")
version("develop", branch="development")
version("23.10", sha256="3c85aa0ad5f96303e797960a6e0aa37c427f6483f39cdd61dbc2f7ca16357714")
version("23.09", sha256="1a539c2628041b17ad910afd9270332060251c8e346b1482764fdb87a4f25053")
version("23.08", sha256="a83b7249d65ad8b6ac1881377e5f814b6db8ed8410ea5562b8ae9d4ed1f37c29")
version("23.07", sha256="4edb991da51bcaad040f852e42c82834d8605301aa7eeb01cd1512d389a58d90")

View File

@@ -32,6 +32,7 @@ class AprUtil(AutotoolsPackage):
depends_on("postgresql", when="+pgsql")
depends_on("sqlite", when="+sqlite")
depends_on("unixodbc", when="+odbc")
depends_on("pkg-config", type="build", when="+crypto ^openssl~shared")
@property
def libs(self):
@@ -85,6 +86,13 @@ def configure_args(self):
else:
args.append("--without-odbc")
if spec.satisfies("+crypto ^openssl~shared"):
# Need pkg-config to get zlib and -ldl flags
# (see https://dev.apr.apache.narkive.com/pNnO9F1S/configure-bug-openssl)
pkgconf = which("pkg-config")
ssl_libs = pkgconf("--libs", "--static", "openssl", output=str)
args.append(f"LIBS={ssl_libs}")
return args
def check(self):

View File

@@ -17,6 +17,7 @@ class Assimp(CMakePackage):
maintainers("wdconinc")
version("master", branch="master")
version("5.3.1", sha256="a07666be71afe1ad4bc008c2336b7c688aca391271188eb9108d0c6db1be53f1")
version("5.2.5", sha256="b5219e63ae31d895d60d98001ee5bb809fb2c7b2de1e7f78ceeb600063641e1a")
version("5.2.4", sha256="6a4ff75dc727821f75ef529cea1c4fc0a7b5fc2e0a0b2ff2f6b7993fe6cb54ba")
version("5.2.3", sha256="b20fc41af171f6d8f1f45d4621f18e6934ab7264e71c37cd72fd9832509af2a8")

Some files were not shown because too many files have changed in this diff Show More