Compare commits

...

344 Commits

Author SHA1 Message Date
Todd Gamblin
6cbe4e1311 spec: add {install_status} format attribute
`{install_status}` is handled in a funny way in `Spec.tree()`, and it can't be used in
other useful places like `Spec.format()`.

- [x] Make `{install_status}` a format attribute like most other things we want to print
      about specs.

- [x] Refactor whitespace handling in `Spec.format()` to only strip whitespace that wasn't
      in the original format string (i.e. that was added by our own attributes)
2024-02-16 22:46:58 -08:00
dependabot[bot]
b88c4792a7 build(deps): bump clingo from 5.6.2 to 5.7.1 in /.github/workflows/style (#42732)
Bumps [clingo](https://github.com/potassco/clingo) from 5.6.2 to 5.7.1.
- [Release notes](https://github.com/potassco/clingo/releases)
- [Changelog](https://github.com/potassco/clingo/blob/master/CHANGES.md)
- [Commits](https://github.com/potassco/clingo/compare/v5.6.2...v5.7.1)

---
updated-dependencies:
- dependency-name: clingo
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 22:24:10 -08:00
dependabot[bot]
ac92e94b00 build(deps): bump pytest from 8.0.0 to 8.0.1 in /lib/spack/docs (#42733)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.0 to 8.0.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.0...8.0.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 22:23:13 -08:00
Luc Berger
4a01865f7b Kokkos Ecosystem: update for release 4.2.01 (#42711)
* Kokkos Ecosystem: update for release 4.2.01

Will rebase this on top of develop once Kokkos Core PR merges.

* Kokkos Ecosystem: update license statement to reflect current license
2024-02-16 17:28:29 -07:00
Sebastian Pipping
025165e22e expat: Add latest release 2.6.0 with security fixes (#42680)
* expat: Add latest release 2.6.0 with security fixes

* expat: Deprecate vulnerable 2.5.0

* expat: Add past CVEs and where they were fixed
2024-02-16 17:23:46 -07:00
SXS Bot
cda9bc3e1d spectre: add v2024.02.05 (#42496)
* spectre: add v2024.02.05

* [@spackbot] updating style on behalf of sxs-bot

---------

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-02-16 15:49:06 -07:00
Lydéric Debusschère
caf21dda42 rust: enable vendor (#42365)
* rust: enable vendor

* rust: modify vendor description; move the call of variant

* rust: fix style

* rust: typo

* rust: remove variant 'vendor' to let vendoring as default fonctionality

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-02-16 14:13:44 -08:00
HELICS-bot
e1779a2884 helics: Add version 3.5.0 (#42572)
* helics: Add version 3.5.0

* helics: define CMAKE_CXX_STANDARD=20 when GCC>=13 is used to compile

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
2024-02-16 14:12:50 -08:00
Adam J. Stewart
f55a018fd9 py-torchmetrics: add v1.3.1 (#42638) 2024-02-16 14:00:43 -08:00
dependabot[bot]
f5964e1dde build(deps): bump black in /.github/workflows/style (#42631)
Bumps [black](https://github.com/psf/black) from 24.1.1 to 24.2.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:52:38 -08:00
dependabot[bot]
23e0fe2e21 build(deps): bump black from 24.1.1 to 24.2.0 in /lib/spack/docs (#42629)
Bumps [black](https://github.com/psf/black) from 24.1.1 to 24.2.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:51:50 -08:00
dependabot[bot]
44438e6171 build(deps): bump python-levenshtein in /lib/spack/docs (#42630)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.24.0 to 0.25.0.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.24.0...v0.25.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:51:12 -08:00
Martin Lang
af8e000a93 bigdft-futile: new versions 1.9.3, 1.9.4 (#42646) 2024-02-16 13:41:18 -08:00
Adam J. Stewart
375b82a593 py-black: add v24.2.0 (#42639) 2024-02-16 13:39:16 -08:00
Adam J. Stewart
2030e2b089 py-jsonargparse: add v4.27.5 (#42640) 2024-02-16 13:38:17 -08:00
Carlos Bederián
34aba94148 openmpi: add ucc to fabrics (#41889) 2024-02-16 22:37:00 +01:00
Nicolas Morales
43c909e19c Update maintainers for Kokkos and add Kokkos 4.2.1 (#42690)
* update kokkos spack package maintainers

* add Kokkos 4.2.01

* Update var/spack/repos/builtin/packages/kokkos/package.py

Co-authored-by: Luc Berger <lberge@sandia.gov>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
Co-authored-by: Luc Berger <lberge@sandia.gov>
2024-02-16 11:12:36 -07:00
Martin Lang
7c011d304f nfft: new versions 3.5.1, 3.5.2, 3.5.3 (#42645) 2024-02-16 09:58:03 -08:00
Martin Lang
1546fc7e5f dftbplus: new versions 22.2, 23.1 (#42647) 2024-02-16 09:55:53 -08:00
Rob Falgout
75a134f085 Update package.py for hypre release 2.31.0 (#42689) 2024-02-16 11:46:29 -06:00
Alex Richert
d0c4675a9b Add aocc support to ESMF (#42708)
* Add aocc support to ESMF

* Update package.py
2024-02-16 09:33:47 -08:00
WuK
0507c3c63d add new CUTLASS versions (#42715) 2024-02-16 09:23:50 -08:00
dependabot[bot]
59caa93571 build(deps): bump dorny/paths-filter from 3.0.0 to 3.0.1 (#42710)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 3.0.0 to 3.0.1.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](0bc4621a31...ebc4d7e9eb)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 09:16:35 -08:00
Arne Becker
b4b53a9a9f perl-datetime-format-iso8601: New package (#42719)
Adds DateTime::Format::ISO8601
2024-02-16 09:15:54 -08:00
Todd Gamblin
be1cfffa45 clingo: add version 5.7.0 (#42707)
5.7.0 was just released. It includes a number of changes requested and/or
upstreamed by Spack developers, e.g.:

* API for accessing optimization priorities: https://github.com/potassco/clingo/pull/406
* Hash optimization: https://github.com/potassco/clingo/pull/441
* Contributing Guide: https://github.com/potassco/clingo/pull/465
* Hiding more ELF symbols:
  * https://github.com/potassco/clingo/pull/447
  * https://github.com/potassco/clingo/pull/449
2024-02-16 18:14:19 +01:00
Victor Lopez Herrero
75b7109222 dlb: add v3.4 (#42722) 2024-02-16 09:13:23 -08:00
Mikael Simberg
c56cf8c0d2 Add support for clang with OpenMP and other minor changes to oneapi build system (#42717)
* Add support for clang in oneapi packages with OpenMP

* Add fallback search for libomp in OneApi package with OpenMP threading

* Add requires for the compiler when using threads=openmp in intel-oneapi-mkl

* Cosmetic changes to messages in oneapi.py

* Update error message in oneapi.py

Co-authored-by: Robert Cohn <rscohn2@gmail.com>

* Update another error message in oneapi.py

Co-authored-by: Robert Cohn <rscohn2@gmail.com>

* Inline helper error function in oneapi.py

* Update one more error message in oneapi.py

* Wrap long line in oneapi.py

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2024-02-16 11:39:31 -05:00
Robert Cohn
5c3df6e8ca [intel-oneapi-mkl] provide omp lib for threads=openmp (#42653) 2024-02-16 08:48:42 +01:00
Tim Fuller
79087d08d9 allow packages to request no submodules be updated (#40409)
* allow packages to request no submodules be updated when self.submodules is a
  callable function

* Extend the test added in Allow more fine-grained control over what submodules are
  updated: part 2 #27293 to include this case

* Update the type signature for the submodules arg of version() in directives.py

---------

Co-authored-by: tjfulle <tjfulle@users.noreply.github.com>
2024-02-15 22:36:37 -08:00
Victor Brunini
d31e503e5b develop: Add -b/--build-directory option to set build_directory package attribute (#39606)
* develop: Add -b/--build-directory option to set build_directory package attribute.

* Update docs

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
Co-authored-by: vbrunini <vbrunini@users.noreply.github.com>
2024-02-16 06:30:58 +00:00
Eric Berquist
55b62c2168 SST: only run autoreconf for versions from Git branches (#42712) 2024-02-15 22:27:07 -06:00
Scott Wittenburg
6c3511ee1d Fix spack --profile|--pdb <cmd> (#42662) 2024-02-15 15:15:40 -07:00
Seth R. Johnson
d19fa74909 celeritas: new version 0.4.2 (#42702) 2024-02-15 16:41:10 +00:00
Jemma Stachelek
a2ad2d1c9f docs: fix typo (#42688) 2024-02-15 11:21:51 +01:00
Harmen Stoppels
55863bd680 compilers: fixup order of arguments to satisfies (#42682) 2024-02-15 10:21:06 +00:00
Harmen Stoppels
7d4dcd52d9 elpa: fix checksum (#42674) 2024-02-14 11:56:27 +01:00
Henrik Stooss
5e4e72ddd2 veclibfort: explicitly add platform=darwin as requirement (#42664) 2024-02-14 11:18:04 +01:00
John W. Parent
447c48e2fd VTK: limit patches to v8 (#42505)
* VTK: limit patches to v8

* Finer scrope on patch version applicability
2024-02-14 03:36:57 -06:00
Sreenivasa Murthy Kolam
3be4f4db86 Deprecate ROCm 5.1.0 till 5.2.3 (#41794) 2024-02-14 09:03:00 +01:00
John Pennycook
ca97a0fefb cmake: Enable compilation database generation (#42353)
* cmake: Enable CMAKE_EXPORT_COMPILE_COMMANDS

Enabling this option causes CMake to generate a compile_commands.json file
containing a compilation database that can be used to drive third-party tools.

CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5

Exporting compilation databases is only supported for Makefile and Ninja
generators, so check these conditions as well.

CMAKE_EXPORT_COMPILE_COMMANDS is only enabled in supported configurations
2024-02-13 16:47:40 -07:00
Jon Rood
59f56327fe Remove boost as dependency for trilinos+stk (#39556)
* Remove boost as dependency for trilinos+stk.

* Formatting.

* Put bounds on STK requirement of Boost.
2024-02-13 13:28:25 -07:00
Victor Brunini
e4c871489a boost: Add patch to workaround mpl compiler error with gcc 8.3 and c++17. (#39144)
Works around this issue: https://github.com/boostorg/mpl/issues/44

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-02-13 12:23:13 -07:00
Nichols A. Romero
c048101d90 ELPA package: add patched version from Nov 2023 (#42539) 2024-02-13 10:47:28 -08:00
Alex Richert
e0304bf509 pdt: add aocc support (#42634) 2024-02-13 11:03:46 -07:00
Maciej Wójcik
0e2a9fe26a py-py-tes: add new package (#42531)
* py-py-tes: add new package

* py-py-tes: add constraint on Python
2024-02-13 10:31:11 -07:00
George Young
d1e01d5646 khmer: new package @2.1.1 (#42450)
* khmer: new package @2.1.1

* rationalising patch

* adding dep, modifying patch

* Update var/spack/repos/builtin/packages/khmer/package.py

Indeed!

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:02:44 -07:00
George Young
7b04910f84 possvm: new package @1.2 (#42516)
* possvm: new package @1.2

* black!

* appeasing flake8

* Updating commit ID

* Adding graphing dep

* Update var/spack/repos/builtin/packages/py-markov-clustering/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:49:36 -07:00
Maciej Wójcik
0664a2cdb2 py-pysftp: add new package (#42525)
* py-pysftp: add new package

* py-pysftp: correct version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:49:14 -07:00
Jeremy Fix
fc38fe1c69 Adds the spack recipe for building the pynpm python package (#42582)
* Adds the spack recipe for building the pynpm python package

* fix license header

* Update var/spack/repos/builtin/packages/py-pynpm/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:42:11 -07:00
George Young
29cb1d0ce0 cellpose: new package @2.2.3 (#42403)
* py-fastremap: new package @1.14.1

* py-pyqtgraph: new package @0.13.3

* py-roifile: new package @2024.1.10

* py-superqt: new package @0.6.1

* cellpose: new package @2.2.3

* Appeasing black ...

* Dropping the cuda variant

* Update var/spack/repos/builtin/packages/py-fastremap/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Adding python version dependency

* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-roifile/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-superqt/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Adding missing dep & conflict

* Appeasing black

* Adding missing py- prefix for dep

* Switching over to py-pyqt6

* Switching over to py-pyqt6

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:08:41 -06:00
Dani
ce777e3c89 py-biobb-structure-checking: new package (#42580)
* new builtin package: py-biobb-structure-checking

* Update var/spack/repos/builtin/packages/py-biobb-structure-checking/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* dependencies are also build dependencies

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:05:37 -06:00
Dani
bd44cedd0d new builtin package: py-biobb-io (#42451)
* new builtin package: py-biobb-io

* Update var/spack/repos/builtin/packages/py-biobb-io/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:03:42 -06:00
Dani
316a9a5d11 py-biobb-gromacs: add new package (#42579)
* new builtin package: py-biobb-gromacs

* Update var/spack/repos/builtin/packages/py-biobb-gromacs/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* added setuptools dependency

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:03:06 -06:00
Harmen Stoppels
4a04989bbb PythonExtension.add_files_to_view: link non-executable/non-shebang regular files (#42641) 2024-02-13 12:55:37 +01:00
Jordan Ogas
2c4b529896 squashfuse: add versions (#42589) 2024-02-13 09:03:49 +01:00
Henri Menke
e37c099ddb clfft: workaround compiler error (#42519) 2024-02-13 00:10:51 -07:00
George Young
4d7898a669 psipred: new package @4.02 (#42529)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-12 18:20:33 -08:00
Sreenivasa Murthy Kolam
91b0528abf ROCm packages and dependents: add 6.0.2 release (#42544)
* Bump up the version for rocm-6.0.2 release
* extend the patches that were created for apps for rocm-6.0.0 and rocm-6.0.2 releases
  (but apply hipfft patch for only 6.0.0)
2024-02-12 17:54:33 -08:00
Zack Galbreath
8ee3073350 More updates for GitLab CI memory requests (#42425)
* gitlab: remove requests for unreferenced packages

The packages removed in this commit are not built by any of
our current GitLab CI stacks.

* gitlab: update memory requests for "huge" packages

* gitlab: reduce memory requests for overprovisioned packages

* gitlab: more memory for py-torch (again)

* gitlab: update memory but keep CPU the same
2024-02-12 16:41:56 -06:00
Massimiliano Culpo
cb3c014a43 audit: detect self-referential depends_on (#42456) 2024-02-12 21:56:06 +01:00
Victoria Cherkas
2a01e9679a Adds latest releases of eccodes/eckit/metkit (#42618) 2024-02-12 13:41:15 -07:00
Harmen Stoppels
519deac544 Fix multiple issues with Python in views (#42601)
This fixes bugs, performance issues, and removes no longer necessary code.

Short version:

1. Creating views from Python extensions would error if the Spack `opt` dir itself was in some symlinked directory. Use of `realpath` would expand those, and keying into `merge_map` would fail.
2. Creating views from Python extensions (and Python itself, potentially) could fail if the `bin/` dir contains symlinks pointing outside the package prefix -- Spack keyed into `merge_map[target_of_symlink]` incorrectly.
3. In the `python` package the `remove_files_from_view` function was broken after a breaking API change two years ago (#24355). However, the entire function body was redundant anyways, so solved it by removing it.
4. Notions of "global view" (i.e. python extensions being linked into Python's own prefix instead of into a view) are completely outdated, and removed. It used to be supported but was removed years ago.
5. Views for Python extension would _always_ copy non-symlinks in `./bin/*`, which is a big mistake, since all we care about is rewriting shebangs of scripts; we don't want to copy binaries. Now we first check if the file is executable, and then read two bytes to check if it has a shebang, and only if so, copy the entire file and patch up shebangs.

The bug fixes for (1) and (2) basically consist of getting rid of `realpath` entirely, and instead simply keep track of file identifiers of files that are copied/modified in the view. Only after patching up regular files do we iterate over symlinks and check if they target one of those. If so, retarget it to the modified file in the view.
2024-02-12 19:52:52 +01:00
John Biddiscombe
c33a8dc223 h5hut: fix to work with latest hdf5 (H5_USE_110_API) (#42607) 2024-02-12 04:33:39 -07:00
Richard Berger
742e2fc7e4 caliper: allow newer papi to be used (#42501) 2024-02-12 11:53:57 +01:00
Jean Luca Bez
e90b616428 pdc: add v0.4 (#42508) 2024-02-12 11:52:54 +01:00
Brian Vanderwende
3d037c5150 nco: add v5.1.9 (#42512) 2024-02-12 11:46:07 +01:00
Ivan Maidanski
fd60e97784 bdw-gc: add v8.2.6 (#42524) 2024-02-12 11:38:12 +01:00
Thomas Madlener
c2cda6bc48 lcio: add v2.21 (#42514) 2024-02-12 11:37:10 +01:00
Thomas Madlener
1cd95a4bb7 podio: add 0.99 pre-release version and deprecate all older versions (#42526) 2024-02-12 11:36:30 +01:00
James Beal
727eaf3c82 samtools: add v1.19.2 (#42550)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-12 11:09:05 +01:00
Thomas Madlener
d014671bcb edm4hep: add v0.10.5, deprecate broken v0.10.4 (#42561) 2024-02-12 11:01:44 +01:00
Caetano Melone
4edb073a20 litestream: add new package (#42565) 2024-02-12 10:58:12 +01:00
G-Ragghianti
5d2b9514db Fixed papi release tar hash due to rebuild of tar file (#42567) 2024-02-12 10:57:12 +01:00
Ken Raffenetti
2491855678 mpich: Fix +vci variant for newer releases (#42570)
--with-ch4-max-vcis=default is no longer accepted by MPICH configure
since the 4.1 release. Just omit the option from the +vci variant, since
configure will select the default value in its absence.
2024-02-12 10:55:57 +01:00
Mark Grondona
b23038db53 flux-core: drop czmq,jsonschema requirements for recent versions (#42560)
Problem: Older versions of the flux-core package require czmq and
jsonschema, but these dependencies have been dropped in recent
versions.

Add `when=` arguments to drop these requirements for the appropriate
versions of flux-core.
2024-02-12 10:55:00 +01:00
Julien Cortial
6d68dcf13c libuv: update compiler requirements (#42576) 2024-02-12 10:54:10 +01:00
Adam J. Stewart
fbf6db035b py-lightning: add v2.2.0 (#42577) 2024-02-12 10:49:17 +01:00
Richard Berger
1a43fc1e62 lammps: add v20240207 (#42587)
* lammps: correct license

LAMMPS has always been GPL-2.0 only. The clarification was made here:
9a8ac23663
2024-02-12 10:46:05 +01:00
Mikael Simberg
8210853276 pika: add v0.22.2 (#42617) 2024-02-12 02:38:50 -07:00
Sebastian Ehlert
e8bf6ab352 orca: added new versions (#38822) 2024-02-12 10:20:36 +01:00
Sinan
0aa91b99ed freexl: add v2.0.0 (#42574)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-02-12 10:16:31 +01:00
kinagaki-fj
d93b035f14 xmlf90: add version and fix for fujitsu compiler (#42305) 2024-02-12 10:14:11 +01:00
kinagaki-fj
adb0757f72 blitz: fix for fujitsu compiler (#42307)
Co-authored-by: m-shunji <m.shunji@fujitsu.com>
2024-02-12 10:11:26 +01:00
George Malerbo
2369a8f4e5 raylib: add new package (#42594) 2024-02-12 10:07:33 +01:00
Alec Scott
a6844d26e0 coretuils: add v9.4 (#42596) 2024-02-12 10:03:00 +01:00
Alec Scott
95c663224d curl: add v8.6.0 (#42597) 2024-02-12 10:02:26 +01:00
Brian Vanderwende
6dc01f0d94 Fix for tirpc variable in libdap4 (#42511) 2024-02-12 10:01:53 +01:00
Adam J. Stewart
2d3eeecef8 Sleef: fix build on macOS arm64, add test support (#42583) 2024-02-12 09:59:12 +01:00
Alec Scott
167a168a63 bfs: add v3.1 (#42595) 2024-02-12 09:54:45 +01:00
Elsa Gonsiorowski, PhD
10af235895 lwgrp: add autotools deps when building @main (#42564) 2024-02-12 09:40:52 +01:00
Cyrus Harrison
5d6dec5a5c conduit: add v0.9.1 (#42510) 2024-02-12 09:30:04 +01:00
Massimiliano Culpo
6ae0696e2f sirius: fix self-referential dependencies (#42552) 2024-02-12 09:21:14 +01:00
Massimiliano Culpo
9688e91a6b spla: fix self-referential dependencies (#42553) 2024-02-12 09:15:27 +01:00
Alec Scott
c026943c8e glab: add v1.36.0 (#42610) 2024-02-12 09:14:20 +01:00
Alec Scott
662ab2d4c5 xz: add v5.4.6 & v5.4.5 (#42612) 2024-02-12 09:13:49 +01:00
Alec Scott
722b00bbfb go: add v1.22.0 (#42611) 2024-02-12 09:12:31 +01:00
Alberto Invernizzi
6dbb56ba36 nvpl-lapack: fix versioning and add missing dependency (#42599)
* fix wrong versioning
use doc version and not the one extrapolated from the path (i.e. 0.2.0.1)

* nvpl-lapack requires nvpl-blas
propagate matching variants to nvpl-blas dependency

Co-authored-by: albestro <albestro@users.noreply.github.com>
2024-02-12 09:11:47 +01:00
Dave Keeshan
2e5bdd2679 yosys: add v0.38 (#42616) 2024-02-12 09:08:03 +01:00
Maciej Wójcik
97ec167452 py-immutables: fix building, add new versions (#42530)
* py-immutables: fix building, add new versions

* Explain constraint

* py-immutables: Wrap line
2024-02-11 16:07:54 -07:00
Garth N. Wells
2b8dcc0f57 fenicsx: deprecate older versions (#42613)
* Deprecate FEniCSx packages

* Format files

* Simplification
2024-02-11 15:57:39 -07:00
Maciej Wójcik
6312658888 py-configargparse: add new versions (#42533)
* py-configargparse: add new versions

* py-configargparse: Remove unnecessary constraint
2024-02-11 15:52:37 -07:00
Mikhail Titov
213818ffb5 Update package versions: RADICAL-Cybertools (RE, RG, RP, RS, RU) (#41997)
* rct: update packages (RE, RG, RP, RS, RU) with new versions

* rct: updated style

* rct: 1.46 release

* rct: RP 1.46.1 (hotfix applied)

* rct: deprecated old versions

* Update var/spack/repos/builtin/packages/py-radical-entk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-radical-pilot/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-radical-saga/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rct: brought back dependencies for deprecated versions

* rct: RP hotfix release 1.46.2

* rct: new stack release 1.47.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-11 15:09:20 -06:00
Alex Leute
aed5d1a88d py-gpy: added +plotting variant (#42588)
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-02-11 14:58:23 -06:00
Stephen Hudson
8b94128625 libEnsemble: add v1.2.0 (#42591) 2024-02-11 14:57:04 -06:00
Maciej Wójcik
4d91fbdf0f py-versioneer-518: add a workaround (#42534) 2024-02-11 14:48:29 -06:00
George Young
d7aa9d38fa py-deeptools: add 3.5.3 (#39951)
* py-deeptools: add 3.5.3

* switching tp pypi

* removing patch for tests

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-11 14:31:46 -06:00
Eric Berquist
b3369ac669 Add pre-commit 3.6.0 (#42404)
* Add pre-commit 3.6.0

* pre-commit 3.6.0 drops support for Python <3.9
2024-02-11 14:29:26 -06:00
George Young
e68fde6f4e py-ucsf-pyem: updating to commit e92cd4d (#42428)
* py-ucsf-pyem: updating to commit e92cd4d

* py-pyfftw: updating to @0.13:1

* py-ucsf-pyem: correcting install of scripts

* Upper limits

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Upper limits

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* updates from github review

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-11 13:01:58 -06:00
Dani
f0e49a54c0 new builtin package: py-simpletraj (#42460)
* new builtin package: py-simpletraj

* long line splitted in two for the style test

* removed trailing whitespace for the style test
2024-02-11 12:56:31 -06:00
Robert Cohn
2c67571726 [oneapi]: make headers match oneapi vars.sh (#42614)
* [oneapi]: make headers match oneapi vars.sh

* update

* update
2024-02-11 08:23:53 -08:00
Maciej Wójcik
fae6d3780f gromacs: add new version (#42609) 2024-02-10 08:19:12 -07:00
Howard Pritchard
34ba8e9527 openmpi: add 5.0.2 (#42568)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-02-10 14:36:18 +01:00
Massimiliano Culpo
686d1bc1ea mfem: fix self-referential dependencies (#42487) 2024-02-10 13:08:13 +01:00
snehring
9b42f9ab18 viennarna package: add v2.6.4 (#42606) 2024-02-10 02:38:03 -07:00
Alec Scott
1d508e1be3 fzf package: add v0.46.1 (#42603)
* fzf: add v0.46.1
* Standardize vim plugin install
2024-02-09 16:29:20 -08:00
Alec Scott
41f735a4ee gh package: add v2.43.1 (#42604)
* gh: add v2.43.1
* gh: fix go dependency versions
2024-02-09 16:28:06 -08:00
James Taliaferro
e9e6eb613b libsixel: add new package (#40031)
* add new package libsixel

* reorder to group variants and dependencies together

* Switch to using source from fork

The original developer of libsixel, Hayaki Saito (@saitoha), disappeared
in early 2020 and has not updated the repository or been seen since.
However, a fork of the project has been created at libsixel/libsixel,
and that fork has been getting much more regular updates. In this commit
I switch the package to using the better-maintained, now apparently
canonical fork of the project.

That project switched the build system from autotools to Meson, so much
of the build arguments code needed to be rewritten. The newest official
release also contained an error in meson.build which would break on
newer versions of Meson. That error only applied to the libjpeg and
libpng variants, though, so I switched the default to use libgd which
has broader format support anyway.

* blacken

* broke description into multiple lines

* allow libjpeg, etc to be loaded automatically
2024-02-09 12:09:10 -08:00
Chris Marsh
6dd19594ab Resolve unzip build failure with %oneapi (#42593)
* apply patch to all compilers as per spack/issues/42007

* [@spackbot] updating style on behalf of Chrismarsh

* Resolve flake8 style issue from spackbot

* fix flake8 trailing whitespace

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-02-09 12:04:05 -08:00
Harmen Stoppels
4f0a8fce52 hooks: remove 7 unused hooks (#42575)
These 7 hooks were not used.

- Six of them related to install phases were unused after `spack`
  `monitor` was removed, and the code seems to have bit rotten as there
  were reports they were not (always?) triggered when they should.
- The post environment one was made redundant after spack install for
  environment started following the common code path for generating
  module files in #42147.

It should not be a breaking change to remove, since users cannot define
hooks in extensions, they would have to fork Spack.

If we ever _were_ to make those hooks extendable outside of core Spack,
it would also be better to start with fewer rather than more, cause
everything you expose gets relied upon...

Removing those also allows us to rethink what hooks we really need, and
in particular it seems like we need a hook that runs post install also when
the spec is inserted into the database.
2024-02-09 20:52:09 +01:00
Massimiliano Culpo
7ff3b17f14 ASP-based solver: fix issue with conditional requirements and trigger conditions (#42566)
The lack of a rule to avoid enforcing requirements on multi-valued variants, when the condition activating the environment was not met, resulted in multiple optimal solutions. The fix is to prevent imposing a requirement if the when= rule activating it is not met.
2024-02-09 20:50:04 +01:00
Samuel Li
f71669175f sperr: add v0.8.1 (#42506) 2024-02-09 18:27:27 +01:00
simonLeary42
27b72b7691 Allow + in module file names (#41999) 2024-02-09 15:50:05 +01:00
Massimiliano Culpo
c5e0270ef0 dd4hep: remove self-referential dependencies (#42483)
This shouldn't be an issue, but express the self-reference
with a conflict.
2024-02-09 15:13:00 +01:00
Harmen Stoppels
c59d2d5b1c docs: overhaul module_file_support.rst (#42585)
The section was highly outdated as it referred to old defaults, and
failed to mention `hide_implicits: true`.

This commit restructures it, moves some deeply nested sections a level
up, and promotes `hide_implicits: true` + `autoload: direct` before
talking about `exclude`.
2024-02-09 13:32:43 +01:00
Paul R. C. Kent
a832d31ccd llvm: add 17.0.5, 17.0.6 (#42571) 2024-02-09 11:29:12 +01:00
Veselin Dobrev
47a8bde4da Add the latest OpenSSL versions: 3.2.1, 3.1.5, 3.0.13 (#42581) 2024-02-09 10:30:01 +01:00
kwryankrattiger
13050a9fb3 CI: Add ability to enable and disable stacks (#42255)
It is useful to enable/disable stacks in order to handle turning
specific stacks on/off based on runner availability, stack stability,
testing requirements, etc.

The disabled stack list takes precedence over the enable stack list. The
assumption is that stacks that are disabled are so due to some
functionality missing or broken for that stack.

The enable stack list implicitly disables all stacks not listed in the
enable list.
2024-02-08 15:26:32 -06:00
Massimiliano Culpo
753e8b53d3 xyce: fix self-referential dependencies (#42557) 2024-02-08 21:04:58 +01:00
Robert Underwood
af49f81ec5 libice fix for older glibc (#42586) 2024-02-08 19:52:37 +01:00
Massimiliano Culpo
db1a7406ca pastix: fix self-referential dependencies (#42546) 2024-02-08 16:27:01 +01:00
Adam J. Stewart
ecc9145d2c spack help --spec: add @= notation (#42584) 2024-02-08 15:57:44 +01:00
Rocco Meli
7090983c67 cp2k: patch for compilation with RelWithDebInfo (#42563) 2024-02-08 15:42:43 +01:00
Sergey Kosukhin
09fdea959f openmpi: add patch fixing MPI_MIN for unsigned long (#32392) 2024-02-08 15:41:46 +01:00
Massimiliano Culpo
e419e4ca93 q-e-sirius: fix self-referential dependencies (#42549) 2024-02-08 12:44:00 +01:00
Massimiliano Culpo
9e8f6e8d54 mpich: fix self-referential dependencies (#42527) 2024-02-08 09:34:54 +01:00
Massimiliano Culpo
753d69856a suite-sparse: fix self-referential dependencies (#42556) 2024-02-08 09:34:26 +01:00
Massimiliano Culpo
6d28caefc7 grib-util: fix self-referential dependencies (#42558) 2024-02-08 09:33:38 +01:00
John W. Parent
7b9eac02ff Windows registry: improve search efficiency & error reporting (#41720)
* Registry queries can fail due to simultaneous access from other
  processes. Wrap some of these accesses and retry when this may
  be the cause (the generated exceptions don't allow pinpointing
  this as the reason, but we add logic to identify cases which
  are definitely unrecoverable, and retry if it is not one of
  these).
* Add make recursion optioal for most registry search functions;
  disable recursive search in case where it was originally always
  recursive.
2024-02-07 13:26:07 -08:00
Massimiliano Culpo
138f8ba6e2 strumpack: fix self-referential dependencies (#42554) 2024-02-07 19:00:26 +01:00
Jonathon Anderson
046f9e89a1 hpctoolkit: Add dependency on xxhash for @develop (#42513) 2024-02-07 18:05:08 +01:00
Kevin Huck
2cca64d01d apex: add new release, deprecate old options, remove boost (#42538) 2024-02-07 17:17:04 +01:00
Alberto Invernizzi
9aed13adb9 nvpl-blas and nvpl-lapack: new packages for NVidia performance libraries (#41775)
Co-authored-by: Raffaele Solcà <rasolca@cscs.ch>
2024-02-07 17:01:11 +01:00
Massimiliano Culpo
85b6c344bd r: fix self-referential dependencies (#42551) 2024-02-07 16:55:28 +01:00
Massimiliano Culpo
9359c9b9db plink2: fix self-referential dependencies (#42547) 2024-02-07 16:55:09 +01:00
Massimiliano Culpo
d43f62cc5f orc: fix self-referential dependencies (#42542) 2024-02-07 16:54:37 +01:00
Massimiliano Culpo
a30d4612f5 oce: fix self-referential dependencies (#42540) 2024-02-07 16:54:14 +01:00
Massimiliano Culpo
e32009a7e3 openscenegraph: fix self-referential dependencies (#42541) 2024-02-07 16:53:44 +01:00
Massimiliano Culpo
b0f3489d68 py-*: fix self-referential dependencies (#42548) 2024-02-07 16:52:26 +01:00
simonLeary42
9099e4c233 libcatalyst cmake_minimum_required (#42532)
d469357740
2024-02-07 16:10:46 +01:00
Vanessasaurus
a1b895e547 flux-core: add v0.59.0 (#42536)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-07 16:10:05 +01:00
Vanessasaurus
df1ff0affa flux-pmix: add v0.5.0 (#42537)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-07 16:09:34 +01:00
George Young
95e828f3d8 s4pred: add new package (#42520) 2024-02-07 16:05:14 +01:00
Rocco Meli
a28c6caac0 dbcsr: examples variant (#42543) 2024-02-07 15:54:19 +01:00
Rocco Meli
514260d8cb cp2k+mpi requires dbcsr+mpi (#42545) 2024-02-07 15:35:20 +01:00
Massimiliano Culpo
642ec1918f julia: fix self-referential dependencies (#42486) 2024-02-07 08:46:03 +01:00
Massimiliano Culpo
bdae9f776b netcdf-c: fix self-referential dependencies (#42528) 2024-02-07 08:22:56 +01:00
Massimiliano Culpo
5c86a3cca2 lammps: fix self-referential dependencies (#42521) 2024-02-07 08:22:11 +01:00
Massimiliano Culpo
ea53008604 molgw: fix self-referential dependencies (#42523) 2024-02-06 19:01:44 +01:00
Massimiliano Culpo
01ea8f46e7 ldak: fix self-referential dependencies (#42522) 2024-02-06 17:45:40 +01:00
jalcaraz
ff1e700b56 TAU: Added new test for other variants. (#42503)
Now it tests all GPUs, syscall and python.
2024-02-06 08:37:22 -08:00
renjithravindrankannath
5efa723289 mesa: updating llvm dependency for version 23 (#42481) 2024-02-06 14:53:31 +01:00
Massimiliano Culpo
4985f87a52 dbcsr: fix self-referential dependencies (#42482) 2024-02-06 14:41:27 +01:00
Massimiliano Culpo
ae000f963c hpx: remove self-referential dependencies (#42485)
This shouldn't be an issue, but avoid self references
on "^asio".
2024-02-06 13:37:40 +01:00
Olivier Cessenat
0960f691a1 keepassxc: new version 2.7.6 (#42478)
AUTOTYPE is set by default in the 2.7.6 release, it was not previously.
2024-02-05 18:24:42 +01:00
Mosè Giordano
713b19cac7 py-pythran: apply patch to fix compilation with GCC 13 (#42490)
* py-pythran: apply patch to fix compilation with GCC 13

* Update var/spack/repos/builtin/packages/py-pythran/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-05 09:08:41 -07:00
Thomas Madlener
05f1f07e51 edm4hep: add v0.10.4 (#42488) 2024-02-05 07:43:51 -07:00
Mosè Giordano
955a01dfa4 hh-suite: apply patch to fix compilation with GCC 13 (#42489)
This fixes errors like
```
     294    In file included from /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.cpp:8:
  >> 295    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:37:37: error: 'uint16_t' has not been declared
     296       37 |   void writeU16(std::ostream& file, uint16_t);
     297          |                                     ^~~~~~~~
  >> 298    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:38:28: error: 'uint16_t' has not been declared
     299       38 |   void readU16(char** ptr, uint16_t &result);
     300          |                            ^~~~~~~~
  >> 301    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:40:37: error: 'uint32_t' has not been declared
     302       40 |   void writeU32(std::ostream& file, uint32_t);
     303          |                                     ^~~~~~~~
  >> 304    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:41:27: error: 'uint32_t' has not been declared
     305       41 |   void readU32(char**ptr, uint32_t &result);
     306          |                           ^~~~~~~~
```
2024-02-05 07:38:49 -07:00
Massimiliano Culpo
928ee7569c ecp-data-vis-sdk: remove self-referential dependencies (#42484)
This shouldn't be an issue, but express this limitation
with a conflict.
2024-02-05 15:17:46 +01:00
George Young
8190903821 motioncor2: add v1.6.4 (#42380)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-05 14:32:38 +01:00
George Young
473347df41 kentutils: update to @459, update download location & deps (#42448)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-05 04:23:33 -07:00
Adam J. Stewart
1ef52d7c8e py-pytest: add v8.0.0 (#42344) 2024-02-05 12:16:58 +01:00
Tom Payerle
561fe13bad cgal: add v5.3.2 (#42378) 2024-02-05 12:08:27 +01:00
George Young
c15ed38cef star: updating to 2.7.11a (#42011)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-05 12:06:01 +01:00
David Boehme
fa4568d9c9 kripke: add adiak dependency with Caliper enabled (#42414) 2024-02-05 11:17:24 +01:00
Nathalie Furmento
ef4f78a6cd starpu: add release 1.4.4 (#42446) 2024-02-05 11:12:50 +01:00
Adam J. Stewart
da36d069db py-lightly: fix conflict definition (#42449) 2024-02-05 11:11:23 +01:00
Cameron Rutherford
6ff3e17a7d hiop: require RAJA when cuda is enabled (#42407) 2024-02-05 11:04:08 +01:00
snehring
bb200be57d sentieon-genomics: add new version 202308.02 (#42465) 2024-02-05 11:01:00 +01:00
Olivier Cessenat
a238563fdb mpfr: add v4.2.1 (#42479) 2024-02-05 02:59:22 -07:00
Adam J. Stewart
bfc6f1d2a9 py-lightning: add v2.1.4 (#42468) 2024-02-05 10:59:00 +01:00
Olivier Cessenat
99b8a08366 stripack: sets the licence SPDX (#42473) 2024-02-05 10:45:23 +01:00
Olivier Cessenat
dba5ae939d visit-ffp and visit-unv: sets the licence SPDX (#42474) 2024-02-05 10:42:07 +01:00
Olivier Cessenat
7a4aa823d1 ngspice: new version 42 (#42475) 2024-02-05 10:41:06 +01:00
Olivier Cessenat
89e387cb67 latex2html: new version 2024 (#42476) 2024-02-05 10:38:24 +01:00
Olivier Cessenat
19c46de69f nlopt: new version 2.7.1 (#42477) 2024-02-05 10:27:01 +01:00
Michael Kuhn
e35fbfab77 gtkplus: add v3.24.41 and fix CUPS problems (#42480)
Fixes #42297
2024-02-05 10:23:57 +01:00
Massimiliano Culpo
478203dc68 asio: remove self-referential dependencies (#42469)
These shouldn't be an issue, but they can be expressed
in terms of variants on the package.
2024-02-05 10:10:58 +01:00
Massimiliano Culpo
5713ffd143 converge: fix self-referential dependencies (#42471) 2024-02-05 10:03:20 +01:00
Elsa Gonsiorowski, PhD
1711b6dee1 scr: fix @develop dependency versions (#42379) 2024-02-05 09:15:19 +01:00
Massimiliano Culpo
7fbd4afaaa cp2k: fix self-referential dependencies (#42472) 2024-02-05 09:12:47 +01:00
Massimiliano Culpo
f396dbcb4f berkeleygw: fix self-referential dependencies (#42470)
Also, remove a couple of duplicate directives
2024-02-05 09:04:32 +01:00
David Guibert
57fe3430fd py-pymummer: init (#42412)
* py-pymummer: init

* Update py-pymummer copyright date

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-04 06:27:46 -07:00
snehring
8428bef040 py-ete3: adding version 3.1.3 (#42462) 2024-02-04 06:12:19 -06:00
George Young
47c91c9163 ldsc: new package @2.0.1 (#42430)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-04 06:06:45 -06:00
David Guibert
6c57360eac py-pyfastaq: init (#42413) 2024-02-04 05:50:44 -06:00
George Young
624292d685 sicer2: new Python package @1.0.3 (#42383)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-04 05:17:27 -06:00
Mike Renfro
55ecc47dce fix ipyrad numpy dependencies (#42098)
* fix ipyrad numpy depedencies

ipyrad versions through 0.9.90 use np.int, which is deprecated in numpy 1.20.

* fix whitespace

* Correct numpy dependency restrictions
2024-02-04 05:16:22 -06:00
Massimiliano Culpo
f2125882c5 flecsi: fix constraints on mpi providers (#42447) 2024-02-03 12:11:03 +01:00
Massimiliano Culpo
d23cb39a3f quantum-espresso: fix self-referential dependencies (#42458) 2024-02-03 12:09:42 +01:00
Mosè Giordano
7f7d5b899a py-alphafold: use permanent link for openmm patch (#42461) 2024-02-03 11:07:16 +01:00
Harmen Stoppels
c44e854d05 Environment views: dependents before dependencies, resolve identical file conflicts (#42350)
Fix two separate problems:

1. We want to always visit parents before children while creating views
   (when it comes to ignoring conflicts, the first instance generated in
   the view is chosen, and we want the parent instance to have precedence).
   Our preorder traversal does not guarantee that, but our topological-
   order traversal does.
2. For copy style views with packages x depending on y, where
   <x-prefix>/foo is a symlink to <y-prefix>/foo, we want to guarantee
   that:
   * A conflict is not registered
   * <y-prefix>/foo is chosen (otherwise, the "foo" symlink would become
     self-referential if relocated relative to the view root)

   Note that
   * This is an exception to [1] (in this case the dependency instance
     overrides the dependent)
   * Prior to this change, if "foo" was ignored as a conflict, it was
     possible to create this self-referential symlink

Add tests for each of these cases
2024-02-03 11:05:45 +01:00
John W. Parent
8fa8dbc269 Sqlite package: export api symbols on Windows (#42299)
* Sqlite requires the user to provide a command line arg (DYNAMIC_SHELL)
  to export shared symbols to import lib from .def
* Add other options recommended by Sqlite docs: 
  https://github.com/sqlite/sqlite/blob/master/doc/compile-for-windows.md
  * Some of these options mean we can restore variants that were
    disabled for Windows (fts, functions, rtree).
2024-02-02 13:27:53 -08:00
jalcaraz
e59303d4ff tau: update test functions (#42432)
* Updated with generic test functions

* Added level_zero test

As the compiler is not installed by TAU, it should be loaded first and run with --dirty. Will check if there is a way to make it work without --dirty.

* Update var/spack/repos/builtin/packages/tau/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-02 11:12:24 -08:00
Massimiliano Culpo
494d943a24 ascent: fix self-referential dependencies (#42457) 2024-02-02 19:25:56 +01:00
kwryankrattiger
d227da5554 CI: Call timing script in after_script (#42166)
The main script body is over-written for power. Putting thet timing
aggregation in the after script allows it to be called on all of the
current pipelines.
2024-02-02 12:02:46 -06:00
Olivier Cessenat
714590426f astyle: new version 4.1.11 and build using cmake (#42390)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-02 18:58:27 +01:00
Massimiliano Culpo
9ffe179934 acts: fix self-referential dependencies (#42455) 2024-02-02 18:34:34 +01:00
dependabot[bot]
9c4e44a0ad build(deps): bump codecov/codecov-action from 4.0.0 to 4.0.1 (#42439)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.0 to 4.0.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](f30e4959ba...e0b68c6749)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-02 12:20:29 +01:00
dependabot[bot]
1ef69a8bfb build(deps): bump python-levenshtein in /lib/spack/docs (#42440)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.23.0 to 0.24.0.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.23.0...v0.24.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-02 12:20:02 +01:00
Massimiliano Culpo
55db090206 Extract low-level clingo wrappers from spack.solver.asp (#42429) 2024-02-02 12:19:38 +01:00
Matthew Whitlock
f8ce84860c Update packages_yaml.rst (#42438)
Fix an incorrect example.
2024-02-02 11:05:26 +01:00
Mosè Giordano
795360fe48 openmm: Apply patch use FindCUDAToolkit (#42437) 2024-02-02 11:04:03 +01:00
Axel Huebl
3d3d075496 WarpX: Disable CCache (#42434)
https://github.com/ECP-WarpX/WarpX/pull/4637
2024-02-02 11:02:00 +01:00
Frédéric Simonis
35630c219d preCICE: add v3.0.0 (#42426)
* preCICE: add version 3.0.0
* preCICE: add version conflict for libxml2 2.12.x
2024-02-01 22:56:18 -07:00
stefanfechter
5ef9bb7752 Bugfix: fix build of xforms (#35391)
This additional patch fixes the build of the now unmaintained library
xforms.
2024-02-01 13:07:57 -08:00
kjrstory
5bd5a219a6 Add algorithmic differentiation packages for SU2 (#39975)
* Add algorithmic differentiation packages for SU2
* Simplify checking boolean variants
* spack prefix and spec satisfies methos fix
* spelling fix
* style fix
2024-02-01 13:01:35 -08:00
fpruvost
c1450d26ff Add new package Qrmumps (#42393) 2024-02-01 13:00:14 -08:00
Weiqun Zhang
10d826597a amrex: add v24.02 (#42431) 2024-02-01 12:23:51 -07:00
Adam J. Stewart
00cbcd5dbc py-lightly: add v1.4.25 (#42421) 2024-02-01 11:54:01 -07:00
George Young
6bd8fda597 foldseek: new package @8 (#42422)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-01 11:33:28 -07:00
Rémi Lacroix
8b88255b43 pv: Add version 1.8.5. (#42419)
Switch to "tar.gz" source packages as the "tar.bz2" archives are not available for newer versions.
2024-02-01 09:54:53 -08:00
Rémi Lacroix
bcbd78cea9 ncdu: Add version 1.19 (#42420) 2024-02-01 09:49:57 -08:00
Adam J. Stewart
22ad28bb17 Python: add v3.12.1 (#42397) 2024-02-01 07:43:14 -07:00
Alberto Invernizzi
74bd8a9cf7 neovim: be more specific with lua dependencies (#42401) 2024-02-01 14:11:49 +01:00
Adam J. Stewart
023a6be67d Update PyTorch ecosystem (#42394) 2024-02-01 06:42:13 -06:00
Harmen Stoppels
922a1983f3 docs: git version section title + highlight issues (#42398)
* basic_usage: section title for git versions

* improve highlighting by using a comment instead of invalid syntax
2024-02-01 09:46:17 +01:00
Massimiliano Culpo
2758fc7e14 Remove caching in generated Dockerfiles (#42405) 2024-02-01 09:22:52 +01:00
dependabot[bot]
3541fcae01 build(deps): bump docker/metadata-action from 5.5.0 to 5.5.1 (#42416)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.0 to 5.5.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](dbef88086f...8e5442c4ef)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-01 09:14:31 +01:00
dependabot[bot]
eea06de6df build(deps): bump codecov/codecov-action from 3.1.6 to 4.0.0 (#42415)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.6 to 4.0.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](ab904c41d6...f30e4959ba)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-01 09:13:52 +01:00
Alberto Invernizzi
29658eddcc Lua: better specify providers in LuaPackage base class (#42392) 2024-02-01 08:57:12 +01:00
Tamara Dahlgren
2fc0d05a55 Environments: Add support for including views (#42250)
* Environments: Add support for including views (take 2)

* schema type hint fixes
2024-02-01 10:07:16 +09:00
Edward Hartnett
faf64f1a26 g2c: add v1.8.0 (#40761)
includes variant to build with future API for v2
2024-02-01 09:28:55 +09:00
Harmen Stoppels
d340523d68 jq: 1.7.1 (#42409) 2024-01-31 15:49:44 -08:00
Dave Sweeris
089fa12ff8 Update OpenSubdiv spec (#42381)
* Update package.py
   Add support for OpenSubdiv 3.5.x
* Fixed Dependencies
   I was getting errors about cmake not being able to find `xf86vm`, and adding a dependency on `libxxf86vm` fixes it.
2024-01-31 15:34:47 -08:00
Julius Plehn
d9528819a3 Adds nvhpc 24.1 (#42388) 2024-01-31 13:40:33 -08:00
Ben Wibking
e3d5ca2997 visit: set minimum silo version to 4.11 (#42072) 2024-01-31 22:05:17 +01:00
Sebastian Grimberg
8fc76ab325 Update recipe for Palace v0.12.0 (#42400) 2024-01-31 12:57:03 -08:00
Hector Martinez-Seara
e77678bd82 kim-api: added paths for bash/zsh completion (#31691)
Co-authored-by: Hector Mtz-Seara <hector@gmail.com>
Co-authored-by: Ryan S. Elliott <relliott@umn.edu>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: hseara <hseara@users.noreply.github.com>
2024-01-31 12:42:59 -07:00
Richard Berger
dafd8dbe87 Add sol2 package (#42402) 2024-01-31 12:09:07 -07:00
Chris White
5034919d23 add new release versions (#42362) 2024-01-31 10:58:07 -08:00
Harmen Stoppels
2bcfb72d38 environment/view: small cleanup (#42395) 2024-01-31 17:22:20 +01:00
Harmen Stoppels
f27bff81ba spack reproduce-build: accept URLs from web interface (#42261)
Sometimes the logs are too long and the copy & paste command is not
shown. In that case I'd like to just copy the failing GitLab job URL in
my browser to `spack reproduce-build <url>`.
2024-01-31 15:58:51 +01:00
Massimiliano Culpo
5c49bb45c7 ASP-based solver: decouple setup phase from clingo.backend (#41952)
Currently, the `SpackSolverSetup` and the `PyclingoDriver` are more coupled than necessary:
1. The driver object needs a setup object to be injected during a solve, 
2. And the setup object will get a reference back to the driver

This design is necessary because we use the low-level `clingo.backend` interface to setup our problem. This interface though is meant to bypass the grounder and add symbols directly in the grounded table, which is a feature we don't currently use.

The PR simplifies the encoding by having the setup object returning the problem-specific facts / rules as a list of strings, and the driver ingesting them using the [clingo.Control.add](https://potassco.org/clingo/python-api/5.6/clingo/control.html#clingo.control.Control.add) method. This removes any use of the low level interface.

Using this encoding makes it easy to hash the output of the setup phase, since it is returned as a string.
2024-01-31 15:37:59 +01:00
Alex Richert
97fb9565ee py-nbconvert: avoid install-time downloads (#42024)
* py-nbconvert: avoid install-time downloads

* install css files as resources for py-nbconvert

* Update var/spack/repos/builtin/packages/py-nbconvert/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* py-nbconvert: update dependencies for 7.14.1

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update var/spack/repos/builtin/packages/py-nbconvert/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update resources & remove style.min.css file

* Update package.py

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-31 08:28:41 -06:00
Alberto Invernizzi
72eaca23fe environments: develop paths were not getting expanded (#34986) 2024-01-31 15:18:25 +01:00
kwryankrattiger
1f11b3844a CI: Add OIDC capability for deprecated CI (#42371)
This "breaks" the deprecated schema by allowing unknown attributes
to the attributes section of the job types. The breaking change here is
that deprecated stacks will no longer ignore attributes that are unknown
but rather assume the new CI schema behavior of injecting them into the
generated CI configuration. This change is required to secure
authentication in Spack CI.
2024-01-31 15:05:57 +01:00
Rocco Meli
e129a6f47a Add +dlaf variant to cp2k in CI (#42346) 2024-01-31 11:54:45 +01:00
Harmen Stoppels
d983ac35fe ci: bump ghcr.io/spack/linux-ubuntu22.04-x86_64_v2 tag (#42357) 2024-01-31 09:59:03 +01:00
dependabot[bot]
2dcf4f709b build(deps): bump urllib3 from 2.1.0 to 2.2.0 in /lib/spack/docs (#42384)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.1.0 to 2.2.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.1.0...2.2.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 09:55:26 +01:00
dependabot[bot]
a171fe3565 build(deps): bump codecov/codecov-action from 3.1.5 to 3.1.6 (#42385)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.5 to 3.1.6.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](4fe8c5f003...ab904c41d6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 09:54:11 +01:00
Jonathon Anderson
c9aeab58e6 hpctoolkit: refine dependencies (#42354)
* Force Dyninst <=12 before @2024.01

* Remove some +pic requirements

* Use virtual tbb dep
2024-01-31 08:27:05 +01:00
Harmen Stoppels
517dac6ff8 compression.py: refactor + bug fix (#42367)
Improve naming, so it's clear file "extensions" are not taken in the
`PurePath(path).suffix` sense as the original function name suggests,
but rather that the files are opened and their magic bytes are
classified.

Add type hints.

Fix a bug where `stream.read(num_bytes)` was run on the compressed
stream instead of the uncompressed stream, which can potentially break
detection of tar.bz2 files.

Ensure that when peeking into streams for magic bytes, they are reset to
their original position upon return.

Use new API in `spack logs`.
2024-01-31 07:59:07 +01:00
jalcaraz
376653ec3d Updated last commit of TAU package with Dyninst test (#42387)
* Updated last commit of TAU package with Dyninst test

* The path to dyninst was missing when loading TAU
2024-01-30 19:07:33 -08:00
Harmen Stoppels
28eea2994f elf: relocate PT_INTERP (#42318)
Relocation of `PT_INTERP` in ELF files already happens to work from long to short path, thanks to generic binary relocation (i.e. find and replace). This PR improves it:

1. Adds logic to grow `PT_INTERP` strings through patchelf (which is only useful if the interpreter and rpath paths are the _only_ paths in the binary that need to be relocated)
2. Makes shrinking `PT_INTERP` cleaner. Before this PR when you would use Spack-built glibc as link dep, and relocate
executables using its dynamic linker, you'd end up with

   ```
   $ file exe
   exe: ELF 64-bit LSD pie executable, ..., interpreter /////////////////////////////////////////////////path/to/glibc/lib/ld-linux.so
   ```

   With this PR you get something sensible:

   ```
   $ file exe
   exe: ELF 64-bit LSD pie executable, ..., interpreter /path/to/glibc/lib/ld-linux.so
   ```

When Spack cannot modify the interpreter or rpath strings in-place, it errors out without modifying the file, and leaves both tasks to patchelf instead.

Also add type hints to `elf.py`.
2024-01-30 22:36:49 +01:00
Frédéric Simonis
6d55caabe8 precice: add release v2.5.1 (#42376) 2024-01-30 12:47:59 -08:00
Benjamin Fovet
d8260907df add gmsh v4.12.2 (#42375) 2024-01-30 12:13:07 -08:00
Wileam Y. Phan
9474f4bb33 gtpin: add versions 3.4 and 3.7 (#42373) 2024-01-30 12:05:43 -08:00
Rémi Lacroix
698b71e2fd autoconf: Fix patches' URLs (#42372) 2024-01-30 11:33:48 -08:00
John W. Parent
749301d133 MSVC: Broken ifx needs new $TMP (#42155)
Certain versions of ifx (the majority of those available) have an issue
where they are not compatible with TMP directories with dot chars
This precludes their use with CMake.
Remap TMP to point to the stage directory rather than whatever the TMP
default is
2024-01-30 10:18:54 -08:00
Massimiliano Culpo
a20c0de6d8 ns-3-dev: rewrite the package to use CMake (#34207) 2024-01-30 17:57:27 +01:00
dependabot[bot]
ad2ae63745 build(deps): bump pytest from 7.4.4 to 8.0.0 in /lib/spack/docs (#42361)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.4.4 to 8.0.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.4.4...8.0.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 08:23:50 -07:00
dependabot[bot]
d56ca4f107 build(deps): bump black from 24.1.0 to 24.1.1 in /lib/spack/docs (#42360)
Bumps [black](https://github.com/psf/black) from 24.1.0 to 24.1.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.0...24.1.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 15:23:42 +00:00
dependabot[bot]
e09c3ddec2 build(deps): bump black in /.github/workflows/style (#42359)
Bumps [black](https://github.com/psf/black) from 24.1.0 to 24.1.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.0...24.1.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 15:23:30 +00:00
Massimiliano Culpo
1882920c97 elsi: cleanup recipe (#42355) 2024-01-30 14:08:10 +01:00
Matthias Wolf
b70cb60b65 py-bluepyefe, py-igor2: new version, new package igor2 (#42191) 2024-01-30 03:29:07 -07:00
Adam J. Stewart
6a1e64f531 py-black: add v24.1.1 (#42343) 2024-01-30 03:14:02 -07:00
Nathalie Furmento
5217b20901 starpu: add release 1.4.3 (#42339) 2024-01-30 03:09:21 -07:00
James Beal
ed37969925 singularityce: add v4.x (#42347)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-01-30 03:08:57 -07:00
Henrik Finsberg
32230e6520 Add fenics development version and ufl-legacy (#42182)
* Add fenics development version and ufl-legacy

* Make sure python and setuptools are added to ufl-legacy and add version 2022.3.0 as well

* Run black and add maintainer to fenics package

* Fix typo

* Use Fiat version 2019.1.0

* Run black

* Add back master branch of fiat

* Remove master from the list of dolfin versions and add one extra line of each deps instead

* Run black

* Do not specify python version in ufl-legacy

* Remove python dependency from ufl-legacy

* Remove python dependency from ffc

* Add special case for master in ffc

* Run black

* Remove master from loop in ffc

* Run black again
2024-01-30 02:49:51 -07:00
Brad Geltz
7a712a11b9 geopm-service: New package and deprecate geopm (#41788)
* geopm: Mark all as deprecated

- This recipe will be removed in a future release.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add py-sphinx-emoji

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add py-dasbus

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* py-pygobject: Add v3.46.0

- Previous versions error during build phase.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* py-sphinx-tabs: Add new versions

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add geopm-service

- Previous geopm package is now 2 packages:
  geopm-service and geopm-runtime.
- The GEOPM service is designed as a systemd/dbus
  service providing a userspace interface to
  privileged hardware telemetry and configurations.
- Installing via spack will enable some userspace
  testing, but generally most users will want to
  install the GEOPM service via the system package
  manager as root to get full functionality.
- This recipe will enable the creation of the fully
  userspace geopm-runtime recipe which will replace
  the old geopm recipe.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

---------

Signed-off-by: Brad Geltz <brad.geltz@intel.com>
2024-01-30 02:49:33 -07:00
Peter Scheibel
e63d8e6163 "spack logs": print log files for packages (either partially built or installed) (#42202) 2024-01-30 10:42:00 +01:00
Alex Leute
461a9093cd py-textual: Added package py-textual (#42291)
* py-textual: New package py-textual

* py-textual: Depend on py-mdit-py-plugins

* py-textual: Added dependency on python@3.8:3

* py-textual: Added a comment about why there is a dependency on
py-mdit-py-plugins

* py-textual: Ran black

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-01-30 03:19:39 -06:00
Caetano Melone
8edb6ff22a py-pytest-aiohttp: add package (#42313)
* add pytest-aiohttp

* black

* py-setuptools -> py-setuptools-scm

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-30 03:18:32 -06:00
Massimiliano Culpo
1f44be83af Add a PR template with a reference to spackbot commands (#42349) 2024-01-30 09:43:04 +01:00
Benjamin Fovet
1be078d01d gmsh: add v4.12.0 (#41854)
* disable building gmsh with oce for recent versions

Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2024-01-30 08:37:37 +01:00
Cristian Le
c07cde3308 Bump spglib version (#42340) 2024-01-29 16:06:29 -07:00
John W. Parent
dda8b1a5b8 VTK package: improve dependency-detection on Windows (#42300)
VTK struggles to consume some Spack-derived packages on Windows:
Patch VTK to allow a smoother integration

Also add install for examples as they are not part of the install
interface.
2024-01-29 14:42:36 -08:00
John W. Parent
ba45277640 gl2ps package: build only one of shared/static on Windows (#36576)
gl2ps tries to build static and shared libs simultaneously with
the same target name on the generator side. This causes a name
clash issue for Ninja on Windows (where the extension is .lib
in both cases).

Add a variant on Windows to force building only one of shared
or static, and patch the CMake build to enable use of this
variant.
2024-01-29 11:49:55 -08:00
Zack Galbreath
f03ae39fd1 Update GitLab memory requests (#42351)
* gitlab: remove commented-out duplicate entries

* gitlab: reclassify some packages from "huge" to "large"

Our observed max memory usage for these packages is as follows:

hipblas: 7.7G
qt: 6.6G
visit: 9.7G

All of these should fit within a "large" request (currently 12G).

* gitlab: remove pango from list of huge packages

This package is not currently built by any of our CI stacks.

* gitlab: update requests for high memory packages

Refine resource requests for memory-intensive packages based on
max memory usage data.
2024-01-29 19:28:58 +00:00
Arne Becker
62145122be perl-sql-translator: New package (#42319)
- Adds perl-sql-translator and its missing deps:
- Adds perl-import-into
- Adds perl-package-variant
- Adds perl-strictures

Built with build-time tests and comes with a simple run-time test.
2024-01-29 10:36:25 -08:00
Gilles Grospellier
44e33c3eb9 dotnet-core-sdk: Update to version 6.0.25 and add binaries for 'aarch64'. (#41739) 2024-01-29 10:10:58 -08:00
fpruvost
ed5ed3e31e Add fabulous, maphyspp, paddle packages. (#42287) 2024-01-29 10:06:35 -08:00
dependabot[bot]
cc866efba1 build(deps): bump dorny/paths-filter from 2.11.1 to 3.0.0 (#42294)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 2.11.1 to 3.0.0.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](4512585405...0bc4621a31)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 16:38:23 +01:00
Martin Diehl
f550262efb update damask to 3.0.0-beta (#42259) 2024-01-29 09:33:22 -06:00
Harmen Stoppels
890ec8d71c traverse: w/o deptype (#42345)
Add the empty deptype `spack.deptypes.NONE`.

Test the case `traverse_nodes(deptype=spack.deptypes.NONE)` to not
traverse dependencies, only de-duplicate.

Use the construct in environment views that otherwise would branch on
whether deps are enabled or not.
2024-01-29 16:31:50 +01:00
dependabot[bot]
62ed5ee318 build(deps): bump codecov/codecov-action from 3.1.4 to 3.1.5 (#42295)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.4 to 3.1.5.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](eaaf4bedf3...4fe8c5f003)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 16:27:44 +01:00
Mosè Giordano
0a10ff70bc openblas: use ARMV8SVE when target supports SVE feature (#42107) 2024-01-29 08:20:32 -07:00
Harmen Stoppels
0718e3459a filesystem: cleanup (#42342)
Type hints and removal of unused code
2024-01-29 14:43:17 +00:00
Rocco Meli
7ec93a496d MDAnalysis: add v2.7.0 (#41907)
* ensure umpire~cuda~rocm when ~cuda~rocm

* MDAnalysis: add v2.7.0

* Apply suggestions from code review

* license

* review changes

* add gsd and py-cmake versions

* Update package.py

* py-cmake

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix and reorder

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 08:31:34 -06:00
Brian Vanderwende
2170386ad9 Improve NCL variant support and fix hdf-eos5 issues (#41259) 2024-01-29 15:28:32 +01:00
WuK
6c48effbf5 add py-vl-convert-python (#42073)
* add py-vl-convert-python

* code format

* Update package.py

add homepage

* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove old versions, add todo

* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove blank line

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 08:18:42 -06:00
Wouter Deconinck
97e3da0e3e ncurses: fix 5.9 and 6.0 on modern compilers (#41982) 2024-01-29 15:18:25 +01:00
Alex Leute
d6ff81ab4d py-metrics: Adding package py-metrics (#42220)
* New package: py-metrics

* [py-metrics] cleaned up extra comment

* Updated copyright and ran black

* py-pathspec: Added version 0.5.5

---------

Co-authored-by: vehrc <vehrc@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-01-29 08:14:03 -06:00
Sam Grayson
6dfc2e7075 openldap: add findutils build dep (#42159) 2024-01-29 15:06:57 +01:00
Matthias Wolf
80e36d47c2 py-frozendict: patch up for Python 3.11 (#42192)
* py-frozendict: patch up for Python 3.11

See also Marco-Sulla/python-frozendict#68, rely on a pure Python
implementation when 3.11+ is used.

* mention related Github issue
2024-01-29 08:03:46 -06:00
snehring
e1826f89d4 tbl2asn: adding runtime dep libidn (#42171) 2024-01-29 14:59:10 +01:00
kjrstory
23df20fe6d of-precice: add new versions, update run environment (#40479) 2024-01-29 14:57:16 +01:00
Wouter Deconinck
10ef7a0466 containers: switch to quay.io/almalinuxorg images (#42193) 2024-01-29 14:55:56 +01:00
Alec Scott
7eba7d4028 go: simplify dependency on git to only run (#42208) 2024-01-29 14:49:45 +01:00
Dani
b09d741aed new builtin package: py-biobb-common (#41993)
* new builtin package: biobb-common

* removed whitespace for the style test

* replaced ' by " for the style test

* changed import line for the style test

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed test

* using pypi instead of git

I hope this is declared the right way.
I tried to test it locally but I am having problems with a dependency (openblas) thus I can not even try if this works.

* upgraded version

* Added py prefix

* removed biopython version restriction

* directory with new py-prefixed name

* Delete old non-prefixed package

* Update var/spack/repos/builtin/packages/py-biobb-common/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 13:52:57 +01:00
Sergey Kosukhin
594069961a netcdf-fortran: fix static linking in some cases (#35466)
* netcdf-fortran: enable building against static netcdf-c

* netcdf-fortran: strip the output of nc-config
2024-01-29 12:53:13 +01:00
Sean Koyama
9eb445f0a2 singularityce: add v3.11.4, v3.11.5 (#42251)
Co-authored-by: Servesh Muralidharan <smuralidharan@anl.gov>
2024-01-29 12:31:39 +01:00
Sean Koyama
e791f61c52 gnuplot: add v5.4.10, v6.0.0 (#42252)
Co-authored-by: Servesh Muralidharan <smuralidharan@anl.gov>
2024-01-29 12:30:47 +01:00
Thomas Madlener
b074e18402 py-jupytext: Add version 1.16 and update dependencies (#41552)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 12:26:58 +01:00
Alex Seaton
5b6a289d30 Added heyoka and mppp packages (#41839) 2024-01-29 11:03:48 +01:00
Jonathon Anderson
8e9bce44cc opencl-c-headers: install with CMake (#42173) 2024-01-29 10:47:52 +01:00
Ye Luo
7242238a25 quantum_espresso: relax constrait to allow +openmp ^elpa~openmp (#42322) 2024-01-29 10:41:52 +01:00
Mikael Simberg
ef26ee3f1f pika: Add 0.22.1 (#42338) 2024-01-29 02:37:56 -07:00
Adam J. Stewart
584ff9933a py-pandas: add v2.2.0 (#42205) 2024-01-29 03:34:07 -06:00
dependabot[bot]
bb07776606 build(deps): bump black from 23.12.1 to 24.1.0 in /lib/spack/docs (#42328)
Bumps [black](https://github.com/psf/black) from 23.12.1 to 24.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.12.1...24.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 10:28:28 +01:00
Pramod Kumbhar
07df50fbdc neuron: add new versions and clean-up the recipe (#41931)
* update tarball urls, add new versions and update maintainers
* remove unnecessary variant like cross-compile
* use self.define and self.define_from_variant
* improve regex / bug that matches with -DMPICH_SKIP_MPICXX=1

Co-authored-by: Matthias Wolf <matthias.wolf@epfl.ch>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-01-29 09:46:48 +01:00
Luc Grosheintz
fc731f28cb highfive: add v2.9.0 (#42337) 2024-01-29 01:43:04 -07:00
Tom Payerle
6474d7ef00 hdf5: Make +subfiling variant depend on +mpi (#42324)
Based on CMakeLists.txt, the subfiling VFD requires a parallel HDF5 build.
So make +subfiling variant depend on +mpi

Should resolve #42323
2024-01-29 09:41:51 +01:00
Jonathon Anderson
0b23bbbbb0 hpctoolkit: update develop and add 2024.01.stable (#42309) 2024-01-29 09:05:02 +01:00
wspear
21406b119c Update tau 2.33.1 hash (#42336) 2024-01-28 09:44:55 -08:00
Adam J. Stewart
2b51980904 Apply black 2024 style to Spack (#42317) 2024-01-27 16:15:35 +01:00
Robert Cohn
1865e228c4 [intel-oneapi-mkl] workaround linking issue for threads=openmp (#42327) 2024-01-26 22:22:51 -07:00
wspear
179a1e423e pdt 3.25.2 (#42330)
Add support for -icpx for oneapi
2024-01-26 21:08:19 -07:00
wspear
bd8de5bf2d Add tau 2.33.1 (#42331) 2024-01-26 20:52:44 -07:00
eugeneswalker
7c8c7eedca xyce: break blis circularity in depends_on (#42321) 2024-01-26 20:22:55 -07:00
Massimiliano Culpo
8c1957c03e gnupg: add v2.4.4 (#42320) 2024-01-26 14:33:48 -08:00
eugeneswalker
803ad69eb1 hydrogen@1.5.3: cmake patch with ESCAPE_QUOTES (#42325) 2024-01-26 13:56:16 -08:00
Chris White
29d784e5fa Axom: Update/merge changes from Axom's repo (#42269)
* update axom package from axom's repo

Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-01-26 11:08:21 -08:00
Dan LaManna
58b2201710 Stop passing manual AWS credentials to jobs (#42096) 2024-01-26 10:25:37 -07:00
Matthew Thompson
02605d577b hdf5: fix build error on Apple Clang 15 (#42264) 2024-01-26 18:18:57 +01:00
Adam J. Stewart
42de252c12 py-black: add v24.1.0 (#42316) 2024-01-26 07:48:21 -07:00
m-shunji
6c3c06a571 pexsi: fix to build with fujitsu-ssl2 (#42234)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-01-26 11:00:22 +01:00
Harmen Stoppels
6a4573ce5a julia: patch for system lld and dsymutil (#42282) 2024-01-26 10:30:27 +01:00
Massimiliano Culpo
e77128dfa2 Run config audits in CI, add a new audit to detect wrongly named external specs (#42289) 2024-01-26 10:21:43 +01:00
Massimiliano Culpo
19df8e45ec Merge virtuals= from duplicate dependencies (#42281)
Previously, for abstract specs like:
```
foo ^[virtuals=a] bar ^[virtuals=b] bar
```
the second requirement was silently discarded on concretization. Now they're merged, and the abstract spec is equivalent to:
```
foo ^[virtuals=a,b] bar
```
2024-01-26 09:48:53 +01:00
eugeneswalker
4c7a1f541c e4s oneapi: use ghcr spack registry for runner image (#42267) 2024-01-26 02:02:58 +00:00
Daniele Cesarini
295e36efa3 COUNTDOWN package (#42123)
* Added countdown repo
* Added fixed version of COUNTDOWN
* Style fixes
* Changed mantainer syntax
* Variants listed in alphabetical order by name
* Added conflicts and some reordering
* Fixed conflicts syntax
* Style fixes
* [@spackbot] updating style on behalf of danielecesarini

---------

Co-authored-by: danielecesarini <danielecesarini@users.noreply.github.com>
2024-01-25 13:56:29 -08:00
eugeneswalker
3f47cc8d00 e4s neoverse-v2: use ghcr.io/spack image registry (#42268) 2024-01-25 13:45:29 -08:00
Matthew Thompson
4006020d78 pflogger: Add v1.12 (#42288)
* pflogger: add version 1.12
* Add version
* return MPI variant to false default
* [@spackbot] updating style on behalf of mathomp4

---------

Co-authored-by: mathomp4 <mathomp4@users.noreply.github.com>
2024-01-25 14:08:48 -07:00
kenche-linaro
6d4fa96aad linaro-forge: added 23.1.1 version (#42283) 2024-01-25 13:45:01 -07:00
Dom Heinzeller
85def2bfc7 Add eccodes@2.32.1 and eccodes@2.33.0 (#42257) 2024-01-25 13:34:10 -07:00
eugeneswalker
266bbad8cd e4s: add gromacs (#42266) 2024-01-25 11:48:25 -08:00
Derek Ryan Strong
1e3b7a6df1 Add fpart 1.6.0 (#42258) 2024-01-25 11:39:14 -08:00
Tom Payerle
00fe864321 cgal: Add version 5.6 (#42277)
Needed for latest version of sfcgal
2024-01-25 11:36:34 -08:00
Adam J. Stewart
3df720e909 py-lightly: add v1.4.26 (#42279) 2024-01-25 11:35:15 -08:00
Alberto Invernizzi
02a6ec7b3c CMake: disable Package Registry (#42149)
CMake may write and read from `~/.cmake` through `export(...)` and read `find_package(...)` respectively. We don't want this as it may influence the build in a non-deterministic way, so disable it for all versions of `cmake`.
2024-01-25 19:04:03 +01:00
Greg Becker
d3c1f7a872 Fix using sticky variants in externals (#42253) 2024-01-25 17:22:22 +01:00
Robert Cohn
84568b3454 spack find mpiexec for impi (#42284) 2024-01-25 08:21:50 -08:00
Harmen Stoppels
2721b4c10d llvm: disable libomptarget AMDGPU plugin (#42265)
Fixes CI on develop
2024-01-25 08:43:29 +01:00
527 changed files with 9469 additions and 3686 deletions

6
.github/pull_request_template.md vendored Normal file
View File

@@ -0,0 +1,6 @@
<!--
Remember that `spackbot` can help with your PR in multiple ways:
- `@spackbot help` shows all the commands that are currently available
- `@spackbot fix style` tries to push a commit to fix style issues in this PR
- `@spackbot re-run pipeline` runs the pipelines again, if you have write access to the repository
-->

View File

@@ -43,7 +43,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d # @v2.1.0
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@dbef88086f6cef02e264edb7dbf63250c17cef6c
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
with:
images: |
@@ -118,7 +118,5 @@ jobs:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
cache-from: type=gha
cache-to: type=gha,mode=max
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}

View File

@@ -40,7 +40,7 @@ jobs:
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@4512585405083f25c027a35db413c2b3b9006d50
- uses: dorny/paths-filter@ebc4d7e9ebcb0b1eb21480bb8f43113e996ac77a
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below

View File

@@ -1,5 +1,5 @@
black==23.12.1
clingo==5.6.2
black==24.2.0
clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0

View File

@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
@@ -122,7 +122,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
with:
flags: shelltests,linux
@@ -181,7 +181,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d # @v2.1.0
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
@@ -216,6 +216,6 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
with:
flags: unittests,macos

View File

@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
with:
flags: unittests,windows
unit-tests-cmd:
@@ -57,7 +57,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
with:
flags: unittests,windows
build-abseil:

View File

@@ -1130,6 +1130,10 @@ A version specifier can also be a list of ranges and specific versions,
separated by commas. For example, ``@1.0:1.5,=1.7.1`` matches any version
in the range ``1.0:1.5`` and the specific version ``1.7.1``.
^^^^^^^^^^^^
Git versions
^^^^^^^^^^^^
For packages with a ``git`` attribute, ``git`` references
may be specified instead of a numerical version i.e. branches, tags
and commits. Spack will stage and build based off the ``git``

View File

@@ -199,6 +199,7 @@ def setup(sphinx):
("py:class", "contextlib.contextmanager"),
("py:class", "module"),
("py:class", "_io.BufferedReader"),
("py:class", "_io.BytesIO"),
("py:class", "unittest.case.TestCase"),
("py:class", "_frozen_importlib_external.SourceFileLoader"),
("py:class", "clingo.Control"),

View File

@@ -357,91 +357,23 @@ If there is a hook that you would like and is missing, you can propose to add a
``pre_install(spec)``
"""""""""""""""""""""
A ``pre_install`` hook is run within an install subprocess, directly before
the install starts. It expects a single argument of a spec, and is run in
a multiprocessing subprocess. Note that if you see ``pre_install`` functions associated with packages these are not hooks
as we have defined them here, but rather callback functions associated with
a package install.
A ``pre_install`` hook is run within the install subprocess, directly before the install starts.
It expects a single argument of a spec.
""""""""""""""""""""""
``post_install(spec)``
""""""""""""""""""""""
"""""""""""""""""""""""""""""""""""""
``post_install(spec, explicit=None)``
"""""""""""""""""""""""""""""""""""""
A ``post_install`` hook is run within an install subprocess, directly after
the install finishes, but before the build stage is removed. If you
write one of these hooks, you should expect it to accept a spec as the only
argument. This is run in a multiprocessing subprocess. This ``post_install`` is
also seen in packages, but in this context not related to the hooks described
here.
A ``post_install`` hook is run within the install subprocess, directly after the install finishes,
but before the build stage is removed and the spec is registered in the database. It expects two
arguments: spec and an optional boolean indicating whether this spec is being installed explicitly.
""""""""""""""""""""""""""""""""""""""""""""""""""""
``pre_uninstall(spec)`` and ``post_uninstall(spec)``
""""""""""""""""""""""""""""""""""""""""""""""""""""
""""""""""""""""""""""""""
``on_install_start(spec)``
""""""""""""""""""""""""""
This hook is run at the beginning of ``lib/spack/spack/installer.py``,
in the install function of a ``PackageInstaller``,
and importantly is not part of a build process, but before it. This is when
we have just newly grabbed the task, and are preparing to install. If you
write a hook of this type, you should provide the spec to it.
.. code-block:: python
def on_install_start(spec):
"""On start of an install, we want to...
"""
print('on_install_start')
""""""""""""""""""""""""""""
``on_install_success(spec)``
""""""""""""""""""""""""""""
This hook is run on a successful install, and is also run inside the build
process, akin to ``post_install``. The main difference is that this hook
is run outside of the context of the stage directory, meaning after the
build stage has been removed and the user is alerted that the install was
successful. If you need to write a hook that is run on success of a particular
phase, you should use ``on_phase_success``.
""""""""""""""""""""""""""""
``on_install_failure(spec)``
""""""""""""""""""""""""""""
This hook is run given an install failure that happens outside of the build
subprocess, but somewhere in ``installer.py`` when something else goes wrong.
If you need to write a hook that is relevant to a failure within a build
process, you would want to instead use ``on_phase_failure``.
"""""""""""""""""""""""""""
``on_install_cancel(spec)``
"""""""""""""""""""""""""""
The same, but triggered if a spec install is cancelled for any reason.
"""""""""""""""""""""""""""""""""""""""""""""""
``on_phase_success(pkg, phase_name, log_file)``
"""""""""""""""""""""""""""""""""""""""""""""""
This hook is run within the install subprocess, and specifically when a phase
successfully finishes. Since we are interested in the package, the name of
the phase, and any output from it, we require:
- **pkg**: the package variable, which also has the attached spec at ``pkg.spec``
- **phase_name**: the name of the phase that was successful (e.g., configure)
- **log_file**: the path to the file with output, in case you need to inspect or otherwise interact with it.
"""""""""""""""""""""""""""""""""""""""""""""
``on_phase_error(pkg, phase_name, log_file)``
"""""""""""""""""""""""""""""""""""""""""""""
In the case of an error during a phase, we might want to trigger some event
with a hook, and this is the purpose of this particular hook. Akin to
``on_phase_success`` we require the same variables - the package that failed,
the name of the phase, and the log file where we might find errors.
These hooks are currently used for cleaning up module files after uninstall.
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -416,6 +416,23 @@ that git clone if ``foo`` is in the environment.
Further development on ``foo`` can be tested by reinstalling the environment,
and eventually committed and pushed to the upstream git repo.
If the package being developed supports out-of-source builds then users can use the
``--build_directory`` flag to control the location and name of the build directory.
This is a shortcut to set the ``package_attributes:build_directory`` in the
``packages`` configuration (see :ref:`assigning-package-attributes`).
The supplied location will become the build-directory for that package in all future builds.
.. warning::
Potential pitfalls of setting the build directory
Spack does not check for out-of-source build compatibility with the packages and
so the onerous of making sure the package supports out-of-source builds is on
the user.
For example, most ``autotool`` and ``makefile`` packages do not support out-of-source builds
while all ``CMake`` packages do.
Understanding these nuances are on the software developers and we strongly encourage
developers to only redirect the build directory if they understand their package's
build-system.
^^^^^^^
Loading
^^^^^^^
@@ -472,11 +489,11 @@ a ``packages.yaml`` file) could contain:
.. code-block:: yaml
spack:
...
# ...
packages:
all:
compiler: [intel]
...
# ...
This configuration sets the default compiler for all packages to
``intel``.
@@ -822,7 +839,7 @@ directories.
.. code-block:: yaml
spack:
...
# ...
view:
mpis:
root: /path/to/view
@@ -866,7 +883,7 @@ automatically named ``default``, so that
.. code-block:: yaml
spack:
...
# ...
view: True
is equivalent to
@@ -874,7 +891,7 @@ is equivalent to
.. code-block:: yaml
spack:
...
# ...
view:
default:
root: .spack-env/view
@@ -884,7 +901,7 @@ and
.. code-block:: yaml
spack:
...
# ...
view: /path/to/view
is equivalent to
@@ -892,7 +909,7 @@ is equivalent to
.. code-block:: yaml
spack:
...
# ...
view:
default:
root: /path/to/view

View File

@@ -623,7 +623,7 @@ Fortran.
compilers:
- compiler:
...
# ...
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++

View File

@@ -10,7 +10,7 @@ Modules (modules.yaml)
======================
The use of module systems to manage user environment in a controlled way
is a common practice at HPC centers that is often embraced also by
is a common practice at HPC centers that is sometimes embraced also by
individual programmers on their development machines. To support this
common practice Spack integrates with `Environment Modules
<http://modules.sourceforge.net/>`_ and `Lmod
@@ -21,14 +21,38 @@ Modules are one of several ways you can use Spack packages. For other
options that may fit your use case better, you should also look at
:ref:`spack load <spack-load>` and :ref:`environments <environments>`.
----------------------------
Using module files via Spack
----------------------------
-----------
Quick start
-----------
If you have installed a supported module system you should be able to
run ``module avail`` to see what module
files have been installed. Here is sample output of those programs,
showing lots of installed packages:
In the current version of Spack, module files are not generated by default. To get started, you
can generate module files for all currently installed packages by running either
.. code-block:: console
$ spack module tcl refresh
or
.. code-block:: console
$ spack module lmod refresh
Spack can also generate module files for all future installations automatically through the
following configuration:
.. code-block:: console
$ spack config add modules:default:enable:[tcl]
or
.. code-block:: console
$ spack config add modules:default:enable:[lmod]
Assuming you have a module system installed, you should now be able to use the ``module`` command
to interact with them:
.. code-block:: console
@@ -65,33 +89,17 @@ scheme used at your site.
Module file customization
-------------------------
Module files are generated by post-install hooks after the successful
installation of a package.
.. note::
Spack only generates modulefiles when a package is installed. If
you attempt to install a package and it is already installed, Spack
will not regenerate modulefiles for the package. This may lead to
inconsistent modulefiles if the Spack module configuration has
changed since the package was installed, either by editing a file
or changing scopes or environments.
Later in this section there is a subsection on :ref:`regenerating
modules <cmd-spack-module-refresh>` that will allow you to bring
your modules to a consistent state.
The table below summarizes the essential information associated with
the different file formats that can be generated by Spack:
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| | **Hook name** | **Default root directory** | **Default template file** | **Compatible tools** |
+=============================+====================+===============================+==============================================+======================+
| **Tcl - Non-Hierarchical** | ``tcl`` | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| **Lua - Hierarchical** | ``lmod`` | share/spack/lmod | share/spack/templates/modules/modulefile.lua | Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
| | Hierarchical | **Default root directory** | **Default template file** | **Compatible tools** |
+===========+==============+==============================+==============================================+======================+
| ``tcl`` | No | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/Lmod |
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
| ``lmod`` | Yes | share/spack/lmod | share/spack/templates/modules/modulefile.lua | Lmod |
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
Spack ships with sensible defaults for the generation of module files, but
@@ -102,7 +110,7 @@ In general you can override or extend the default behavior by:
2. writing specific rules in the ``modules.yaml`` configuration file
3. writing your own templates to override or extend the defaults
The former method let you express changes in the run-time environment
The former method lets you express changes in the run-time environment
that are needed to use the installed software properly, e.g. injecting variables
from language interpreters into their extensions. The latter two instead permit to
fine tune the filesystem layout, content and creation of module files to meet
@@ -110,79 +118,62 @@ site specific conventions.
.. _overide-api-calls-in-package-py:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Override API calls in ``package.py``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting environment variables dynamically in ``package.py``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are two methods that you can override in any ``package.py`` to affect the
content of the module files generated by Spack. The first one:
There are two methods that you can implement in any ``package.py`` to dynamically affect the
content of the module files generated by Spack. The most important one is
``setup_run_environment``, which can be used to set environment variables in the module file that
depend on the spec:
.. code-block:: python
def setup_run_environment(self, env):
pass
if self.spec.satisfies("+foo"):
env.set("FOO", "bar")
can alter the content of the module file associated with the same package where it is overridden.
The second method:
The second, less commonly used, is ``setup_dependent_run_environment(self, env, dependent_spec)``,
which allows a dependency to set variables in the module file of its dependents. This is typically
used in packages like ``python``, ``r``, or ``perl`` to prepend the dependent's prefix to the
search path of the interpreter (``PYTHONPATH``, ``R_LIBS``, ``PERL5LIB`` resp.), so it can locate
the packages at runtime.
For example, a simplified version of the ``python`` package could look like this:
.. code-block:: python
def setup_dependent_run_environment(self, env, dependent_spec):
pass
if dependent_spec.package.extends(self.spec):
env.prepend_path("PYTHONPATH", dependent_spec.prefix.lib.python)
can instead inject run-time environment modifications in the module files of packages
that depend on it. In both cases you need to fill ``env`` with the desired
list of environment modifications.
.. admonition:: The ``r`` package and callback APIs
An example in which it is crucial to override both methods
is given by the ``r`` package. This package installs libraries and headers
in non-standard locations and it is possible to prepend the appropriate directory
to the corresponding environment variables:
================== =================================
LD_LIBRARY_PATH ``self.prefix/rlib/R/lib``
PKG_CONFIG_PATH ``self.prefix/rlib/pkgconfig``
================== =================================
with the following snippet:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/r/package.py
:pyobject: R.setup_run_environment
The ``r`` package also knows which environment variable should be modified
to make language extensions provided by other packages available, and modifies
it appropriately in the override of the second method:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/r/package.py
:pyobject: R.setup_dependent_run_environment
and would make any package that ``extends("python")`` have its library directory added to the
``PYTHONPATH`` environment variable in the module file. It's much more convenient to set this
variable here, than to repeat it in every Python extension's ``setup_run_environment`` method.
.. _modules-yaml:
^^^^^^^^^^^^^^^^^^^^^^^^^^
Write a configuration file
^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``modules.yaml`` config file and module sets
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The configuration files that control module generation behavior
are named ``modules.yaml``. The default configuration:
The configuration files that control module generation behavior are named ``modules.yaml``. The
default configuration looks like this:
.. literalinclude:: _spack_root/etc/spack/defaults/modules.yaml
:language: yaml
activates the hooks to generate ``tcl`` module files and inspects
the installation folder of each package for the presence of a set of subdirectories
(``bin``, ``man``, ``share/man``, etc.). If any is found its full path is prepended
to the environment variables listed below the folder name.
You can define one or more **module sets**, each of which can be configured separately with regard
to install location, naming scheme, inclusion and exclusion, autoloading, et cetera.
Spack modules can be configured for multiple module sets. The default
module set is named ``default``. All Spack commands which operate on
modules default to apply the ``default`` module set, but can be
applied to any module set in the configuration.
The default module set is aptly named ``default``. All
:ref:`Spack commands that operate on modules <maintaining-module-files>` apply to the ``default``
module set, unless another module set is specified explicitly (with the ``--name`` flag).
"""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^
Changing the modules root
"""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^
As shown in the table above, the default module root for ``lmod`` is
``$spack/share/spack/lmod`` and the default root for ``tcl`` is
@@ -198,7 +189,7 @@ set by changing the ``roots`` key of the configuration.
my_custom_lmod_modules:
roots:
lmod: /path/to/install/custom/lmod/modules
...
# ...
This configuration will create two module sets. The default module set
will install its ``tcl`` modules to ``/path/to/install/tcl/modules``
@@ -224,25 +215,32 @@ location could be confusing to users of your modules. In the next
section, we will discuss enabling and disabling module types (module
file generators) for each module set.
""""""""""""""""""""
Activate other hooks
""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically generating module files
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Any other module file generator shipped with Spack can be activated adding it to the
list under the ``enable`` key in the module file. Currently the only generator that
is not active by default is ``lmod``, which produces hierarchical lua module files.
Each module system can then be configured separately. In fact, you should list configuration
options that affect a particular type of module files under a top level key corresponding
to the generator being customized:
Spack can be configured to automatically generate module files as part of package installation.
This is done by adding the desired module systems to the ``enable`` list.
.. code-block:: yaml
modules:
default:
enable:
- tcl
- lmod
- tcl
- lmod
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configuring ``tcl`` and ``lmod`` modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can configure the behavior of either module system separately, under a key corresponding to
the generator being customized:
.. code-block:: yaml
modules:
default:
tcl:
# contains environment modules specific customizations
lmod:
@@ -253,16 +251,70 @@ either change the layout of the module files on the filesystem, or they will aff
their content. For the latter point it is possible to use anonymous specs
to fine tune the set of packages on which the modifications should be applied.
.. _autoloading-dependencies:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Autoloading and hiding dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A module file should set the variables that are needed for an application to work. But since an
application often has many dependencies, where should all the environment variables for those be
set? In Spack the rule is that each package sets the runtime variables that are needed by the
package itself, and no more. This way, dependencies can be loaded standalone too, and duplication
of environment variables is avoided.
That means however that if you want to use an application, you need to load the modules for all its
dependencies. Of course this is not something you would want users to do manually.
Since Spack knows the dependency graph of every package, it can easily generate module files that
automatically load the modules for its dependencies recursively. It is enabled by default for both
Lmod and Environment Modules under the ``autoload: direct`` config option. The former system has
builtin support through the ``depends_on`` function, the latter simply uses a ``module load``
statement. Both module systems (at least in newer versions) do reference counting, so that if a
module is loaded by two different modules, it will only be unloaded after the others are.
The ``autoload`` key accepts the values ``none``, ``direct``, and ``all``. To disable it, use
``none``, and to enable, it's best to stick to ``direct``, which only autoloads the direct link and
run type dependencies, relying on recursive autoloading to load the rest.
A common complaint about autoloading is the large number of modules that are visible to the user.
Spack has a solution for this as well: ``hide_implicits: true``. This ensures that only those
packages you've explicitly installed are exposed by ``module avail``, but still allows for
autoloading of hidden dependencies. Lmod should support hiding implicits in general, while
Environment Modules requires version 4.7 or higher.
.. note::
If supported by your module system, we highly encourage the following configuration that enables
autoloading and hiding of implicits. It ensures all runtime variables are set correctly,
including those for dependencies, without overwhelming the user with a large number of available
modules. Further, it makes it easier to get readable module names without collisions, see the
section below on :ref:`modules-projections`.
.. code-block:: yaml
modules:
default:
tcl:
hide_implicits: true
all:
autoload: direct
lmod:
hide_implicits: true
all:
autoload: direct
.. _anonymous_specs:
""""""""""""""""""""""""""""
Selection by anonymous specs
""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting environment variables for selected packages in config
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In the configuration file you can use *anonymous specs* (i.e. specs
that **are not required to have a root package** and are thus used just
to express constraints) to apply certain modifications on a selected set
of the installed software. For instance, in the snippet below:
In the configuration file you can filter particular specs, and make further changes to the
environment variables that go into their module files. This is very powerful when you want to avoid
:ref:`modifying the package itself <overide-api-calls-in-package-py>`, or when you want to set
certain variables on multiple selected packages at once.
For instance, in the snippet below:
.. code-block:: yaml
@@ -305,12 +357,28 @@ the variable ``FOOBAR`` will be unset.
.. note::
Order does matter
The modifications associated with the ``all`` keyword are always evaluated
first, no matter where they appear in the configuration file. All the other
spec constraints are instead evaluated top to bottom.
first, no matter where they appear in the configuration file. All the other changes to
environment variables for matching specs are evaluated from top to bottom.
""""""""""""""""""""""""""""""""""""""""""""
.. warning::
As general advice, it's often better to set as few unnecessary variables as possible. For
example, the following seemingly innocent and potentially useful configuration
.. code-block:: yaml
all:
environment:
set:
"{name}_ROOT": "{prefix}"
sets ``BINUTILS_ROOT`` to its prefix in modules for ``binutils``, which happens to break
the ``gcc`` compiler: it uses this variable as its default search path for certain object
files and libraries, and by merely setting it, everything fails to link.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Exclude or include specific module files
""""""""""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can use anonymous specs also to prevent module files from being written or
to force them to be written. Consider the case where you want to hide from users
@@ -330,14 +398,19 @@ you will prevent the generation of module files for any package that
is compiled with ``gcc@4.4.7``, with the only exception of any ``gcc``
or any ``llvm`` installation.
It is safe to combine ``exclude`` and ``autoload``
:ref:`mentioned above <autoloading-dependencies>`. When ``exclude`` prevents a module file to be
generated for a dependency, the ``autoload`` feature will simply not generate a statement to load
it.
.. _modules-projections:
"""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Customize the naming of modules
"""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The names of environment modules generated by spack are not always easy to
The names of environment modules generated by Spack are not always easy to
fully comprehend due to the long hash in the name. There are three module
configuration options to help with that. The first is a global setting to
adjust the hash length. It can be set anywhere from 0 to 32 and has a default
@@ -353,6 +426,13 @@ shows how to set hash length in the module file names:
tcl:
hash_length: 7
.. tip::
Using ``hide_implicits: true`` (see :ref:`autoloading-dependencies`) vastly reduces the number
modules exposed to the user. The hidden modules always contain the hash in their name, and are
not influenced by the ``hash_length`` setting. Hidden implicits thus make it easier to use a
short hash length or no hash at all, without risking name conflicts.
To help make module names more readable, and to help alleviate name conflicts
with a short hash, one can use the ``suffixes`` option in the modules
configuration file. This option will add strings to modules that match a spec.
@@ -365,12 +445,12 @@ For instance, the following config options,
tcl:
all:
suffixes:
^python@2.7.12: 'python-2.7.12'
^python@3.12: 'python-3.12'
^openblas: 'openblas'
will add a ``python-2.7.12`` version string to any packages compiled with
python matching the spec, ``python@2.7.12``. This is useful to know which
version of python a set of python extensions is associated with. Likewise, the
will add a ``python-3.12`` version string to any packages compiled with
Python matching the spec, ``python@3.12``. This is useful to know which
version of Python a set of Python extensions is associated with. Likewise, the
``openblas`` string is attached to any program that has openblas in the spec,
most likely via the ``+blas`` variant specification.
@@ -468,41 +548,11 @@ that are already in the Lmod hierarchy.
For hierarchies that are deeper than three layers ``lmod spider`` may have some issues.
See `this discussion on the Lmod project <https://github.com/TACC/Lmod/issues/114>`_.
""""""""""""""""""""""
Select default modules
""""""""""""""""""""""
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _customize-env-modifications:
"""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Customize environment modifications
"""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can control which prefixes in a Spack package are added to
environment variables with the ``prefix_inspections`` section; this
@@ -600,9 +650,9 @@ stack to users who are likely to inspect the modules to find full
paths to software, when it is desirable to present the users with a
simpler set of paths than those generated by the Spack install tree.
""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Filter out environment modifications
""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Modifications to certain environment variables in module files are there by
default, for instance because they are generated by prefix inspections.
@@ -622,49 +672,37 @@ do so by using the ``exclude_env_vars``:
The configuration above will generate module files that will not contain
modifications to either ``CPATH`` or ``LIBRARY_PATH``.
^^^^^^^^^^^^^^^^^^^^^^
Select default modules
^^^^^^^^^^^^^^^^^^^^^^
.. _autoloading-dependencies:
"""""""""""""""""""""
Autoload dependencies
"""""""""""""""""""""
Often it is required for a module to have its (transient) dependencies loaded as well.
One example where this is useful is when one package needs to use executables provided
by its dependency; when the dependency is autoloaded, the executable will be in the
PATH. Similarly for scripting languages such as Python, packages and their dependencies
have to be loaded together.
Autoloading is enabled by default for Lmod and Environment Modules. The former
has builtin support for through the ``depends_on`` function. The latter uses
``module load`` statement to load and track dependencies.
Autoloading can also be enabled conditionally:
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
default:
tcl:
all:
autoload: none
^python:
autoload: direct
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
The configuration file above will produce module files that will
load their direct dependencies if the package installed depends on ``python``.
The allowed values for the ``autoload`` statement are either ``none``,
``direct`` or ``all``.
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. note::
Tcl prerequisites
In the ``tcl`` section of the configuration file it is possible to use
the ``prerequisites`` directive that accepts the same values as
``autoload``. It will produce module files that have a ``prereq``
statement, which autoloads dependencies on Environment Modules when its
``auto_handling`` configuration option is enabled. If Environment Modules
is installed with Spack, ``auto_handling`` is enabled by default starting
version 4.2. Otherwise it is enabled by default since version 5.0.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _maintaining-module-files:
------------------------
Maintaining Module Files

View File

@@ -647,6 +647,8 @@ manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
.. _assigning-package-attributes:
----------------------------
Assigning Package Attributes
----------------------------
@@ -657,10 +659,11 @@ You can assign class-level attributes in the configuration:
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
package_attributes:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these

View File

@@ -810,7 +810,7 @@ generated by ``spack ci generate``. You also want your generated rebuild jobs
.. code-block:: yaml
spack:
...
# ...
ci:
pipeline-gen:
- build-job:

View File

@@ -17,7 +17,7 @@ experimental software separately from the built-in repository. Spack
allows you to configure local repositories using either the
``repos.yaml`` or the ``spack repo`` command.
A package repository a directory structured like this::
A package repository is a directory structured like this::
repo/
repo.yaml

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.23.0
python-levenshtein==0.25.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
pytest==7.4.4
urllib3==2.2.0
pytest==8.0.1
isort==5.13.2
black==23.12.1
black==24.2.0
flake8==7.0.0
mypy==1.8.0

View File

@@ -171,7 +171,7 @@ def polite_path(components: Iterable[str]):
@memoized
def _polite_antipattern():
# A regex of all the characters we don't want in a filename
return re.compile(r"[^A-Za-z0-9_.-]")
return re.compile(r"[^A-Za-z0-9_+.-]")
def polite_filename(filename: str) -> str:
@@ -920,29 +920,35 @@ def get_filetype(path_name):
return output.strip()
@system_path_filter
def is_nonsymlink_exe_with_shebang(path):
"""
Returns whether the path is an executable script with a shebang.
Return False when the path is a *symlink* to an executable script.
"""
def has_shebang(path):
"""Returns whether a path has a shebang line. Returns False if the file cannot be opened."""
try:
st = os.lstat(path)
# Should not be a symlink
if stat.S_ISLNK(st.st_mode):
return False
# Should be executable
if not st.st_mode & (stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH):
return False
# Should start with a shebang
with open(path, "rb") as f:
return f.read(2) == b"#!"
except (IOError, OSError):
except OSError:
return False
@system_path_filter
def is_nonsymlink_exe_with_shebang(path):
"""Returns whether the path is an executable regular file with a shebang. Returns False too
when the path is a symlink to a script, and also when the file cannot be opened."""
try:
st = os.lstat(path)
except OSError:
return False
# Should not be a symlink
if stat.S_ISLNK(st.st_mode):
return False
# Should be executable
if not st.st_mode & (stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH):
return False
return has_shebang(path)
@system_path_filter(arg_slice=slice(1))
def chgrp_if_not_world_writable(path, group):
"""chgrp path to group if path is not world writable"""
@@ -1377,120 +1383,89 @@ def traverse_tree(
yield (source_path, dest_path)
def lexists_islink_isdir(path):
"""Computes the tuple (lexists(path), islink(path), isdir(path)) in a minimal
number of stat calls on unix. Use os.path and symlink.islink methods for windows."""
if sys.platform == "win32":
if not os.path.lexists(path):
return False, False, False
return os.path.lexists(path), islink(path), os.path.isdir(path)
# First try to lstat, so we know if it's a link or not.
try:
lst = os.lstat(path)
except (IOError, OSError):
return False, False, False
is_link = stat.S_ISLNK(lst.st_mode)
# Check whether file is a dir.
if not is_link:
is_dir = stat.S_ISDIR(lst.st_mode)
return True, is_link, is_dir
# Check whether symlink points to a dir.
try:
st = os.stat(path)
is_dir = stat.S_ISDIR(st.st_mode)
except (IOError, OSError):
# Dangling symlink (i.e. it lexists but not exists)
is_dir = False
return True, is_link, is_dir
class BaseDirectoryVisitor:
"""Base class and interface for :py:func:`visit_directory_tree`."""
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
"""Handle the non-symlink file at ``os.path.join(root, rel_path)``
Parameters:
root (str): root directory
rel_path (str): relative path to current file from ``root``
root: root directory
rel_path: relative path to current file from ``root``
depth (int): depth of current file from the ``root`` directory"""
pass
def visit_symlinked_file(self, root, rel_path, depth):
"""Handle the symlink to a file at ``os.path.join(root, rel_path)``.
Note: ``rel_path`` is the location of the symlink, not to what it is
pointing to. The symlink may be dangling.
def visit_symlinked_file(self, root: str, rel_path: str, depth) -> None:
"""Handle the symlink to a file at ``os.path.join(root, rel_path)``. Note: ``rel_path`` is
the location of the symlink, not to what it is pointing to. The symlink may be dangling.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory"""
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory"""
pass
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""Return True from this function to recurse into the directory at
os.path.join(root, rel_path). Return False in order not to recurse further.
Parameters:
root (str): root directory
rel_path (str): relative path to current directory from ``root``
depth (int): depth of current directory from the ``root`` directory
root: root directory
rel_path: relative path to current directory from ``root``
depth: depth of current directory from the ``root`` directory
Returns:
bool: ``True`` when the directory should be recursed into. ``False`` when
not"""
return False
def before_visit_symlinked_dir(self, root, rel_path, depth):
"""Return ``True`` to recurse into the symlinked directory and ``False`` in
order not to. Note: ``rel_path`` is the path to the symlink itself.
Following symlinked directories blindly can cause infinite recursion due to
cycles.
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""Return ``True`` to recurse into the symlinked directory and ``False`` in order not to.
Note: ``rel_path`` is the path to the symlink itself. Following symlinked directories
blindly can cause infinite recursion due to cycles.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory
Returns:
bool: ``True`` when the directory should be recursed into. ``False`` when
not"""
return False
def after_visit_dir(self, root, rel_path, depth):
"""Called after recursion into ``rel_path`` finished. This function is not
called when ``rel_path`` was not recursed into.
def after_visit_dir(self, root: str, rel_path: str, depth: int) -> None:
"""Called after recursion into ``rel_path`` finished. This function is not called when
``rel_path`` was not recursed into.
Parameters:
root (str): root directory
rel_path (str): relative path to current directory from ``root``
depth (int): depth of current directory from the ``root`` directory"""
root: root directory
rel_path: relative path to current directory from ``root``
depth: depth of current directory from the ``root`` directory"""
pass
def after_visit_symlinked_dir(self, root, rel_path, depth):
"""Called after recursion into ``rel_path`` finished. This function is not
called when ``rel_path`` was not recursed into.
def after_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> None:
"""Called after recursion into ``rel_path`` finished. This function is not called when
``rel_path`` was not recursed into.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory"""
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory"""
pass
def visit_directory_tree(root, visitor, rel_path="", depth=0):
"""Recurses the directory root depth-first through a visitor pattern using the
interface from :py:class:`BaseDirectoryVisitor`
def visit_directory_tree(
root: str, visitor: BaseDirectoryVisitor, rel_path: str = "", depth: int = 0
):
"""Recurses the directory root depth-first through a visitor pattern using the interface from
:py:class:`BaseDirectoryVisitor`
Parameters:
root (str): path of directory to recurse into
visitor (BaseDirectoryVisitor): what visitor to use
rel_path (str): current relative path from the root
depth (str): current depth from the root
root: path of directory to recurse into
visitor: what visitor to use
rel_path: current relative path from the root
depth: current depth from the root
"""
dir = os.path.join(root, rel_path)
dir_entries = sorted(os.scandir(dir), key=lambda d: d.name)
@@ -1498,26 +1473,19 @@ def visit_directory_tree(root, visitor, rel_path="", depth=0):
for f in dir_entries:
rel_child = os.path.join(rel_path, f.name)
islink = f.is_symlink()
# On Windows, symlinks to directories are distinct from
# symlinks to files, and it is possible to create a
# broken symlink to a directory (e.g. using os.symlink
# without `target_is_directory=True`), invoking `isdir`
# on a symlink on Windows that is broken in this manner
# will result in an error. In this case we can work around
# the issue by reading the target and resolving the
# directory ourselves
# On Windows, symlinks to directories are distinct from symlinks to files, and it is
# possible to create a broken symlink to a directory (e.g. using os.symlink without
# `target_is_directory=True`), invoking `isdir` on a symlink on Windows that is broken in
# this manner will result in an error. In this case we can work around the issue by reading
# the target and resolving the directory ourselves
try:
isdir = f.is_dir()
except OSError as e:
if sys.platform == "win32" and hasattr(e, "winerror") and e.winerror == 5 and islink:
# if path is a symlink, determine destination and
# evaluate file vs directory
# if path is a symlink, determine destination and evaluate file vs directory
link_target = resolve_link_target_relative_to_the_link(f)
# link_target might be relative but
# resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative
# to the CWD and therefore
# makes sense
# link_target might be relative but resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative to the CWD and therefore makes sense
isdir = os.path.isdir(link_target)
else:
raise e

View File

@@ -8,7 +8,7 @@
import filecmp
import os
import shutil
from collections import OrderedDict
from typing import Callable, Dict, List, Optional, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, touch, traverse_tree
@@ -51,32 +51,32 @@ class SourceMergeVisitor(BaseDirectoryVisitor):
- A list of merge conflicts in dst/
"""
def __init__(self, ignore=None):
def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
self.ignore = ignore if ignore is not None else lambda f: False
# When mapping <src root> to <dst root>/<projection>, we need
# to prepend the <projection> bit to the relative path in the
# destination dir.
self.projection = ""
# When mapping <src root> to <dst root>/<projection>, we need to prepend the <projection>
# bit to the relative path in the destination dir.
self.projection: str = ""
# When a file blocks another file, the conflict can sometimes
# be resolved / ignored (e.g. <prefix>/LICENSE or
# or <site-packages>/<namespace>/__init__.py conflicts can be
# ignored).
self.file_conflicts = []
# Two files f and g conflict if they are not os.path.samefile(f, g) and they are both
# projected to the same destination file. These conflicts are not necessarily fatal, and
# can be resolved or ignored. For example <prefix>/LICENSE or
# <site-packages>/<namespace>/__init__.py conflicts can be ignored).
self.file_conflicts: List[MergeConflict] = []
# When we have to create a dir where a file is, or a file
# where a dir is, we have fatal errors, listed here.
self.fatal_conflicts = []
# When we have to create a dir where a file is, or a file where a dir is, we have fatal
# errors, listed here.
self.fatal_conflicts: List[MergeConflict] = []
# What directories we have to make; this is an ordered set,
# so that we have a fast lookup and can run mkdir in order.
self.directories = OrderedDict()
# What directories we have to make; this is an ordered dict, so that we have a fast lookup
# and can run mkdir in order.
self.directories: Dict[str, Tuple[str, str]] = {}
# Files to link. Maps dst_rel to (src_root, src_rel)
self.files = OrderedDict()
# Files to link. Maps dst_rel to (src_root, src_rel). This is an ordered dict, where files
# are guaranteed to be grouped by src_root in the order they were visited.
self.files: Dict[str, Tuple[str, str]] = {}
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Register a directory if dst / rel_path is not blocked by a file or ignored.
"""
@@ -104,7 +104,7 @@ def before_visit_dir(self, root, rel_path, depth):
self.directories[proj_rel_path] = (root, rel_path)
return True
def before_visit_symlinked_dir(self, root, rel_path, depth):
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Replace symlinked dirs with actual directories when possible in low depths,
otherwise handle it as a file (i.e. we link to the symlink).
@@ -136,40 +136,56 @@ def before_visit_symlinked_dir(self, root, rel_path, depth):
self.visit_file(root, rel_path, depth)
return False
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = False) -> None:
proj_rel_path = os.path.join(self.projection, rel_path)
if self.ignore(rel_path):
pass
elif proj_rel_path in self.directories:
# Can't create a file where a dir is; fatal error
src_a_root, src_a_relpath = self.directories[proj_rel_path]
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_a=os.path.join(*self.directories[proj_rel_path]),
src_b=os.path.join(root, rel_path),
)
)
elif proj_rel_path in self.files:
# In some cases we can resolve file-file conflicts
src_a_root, src_a_relpath = self.files[proj_rel_path]
self.file_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_b=os.path.join(root, rel_path),
# When two files project to the same path, they conflict iff they are distinct.
# If they are the same (i.e. one links to the other), register regular files rather
# than symlinks. The reason is that in copy-type views, we need a copy of the actual
# file, not the symlink.
src_a = os.path.join(*self.files[proj_rel_path])
src_b = os.path.join(root, rel_path)
try:
samefile = os.path.samefile(src_a, src_b)
except OSError:
samefile = False
if not samefile:
# Distinct files produce a conflict.
self.file_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
return
if not symlink:
# Remove the link in favor of the actual file. The del is necessary to maintain the
# order of the files dict, which is grouped by root.
del self.files[proj_rel_path]
self.files[proj_rel_path] = (root, rel_path)
else:
# Otherwise register this file to be linked.
self.files[proj_rel_path] = (root, rel_path)
def visit_symlinked_file(self, root, rel_path, depth):
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
self.visit_file(root, rel_path, depth)
self.visit_file(root, rel_path, depth, symlink=True)
def set_projection(self, projection):
def set_projection(self, projection: str) -> None:
self.projection = os.path.normpath(projection)
# Todo, is this how to check in general for empty projection?
@@ -197,24 +213,19 @@ def set_projection(self, projection):
class DestinationMergeVisitor(BaseDirectoryVisitor):
"""DestinatinoMergeVisitor takes a SourceMergeVisitor
and:
"""DestinatinoMergeVisitor takes a SourceMergeVisitor and:
a. registers additional conflicts when merging
to the destination prefix
b. removes redundant mkdir operations when
directories already exist in the destination
prefix.
a. registers additional conflicts when merging to the destination prefix
b. removes redundant mkdir operations when directories already exist in the destination prefix.
This also makes sure that symlinked directories
in the target prefix will never be merged with
This also makes sure that symlinked directories in the target prefix will never be merged with
directories in the sources directories.
"""
def __init__(self, source_merge_visitor):
def __init__(self, source_merge_visitor: SourceMergeVisitor):
self.src = source_merge_visitor
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir is a file in a src dir, add a conflict,
# and don't traverse deeper
if rel_path in self.src.files:
@@ -236,7 +247,7 @@ def before_visit_dir(self, root, rel_path, depth):
# don't descend into it.
return False
def before_visit_symlinked_dir(self, root, rel_path, depth):
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Symlinked directories in the destination prefix should
be seen as files; we should not accidentally merge
@@ -262,7 +273,7 @@ def before_visit_symlinked_dir(self, root, rel_path, depth):
# Never descend into symlinked target dirs.
return False
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# Can't merge a file if target already exists
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
@@ -280,7 +291,7 @@ def visit_file(self, root, rel_path, depth):
)
)
def visit_symlinked_file(self, root, rel_path, depth):
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
self.visit_file(root, rel_path, depth)

View File

@@ -244,7 +244,7 @@ def _search_duplicate_specs_in_externals(error_cls):
+ lines
+ ["as they might result in non-deterministic hashes"]
)
except TypeError:
except (TypeError, AttributeError):
details = []
errors.append(error_cls(summary=error_msg, details=details))
@@ -292,12 +292,6 @@ def _avoid_mismatched_variants(error_cls):
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
for pkg_name in packages_yaml:
# 'all:' must be more forgiving, since it is setting defaults for everything
if pkg_name == "all" or "variants" not in packages_yaml[pkg_name]:
@@ -317,7 +311,7 @@ def make_error(config_data, summary):
f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'"
)
errors.append(make_error(preferences, summary))
errors.append(_make_config_error(preferences, summary, error_cls=error_cls))
continue
# Variant cannot accept this value
@@ -329,11 +323,41 @@ def make_error(config_data, summary):
f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
f"to the invalid value '{str(variant)}'"
)
errors.append(make_error(preferences, summary))
errors.append(_make_config_error(preferences, summary, error_cls=error_cls))
return errors
@config_packages
def _wrongly_named_spec(error_cls):
"""Warns if the wrong name is used for an external spec"""
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
for pkg_name in packages_yaml:
if pkg_name == "all":
continue
externals = packages_yaml[pkg_name].get("externals", [])
is_virtual = spack.repo.PATH.is_virtual(pkg_name)
for entry in externals:
spec = spack.spec.Spec(entry["spec"])
regular_pkg_is_wrong = not is_virtual and pkg_name != spec.name
virtual_pkg_is_wrong = is_virtual and not any(
p.name == spec.name for p in spack.repo.PATH.providers_for(pkg_name)
)
if regular_pkg_is_wrong or virtual_pkg_is_wrong:
summary = f"Wrong external spec detected for '{pkg_name}': {spec}"
errors.append(_make_config_error(entry, summary, error_cls=error_cls))
return errors
def _make_config_error(config_data, summary, error_cls):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
#: Sanity checks on package directives
package_directives = AuditClass(
group="packages",
@@ -772,13 +796,33 @@ def check_virtual_with_variants(spec, msg):
except spack.repo.UnknownPackageError:
# This dependency is completely missing, so report
# and continue the analysis
summary = (
f"{pkg_name}: unknown package '{dep_name}' in " "'depends_on' directive"
)
summary = f"{pkg_name}: unknown package '{dep_name}' in 'depends_on' directive"
details = [f" in {filename}"]
errors.append(error_cls(summary=summary, details=details))
continue
# Check for self-referential specs similar to:
#
# depends_on("foo@X.Y", when="^foo+bar")
#
# That would allow clingo to choose whether to have foo@X.Y+bar in the graph.
problematic_edges = [
x for x in when.edges_to_dependencies(dep_name) if not x.virtuals
]
if problematic_edges and not dep.patches:
summary = (
f"{pkg_name}: dependency on '{dep.spec}' when '{when}' is self-referential"
)
details = [
(
f" please specify better using '^[virtuals=...] {dep_name}', or "
f"substitute with an equivalent condition on '{pkg_name}'"
),
f" in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
continue
# check variants
dependency_variants = dep.spec.variants
for name, value in dependency_variants.items():

View File

@@ -542,7 +542,7 @@ def verify_patchelf(patchelf: "spack.util.executable.Executable") -> bool:
return version >= spack.version.Version("0.13.1")
def ensure_patchelf_in_path_or_raise() -> None:
def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
"""Ensure patchelf is in the PATH or raise."""
# The old concretizer is not smart and we're doing its job: if the latest patchelf
# does not concretize because the compiler doesn't support C++17, we try to

View File

@@ -146,7 +146,7 @@ def mypy_root_spec() -> str:
def black_root_spec() -> str:
"""Return the root spec used to bootstrap black"""
return _root_spec("py-black@:23.1.0")
return _root_spec("py-black@:24.1.0")
def flake8_root_spec() -> str:

View File

@@ -199,6 +199,8 @@ def initconfig_mpi_entries(self):
mpiexec = "/usr/bin/srun"
else:
mpiexec = os.path.join(spec["slurm"].prefix.bin, "srun")
elif hasattr(spec["mpi"].package, "mpiexec"):
mpiexec = spec["mpi"].package.mpiexec
else:
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpirun")
if not os.path.exists(mpiexec):

View File

@@ -58,6 +58,62 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
)
def _supports_compilation_databases(pkg: spack.package_base.PackageBase) -> bool:
"""Check if this package (and CMake) can support compilation databases."""
# CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
if not pkg.spec.satisfies("^cmake@3.5:"):
return False
# CMAKE_EXPORT_COMPILE_COMMANDS is only implemented for Makefile and Ninja generators
if not (pkg.spec.satisfies("generator=make") or pkg.spec.satisfies("generator=ninja")):
return False
return True
def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
"""Set a few default defines for CMake, depending on its version."""
cmakes = pkg.spec.dependencies("cmake", dt.BUILD)
if len(cmakes) != 1:
return
cmake = cmakes[0]
# CMAKE_INTERPROCEDURAL_OPTIMIZATION only exists for CMake >= 3.9
try:
ipo = pkg.spec.variants["ipo"].value
except KeyError:
ipo = False
if cmake.satisfies("@3.9:"):
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
def generator(*names: str, default: Optional[str] = None):
"""The build system generator to use.
@@ -246,7 +302,10 @@ class CMakeBuilder(BaseBuilder):
@property
def archive_files(self):
"""Files to archive for packages based on CMake"""
return [os.path.join(self.build_directory, "CMakeCache.txt")]
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self):
@@ -293,11 +352,6 @@ def std_args(pkg, generator=None):
except KeyError:
build_type = "RelWithDebInfo"
try:
ipo = pkg.spec.variants["ipo"].value
except KeyError:
ipo = False
define = CMakeBuilder.define
args = [
"-G",
@@ -306,10 +360,6 @@ def std_args(pkg, generator=None):
define("CMAKE_BUILD_TYPE", build_type),
]
# CMAKE_INTERPROCEDURAL_OPTIMIZATION only exists for CMake >= 3.9
if pkg.spec.satisfies("^cmake@3.9:"):
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
if primary_generator == "Unix Makefiles":
args.append(define("CMAKE_VERBOSE_MAKEFILE", True))
@@ -318,6 +368,7 @@ def std_args(pkg, generator=None):
[define("CMAKE_FIND_FRAMEWORK", "LAST"), define("CMAKE_FIND_APPBUNDLE", "LAST")]
)
_conditional_cmake_defaults(pkg, args)
_maybe_set_python_hints(pkg, args)
# Set up CMake rpath

View File

@@ -218,7 +218,7 @@ def pset_components(self):
"+inspector": " intel-inspector",
"+itac": " intel-itac intel-ta intel-tc" " intel-trace-analyzer intel-trace-collector",
# Trace Analyzer and Collector
"+vtune": " intel-vtune"
"+vtune": " intel-vtune",
# VTune, ..-profiler since 2020, ..-amplifier before
}.items():
if variant in self.spec:

View File

@@ -29,15 +29,12 @@ class LuaPackage(spack.package_base.PackageBase):
with when("build_system=lua"):
depends_on("lua-lang")
extends("lua", when="^lua")
with when("^lua-luajit"):
extends("lua-luajit")
depends_on("luajit")
depends_on("lua-luajit+lualinks")
with when("^lua-luajit-openresty"):
extends("lua-luajit-openresty")
depends_on("luajit")
depends_on("lua-luajit-openresty+lualinks")
with when("^[virtuals=lua-lang] lua"):
extends("lua")
with when("^[virtuals=lua-lang] lua-luajit"):
extends("lua-luajit+lualinks")
with when("^[virtuals=lua-lang] lua-luajit-openresty"):
extends("lua-luajit-openresty+lualinks")
@property
def lua(self):

View File

@@ -9,10 +9,13 @@
import shutil
from os.path import basename, isdir
from llnl.util.filesystem import HeaderList, find_libraries, join_path, mkdirp
from llnl.util import tty
from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -179,16 +182,72 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""
def openmp_libs(self):
"""Supply LibraryList for linking OpenMP"""
# NB: Hunting down explicit library files may be the Spack way of
# doing things, but it is better to add the compiler defined option
# e.g. -fopenmp
# If other packages use openmp, then all the packages need to
# support the same ABI. Spack usually uses the same compiler
# for all the packages, but you can force it if necessary:
#
# e.g. spack install blaspp%oneapi@2024 ^intel-oneapi-mkl%oneapi@2024
#
if self.spec.satisfies("%intel") or self.spec.satisfies("%oneapi"):
libname = "libiomp5"
elif self.spec.satisfies("%gcc"):
libname = "libgomp"
elif self.spec.satisfies("%clang"):
libname = "libomp"
else:
raise InstallError(
"OneAPI package with OpenMP threading requires one of %clang, %gcc, %oneapi, "
"or %intel"
)
# query the compiler for the library path
with self.compiler.compiler_environment():
omp_lib_path = Executable(self.compiler.cc)(
"--print-file-name", f"{libname}.{dso_suffix}", output=str
).strip()
# Newer versions of clang do not give the full path to libomp. If that's
# the case, look in a path relative to the compiler where libomp is
# typically found. If it's not found there, error out.
if not os.path.exists(omp_lib_path) and self.spec.satisfies("%clang"):
compiler_root = os.path.dirname(os.path.dirname(os.path.realpath(self.compiler.cc)))
omp_lib_path_compiler = os.path.join(compiler_root, "lib", f"{libname}.{dso_suffix}")
if os.path.exists(omp_lib_path_compiler):
omp_lib_path = omp_lib_path_compiler
# if the compiler cannot find the file, it returns the input path
if not os.path.exists(omp_lib_path):
raise InstallError(f"OneAPI package cannot locate OpenMP library: {omp_lib_path}")
omp_libs = LibraryList(omp_lib_path)
tty.info(f"OneAPI package requires OpenMP library: {omp_libs}")
return omp_libs
# find_headers uses heuristics to determine the include directory
# that does not work for oneapi packages. Use explicit directories
# instead.
def header_directories(self, dirs):
h = HeaderList([])
h.directories = dirs
# trilinos passes the directories to cmake, and cmake requires
# that the directory exists
for dir in dirs:
if not isdir(dir):
raise RuntimeError(f"{dir} does not exist")
return h
@property
def headers(self):
return self.header_directories(
[self.component_prefix.include, self.component_prefix.include.join(self.component_dir)]
)
# This should match the directories added to CPATH by
# env/vars.sh for the component
return self.header_directories([self.component_prefix.include])
@property
def libs(self):

View File

@@ -6,7 +6,8 @@
import os
import re
import shutil
from typing import Iterable, List, Mapping, Optional
import stat
from typing import Dict, Iterable, List, Mapping, Optional, Tuple
import archspec
@@ -136,31 +137,52 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
if not self.extendee_spec:
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
python_is_external = self.extendee_spec.external
global_view = fs.same_path(python_prefix, view.get_projection_for_spec(self.spec))
for src, dst in merge_map.items():
if os.path.exists(dst):
if skip_if_exists and os.path.lexists(dst):
continue
elif global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
view.link(src, dst)
elif not os.path.islink(src):
continue
s = os.lstat(src)
# Symlink is delayed because we may need to re-target if its target is copied in view
if stat.S_ISLNK(s.st_mode):
delayed_links.append((src, dst))
continue
# If it's executable and has a shebang, copy and patch it.
if (s.st_mode & 0b111) and fs.has_shebang(src):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
is_script = fs.is_nonsymlink_exe_with_shebang(src)
if is_script and not python_is_external:
fs.filter_file(
python_prefix,
os.path.abspath(view.get_projection_for_spec(self.spec)),
dst,
)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
orig_link_target = os.path.realpath(src)
new_link_target = os.path.abspath(merge_map[orig_link_target])
view.link(new_link_target, dst)
view.link(src, dst)
# Finally re-target the symlinks that point to copied files.
for src, dst in delayed_links:
try:
s = os.stat(src)
target = copied_files[(s.st_dev, s.st_ino)]
except (OSError, KeyError):
target = None
if target:
os.symlink(target, dst)
else:
view.link(src, dst, spec=self.spec)
def remove_files_from_view(self, view, merge_map):
ignore_namespace = False

View File

@@ -35,9 +35,9 @@ def _misc_cache():
#: Spack's cache for small data
MISC_CACHE: Union[
spack.util.file_cache.FileCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_misc_cache)
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache)
)
def fetch_cache_location():
@@ -91,6 +91,6 @@ def symlink(self, mirror_ref):
#: Spack's local cache for downloaded source archives
FETCH_CACHE: Union[
spack.fetch_strategy.FsCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_fetch_cache)
FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_fetch_cache)
)

View File

@@ -7,9 +7,7 @@
get_job_name = lambda needs_entry: (
needs_entry.get("job")
if (isinstance(needs_entry, collections.abc.Mapping) and needs_entry.get("artifacts", True))
else needs_entry
if isinstance(needs_entry, str)
else None
else needs_entry if isinstance(needs_entry, str) else None
)

View File

@@ -6,6 +6,7 @@
import json
import os
import shutil
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -157,7 +158,9 @@ def setup_parser(subparser):
description=deindent(ci_reproduce.__doc__),
help=spack.cmd.first_line(ci_reproduce.__doc__),
)
reproduce.add_argument("job_url", help="URL of job artifacts bundle")
reproduce.add_argument(
"job_url", help="URL of GitLab job web page or artifact", type=_gitlab_artifacts_url
)
reproduce.add_argument(
"--runtime",
help="Container runtime to use.",
@@ -792,11 +795,6 @@ def ci_reproduce(args):
artifacts of the provided gitlab pipeline rebuild job's URL will be used to derive
instructions for reproducing the build locally
"""
job_url = args.job_url
work_dir = args.working_dir
autostart = args.autostart
runtime = args.runtime
# Allow passing GPG key for reprocuding protected CI jobs
if args.gpg_file:
gpg_key_url = url_util.path_to_file_url(args.gpg_file)
@@ -805,7 +803,47 @@ def ci_reproduce(args):
else:
gpg_key_url = None
return spack_ci.reproduce_ci_job(job_url, work_dir, autostart, gpg_key_url, runtime)
return spack_ci.reproduce_ci_job(
args.job_url, args.working_dir, args.autostart, gpg_key_url, args.runtime
)
def _gitlab_artifacts_url(url: str) -> str:
"""Take a URL either to the URL of the job in the GitLab UI, or to the artifacts zip file,
and output the URL to the artifacts zip file."""
parsed = urlparse(url)
if not parsed.scheme or not parsed.netloc:
raise ValueError(url)
parts = parsed.path.split("/")
if len(parts) < 2:
raise ValueError(url)
# Just use API endpoints verbatim, they're probably generated by Spack.
if parts[1] == "api":
return url
# If it's a URL to the job in the Gitlab UI, we may need to append the artifacts path.
minus_idx = parts.index("-")
# Remove repeated slashes in the remainder
rest = [p for p in parts[minus_idx + 1 :] if p]
# Now the format is jobs/X or jobs/X/artifacts/download
if len(rest) < 2 or rest[0] != "jobs":
raise ValueError(url)
if len(rest) == 2:
# replace jobs/X with jobs/X/artifacts/download
rest.extend(("artifacts", "download"))
# Replace the parts and unparse.
parts[minus_idx + 1 :] = rest
# Don't allow fragments / queries
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def ci(parser, args):

View File

@@ -115,7 +115,7 @@ def emulate_env_utility(cmd_name, context: Context, args):
f"Not all dependencies of {spec.name} are installed. "
f"Cannot setup {context} environment:",
spec.tree(
status_fn=spack.spec.Spec.install_status,
install_status=True,
hashlen=7,
hashes=True,
# This shows more than necessary, but we cannot dynamically change deptypes

View File

@@ -8,6 +8,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.spec
import spack.util.path
import spack.version
@@ -21,6 +22,7 @@
def setup_parser(subparser):
subparser.add_argument("-p", "--path", help="source location of package")
subparser.add_argument("-b", "--build-directory", help="build directory for the package")
clone_group = subparser.add_mutually_exclusive_group()
clone_group.add_argument(
@@ -151,4 +153,11 @@ def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if args.build_directory is not None:
spack.config.add(
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
env.scope_name,
)
_update_config(spec, path)

View File

@@ -30,6 +30,7 @@
@c{@min:max} version range (inclusive)
@c{@min:} version <min> or higher
@c{@:max} up to version <max> (inclusive)
@c{@=version} exact version
compilers:
@g{%compiler} build with <compiler>

View File

@@ -292,9 +292,11 @@ def head(n, span_id, title, anchor=None):
out.write("<dd>\n")
out.write(
", ".join(
d
if d not in pkg_names
else '<a class="reference internal" href="#%s">%s</a>' % (d, d)
(
d
if d not in pkg_names
else '<a class="reference internal" href="#%s">%s</a>' % (d, d)
)
for d in deps
)
)

View File

@@ -0,0 +1,71 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import gzip
import io
import os
import shutil
import sys
import spack.cmd
import spack.spec
import spack.util.compression as compression
from spack.cmd.common import arguments
from spack.main import SpackCommandError
description = "print out logs for packages"
section = "basic"
level = "long"
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["spec"])
def _dump_byte_stream_to_stdout(instream: io.BufferedIOBase) -> None:
# Reopen stdout in binary mode so we don't have to worry about encoding
outstream = os.fdopen(sys.stdout.fileno(), "wb", closefd=False)
shutil.copyfileobj(instream, outstream)
def _logs(cmdline_spec: spack.spec.Spec, concrete_spec: spack.spec.Spec):
if concrete_spec.installed:
log_path = concrete_spec.package.install_log_path
elif os.path.exists(concrete_spec.package.stage.path):
# TODO: `spack logs` can currently not show the logs while a package is being built, as the
# combined log file is only written after the build is finished.
log_path = concrete_spec.package.log_path
else:
raise SpackCommandError(f"{cmdline_spec} is not installed or staged")
try:
stream = open(log_path, "rb")
except OSError as e:
if e.errno == errno.ENOENT:
raise SpackCommandError(f"No logs are available for {cmdline_spec}") from e
raise SpackCommandError(f"Error reading logs for {cmdline_spec}: {e}") from e
with stream as f:
ext = compression.extension_from_magic_numbers_by_stream(f, decompress=False)
if ext and ext != "gz":
raise SpackCommandError(f"Unsupported storage format for {log_path}: {ext}")
# If the log file is gzip compressed, wrap it with a decompressor
_dump_byte_stream_to_stdout(gzip.GzipFile(fileobj=f) if ext == "gz" else f)
def logs(parser, args):
specs = spack.cmd.parse_specs(args.spec)
if not specs:
raise SpackCommandError("You must supply a spec.")
if len(specs) != 1:
raise SpackCommandError("Too many specs. Supply only one.")
concrete_spec = spack.cmd.matching_spec_from_env(specs[0])
_logs(specs[0], concrete_spec)

View File

@@ -135,8 +135,6 @@ def _process_result(result, show, required_format, kwargs):
def solve(parser, args):
# these are the same options as `spack spec`
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt
@@ -146,7 +144,7 @@ def solve(parser, args):
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
"install_status": args.install_status,
"hashes": args.long or args.very_long,
}

View File

@@ -75,8 +75,6 @@ def setup_parser(subparser):
def spec(parser, args):
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt
@@ -86,7 +84,7 @@ def spec(parser, args):
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
"install_status": args.install_status,
}
# use a read transaction if we are getting install status for every

View File

@@ -514,9 +514,10 @@ def get_compilers(config, cspec=None, arch_spec=None):
for items in config:
items = items["compiler"]
# NOTE: in principle this should be equality not satisfies, but config can still
# be written in old format gcc@10.1.0 instead of gcc@=10.1.0.
if cspec and not cspec.satisfies(items["spec"]):
# We might use equality here.
if cspec and not spack.spec.parse_with_version_concrete(
items["spec"], compiler=True
).satisfies(cspec):
continue
# If an arch spec is given, confirm that this compiler

View File

@@ -7,6 +7,7 @@
import re
import subprocess
import sys
import tempfile
from typing import Dict, List, Set
import spack.compiler
@@ -15,7 +16,7 @@
import spack.util.executable
from spack.compiler import Compiler
from spack.error import SpackError
from spack.version import Version
from spack.version import Version, VersionRange
avail_fc_version: Set[str] = set()
fc_path: Dict[str, str] = dict()
@@ -292,6 +293,15 @@ def setup_custom_environment(self, pkg, env):
else:
env.set_path(env_var, int_env[env_var].split(os.pathsep))
# certain versions of ifx (2021.3.0:2023.1.0) do not play well with env:TMP
# that has a "." character in the path
# Work around by pointing tmp to the stage for the duration of the build
if self.fc and Version(self.fc_version(self.fc)).satisfies(
VersionRange("2021.3.0", "2023.1.0")
):
new_tmp = tempfile.mkdtemp(dir=pkg.stage.path)
env.set("TMP", new_tmp)
env.set("CC", self.cc)
env.set("CXX", self.cxx)
env.set("FC", self.fc)

View File

@@ -826,7 +826,6 @@ def __init__(self, spec):
class InsufficientArchitectureInfoError(spack.error.SpackError):
"""Raised when details on architecture cannot be collected from the
system"""

View File

@@ -63,10 +63,11 @@
from spack.util.cpus import cpus_available
#: Dict from section names -> schema for that section
SECTION_SCHEMAS = {
SECTION_SCHEMAS: Dict[str, Any] = {
"compilers": spack.schema.compilers.schema,
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"view": spack.schema.view.schema,
"develop": spack.schema.develop.schema,
"mirrors": spack.schema.mirrors.schema,
"repos": spack.schema.repos.schema,
@@ -81,7 +82,7 @@
# Same as above, but including keys for environments
# this allows us to unify config reading between configs and environments
_ALL_SCHEMAS = copy.deepcopy(SECTION_SCHEMAS)
_ALL_SCHEMAS: Dict[str, Any] = copy.deepcopy(SECTION_SCHEMAS)
_ALL_SCHEMAS.update({spack.schema.env.TOP_LEVEL_KEY: spack.schema.env.schema})
#: Path to the default configuration
@@ -1096,7 +1097,7 @@ def read_config_file(
data = syaml.load_config(f)
if data:
if not schema:
if schema is None:
key = next(iter(data))
schema = _ALL_SCHEMAS[key]
validate(data, schema)

View File

@@ -71,7 +71,7 @@
"almalinux:9": {
"bootstrap": {
"template": "container/almalinux_9.dockerfile",
"image": "quay.io/almalinux/almalinux:9"
"image": "quay.io/almalinuxorg/almalinux:9"
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux9",
@@ -79,13 +79,13 @@
"develop": "latest"
},
"final": {
"image": "quay.io/almalinux/almalinux:9"
"image": "quay.io/almalinuxorg/almalinux:9"
}
},
"almalinux:8": {
"bootstrap": {
"template": "container/almalinux_8.dockerfile",
"image": "quay.io/almalinux/almalinux:8"
"image": "quay.io/almalinuxorg/almalinux:8"
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux8",
@@ -93,7 +93,7 @@
"develop": "latest"
},
"final": {
"image": "quay.io/almalinux/almalinux:8"
"image": "quay.io/almalinuxorg/almalinux:8"
}
},
"centos:stream": {

View File

@@ -36,6 +36,9 @@
#: Default dependency type if none is specified
DEFAULT: DepFlag = BUILD | LINK
#: A flag with no dependency types set
NONE: DepFlag = 0
#: An iterator of all flag components
ALL_FLAGS: Tuple[DepFlag, DepFlag, DepFlag, DepFlag] = (BUILD, LINK, RUN, TEST)

View File

@@ -34,7 +34,7 @@ class OpenMpi(Package):
import functools
import os.path
import re
from typing import Any, Callable, List, Optional, Set, Tuple, Union
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Set, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
@@ -57,6 +57,9 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
@@ -349,6 +352,7 @@ def remove_directives(arg):
return _decorator
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -380,7 +384,7 @@ def version(
tag: Optional[str] = None,
branch: Optional[str] = None,
get_full_repo: Optional[bool] = None,
submodules: Optional[bool] = None,
submodules: Union[SubmoduleCallback, Optional[bool]] = None,
submodules_delete: Optional[bool] = None,
# other version control
svn: Optional[str] = None,

View File

@@ -21,7 +21,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.lang import dedupe
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import symlink
@@ -379,8 +378,8 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
if not dev_specs:
return
for name, entry in dev_specs.items():
dev_path = entry["path"]
expanded_path = os.path.normpath(os.path.join(init_file_dir, entry["path"]))
dev_path = substitute_path_variables(entry["path"])
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
# Skip if the expanded path is the same (e.g. when absolute)
if dev_path == expanded_path:
@@ -663,30 +662,26 @@ def __contains__(self, spec):
return True
def specs_for_view(self, concretized_root_specs):
"""
From the list of concretized user specs in the environment, flatten
the dags, and filter selected, installed specs, remove duplicates on dag hash.
"""
# With deps, requires traversal
if self.link == "all" or self.link == "run":
deptype = ("run") if self.link == "run" else ("link", "run")
specs = list(
traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=traverse.by_dag_hash
)
)
def specs_for_view(self, concrete_roots: List[Spec]) -> List[Spec]:
"""Flatten the DAGs of the concrete roots, keep only unique, selected, and installed specs
in topological order from root to leaf."""
if self.link == "all":
deptype = dt.LINK | dt.RUN
elif self.link == "run":
deptype = dt.RUN
else:
specs = list(dedupe(concretized_root_specs, key=traverse.by_dag_hash))
deptype = dt.NONE
specs = traverse.traverse_nodes(
concrete_roots, order="topo", deptype=deptype, key=traverse.by_dag_hash
)
# Filter selected, installed specs
with spack.store.STORE.db.read_transaction():
specs = [s for s in specs if s in self and s.installed]
return [s for s in specs if s in self and s.installed]
return specs
def regenerate(self, concretized_root_specs):
specs = self.specs_for_view(concretized_root_specs)
def regenerate(self, concrete_roots: List[Spec]) -> None:
specs = self.specs_for_view(concrete_roots)
# To ensure there are no conflicts with packages being installed
# that cannot be resolved or have repos that have been removed
@@ -703,14 +698,14 @@ def regenerate(self, concretized_root_specs):
old_root = self._current_root
if new_root == old_root:
tty.debug("View at %s does not need regeneration." % self.root)
tty.debug(f"View at {self.root} does not need regeneration.")
return
_error_on_nonempty_view_dir(new_root)
# construct view at new_root
if specs:
tty.msg("Updating view at {0}".format(self.root))
tty.msg(f"Updating view at {self.root}")
view = self.view(new=new_root)
@@ -720,7 +715,7 @@ def regenerate(self, concretized_root_specs):
# Create a new view
try:
fs.mkdirp(new_root)
view.add_specs(*specs, with_dependencies=False)
view.add_specs(*specs)
# create symlink from tmp_symlink_name to new_root
if os.path.exists(tmp_symlink_name):
@@ -734,7 +729,7 @@ def regenerate(self, concretized_root_specs):
try:
shutil.rmtree(new_root, ignore_errors=True)
os.unlink(tmp_symlink_name)
except (IOError, OSError):
except OSError:
pass
# Give an informative error message for the typical error case: two specs, same package
@@ -876,9 +871,55 @@ def _process_definition(self, item):
else:
self.spec_lists[name] = user_specs
def _process_view(self, env_view: Optional[Union[bool, str, Dict]]):
"""Process view option(s), which can be boolean, string, or None.
A boolean environment view option takes precedence over any that may
be included. So ``view: True`` results in the default view only. And
``view: False`` means the environment will have no view.
Args:
env_view: view option provided in the manifest or configuration
"""
def add_view(name, values):
"""Add the view with the name and the string or dict values."""
if isinstance(values, str):
self.views[name] = ViewDescriptor(self.path, values)
elif isinstance(values, dict):
self.views[name] = ViewDescriptor.from_dict(self.path, values)
else:
tty.error(f"Cannot add view named {name} for {type(values)} values {values}")
# If the configuration specifies 'view: False' then we are done
# processing views. If this is called with the environment's view
# view (versus an included view), then there are to be NO views.
if env_view is False:
return
# If the configuration specifies 'view: True' then only the default
# view will be created for the environment and we are done processing
# views.
if env_view is True:
add_view(default_view_name, self.view_path_default)
return
# Otherwise, the configuration has a subdirectory or dictionary.
if isinstance(env_view, str):
add_view(default_view_name, env_view)
elif env_view:
for name, values in env_view.items():
add_view(name, values)
# If we reach this point without an explicit view option then we
# provide the default view.
if self.views == dict():
self.views[default_view_name] = ViewDescriptor(self.path, self.view_path_default)
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
self.views = {}
for item in spack.config.get("definitions", []):
self._process_definition(item)
@@ -890,20 +931,7 @@ def _construct_state_from_manifest(self):
)
self.spec_lists[user_speclist_name] = user_specs
enable_view = env_configuration.get("view")
# enable_view can be boolean, string, or None
if enable_view is True or enable_view is None:
self.views = {default_view_name: ViewDescriptor(self.path, self.view_path_default)}
elif isinstance(enable_view, str):
self.views = {default_view_name: ViewDescriptor(self.path, enable_view)}
elif enable_view:
path = self.path
self.views = dict(
(name, ViewDescriptor.from_dict(path, values))
for name, values in enable_view.items()
)
else:
self.views = {}
self._process_view(spack.config.get("view", True))
@property
def user_specs(self):
@@ -2062,7 +2090,6 @@ def write(self, regenerate: bool = True) -> None:
if regenerate:
self.regenerate_views()
spack.hooks.post_env_write(self)
self.new_specs.clear()
@@ -2185,7 +2212,7 @@ def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.DISPLAY_FORMAT,
status_fn=spack.spec.Spec.install_status,
install_status=True,
hashlen=7,
hashes=True,
)

View File

@@ -697,7 +697,6 @@ def __str__(self):
@fetcher
class GitFetchStrategy(VCSFetchStrategy):
"""
Fetch strategy that gets source code from a git repository.
Use like this in a package:
@@ -930,9 +929,12 @@ def clone(self, dest=None, commit=None, branch=None, tag=None, bare=False):
git_commands = []
submodules = self.submodules
if callable(submodules):
submodules = list(submodules(self.package))
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(["submodule", "update", "--recursive"])
submodules = submodules(self.package)
if submodules:
if isinstance(submodules, str):
submodules = [submodules]
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(["submodule", "update", "--recursive"])
elif submodules:
git_commands.append(["submodule", "update", "--init", "--recursive"])
@@ -1089,7 +1091,6 @@ def __str__(self):
@fetcher
class SvnFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a subversion repository.
Use like this in a package:
@@ -1184,7 +1185,6 @@ def __str__(self):
@fetcher
class HgFetchStrategy(VCSFetchStrategy):
"""
Fetch strategy that gets source code from a Mercurial repository.
Use like this in a package:

View File

@@ -32,6 +32,7 @@
from llnl.util.tty.color import colorize
import spack.config
import spack.paths
import spack.projections
import spack.relocate
import spack.schema.projections
@@ -91,16 +92,16 @@ def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
prefix_to_projection[spack.store.STORE.layout.root] = view._root
# This is vestigial code for the *old* location of sbang.
prefix_to_projection[
"#!/bin/bash {0}/bin/sbang".format(spack.paths.spack_root)
] = sbang.sbang_shebang_line()
prefix_to_projection[f"#!/bin/bash {spack.paths.spack_root}/bin/sbang"] = (
sbang.sbang_shebang_line()
)
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
try:
os.chown(dst, src_stat.st_uid, src_stat.st_gid)
except OSError:
tty.debug("Can't change the permissions for %s" % dst)
tty.debug(f"Can't change the permissions for {dst}")
def view_func_parser(parsed_name):
@@ -112,7 +113,7 @@ def view_func_parser(parsed_name):
elif parsed_name in ("add", "symlink", "soft"):
return view_symlink
else:
raise ValueError("invalid link type for view: '%s'" % parsed_name)
raise ValueError(f"invalid link type for view: '{parsed_name}'")
def inverse_view_func_parser(view_type):
@@ -270,9 +271,10 @@ def __init__(self, root, layout, **kwargs):
# Ensure projections are the same from each source
# Read projections file from view
if self.projections != self.read_projections():
msg = "View at %s has projections file" % self._root
msg += " which does not match projections passed manually."
raise ConflictingProjectionsError(msg)
raise ConflictingProjectionsError(
f"View at {self._root} has projections file"
" which does not match projections passed manually."
)
self._croot = colorize_root(self._root) + " "
@@ -313,11 +315,11 @@ def add_specs(self, *specs, **kwargs):
def add_standalone(self, spec):
if spec.external:
tty.warn(self._croot + "Skipping external package: %s" % colorize_spec(spec))
tty.warn(f"{self._croot}Skipping external package: {colorize_spec(spec)}")
return True
if self.check_added(spec):
tty.warn(self._croot + "Skipping already linked package: %s" % colorize_spec(spec))
tty.warn(f"{self._croot}Skipping already linked package: {colorize_spec(spec)}")
return True
self.merge(spec)
@@ -325,7 +327,7 @@ def add_standalone(self, spec):
self.link_meta_folder(spec)
if self.verbose:
tty.info(self._croot + "Linked package: %s" % colorize_spec(spec))
tty.info(f"{self._croot}Linked package: {colorize_spec(spec)}")
return True
def merge(self, spec, ignore=None):
@@ -393,7 +395,7 @@ def needs_file(spec, file):
for file in files:
if not os.path.lexists(file):
tty.warn("Tried to remove %s which does not exist" % file)
tty.warn(f"Tried to remove {file} which does not exist")
continue
# remove if file is not owned by any other package in the view
@@ -404,7 +406,7 @@ def needs_file(spec, file):
# we are currently removing, as we remove files before unlinking the
# metadata directory.
if len([s for s in specs if needs_file(s, file)]) <= 1:
tty.debug("Removing file " + file)
tty.debug(f"Removing file {file}")
os.remove(file)
def check_added(self, spec):
@@ -477,14 +479,14 @@ def remove_standalone(self, spec):
Remove (unlink) a standalone package from this view.
"""
if not self.check_added(spec):
tty.warn(self._croot + "Skipping package not linked in view: %s" % spec.name)
tty.warn(f"{self._croot}Skipping package not linked in view: {spec.name}")
return
self.unmerge(spec)
self.unlink_meta_folder(spec)
if self.verbose:
tty.info(self._croot + "Removed package: %s" % colorize_spec(spec))
tty.info(f"{self._croot}Removed package: {colorize_spec(spec)}")
def get_projection_for_spec(self, spec):
"""
@@ -558,9 +560,9 @@ def print_conflict(self, spec_active, spec_specified, level="error"):
linked = tty.color.colorize(" (@gLinked@.)", color=color)
specified = tty.color.colorize("(@rSpecified@.)", color=color)
cprint(
self._croot + "Package conflict detected:\n"
"%s %s\n" % (linked, colorize_spec(spec_active))
+ "%s %s" % (specified, colorize_spec(spec_specified))
f"{self._croot}Package conflict detected:\n"
f"{linked} {colorize_spec(spec_active)}\n"
f"{specified} {colorize_spec(spec_specified)}"
)
def print_status(self, *specs, **kwargs):
@@ -572,14 +574,14 @@ def print_status(self, *specs, **kwargs):
for s, v in zip(specs, in_view):
if not v:
tty.error(self._croot + "Package not linked: %s" % s.name)
tty.error(f"{self._croot}Package not linked: {s.name}")
elif s != v:
self.print_conflict(v, s, level="warn")
in_view = list(filter(None, in_view))
if len(specs) > 0:
tty.msg("Packages linked in %s:" % self._croot[:-1])
tty.msg(f"Packages linked in {self._croot[:-1]}:")
# Make a dict with specs keyed by architecture and compiler.
index = index_by(specs, ("architecture", "compiler"))
@@ -589,20 +591,19 @@ def print_status(self, *specs, **kwargs):
if i > 0:
print()
header = "%s{%s} / %s{%s}" % (
spack.spec.ARCHITECTURE_COLOR,
architecture,
spack.spec.COMPILER_COLOR,
compiler,
header = (
f"{spack.spec.ARCHITECTURE_COLOR}{{{architecture}}} "
f"/ {spack.spec.COMPILER_COLOR}{{{compiler}}}"
)
tty.hline(colorize(header), char="-")
specs = index[(architecture, compiler)]
specs.sort()
format_string = "{name}{@version}"
format_string += "{%compiler}{compiler_flags}{variants}"
abbreviated = [s.cformat(format_string) for s in specs]
abbreviated = [
s.cformat("{name}{@version}{%compiler}{compiler_flags}{variants}")
for s in specs
]
# Print one spec per line along with prefix path
width = max(len(s) for s in abbreviated)
@@ -634,22 +635,19 @@ def unlink_meta_folder(self, spec):
class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on
performance and immutable views, where specs cannot be removed after they
were added."""
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same
package, that project to the same prefix. We want to catch that as
early as possible and give a sensible error to the user. Here we use
the metadata dir (.spack) projection as a quick test to see whether
two specs in the view are going to clash. The metadata dir is used
because it's always added by Spack with identical files, so a
guaranteed clash that's easily verified."""
seen = dict()
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to
the user. Here we use the metadata dir (.spack) projection as a quick test to see whether
two specs in the view are going to clash. The metadata dir is used because it's always
added by Spack with identical files, so a guaranteed clash that's easily verified."""
seen = {}
for current_spec in specs:
metadata_dir = self.relative_metadata_dir_for_spec(current_spec)
conflicting_spec = seen.get(metadata_dir)
@@ -657,7 +655,8 @@ def _sanity_check_view_projection(self, specs):
raise ConflictingSpecsError(current_spec, conflicting_spec)
seen[metadata_dir] = current_spec
def add_specs(self, *specs, **kwargs):
def add_specs(self, *specs: spack.spec.Spec) -> None:
"""Link a root-to-leaf topologically ordered list of specs into the view."""
assert all((s.concrete for s in specs))
if len(specs) == 0:
return
@@ -668,9 +667,6 @@ def add_specs(self, *specs, **kwargs):
tty.warn("Skipping external package: " + s.short_spec)
specs = [s for s in specs if not s.external]
if kwargs.get("exclude", None):
specs = set(filter_exclude(specs, kwargs["exclude"]))
self._sanity_check_view_projection(specs)
# Ignore spack meta data folder.
@@ -695,13 +691,11 @@ def skip_list(file):
# Inform about file-file conflicts.
if visitor.file_conflicts:
if self.ignore_conflicts:
tty.debug("{0} file conflicts".format(len(visitor.file_conflicts)))
tty.debug(f"{len(visitor.file_conflicts)} file conflicts")
else:
raise MergeConflictSummary(visitor.file_conflicts)
tty.debug(
"Creating {0} dirs and {1} links".format(len(visitor.directories), len(visitor.files))
)
tty.debug(f"Creating {len(visitor.directories)} dirs and {len(visitor.files)} links")
# Make the directory structure
for dst in visitor.directories:

View File

@@ -15,13 +15,6 @@
* post_install(spec, explicit)
* pre_uninstall(spec)
* post_uninstall(spec)
* on_install_start(spec)
* on_install_success(spec)
* on_install_failure(spec)
* on_phase_success(pkg, phase_name, log_file)
* on_phase_error(pkg, phase_name, log_file)
* on_phase_error(pkg, phase_name, log_file)
* post_env_write(env)
This can be used to implement support for things like module
systems (e.g. modules, lmod, etc.) or to add other custom
@@ -78,17 +71,5 @@ def __call__(self, *args, **kwargs):
pre_install = _HookRunner("pre_install")
post_install = _HookRunner("post_install")
# These hooks are run within an install subprocess
pre_uninstall = _HookRunner("pre_uninstall")
post_uninstall = _HookRunner("post_uninstall")
on_phase_success = _HookRunner("on_phase_success")
on_phase_error = _HookRunner("on_phase_error")
# These are hooks in installer.py, before starting install subprocess
on_install_start = _HookRunner("on_install_start")
on_install_success = _HookRunner("on_install_success")
on_install_failure = _HookRunner("on_install_failure")
on_install_cancel = _HookRunner("on_install_cancel")
# Environment hooks
post_env_write = _HookRunner("post_env_write")

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import IO, Optional, Tuple
from typing import BinaryIO, Optional, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, visit_directory_tree
@@ -18,7 +18,7 @@ def should_keep(path: bytes) -> bool:
return path.startswith(b"$") or (os.path.isabs(path) and os.path.lexists(path))
def _drop_redundant_rpaths(f: IO) -> Optional[Tuple[bytes, bytes]]:
def _drop_redundant_rpaths(f: BinaryIO) -> Optional[Tuple[bytes, bytes]]:
"""Drop redundant entries from rpath.
Args:

View File

@@ -1705,7 +1705,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
except spack.build_environment.StopPhase as e:
# A StopPhase exception means that do_install was asked to
# stop early from clients, and is not an error at this point
spack.hooks.on_install_failure(task.request.pkg.spec)
pid = f"{self.pid}: " if tty.show_pid() else ""
tty.debug(f"{pid}{str(e)}")
tty.debug(f"Package stage directory: {pkg.stage.source_path}")
@@ -2011,7 +2010,6 @@ def install(self) -> None:
if task is None:
continue
spack.hooks.on_install_start(task.request.pkg.spec)
install_args = task.request.install_args
keep_prefix = install_args.get("keep_prefix")
@@ -2037,9 +2035,6 @@ def install(self) -> None:
tty.warn(f"{pkg_id} does NOT actually have any uninstalled deps left")
dep_str = "dependencies" if task.priority > 1 else "dependency"
# Hook to indicate task failure, but without an exception
spack.hooks.on_install_failure(task.request.pkg.spec)
raise InstallError(
f"Cannot proceed with {pkg_id}: {task.priority} uninstalled "
f"{dep_str}: {','.join(task.uninstalled_deps)}",
@@ -2062,11 +2057,6 @@ def install(self) -> None:
tty.warn(f"{pkg_id} failed to install")
self._update_failed(task)
# Mark that the package failed
# TODO: this should also be for the task.pkg, but we don't
# model transitive yet.
spack.hooks.on_install_failure(task.request.pkg.spec)
if self.fail_fast:
raise InstallError(fail_fast_err, pkg=pkg)
@@ -2169,7 +2159,6 @@ def install(self) -> None:
tty.error(
f"Failed to install {pkg.name} due to " f"{exc.__class__.__name__}: {str(exc)}"
)
spack.hooks.on_install_cancel(task.request.pkg.spec)
raise
except binary_distribution.NoChecksumException as exc:
@@ -2188,7 +2177,6 @@ def install(self) -> None:
except (Exception, SystemExit) as exc:
self._update_failed(task, True, exc)
spack.hooks.on_install_failure(task.request.pkg.spec)
# Best effort installs suppress the exception and mark the
# package as a failure.
@@ -2372,9 +2360,6 @@ def run(self) -> bool:
_print_timer(pre=self.pre, pkg_id=self.pkg_id, timer=self.timer)
_print_installed_pkg(self.pkg.prefix)
# Send final status that install is successful
spack.hooks.on_install_success(self.pkg.spec)
# preserve verbosity across runs
return self.echo
@@ -2453,15 +2438,10 @@ def _real_install(self) -> None:
# Catch any errors to report to logging
self.timer.start(phase_fn.name)
phase_fn.execute()
spack.hooks.on_phase_success(pkg, phase_fn.name, log_file)
self.timer.stop(phase_fn.name)
except BaseException:
combine_phase_logs(pkg.phase_log_files, pkg.log_path)
spack.hooks.on_phase_error(pkg, phase_fn.name, log_file)
# phase error indicates install error
spack.hooks.on_install_failure(pkg.spec)
raise
# We assume loggers share echo True/False

View File

@@ -1038,9 +1038,9 @@ def finish_parse_and_run(parser, cmd_name, main_args, env_format_error):
set_working_dir()
# now we can actually execute the command.
if args.spack_profile or args.sorted_profile:
if main_args.spack_profile or main_args.sorted_profile:
_profile_wrapper(command, parser, args, unknown)
elif args.pdb:
elif main_args.pdb:
import pdb
pdb.runctx("_invoke_command(command, parser, args, unknown)", globals(), locals())

View File

@@ -9,6 +9,8 @@
import platform
import subprocess
from llnl.util import tty
from spack.error import SpackError
from spack.util import windows_registry as winreg
from spack.version import Version
@@ -83,11 +85,50 @@ def compiler_search_paths(self):
os.path.join(str(os.getenv("ONEAPI_ROOT")), "compiler", "*", "windows", "bin")
)
)
# Second strategy: Find MSVC via the registry
msft = winreg.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft", winreg.HKEY.HKEY_LOCAL_MACHINE
)
vs_entries = msft.find_subkeys(r"VisualStudio_.*")
def try_query_registry(retry=False):
winreg_report_error = lambda e: tty.debug(
'Windows registry query on "SOFTWARE\\WOW6432Node\\Microsoft"'
f"under HKEY_LOCAL_MACHINE: {str(e)}"
)
try:
# Registry interactions are subject to race conditions, etc and can generally
# be flakey, do this in a catch block to prevent reg issues from interfering
# with compiler detection
msft = winreg.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft", winreg.HKEY.HKEY_LOCAL_MACHINE
)
return msft.find_subkeys(r"VisualStudio_.*", recursive=False)
except OSError as e:
# OSErrors propagated into caller by Spack's registry module are expected
# and indicate a known issue with the registry query
# i.e. user does not have permissions or the key/value
# doesn't exist
winreg_report_error(e)
return []
except winreg.InvalidRegistryOperation as e:
# Other errors raised by the Spack's reg module indicate
# an unexpected error type, and are handled specifically
# as the underlying cause is difficult/impossible to determine
# without manually exploring the registry
# These errors can also be spurious (race conditions)
# and may resolve on re-execution of the query
# or are permanent (specific types of permission issues)
# but the registry raises the same exception for all types of
# atypical errors
if retry:
winreg_report_error(e)
return []
vs_entries = try_query_registry()
if not vs_entries:
# Occasional spurious race conditions can arise when reading the MS reg
# typically these race conditions resolve immediately and we can safely
# retry the reg query without waiting
# Note: Winreg does not support locking
vs_entries = try_query_registry(retry=True)
vs_paths = []
def clean_vs_path(path):
@@ -99,11 +140,8 @@ def clean_vs_path(path):
val = entry.get_subkey("Capabilities").get_value("ApplicationDescription").value
vs_paths.append(clean_vs_path(val))
except FileNotFoundError as e:
if hasattr(e, "winerror"):
if e.winerror == 2:
pass
else:
raise
if hasattr(e, "winerror") and e.winerror == 2:
pass
else:
raise

View File

@@ -7,6 +7,7 @@
import os
import re
from collections import OrderedDict
from typing import List, Optional
import macholib.mach_o
import macholib.MachO
@@ -47,7 +48,7 @@ def __init__(self, file_path, root_path):
@memoized
def _patchelf():
def _patchelf() -> Optional[executable.Executable]:
"""Return the full path to the patchelf binary, if available, else None."""
import spack.bootstrap
@@ -55,9 +56,7 @@ def _patchelf():
return None
with spack.bootstrap.ensure_bootstrap_configuration():
patchelf = spack.bootstrap.ensure_patchelf_in_path_or_raise()
return patchelf.path
return spack.bootstrap.ensure_patchelf_in_path_or_raise()
def _elf_rpaths_for(path):
@@ -340,31 +339,34 @@ def macholib_get_paths(cur_path):
return (rpaths, deps, ident)
def _set_elf_rpaths(target, rpaths):
"""Replace the original RPATH of the target with the paths passed
as arguments.
def _set_elf_rpaths_and_interpreter(
target: str, rpaths: List[str], interpreter: Optional[str] = None
) -> Optional[str]:
"""Replace the original RPATH of the target with the paths passed as arguments.
Args:
target: target executable. Must be an ELF object.
rpaths: paths to be set in the RPATH
interpreter: optionally set the interpreter
Returns:
A string concatenating the stdout and stderr of the call
to ``patchelf`` if it was invoked
A string concatenating the stdout and stderr of the call to ``patchelf`` if it was invoked
"""
# Join the paths using ':' as a separator
rpaths_str = ":".join(rpaths)
patchelf, output = executable.Executable(_patchelf()), None
try:
# TODO: error handling is not great here?
# TODO: revisit the use of --force-rpath as it might be conditional
# TODO: if we want to support setting RUNPATH from binary packages
patchelf_args = ["--force-rpath", "--set-rpath", rpaths_str, target]
output = patchelf(*patchelf_args, output=str, error=str)
args = ["--force-rpath", "--set-rpath", rpaths_str]
if interpreter:
args.extend(["--set-interpreter", interpreter])
args.append(target)
return _patchelf()(*args, output=str, error=str)
except executable.ProcessError as e:
msg = "patchelf --force-rpath --set-rpath {0} failed with error {1}"
tty.warn(msg.format(target, e))
return output
tty.warn(str(e))
return None
def needs_binary_relocation(m_type, m_subtype):
@@ -501,10 +503,12 @@ def new_relocate_elf_binaries(binaries, prefix_to_prefix):
for path in binaries:
try:
elf.replace_rpath_in_place_or_raise(path, prefix_to_prefix)
except elf.ElfDynamicSectionUpdateFailed as e:
# Fall back to the old `patchelf --set-rpath` method.
_set_elf_rpaths(path, e.new.decode("utf-8").split(":"))
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix)
except elf.ElfCStringUpdatesFailed as e:
# Fall back to `patchelf --set-rpath ... --set-interpreter ...`
rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else []
interpreter = e.pt_interp.new_value.decode("utf-8") if e.pt_interp else None
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def relocate_elf_binaries(
@@ -546,10 +550,10 @@ def relocate_elf_binaries(
new_rpaths = _make_relative(new_binary, new_root, new_norm_rpaths)
# check to see if relative rpaths are changed before rewriting
if sorted(new_rpaths) != sorted(orig_rpaths):
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
else:
new_rpaths = _transform_rpaths(orig_rpaths, orig_root, new_prefixes)
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def make_link_relative(new_links, orig_links):
@@ -596,7 +600,7 @@ def make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root):
orig_rpaths = _elf_rpaths_for(new_binary)
if orig_rpaths:
new_rpaths = _make_relative(orig_binary, orig_layout_root, orig_rpaths)
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def warn_if_link_cant_be_relocated(link, target):

View File

@@ -3,16 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for bootstrap.yaml configuration file."""
from typing import Any, Dict
#: Schema of a single source
_source_schema = {
_source_schema: Dict[str, Any] = {
"type": "object",
"properties": {"name": {"type": "string"}, "metadata": {"type": "string"}},
"additionalProperties": False,
"required": ["name", "metadata"],
}
properties = {
properties: Dict[str, Any] = {
"bootstrap": {
"type": "object",
"properties": {

View File

@@ -6,27 +6,31 @@
"""Schema for a buildcache spec.yaml file
.. literalinclude:: _spack_root/lib/spack/spack/schema/buildcache_spec.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
import spack.schema.spec
properties: Dict[str, Any] = {
# `buildinfo` is no longer needed as of Spack 0.21
"buildinfo": {"type": "object"},
"spec": {
"type": "object",
"additionalProperties": True,
"items": spack.schema.spec.properties,
},
"binary_cache_checksum": {
"type": "object",
"properties": {"hash_algorithm": {"type": "string"}, "hash": {"type": "string"}},
},
"buildcache_layout_version": {"type": "number"},
}
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack buildcache specfile schema",
"type": "object",
"additionalProperties": False,
"properties": {
# `buildinfo` is no longer needed as of Spack 0.21
"buildinfo": {"type": "object"},
"spec": {
"type": "object",
"additionalProperties": True,
"items": spack.schema.spec.properties,
},
"binary_cache_checksum": {
"type": "object",
"properties": {"hash_algorithm": {"type": "string"}, "hash": {"type": "string"}},
},
"buildcache_layout_version": {"type": "number"},
},
"properties": properties,
}

View File

@@ -2,16 +2,15 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for cdash.yaml configuration file.
.. literalinclude:: ../spack/schema/cdash.py
:lines: 13-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"cdash": {
"type": "object",
"additionalProperties": False,

View File

@@ -2,12 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for gitlab-ci.yaml configuration file.
.. literalinclude:: ../spack/schema/ci.py
:lines: 13-
:lines: 16-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
@@ -164,7 +164,7 @@
}
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"ci": {
"oneOf": [
# TODO: Replace with core-shared-properties in Spack 0.23

View File

@@ -2,16 +2,17 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for compilers.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/compilers.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
import spack.schema.environment
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"compilers": {
"type": "array",
"items": {

View File

@@ -2,14 +2,14 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for concretizer.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/concretizer.py
:lines: 13-
:lines: 12-
"""
from typing import Any, Dict
properties = {
properties: Dict[str, Any] = {
"concretizer": {
"type": "object",
"additionalProperties": False,

View File

@@ -5,15 +5,16 @@
"""Schema for config.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/config.py
:lines: 13-
:lines: 17-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema.projections
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"config": {
"type": "object",
"default": {},

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for the 'container' subsection of Spack environments."""
from typing import Any, Dict
_stages_from_dockerhub = {
"type": "object",
@@ -85,4 +86,4 @@
},
}
properties = {"container": container_schema}
properties: Dict[str, Any] = {"container": container_schema}

View File

@@ -11,112 +11,115 @@
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
"""
from typing import Any, Dict
properties: Dict[str, Any] = {
"_meta": {
"type": "object",
"additionalProperties": False,
"properties": {
"file-type": {"type": "string", "minLength": 1},
"cpe-version": {"type": "string", "minLength": 1},
"system-type": {"type": "string", "minLength": 1},
"schema-version": {"type": "string", "minLength": 1},
# Older schemas use did not have "cpe-version", just the
# schema version; in that case it was just called "version"
"version": {"type": "string", "minLength": 1},
},
},
"compilers": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"prefix": {"type": "string", "minLength": 1},
"executables": {
"type": "object",
"additionalProperties": False,
"properties": {
"cc": {"type": "string", "minLength": 1},
"cxx": {"type": "string", "minLength": 1},
"fc": {"type": "string", "minLength": 1},
},
},
"arch": {
"type": "object",
"required": ["os", "target"],
"additionalProperties": False,
"properties": {
"os": {"type": "string", "minLength": 1},
"target": {"type": "string", "minLength": 1},
},
},
},
},
},
"specs": {
"type": "array",
"items": {
"type": "object",
"required": ["name", "version", "arch", "compiler", "prefix", "hash"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"arch": {
"type": "object",
"required": ["platform", "platform_os", "target"],
"additionalProperties": False,
"properties": {
"platform": {"type": "string", "minLength": 1},
"platform_os": {"type": "string", "minLength": 1},
"target": {
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {"name": {"type": "string", "minLength": 1}},
},
},
},
"compiler": {
"type": "object",
"required": ["name", "version"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
},
},
"dependencies": {
"type": "object",
"patternProperties": {
"\\w[\\w-]*": {
"type": "object",
"required": ["hash"],
"additionalProperties": False,
"properties": {
"hash": {"type": "string", "minLength": 1},
"type": {
"type": "array",
"items": {"type": "string", "minLength": 1},
},
},
}
},
},
"prefix": {"type": "string", "minLength": 1},
"rpm": {"type": "string", "minLength": 1},
"hash": {"type": "string", "minLength": 1},
"parameters": {"type": "object"},
},
},
},
}
schema = {
"$schema": "http://json-schema.org/schema#",
"title": "CPE manifest schema",
"type": "object",
"additionalProperties": False,
"properties": {
"_meta": {
"type": "object",
"additionalProperties": False,
"properties": {
"file-type": {"type": "string", "minLength": 1},
"cpe-version": {"type": "string", "minLength": 1},
"system-type": {"type": "string", "minLength": 1},
"schema-version": {"type": "string", "minLength": 1},
# Older schemas use did not have "cpe-version", just the
# schema version; in that case it was just called "version"
"version": {"type": "string", "minLength": 1},
},
},
"compilers": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"prefix": {"type": "string", "minLength": 1},
"executables": {
"type": "object",
"additionalProperties": False,
"properties": {
"cc": {"type": "string", "minLength": 1},
"cxx": {"type": "string", "minLength": 1},
"fc": {"type": "string", "minLength": 1},
},
},
"arch": {
"type": "object",
"required": ["os", "target"],
"additionalProperties": False,
"properties": {
"os": {"type": "string", "minLength": 1},
"target": {"type": "string", "minLength": 1},
},
},
},
},
},
"specs": {
"type": "array",
"items": {
"type": "object",
"required": ["name", "version", "arch", "compiler", "prefix", "hash"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"arch": {
"type": "object",
"required": ["platform", "platform_os", "target"],
"additioanlProperties": False,
"properties": {
"platform": {"type": "string", "minLength": 1},
"platform_os": {"type": "string", "minLength": 1},
"target": {
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {"name": {"type": "string", "minLength": 1}},
},
},
},
"compiler": {
"type": "object",
"required": ["name", "version"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
},
},
"dependencies": {
"type": "object",
"patternProperties": {
"\\w[\\w-]*": {
"type": "object",
"required": ["hash"],
"additionalProperties": False,
"properties": {
"hash": {"type": "string", "minLength": 1},
"type": {
"type": "array",
"items": {"type": "string", "minLength": 1},
},
},
}
},
},
"prefix": {"type": "string", "minLength": 1},
"rpm": {"type": "string", "minLength": 1},
"hash": {"type": "string", "minLength": 1},
"parameters": {"type": "object"},
},
},
},
},
"properties": properties,
}

View File

@@ -6,12 +6,41 @@
"""Schema for database index.json file
.. literalinclude:: _spack_root/lib/spack/spack/schema/database_index.py
:lines: 36-
:lines: 17-
"""
from typing import Any, Dict
import spack.schema.spec
# spack.schema.spec.properties
properties: Dict[str, Any] = {
"database": {
"type": "object",
"required": ["installs", "version"],
"additionalProperties": False,
"properties": {
"installs": {
"type": "object",
"patternProperties": {
r"^[\w\d]{32}$": {
"type": "object",
"properties": {
"spec": spack.schema.spec.properties,
"path": {"oneOf": [{"type": "string"}, {"type": "null"}]},
"installed": {"type": "boolean"},
"ref_count": {"type": "integer", "minimum": 0},
"explicit": {"type": "boolean"},
"installation_time": {"type": "number"},
},
}
},
},
"version": {"type": "string"},
},
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
@@ -19,30 +48,5 @@
"type": "object",
"required": ["database"],
"additionalProperties": False,
"properties": {
"database": {
"type": "object",
"required": ["installs", "version"],
"additionalProperties": False,
"properties": {
"installs": {
"type": "object",
"patternProperties": {
r"^[\w\d]{32}$": {
"type": "object",
"properties": {
"spec": spack.schema.spec.properties,
"path": {"oneOf": [{"type": "string"}, {"type": "null"}]},
"installed": {"type": "boolean"},
"ref_count": {"type": "integer", "minimum": 0},
"explicit": {"type": "boolean"},
"installation_time": {"type": "number"},
},
}
},
},
"version": {"type": "string"},
},
}
},
"properties": properties,
}

View File

@@ -6,13 +6,14 @@
"""Schema for definitions
.. literalinclude:: _spack_root/lib/spack/spack/schema/definitions.py
:lines: 13-
:lines: 16-
"""
from typing import Any, Dict
import spack.schema
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"definitions": {
"type": "array",
"default": [],

View File

@@ -2,9 +2,9 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Any, Dict
properties = {
properties: Dict[str, Any] = {
"develop": {
"type": "object",
"default": {},

View File

@@ -6,8 +6,10 @@
"""Schema for env.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/env.py
:lines: 36-
:lines: 19-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema.gitlab_ci # DEPRECATED
@@ -19,61 +21,31 @@
projections_scheme = spack.schema.projections.properties["projections"]
properties: Dict[str, Any] = {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
spack.schema.gitlab_ci.properties,
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
},
),
}
}
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack environment file schema",
"type": "object",
"additionalProperties": False,
"properties": {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
spack.schema.gitlab_ci.properties,
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
"view": {
"anyOf": [
{"type": "boolean"},
{"type": "string"},
{
"type": "object",
"patternProperties": {
r"\w+": {
"required": ["root"],
"additionalProperties": False,
"properties": {
"root": {"type": "string"},
"link": {
"type": "string",
"pattern": "(roots|all|run)",
},
"link_type": {"type": "string"},
"select": {
"type": "array",
"items": {"type": "string"},
},
"exclude": {
"type": "array",
"items": {"type": "string"},
},
"projections": projections_scheme,
},
}
},
},
]
},
},
),
}
},
"properties": properties,
}

View File

@@ -6,6 +6,7 @@
schemas.
"""
import collections.abc
from typing import Any, Dict
array_of_strings_or_num = {
"type": "array",
@@ -18,7 +19,7 @@
"patternProperties": {r"\w[\w-]*": {"anyOf": [{"type": "string"}, {"type": "number"}]}},
}
definition = {
definition: Dict[str, Any] = {
"type": "object",
"default": {},
"additionalProperties": False,

View File

@@ -6,8 +6,9 @@
"""Schema for gitlab-ci.yaml configuration file.
.. literalinclude:: ../spack/schema/gitlab_ci.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
@@ -35,7 +36,7 @@
runner_selector_schema = {
"type": "object",
"additionalProperties": False,
"additionalProperties": True,
"required": ["tags"],
"properties": runner_attributes_schema_items,
}
@@ -112,7 +113,7 @@
}
#: Properties for inclusion in other schemas
properties = {"gitlab-ci": gitlab_ci_properties}
properties: Dict[str, Any] = {"gitlab-ci": gitlab_ci_properties}
#: Full schema with metadata
schema = {

View File

@@ -6,8 +6,10 @@
"""Schema for configuration merged into one file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/merged.py
:lines: 39-
:lines: 32-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema.bootstrap
@@ -24,9 +26,10 @@
import spack.schema.packages
import spack.schema.repos
import spack.schema.upstreams
import spack.schema.view
#: Properties for inclusion in other schemas
properties = union_dicts(
properties: Dict[str, Any] = union_dicts(
spack.schema.bootstrap.properties,
spack.schema.cdash.properties,
spack.schema.compilers.properties,
@@ -41,6 +44,7 @@
spack.schema.packages.properties,
spack.schema.repos.properties,
spack.schema.upstreams.properties,
spack.schema.view.properties,
)

View File

@@ -6,8 +6,9 @@
"""Schema for mirrors.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/mirrors.py
:lines: 12-69
:lines: 13-
"""
from typing import Any, Dict
#: Common properties for connection specification
connection = {
@@ -50,7 +51,7 @@
}
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"mirrors": {
"type": "object",
"default": {},

View File

@@ -6,8 +6,10 @@
"""Schema for modules.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/modules.py
:lines: 13-
:lines: 16-
"""
from typing import Any, Dict
import spack.schema.environment
import spack.schema.projections
@@ -141,7 +143,7 @@
# Properties for inclusion into other schemas (requires definitions)
properties = {
properties: Dict[str, Any] = {
"modules": {
"type": "object",
"additionalProperties": False,

View File

@@ -5,8 +5,10 @@
"""Schema for packages.yaml configuration files.
.. literalinclude:: _spack_root/lib/spack/spack/schema/packages.py
:lines: 13-
:lines: 14-
"""
from typing import Any, Dict
import spack.schema.environment
permissions = {
@@ -91,7 +93,7 @@
REQUIREMENT_URL = "https://spack.readthedocs.io/en/latest/packages_yaml.html#package-requirements"
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"packages": {
"type": "object",
"default": {},

View File

@@ -6,12 +6,12 @@
"""Schema for projections.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/projections.py
:lines: 13-
:lines: 14-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"projections": {"type": "object", "patternProperties": {r"all|\w[\w-]*": {"type": "string"}}}
}

View File

@@ -6,12 +6,14 @@
"""Schema for repos.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/repos.py
:lines: 13-
:lines: 14-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties = {"repos": {"type": "array", "default": [], "items": {"type": "string"}}}
properties: Dict[str, Any] = {
"repos": {"type": "array", "default": [], "items": {"type": "string"}}
}
#: Full schema with metadata

View File

@@ -0,0 +1,46 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for spack environment
.. literalinclude:: _spack_root/lib/spack/spack/schema/spack.py
:lines: 20-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema
import spack.schema.gitlab_ci as ci_schema # DEPRECATED
import spack.schema.merged as merged_schema
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
ci_schema.properties,
# merged configuration scope schemas
merged_schema.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
},
),
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack environment file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -8,9 +8,9 @@
TODO: This needs to be updated? Especially the hashes under properties.
.. literalinclude:: _spack_root/lib/spack/spack/schema/spec.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
target = {
"oneOf": [
@@ -57,7 +57,7 @@
}
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"spec": {
"type": "object",
"additionalProperties": False,

View File

@@ -2,10 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"upstreams": {
"type": "object",
"default": {},

View File

@@ -0,0 +1,49 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for view
.. literalinclude:: _spack_root/lib/spack/spack/schema/view.py
:lines: 15-
"""
from typing import Any, Dict
import spack.schema
projections_scheme = spack.schema.projections.properties["projections"]
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"view": {
"anyOf": [
{"type": "boolean"},
{"type": "string"},
{
"type": "object",
"patternProperties": {
r"\w+": {
"required": ["root"],
"additionalProperties": False,
"properties": {
"root": {"type": "string"},
"link": {"type": "string", "pattern": "(roots|all|run)"},
"link_type": {"type": "string"},
"select": {"type": "array", "items": {"type": "string"}},
"exclude": {"type": "array", "items": {"type": "string"}},
"projections": projections_scheme,
},
}
},
},
]
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack view configuration file schema",
"properties": properties,
}

View File

@@ -19,51 +19,6 @@
import archspec.cpu
import spack.config as sc
import spack.deptypes as dt
import spack.parser
import spack.paths as sp
import spack.util.path as sup
try:
import clingo # type: ignore[import]
# There may be a better way to detect this
clingo_cffi = hasattr(clingo.Symbol, "_rep")
except ImportError:
clingo = None # type: ignore
clingo_cffi = False
except AttributeError:
# Reaching this point indicates a broken clingo installation
# If Spack derived clingo, suggest user re-run bootstrap
# if non-spack, suggest user investigate installation
# assume Spack is not responsibe for broken clingo
msg = (
f"Clingo installation at {clingo.__file__} is incomplete or invalid."
"Please repair installation or re-install. "
"Alternatively, consider installing clingo via Spack."
)
# check whether Spack is responsible
if (
pathlib.Path(
sup.canonicalize_path(sc.get("bootstrap:root", sp.default_user_bootstrap_path))
)
in pathlib.Path(clingo.__file__).parents
):
# Spack is responsible for the broken clingo
msg = (
"Spack bootstrapped copy of Clingo is broken, "
"please re-run the bootstrapping process via command `spack bootstrap now`."
" If this issue persists, please file a bug at: github.com/spack/spack"
)
raise RuntimeError(
"Clingo installation may be broken or incomplete, "
"please verify clingo has been installed correctly"
"\n\nClingo does not provide symbol clingo.Symbol"
f"{msg}"
)
import llnl.util.lang
import llnl.util.tty as tty
@@ -72,11 +27,14 @@
import spack.cmd
import spack.compilers
import spack.config
import spack.config as sc
import spack.deptypes as dt
import spack.directives
import spack.environment as ev
import spack.error
import spack.package_base
import spack.package_prefs
import spack.parser
import spack.platforms
import spack.repo
import spack.spec
@@ -89,13 +47,23 @@
import spack.version.git_ref_lookup
from spack import traverse
from .core import (
AspFunction,
NodeArgument,
ast_sym,
ast_type,
clingo,
clingo_cffi,
extract_args,
fn,
parse_files,
parse_term,
)
from .counter import FullDuplicatesCounter, MinimalDuplicatesCounter, NoDuplicatesCounter
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
# these are from clingo.ast and bootstrapped later
ASTType = None
parse_files = None
TransformFunction = Callable[["spack.spec.Spec", List[AspFunction]], List[AspFunction]]
#: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32"
@@ -120,29 +88,13 @@
def default_clingo_control():
"""Return a control object with the default settings used in Spack"""
control = clingo.Control()
control = clingo().Control()
control.configuration.configuration = "tweety"
control.configuration.solver.heuristic = "Domain"
control.configuration.solver.opt_strategy = "usc,one"
return control
# backward compatibility functions for clingo ASTs
def ast_getter(*names):
def getter(node):
for name in names:
result = getattr(node, name, None)
if result:
return result
raise KeyError("node has no such keys: %s" % names)
return getter
ast_type = ast_getter("ast_type", "type")
ast_sym = ast_getter("symbol", "term")
class Provenance(enum.IntEnum):
"""Enumeration of the possible provenances of a version."""
@@ -301,86 +253,6 @@ def specify(spec):
return spack.spec.Spec(spec)
class AspObject:
"""Object representing a piece of ASP code."""
def _id(thing):
"""Quote string if needed for it to be a valid identifier."""
if isinstance(thing, AspObject):
return thing
elif isinstance(thing, bool):
return '"%s"' % str(thing)
elif isinstance(thing, int):
return str(thing)
else:
return '"%s"' % str(thing)
@llnl.util.lang.key_ordering
class AspFunction(AspObject):
__slots__ = ["name", "args"]
def __init__(self, name, args=None):
self.name = name
self.args = () if args is None else tuple(args)
def _cmp_key(self):
return self.name, self.args
def __call__(self, *args):
"""Return a new instance of this function with added arguments.
Note that calls are additive, so you can do things like::
>>> attr = AspFunction("attr")
attr()
>>> attr("version")
attr("version")
>>> attr("version")("foo")
attr("version", "foo")
>>> v = AspFunction("attr", "version")
attr("version")
>>> v("foo", "bar")
attr("version", "foo", "bar")
"""
return AspFunction(self.name, self.args + args)
def symbol(self, positive=True):
def argify(arg):
if isinstance(arg, bool):
return clingo.String(str(arg))
elif isinstance(arg, int):
return clingo.Number(arg)
elif isinstance(arg, AspFunction):
return clingo.Function(arg.name, [argify(x) for x in arg.args], positive=positive)
else:
return clingo.String(str(arg))
return clingo.Function(self.name, [argify(arg) for arg in self.args], positive=positive)
def __str__(self):
return "%s(%s)" % (self.name, ", ".join(str(_id(arg)) for arg in self.args))
def __repr__(self):
return str(self)
class AspFunctionBuilder:
def __getattr__(self, name):
return AspFunction(name)
fn = AspFunctionBuilder()
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]:
"""Transformation that removes all "node" and "virtual_node" from the input list of facts."""
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
@@ -663,72 +535,6 @@ def _spec_with_default_name(spec_str, name):
return spec
def bootstrap_clingo():
global clingo, ASTType, parse_files
if not clingo:
import spack.bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
import clingo
from clingo.ast import ASTType
try:
from clingo.ast import parse_files
except ImportError:
# older versions of clingo have this one namespace up
from clingo import parse_files
class NodeArgument(NamedTuple):
id: str
pkg: str
def intermediate_repr(sym):
"""Returns an intermediate representation of clingo models for Spack's spec builder.
Currently, transforms symbols from clingo models either to strings or to NodeArgument objects.
Returns:
This will turn a ``clingo.Symbol`` into a string or NodeArgument, or a sequence of
``clingo.Symbol`` objects into a tuple of those objects.
"""
# TODO: simplify this when we no longer have to support older clingo versions.
if isinstance(sym, (list, tuple)):
return tuple(intermediate_repr(a) for a in sym)
try:
if sym.name == "node":
return NodeArgument(
id=intermediate_repr(sym.arguments[0]), pkg=intermediate_repr(sym.arguments[1])
)
except RuntimeError:
# This happens when using clingo w/ CFFI and trying to access ".name" for symbols
# that are not functions
pass
if clingo_cffi:
# Clingo w/ CFFI will throw an exception on failure
try:
return sym.string
except RuntimeError:
return str(sym)
else:
return sym.string or str(sym)
def extract_args(model, predicate_name):
"""Extract the arguments to predicates with the provided name from a model.
Pull out all the predicates with name ``predicate_name`` from the model, and
return their intermediate representation.
"""
return [intermediate_repr(sym.arguments) for sym in model if sym.name == predicate_name]
class ErrorHandler:
def __init__(self, model):
self.model = model
@@ -830,7 +636,7 @@ def raise_if_errors(self):
if not initial_error_args:
return
error_causation = clingo.Control()
error_causation = clingo().Control()
parent_dir = pathlib.Path(__file__).parent
errors_lp = parent_dir / "error_messages.lp"
@@ -881,54 +687,9 @@ def __init__(self, cores=True):
cores (bool): whether to generate unsatisfiable cores for better
error reporting.
"""
bootstrap_clingo()
self.out = llnl.util.lang.Devnull()
self.cores = cores
# These attributes are part of the object, but will be reset
# at each call to solve
# This attribute will be reset at each call to solve
self.control = None
self.backend = None
self.assumptions = None
def title(self, name, char):
self.out.write("\n")
self.out.write("%" + (char * 76))
self.out.write("\n")
self.out.write("%% %s\n" % name)
self.out.write("%" + (char * 76))
self.out.write("\n")
def h1(self, name):
self.title(name, "=")
def h2(self, name):
self.title(name, "-")
def newline(self):
self.out.write("\n")
def fact(self, head):
"""ASP fact (a rule without a body).
Arguments:
head (AspFunction): ASP function to generate as fact
"""
symbol = head.symbol() if hasattr(head, "symbol") else head
# This is commented out to avoid evaluating str(symbol) when we have no stream
if not isinstance(self.out, llnl.util.lang.Devnull):
self.out.write(f"{str(symbol)}.\n")
atom = self.backend.add_atom(symbol)
# Only functions relevant for constructing bug reports for bad error messages
# are assumptions, and only when using cores.
choice = self.cores and symbol.name == "internal_error"
self.backend.add_rule([atom], [], choice=choice)
if choice:
self.assumptions.append(atom)
def solve(self, setup, specs, reuse=None, output=None, control=None, allow_deprecated=False):
"""Set up the input and solve for dependencies of ``specs``.
@@ -948,49 +709,24 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
solve, and the internal statistics from clingo.
"""
output = output or DEFAULT_OUTPUT_CONFIGURATION
# allow solve method to override the output stream
if output.out is not None:
self.out = output.out
timer = spack.util.timer.Timer()
# Initialize the control object for the solver
self.control = control or default_clingo_control()
# set up the problem -- this generates facts and rules
self.assumptions = []
timer.start("setup")
with self.control.backend() as backend:
self.backend = backend
setup.setup(self, specs, reuse=reuse, allow_deprecated=allow_deprecated)
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
if output.out is not None:
output.out.write(asp_problem)
if output.setup_only:
return Result(specs), None, None
timer.stop("setup")
timer.start("load")
# read in the main ASP program and display logic -- these are
# handwritten, not generated, so we load them as resources
parent_dir = os.path.dirname(__file__)
# extract error messages from concretize.lp by inspecting its AST
with self.backend:
def visit(node):
if ast_type(node) == ASTType.Rule:
for term in node.body:
if ast_type(term) == ASTType.Literal:
if ast_type(term.atom) == ASTType.SymbolicAtom:
name = ast_sym(term.atom).name
if name == "internal_error":
arg = ast_sym(ast_sym(term.atom).arguments[0])
self.fact(AspFunction(name)(arg.string))
self.h1("Error messages")
path = os.path.join(parent_dir, "concretize.lp")
parse_files([path], visit)
# If we're only doing setup, just return an empty solve result
if output.setup_only:
return Result(specs), None, None
# Add the problem instance
self.control.add("base", [], asp_problem)
# Load the file itself
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
@@ -1016,12 +752,12 @@ def on_model(model):
models.append((model.cost, model.symbols(shown=True, terms=True)))
solve_kwargs = {
"assumptions": self.assumptions,
"assumptions": setup.assumptions,
"on_model": on_model,
"on_core": cores.append,
}
if clingo_cffi:
if clingo_cffi():
solve_kwargs["on_unsat"] = cores.append
timer.start("solve")
@@ -1142,6 +878,7 @@ class SpackSolverSetup:
def __init__(self, tests=False):
self.gen = None # set by setup()
self.assumptions = []
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
@@ -1768,15 +1505,12 @@ def external_packages(self):
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, _):
return [fn.attr("external_conditions_hold", input_spec.name, local_idx)]
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
self.condition(
spec,
spack.spec.Spec(spec.name),
msg=msg,
transform_imposed=external_imposition,
)
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
@@ -1881,36 +1615,7 @@ def _spec_clauses(
"""
clauses = []
# TODO: do this with consistent suffixes.
class Head:
node = fn.attr("node")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform_set")
node_os = fn.attr("node_os_set")
node_target = fn.attr("node_target_set")
variant_value = fn.attr("variant_set")
node_compiler = fn.attr("node_compiler_set")
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class Body:
node = fn.attr("node")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform")
node_os = fn.attr("node_os")
node_target = fn.attr("node_target")
variant_value = fn.attr("variant_value")
node_compiler = fn.attr("node_compiler")
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
f = Body if body else Head
f = _Body if body else _Head
if spec.name:
clauses.append(f.node(spec.name) if not spec.virtual else f.virtual_node(spec.name))
@@ -2506,12 +2211,11 @@ def define_concrete_input_specs(self, specs, possible):
def setup(
self,
driver: PyclingoDriver,
specs: Sequence[spack.spec.Spec],
*,
reuse: Optional[List[spack.spec.Spec]] = None,
allow_deprecated: bool = False,
):
) -> str:
"""Generate an ASP program with relevant constraints for specs.
This calls methods on the solve driver to set up the problem with
@@ -2519,7 +2223,6 @@ def setup(
specs, as well as constraints from the specs themselves.
Arguments:
driver: driver instance of this solve
specs: list of Specs to solve
reuse: list of concrete specs that can be reused
allow_deprecated: if True adds deprecated versions into the solve
@@ -2545,9 +2248,7 @@ def setup(
if node.namespace is not None:
self.explicitly_required_namespaces[node.name] = node.namespace
# driver is used by all the functions below to add facts and
# rules to generate an ASP program.
self.gen = driver
self.gen = ProblemInstanceBuilder()
if not allow_deprecated:
self.gen.fact(fn.deprecated_versions_not_allowed())
@@ -2651,6 +2352,29 @@ def setup(
self.gen.h1("Target Constraints")
self.define_target_constraints()
self.gen.h1("Internal errors")
self.internal_errors()
return self.gen.value()
def internal_errors(self):
parent_dir = os.path.dirname(__file__)
def visit(node):
if ast_type(node) == clingo().ast.ASTType.Rule:
for term in node.body:
if ast_type(term) == clingo().ast.ASTType.Literal:
if ast_type(term.atom) == clingo().ast.ASTType.SymbolicAtom:
name = ast_sym(term.atom).name
if name == "internal_error":
arg = ast_sym(ast_sym(term.atom).arguments[0])
symbol = AspFunction(name)(arg.string)
self.assumptions.append((parse_term(str(symbol)), True))
self.gen.asp_problem.append(f"{{ {symbol} }}.\n")
path = os.path.join(parent_dir, "concretize.lp")
parse_files([path], visit)
def define_runtime_constraints(self):
"""Define the constraints to be imposed on the runtimes"""
recorder = RuntimePropertyRecorder(self)
@@ -2781,6 +2505,83 @@ def pkg_class(self, pkg_name: str) -> typing.Type["spack.package_base.PackageBas
return spack.repo.PATH.get_pkg_class(request)
class _Head:
"""ASP functions used to express spec clauses in the HEAD of a rule"""
node = fn.attr("node")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform_set")
node_os = fn.attr("node_os_set")
node_target = fn.attr("node_target_set")
variant_value = fn.attr("variant_set")
node_compiler = fn.attr("node_compiler_set")
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class _Body:
"""ASP functions used to express spec clauses in the BODY of a rule"""
node = fn.attr("node")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform")
node_os = fn.attr("node_os")
node_target = fn.attr("node_target")
variant_value = fn.attr("variant_value")
node_compiler = fn.attr("node_compiler")
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class ProblemInstanceBuilder:
"""Provides an interface to construct a problem instance.
Once all the facts and rules have been added, the problem instance can be retrieved with:
>>> builder = ProblemInstanceBuilder()
>>> ...
>>> problem_instance = builder.value()
The problem instance can be added directly to the "control" structure of clingo.
"""
def __init__(self):
self.asp_problem = []
def fact(self, atom: AspFunction) -> None:
symbol = atom.symbol() if hasattr(atom, "symbol") else atom
self.asp_problem.append(f"{str(symbol)}.\n")
def append(self, rule: str) -> None:
self.asp_problem.append(rule)
def title(self, header: str, char: str) -> None:
self.asp_problem.append("\n")
self.asp_problem.append("%" + (char * 76))
self.asp_problem.append("\n")
self.asp_problem.append(f"% {header}\n")
self.asp_problem.append("%" + (char * 76))
self.asp_problem.append("\n")
def h1(self, header: str) -> None:
self.title(header, "=")
def h2(self, header: str) -> None:
self.title(header, "-")
def newline(self):
self.asp_problem.append("\n")
def value(self) -> str:
return "".join(self.asp_problem)
class RequirementParser:
"""Parses requirements from package.py files and configuration, and returns rules."""
@@ -3088,9 +2889,7 @@ def consume_facts(self):
self._setup.gen.h2("Runtimes: rules")
self._setup.gen.newline()
for rule in self.rules:
if not isinstance(self._setup.gen.out, llnl.util.lang.Devnull):
self._setup.gen.out.write(rule)
self._setup.gen.control.add("base", [], rule)
self._setup.gen.append(rule)
self._setup.gen.h2("Runtimes: conditions")
for runtime_pkg in spack.repo.PATH.packages_with_tags("runtime"):

View File

@@ -698,6 +698,14 @@ requirement_group_satisfied(node(ID, Package), X) :-
activate_requirement(node(ID, Package), X),
requirement_group(Package, X).
% Do not impose requirements, if the conditional requirement is not active
do_not_impose(EffectID, node(ID, Package)) :-
trigger_condition_holds(TriggerID, node(ID, Package)),
pkg_fact(Package, condition_trigger(ConditionID, TriggerID)),
pkg_fact(Package, condition_effect(ConditionID, EffectID)),
requirement_group_member(ConditionID , Package, RequirementID),
not activate_requirement(node(ID, Package), RequirementID).
% When we have a required provider, we need to ensure that the provider/2 facts respect
% the requirement. This is particularly important for packages that could provide multiple
% virtuals independently

View File

@@ -0,0 +1,272 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Low-level wrappers around clingo API."""
import importlib
import pathlib
from types import ModuleType
from typing import Any, Callable, NamedTuple, Optional, Tuple, Union
from llnl.util import lang
def _ast_getter(*names: str) -> Callable[[Any], Any]:
"""Helper to retrieve AST attributes from different versions of the clingo API"""
def getter(node):
for name in names:
result = getattr(node, name, None)
if result:
return result
raise KeyError(f"node has no such keys: {names}")
return getter
ast_type = _ast_getter("ast_type", "type")
ast_sym = _ast_getter("symbol", "term")
class AspObject:
"""Object representing a piece of ASP code."""
def _id(thing: Any) -> Union[str, AspObject]:
"""Quote string if needed for it to be a valid identifier."""
if isinstance(thing, AspObject):
return thing
elif isinstance(thing, bool):
return f'"{str(thing)}"'
elif isinstance(thing, int):
return str(thing)
else:
return f'"{str(thing)}"'
@lang.key_ordering
class AspFunction(AspObject):
"""A term in the ASP logic program"""
__slots__ = ["name", "args"]
def __init__(self, name: str, args: Optional[Tuple[Any, ...]] = None) -> None:
self.name = name
self.args = () if args is None else tuple(args)
def _cmp_key(self) -> Tuple[str, Optional[Tuple[Any, ...]]]:
return self.name, self.args
def __call__(self, *args: Any) -> "AspFunction":
"""Return a new instance of this function with added arguments.
Note that calls are additive, so you can do things like::
>>> attr = AspFunction("attr")
attr()
>>> attr("version")
attr("version")
>>> attr("version")("foo")
attr("version", "foo")
>>> v = AspFunction("attr", "version")
attr("version")
>>> v("foo", "bar")
attr("version", "foo", "bar")
"""
return AspFunction(self.name, self.args + args)
def _argify(self, arg: Any) -> Any:
"""Turn the argument into an appropriate clingo symbol"""
if isinstance(arg, bool):
return clingo().String(str(arg))
elif isinstance(arg, int):
return clingo().Number(arg)
elif isinstance(arg, AspFunction):
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
return clingo().String(str(arg))
def symbol(self):
"""Return a clingo symbol for this function"""
return clingo().Function(
self.name, [self._argify(arg) for arg in self.args], positive=True
)
def __str__(self) -> str:
return f"{self.name}({', '.join(str(_id(arg)) for arg in self.args)})"
def __repr__(self) -> str:
return str(self)
class _AspFunctionBuilder:
def __getattr__(self, name):
return AspFunction(name)
#: Global AspFunction builder
fn = _AspFunctionBuilder()
_CLINGO_MODULE: Optional[ModuleType] = None
def clingo() -> ModuleType:
"""Lazy imports the Python module for clingo, and returns it."""
if _CLINGO_MODULE is not None:
return _CLINGO_MODULE
try:
clingo_mod = importlib.import_module("clingo")
# Make sure we didn't import an empty module
_ensure_clingo_or_raise(clingo_mod)
except ImportError:
clingo_mod = None
if clingo_mod is not None:
return _set_clingo_module_cache(clingo_mod)
clingo_mod = _bootstrap_clingo()
return _set_clingo_module_cache(clingo_mod)
def _set_clingo_module_cache(clingo_mod: ModuleType) -> ModuleType:
"""Sets the global cache to the lazy imported clingo module"""
global _CLINGO_MODULE
importlib.import_module("clingo.ast")
_CLINGO_MODULE = clingo_mod
return clingo_mod
def _ensure_clingo_or_raise(clingo_mod: ModuleType) -> None:
"""Ensures the clingo module can access expected attributes, otherwise raises an error."""
# These are imports that may be problematic at top level (circular imports). They are used
# only to provide exhaustive details when erroring due to a broken clingo module.
import spack.config
import spack.paths as sp
import spack.util.path as sup
try:
clingo_mod.Symbol
except AttributeError:
assert clingo_mod.__file__ is not None, "clingo installation is incomplete or invalid"
# Reaching this point indicates a broken clingo installation
# If Spack derived clingo, suggest user re-run bootstrap
# if non-spack, suggest user investigate installation
# assume Spack is not responsible for broken clingo
msg = (
f"Clingo installation at {clingo_mod.__file__} is incomplete or invalid."
"Please repair installation or re-install. "
"Alternatively, consider installing clingo via Spack."
)
# check whether Spack is responsible
if (
pathlib.Path(
sup.canonicalize_path(
spack.config.CONFIG.get("bootstrap:root", sp.default_user_bootstrap_path)
)
)
in pathlib.Path(clingo_mod.__file__).parents
):
# Spack is responsible for the broken clingo
msg = (
"Spack bootstrapped copy of Clingo is broken, "
"please re-run the bootstrapping process via command `spack bootstrap now`."
" If this issue persists, please file a bug at: github.com/spack/spack"
)
raise RuntimeError(
"Clingo installation may be broken or incomplete, "
"please verify clingo has been installed correctly"
"\n\nClingo does not provide symbol clingo.Symbol"
f"{msg}"
)
def clingo_cffi() -> bool:
"""Returns True if clingo uses the CFFI interface"""
return hasattr(clingo().Symbol, "_rep")
def _bootstrap_clingo() -> ModuleType:
"""Bootstraps the clingo module and returns it"""
import spack.bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
clingo_mod = importlib.import_module("clingo")
return clingo_mod
def parse_files(*args, **kwargs):
"""Wrapper around clingo parse_files, that dispatches the function according
to clingo API version.
"""
clingo()
try:
return importlib.import_module("clingo.ast").parse_files(*args, **kwargs)
except (ImportError, AttributeError):
return clingo().parse_files(*args, **kwargs)
def parse_term(*args, **kwargs):
"""Wrapper around clingo parse_term, that dispatches the function according
to clingo API version.
"""
clingo()
try:
return importlib.import_module("clingo.symbol").parse_term(*args, **kwargs)
except (ImportError, AttributeError):
return clingo().parse_term(*args, **kwargs)
class NodeArgument(NamedTuple):
"""Represents a node in the DAG"""
id: str
pkg: str
def intermediate_repr(sym):
"""Returns an intermediate representation of clingo models for Spack's spec builder.
Currently, transforms symbols from clingo models either to strings or to NodeArgument objects.
Returns:
This will turn a ``clingo.Symbol`` into a string or NodeArgument, or a sequence of
``clingo.Symbol`` objects into a tuple of those objects.
"""
# TODO: simplify this when we no longer have to support older clingo versions.
if isinstance(sym, (list, tuple)):
return tuple(intermediate_repr(a) for a in sym)
try:
if sym.name == "node":
return NodeArgument(
id=intermediate_repr(sym.arguments[0]), pkg=intermediate_repr(sym.arguments[1])
)
except RuntimeError:
# This happens when using clingo w/ CFFI and trying to access ".name" for symbols
# that are not functions
pass
if clingo_cffi():
# Clingo w/ CFFI will throw an exception on failure
try:
return sym.string
except RuntimeError:
return str(sym)
else:
return sym.string or str(sym)
def extract_args(model, predicate_name):
"""Extract the arguments to predicates with the provided name from a model.
Pull out all the predicates with name ``predicate_name`` from the model, and
return their intermediate representation.
"""
return [intermediate_repr(sym.arguments) for sym in model if sym.name == predicate_name]

View File

@@ -186,11 +186,11 @@ class InstallStatus(enum.Enum):
Options are artificially disjoint for display purposes
"""
installed = "@g{[+]} "
upstream = "@g{[^]} "
external = "@g{[e]} "
absent = "@K{ - } "
missing = "@r{[-]} "
INSTALLED = "@g{[+]}"
UPSTREAM = "@g{[^]}"
EXTERNAL = "@g{[e]}"
ABSENT = "@K{ - }"
MISSING = "@r{[-]}"
def colorize_spec(spec):
@@ -1499,9 +1499,11 @@ def edge_attributes(self) -> str:
if not deptypes_str and not virtuals_str:
return ""
result = f"{deptypes_str} {virtuals_str}".strip()
return f"[{result}]"
return f"[{result}] "
def dependencies(self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL):
def dependencies(
self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL
) -> List["Spec"]:
"""Return a list of direct dependencies (nodes in the DAG).
Args:
@@ -1512,7 +1514,9 @@ def dependencies(self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.A
deptype = dt.canonicalize(deptype)
return [d.spec for d in self.edges_to_dependencies(name, depflag=deptype)]
def dependents(self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL):
def dependents(
self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL
) -> List["Spec"]:
"""Return a list of direct dependents (nodes in the DAG).
Args:
@@ -1636,23 +1640,23 @@ def _add_dependency(self, spec: "Spec", *, depflag: dt.DepFlag, virtuals: Tuple[
self.add_dependency_edge(spec, depflag=depflag, virtuals=virtuals)
return
# Keep the intersection of constraints when a dependency is added
# multiple times. Currently, we only allow identical edge types.
# Keep the intersection of constraints when a dependency is added multiple times.
# The only restriction, currently, is keeping the same dependency type
orig = self._dependencies[spec.name]
try:
dspec = next(dspec for dspec in orig if depflag == dspec.depflag)
except StopIteration:
current_deps = ", ".join(
dt.flag_to_chars(x.depflag) + " " + x.spec.short_spec for x in orig
)
edge_attrs = f"deptypes={dt.flag_to_chars(depflag).strip()}"
required_dep_str = f"^[{edge_attrs}] {str(spec)}"
raise DuplicateDependencyError(
f"{self.short_spec} cannot depend on '{spec.short_spec}' multiple times.\n"
f"\tRequired: {dt.flag_to_chars(depflag)}\n"
f"\tDependency: {current_deps}"
f"{spec.name} is a duplicate dependency, with conflicting dependency types\n"
f"\t'{str(self)}' cannot depend on '{required_dep_str}'"
)
try:
dspec.spec.constrain(spec)
dspec.update_virtuals(virtuals=virtuals)
except spack.error.UnsatisfiableSpecError:
raise DuplicateDependencyError(
f"Cannot depend on incompatible specs '{dspec.spec}' and '{spec}'"
@@ -4315,8 +4319,7 @@ def colorized(self):
return colorize_spec(self)
def format(self, format_string=DEFAULT_FORMAT, **kwargs):
r"""Prints out particular pieces of a spec, depending on what is
in the format string.
r"""Prints out particular pieces of a spec, depending on what is in the format string.
Using the ``{attribute}`` syntax, any field of the spec can be
selected. Those attributes can be recursive. For example,
@@ -4442,6 +4445,9 @@ def write_attribute(spec, attribute, color):
elif attribute == "spack_install":
write(morph(spec, spack.store.STORE.layout.root))
return
elif re.match(r"install_status", attribute):
write(self.install_status_symbol())
return
elif re.match(r"hash(:\d)?", attribute):
col = "#"
if ":" in attribute:
@@ -4536,8 +4542,18 @@ def write_attribute(spec, attribute, color):
"Format string terminated while reading attribute." "Missing terminating }."
)
# remove leading whitespace from directives that add it for internal formatting.
# Arch, compiler flags, and variants add spaces for spec format correctness, but
# we don't really want them in formatted string output. We do want to preserve
# whitespace from the format string.
formatted_spec = out.getvalue()
return formatted_spec.strip()
whitespace_attrs = [r"{arch=[^}]*}", r"{architecture}", r"{compiler_flags}", r"{variants}"]
if any(re.match(rx, format_string) for rx in whitespace_attrs):
formatted_spec = formatted_spec.lstrip()
if any(re.search(f"{rx}$", format_string) for rx in whitespace_attrs):
formatted_spec = formatted_spec.rstrip()
return formatted_spec
def cformat(self, *args, **kwargs):
"""Same as format, but color defaults to auto instead of False."""
@@ -4587,7 +4603,7 @@ def __str__(self):
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
d.format("{edge_attributes}" + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@@ -4607,20 +4623,25 @@ def colored_str(self):
def install_status(self):
"""Helper for tree to print DB install status."""
if not self.concrete:
return InstallStatus.absent
return InstallStatus.ABSENT
if self.external:
return InstallStatus.external
return InstallStatus.EXTERNAL
upstream, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
if not record:
return InstallStatus.absent
return InstallStatus.ABSENT
elif upstream and record.installed:
return InstallStatus.upstream
return InstallStatus.UPSTREAM
elif record.installed:
return InstallStatus.installed
return InstallStatus.INSTALLED
else:
return InstallStatus.missing
return InstallStatus.MISSING
def install_status_symbol(self):
"""Get an install status symbol."""
status = self.install_status()
return clr.colorize(status.value)
def _installed_explicitly(self):
"""Helper for tree to print DB install status."""
@@ -4646,7 +4667,7 @@ def tree(
show_types: bool = False,
depth_first: bool = False,
recurse_dependencies: bool = True,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
install_status: bool = False,
prefix: Optional[Callable[["Spec"], str]] = None,
) -> str:
"""Prints out this spec and its dependencies, tree-formatted
@@ -4667,8 +4688,7 @@ def tree(
show_types: if True, show the (merged) dependency type of a node
depth_first: if True, traverse the DAG depth first when representing it as a tree
recurse_dependencies: if True, recurse on dependencies
status_fn: optional callable that takes a node as an argument and return its
installation status
install_status: if True, show installation status next to each spec
prefix: optional callable that takes a node as an argument and return its
installation prefix
"""
@@ -4682,6 +4702,9 @@ def tree(
):
node = dep_spec.spec
if install_status:
out += node.format("{install_status} ")
if prefix is not None:
out += prefix(node)
out += " " * indent
@@ -4689,15 +4712,6 @@ def tree(
if depth:
out += "%-4d" % d
if status_fn:
status = status_fn(node)
if status in list(InstallStatus):
out += clr.colorize(status.value, color=color)
elif status:
out += clr.colorize("@g{[+]} ", color=color)
else:
out += clr.colorize("@r{[-]} ", color=color)
if hashes:
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hashlen)

View File

@@ -199,9 +199,11 @@ def get_stage_root():
def _mirror_roots():
mirrors = spack.config.get("mirrors")
return [
sup.substitute_path_variables(root)
if root.endswith(os.sep)
else sup.substitute_path_variables(root) + os.sep
(
sup.substitute_path_variables(root)
if root.endswith(os.sep)
else sup.substitute_path_variables(root) + os.sep
)
for root in mirrors.values()
]

View File

@@ -16,6 +16,7 @@
import spack
import spack.binary_distribution
import spack.ci as ci
import spack.cmd.ci
import spack.config
import spack.environment as ev
import spack.hash_types as ht
@@ -2028,6 +2029,43 @@ def fake_download_and_extract_artifacts(url, work_dir):
assert expect_out in rep_out
@pytest.mark.parametrize(
"url_in,url_out",
[
(
"https://example.com/api/v4/projects/1/jobs/2/artifacts",
"https://example.com/api/v4/projects/1/jobs/2/artifacts",
),
(
"https://example.com/spack/spack/-/jobs/123456/artifacts/download",
"https://example.com/spack/spack/-/jobs/123456/artifacts/download",
),
(
"https://example.com/spack/spack/-/jobs/123456",
"https://example.com/spack/spack/-/jobs/123456/artifacts/download",
),
(
"https://example.com/spack/spack/-/jobs/////123456////?x=y#z",
"https://example.com/spack/spack/-/jobs/123456/artifacts/download",
),
],
)
def test_reproduce_build_url_validation(url_in, url_out):
assert spack.cmd.ci._gitlab_artifacts_url(url_in) == url_out
def test_reproduce_build_url_validation_fails():
"""Wrong URLs should cause an exception"""
with pytest.raises(SystemExit):
ci_cmd("reproduce-build", "example.com/spack/spack/-/jobs/123456/artifacts/download")
with pytest.raises(SystemExit):
ci_cmd("reproduce-build", "https://example.com/spack/spack/-/issues")
with pytest.raises(SystemExit):
ci_cmd("reproduce-build", "https://example.com/spack/spack/-")
@pytest.mark.parametrize(
"subcmd", [(""), ("generate"), ("rebuild-index"), ("rebuild"), ("reproduce-build")]
)

View File

@@ -215,6 +215,44 @@ def test_dev_build_env(tmpdir, install_mockery, mutable_mock_env_path):
assert f.read() == spec.package.replacement_string
def test_dev_build_env_with_vars(tmpdir, install_mockery, mutable_mock_env_path, monkeypatch):
"""Test Spack does dev builds for packages in develop section of env (path with variables)."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
spec.concretize()
# store the build path in an environment variable that will be used in the environment
monkeypatch.setenv("CUSTOM_BUILD_PATH", build_dir)
with build_dir.as_cwd(), open(spec.package.filename, "w") as f:
f.write(spec.package.original_string)
# setup environment
envdir = tmpdir.mkdir("env")
with envdir.as_cwd():
with open("spack.yaml", "w") as f:
f.write(
"""\
spack:
specs:
- dev-build-test-install@0.0.0
develop:
dev-build-test-install:
spec: dev-build-test-install@0.0.0
path: $CUSTOM_BUILD_PATH
"""
)
env("create", "test", "./spack.yaml")
with ev.read("test"):
install()
assert spec.package.filename in os.listdir(spec.prefix)
with open(os.path.join(spec.prefix, spec.package.filename), "r") as f:
assert f.read() == spec.package.replacement_string
def test_dev_build_env_version_mismatch(tmpdir, install_mockery, mutable_mock_env_path):
"""Test Spack constraints concretization by develop specs."""
# setup dev-build-test-install package for dev build

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import spack.config
import spack.environment as ev
import spack.spec
from spack.main import SpackCommand
@@ -21,7 +22,7 @@
@pytest.mark.usefixtures("mutable_mock_env_path", "mock_packages", "mock_fetch", "mutable_config")
class TestDevelop:
def check_develop(self, env, spec, path=None):
def check_develop(self, env, spec, path=None, build_dir=None):
path = path or spec.name
# check in memory representation
@@ -41,6 +42,12 @@ def check_develop(self, env, spec, path=None):
else:
assert yaml_entry["path"] == path
if build_dir is not None:
scope = env.scope_name
assert build_dir == spack.config.get(
"packages:{}:package_attributes:build_directory".format(spec.name), scope
)
def test_develop_no_path_no_clone(self):
env("create", "test")
with ev.read("test") as e:
@@ -72,6 +79,12 @@ def test_develop_no_args(self):
develop()
self.check_develop(e, spack.spec.Spec("mpich@=1.0"))
def test_develop_build_directory(self):
env("create", "test")
with ev.read("test") as e:
develop("-b", "test_build_dir", "mpich@1.0")
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), None, "test_build_dir")
def test_develop_twice(self):
env("create", "test")
with ev.read("test") as e:

View File

@@ -1471,8 +1471,8 @@ def test_env_view_fails_dir_file(tmpdir, mock_packages, mock_stage, mock_fetch,
view_dir = tmpdir.join("view")
env("create", "--with-view=%s" % view_dir, "test")
with ev.read("test"):
add("view-dir-file")
add("view-dir-dir")
add("view-file")
add("view-dir")
with pytest.raises(
llnl.util.link_tree.MergeConflictSummary, match=os.path.join("bin", "x")
):
@@ -1486,8 +1486,8 @@ def test_env_view_succeeds_symlinked_dir_file(
view_dir = tmpdir.join("view")
env("create", "--with-view=%s" % view_dir, "test")
with ev.read("test"):
add("view-dir-symlinked-dir")
add("view-dir-dir")
add("view-symlinked-dir")
add("view-dir")
install()
x_dir = os.path.join(str(view_dir), "bin", "x")
assert os.path.exists(os.path.join(x_dir, "file_in_dir"))
@@ -2537,58 +2537,88 @@ def test_stack_view_no_activate_without_default(
assert viewdir not in shell
@pytest.mark.parametrize("include_views", [True, False, "split"])
def test_stack_view_multiple_views(
tmpdir, mock_fetch, mock_packages, mock_archive, install_mockery
tmp_path,
mock_fetch,
mock_packages,
mock_archive,
install_mockery,
mutable_config,
include_views,
):
filename = str(tmpdir.join("spack.yaml"))
default_viewdir = str(tmpdir.join("default-view"))
combin_viewdir = str(tmpdir.join("combinatorial-view"))
with open(filename, "w") as f:
f.write(
"""\
spack:
"""Test multiple views as both included views (True), as both environment
views (False), or as one included and the other in the environment."""
# Write the view configuration and or manifest file
view_filename = tmp_path / "view.yaml"
base_content = """\
definitions:
- packages: [mpileaks, cmake]
- compilers: ['%%gcc', '%%clang']
- compilers: ['%gcc', '%clang']
specs:
- matrix:
- [$packages]
- [$compilers]
"""
view:
default:
root: %s
select: ['%%gcc']
combinatorial:
root: %s
exclude: [callpath %%gcc]
projections:
'all': '{name}/{version}-{compiler.name}'"""
% (default_viewdir, combin_viewdir)
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
install()
include_content = f" include:\n - {view_filename}\n"
view_line = " view:\n"
shell = env("activate", "--sh", "test")
assert "PATH" in shell
assert os.path.join(default_viewdir, "bin") in shell
comb_dir = tmp_path / "combinatorial-view"
comb_view = """\
{0}combinatorial:
{0} root: {1}
{0} exclude: [callpath%gcc]
{0} projections:
"""
test = ev.read("test")
for spec in test._get_environment_specs():
projection = " 'all': '{name}/{version}-{compiler.name}'"
default_dir = tmp_path / "default-view"
default_view = """\
{0}default:
{0} root: {1}
{0} select: ['%gcc']
"""
content = "spack:\n"
indent = " "
if include_views is True:
# Include both the gcc and combinatorial views
view = "view:\n" + default_view.format(indent, str(default_dir))
view += comb_view.format(indent, str(comb_dir)) + indent + projection
view_filename.write_text(view)
content += include_content + base_content
elif include_views == "split":
# Include the gcc view and inline the combinatorial view
view = "view:\n" + default_view.format(indent, str(default_dir))
view_filename.write_text(view)
content += include_content + base_content + view_line
indent += " "
content += comb_view.format(indent, str(comb_dir)) + indent + projection
else:
# Inline both the gcc and combinatorial views in the environment.
indent += " "
content += base_content + view_line
content += default_view.format(indent, str(default_dir))
content += comb_view.format(indent, str(comb_dir)) + indent + projection
filename = tmp_path / ev.manifest_name
filename.write_text(content)
env("create", "test", str(filename))
with ev.read("test"):
install()
with ev.read("test") as e:
assert os.path.exists(str(default_dir / "bin"))
for spec in e._get_environment_specs():
spec_subdir = f"{spec.version}-{spec.compiler.name}"
comb_spec_dir = str(comb_dir / spec.name / spec_subdir)
if not spec.satisfies("callpath%gcc"):
assert os.path.exists(
os.path.join(
combin_viewdir, spec.name, "%s-%s" % (spec.version, spec.compiler.name)
)
)
assert os.path.exists(comb_spec_dir)
else:
assert not os.path.exists(
os.path.join(
combin_viewdir, spec.name, "%s-%s" % (spec.version, spec.compiler.name)
)
)
assert not os.path.exists(comb_spec_dir)
def test_env_activate_sh_prints_shell_output(tmpdir, mock_stage, mock_fetch, install_mockery):
@@ -3714,3 +3744,191 @@ def test_environment_created_from_lockfile_has_view(mock_packages, temporary_sto
# Make sure the view was created
with ev.Environment(env_b) as e:
assert os.path.isdir(e.view_path_default)
def test_env_view_disabled(tmp_path, mutable_mock_env_path):
"""Ensure an inlined view being disabled means not even the default view
is created (since the case doesn't appear to be covered in this module)."""
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
"""\
spack:
specs:
- mpileaks
view: false
"""
)
env("create", "disabled", str(spack_yaml))
with ev.read("disabled") as e:
e.concretize()
assert len(e.views) == 0
assert not os.path.exists(e.view_path_default)
@pytest.mark.parametrize("first", ["false", "true", "custom"])
def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config, first):
"""Ensure including path and boolean views in different combinations result
in the creation of only the first view if it is not disabled."""
false_yaml = tmp_path / "false-view.yaml"
false_yaml.write_text("view: false\n")
true_yaml = tmp_path / "true-view.yaml"
true_yaml.write_text("view: true\n")
custom_name = "my-test-view"
custom_view = tmp_path / custom_name
custom_yaml = tmp_path / "custom-view.yaml"
custom_yaml.write_text(
f"""
view:
{custom_name}:
root: {custom_view}
"""
)
if first == "false":
order = [false_yaml, true_yaml, custom_yaml]
elif first == "true":
order = [true_yaml, custom_yaml, false_yaml]
else:
order = [custom_yaml, false_yaml, true_yaml]
includes = [f" - {yaml}\n" for yaml in order]
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
include:
{''.join(includes)}
specs:
- mpileaks
packages:
mpileaks:
compiler: [gcc]
"""
)
env("create", "test", str(spack_yaml))
with ev.read("test") as e:
concretize()
# Only the first included view should be created if view not disabled by it
assert len(e.views) == 0 if first == "false" else 1
if first == "true":
assert os.path.exists(e.view_path_default)
else:
assert not os.path.exists(e.view_path_default)
if first == "custom":
assert os.path.exists(custom_view)
else:
assert not os.path.exists(custom_view)
def test_stack_view_multiple_views_same_name(
tmp_path, mock_fetch, mock_packages, mock_archive, install_mockery, mutable_config
):
"""Test multiple views with the same name combine settings with precedence
given to the options in spack.yaml."""
# Write the view configuration and or manifest file
view_filename = tmp_path / "view.yaml"
default_dir = tmp_path / "default-view"
default_view = f"""\
view:
default:
root: {default_dir}
select: ['%gcc']
projections:
all: '{{name}}/{{version}}-{{compiler.name}}'
"""
view_filename.write_text(default_view)
view_dir = tmp_path / "view"
content = f"""\
spack:
include:
- {view_filename}
definitions:
- packages: [mpileaks, cmake]
- compilers: ['%gcc', '%clang']
specs:
- matrix:
- [$packages]
- [$compilers]
view:
default:
root: {view_dir}
exclude: ['cmake']
projections:
all: '{{name}}/{{compiler.name}}-{{version}}'
"""
filename = tmp_path / ev.manifest_name
filename.write_text(content)
env("create", "test", str(filename))
with ev.read("test"):
install()
with ev.read("test") as e:
# the view root in the included view should NOT exist
assert not os.path.exists(str(default_dir))
for spec in e._get_environment_specs():
# no specs will exist in the included view projection
included_spec_subdir = f"{spec.version}-{spec.compiler.name}"
included_spec_dir = str(view_dir / spec.name / included_spec_subdir)
assert not os.path.exists(included_spec_dir)
# only specs compiled with %gcc (selected in the included view) that
# are also not cmake (excluded in the environment view) should exist
env_spec_subdir = f"{spec.compiler.name}-{spec.version}"
env_spec_dir = str(view_dir / spec.name / env_spec_subdir)
if spec.satisfies("cmake") or spec.satisfies("%clang"):
assert not os.path.exists(env_spec_dir)
else:
assert os.path.exists(env_spec_dir)
def test_env_view_resolves_identical_file_conflicts(tmp_path, install_mockery, mock_fetch):
"""When files clash in a view, but refer to the same file on disk (for example, the dependent
symlinks to a file in the dependency at the same relative path), Spack links the first regular
file instead of symlinks. This is important for copy type views where we need the underlying
file to be copied instead of the symlink (when a symlink would be copied, it would become a
self-referencing symlink after relocation). The test uses a symlink type view though, since
that keeps track of the original file path."""
with ev.create("env", with_view=tmp_path / "view") as e:
add("view-resolve-conflict-top")
install()
top = e.matching_spec("view-resolve-conflict-top").prefix
bottom = e.matching_spec("view-file").prefix
# In this example we have `./bin/x` in 3 prefixes, two links, one regular file. We expect the
# regular file to be linked into the view. There are also 2 links at `./bin/y`, but no regular
# file, so we expect standard behavior: first entry is linked into the view.
# view-resolve-conflict-top/bin/
# x -> view-file/bin/x
# y -> view-resolve-conflict-middle/bin/y # expect this y to be linked
# view-resolve-conflict-middle/bin/
# x -> view-file/bin/x
# y -> view-file/bin/x
# view-file/bin/
# x # expect this x to be linked
assert os.readlink(tmp_path / "view" / "bin" / "x") == bottom.bin.x
assert os.readlink(tmp_path / "view" / "bin" / "y") == top.bin.y
def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mock_fetch):
"""Test that file-file conflicts for two unique files in environment views are ignored, and
that the dependent's file is linked into the view, not the dependency's file."""
with ev.create("env", with_view=tmp_path / "view") as e:
add("view-ignore-conflict")
install()
prefix_dependent = e.matching_spec("view-ignore-conflict").prefix
# The dependent's file is linked into the view
assert os.readlink(tmp_path / "view" / "bin" / "x") == prefix_dependent.bin.x

View File

@@ -0,0 +1,119 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import gzip
import os
import sys
import tempfile
from contextlib import contextmanager
from io import BytesIO, TextIOWrapper
import pytest
import spack
from spack.main import SpackCommand
logs = SpackCommand("logs")
install = SpackCommand("install")
@contextmanager
def stdout_as_buffered_text_stream():
"""Attempt to simulate "typical" interface for stdout when user is
running Spack/Python from terminal. "spack log" should not be run
for all possible cases of what stdout might look like, in
particular some programmatic redirections of stdout like StringIO
are not meant to be supported by this command; more-generally,
mechanisms that depend on decoding binary output prior to write
are not supported for "spack log".
"""
original_stdout = sys.stdout
with tempfile.TemporaryFile(mode="w+b") as tf:
sys.stdout = TextIOWrapper(tf)
try:
yield tf
finally:
sys.stdout = original_stdout
def _rewind_collect_and_decode(rw_stream):
rw_stream.seek(0)
return rw_stream.read().decode("utf-8")
@pytest.fixture
def disable_capture(capfd):
with capfd.disabled():
yield
def test_logs_cmd_errors(install_mockery, mock_fetch, mock_archive, mock_packages):
spec = spack.spec.Spec("libelf").concretized()
assert not spec.installed
with pytest.raises(spack.main.SpackCommandError, match="is not installed or staged"):
logs("libelf")
with pytest.raises(spack.main.SpackCommandError, match="Too many specs"):
logs("libelf mpi")
install("libelf")
os.remove(spec.package.install_log_path)
with pytest.raises(spack.main.SpackCommandError, match="No logs are available"):
logs("libelf")
def _write_string_to_path(string, path):
"""Write a string to a file, preserving newline format in the string."""
with open(path, "wb") as f:
f.write(string.encode("utf-8"))
def test_dump_logs(install_mockery, mock_fetch, mock_archive, mock_packages, disable_capture):
"""Test that ``spack log`` can find (and print) the logs for partial
builds and completed installs.
Also make sure that for compressed logs, that we automatically
decompress them.
"""
cmdline_spec = spack.spec.Spec("libelf")
concrete_spec = cmdline_spec.concretized()
# Sanity check, make sure this test is checking what we want: to
# start with
assert not concrete_spec.installed
stage_log_content = "test_log stage output\nanother line"
installed_log_content = "test_log install output\nhere to test multiple lines"
with concrete_spec.package.stage:
_write_string_to_path(stage_log_content, concrete_spec.package.log_path)
with stdout_as_buffered_text_stream() as redirected_stdout:
spack.cmd.logs._logs(cmdline_spec, concrete_spec)
assert _rewind_collect_and_decode(redirected_stdout) == stage_log_content
install("libelf")
# Sanity check: make sure a path is recorded, regardless of whether
# it exists (if it does exist, we will overwrite it with content
# in this test)
assert concrete_spec.package.install_log_path
with gzip.open(concrete_spec.package.install_log_path, "wb") as compressed_file:
bstream = BytesIO(installed_log_content.encode("utf-8"))
compressed_file.writelines(bstream)
with stdout_as_buffered_text_stream() as redirected_stdout:
spack.cmd.logs._logs(cmdline_spec, concrete_spec)
assert _rewind_collect_and_decode(redirected_stdout) == installed_log_content
with concrete_spec.package.stage:
_write_string_to_path(stage_log_content, concrete_spec.package.log_path)
# We re-create the stage, but "spack log" should ignore that
# if the package is installed
with stdout_as_buffered_text_stream() as redirected_stdout:
spack.cmd.logs._logs(cmdline_spec, concrete_spec)
assert _rewind_collect_and_decode(redirected_stdout) == installed_log_content

View File

@@ -98,13 +98,9 @@ def test_url_list(mock_packages):
def test_url_summary(mock_packages):
"""Test the URL summary command."""
# test url_summary, the internal function that does the work
(
total_urls,
correct_names,
correct_versions,
name_count_dict,
version_count_dict,
) = url_summary(None)
(total_urls, correct_names, correct_versions, name_count_dict, version_count_dict) = (
url_summary(None)
)
assert 0 < correct_names <= sum(name_count_dict.values()) <= total_urls
assert 0 < correct_versions <= sum(version_count_dict.values()) <= total_urls

View File

@@ -191,7 +191,7 @@ def test_view_files_not_ignored(
pkg.do_install()
pkg.assert_installed(spec.prefix)
install("view-dir-file") # Arbitrary package to add noise
install("view-file") # Arbitrary package to add noise
viewpath = str(tmpdir.mkdir("view_{0}".format(cmd)))
@@ -205,7 +205,7 @@ def test_view_files_not_ignored(
prefix_in_view = viewpath
args = []
view(cmd, *(args + [viewpath, "view-not-ignored", "view-dir-file"]))
view(cmd, *(args + [viewpath, "view-not-ignored", "view-file"]))
pkg.assert_installed(prefix_in_view)
view("remove", viewpath, "view-not-ignored")

View File

@@ -103,9 +103,9 @@ def hello_world_with_module_in_root(extension_creator):
@contextlib.contextmanager
def _hwwmir(extension_name=None):
with extension_creator(
extension_name
) if extension_name else extension_creator() as extension:
with (
extension_creator(extension_name) if extension_name else extension_creator()
) as extension:
# Note that the namespace of the extension is derived from the
# fixture.
extension.add_command(

View File

@@ -11,7 +11,7 @@
import llnl.util.filesystem as fs
import spack.compiler
import spack.compilers as compilers
import spack.compilers
import spack.spec
import spack.util.environment
from spack.compiler import Compiler
@@ -25,12 +25,14 @@ class MockOs:
pass
compiler_name = "gcc"
compiler_cls = compilers.class_for_compiler_name(compiler_name)
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
monkeypatch.setattr(compiler_cls, "cc_version", lambda x: version)
compiler_id = compilers.CompilerID(os=MockOs, compiler_name=compiler_name, version=None)
variation = compilers.NameVariation(prefix="", suffix="")
return compilers.DetectVersionArgs(
compiler_id = spack.compilers.CompilerID(
os=MockOs, compiler_name=compiler_name, version=None
)
variation = spack.compilers.NameVariation(prefix="", suffix="")
return spack.compilers.DetectVersionArgs(
id=compiler_id, variation=variation, language="cc", path=path
)
@@ -56,7 +58,7 @@ def test_multiple_conflicting_compiler_definitions(mutable_config):
mutable_config.update_config("compilers", compiler_config)
arch_spec = spack.spec.ArchSpec(("test", "test", "test"))
cmp = compilers.compiler_for_spec("clang@=0.0.0", arch_spec)
cmp = spack.compilers.compiler_for_spec("clang@=0.0.0", arch_spec)
assert cmp.f77 == "f77"
@@ -64,7 +66,7 @@ def test_get_compiler_duplicates(config):
# In this case there is only one instance of the specified compiler in
# the test configuration (so it is not actually a duplicate), but the
# method behaves the same.
cfg_file_to_duplicates = compilers.get_compiler_duplicates(
cfg_file_to_duplicates = spack.compilers.get_compiler_duplicates(
"gcc@4.5.0", spack.spec.ArchSpec("cray-CNL-xeon")
)
@@ -74,7 +76,7 @@ def test_get_compiler_duplicates(config):
def test_all_compilers(config):
all_compilers = compilers.all_compilers()
all_compilers = spack.compilers.all_compilers()
filtered = [x for x in all_compilers if str(x.spec) == "clang@=3.3"]
filtered = [x for x in filtered if x.operating_system == "SuSE11"]
assert len(filtered) == 1
@@ -88,7 +90,7 @@ def test_version_detection_is_empty(
make_args_for_version, input_version, expected_version, expected_error
):
args = make_args_for_version(version=input_version)
result, error = compilers.detect_version(args)
result, error = spack.compilers.detect_version(args)
if not error:
assert result.id.version == expected_version
@@ -104,7 +106,7 @@ def test_compiler_flags_from_config_are_grouped():
"modules": None,
}
compiler = compilers.compiler_from_dict(compiler_entry)
compiler = spack.compilers.compiler_from_dict(compiler_entry)
assert any(x == "-foo-flag foo-val" for x in compiler.flags["cflags"])
@@ -288,7 +290,7 @@ def flag_value(flag, spec):
else:
compiler_entry = copy(default_compiler_entry)
compiler_entry["spec"] = spec
compiler = compilers.compiler_from_dict(compiler_entry)
compiler = spack.compilers.compiler_from_dict(compiler_entry)
return getattr(compiler, flag)
@@ -657,8 +659,8 @@ def test_xl_r_flags():
[("gcc@4.7.2", False), ("clang@3.3", False), ("clang@8.0.0", True)],
)
def test_detecting_mixed_toolchains(compiler_spec, expected_result, config):
compiler = compilers.compilers_for_spec(compiler_spec).pop()
assert compilers.is_mixed_toolchain(compiler) is expected_result
compiler = spack.compilers.compilers_for_spec(compiler_spec).pop()
assert spack.compilers.is_mixed_toolchain(compiler) is expected_result
@pytest.mark.regression("14798,13733")
@@ -737,6 +739,49 @@ def module(*args):
assert version == test_version
@pytest.mark.regression("42679")
def test_get_compilers(config):
"""Tests that we can select compilers whose versions differ only for a suffix."""
common = {
"flags": {},
"operating_system": "ubuntu23.10",
"target": "x86_64",
"modules": [],
"environment": {},
"extra_rpaths": [],
}
with_suffix = {
"spec": "gcc@13.2.0-suffix",
"paths": {
"cc": "/usr/bin/gcc-13.2.0-suffix",
"cxx": "/usr/bin/g++-13.2.0-suffix",
"f77": "/usr/bin/gfortran-13.2.0-suffix",
"fc": "/usr/bin/gfortran-13.2.0-suffix",
},
**common,
}
without_suffix = {
"spec": "gcc@13.2.0",
"paths": {
"cc": "/usr/bin/gcc-13.2.0",
"cxx": "/usr/bin/g++-13.2.0",
"f77": "/usr/bin/gfortran-13.2.0",
"fc": "/usr/bin/gfortran-13.2.0",
},
**common,
}
compilers = [{"compiler": without_suffix}, {"compiler": with_suffix}]
assert spack.compilers.get_compilers(
compilers, cspec=spack.spec.CompilerSpec("gcc@=13.2.0-suffix")
) == [spack.compilers._compiler_from_config_entry(with_suffix)]
assert spack.compilers.get_compilers(
compilers, cspec=spack.spec.CompilerSpec("gcc@=13.2.0")
) == [spack.compilers._compiler_from_config_entry(without_suffix)]
def test_compiler_get_real_version_fails(working_env, monkeypatch, tmpdir):
# Test variables
test_version = "2.2.2"

View File

@@ -422,7 +422,7 @@ def test_xl_version_detection(version_str, expected_version):
("pgi", "19.1"),
("pgi", "19.1a"),
("intel", "9.0.0"),
("intel", "0.0.0-foobar")
("intel", "0.0.0-foobar"),
# ('oneapi', '2021.1'),
# ('oneapi', '2021.1-foobar')
],

View File

@@ -1522,6 +1522,30 @@ def test_sticky_variant_in_package(self):
s = Spec("sticky-variant %clang").concretized()
assert s.satisfies("%clang") and s.satisfies("~allow-gcc")
@pytest.mark.regression("42172")
@pytest.mark.only_clingo("Original concretizer cannot use sticky variants")
@pytest.mark.parametrize(
"spec,allow_gcc",
[
("sticky-variant@1.0+allow-gcc", True),
("sticky-variant@1.0~allow-gcc", False),
("sticky-variant@1.0", False),
],
)
def test_sticky_variant_in_external(self, spec, allow_gcc):
# setup external for sticky-variant+allow-gcc
config = {"externals": [{"spec": spec, "prefix": "/fake/path"}], "buildable": False}
spack.config.set("packages:sticky-variant", config)
maybe = llnl.util.lang.nullcontext if allow_gcc else pytest.raises
with maybe(spack.error.SpackError):
s = Spec("sticky-variant-dependent%gcc").concretized()
if allow_gcc:
assert s.satisfies("%gcc")
assert s["sticky-variant"].satisfies("+allow-gcc")
assert s["sticky-variant"].external
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_do_not_invent_new_concrete_versions_unless_necessary(self):
# ensure we select a known satisfying version rather than creating

View File

@@ -1133,3 +1133,46 @@ def test_conflict_packages_yaml(packages_yaml, spec_str, concretize_scope, mock_
update_packages_config(packages_yaml)
with pytest.raises(UnsatisfiableSpecError):
Spec(spec_str).concretized()
@pytest.mark.parametrize(
"spec_str,expected,not_expected",
[
(
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv~cuda",
["cuda_arch=10", "^dependency-mv~cuda"],
["cuda_arch=11", "^dependency-mv cuda_arch=10", "^dependency-mv cuda_arch=11"],
),
(
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv+cuda",
["cuda_arch=10", "^dependency-mv cuda_arch=10"],
["cuda_arch=11", "^dependency-mv cuda_arch=11"],
),
(
"forward-multi-value +cuda cuda_arch=11 ^dependency-mv+cuda",
["cuda_arch=11", "^dependency-mv cuda_arch=11"],
["cuda_arch=10", "^dependency-mv cuda_arch=10"],
),
(
"forward-multi-value +cuda cuda_arch=10,11 ^dependency-mv+cuda",
["cuda_arch=10,11", "^dependency-mv cuda_arch=10,11"],
[],
),
],
)
def test_forward_multi_valued_variant_using_requires(
spec_str, expected, not_expected, config, mock_packages
):
"""Tests that a package can forward multivalue variants to dependencies, using
`requires` directives of the form:
for _val in ("shared", "static"):
requires(f"^some-virtual-mv libs={_val}", when=f"libs={_val} ^some-virtual-mv")
"""
s = Spec(spec_str).concretized()
for constraint in expected:
assert s.satisfies(constraint)
for constraint in not_expected:
assert not s.satisfies(constraint)

View File

@@ -1851,7 +1851,7 @@ def binary_with_rpaths(prefix_tmpdir):
paths are encoded with `$ORIGIN` prepended.
"""
def _factory(rpaths, message="Hello world!"):
def _factory(rpaths, message="Hello world!", dynamic_linker="/lib64/ld-linux.so.2"):
source = prefix_tmpdir.join("main.c")
source.write(
"""
@@ -1867,10 +1867,10 @@ def _factory(rpaths, message="Hello world!"):
executable = source.dirpath("main.x")
# Encode relative RPATHs using `$ORIGIN` as the root prefix
rpaths = [x if os.path.isabs(x) else os.path.join("$ORIGIN", x) for x in rpaths]
rpath_str = ":".join(rpaths)
opts = [
"-Wl,--disable-new-dtags",
"-Wl,-rpath={0}".format(rpath_str),
f"-Wl,-rpath={':'.join(rpaths)}",
f"-Wl,--dynamic-linker,{dynamic_linker}",
str(source),
"-o",
str(executable),

View File

@@ -60,13 +60,9 @@ def test_spec_installed_upstream(
upstream_and_downstream_db, mock_custom_repository, config, monkeypatch
):
"""Test whether Spec.installed_upstream() works."""
(
upstream_write_db,
upstream_db,
upstream_layout,
downstream_db,
downstream_layout,
) = upstream_and_downstream_db
(upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout) = (
upstream_and_downstream_db
)
# a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository):
@@ -90,13 +86,9 @@ def test_spec_installed_upstream(
@pytest.mark.usefixtures("config")
def test_installed_upstream(upstream_and_downstream_db, tmpdir):
(
upstream_write_db,
upstream_db,
upstream_layout,
downstream_db,
downstream_layout,
) = upstream_and_downstream_db
(upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout) = (
upstream_and_downstream_db
)
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
@@ -132,13 +124,9 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
@pytest.mark.usefixtures("config")
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir):
(
upstream_write_db,
upstream_db,
upstream_layout,
downstream_db,
downstream_layout,
) = upstream_and_downstream_db
(upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout) = (
upstream_and_downstream_db
)
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("z")
@@ -168,13 +156,9 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
DB. When a package is recorded as installed in both, the results should
refer to the downstream DB.
"""
(
upstream_write_db,
upstream_db,
upstream_layout,
downstream_db,
downstream_layout,
) = upstream_and_downstream_db
(upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout) = (
upstream_and_downstream_db
)
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")

View File

@@ -363,3 +363,30 @@ def test_gitsubmodules_delete(
assert not os.path.isdir(file_path)
file_path = os.path.join(s.package.stage.source_path, "third_party/submodule1")
assert not os.path.isdir(file_path)
@pytest.mark.disable_clean_stage_check
def test_gitsubmodules_falsey(
mock_git_repository, default_mock_concretization, mutable_mock_repo, monkeypatch
):
"""
Test GitFetchStrategy behavior when callable submodules returns Falsey
"""
def submodules_callback(package):
return False
type_of_test = "tag-branch"
t = mock_git_repository.checks[type_of_test]
# Construct the package under test
s = default_mock_concretization("git-test")
args = copy.copy(t.args)
args["submodules"] = submodules_callback
monkeypatch.setitem(s.package.versions, Version("git"), args)
s.package.do_stage()
with working_dir(s.package.stage.source_path):
file_path = os.path.join(s.package.stage.source_path, "third_party/submodule0/r0_file_0")
assert not os.path.isfile(file_path)
file_path = os.path.join(s.package.stage.source_path, "third_party/submodule1/r0_file_1")
assert not os.path.isfile(file_path)

View File

@@ -13,8 +13,7 @@
import pytest
import llnl.util.filesystem as fs
import llnl.util.symlink
from llnl.util.symlink import SymlinkError, _windows_can_symlink, islink, symlink
from llnl.util.symlink import islink, symlink
import spack.paths
@@ -754,93 +753,6 @@ def test_is_nonsymlink_exe_with_shebang(tmpdir):
assert not fs.is_nonsymlink_exe_with_shebang("symlink_to_executable_script")
@pytest.mark.skipif(sys.platform == "win32", reason="Unix-only test.")
def test_lexists_islink_isdir(tmpdir):
root = str(tmpdir)
# Create a directory and a file, an a bunch of symlinks.
dir = os.path.join(root, "dir")
file = os.path.join(root, "file")
nonexistent = os.path.join(root, "does_not_exist")
symlink_to_dir = os.path.join(root, "symlink_to_dir")
symlink_to_file = os.path.join(root, "symlink_to_file")
dangling_symlink = os.path.join(root, "dangling_symlink")
symlink_to_dangling_symlink = os.path.join(root, "symlink_to_dangling_symlink")
symlink_to_symlink_to_dir = os.path.join(root, "symlink_to_symlink_to_dir")
symlink_to_symlink_to_file = os.path.join(root, "symlink_to_symlink_to_file")
os.mkdir(dir)
with open(file, "wb") as f:
f.write(b"file")
symlink("dir", symlink_to_dir)
symlink("file", symlink_to_file)
symlink("does_not_exist", dangling_symlink)
symlink("dangling_symlink", symlink_to_dangling_symlink)
symlink("symlink_to_dir", symlink_to_symlink_to_dir)
symlink("symlink_to_file", symlink_to_symlink_to_file)
assert fs.lexists_islink_isdir(dir) == (True, False, True)
assert fs.lexists_islink_isdir(file) == (True, False, False)
assert fs.lexists_islink_isdir(nonexistent) == (False, False, False)
assert fs.lexists_islink_isdir(symlink_to_dir) == (True, True, True)
assert fs.lexists_islink_isdir(symlink_to_file) == (True, True, False)
assert fs.lexists_islink_isdir(symlink_to_dangling_symlink) == (True, True, False)
assert fs.lexists_islink_isdir(symlink_to_symlink_to_dir) == (True, True, True)
assert fs.lexists_islink_isdir(symlink_to_symlink_to_file) == (True, True, False)
@pytest.mark.skipif(sys.platform != "win32", reason="For Windows Only")
@pytest.mark.parametrize("win_can_symlink", [True, False])
def test_lexists_islink_isdir_windows(tmpdir, monkeypatch, win_can_symlink):
"""Run on windows without elevated privileges to test junctions and hard links which have
different results from the lexists_islink_isdir method.
"""
if win_can_symlink and not _windows_can_symlink():
pytest.skip("Cannot test dev mode behavior without dev mode enabled.")
with tmpdir.as_cwd():
monkeypatch.setattr(llnl.util.symlink, "_windows_can_symlink", lambda: win_can_symlink)
dir = str(tmpdir.join("dir"))
file = str(tmpdir.join("file"))
nonexistent = str(tmpdir.join("does_not_exist"))
symlink_to_dir = str(tmpdir.join("symlink_to_dir"))
symlink_to_file = str(tmpdir.join("symlink_to_file"))
dangling_symlink = str(tmpdir.join("dangling_symlink"))
symlink_to_dangling_symlink = str(tmpdir.join("symlink_to_dangling_symlink"))
symlink_to_symlink_to_dir = str(tmpdir.join("symlink_to_symlink_to_dir"))
symlink_to_symlink_to_file = str(tmpdir.join("symlink_to_symlink_to_file"))
os.mkdir(dir)
assert fs.lexists_islink_isdir(dir) == (True, False, True)
symlink("dir", symlink_to_dir)
assert fs.lexists_islink_isdir(dir) == (True, False, True)
assert fs.lexists_islink_isdir(symlink_to_dir) == (True, True, True)
with open(file, "wb") as f:
f.write(b"file")
assert fs.lexists_islink_isdir(file) == (True, False, False)
symlink("file", symlink_to_file)
if win_can_symlink:
assert fs.lexists_islink_isdir(file) == (True, False, False)
else:
assert fs.lexists_islink_isdir(file) == (True, True, False)
assert fs.lexists_islink_isdir(symlink_to_file) == (True, True, False)
with pytest.raises(SymlinkError):
symlink("does_not_exist", dangling_symlink)
symlink("dangling_symlink", symlink_to_dangling_symlink)
symlink("symlink_to_dir", symlink_to_symlink_to_dir)
symlink("symlink_to_file", symlink_to_symlink_to_file)
assert fs.lexists_islink_isdir(nonexistent) == (False, False, False)
assert fs.lexists_islink_isdir(symlink_to_dangling_symlink) == (False, False, False)
assert fs.lexists_islink_isdir(symlink_to_symlink_to_dir) == (True, True, True)
assert fs.lexists_islink_isdir(symlink_to_symlink_to_file) == (True, True, False)
class RegisterVisitor(fs.BaseDirectoryVisitor):
"""A directory visitor that keeps track of all visited paths"""

Some files were not shown because too many files have changed in this diff Show More