Compare commits

...

264 Commits

Author SHA1 Message Date
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
Massimiliano Culpo
f7afd67a26 Remove spurious ASP debug lines (#44051) 2024-05-07 11:28:38 +02:00
psakievich
d22bdc1c4e certs: fix interpolation and disallow relative paths (#44030) 2024-05-07 11:16:32 +02:00
Mikael Simberg
540f9eefb7 mold: Add 2.30.0 and 2.31.0 (#44034) 2024-05-07 10:41:13 +02:00
Massimiliano Culpo
2db5bca778 Warn users of the future removal of platform=cray (#43980) 2024-05-07 10:30:23 +02:00
Harmen Stoppels
bcd05407b8 llnl.util.tty.color._force_color: init in global scope (#44036)
Currently SPACK_COLOR=always is not respected in the build process on
macOS, because the global `_force_color` is re-evaluated in global scope
during module setup, where it is always `None`.

So, move global init bits from main.py to the module itself.
2024-05-07 09:49:46 +02:00
John W. Parent
b35ec605fe python-venv: use correct python name for which call (#44048) 2024-05-07 09:32:44 +02:00
Massimiliano Culpo
0a353abc42 Fix issues in packages with the new release of archspec (#44037) 2024-05-07 09:20:34 +02:00
Massimiliano Culpo
e178c58847 Respect requests when filtering reused specs (#44042)
Some specs which were excluded from reuse,
are currently added back to the solve when
we traverse dependencies of other reusable
specs.

This fixes the issue by keeping track of what
we can explicitly reuse.
2024-05-07 09:06:51 +02:00
Massimiliano Culpo
d7297e67a5 Maintenance for the "bootstrap" workflow in CI (#44031)
Removed a lot of duplication.

Fixed an issue in containers, leading to false negative
2024-05-07 08:49:53 +02:00
snehring
ee8addf04a neic-finitefault: adding new package and dependencies (#44032)
* neic-finitefault: adding new package
* py-obspy: adding new package
* py-okada-wrapper: adding new package
* py-pygmt: adding new package
* py-pyrocko: adding new package

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-06 15:45:32 -07:00
Jon Rood
fd3cd3a1c6 amr-wind: updates and fixes (#43993)
* Updates and fixes for AMR-Wind.

* Add versions and update openfast constraints.

* Style.

* Fix version.
2024-05-06 13:31:29 -06:00
Garth N. Wells
e585aeb883 (py)-fenics-dolfinx: version update (#44002)
* Update DOLFINx to v0.8
* Fix typo
2024-05-06 12:11:02 -07:00
James Taliaferro
1f43384db4 Add LSP client extension for Kakoune (#44000)
* new package: kakoune-lsp
* blacken
* add myself as a maintainer
2024-05-06 11:18:45 -07:00
Adam J. Stewart
814b328fca py-torchmetrics: add v1.4.0 (#44027) 2024-05-06 10:21:20 -07:00
Harmen Stoppels
125206d44d python: always use a venv (#40773)
This commit adds a layer of indirection to improve build isolation with 
and without external Python, as well as usability of environment views.

It adds `python-venv` as a dependency to all packages that `extends("python")`, 
which has the following advantages:

1. Build isolation: only `PYTHONPATH` is considered in builds, not 
   user / system packages
2. Stable install layout: fixes the problem on Debian, RHEL and Fedora where 
   external / system python produces `bin/local` subdirs in Spack install prefixes. 
3. Environment views are Python virtual environments (and if you add 
   `py-pip` things like `pip list` work)

Views work whether they're symlink, hardlink or copy type.

This commit additionally makes `spec["python"].command` return 
`spec["python-venv"].command`. The rationale is that packages in repos we do 
not own do not pass the underlying python to the build system, which could still 
result in incorrectly computed install layouts.

Other attributes like `libs`, `headers` should be on `python` anyways and need no change.
2024-05-06 16:17:35 +02:00
John W. Parent
a081b875b4 proj: patch for modern cmake (#43780) 2024-05-06 16:01:10 +02:00
Harmen Stoppels
a16ee3348b Do not cache indices in Gitlab (#44029) 2024-05-06 15:54:21 +02:00
Massimiliano Culpo
d654d6b1f4 Remove Fedora 37 and 38, Ubuntu 18 from CI (#44006) 2024-05-06 15:51:45 +02:00
Harmen Stoppels
9b4ca0be40 clingo bootstrap: remove 3.12 patch and concretizer workarounds (#44028) 2024-05-06 15:00:41 +02:00
Harmen Stoppels
dc71dcfdc2 bootstrap: lazy bootstrapping of clingo and GnuPG (#44026)
Currently bootstrapping from source fails because clingo requires gnupg
requires clingo.

This commit stops eager bootstrapping. We don't need `patchelf` nor `gnupg`
generally. They're bootstrapped when needed.
2024-05-06 14:02:39 +02:00
Greg Becker
1f31c3374c External package detection for compilers (#43464)
This creates shared infrastructure for compiler packages to implement the 
detailed search capabilities from the `spack compiler find` command for the 
`spack external find` command.

After this commit, `spack compiler find` can be replaced with 
`spack external find --tag compiler`, with the exception of mixed toolchains.
2024-05-06 10:33:33 +02:00
Massimiliano Culpo
27aeb6e293 Update vendored archspec to v0.2.4 (#44005) 2024-05-06 10:20:56 +02:00
Harmen Stoppels
715214c1a1 spack env create <env>: dir if dir-like (#44024)
A named env cannot contain `.` and `/`.

So when a user runs `spack env create ./here` do not error but treat it
as `spack env create -d ./here`.

Also fix help string of `spack env create`, which seems to have been
copied from `activate` incorrectly.
2024-05-06 02:00:23 -06:00
dependabot[bot]
b471d62dbd build(deps): bump black from 24.4.0 to 24.4.2 in /lib/spack/docs (#43878)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-06 09:55:43 +02:00
Massimiliano Culpo
a5f62889ca archspec: add v0.2.4 (#44022) 2024-05-06 09:51:01 +02:00
Alec Scott
2a942d98e3 pkgconf: add v2.2.0 (#44008) 2024-05-06 09:31:41 +02:00
Alec Scott
4a4077d4ef rust: add v1.78.0 (#44012) 2024-05-06 09:30:34 +02:00
Alec Scott
c0fcccc232 go: add v1.22.2 (#44013) 2024-05-06 09:30:13 +02:00
Alec Scott
0b2cbfefce ncurses: add v6.5 (#44015) 2024-05-06 09:30:00 +02:00
Massimiliano Culpo
c499514322 libgpg-error: add v1.49 (#44023) 2024-05-06 09:28:31 +02:00
Alec Scott
ae392b5935 pcre2: add v10.43 (#44020) 2024-05-06 09:25:11 +02:00
Alec Scott
62e9bb5d51 libunistring: add v1.2 (#44018) 2024-05-06 09:24:12 +02:00
Alec Scott
6cd948184e libidn2: add v2.3.7 (#44017) 2024-05-06 09:23:50 +02:00
Alec Scott
44ff24f558 openssl: v3.3.0 (#44019) 2024-05-06 09:22:15 +02:00
Alec Scott
c657dfb768 gettext: v0.22.5 (#44014) 2024-05-05 06:57:42 -06:00
Pranav Sivaraman
f2e410d95a lazygit: add v0.41.0 (#43903) 2024-05-04 17:18:30 -06:00
dependabot[bot]
df443a38d6 build(deps): bump actions/checkout from 4.1.2 to 4.1.4 (#43831)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.2 to 4.1.4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](9bb56186c3...0ad4b8fada)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:49:26 -07:00
dependabot[bot]
74fe498cb8 build(deps): bump mypy from 1.9.0 to 1.10.0 in /lib/spack/docs (#43834)
Bumps [mypy](https://github.com/python/mypy) from 1.9.0 to 1.10.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:48:49 -07:00
Carlos Bederián
5f13a48bf2 mesa: add 23.3.6 (#43736) 2024-05-04 15:46:39 -07:00
Jon Rood
c4824f7fd2 ccls: add new versions (#43923) 2024-05-04 15:42:07 -07:00
dependabot[bot]
49a8634584 build(deps): bump codecov/codecov-action from 4.3.0 to 4.3.1 (#43945)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](84508663e9...5ecb98a3c6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:39:42 -07:00
Seth Bromberger
eac5ea869f tmux: add v3.4 and missing yacc build dep (#43975) 2024-05-04 22:33:19 +02:00
Juan Miguel Carceller
f5946c4621 pythia8: Add a gzip variant and filter some compiler wrapper paths (#43807)
* Add a gzip variant and filter some compiler wrapper paths

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-04 09:33:44 -05:00
Harmen Stoppels
8564ab19c3 fix iconv from libc (#43996)
* fix iconv from libc

* fix args in glib
2024-05-04 10:30:43 +02:00
Scott Wittenburg
aae7a22d39 gitlab: release branch pipelines rebuild what changed (#43990) 2024-05-04 10:11:48 +02:00
eugeneswalker
09cea230b4 e4s-alc: add v1.0.2 (#44001) 2024-05-03 22:24:08 -06:00
wspear
a1f34ec58b Add a gmake dependency for TAU (#43870)
We discovered a case where the system gmake can get confused by spack's build environment. Let's use a spack-provided gmake for consistent builds.
2024-05-03 20:15:40 -07:00
Auriane R
4d7cd4c0bf Add py-flash-attn@2.4.2 (#43960) 2024-05-03 16:40:46 -07:00
Christopher Christofi
4adbfa3835 fix package permissions docs link for gaussian packages (#43998) 2024-05-03 17:35:39 -06:00
Adam J. Stewart
8a1b69c1d3 Modernize py-torch-geometric and dependencies (#43984)
* Modernize py-torch-geometric and dependencies
* Forgot one maintainer
2024-05-03 16:21:41 -07:00
Matthew Thompson
a1d69f8661 hdf package: fix building with apple-clang@15: (#43964) 2024-05-03 16:11:55 -07:00
Carlos Bederián
e05dbc529e globalarrays package: add missing dependencies for armci=ofi/openib (#43971) 2024-05-03 16:08:29 -07:00
jdomke
99d33bf1f2 hpl: linking against fujitsu ssl2 if available (#43978)
* linking against fujitsu ssl2 if available

---------

Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:08:12 -07:00
jdomke
bd1918cd71 adding clang compiler checks which behaves exactly like aocc (#43976)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:01:14 -07:00
snehring
2a967c7df4 graphicsmagick package: add version 1.3.43 (#43977) 2024-05-03 15:59:10 -07:00
downloadico
7596aac958 New package: py-pylith (#43987) 2024-05-03 15:56:21 -07:00
Frédéric Simonis
c73ded8ed6 preCICE: increase baseline versions for next ver (#43985) 2024-05-03 15:55:46 -07:00
Satish Balay
df1d783035 sowing: add @master (#43991) 2024-05-03 15:37:39 -07:00
Massimiliano Culpo
47051c3690 Add a default provider for armci (#43997) 2024-05-03 23:48:58 +02:00
Owen Solberg
3fd83c637d Add checksum for R v4.4.0 (#43973)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-05-03 14:47:53 -07:00
Jon Rood
ef5afb66da openfast: updates to package (#43994) 2024-05-03 15:22:12 -06:00
snehring
ecc4336bf9 r-asreml: adding new package (#43970)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-03 12:20:58 -07:00
Harmen Stoppels
d2ed217796 concretizer args: --fresh-roots == --reuse-deps (#43988)
Since reuse is the default now, `--reuse-deps` can be confusing, as it
technically does not imply roots are fresh.

So add `--fresh-roots`, which is also easier to discover when running
`spack concretize --fre<tab>`
2024-05-03 12:12:36 -07:00
Dom Heinzeller
272c7c069a Add -fPIC to hdf-eos2 builds (#43794)
* Add -fPIC to hdf-eos2 builds
* Use self.compiler.cc_pic_flag instead of -fPIC
* Fix style in var/spack/repos/builtin/packages/hdf-eos2/package.py
2024-05-03 11:38:33 -07:00
Jeff Hammond
23f16041cd NWChem ARMCI-MPI variant and optional TCE builds (#43883)
* WIP NWChem with ARMCI-MPI and TCE optional
* rename armci-mpi to armcimpi
* add version for git
* add ARMCI-MPI support to NWChem package
   EXTERNAL_ARMCI_PATH needs to be set to the ARMCI-MPI package install prefix.
   I could not get it from spec["armcimpi"] but it worked to use the dependent build environment method to export a    generic environmental variable to use instead.
* i suppose i can maintain NWChem as well
* check rejects option that dozens of packages use
   i do not have time to fight with this nonsense
   Run . share/spack/setup-env.sh
   ==> Error: armcimpi version 'master' has extra arguments: 'branch'
   Valid arguments for a url fetcher are:
       'url', 'sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', and 'checksum'
   Error: Process completed with exit code 1.
* style
* ARMCI-MPI needs depends_on; the rest seems fixed
* fix ARMCI selection per review feedback from @zzzoom
   https://github.com/spack/spack/pull/43883#discussion_r1585014147
* address reviewer feedback from @yizeyi18
   https://github.com/spack/spack/pull/43883#discussion_r1587228084
   elaborate on what +extratce does, in terms of the NWChem build environment variables,
   some of which are documented on https://nwchemgit.github.io/TCE.html.
* style
* Update var/spack/repos/builtin/packages/nwchem/package.py

---------

Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Carlos Bederián <4043375+zzzoom@users.noreply.github.com>
2024-05-03 11:13:54 -07:00
Frank Willmore
e2329adac0 Update package.py for vacuumms 1.2.0 (#43948)
* updating vacuumms for new release
* formatting
* re-order versions and remove triple quote
2024-05-03 11:06:15 -07:00
jalcaraz
4ec788ca12 [TAU package] Update with +rocprofv2. Updated some tests. (#43937)
* [TAU package] Update with +rocprofv2. Updated some tests.

pdated with +rocprofv2 flag. Only works with rocm-core >= 6.0.0

Can be tested with (Omnia):
spack install tau@master +rocm+rocprofv2 %gcc@11
Needs the last commit in our local repository, can be done by modifying the "git = " line, or waiting until the public one is updated.


In the case that tests cause issues when building TAU, there is the flag:
disable_tests = False

The rocm test is disabled by default, as the PR regarding tests loading dependencies is not solved (PR#43682).

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-05-03 08:33:13 -07:00
Andrew W Elble
c1cea9d304 py-tensorflow: fix aarch64 build (#43852)
* py-tensorflow: fix aarch64 build

* [@spackbot] updating style on behalf of aweits

* patch format

* change patch strategy, actually get the logic correct in
the patch

* only patch 2.16.1 onwards for aarch64

* __CUDACC__ not __NVCC__

* !(defined(__NVCC__) && defined(__CUDACC__))

---------

Co-authored-by: aweits <aweits@users.noreply.github.com>
2024-05-03 15:46:13 +02:00
Massimiliano Culpo
5c96e67bb1 Make runtimes depend on libc only on linux (#43981) 2024-05-03 15:40:02 +02:00
Marc T. Henry de Frahan
7008bb6335 Add rosco package (#43822)
* Add rosco package

* format

* remove unused

* Add in other versions

* start adding some patches

* finish adding the patches
2024-05-03 01:58:35 -06:00
Adam J. Stewart
14561fafff py-torch: set TORCH_CUDA_ARCH_LIST globally for dependents (#43962) 2024-05-03 09:11:59 +02:00
Massimiliano Culpo
89bf1edb6e Spec.satisfies: fix a bug with concrete spec from JSON (#43968)
Fix a bug triggered by missing a virtual on some transitive edge, in a subdag of a pure build dependency.
2024-05-02 22:16:02 -04:00
Todd Gamblin
cc85dc05b7 docs: re-enable google analytics (#43974)
We recently switched to using the new ReadTheDocs with "addons". That includes its own
analytics, which is nice, but we also want to continue using our GA4 analytics.

Adding GA4 is no longer supported by RTD, so we have to add it manually.

- [x] re-add the gtag to all pages, manually

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-02 21:56:19 -04:00
Adam J. Stewart
ae171f8b83 py-torch: conflict with newer cuda (#43969) 2024-05-02 18:47:33 -07:00
snehring
578dd18b34 minimap2: adding version 2.28 (#43972)
* minimap2: adding version 2.28
* minimap2: adding maintainer

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-02 18:45:59 -07:00
Garth N. Wells
a7a51ee5cf py-fenics-ffcx: update to v0.8 (#43929)
* Update FFCx and UFCx to v0.8
* Syntax fix
2024-05-02 09:53:31 -07:00
Stephanie Labasan Brink
960cc90667 variorum: add branch dev and version 0.8.0 (#43829)
* variorum: add branch dev and version 0.8.0
* formatting
2024-05-02 09:50:05 -07:00
Alex Richert
dea44bad8b grads: fix libpng and g2c deps (#43909)
* grads: fix libpng and g2c deps
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:42:15 -07:00
Richard Berger
e37870ff43 rdma-core: add versions 51.0, 50.0 and 49.1 (#43926) 2024-05-02 09:41:23 -07:00
Sajid Ali
3751642a27 mold: add new versions (#43931) 2024-05-02 09:39:09 -07:00
Nils Vu
0f386697c6 spectre: add versions, update constraints (#43936) 2024-05-02 09:38:06 -07:00
Alex Richert
67ce103b2c Update prod-util recipe (#43940)
* update prod-util recipe
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:24:39 -07:00
Alex Richert
a8c9fa0e45 g2tmpl: add v1.12.0 (#43941) 2024-05-02 09:23:42 -07:00
Alex Richert
b56a133fce update grib-util recipe (#43939) 2024-05-02 09:22:03 -07:00
Seth R. Johnson
f0b3d33145 Celeritas: new version 0.4.3 (#43949) 2024-05-02 09:04:30 -07:00
Adam J. Stewart
32564da9d0 rust: add conflict for older GCC (#43958) 2024-05-02 08:36:06 -07:00
Stephen Hudson
8f2faf65dc libEnsemble: add v1.3.0 (#43944) 2024-05-02 08:35:06 -07:00
eugeneswalker
1d59637051 e4s ci: add py-amrex (#43904) 2024-05-02 08:57:26 -06:00
jdomke
97dc353cb0 libc: detect ARM flavor (#43959)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-02 15:59:29 +02:00
wspear
beebe2c9d3 Accept later pythons for tau@2.33+ (#43911)
* Accept later pythons for tau@2.33+

* separate forward compat bounds
2024-05-02 07:13:21 -06:00
Harmen Stoppels
2eb7add8c4 gcc: write builtin specs before modifications (#43956) 2024-05-02 06:40:10 -06:00
Harmen Stoppels
a9fea9f611 nghttp2: add diffutils (#43954) 2024-05-02 14:22:53 +02:00
Harmen Stoppels
9b62a9c238 gitlab ci: cache user cache (#43952) 2024-05-02 12:06:30 +02:00
wspear
f7eb0ccfc9 Revert #43701 (#43935)
PR #43701 is broken, preventing scorep from being installed. As explained in #43700 the issue is internal to scorep and can not be fixed in the package, at least by the method being attempted here.
2024-05-02 09:56:21 +02:00
Adrien Bernede
a0aa35667c Ignore external packages when pushing to buildcache automatically (--autopush) (#43930) 2024-05-02 09:50:10 +02:00
Luc Berger
b1d4fd14bc Nalu: adding support for Trilinos 14.2.0 for Nalu 1.6.0 (#43857) 2024-05-02 01:10:27 -06:00
John W. Parent
7e8415a3a6 Windows: auto-add WGL/SDK as externals (#43752)
Adds a pre-concretization check for the Windows SDK and WGL (Windows
GL) packages as non-buildable externals.

This is a redo of https://github.com/spack/spack/pull/43459, but makes
sure to modify the configuration scope outside of the bootstrap scope:
whichever is highest-precedence in the user's environment at the time
the concretization runs, which should either be an env scope or the
~ scope.

Adds pytest fixture mocking the check for WGL and WSDK as if they were
present.
2024-05-02 01:05:03 -06:00
Simon Pintarelli
7f4f42894d update sirius package.py (#43872) 2024-05-02 08:23:10 +02:00
Massimiliano Culpo
4e876b4014 Allow more control over which specs are reused (#42782)
This PR gives users finer control over which specs are reused during concretization.

The value of the `concretizer:reuse` config option now can take an object with the following properties:
- `roots`: true if reusing roots, false if reusing just dependencies
- `exclude`: list of constraints used to select reusable specs 
- `include`: list of constraints used to select reusable specs 
- `from`: allows to select the sources of reused specs

### Examples

#### Reuse only specs compiled with GCC
```yaml
concretizer:
  reuse:
    roots: true
    include:
    - "%gcc"
```

#### `openmpi` must be used from externals, and it must be the only external used
```yaml
concretizer:
  reuse:
    roots: true
    from:
    - type: local
      exclude:
      - "openmpi"
    - type: buildcache
      exclude:
      - "openmpi"
    - type: external
      include:
      - "openmpi"
```
2024-05-01 23:05:26 -04:00
Wouter Deconinck
77a8a4fe08 abseil-cpp: new version 20240116.2 (#43666) 2024-05-01 17:06:57 -07:00
Thomas Bouvier
597e5a4e5e Update py-nvidia-dali to v1.36.0 (#43928)
* py-nvidia-dali: add v1.36.0
* fix: do not expand downloaded wheel
2024-05-01 16:35:43 -07:00
MatthewLieber
3c31c32f62 Adding sha for 7.4 release of OSU Micro Benchmarks (#43899)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-05-01 16:27:52 -07:00
Weiqun Zhang
3a93a716e4 amrex: add v24.05 (#43938) 2024-05-01 16:17:05 -07:00
Filippo Spiga
82229a0784 Adding EXTRAE 4.1.2 (#43907) 2024-05-01 15:55:54 -07:00
Axel Huebl
5d846a69d1 ADIOS2: Campaign Variant (#43906)
With v2.10+, ADIOS added a campaign manager. This is auto-enabled
if SQLite3 is found.

Add explicit control for it now and disables it by default, to avoid
picking up system dependencies or bloating by default the ADIOS2
dependencies. Also, not yet fully mature to be used by default:
https://github.com/ornladios/ADIOS2/issues/4148
2024-05-01 15:50:45 -07:00
Stephen Herbener
d21aa1cc12 Removed use of mpi wrappers for fms and mapl package.py scripts. These were causing (#43726)
builds to fail on MacOS and CMake appears to handle the build fine without the mpi wrappers.
2024-05-01 15:41:30 -07:00
Jake Koester
7896ff51f6 bumped up version of kokkos-kernels (#43920) 2024-05-01 12:20:51 -06:00
Chris Mc
5849a24a74 jwt-cpp: add v0.7.0 (#41894)
* jwt-cpp: add version 0.7.0

* Update var/spack/repos/builtin/packages/jwt-cpp/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-01 12:20:35 -06:00
Alex Leute
38c49d6b82 py-simpy: New package (#43497)
* pypi build of py-simpy

* [py-simpy] toml explicitly called out

* py-simpy: New version

* py-setuptools-scm: Added new version

* py-setuptools-scm: add url_for_version

Because versions @:7 have an underscore (setuptools_scm) in the URL, and
newer versions have a dash (setuptools-scm).

* py-setuptools-scm: fix flake8 line too long

* py-simpy: Fix hash

---------

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-05-01 12:06:52 -06:00
fpruvost
0d8900986d qrmumps: 3.1 (#43913) 2024-05-01 12:00:25 -06:00
dependabot[bot]
62554cebc4 build(deps): bump black in /.github/workflows/style (#43879)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:37:54 +02:00
Wouter Deconinck
067155cff5 containers: add ubuntu 24.04 (#43881)
* containers: add ubuntu 24.04

* containers: use python3-boto3 pkg instead of pip install
2024-05-01 13:37:13 +02:00
Wouter Deconinck
08e68d779f ci: update upload-artifact to v4 (in build-containers) (#43880) 2024-05-01 13:35:12 +02:00
Adam J. Stewart
05b04cd4c3 Update PyTorch ecosystem (#43823)
* Update PyTorch ecosystem

* Don't expose protobuf namespace to other packages, breaks torchtext build

* Revert "Don't expose protobuf namespace to other packages, breaks torchtext build"

This reverts commit 50a4b08f65.
2024-05-01 13:30:46 +02:00
dependabot[bot]
be48f762a9 build(deps): bump pytest from 8.1.1 to 8.2.0 in /lib/spack/docs (#43908)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.1.1 to 8.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.1.1...8.2.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:29:22 +02:00
Axel Huebl
de5b4840e9 Add quantiphy (#43894)
* Add quanitphy

Add https://github.com/KenKundert/quantiphy .

* Simplify Version Range
2024-04-30 15:26:35 -07:00
Harmen Stoppels
20f9884445 curl: perl is temporarily required (#43921) 2024-04-30 23:32:06 +02:00
Harmen Stoppels
deb78bcd93 re2c: depends on gmake (#43916)
regressed in 650a668a9d
2024-04-30 23:30:36 +02:00
Garth N. Wells
06239de0e9 py-fenics-ufl: add new version (#43848)
* Bump version to 2024.1.0
* Bump to .post0
* Dependency fix
* Format file
* Run black on package file
2024-04-30 14:10:55 -07:00
Garth N. Wells
1f904c38b3 Add version 1.9.2. (#43838) 2024-04-30 12:19:31 -07:00
Wouter Deconinck
f2d0ba8fcc PackageStillNeededError: add pkg that needs spec to exception msg (#43845)
* PackageStillNeededError: add pkg that needs spec to exception msg
* PackageStillNeededError: f-string with short fmt and hash
* PackageStillNeededError: split long string
2024-04-30 12:11:47 -07:00
Satish Balay
49d3eb1723 petsc, py-petsc4py: add v3.21.1 (#43887) 2024-04-30 10:07:33 -07:00
Dom Heinzeller
7c5439f48a Update crtm-fix and crtm from JCSDA fork (#43855) 2024-04-30 10:07:16 -07:00
Harmen Stoppels
7f2cedd31f hack: drop glibc and musl in old concretizer (#43914)
The old concretizer creates a cyclic graph when expanding virtuals for
`iconv`, which is a bug. This hack drops glibc and musl as possible
providers for `iconv` in the old concretizer to work around it.
2024-04-30 13:55:48 +02:00
Harmen Stoppels
d47951a1e3 glibc: provides iconv (#43897)
`iconv` is a bit of weird virtual because the only shared API between
`glibc` and `libiconv` is:

```
iconv
iconv_open
iconv_close
```

whereas `libiconv` has further symbols [iconvctl](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconvctl.3.html), [iconv_open_into](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconv_open_into.3.html), and an `iconv` executable and `libcharset.so`. Packages that need those will have to do `depends_on("[virtuals=iconv] libiconv")`.
2024-04-30 07:40:00 +02:00
Teague Sterling
f2bd0c5cf1 Fix duckdb version handling again (#43762)
* Added package to build Ollama
* Update package.py
   Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
* We can now use OVERRIDE_GIT_DESCRIBE to set the version in DuckDB
* Update duckdb/package.py
   - use f-string
   - fix version specs to address inconsistencies
* Update package.py
   Fix spec definitions further
* [@spackbot] updating style on behalf of teaguesterling

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-29 18:51:04 -07:00
one
4362382223 Update package.py (#43764) 2024-04-29 18:49:49 -07:00
Nathalie Furmento
ba4859b33d starpu: add release 1.4.5 and dependancy for building starpupy (#43810) 2024-04-29 18:48:37 -07:00
Carlos Bederián
e8472714ef ucc: add 1.3.0 (#43828) 2024-04-29 18:45:00 -07:00
Mikael Simberg
ee6960e53e boost: Add 1.85.0 (#43788)
* boost: Add 1.85.0
* Add conflict for Boost 1.85.0 stacktrace change
2024-04-29 18:41:38 -07:00
Nathalie Furmento
dad266c955 starpu: add branch 1.4 (#43837) 2024-04-29 18:38:43 -07:00
Garth N. Wells
7a234ce00a (py-)fenics-basix: update for new version (#43841)
* Bump version to 2024.1.0
* Bump py-basix version
* Revert UFL change
* Python version update
2024-04-29 18:11:36 -07:00
Loris Ercole
a0c2ed97c8 Update scine-qcmaquis (#43817)
`scine-qcmaquis` is updated with a version 3.1.4.
The option to build the OpenMolcas interface is added, and some
dependencies are clarified.
2024-04-29 16:38:29 -07:00
Rémi Lacroix
a3aa5b59cd dotnet-core-sdk: Fix environment setup (#43842)
The "DOTNET_CLI_TELEMETRY_OPTOUT" environment variable should be defined when using the product, not when installing it (the installation phase is just extract the files anyway).

Also use `~` instead of `-` to check for the variant and fix the second argument for `env.set`  which should also be a string.
2024-04-29 14:47:37 -07:00
Zach Jibben
f7dbb59d13 Update Petaca (#43849)
* Add recent petaca releases
* Add Petaca 24.04
2024-04-29 14:42:11 -07:00
Wouter Deconinck
0df27bc0f7 nopayloadclient: new package (#43853) 2024-04-29 14:16:57 -07:00
Massimiliano Culpo
877e09dcc1 Delete leftover file (#43869)
This file is not needed anymore, was introduced in #19688
2024-04-29 22:45:41 +02:00
Adam J. Stewart
c4439e86a2 Prettier: add new package (#43866) 2024-04-29 13:39:52 -07:00
Pramod Kumbhar
aa00dcac96 gprofng-gui: new package (#43873) 2024-04-29 13:38:48 -07:00
Adam J. Stewart
4c9a946b3b py-keras: add v3.3.3 (#43885) 2024-04-29 13:30:52 -07:00
Wouter Deconinck
0c6e6ad226 xbae: new package (#43889) 2024-04-29 13:05:56 -07:00
Adam J. Stewart
bf8f32443f py-einops: add v0.8.0 (#43890) 2024-04-29 12:54:33 -07:00
Wouter Deconinck
c2eef8bab2 fontconfig: depends_on gperf when 2.11.1: (#43892) 2024-04-29 11:56:59 -07:00
afzpatel
2df4b307d7 hipblaslt: new package (#43846)
* initial commit to add hipblaslt package
* remove master and update patch
* add docstring comment
2024-04-29 11:48:03 -07:00
Ryan Marcellino
3c57440c10 openjdk: add v21_35 (#40699)
* openjdk: add v21+35

* add provides java@21

* Update var/spack/repos/builtin/packages/openjdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-29 10:51:17 -05:00
Harmen Stoppels
3e6e9829da compiler.py: fix early return (#43898) 2024-04-29 08:53:27 -06:00
Gregory Becker
859745f1a9 Run audits on windows
Add debug log for external detection tests. The debug log
is used to print which test is being executed.

Skip version audit on Windows where appropriate
2024-04-29 14:13:10 +02:00
Massimiliano Culpo
ddabb8b12c Fix concretization when installing missing compilers (#43876)
Restore the previous behavior when config:install_missing_compilers
is True. The libc of the missing compiler is inferred from the
Python process.
2024-04-29 08:20:33 +02:00
Jeff Hammond
16bba32124 add ILP64 option for BLIS (#43882)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
2024-04-29 08:14:25 +02:00
Michael Kuhn
7d87369ead autoconf: fix typo in m4 dependencies (#43893)
m4 1.4.8 is actually required starting with autoconf 2.72 according to
the NEWS file.
2024-04-28 18:34:12 -05:00
Adam J. Stewart
7723bd28ed Revert "package/npm update (#43692)" (#43884)
This reverts commit 03a074ebe7.
2024-04-27 08:58:12 -06:00
Harmen Stoppels
43f3a35150 gcc: generate spec file and fix external libc default paths after install from cache (#43839)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-27 08:49:20 -06:00
Jonathon Anderson
ae9f2d4d40 containers: Add Fedora 40, 39 (#43847) 2024-04-26 20:02:04 -05:00
Huston Rogers
5a3814ff15 Miniconda3: Added Versions: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0 (#43868)
* Added Versions of miniconda3: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0

* fixed style

---------

Co-authored-by: James H. Rogers <jhrogers@spear.hpc.msstate.edu>
2024-04-26 18:58:58 -06:00
Sakib Rahman
946c539dbd osg-ca-certs: new version osg 1.119 and igtf 1.128 (#43871)
* Update package.py to osg 1.119 and igtf 1.128

* Remove unnecessary white space

* Add missing comma

* change url to use git and use commit hash from repository instead of ssh256sum of tar.gz

* [@spackbot] updating style on behalf of rahmans1

---------

Co-authored-by: rahmans1 <rahmans1@users.noreply.github.com>
2024-04-26 18:39:12 -06:00
Robert Cohn
0037462f9e [SOS] update license (#43864)
Auto-generated license was wrong
2024-04-26 18:45:29 +02:00
Massimiliano Culpo
e5edac4d0c ASP-based solver: update os compatibility for macOS (#43862) 2024-04-26 18:34:46 +02:00
Jeff Hammond
3e1474dbbb ARMCI-MPI: add new package (#43813)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-26 16:35:33 +02:00
Abhishek Yenpure
0f502bb6c3 libcatalyst: add v2.0.0 (#43804) 2024-04-26 16:27:46 +02:00
Robert Cohn
1eecbd3208 [intel-oneap-*] no redistribution (#43826) 2024-04-26 12:03:39 +00:00
Jack Morrison
6e92b9180c Add tests-sos package + Variants to sos package (#43830)
* * Add initial tests-sos package
* Remove failing call of missing setup_compiler_environment from sos
  package
* Add several variants for sos package

* [@spackbot] updating style on behalf of jack-morrison
2024-04-26 07:26:21 -04:00
Massimiliano Culpo
ac9012da0c spack audit externals: allow selecting platforms and checking extra attributes (#43782) 2024-04-26 12:47:17 +02:00
Mosè Giordano
e3cb4f09f0 julia: add v1.10.2 (#41151)
* julia: add v1.10.2

* julia: add patch to remove suite-sparse cuda stub files

* julia: use permalinks for patches
2024-04-26 12:44:54 +02:00
Harmen Stoppels
2e8600bb71 gcc: simplify specs file, make binutils locatable (#43861) 2024-04-26 01:30:51 -06:00
Harmen Stoppels
d946c37cbb ldflags=* are compiler flags, not linker flags (#43820)
We run `extend spack_flags_list SPACK_LDFLAGS` for `$mode in ld|ccld`.

That's problematic, cause `ccld` needs `-Wl,--flag` whereas `ld` needs
`--flag` directly. Only `-L` and `-l` are common to compiler & linker.

In all build systems `LDFLAGS` is for the compiler not the linker, cause
any linker flag `-x` can be passed as a compiler flag `-Wl,-x`, and there
are many compiler flags that affect the linker invocation, like `-fopenmp`,
`-fuse-ld=`, `-fsanitize=` etc.

So don't pass `LDFLAGS` to the linker directly.

This way users can set `ldflags: -Wl,--allow-shlib-undefined` in compilers.yaml
to work around an issue where the linker tries to resolve the `libcuda.so.1`
stub lib which cannot be located by design in `cuda`.
2024-04-26 09:19:03 +02:00
Harmen Stoppels
47a9f0bdf7 llvm: use --gcc_install_dir in config files (#43795) 2024-04-26 09:01:02 +02:00
Alex Richert
2bf900a893 Update package.py (#43836) 2024-04-25 16:00:43 -07:00
Andrey Perestoronin
99bba0b1ce Add intel-oneapi-mpi 2021.12.1 patch package (#43850)
* Add intel-oneapi-mpi package

* Fix style
2024-04-25 13:38:47 -06:00
Juan Miguel Carceller
a8506f9022 glew package: add ld flags when compiling with ^apple-gl (#43429) 2024-04-25 11:18:00 -07:00
Harmen Stoppels
4a40a76291 build_environment.py: expand SPACK_MANAGED_DIRS with realpath (#43844) 2024-04-25 13:33:50 +02:00
downloadico
fe9ddf22fc spatialdata: add spatialdata package to spack (#43500) 2024-04-25 04:18:08 -07:00
Adam J. Stewart
1cae1299eb CI: remove ML ROCm stack (#43825) 2024-04-25 12:56:59 +02:00
Adam J. Stewart
8b106416c0 py-lightning: add v2.2.3 (#43824) 2024-04-25 12:03:01 +02:00
jalcaraz
e2088b599e [TAU Package] Updates for rocm (#43790)
* Updates for rocm

Updated for rocm@6
Added conflict between rocprofiler and roctracer.
Request either +rocprofiler or +roctracer when +rocm. In this case, it automatically builds for one, instead of displaying the message. 
Request +rocm when using either +rocprofiler or +roctracer. In this case, it automatically builds with +rocm, instead of displaying the message.

Disabled the tests. Will update them with the new test method.

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-04-24 16:35:41 -07:00
Ken Raffenetti
56446685ca mpich: add 4.2.0 release (#42687) 2024-04-24 12:33:59 -07:00
Teo
47a8d875c8 update halide versions; sync llvm (#43793) 2024-04-24 12:27:42 -07:00
Adam J. Stewart
56b2d250c1 py-keras: add v3.3.1-2 (#43798) 2024-04-24 12:23:34 -07:00
David Boehme
abbd09b4b2 Add Caliper v2.11 (#43802) 2024-04-24 12:21:56 -07:00
Rémi Lacroix
9e5fdc6614 dotnet-core-sdk: Add versions 7.0.18 and 8.0.4 (#43814) 2024-04-24 12:19:33 -07:00
Harmen Stoppels
1224a3e8cf clang.py: detect flang-new (#43815)
If a flang-new exists, which is rather unlikely, it probably means the
user wants it as a fortran compiler.
2024-04-24 19:11:02 +02:00
Jake Koester
6c3218920f Trilinos: update kokkos dependency (#43785)
* fix so trilinos@master uses correct kokkos (@4.3.00)

* Update var/spack/repos/builtin/packages/trilinos/package.py
2024-04-24 10:42:08 -06:00
Peter Scheibel
02cc3ea005 Add new redistribute() directive (#20185)
Some packages can't be redistributed in source or binary form. We need an explicit way to say that in a package.

This adds a `redistribute()` directive so that package authors can write, e.g.:

```python
    redistribute(source=False, binary=False)
```

You can also do this conditionally with `when=`, as with other directives, e.g.:

```python
    # 12.0 and higher are proprietary
    redistribute(source=False, binary=False, when="@12.0:")

    # can't redistribute when we depend on some proprietary dependency
    redistribute(source=False, binary=False, when="^proprietary-dependency")
```


To prevent Spack from adding either their sources or binaries to public mirrors and build caches. You can still unconditionally add things *if* you run either:
* `spack mirror create --private`
* `spack buildcache push --private`

But the default behavior for build caches is not to include non-redistributable packages in either mirrors or build caches.  We have previously done this manually for our public buildcache, but with this we can start maintaining redistributability directly in packages.

Caveats: currently the default for `redistribute()` is `True` for both `source` and `binary`, and you can only set either of them to `False` via this directive.

- [x] add `redistribute()` directive
- [x] add `redistribute_source` and `redistribute_binary` class methods to `PackageBase`
- [x] add `--private` option to `spack mirror`
- [x] add `--private` option to `spack buildcache push`
- [x] test exclusion of packages from source mirror (both as a root and as a dependency)
- [x] test exclusion of packages from binary mirror (both as a root and as a dependency)
2024-04-24 09:41:03 -07:00
John W. Parent
641ab95a31 Revert "Windows: add win-sdk/wgl externals during bootstrapping (#43459)" (#43819)
This reverts commit 9e2558bd56.
2024-04-24 18:24:28 +02:00
Adam J. Stewart
e8b76c27e4 py-matplotlib: drop test dep (#43765)
test dependencies constrains build / link type deps, so avoid that
2024-04-24 18:03:13 +02:00
Massimiliano Culpo
0dbe4d54b6 Tune default compiler preferences (#43805)
Add `%oneapi`, remove compilers that have been discontinued upstream
2024-04-24 18:00:40 +02:00
Harmen Stoppels
1eb6977049 rust: use system libs (#43797) 2024-04-24 09:46:11 -06:00
Harmen Stoppels
3f1cfdb7d7 libc: from current python process (#43787)
If there's no compiler we currently don't have any external libc for the solver.

This commit adds a fallback on libc from the current Python process, which works if it is dynamically linked.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-24 05:10:48 -06:00
Massimiliano Culpo
d438d7993d nf-*-cli: fix typo in conditional (#43806)
The default runner changed on GitHub for macOS, and that
revealed a bug in a package when running audits
2024-04-24 11:30:35 +02:00
Todd Gamblin
aa0825d642 Refactor to improve spec format speed (#43712)
When looking at where we spend our time in solver setup, I noticed a fair bit of time is spent
in `Spec.format()`, and `Spec.format()` is a pretty old, slow, convoluted method.

This PR does a number of things:
- [x] Consolidate most of what was being done manually with a character loop and several
      regexes into a single regex.
- [x] Precompile regexes where we keep them 
- [x] Remove the `transform=` argument to `Spec.format()` which was only used in one 
      place in the code (modules) to uppercase env var names, but added a lot of complexity
- [x] Avoid escaping and colorizing specs unless necessary
- [x] Refactor a lot of the colorization logic to avoid unnecessary object construction
- [x] Add type hints and remove some spots in the code where we were using nonexistent
      arguments to `format()`.
- [x] Add trivial cases to `__str__` in `VariantMap` and `VersionList` to avoid sorting
- [x] Avoid calling `isinstance()` in the main loop of `Spec.format()`
- [x] Don't bother constructing a `string` representation for the result of `_prev_version`
      as it is only used for comparisons.

In my timings (on all the specs formatted in a solve of `hdf5`), this is over 2.67x faster than the 
original `format()`, and it seems to reduce setup time by around a second (for `hdf5`).
2024-04-23 10:52:15 -07:00
Greg Becker
978c20f35a concretizer: update reuse: default to True (#41302) 2024-04-23 17:42:14 +02:00
Marc T. Henry de Frahan
d535124500 Update amr-wind package with versions (#43728) 2024-04-23 05:07:42 -06:00
Harmen Stoppels
01f61a2eba Remove import distro from packages and docs (#43772) 2024-04-23 12:47:33 +02:00
Massimiliano Culpo
7d5e27d5e8 Do not detect a compiler without a C compiler (#43778) 2024-04-23 12:20:33 +02:00
Harmen Stoppels
d210425eef nettle: remove openssl dep (#43770) 2024-04-23 07:44:15 +02:00
Nathalie Furmento
6be07da201 starpu: fix release 1.4.4 (#43730) 2024-04-22 15:56:23 -07:00
Filippo Spiga
02b38716bf Adding UCX 1.16.0 (#43743)
* Adding UCX 1.16.0
* Fixed hash
2024-04-22 15:52:20 -07:00
Adam J. Stewart
d7bc624c61 Tags: add more build tools (#43766)
* Tags: add more build tools
* py-pythran: add maintainer
2024-04-22 15:34:38 -07:00
Adam J. Stewart
b7cecc9726 py-scikit-image: add v0.23 (#43767) 2024-04-22 15:32:55 -07:00
Adam J. Stewart
393a2f562b py-keras: add v3.3.0 (#43783) 2024-04-22 15:04:06 -07:00
Massimiliano Culpo
682fcec0b2 zig: add v0.12.0 (#43774) 2024-04-22 15:02:45 -07:00
Harmen Stoppels
d6baae525f repo.py: drop deleted packages from provider cache (#43779)
The reverse provider lookup may have stale entries for deleted packages, which used to cause errors. It's hard to invalidate those cache entries, so this commit simply drops entries w/o invalidating the cache.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-22 19:03:44 +02:00
Kyle Knoepfel
e1f2612581 Adjust severity of irreversible operations (#43721) 2024-04-22 16:41:53 +02:00
Harmen Stoppels
080fc875eb compiler.py: reduce verbosity of implicit link dirs parsing (#43777) 2024-04-22 16:07:14 +02:00
Harmen Stoppels
69f417b26a view: dont warn about externals (#43771)
since it's the status quo on linux after libc as external by default
2024-04-22 16:05:32 +02:00
Harmen Stoppels
80b5106611 bootstrap: no need to add dummy compilers (#43775) 2024-04-22 16:01:41 +02:00
Massimiliano Culpo
34146c197a Add libc dependency to compiled packages and runtime deps
This commit differentiate linux from other platforms by
using libc compatibility as a criterion for deciding
which buildcaches / binaries can be reused. Other
platforms still use OS compatibility.

On linux a libc is injected by all compilers as an implicit
external, and the compatibility criterion is that a libc is
compatible with all other libcs with the same name and a
version that is lesser or equal.

Some concretization unit tests use libc when run on linux.
2024-04-22 15:18:06 +02:00
Harmen Stoppels
209a3bf302 Compiler.default_libc
Some logic to detect what libc the c / cxx compilers use by default,
based on `-dynamic-linker`.

The function `compiler.default_libc()` returns a `Spec` of the form
`glibc@x.y` or `musl@x.y` with the `external_path` property set.

The idea is this can be injected as a dependency.

If we can't run the dynamic linker directly, fall back to `ldd` relative
to the prefix computed from `ld.so.`
2024-04-22 15:18:06 +02:00
Harmen Stoppels
e8c41cdbcb database.py: stream of json objects forward compat (#43598)
In the future we may transform the database from a single JSON object to
a stream of JSON objects.

This paves the way for constant time writes and constant time rereads
when only O(1) changes are made. Currently both are linear time.

This commit gives just enough forward compat for Spack to produce a
friendly error when we would move to a stream of json objects, and a db
would look like this:

```json
{"database": {"version": "<something newer>"}}
```
2024-04-22 09:43:41 +02:00
Massimiliano Culpo
a450dd31fa Fix a bug preventing to set platform= on externals (#43758)
closes #43406
2024-04-22 09:15:22 +02:00
618 changed files with 10661 additions and 4280 deletions

View File

@@ -17,33 +17,51 @@ concurrency:
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.operating_system }}
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit externals
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -1,7 +1,8 @@
#!/bin/bash
set -ex
set -e
source share/spack/setup-env.sh
$PYTHON bin/spack bootstrap disable github-actions-v0.4
$PYTHON bin/spack bootstrap disable spack-install
$PYTHON bin/spack -d solve zlib
$PYTHON bin/spack $SPACK_FLAGS solve zlib
tree $BOOTSTRAP/store
exit 0

View File

@@ -13,118 +13,22 @@ concurrency:
cancel-in-progress: true
jobs:
fedora-clingo-sources:
distros-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
container: ${{ matrix.image }}
strategy:
matrix:
image: ["fedora:latest", "opensuse/leap:latest"]
steps:
- name: Install dependencies
- name: Setup Fedora
if: ${{ matrix.image == 'fedora:latest' }}
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
@@ -133,15 +37,9 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- name: Setup repo
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -151,77 +49,102 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-sources:
runs-on: macos-latest
clingo-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison@2.7 tree
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: ${{ matrix.macos-version }}
gnupg-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
macos-version: ['macos-11', 'macos-12']
runner: [ 'macos-13', 'macos-14', "ubuntu-latest" ]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap clingo
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
# NOTE: test all pythons that exist, not all do on 12
done
ubuntu-clingo-binaries:
runs-on: ubuntu-20.04
steps:
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- name: Setup repo
- name: Bootstrap GnuPG
run: |
git --version
. .github/workflows/setup_git.sh
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
from-binaries:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: |
3.8
3.9
3.10
3.11
3.12
- name: Set bootstrap sources
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
echo "Testing $ver_dir"
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
@@ -236,122 +159,9 @@ jobs:
exit 1
fi
done
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-binaries:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
# [1] Distros that have patched git to resolve CVE-2022-24765 (e.g. Ubuntu patching v2.25.1)
# introduce breaking behaviorso we have to set `safe.directory` in gitconfig ourselves.
# See:
# - https://github.blog/2022-04-12-git-security-vulnerability-announced/
# - https://github.com/actions/checkout/issues/760
# - http://changelogs.ubuntu.com/changelogs/pool/main/g/git/git_2.25.1-1ubuntu3.3/changelog

View File

@@ -45,17 +45,18 @@ jobs:
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora37, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:37'],
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38']]
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -87,9 +88,9 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@a8a3f3ad30e3422c9c7b888a15615d19a852ae32
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
@@ -120,3 +121,14 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -1,4 +1,4 @@
black==24.4.0
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -100,7 +100,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- name: Setup repo and non-root user
run: |
git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -195,10 +195,10 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-latest, macos-14]
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,7 +223,7 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -41,7 +41,7 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -59,7 +59,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -67,7 +67,7 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -1,3 +1,48 @@
# v0.21.2 (2024-03-01)
## Bugfixes
- Containerize: accommodate nested or pre-existing spack-env paths (#41558)
- Fix setup-env script, when going back and forth between instances (#40924)
- Fix using fully-qualified namespaces from root specs (#41957)
- Fix a bug when a required provider is requested for multiple virtuals (#42088)
- OCI buildcaches:
- only push in parallel when forking (#42143)
- use pickleable errors (#42160)
- Fix using sticky variants in externals (#42253)
- Fix a rare issue with conditional requirements and multi-valued variants (#42566)
## Package updates
- rust: add v1.75, rework a few variants (#41161,#41903)
- py-transformers: add v4.35.2 (#41266)
- mgard: fix OpenMP on AppleClang (#42933)
# v0.21.1 (2024-01-11)
## New features
- Add support for reading buildcaches created by Spack v0.22 (#41773)
## Bugfixes
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)
- Improve the error message for deprecated preferences (#41075)
- Fix MSVC preview version breaking clingo build on Windows (#41185)
- Fix multi-word aliases (#41126)
- Add a warning for unconfigured compiler (#41213)
- environment: fix an issue with deconcretization/reconcretization of specs (#41294)
- buildcache: don't error if a patch is missing, when installing from binaries (#41986)
- Multiple improvements to unit-tests (#41215,#41369,#41495,#41359,#41361,#41345,#41342,#41308,#41226)
## Package updates
- root: add a webgui patch to address security issue (#41404)
- BerkeleyGW: update source urls (#38218)
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.

View File

@@ -15,7 +15,7 @@ concretizer:
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization. If `dependencies`, we'll only reuse dependencies but
# give you a fresh concretization for your root specs.
reuse: dependencies
reuse: true
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -19,7 +19,6 @@ packages:
- apple-clang
- clang
- gcc
- intel
providers:
elf: [libelf]
fuse: [macfuse]

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -15,9 +15,10 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, intel, pgi, clang, xl, nag, fj, aocc]
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-oneapi-daal]
@@ -35,6 +36,7 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx, mesa18+glx]
libifcore: [ intel-oneapi-runtime ]

12
lib/spack/docs/_templates/layout.html vendored Normal file
View File

@@ -0,0 +1,12 @@
{% extends "!layout.html" %}
{%- block extrahead %}
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S0PQ7WV75K"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-S0PQ7WV75K');
</script>
{% endblock %}

View File

@@ -865,7 +865,7 @@ There are several different ways to use Spack packages once you have
installed them. As you've seen, spack packages are installed into long
paths with hashes, and you need a way to get them into your path. The
easiest way is to use :ref:`spack load <cmd-spack-load>`, which is
described in the next section.
described in this section.
Some more advanced ways to use Spack packages include:
@@ -959,7 +959,86 @@ use ``spack find --loaded``.
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
We'll learn more about Spack's spec syntax in :ref:`a later section <sec-specs>`.
.. _extensions:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Python packages and virtual environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack can install a large number of Python packages. Their names are
typically prefixed with ``py-``. Installing and using them is no
different from any other package:
.. code-block:: console
$ spack install py-numpy
$ spack load py-numpy
$ python3
>>> import numpy
The ``spack load`` command sets the ``PATH`` variable so that the right Python
executable is used, and makes sure that ``numpy`` and its dependencies can be
located in the ``PYTHONPATH``.
Spack is different from other Python package managers in that it installs
every package into its *own* prefix. This is in contrast to ``pip``, which
installs all packages into the same prefix, be it in a virtual environment
or not.
For many users, **virtual environments** are more convenient than repeated
``spack load`` commands, particularly when working with multiple Python
packages. Fortunately Spack supports environments itself, which together
with a view are no different from Python virtual environments.
The recommended way of working with Python extensions such as ``py-numpy``
is through :ref:`Environments <environments>`. The following example creates
a Spack environment with ``numpy`` in the current working directory. It also
puts a filesystem view in ``./view``, which is a more traditional combined
prefix for all packages in the environment.
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
Now you can activate the environment and start using the packages:
.. code-block:: console
$ spack env activate .
$ python3
>>> import numpy
The environment view is also a virtual environment, which is useful if you are
sharing the environment with others who are unfamiliar with Spack. They can
either use the Python executable directly:
.. code-block:: console
$ ./view/bin/python3
>>> import numpy
or use the activation script:
.. code-block:: console
$ source ./view/bin/activate
$ python3
>>> import numpy
In general, there should not be much difference between ``spack env activate``
and using the virtual environment. The main advantage of ``spack env activate``
is that it knows about more packages than just Python packages, and it may set
additional runtime variables that are not covered by the virtual environment
activation script.
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
.. _sec-specs:
@@ -1705,165 +1784,6 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
.. _extensions:
---------------------------
Extensions & Python support
---------------------------
Spack's installation model assumes that each package will live in its
own install prefix. However, certain packages are typically installed
*within* the directory hierarchy of other packages. For example,
`Python <https://www.python.org>`_ packages are typically installed in the
``$prefix/lib/python-2.7/site-packages`` directory.
In Spack, installation prefixes are immutable, so this type of installation
is not directly supported. However, it is possible to create views that
allow you to merge install prefixes of multiple packages into a single new prefix.
Views are a convenient way to get a more traditional filesystem structure.
Using *extensions*, you can ensure that Python packages always share the
same prefix in the view as Python itself. Suppose you have
Python installed like so:
.. code-block:: console
$ spack find python
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
python@2.7.8
.. _cmd-spack-extensions:
^^^^^^^^^^^^^^^^^^^^
``spack extensions``
^^^^^^^^^^^^^^^^^^^^
You can find extensions for your Python installation like this:
.. code-block:: console
$ spack extensions python
==> python@2.7.8%gcc@4.4.7 arch=linux-debian7-x86_64-703c7a96
==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six
py-biopython py-mako py-pmw py-rpy2 py-sympy
py-cython py-matplotlib py-pychecker py-scientificpython py-virtualenv
py-dateutil py-mpi4py py-pygments py-scikit-learn
py-epydoc py-mx py-pylint py-scipy
py-gnuplot py-nose py-pyparsing py-setuptools
py-h5py py-numpy py-pyqt py-shiboken
==> 12 installed:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-dateutil@2.4.0 py-nose@1.3.4 py-pyside@1.2.2
py-dateutil@2.4.0 py-numpy@1.9.1 py-pytz@2014.10
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
The extensions are a subset of what's returned by ``spack list``, and
they are packages like any other. They are installed into their own
prefixes, and you can see this with ``spack find --paths``:
.. code-block:: console
$ spack find --paths py-numpy
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-numpy@1.9.1 ~/spack/opt/linux-debian7-x86_64/gcc@4.4.7/py-numpy@1.9.1-66733244
However, even though this package is installed, you cannot use it
directly when you run ``python``:
.. code-block:: console
$ spack load python
$ python
Python 2.7.8 (default, Feb 17 2015, 01:35:25)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>>
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Extensions in Environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The recommended way of working with extensions such as ``py-numpy``
above is through :ref:`Environments <environments>`. For example,
the following creates an environment in the current working directory
with a filesystem view in the ``./view`` directory:
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
We recommend environments for two reasons. Firstly, environments
can be activated (requires :ref:`shell-support`):
.. code-block:: console
$ spack env activate .
which sets all the right environment variables such as ``PATH`` and
``PYTHONPATH``. This ensures that
.. code-block:: console
$ python
>>> import numpy
works. Secondly, even without shell support, the view ensures
that Python can locate its extensions:
.. code-block:: console
$ ./view/bin/python
>>> import numpy
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
^^^^^^^^^^^^^^^^^^^^
Using ``spack load``
^^^^^^^^^^^^^^^^^^^^
A more traditional way of using Spack and extensions is ``spack load``
(requires :ref:`shell-support`). This will add the extension to ``PYTHONPATH``
in your current shell, and Python itself will be available in the ``PATH``:
.. code-block:: console
$ spack load py-numpy
$ python
>>> import numpy
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Apart from ``spack env activate`` and ``spack load``, you can load numpy
through your environment modules (using ``environment-modules`` or
``lmod``). This will also add the extension to the ``PYTHONPATH`` in
your current shell.
.. code-block:: console
$ module load <name of numpy module>
If you do not know the name of the specific numpy module you wish to
load, you can use the ``spack module tcl|lmod loads`` command to get
the name of the module from the Spack spec.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -21,23 +21,86 @@ is the following:
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
The ``reuse`` attribute controls how aggressively Spack reuses binary packages during concretization. The
attribute can either be a single value, or an object for more complex configurations.
In the former case ("single value") it allows Spack to:
1. Reuse installed packages and buildcaches for all the specs to be concretized, when ``true``
2. Reuse installed packages and buildcaches only for the dependencies of the root specs, when ``dependencies``
3. Disregard reusing installed packages and buildcaches, when ``false``
In case a finer control over which specs are reused is needed, then the value of this attribute can be
an object, with the following keys:
1. ``roots``: if ``true`` root specs are reused, if ``false`` only dependencies of root specs are reused
2. ``from``: list of sources from which reused specs are taken
Each source in ``from`` is itself an object:
.. list-table:: Attributes for a source or reusable specs
:header-rows: 1
* - Attribute name
- Description
* - type (mandatory, string)
- Can be ``local``, ``buildcache``, or ``external``
* - include (optional, list of specs)
- If present, reusable specs must match at least one of the constraint in the list
* - exclude (optional, list of specs)
- If present, reusable specs must not match any of the constraint in the list.
For instance, the following configuration:
.. code-block:: yaml
concretizer:
reuse:
roots: true
from:
- type: local
include:
- "%gcc"
- "%clang"
tells the concretizer to reuse all specs compiled with either ``gcc`` or ``clang``, that are installed
in the local store. Any spec from remote buildcaches is disregarded.
To reduce the boilerplate in configuration files, default values for the ``include`` and
``exclude`` options can be pushed up one level:
.. code-block:: yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
from:
- type: local
- type: buildcache
- type: local
include:
- "foo %oneapi"
In the example above we reuse all specs compiled with ``gcc`` from the local store
and remote buildcaches, and we also reuse ``foo %oneapi``. Note that the last source of
specs override the default ``include`` attribute.
For one-off concretizations, the are command line arguments for each of the simple "single value"
configurations. This means a user can:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
to enable reuse for a single installation, or:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: dependencies`` is the default.
.. seealso::

View File

@@ -718,23 +718,45 @@ command-line tool, or C/C++/Fortran program with optional Python
modules? The former should be prepended with ``py-``, while the
latter should not.
""""""""""""""""""""""
extends vs. depends_on
""""""""""""""""""""""
""""""""""""""""""""""""""""""
``extends`` vs. ``depends_on``
""""""""""""""""""""""""""""""
This is very similar to the naming dilemma above, with a slight twist.
As mentioned in the :ref:`Packaging Guide <packaging_extensions>`,
``extends`` and ``depends_on`` are very similar, but ``extends`` ensures
that the extension and extendee share the same prefix in views.
This allows the user to import a Python module without
having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``.
Additionally, ``extends("python")`` adds a dependency on the package
``python-venv``. This improves isolation from the system, whether
it's during the build or at runtime: user and system site packages
cannot accidentally be used by any package that ``extends("python")``.
As a rule of thumb: if a package does not install any Python modules
of its own, and merely puts a Python script in the ``bin`` directory,
then there is no need for ``extends``. If the package installs modules
in the ``site-packages`` directory, it requires ``extends``.
"""""""""""""""""""""""""""""""""""""
Executing ``python`` during the build
"""""""""""""""""""""""""""""""""""""
Whenever you need to execute a Python command or pass the path of the
Python interpreter to the build system, it is best to use the global
variable ``python`` directly. For example:
.. code-block:: python
@run_before("install")
def recythonize(self):
python("setup.py", "clean") # use the `python` global
As mentioned in the previous section, ``extends("python")`` adds an
automatic dependency on ``python-venv``, which is a virtual environment
that guarantees build isolation. The ``python`` global always refers to
the correct Python interpreter, whether the package uses ``extends("python")``
or ``depends_on("python")``.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack

View File

@@ -150,7 +150,7 @@ this can expose you to attacks. Use at your own risk.
--------------------
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to a file path.
filesytem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.
@@ -160,6 +160,9 @@ in the subprocess calling ``curl``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
``config:ssl_certs:$SSL_CERT_FILE`` or ``config:ssl_certs:$SSL_CERT_DIR``
will work.
In all cases the expanded path must be absolute for Spack to use the certificates.
Certificates relative to an environment can be created by prepending the path variable
with the Spack configuration variable``$env``.
--------------------
``checksum``

View File

@@ -194,15 +194,15 @@ The OS that are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -227,12 +227,12 @@ The OS that are currently supported are summarized in the table below:
* - Rocky Linux 9
- ``rockylinux:9``
- ``spack/rockylinux9``
* - Fedora Linux 37
- ``fedora:37``
- ``spack/fedora37``
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
* - Fedora Linux 39
- ``fedora:39``
- ``spack/fedora39``
* - Fedora Linux 40
- ``fedora:40``
- ``spack/fedora40``

View File

@@ -552,11 +552,11 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'import distro; distro.linux_distribution()'
('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
...
$ spack python -i ipython -c 'import distro; distro.linux_distribution()'
Out[1]: ('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
Out[1]: ...
or a file:

View File

@@ -142,12 +142,8 @@ user's prompt to begin with the environment name in brackets.
$ spack env activate -p myenv
[myenv] $ ...
The ``activate`` command can also be used to create a new environment, if it is
not already defined, by adding the ``--create`` flag. Managed and anonymous
environments, anonymous environments are explained in the next section,
can both be created using the same flags that `spack env create` accepts.
If an environment already exists then spack will simply activate it and ignore the
create specific flags.
The ``activate`` command can also be used to create a new environment if it does not already
exist.
.. code-block:: console
@@ -176,21 +172,36 @@ environment will remove the view from the user environment.
Anonymous Environments
^^^^^^^^^^^^^^^^^^^^^^
Any directory can be treated as an environment if it contains a file
``spack.yaml``. To load an anonymous environment, use:
Apart from managed environments, Spack also supports anonymous environments.
Anonymous environments can be placed in any directory of choice.
.. note::
When uninstalling packages, Spack asks the user to confirm the removal of packages
that are still used in a managed environment. This is not the case for anonymous
environments.
To create an anonymous environment, use one of the following commands:
.. code-block:: console
$ spack env activate -d /path/to/directory
$ spack env create --dir my_env
$ spack env create ./my_env
Anonymous specs can be created in place using the command:
As a shorthand, you can also create an anonymous environment upon activation if it does not
already exist:
.. code-block:: console
$ spack env create -d .
$ spack env activate --create ./my_env
For convenience, Spack can also place an anonymous environment in a temporary directory for you:
.. code-block:: console
$ spack env activate --temp
In this case Spack simply creates a ``spack.yaml`` file in the requested
directory.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment Sensitive Commands
@@ -449,6 +460,125 @@ Sourcing that file in Bash will make the environment available to the
user; and can be included in ``.bashrc`` files, etc. The ``loads``
file may also be copied out of the environment, renamed, etc.
.. _environment_include_concrete:
------------------------------
Included Concrete Environments
------------------------------
Spack environments can create an environment based off of information in already
established environments. You can think of it as a combination of existing
environments. It will gather information from the existing environment's
``spack.lock`` and use that during the creation of this included concrete
environment. When an included concrete environment is created it will generate
a ``spack.lock`` file for the newly created environment.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Creating included environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To create a combined concrete environment, you must have at least one existing
concrete environment. You will use the command ``spack env create`` with the
argument ``--include-concrete`` followed by the name or path of the environment
you'd like to include. Here is an example of how to create a combined environment
from the command line.
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
You can also include an environment directly in the ``spack.yaml`` file. It
involves adding the ``include_concrete`` heading in the yaml followed by the
absolute path to the independent environments.
.. code-block:: yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /absolute/path/to/environment1
- /absolute/path/to/environment2
Once the ``spack.yaml`` has been updated you must concretize the environment to
get the concrete specs from the included environments.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Updating an included environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If changes were made to the base environment and you want that reflected in the
included environment you will need to reconcretize both the base environment and the
included environment for the change to be implemented. For example:
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
$ spack -e myenv find
==> In environment myenv
==> Root specs
python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
Here we see that ``included_env`` has access to the python package through
the ``myenv`` environment. But if we were to add another spec to ``myenv``,
``included_env`` will not be able to access the new information.
.. code-block:: console
$ spack -e myenv add perl
$ spack -e myenv concretize
$ spack -e myenv find
==> In environment myenv
==> Root specs
perl python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
It isn't until you run the ``spack concretize`` command that the combined
environment will get the updated information from the reconcretized base environmennt.
.. code-block:: console
$ spack -e included_env concretize
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
perl python
==> 0 installed packages
.. _environment-configuration:
------------------------
@@ -800,6 +930,7 @@ For example, the following environment has three root packages:
This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
@@ -1033,7 +1164,7 @@ other targets to depend on the environment installation.
A typical workflow is as follows:
.. code:: console
.. code-block:: console
spack env create -d .
spack -e . add perl
@@ -1126,7 +1257,7 @@ its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
.. code-block:: console
$ spack env depfile -o Makefile
@@ -1148,7 +1279,7 @@ This can be accomplished through the generated ``[<prefix>/]SPACK_PACKAGE_IDS``
variable. Assuming we have an active and concrete environment, we generate the
associated ``Makefile`` with a prefix ``example``:
.. code:: console
.. code-block:: console
$ spack env depfile -o env.mk --make-prefix example

View File

@@ -478,6 +478,13 @@ prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
Note that the format for the ``paths`` field in the
``extra_attributes`` section is different than in the ``compilers``
config. For compilers configured as external packages, the section is
named ``compilers`` and the dictionary maps language names (``c``,
``cxx``, ``fortran``) to paths, rather than using the names ``cc``,
``fc``, and ``f77``.
.. code-block:: yaml
packages:
@@ -493,11 +500,10 @@ all other fields from the compilers config can be added to the
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
paths:
cc: /usr/bin/clang-with-suffix
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fc: /usr/bin/gfortran
f77: /usr/bin/gfortran
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/

View File

@@ -6435,9 +6435,12 @@ the ``paths`` attribute:
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
platforms: ["linux", "darwin"]
results:
- spec: 'llvm@3.9.1 +clang~lld~lldb'
If the ``platforms`` attribute is present, tests are run only if the current host
matches one of the listed platforms.
Each test is performed by first creating a temporary directory structure as
specified in the corresponding ``layout`` and by then running
package detection and checking that the outcome matches the expected
@@ -6471,6 +6474,10 @@ package detection and checking that the outcome matches the expected
- A spec that is expected from detection
- Any valid spec
- Yes
* - ``results:[0]:extra_attributes``
- Extra attributes expected on the associated Spec
- Nested dictionary with string as keys, and regular expressions as leaf values
- No
"""""""""""""""""""""""""""""""
Reuse tests from other packages

View File

@@ -4,10 +4,10 @@ sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
pygments==2.18.0
urllib3==2.2.1
pytest==8.1.1
pytest==8.2.0
isort==5.13.2
black==24.4.0
black==24.4.2
flake8==7.0.0
mypy==1.9.0
mypy==1.10.0

26
lib/spack/env/cc vendored
View File

@@ -174,18 +174,6 @@ preextend() {
unset IFS
}
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
# moving the eval inside the function would eval it every call.
eval "\
path_order() {
case \"\$1\" in
$SPACK_MANAGED_DIRS) return 0 ;;
$SPACK_SYSTEM_DIRS) return 2 ;;
/*) return 1 ;;
esac
}
"
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -198,6 +186,18 @@ for param in $params; do
fi
done
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
# moving the eval inside the function would eval it every call.
eval "\
path_order() {
case \"\$1\" in
$SPACK_MANAGED_DIRS) return 0 ;;
$SPACK_SYSTEM_DIRS) return 2 ;;
/*) return 1 ;;
esac
}
"
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -755,7 +755,7 @@ esac
# Linker flags
case "$mode" in
ld|ccld)
ccld)
extend spack_flags_list SPACK_LDFLAGS
;;
esac

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.3 (commit 7b8fe60b69e2861e7dac104bc1c183decfcd3daf)
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse
----------------

View File

@@ -1,3 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.3"
__version__ = "0.2.4"

View File

@@ -5,9 +5,10 @@
"""The "cpu" package permits to query and compare different
CPU microarchitectures.
"""
from .detect import host
from .detect import brand_string, host
from .microarchitecture import (
TARGETS,
InvalidCompilerVersion,
Microarchitecture,
UnsupportedMicroarchitecture,
generic_microarchitecture,
@@ -15,10 +16,12 @@
)
__all__ = [
"brand_string",
"host",
"TARGETS",
"InvalidCompilerVersion",
"Microarchitecture",
"UnsupportedMicroarchitecture",
"TARGETS",
"generic_microarchitecture",
"host",
"version_components",
]

View File

@@ -155,6 +155,31 @@ def _is_bit_set(self, register: int, bit: int) -> bool:
mask = 1 << bit
return register & mask > 0
def brand_string(self) -> Optional[str]:
"""Returns the brand string, if available."""
if self.highest_extension_support < 0x80000004:
return None
r1 = self.cpuid.registers_for(eax=0x80000002, ecx=0)
r2 = self.cpuid.registers_for(eax=0x80000003, ecx=0)
r3 = self.cpuid.registers_for(eax=0x80000004, ecx=0)
result = struct.pack(
"IIIIIIIIIIII",
r1.eax,
r1.ebx,
r1.ecx,
r1.edx,
r2.eax,
r2.ebx,
r2.ecx,
r2.edx,
r3.eax,
r3.ebx,
r3.ecx,
r3.edx,
).decode("utf-8")
return result.strip("\x00")
@detection(operating_system="Windows")
def cpuid_info():
@@ -174,8 +199,8 @@ def _check_output(args, env):
WINDOWS_MAPPING = {
"AMD64": "x86_64",
"ARM64": "aarch64",
"AMD64": X86_64,
"ARM64": AARCH64,
}
@@ -409,3 +434,16 @@ def compatibility_check_for_riscv64(info, target):
return (target == arch_root or arch_root in target.ancestors) and (
target.name == info.name or target.vendor == "generic"
)
def brand_string() -> Optional[str]:
"""Returns the brand string of the host, if detected, or None."""
if platform.system() == "Darwin":
return _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if host().family == X86_64:
return CpuidInfoCollector().brand_string()
return None

View File

@@ -208,6 +208,8 @@ def optimization_flags(self, compiler, version):
"""Returns a string containing the optimization flags that needs
to be used to produce code optimized for this micro-architecture.
The version is expected to be a string of dot separated digits.
If there is no information on the compiler passed as argument the
function returns an empty string. If it is known that the compiler
version we want to use does not support this architecture the function
@@ -216,6 +218,11 @@ def optimization_flags(self, compiler, version):
Args:
compiler (str): name of the compiler to be used
version (str): version of the compiler to be used
Raises:
UnsupportedMicroarchitecture: if the requested compiler does not support
this micro-architecture.
ValueError: if the version doesn't match the expected format
"""
# If we don't have information on compiler at all return an empty string
if compiler not in self.family.compilers:
@@ -232,6 +239,14 @@ def optimization_flags(self, compiler, version):
msg = msg.format(compiler, best_target, best_target.family)
raise UnsupportedMicroarchitecture(msg)
# Check that the version matches the expected format
if not re.match(r"^(?:\d+\.)*\d+$", version):
msg = (
"invalid format for the compiler version argument. "
"Only dot separated digits are allowed."
)
raise InvalidCompilerVersion(msg)
# If we have information on this compiler we need to check the
# version being used
compiler_info = self.compilers[compiler]
@@ -292,7 +307,7 @@ def generic_microarchitecture(name):
Args:
name (str): name of the micro-architecture
"""
return Microarchitecture(name, parents=[], vendor="generic", features=[], compilers={})
return Microarchitecture(name, parents=[], vendor="generic", features=set(), compilers={})
def version_components(version):
@@ -367,7 +382,15 @@ def fill_target_from_dict(name, data, targets):
TARGETS = LazyDictionary(_known_microarchitectures)
class UnsupportedMicroarchitecture(ValueError):
class ArchspecError(Exception):
"""Base class for errors within archspec"""
class UnsupportedMicroarchitecture(ArchspecError, ValueError):
"""Raised if a compiler version does not support optimization for a given
micro-architecture.
"""
class InvalidCompilerVersion(ArchspecError, ValueError):
"""Raised when an invalid format is used for compiler versions in archspec."""

View File

@@ -2937,8 +2937,6 @@
"ilrcpc",
"flagm",
"ssbs",
"paca",
"pacg",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3066,8 +3064,6 @@
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
@@ -3081,8 +3077,7 @@
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
"dgh"
],
"compilers" : {
"gcc": [

View File

@@ -98,3 +98,10 @@ def path_filter_caller(*args, **kwargs):
if _func:
return holder_func(_func)
return holder_func
def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.
no-op if extended path prefix is not present"""
return path.lstrip("\\\\?\\")

View File

@@ -187,12 +187,18 @@ def polite_filename(filename: str) -> str:
return _polite_antipattern().sub("_", filename)
def getuid():
def getuid() -> Union[str, int]:
"""Returns os getuid on non Windows
On Windows returns 0 for admin users, login string otherwise
This is in line with behavior from get_owner_uid which
always returns the login string on Windows
"""
if sys.platform == "win32":
import ctypes
# If not admin, use the string name of the login as a unique ID
if ctypes.windll.shell32.IsUserAnAdmin() == 0:
return 1
return os.getlogin()
return 0
else:
return os.getuid()
@@ -213,6 +219,15 @@ def _win_rename(src, dst):
os.replace(src, dst)
@system_path_filter
def msdos_escape_parens(path):
"""MS-DOS interprets parens as grouping parameters even in a quoted string"""
if sys.platform == "win32":
return path.replace("(", "^(").replace(")", "^)")
else:
return path
@system_path_filter
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
@@ -553,7 +568,13 @@ def exploding_archive_handler(tarball_container, stage):
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
def get_owner_uid(path, err_msg=None) -> Union[str, int]:
"""Returns owner UID of path destination
On non Windows this is the value of st_uid
On Windows this is the login string associated with the
owning user.
"""
if not os.path.exists(path):
mkdirp(path, mode=stat.S_IRWXU)
@@ -2429,9 +2450,10 @@ def add_library_dependent(self, *dest):
"""
for pth in dest:
if os.path.isfile(pth):
self._additional_library_dependents.add(pathlib.Path(pth).parent)
new_pth = pathlib.Path(pth).parent
else:
self._additional_library_dependents.add(pathlib.Path(pth))
new_pth = pathlib.Path(pth)
self._additional_library_dependents.add(new_pth)
@property
def rpaths(self):

View File

@@ -11,7 +11,7 @@
from llnl.util import lang, tty
from ..path import system_path_filter
from ..path import sanitize_win_longpath, system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
@@ -247,9 +247,9 @@ def _windows_create_junction(source: str, link: str):
out, err = proc.communicate()
tty.debug(out.decode())
if proc.returncode != 0:
err = err.decode()
tty.error(err)
raise SymlinkError("Make junction command returned a non-zero return code.", err)
err_str = err.decode()
tty.error(err_str)
raise SymlinkError("Make junction command returned a non-zero return code.", err_str)
def _windows_create_hard_link(path: str, link: str):
@@ -269,14 +269,14 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def readlink(path: str):
def readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
elif _windows_is_junction(path):
return _windows_read_junction(path)
else:
return os.readlink(path)
return sanitize_win_longpath(os.readlink(path, dir_fd=dir_fd))
def _windows_read_hard_link(link: str) -> str:

View File

@@ -12,7 +12,7 @@
import traceback
from datetime import datetime
from sys import platform as _platform
from typing import NoReturn
from typing import Any, NoReturn
if _platform != "win32":
import fcntl
@@ -158,21 +158,22 @@ def get_timestamp(force=False):
return ""
def msg(message, *args, **kwargs):
def msg(message: Any, *args: Any, newline: bool = True) -> None:
if not msg_enabled():
return
if isinstance(message, Exception):
message = "%s: %s" % (message.__class__.__name__, str(message))
message = f"{message.__class__.__name__}: {message}"
else:
message = str(message)
newline = kwargs.get("newline", True)
st_text = ""
if _stacktrace:
st_text = process_stacktrace(2)
if newline:
cprint("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
else:
cwrite("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
nl = "\n" if newline else ""
cwrite(f"@*b{{{st_text}==>}} {get_timestamp()}{cescape(_output_filter(message))}{nl}")
for arg in args:
print(indent + _output_filter(str(arg)))

View File

@@ -237,7 +237,6 @@ def transpose():
def colified(
elts: List[Any],
cols: int = 0,
output: Optional[IO] = None,
indent: int = 0,
padding: int = 2,
tty: Optional[bool] = None,

View File

@@ -59,9 +59,11 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
import os
import re
import sys
from contextlib import contextmanager
from typing import Optional
class ColorParseError(Exception):
@@ -95,14 +97,34 @@ def __init__(self, message):
} # white
# Regex to be used for color formatting
color_re = r"@(?:@|\.|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)"
COLOR_RE = re.compile(r"@(?:(@)|(\.)|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)")
# Mapping from color arguments to values for tty.set_color
color_when_values = {"always": True, "auto": None, "never": False}
# Force color; None: Only color if stdout is a tty
# True: Always colorize output, False: Never colorize output
_force_color = None
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def _color_from_environ() -> Optional[bool]:
try:
return _color_when_value(os.environ.get("SPACK_COLOR", "auto"))
except ValueError:
return None
#: When `None` colorize when stdout is tty, when `True` or `False` always or never colorize resp.
_force_color = _color_from_environ()
def try_enable_terminal_color_on_windows():
@@ -163,19 +185,6 @@ def _err_check(result, func, args):
debug("Unable to support color on Windows terminal")
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def get_color_when():
"""Return whether commands should print color or not."""
if _force_color is not None:
@@ -203,77 +212,64 @@ def color_when(value):
set_color_when(old_value)
class match_to_ansi:
def __init__(self, color=True, enclose=False, zsh=False):
self.color = _color_when_value(color)
self.enclose = enclose
self.zsh = zsh
def escape(self, s):
"""Returns a TTY escape sequence for a color"""
if self.color:
if self.zsh:
result = rf"\e[0;{s}m"
else:
result = f"\033[{s}m"
if self.enclose:
result = rf"\[{result}\]"
return result
def _escape(s: str, color: bool, enclose: bool, zsh: bool) -> str:
"""Returns a TTY escape sequence for a color"""
if color:
if zsh:
result = rf"\e[0;{s}m"
else:
return ""
result = f"\033[{s}m"
def __call__(self, match):
"""Convert a match object generated by ``color_re`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
style, color, text = match.groups()
m = match.group(0)
if enclose:
result = rf"\[{result}\]"
if m == "@@":
return "@"
elif m == "@.":
return self.escape(0)
elif m == "@":
raise ColorParseError("Incomplete color format: '%s' in %s" % (m, match.string))
string = styles[style]
if color:
if color not in colors:
raise ColorParseError(
"Invalid color specifier: '%s' in '%s'" % (color, match.string)
)
string += ";" + str(colors[color])
colored_text = ""
if text:
colored_text = text + self.escape(0)
return self.escape(string) + colored_text
return result
else:
return ""
def colorize(string, **kwargs):
def colorize(
string: str, color: Optional[bool] = None, enclose: bool = False, zsh: bool = False
) -> str:
"""Replace all color expressions in a string with ANSI control codes.
Args:
string (str): The string to replace
string: The string to replace
Returns:
str: The filtered string
The filtered string
Keyword Arguments:
color (bool): If False, output will be plain text without control
codes, for output to non-console devices.
enclose (bool): If True, enclose ansi color sequences with
color: If False, output will be plain text without control codes, for output to
non-console devices (default: automatically choose color or not)
enclose: If True, enclose ansi color sequences with
square brackets to prevent misestimation of terminal width.
zsh (bool): If True, use zsh ansi codes instead of bash ones (for variables like PS1)
zsh: If True, use zsh ansi codes instead of bash ones (for variables like PS1)
"""
color = _color_when_value(kwargs.get("color", get_color_when()))
zsh = kwargs.get("zsh", False)
string = re.sub(color_re, match_to_ansi(color, kwargs.get("enclose")), string, zsh)
string = string.replace("}}", "}")
return string
color = color if color is not None else get_color_when()
def match_to_ansi(match):
"""Convert a match object generated by ``COLOR_RE`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
escaped_at, dot, style, color_code, text = match.groups()
if escaped_at:
return "@"
elif dot:
return _escape(0, color, enclose, zsh)
elif not (style or color_code):
raise ColorParseError(
f"Incomplete color format: '{match.group(0)}' in '{match.string}'"
)
ansi_code = _escape(f"{styles[style]};{colors.get(color_code, '')}", color, enclose, zsh)
if text:
return f"{ansi_code}{text}{_escape(0, color, enclose, zsh)}"
else:
return ansi_code
return COLOR_RE.sub(match_to_ansi, string).replace("}}", "}")
def clen(string):
@@ -305,7 +301,7 @@ def cprint(string, stream=None, color=None):
cwrite(string + "\n", stream, color)
def cescape(string):
def cescape(string: str) -> str:
"""Escapes special characters needed for color codes.
Replaces the following symbols with their equivalent literal forms:
@@ -321,10 +317,7 @@ def cescape(string):
Returns:
(str): the string with color codes escaped
"""
string = str(string)
string = string.replace("@", "@@")
string = string.replace("}", "}}")
return string
return string.replace("@", "@@").replace("}", "}}")
class ColorStream:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.0.dev0"
__version__ = "0.23.0.dev0"
spack_version = __version__

View File

@@ -254,8 +254,8 @@ def _search_duplicate_specs_in_externals(error_cls):
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
@@ -1046,7 +1046,7 @@ def _extracts_errors(triggers, summary):
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
kwargs=("pkgs", "debug_log"),
)
@@ -1069,7 +1069,7 @@ def packages_with_detection_tests():
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
def _test_detection_by_executable(pkgs, debug_log, error_cls):
"""Test drive external detection for packages"""
import spack.detection
@@ -1095,6 +1095,7 @@ def _test_detection_by_executable(pkgs, error_cls):
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
debug_log(f"[{__file__}]: running test {idx} for package {pkg_name}")
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
@@ -1111,4 +1112,75 @@ def _test_detection_by_executable(pkgs, error_cls):
details = [msg.format(s, idx) for s in sorted(not_expected)]
errors.append(error_cls(summary=summary, details=details))
matched_detection = []
for candidate in expected_specs:
try:
idx = specs.index(candidate)
matched_detection.append((candidate, specs[idx]))
except (AttributeError, ValueError):
pass
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
_summary = f'{pkg_name}: illegal regex in "{_spec}" extra attributes'
_details = [f"{_expected} is not a valid regex"]
return [error_cls(summary=_summary, details=_details)]
if not _regex.match(_detected):
_summary = (
f'{pkg_name}: error when trying to match "{_expected}" '
f"in extra attributes"
)
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
_details = [
f'"{_expected}" was expected',
f'"{_detected}" was detected',
] + [f'attribute "{s}" was not detected' for s in sorted(_not_detected)]
result.append(error_cls(summary=_summary, details=_details))
_common = set(_expected.keys()) & set(_detected.keys())
for _key in _common:
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
return result
for expected, detected in matched_detection:
# We might not want to test all attributes, so avoid not_expected
not_detected = set(expected.extra_attributes) - set(detected.extra_attributes)
if not_detected:
summary = f"{pkg_name}: cannot detect some attributes for spec {expected}"
details = [
f'"{s}" was not detected [test_id={idx}]' for s in sorted(not_detected)
]
errors.append(error_cls(summary=summary, details=details))
common = set(expected.extra_attributes) & set(detected.extra_attributes)
for key in common:
errors.extend(
_compare_extra_attribute(
expected.extra_attributes[key],
detected.extra_attributes[key],
_spec=expected,
)
)
return errors

View File

@@ -29,6 +29,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
from llnl.util.symlink import readlink
import spack.caches
import spack.cmd
@@ -658,7 +659,7 @@ def get_buildfile_manifest(spec):
# 2. paths are used as strings.
for rel_path in visitor.symlinks:
abs_path = os.path.join(root, rel_path)
link = os.readlink(abs_path)
link = readlink(abs_path)
if os.path.isabs(link) and link.startswith(spack.store.STORE.layout.root):
data["link_to_relocate"].append(rel_path)
@@ -2001,6 +2002,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)

View File

@@ -5,7 +5,13 @@
"""Function and classes needed to bootstrap Spack itself."""
from .config import ensure_bootstrap_configuration, is_bootstrapping, store_path
from .core import all_core_root_specs, ensure_core_dependencies, ensure_patchelf_in_path_or_raise
from .core import (
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
from .environment import BootstrapEnvironment, ensure_environment_dependencies
from .status import status_message
@@ -13,6 +19,8 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",
"all_core_root_specs",
"ensure_environment_dependencies",

View File

@@ -54,10 +54,14 @@ def _try_import_from_store(
installed_specs = spack.store.STORE.db.query(query_spec, installed=True)
for candidate_spec in installed_specs:
pkg = candidate_spec["python"].package
# previously bootstrapped specs may not have a python-venv dependency.
if candidate_spec.dependencies("python-venv"):
python, *_ = candidate_spec.dependencies("python-venv")
else:
python, *_ = candidate_spec.dependencies("python")
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
os.path.join(candidate_spec.prefix, python.package.purelib),
os.path.join(candidate_spec.prefix, python.package.platlib),
]
path_before = list(sys.path)

View File

@@ -173,35 +173,14 @@ def _read_metadata(self, package_name: str) -> Any:
return data
def _install_by_hash(
self,
pkg_hash: str,
pkg_sha256: str,
index: List[spack.spec.Spec],
bincache_platform: spack.platforms.Platform,
self, pkg_hash: str, pkg_sha256: str, bincache_platform: spack.platforms.Platform
) -> None:
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null",
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family),
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override("compilers", [{"compiler": compiler_entry}]):
spec_str = "/" + pkg_hash
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
matches = spack.store.find([spec_str], multiple=False, query_fn=query)
for match in matches:
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
for match in spack.store.find([f"/{pkg_hash}"], multiple=False, query_fn=query):
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
def _install_and_test(
self,
@@ -232,7 +211,7 @@ def _install_and_test(
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, index, bincache_platform)
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
if test_fn(query_spec=abstract_spec, query_info=info):
@@ -291,10 +270,6 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
@@ -583,9 +558,9 @@ def ensure_winsdk_external_or_raise() -> None:
missing_packages_lst.append("win-sdk")
missing_packages = " & ".join(missing_packages_lst)
raise RuntimeError(
f"Unable to find the {missing_packages}, please install these packages\
via the Visual Studio installer\
before proceeding with Spack or provide the path to a non standard install via\
f"Unable to find the {missing_packages}, please install these packages \
via the Visual Studio installer \
before proceeding with Spack or provide the path to a non standard install with \
'spack external find --path'"
)
# wgl/sdk are not required for bootstrapping Spack, but
@@ -600,8 +575,6 @@ def ensure_core_dependencies() -> None:
ensure_patchelf_in_path_or_raise()
if not IS_WINDOWS:
ensure_gpg_in_path_or_raise()
else:
ensure_winsdk_external_or_raise()
ensure_clingo_importable_or_raise()

View File

@@ -3,13 +3,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap non-core Spack dependencies from an environment."""
import glob
import hashlib
import os
import pathlib
import sys
import warnings
from typing import List
from typing import Iterable, List
import archspec.cpu
@@ -28,6 +26,16 @@
class BootstrapEnvironment(spack.environment.Environment):
"""Environment to install dependencies of Spack for a given interpreter and architecture"""
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
# Remove python package roots created before python-venv was introduced
for s in self.concrete_roots():
if "python" in s.package.extendees and not s.dependencies("python-venv"):
self.deconcretize(s)
@classmethod
def spack_dev_requirements(cls) -> List[str]:
"""Spack development requirements"""
@@ -59,31 +67,19 @@ def view_root(cls) -> pathlib.Path:
return cls.environment_root().joinpath("view")
@classmethod
def pythonpaths(cls) -> List[str]:
"""Paths to be added to sys.path or PYTHONPATH"""
python_dir_part = f"python{'.'.join(str(x) for x in sys.version_info[:2])}"
glob_expr = str(cls.view_root().joinpath("**", python_dir_part, "**"))
result = glob.glob(glob_expr)
if not result:
msg = f"Cannot find any Python path in {cls.view_root()}"
warnings.warn(msg)
return result
@classmethod
def bin_dirs(cls) -> List[pathlib.Path]:
def bin_dir(cls) -> pathlib.Path:
"""Paths to be added to PATH"""
return [cls.view_root().joinpath("bin")]
return cls.view_root().joinpath("bin")
def python_dirs(self) -> Iterable[pathlib.Path]:
python = next(s for s in self.all_specs_generator() if s.name == "python-venv").package
return {self.view_root().joinpath(p) for p in (python.platlib, python.purelib)}
@classmethod
def spack_yaml(cls) -> pathlib.Path:
"""Environment spack.yaml file"""
return cls.environment_root().joinpath("spack.yaml")
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
@@ -100,21 +96,13 @@ def update_installations(self) -> None:
self.install_all()
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
the environment view.
"""
# Do minimal modifications to sys.path and environment variables. In particular, pay
# attention to have the smallest PYTHONPATH / sys.path possible, since that may impact
# the performance of the current interpreter
sys.path.extend(self.pythonpaths())
os.environ["PATH"] = os.pathsep.join(
[str(x) for x in self.bin_dirs()] + os.environ.get("PATH", "").split(os.pathsep)
)
os.environ["PYTHONPATH"] = os.pathsep.join(
os.environ.get("PYTHONPATH", "").split(os.pathsep)
+ [str(x) for x in self.pythonpaths()]
)
def load(self) -> None:
"""Update PATH and sys.path."""
# Make executables available (shouldn't need PYTHONPATH)
os.environ["PATH"] = f"{self.bin_dir()}{os.pathsep}{os.environ.get('PATH', '')}"
# Spack itself imports pytest
sys.path.extend(str(p) for p in self.python_dirs())
def _write_spack_yaml_file(self) -> None:
tty.msg(
@@ -164,4 +152,4 @@ def ensure_environment_dependencies() -> None:
_add_externals_if_missing()
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()
env.load()

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
from typing import List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -551,13 +551,16 @@ def update_compiler_args_for_dep(dep):
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
spack_managed_dirs: List[str] = [
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
spack_managed_dirs: Set[str] = {
spack.stage.get_stage_root(),
spack.store.STORE.db.root,
*(db.root for db in spack.store.STORE.db.upstream_dbs),
]
}
spack_managed_dirs.update([os.path.realpath(p) for p in spack_managed_dirs])
env.set(SPACK_MANAGED_DIRS, "|".join(f'"{p}/"*' for p in spack_managed_dirs))
env.set(SPACK_MANAGED_DIRS, "|".join(f'"{p}/"*' for p in sorted(spack_managed_dirs)))
is_spack_managed = lambda p: any(p.startswith(store) for store in spack_managed_dirs)
link_dirs_spack, link_dirs_system = stable_partition(link_dirs, is_spack_managed)
include_dirs_spack, include_dirs_system = stable_partition(include_dirs, is_spack_managed)

View File

@@ -39,16 +39,11 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
if not getattr(pkg, "find_python_hints", False):
if not getattr(pkg, "find_python_hints", False) or not pkg.spec.dependencies(
"python", dt.BUILD | dt.LINK
):
return
pythons = pkg.spec.dependencies("python", dt.BUILD | dt.LINK)
if len(pythons) != 1:
return
try:
python_executable = pythons[0].package.command.path
except RuntimeError:
return
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),

View File

@@ -0,0 +1,144 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
import spack.compiler
import spack.package_base
# Local "type" for type hints
Path = Union[str, pathlib.Path]
class CompilerPackage(spack.package_base.PackageBase):
"""A Package mixin for all common logic for packages that implement compilers"""
# TODO: how do these play nicely with other tags
tags: Sequence[str] = ["compiler"]
#: Optional suffix regexes for searching for this type of compiler.
#: Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
#: version suffix for gcc.
compiler_suffixes: List[str] = [r"-.*"]
#: Optional prefix regexes for searching for this compiler
compiler_prefixes: List[str] = []
#: Compiler argument(s) that produces version information
#: If multiple arguments, the earlier arguments must produce errors when invalid
compiler_version_argument: Union[str, Tuple[str]] = "-dumpversion"
#: Regex used to extract version from compiler's output
compiler_version_regex: str = "(.*)"
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
assert set(self.supported_languages) <= set(self.compiler_languages), msg
@property
def supported_languages(self) -> Sequence[str]:
"""Dynamic definition of languages supported by this package"""
return self.compiler_languages
@classproperty
def compiler_names(cls) -> Sequence[str]:
"""Construct list of compiler names from per-language names"""
names = []
for language in cls.compiler_languages:
names.extend(getattr(cls, f"{language}_names"))
return names
@classproperty
def executables(cls) -> Sequence[str]:
"""Construct executables for external detection from names, prefixes, and suffixes."""
regexp_fmt = r"^({0}){1}({2})$"
prefixes = [""] + cls.compiler_prefixes
suffixes = [""] + cls.compiler_suffixes
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
suffixes += [suf + ext for suf in suffixes]
return [
regexp_fmt.format(prefix, re.escape(name), suffix)
for prefix, name, suffix in itertools.product(prefixes, cls.compiler_names, suffixes)
]
@classmethod
def determine_version(cls, exe: Path):
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
except spack.util.executable.ProcessError:
pass
except Exception as e:
tty.debug(
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
return os.path.join(prefix, "bin")
@classmethod
def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
"""Compute the paths to compiler executables associated with this package
This is a helper method for ``determine_variants`` to compute the ``extra_attributes``
to include with each spec object."""
# There are often at least two copies (not symlinks) of each compiler executable in the
# same directory: one with a canonical name, e.g. "gfortran", and another one with the
# target prefix, e.g. "x86_64-pc-linux-gnu-gfortran". There also might be a copy of "gcc"
# with the version suffix, e.g. "x86_64-pc-linux-gnu-gcc-6.3.0". To ensure the consistency
# of values in the "paths" dictionary (i.e. we prefer all of them to reference copies
# with canonical names if possible), we iterate over the executables in the reversed sorted
# order:
# First pass over languages identifies exes that are perfect matches for canonical names
# Second pass checks for names with prefix/suffix
# Second pass is sorted by language name length because longer named languages
# e.g. cxx can often contain the names of shorter named languages
# e.g. c (e.g. clang/clang++)
paths = {}
exes = sorted(exes, reverse=True)
languages = {
lang: getattr(cls, f"{lang}_names")
for lang in sorted(cls.compiler_languages, key=len, reverse=True)
}
for exe in exes:
for lang, names in languages.items():
if os.path.basename(exe) in names:
paths[lang] = exe
break
else:
for lang, names in languages.items():
if any(name in os.path.basename(exe) for name in names):
paths[lang] = exe
break
return paths
@classmethod
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}

View File

@@ -145,7 +145,7 @@ def install(self, pkg, spec, prefix):
opts += self.nmake_install_args()
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", prefix))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, license, variant
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -30,7 +30,7 @@ class IntelOneApiPackage(Package):
# oneAPI license does not allow mirroring outside of the
# organization (e.g. University/Company).
redistribute_source = False
redistribute(source=False, binary=False)
for c in [
"target=ppc64:",

View File

@@ -138,16 +138,21 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
# Patch up shebangs if the package extends Python and we put a Python interpreter in the
# view.
if not self.extendee_spec:
return super().add_files_to_view(view, merge_map, skip_if_exists)
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
if python.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
for src, dst in merge_map.items():
if skip_if_exists and os.path.lexists(dst):
continue
@@ -168,7 +173,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
python.prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
view.link(src, dst)
@@ -199,14 +204,13 @@ def remove_files_from_view(self, view, merge_map):
ignore_namespace = True
bin_dir = self.spec.prefix.bin
global_view = self.extendee_spec.prefix == view.get_projection_for_spec(self.spec)
to_remove = []
for src, dst in merge_map.items():
if ignore_namespace and namespace_init(dst):
continue
if global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
to_remove.append(dst)
else:
os.remove(dst)
@@ -362,6 +366,12 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return f"https://pypi.org/simple/{name}/"
return None
@property
def python_spec(self):
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@property
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
@@ -371,8 +381,9 @@ def headers(self) -> HeaderList:
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
@@ -391,8 +402,9 @@ def libs(self) -> LibraryList:
name = self.spec.name[3:]
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
@@ -504,6 +516,8 @@ def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]
@@ -519,14 +533,6 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
else:
args.append(".")
pip = spec["python"].command
# Hide user packages, since we don't have build isolation. This is
# necessary because pip / setuptools may run hooks from arbitrary
# packages during the build. There is no equivalent variable to hide
# system packages, so this is not reliable for external Python.
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
with fs.working_dir(self.build_directory):
pip(*args)

View File

@@ -80,6 +80,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.util.environment import EnvironmentModifications
class ROCmPackage(PackageBase):
@@ -156,30 +157,23 @@ def hip_flags(amdgpu_target):
archs = ",".join(amdgpu_target)
return "--amdgpu-target={0}".format(archs)
# ASAN
@staticmethod
def asan_on(env, llvm_path):
def asan_on(self, env: EnvironmentModifications):
llvm_path = self.spec["llvm-amdgpu"].prefix
env.set("CC", llvm_path + "/bin/clang")
env.set("CXX", llvm_path + "/bin/clang++")
env.set("ASAN_OPTIONS", "detect_leaks=0")
for root, dirs, files in os.walk(llvm_path):
for root, _, files in os.walk(llvm_path):
if "libclang_rt.asan-x86_64.so" in files:
asan_lib_path = root
env.prepend_path("LD_LIBRARY_PATH", asan_lib_path)
SET_DWARF_VERSION_4 = ""
try:
# This will throw an error if imported on a non-Linux platform.
import distro
distname = distro.id()
except ImportError:
distname = "unknown"
if "rhel" in distname or "sles" in distname:
if "rhel" in self.spec.os or "sles" in self.spec.os:
SET_DWARF_VERSION_4 = "-gdwarf-5"
else:
SET_DWARF_VERSION_4 = ""
env.set("CFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CXXFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("CXXFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("LDFLAGS", "-Wl,--enable-new-dtags -fuse-ld=lld -fsanitize=address -g -Wl,")
# HIP version vs Architecture

View File

@@ -1478,6 +1478,12 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def win_quote(quote_str: str) -> str:
if IS_WINDOWS:
quote_str = f'"{quote_str}"'
return quote_str
def download_and_extract_artifacts(url, work_dir):
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -1942,9 +1948,9 @@ def compose_command_err_handling(args):
# but we need to handle EXEs (git, etc) ourselves
catch_exe_failure = (
"""
if ($LASTEXITCODE -ne 0){
throw "Command {} has failed"
}
if ($LASTEXITCODE -ne 0){{
throw 'Command {} has failed'
}}
"""
if IS_WINDOWS
else ""
@@ -2176,13 +2182,13 @@ def __init__(self, ci_cdash):
def args(self):
return [
"--cdash-upload-url",
self.upload_url,
win_quote(self.upload_url),
"--cdash-build",
self.build_name,
win_quote(self.build_name),
"--cdash-site",
self.site,
win_quote(self.site),
"--cdash-buildstamp",
self.build_stamp,
win_quote(self.build_stamp),
]
@property # type: ignore

View File

@@ -84,7 +84,7 @@ def externals(parser, args):
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs, debug_log=tty.debug)
_process_reports(reports)

View File

@@ -133,6 +133,11 @@ def setup_parser(subparser: argparse.ArgumentParser):
help="when pushing to an OCI registry, tag an image containing all root specs and their "
"runtime dependencies",
)
push.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
@@ -367,6 +372,25 @@ def _make_pool() -> MaybePool:
return NoPool()
def _skip_no_redistribute_for_public(specs):
remaining_specs = list()
removed_specs = list()
for spec in specs:
if spec.package.redistribute_binary:
remaining_specs.append(spec)
else:
removed_specs.append(spec)
if removed_specs:
colified_output = tty.colify.colified(list(s.name for s in removed_specs), indent=4)
tty.debug(
"The following specs will not be added to the binary cache"
" because they cannot be redistributed:\n"
f"{colified_output}\n"
"You can use `--private` to include them."
)
return remaining_specs
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -417,6 +441,8 @@ def push_fn(args):
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
if not args.private:
specs = _skip_no_redistribute_for_public(specs)
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.

View File

@@ -31,7 +31,6 @@
level = "long"
SPACK_COMMAND = "spack"
MAKE_COMMAND = "make"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
@@ -40,6 +39,12 @@ def deindent(desc):
return desc.replace(" ", "")
def unicode_escape(path: str) -> str:
"""Returns transformed path with any unicode
characters replaced with their corresponding escapes"""
return path.encode("unicode-escape").decode("utf-8")
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help="CI sub-commands")
@@ -551,75 +556,35 @@ def ci_rebuild(args):
# No hash match anywhere means we need to rebuild spec
# Start with spack arguments
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose"]
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose", "install"]
config = cfg.get("config")
if not config["verify_ssl"]:
spack_cmd.append("-k")
install_args = []
install_args = [f'--use-buildcache={spack_ci.win_quote("package:never,dependencies:only")}']
can_verify = spack_ci.can_verify_binaries()
verify_binaries = can_verify and spack_is_pr_pipeline is False
if not verify_binaries:
install_args.append("--no-check-signature")
slash_hash = "/{}".format(job_spec.dag_hash())
# Arguments when installing dependencies from cache
deps_install_args = install_args
slash_hash = spack_ci.win_quote("/" + job_spec.dag_hash())
# Arguments when installing the root from sources
root_install_args = install_args + [
"--keep-stage",
"--only=package",
"--use-buildcache=package:never,dependencies:only",
]
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
root_install_args.extend(cdash_handler.args())
root_install_args.append(slash_hash)
# ["x", "y"] -> "'x' 'y'"
args_to_string = lambda args: " ".join("'{}'".format(arg) for arg in args)
commands = [
# apparently there's a race when spack bootstraps? do it up front once
[SPACK_COMMAND, "-e", env.path, "bootstrap", "now"],
[
SPACK_COMMAND,
"-e",
env.path,
"env",
"depfile",
"-o",
"Makefile",
"--use-buildcache=package:never,dependencies:only",
slash_hash, # limit to spec we're building
],
[
# --output-sync requires GNU make 4.x.
# Old make errors when you pass it a flag it doesn't recognize,
# but it doesn't error or warn when you set unrecognized flags in
# this variable.
"export",
"GNUMAKEFLAGS=--output-sync=recurse",
],
[
MAKE_COMMAND,
"SPACK={}".format(args_to_string(spack_cmd)),
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,
[SPACK_COMMAND, "-e", unicode_escape(env.path), "bootstrap", "now"],
spack_cmd + deps_install_args + [slash_hash],
spack_cmd + root_install_args + [slash_hash],
]
tty.debug("Installing {0} from source".format(job_spec.name))
install_exit_code = spack_ci.process_command("install", commands, repro_dir)

View File

@@ -563,12 +563,13 @@ def add_concretizer_args(subparser):
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--fresh-roots",
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
help="concretize with fresh roots and reused dependencies",
)
subgroup.add_argument(
"--deprecated",

View File

@@ -10,13 +10,13 @@
import sys
import tempfile
from pathlib import Path
from typing import Optional
from typing import List, Optional
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from llnl.util.tty.color import cescape, colorize
import spack.cmd
import spack.cmd.common
@@ -61,14 +61,7 @@
#
def env_create_setup_parser(subparser):
"""create a new environment"""
subparser.add_argument(
"env_name",
metavar="env",
help=(
"name of managed environment or directory of the anonymous env "
"(when using --dir/-d) to activate"
),
)
subparser.add_argument("env_name", metavar="env", help="name or directory of environment")
subparser.add_argument(
"-d", "--dir", action="store_true", help="create an environment in a specific directory"
)
@@ -94,6 +87,9 @@ def env_create_setup_parser(subparser):
default=None,
help="either a lockfile (must end with '.json' or '.lock') or a manifest file",
)
subparser.add_argument(
"--include-concrete", action="append", help="name of old environment to copy specs from"
)
def env_create(args):
@@ -111,19 +107,32 @@ def env_create(args):
# the environment should not include a view.
with_view = None
include_concrete = None
if hasattr(args, "include_concrete"):
include_concrete = args.include_concrete
env = _env_create(
args.env_name,
init_file=args.envfile,
dir=args.dir,
dir=args.dir or os.path.sep in args.env_name or args.env_name in (".", ".."),
with_view=with_view,
keep_relative=args.keep_relative,
include_concrete=include_concrete,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
def _env_create(
name_or_path: str,
*,
init_file: Optional[str] = None,
dir: bool = False,
with_view: Optional[str] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
):
"""Create a new environment, with an optional yaml description.
Arguments:
@@ -135,22 +144,31 @@ def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep
keep_relative (bool): if True, develop paths are copied verbatim into
the new environment file, otherwise they may be made absolute if the
new environment is in a different location
include_concrete (list): list of the included concrete environments
"""
if not dir:
env = ev.create(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg("Created environment '%s' in %s" % (name_or_path, env.path))
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % (name_or_path))
return env
env = ev.create_in_dir(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
)
tty.msg("Created environment in %s" % env.path)
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % env.path)
tty.msg(
colorize(
f"Created environment @c{{{cescape(name_or_path)}}} in: @c{{{cescape(env.path)}}}"
)
)
else:
env = ev.create_in_dir(
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg(colorize(f"Created independent environment in: @c{{{cescape(env.path)}}}"))
tty.msg(f"Activate with: {colorize(f'@c{{spack env activate {cescape(name_or_path)}}}')}")
return env
@@ -436,6 +454,12 @@ def env_remove_setup_parser(subparser):
"""remove an existing environment"""
subparser.add_argument("rm_env", metavar="env", nargs="+", help="environment(s) to remove")
arguments.add_common_arguments(subparser, ["yes_to_all"])
subparser.add_argument(
"-f",
"--force",
action="store_true",
help="remove the environment even if it is included in another environment",
)
def env_remove(args):
@@ -445,13 +469,35 @@ def env_remove(args):
and manifests embedded in repositories should be removed manually.
"""
read_envs = []
valid_envs = []
bad_envs = []
for env_name in args.rm_env:
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
read_envs.append(env)
valid_envs.append(env_name)
if env_name in args.rm_env:
read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
bad_envs.append(env_name)
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if env is linked to another before trying to remove
for name in valid_envs:
# don't check if environment is included to itself
if name == env_name:
continue
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import sys
import llnl.util.lang
@@ -271,6 +272,27 @@ def root_decorator(spec, string):
print()
if env.included_concrete_envs:
tty.msg("Included specs")
# Root specs cannot be displayed with prefixes, since those are not
# set for abstract specs. Same for hashes
root_args = copy.copy(args)
root_args.paths = False
# Roots are displayed with variants, etc. so that we can see
# specifically what the user asked for.
cmd.display_specs(
env.included_user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
)
print()
if args.show_concretized:
tty.msg("Concretized roots")
cmd.display_specs(env.specs_by_hash.values(), args, decorator=decorator)

View File

@@ -263,8 +263,8 @@ def _fmt_name_and_default(variant):
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when, indent):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(when)}")
def _fmt_when(when: "spack.spec.Spec", indent: int):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(str(when))}")
def _fmt_variant_description(variant, width, indent):
@@ -441,7 +441,7 @@ def get_url(version):
return "No URL"
url = get_url(preferred) if pkg.has_code else ""
line = version(" {0}".format(pad(preferred))) + color.cescape(url)
line = version(" {0}".format(pad(preferred))) + color.cescape(str(url))
color.cwrite(line)
print()
@@ -464,7 +464,7 @@ def get_url(version):
continue
for v, url in vers:
line = version(" {0}".format(pad(v))) + color.cescape(url)
line = version(" {0}".format(pad(v))) + color.cescape(str(url))
color.cprint(line)
@@ -475,10 +475,7 @@ def print_virtuals(pkg, args):
color.cprint(section_title("Virtual Packages: "))
if pkg.provided:
for when, specs in reversed(sorted(pkg.provided.items())):
line = " %s provides %s" % (
when.colorized(),
", ".join(s.colorized() for s in specs),
)
line = " %s provides %s" % (when.cformat(), ", ".join(s.cformat() for s in specs))
print(line)
else:
@@ -497,7 +494,9 @@ def print_licenses(pkg, args):
pad = padder(pkg.licenses, 4)
for when_spec in pkg.licenses:
license_identifier = pkg.licenses[when_spec]
line = license(" {0}".format(pad(license_identifier))) + color.cescape(when_spec)
line = license(" {0}".format(pad(license_identifier))) + color.cescape(
str(when_spec)
)
color.cprint(line)

View File

@@ -71,6 +71,11 @@ def setup_parser(subparser):
help="the number of versions to fetch for each spec, choose 'all' to"
" retrieve all versions of each package",
)
create_parser.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(create_parser, ["specs"])
arguments.add_concretizer_args(create_parser)
@@ -359,7 +364,6 @@ def concrete_specs_from_user(args):
specs = filter_externals(specs)
specs = list(set(specs))
specs.sort(key=lambda s: (s.name, s.version))
specs, _ = lang.stable_partition(specs, predicate_fn=not_excluded_fn(args))
return specs
@@ -404,36 +408,50 @@ def concrete_specs_from_cli_or_file(args):
return specs
def not_excluded_fn(args):
"""Return a predicate that evaluate to True if a spec was not explicitly
excluded by the user.
"""
exclude_specs = []
if args.exclude_file:
exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
class IncludeFilter:
def __init__(self, args):
self.exclude_specs = []
if args.exclude_file:
self.exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
self.exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
self.private = args.private
def not_excluded(x):
return not any(x.satisfies(y) for y in exclude_specs)
def __call__(self, x):
return all([self._not_license_excluded(x), self._not_cmdline_excluded(x)])
return not_excluded
def _not_license_excluded(self, x):
"""True if the spec is for a private mirror, or as long as the
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
return True
else:
tty.debug(
"Skip adding {0} to mirror: the package.py file"
" indicates that a public mirror should not contain"
" it.".format(x.name)
)
return False
def _not_cmdline_excluded(self, x):
"""True if a spec was not explicitly excluded by the user."""
return not any(x.satisfies(y) for y in self.exclude_specs)
def concrete_specs_from_environment(selection_fn):
def concrete_specs_from_environment():
env = ev.active_environment()
assert env, "an active environment is required"
mirror_specs = env.all_specs()
mirror_specs = filter_externals(mirror_specs)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
def all_specs_with_all_versions(selection_fn):
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
@@ -454,12 +472,6 @@ def versions_per_spec(args):
return num_versions
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def process_mirror_stats(present, mirrored, error):
p, m, e = len(present), len(mirrored), len(error)
tty.msg(
@@ -505,30 +517,28 @@ def mirror_create(args):
# When no directory is provided, the source dir is used
path = args.directory or spack.caches.fetch_cache_location()
mirror_specs, mirror_fn = _specs_and_action(args)
mirror_fn(mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions)
def _specs_and_action(args):
include_fn = IncludeFilter(args)
if args.all and not ev.active_environment():
create_mirror_for_all_specs(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = all_specs_with_all_versions()
mirror_fn = create_mirror_for_all_specs
elif args.all and ev.active_environment():
mirror_specs = concrete_specs_from_environment()
mirror_fn = create_mirror_for_individual_specs
else:
mirror_specs = concrete_specs_from_user(args)
mirror_fn = create_mirror_for_individual_specs
if args.all and ev.active_environment():
create_mirror_for_all_specs_inside_environment(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = concrete_specs_from_user(args)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions
)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=include_fn)
return mirror_specs, mirror_fn
def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
mirror_specs = all_specs_with_all_versions(selection_fn=selection_fn)
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
@@ -540,11 +550,10 @@ def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_all_specs_inside_environment(path, skip_unstable_versions, selection_fn):
mirror_specs = concrete_specs_from_environment(selection_fn=selection_fn)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=skip_unstable_versions
)
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def mirror_destroy(args):

View File

@@ -23,7 +23,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.21"
tutorial_branch = "releases/v0.22"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -214,8 +214,6 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:

View File

@@ -20,8 +20,10 @@
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
@@ -107,7 +109,6 @@ def _parse_link_paths(string):
"""
lib_search_paths = False
raw_link_dirs = []
tty.debug("parsing implicit link info")
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
@@ -122,7 +123,7 @@ def _parse_link_paths(string):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug("linker line: %s" % line)
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
@@ -138,15 +139,12 @@ def _parse_link_paths(string):
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("linkdir: %s" % link_dir)
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("libpath: %s", link_dir)
raw_link_dirs.append(link_dir)
tty.debug("found raw link dirs: %s" % ", ".join(raw_link_dirs))
implicit_link_dirs = list()
visited = set()
@@ -156,7 +154,7 @@ def _parse_link_paths(string):
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug("found link dirs: %s" % ", ".join(implicit_link_dirs))
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@@ -417,17 +415,35 @@ def real_version(self):
self._real_version = self.version
return self._real_version
def implicit_rpaths(self):
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
# Put CXX first since it has the most linking issues
# And because it has flags that affect linking
link_dirs = self._get_compiler_link_paths()
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
output = self.compiler_verbose_output
if not output:
return None
dynamic_linker = spack.util.libc.parse_dynamic_linker(output)
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
@@ -436,17 +452,17 @@ def required_libs(self):
# By default every compiler returns the empty list
return []
def _get_compiler_link_paths(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if not cc or not self.verbose_flag:
# Cannot determine implicit link paths without a compiler / verbose flag
return []
# What flag types apply to first_compiler, in what order
if cc == self.cc:
flags = ["cflags", "cppflags", "ldflags"]
else:
flags = ["cxxflags", "cppflags", "ldflags"]
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
@@ -458,20 +474,19 @@ def _get_compiler_link_paths(self):
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in flags:
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
output = cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
return _parse_non_system_link_dirs(output)
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return []
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self):
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
@@ -669,8 +684,8 @@ def __str__(self):
@contextlib.contextmanager
def compiler_environment(self):
# yield immediately if no modules
if not self.modules:
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
@@ -687,13 +702,9 @@ def compiler_environment(self):
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()
spack.schema.environment.parse(self.environment).apply_modifications()
yield
except BaseException:
raise
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()

View File

@@ -164,33 +164,56 @@ def _compiler_config_from_package_config(config):
def _compiler_config_from_external(config):
extra_attributes_key = "extra_attributes"
compilers_key = "compilers"
c_key, cxx_key, fortran_key = "c", "cxx", "fortran"
# Allow `@x.y.z` instead of `@=x.y.z`
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
err_header = f"The external spec '{spec}' cannot be used as a compiler"
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("paths", {})
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if extra_attributes_key not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{extra_attributes_key}' key")
return None
extra_attributes = config[extra_attributes_key]
# If I have 'extra_attributes' warn if 'compilers' is missing, or we don't have a C compiler
if compilers_key not in extra_attributes:
warnings.warn(
f"{err_header}: missing the '{compilers_key}' key under '{extra_attributes_key}'"
)
return None
attribute_compilers = extra_attributes[compilers_key]
if c_key not in attribute_compilers:
warnings.warn(
f"{err_header}: missing the C compiler path under "
f"'{extra_attributes_key}:{compilers_key}'"
)
return None
c_compiler = attribute_compilers[c_key]
# C++ and Fortran compilers are not mandatory, so let's just leave a debug trace
if cxx_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a C++ compiler")
if fortran_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a Fortran compiler")
# compilers format has cc/fc/f77, externals format has "c/fortran"
paths = {
"cc": c_compiler,
"cxx": attribute_compilers.get(cxx_key, None),
"fc": attribute_compilers.get(fortran_key, None),
"f77": attribute_compilers.get(fortran_key, None),
}
if not spec.architecture:
host_platform = spack.platforms.host()
@@ -967,10 +990,11 @@ def _default_make_compilers(cmp_id, paths):
make_mixed_toolchain(flat_compilers)
# Finally, create the compiler list
compilers = []
compilers: List["spack.compiler.Compiler"] = []
for compiler_id, _, compiler in flat_compilers:
make_compilers = getattr(compiler_id.os, "make_compilers", _default_make_compilers)
compilers.extend(make_compilers(compiler_id, compiler))
candidates = make_compilers(compiler_id, compiler)
compilers.extend(x for x in candidates if x.cc is not None)
return compilers

View File

@@ -38,10 +38,10 @@ class Clang(Compiler):
cxx_names = ["clang++"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["flang"]
f77_names = ["flang-new", "flang"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang"]
fc_names = ["flang-new", "flang"]
version_argument = "--version"
@@ -171,10 +171,11 @@ def extract_version_from_output(cls, output):
match = re.search(
# Normal clang compiler versions are left as-is
r"clang version ([^ )\n]+)-svn[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r"clang version ([^ )\n]+?)-[~.\w\d-]*|" r"clang version ([^ )\n]+)",
r"(?:clang|flang-new) version ([^ )\n]+?)-[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)",
output,
)
if match:

View File

@@ -64,7 +64,7 @@ def verbose_flag(self):
#
# This way, we at least enable the implicit rpath detection, which is
# based on compilation of a C file (see method
# spack.compiler._get_compiler_link_paths): in the case of a mixed
# spack.compiler._compile_dummy_c_source): in the case of a mixed
# NAG/GCC toolchain, the flag will be passed to g++ (e.g.
# 'g++ -Wl,-v ./main.c'), otherwise, the flag will be passed to nagfor
# (e.g. 'nagfor -Wl,-v ./main.c' - note that nagfor recognizes '.c'

View File

@@ -74,6 +74,10 @@ class Concretizer:
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
@@ -113,7 +117,11 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.PATH.providers_for(spec)
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)

View File

@@ -12,26 +12,26 @@
},
"os_package_manager": "yum_amazon"
},
"fedora:38": {
"fedora:40": {
"bootstrap": {
"template": "container/fedora_38.dockerfile",
"image": "docker.io/fedora:38"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:40"
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build": "spack/fedora40",
"final": {
"image": "docker.io/fedora:38"
"image": "docker.io/fedora:40"
}
},
"fedora:37": {
"fedora:39": {
"bootstrap": {
"template": "container/fedora_37.dockerfile",
"image": "docker.io/fedora:37"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:39"
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build": "spack/fedora39",
"final": {
"image": "docker.io/fedora:37"
"image": "docker.io/fedora:39"
}
},
"rockylinux:9": {
@@ -116,6 +116,13 @@
},
"os_package_manager": "apt"
},
"ubuntu:24.04": {
"bootstrap": {
"template": "container/ubuntu_2404.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-noble"
},
"ubuntu:22.04": {
"bootstrap": {
"template": "container/ubuntu_2204.dockerfile"
@@ -129,13 +136,6 @@
},
"build": "spack/ubuntu-focal",
"os_package_manager": "apt"
},
"ubuntu:18.04": {
"bootstrap": {
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -25,6 +25,7 @@
import socket
import sys
import time
from json import JSONDecoder
from typing import (
Any,
Callable,
@@ -818,7 +819,8 @@ def _read_from_file(self, filename):
"""
try:
with open(filename, "r") as f:
fdata = sjson.load(f)
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
raise CorruptDatabaseError("error parsing database:", str(e)) from e
@@ -833,27 +835,24 @@ def check(cond, msg):
# High-level file checks
db = fdata["database"]
check("installs" in db, "no 'installs' in JSON DB.")
check("version" in db, "no 'version' in JSON DB.")
installs = db["installs"]
# TODO: better version checking semantics.
version = vn.Version(db["version"])
if version > _DB_VERSION:
raise InvalidDatabaseVersionError(self, _DB_VERSION, version)
elif version < _DB_VERSION:
if not any(old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX):
tty.warn(
"Spack database version changed from %s to %s. Upgrading."
% (version, _DB_VERSION)
)
elif version < _DB_VERSION and not any(
old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields))
for k, v in self._data.items()
)
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
else:
check("installs" in db, "no 'installs' in JSON DB.")
installs = db["installs"]
spec_reader = reader(version)

View File

@@ -83,26 +83,15 @@ def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
return path_to_dict(search_paths)
def get_elf_compat(path):
"""For ELF files, get a triplet (EI_CLASS, EI_DATA, e_machine) and see if
it is host-compatible."""
# On ELF platforms supporting, we try to be a bit smarter when it comes to shared
# libraries, by dropping those that are not host compatible.
with open(path, "rb") as f:
elf = elf_utils.parse_elf(f, only_header=True)
return (elf.is_64_bit, elf.is_little_endian, elf.elf_hdr.e_machine)
def accept_elf(path, host_compat):
"""Accept an ELF file if the header matches the given compat triplet,
obtained with :py:func:`get_elf_compat`. In case it's not an ELF (e.g.
static library, or some arbitrary file, fall back to is_readable_file)."""
"""Accept an ELF file if the header matches the given compat triplet. In case it's not an ELF
(e.g. static library, or some arbitrary file, fall back to is_readable_file)."""
# Fast path: assume libraries at least have .so in their basename.
# Note: don't replace with splitext, because of libsmth.so.1.2.3 file names.
if ".so" not in os.path.basename(path):
return llnl.util.filesystem.is_readable_file(path)
try:
return host_compat == get_elf_compat(path)
return host_compat == elf_utils.get_elf_compat(path)
except (OSError, elf_utils.ElfParsingError):
return llnl.util.filesystem.is_readable_file(path)
@@ -155,7 +144,7 @@ def libraries_in_ld_and_system_library_path(
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
try:
host_compat = get_elf_compat(sys.executable)
host_compat = elf_utils.get_elf_compat(sys.executable)
accept = lambda path: accept_elf(path, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = llnl.util.filesystem.is_readable_file

View File

@@ -11,6 +11,7 @@
from llnl.util import filesystem
import spack.platforms
import spack.repo
import spack.spec
from spack.util import spack_yaml
@@ -32,6 +33,8 @@ class ExpectedTestResult(NamedTuple):
#: Spec to be detected
spec: str
#: Attributes expected in the external spec
extra_attributes: Dict[str, str]
class DetectionTest(NamedTuple):
@@ -100,7 +103,10 @@ def _create_executable_scripts(self, mock_executables: MockExecutables) -> List[
@property
def expected_specs(self) -> List[spack.spec.Spec]:
return [spack.spec.Spec(r.spec) for r in self.test.results]
return [
spack.spec.Spec.from_detection(item.spec, extra_attributes=item.extra_attributes)
for item in self.test.results
]
def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runner]:
@@ -117,9 +123,13 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
"""
result = []
detection_tests_content = read_detection_tests(pkg_name, repository)
current_platform = str(spack.platforms.host())
tests_by_path = detection_tests_content.get("paths", [])
for single_test_data in tests_by_path:
if current_platform not in single_test_data.get("platforms", [current_platform]):
continue
mock_executables = []
for layout in single_test_data["layout"]:
mock_executables.append(
@@ -127,7 +137,11 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
)
expected_results = []
for assertion in single_test_data["results"]:
expected_results.append(ExpectedTestResult(spec=assertion["spec"]))
expected_results.append(
ExpectedTestResult(
spec=assertion["spec"], extra_attributes=assertion.get("extra_attributes", {})
)
)
current_test = DetectionTest(
pkg_name=pkg_name, layout=mock_executables, results=expected_results

View File

@@ -27,6 +27,7 @@ class OpenMpi(Package):
* ``variant``
* ``version``
* ``requires``
* ``redistribute``
"""
import collections
@@ -63,6 +64,7 @@ class OpenMpi(Package):
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conflicts",
"depends_on",
@@ -75,6 +77,7 @@ class OpenMpi(Package):
"resource",
"build_system",
"requires",
"redistribute",
]
#: These are variant names used by Spack internally; packages can't use them
@@ -598,9 +601,68 @@ def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
By default, Packages allow source/binary distribution (i.e. in
mirrors). Because of this, and because overlapping enable/
disable specs are not allowed, this directive only allows users
to explicitly disable redistribution for specs.
"""
return lambda pkg: _execute_redistribute(pkg, source, binary, when)
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
elif (source is True) or (binary is True):
raise DirectiveError(
"Source/binary distribution are true by default, they can only "
"be explicitly disabled."
)
if source is None:
source = True
if binary is None:
binary = True
when_spec = _make_when_spec(when)
if not when_spec:
return
if source is False:
max_constraint = spack.spec.Spec(f"{pkg.name}@{when_spec.versions}")
if not max_constraint.satisfies(when_spec):
raise DirectiveError("Source distribution can only be disabled for versions")
if when_spec in pkg.disable_redistribute:
disable = pkg.disable_redistribute[when_spec]
if not source:
disable.source = True
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
source=not source, binary=not binary
)
@directive(("extendees", "dependencies"))
def extends(spec, when=None, type=("build", "run"), patches=None):
"""Same as depends_on, but also adds this package to the extendee list.
In case of Python, also adds a dependency on python-venv.
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
@@ -616,6 +678,11 @@ def _execute_extends(pkg):
_depends_on(pkg, spec, when=when, type=type, patches=patches)
spec_obj = spack.spec.Spec(spec)
# When extending python, also add a dependency on python-venv. This is done so that
# Spack environment views are Python virtual environments.
if spec_obj.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, "python-venv", when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[spec_obj.name] = (spec_obj, None)

View File

@@ -34,6 +34,9 @@
* ``spec``: a string representation of the abstract spec that was concretized
4. ``concrete_specs``: a dictionary containing the specs in the environment.
5. ``include_concrete`` (dictionary): an optional dictionary that includes the roots
and concrete specs from the included environments, keyed by the path to that
environment
Compatibility
-------------
@@ -50,26 +53,37 @@
- ``v2``
- ``v3``
- ``v4``
- ``v5``
* - ``v0.12:0.14``
-
-
-
-
-
* - ``v0.15:0.16``
-
-
-
-
-
* - ``v0.17``
-
-
-
-
-
* - ``v0.18:``
-
-
-
-
-
* - ``v0.22:``
-
-
-
-
-
Version 1
---------
@@ -334,6 +348,118 @@
}
}
}
Version 5
---------
Version 5 doesn't change the top-level lockfile format, but an optional dictionary is
added. The dictionary has the ``root`` and ``concrete_specs`` of the included
environments, which are keyed by the path to that environment. Since this is optional
if the environment does not have any included environments ``include_concrete`` will
not be a part of the lockfile.
.. code-block:: json
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 5,
"specfile-version": 3
},
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
"include_concrete": {
"<path to environment>": {
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
}
}
}
"""
from .environment import (

View File

@@ -16,7 +16,7 @@
import urllib.parse
import urllib.request
import warnings
from typing import Dict, Iterable, List, Optional, Set, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -159,6 +159,8 @@ def default_manifest_yaml():
default_view_name = "default"
# Default behavior to link all packages into views (vs. only root packages)
default_view_link = "all"
# The name for any included concrete specs
included_concrete_name = "include_concrete"
def installed_specs():
@@ -293,6 +295,7 @@ def create(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create a managed environment in Spack and returns it.
@@ -309,10 +312,15 @@ def create(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: list of concrete environment names/paths to be included
"""
environment_dir = environment_dir_from_name(name, exists_ok=False)
return create_in_dir(
environment_dir, init_file=init_file, with_view=with_view, keep_relative=keep_relative
environment_dir,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
@@ -321,6 +329,7 @@ def create_in_dir(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create an environment in the directory passed as input and returns it.
@@ -334,6 +343,7 @@ def create_in_dir(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: concrete environment names/paths to be included
"""
initialize_environment_dir(root, envfile=init_file)
@@ -346,6 +356,12 @@ def create_in_dir(
if with_view is not None:
manifest.set_default_view(with_view)
if include_concrete is not None:
set_included_envs_to_env_paths(include_concrete)
validate_included_envs_exists(include_concrete)
validate_included_envs_concrete(include_concrete)
manifest.set_include_concrete(include_concrete)
manifest.flush()
except (spack.config.ConfigFormatError, SpackEnvironmentConfigError) as e:
@@ -419,6 +435,67 @@ def ensure_env_root_path_exists():
fs.mkdirp(env_root_path())
def set_included_envs_to_env_paths(include_concrete: List[str]) -> None:
"""If the included environment(s) is the environment name
it is replaced by the path to the environment
Args:
include_concrete: list of env name or path to env"""
for i, env_name in enumerate(include_concrete):
if is_env_dir(env_name):
include_concrete[i] = env_name
elif exists(env_name):
include_concrete[i] = root(env_name)
def validate_included_envs_exists(include_concrete: List[str]) -> None:
"""Checks that all of the included environments exist
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments do not exist
"""
missing_envs = set()
for i, env_name in enumerate(include_concrete):
if not is_env_dir(env_name):
missing_envs.add(env_name)
if missing_envs:
msg = "The following environment(s) are missing: {0}".format(", ".join(missing_envs))
raise SpackEnvironmentError(msg)
def validate_included_envs_concrete(include_concrete: List[str]) -> None:
"""Checks that all of the included environments are concrete
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments are not concrete
"""
non_concrete_envs = set()
for env_path in include_concrete:
if not os.path.exists(Environment(env_path).lock_path):
non_concrete_envs.add(Environment(env_path).name)
if non_concrete_envs:
msg = "The following environment(s) are not concrete: {0}\n" "Please run:".format(
", ".join(non_concrete_envs)
)
for env in non_concrete_envs:
msg += f"\n\t`spack -e {env} concretize`"
raise SpackEnvironmentError(msg)
def all_environment_names():
"""List the names of environments that currently exist."""
# just return empty if the env path does not exist. A read-only
@@ -821,6 +898,18 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
self.specs_by_hash: Dict[str, Spec] = {}
#: Repository for this environment (memoized)
self._repo = None
#: Environment paths for concrete (lockfile) included environments
self.included_concrete_envs: List[str] = []
#: First-level included concretized spec data from/to the lockfile.
self.included_concrete_spec_data: Dict[str, Dict[str, List[str]]] = {}
#: User specs from included environments from the last concretization
self.included_concretized_user_specs: Dict[str, List[Spec]] = {}
#: Roots from included environments with the last concretization, in order
self.included_concretized_order: Dict[str, List[str]] = {}
#: Concretized specs by hash from the included environments
self.included_specs_by_hash: Dict[str, Dict[str, Spec]] = {}
#: Previously active environment
self._previous_active = None
self._dev_specs = None
@@ -858,7 +947,7 @@ def _read(self):
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
read_lock_version = self._read_lockfile(f)
read_lock_version = self._read_lockfile(f)["_meta"]["lockfile-version"]
if read_lock_version == 1:
tty.debug(f"Storing backup of {self.lock_path} at {self._lock_backup_v1_path}")
@@ -926,6 +1015,20 @@ def add_view(name, values):
if self.views == dict():
self.views[default_view_name] = ViewDescriptor(self.path, self.view_path_default)
def _process_concrete_includes(self):
"""Extract and load into memory included concrete spec data."""
self.included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
if self.included_concrete_envs:
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
data = self._read_lockfile(f)
if included_concrete_name in data:
self.included_concrete_spec_data = data[included_concrete_name]
else:
self.include_concrete_envs()
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
@@ -942,6 +1045,31 @@ def _construct_state_from_manifest(self):
self.spec_lists[user_speclist_name] = user_specs
self._process_view(spack.config.get("view", True))
self._process_concrete_includes()
def all_concretized_user_specs(self) -> List[Spec]:
"""Returns all of the concretized user specs of the environment and
its included environment(s)."""
concretized_user_specs = self.concretized_user_specs[:]
for included_specs in self.included_concretized_user_specs.values():
for included in included_specs:
# Don't duplicate included spec(s)
if included not in concretized_user_specs:
concretized_user_specs.append(included)
return concretized_user_specs
def all_concretized_orders(self) -> List[str]:
"""Returns all of the concretized order of the environment and
its included environment(s)."""
concretized_order = self.concretized_order[:]
for included_concretized_order in self.included_concretized_order.values():
for included in included_concretized_order:
# Don't duplicate included spec(s)
if included not in concretized_order:
concretized_order.append(included)
return concretized_order
@property
def user_specs(self):
@@ -966,6 +1094,26 @@ def _read_dev_specs(self):
dev_specs[name] = local_entry
return dev_specs
@property
def included_user_specs(self) -> SpecList:
"""Included concrete user (or root) specs from last concretization."""
spec_list = SpecList()
if not self.included_concrete_envs:
return spec_list
def add_root_specs(included_concrete_specs):
# add specs from the include *and* any nested includes it may have
for env, info in included_concrete_specs.items():
for root_list in info["roots"]:
spec_list.add(root_list["spec"])
if "include_concrete" in info:
add_root_specs(info["include_concrete"])
add_root_specs(self.included_concrete_spec_data)
return spec_list
def clear(self, re_read=False):
"""Clear the contents of the environment
@@ -977,9 +1125,15 @@ def clear(self, re_read=False):
self.spec_lists[user_speclist_name] = SpecList()
self._dev_specs = {}
self.concretized_user_specs = [] # user specs from last concretize
self.concretized_order = [] # roots of last concretize, in order
self.concretized_user_specs = [] # user specs from last concretize
self.specs_by_hash = {} # concretized specs by hash
self.included_concrete_spec_data = {} # concretized specs from lockfile of included envs
self.included_concretized_order = {} # root specs of the included envs, keyed by env path
self.included_concretized_user_specs = {} # user specs from last concretize's included env
self.included_specs_by_hash = {} # concretized specs by hash from the included envs
self.invalidate_repository_cache()
self._previous_active = None # previously active environment
if not re_read:
@@ -1033,6 +1187,55 @@ def scope_name(self):
"""Name of the config scope of this environment's manifest file."""
return self.manifest.scope_name
def include_concrete_envs(self):
"""Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set()
concrete_hash_seen = set()
self.included_concrete_spec_data = {}
for env_path in self.included_concrete_envs:
# Check that environment exists
if not is_env_dir(env_path):
raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path)
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
# Copy unique root specs from env
self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env
for concrete_spec in lockfile_as_dict["concrete_specs"]:
if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path].update(
{"concrete_specs": lockfile_as_dict["concrete_specs"]}
)
concrete_hash_seen.add(concrete_spec)
if "include_concrete" in lockfile_as_dict.keys():
self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
"include_concrete"
]
self._read_lockfile_dict(self._to_lockfile_dict())
self.write()
def destroy(self):
"""Remove this environment from Spack entirely."""
shutil.rmtree(self.path)
@@ -1232,6 +1435,10 @@ def concretize(self, force=False, tests=False):
for spec in set(self.concretized_user_specs) - set(self.user_specs):
self.deconcretize(spec, concrete=False)
# If a combined env, check updated spec is in the linked envs
if self.included_concrete_envs:
self.include_concrete_envs()
# Pick the right concretization strategy
if self.unify == "when_possible":
return self._concretize_together_where_possible(tests=tests)
@@ -1415,7 +1622,7 @@ def _concretize_separately(self, tests=False):
# Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -1704,8 +1911,14 @@ def _partition_roots_by_install_status(self):
of per spec."""
installed, uninstalled = [], []
with spack.store.STORE.db.read_transaction():
for concretized_hash in self.concretized_order:
spec = self.specs_by_hash[concretized_hash]
for concretized_hash in self.all_concretized_orders():
if concretized_hash in self.specs_by_hash:
spec = self.specs_by_hash[concretized_hash]
else:
for env_path in self.included_specs_by_hash.keys():
if concretized_hash in self.included_specs_by_hash[env_path]:
spec = self.included_specs_by_hash[env_path][concretized_hash]
break
if not spec.installed or (
spec.satisfies("dev_path=*") or spec.satisfies("^dev_path=*")
):
@@ -1785,8 +1998,14 @@ def added_specs(self):
def concretized_specs(self):
"""Tuples of (user spec, concrete spec) for all concrete specs."""
for s, h in zip(self.concretized_user_specs, self.concretized_order):
yield (s, self.specs_by_hash[h])
for s, h in zip(self.all_concretized_user_specs(), self.all_concretized_orders()):
if h in self.specs_by_hash:
yield (s, self.specs_by_hash[h])
else:
for env_path in self.included_specs_by_hash.keys():
if h in self.included_specs_by_hash[env_path]:
yield (s, self.included_specs_by_hash[env_path][h])
break
def concrete_roots(self):
"""Same as concretized_specs, except it returns the list of concrete
@@ -1915,8 +2134,7 @@ def _get_environment_specs(self, recurse_dependencies=True):
If these specs appear under different user_specs, only one copy
is added to the list returned.
"""
specs = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [self.specs_by_hash[h] for h in self.all_concretized_orders()]
if recurse_dependencies:
specs.extend(
traverse.traverse_nodes(
@@ -1961,31 +2179,76 @@ def _to_lockfile_dict(self):
"concrete_specs": concrete_specs,
}
if self.included_concrete_envs:
data[included_concrete_name] = self.included_concrete_spec_data
return data
def _read_lockfile(self, file_or_json):
"""Read a lockfile from a file or from a raw string."""
lockfile_dict = sjson.load(file_or_json)
self._read_lockfile_dict(lockfile_dict)
return lockfile_dict["_meta"]["lockfile-version"]
return lockfile_dict
def set_included_concretized_user_specs(
self,
env_name: str,
env_info: Dict[str, Dict[str, Any]],
included_json_specs_by_hash: Dict[str, Dict[str, Any]],
) -> Dict[str, Dict[str, Any]]:
"""Sets all of the concretized user specs from included environments
to include those from nested included environments.
Args:
env_name: the name (technically the path) of the included environment
env_info: included concrete environment data
included_json_specs_by_hash: concrete spec data keyed by hash
Returns: updated specs_by_hash
"""
self.included_concretized_order[env_name] = []
self.included_concretized_user_specs[env_name] = []
def add_specs(name, info, specs_by_hash):
# Add specs from the environment as well as any of its nested
# environments.
for root_info in info["roots"]:
self.included_concretized_order[name].append(root_info["hash"])
self.included_concretized_user_specs[name].append(Spec(root_info["spec"]))
if "concrete_specs" in info:
specs_by_hash.update(info["concrete_specs"])
if included_concrete_name in info:
for included_name, included_info in info[included_concrete_name].items():
if included_name not in self.included_concretized_order:
self.included_concretized_order[included_name] = []
self.included_concretized_user_specs[included_name] = []
add_specs(included_name, included_info, specs_by_hash)
add_specs(env_name, env_info, included_json_specs_by_hash)
return included_json_specs_by_hash
def _read_lockfile_dict(self, d):
"""Read a lockfile dictionary into this environment."""
self.specs_by_hash = {}
self.included_specs_by_hash = {}
self.included_concretized_user_specs = {}
self.included_concretized_order = {}
roots = d["roots"]
self.concretized_user_specs = [Spec(r["spec"]) for r in roots]
self.concretized_order = [r["hash"] for r in roots]
json_specs_by_hash = d["concrete_specs"]
included_json_specs_by_hash = {}
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
if included_concrete_name in d:
for env_name, env_info in d[included_concrete_name].items():
included_json_specs_by_hash.update(
self.set_included_concretized_user_specs(
env_name, env_info, included_json_specs_by_hash
)
)
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
current_lockfile_format = d["_meta"]["lockfile-version"]
try:
reader = READER_CLS[current_lockfile_format]
@@ -1998,6 +2261,39 @@ def _read_lockfile_dict(self, d):
msg += " You need to use a newer Spack version."
raise SpackEnvironmentError(msg)
first_seen, self.concretized_order = self.filter_specs(
reader, json_specs_by_hash, self.concretized_order
)
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
if any(self.included_concretized_order.values()):
first_seen = {}
for env_name, concretized_order in self.included_concretized_order.items():
filtered_spec, self.included_concretized_order[env_name] = self.filter_specs(
reader, included_json_specs_by_hash, concretized_order
)
first_seen.update(filtered_spec)
for env_path, spec_hashes in self.included_concretized_order.items():
self.included_specs_by_hash[env_path] = {}
for spec_dag_hash in spec_hashes:
self.included_specs_by_hash[env_path].update(
{spec_dag_hash: first_seen[spec_dag_hash]}
)
def filter_specs(self, reader, json_specs_by_hash, order_concretized):
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
# First pass: Put each spec in the map ignoring dependencies
for lockfile_key, node_dict in json_specs_by_hash.items():
spec = reader.from_node_dict(node_dict)
@@ -2020,7 +2316,8 @@ def _read_lockfile_dict(self, d):
# keep. This is only required as long as we support older lockfile
# formats where the mapping from DAG hash to lockfile key is possibly
# one-to-many.
for lockfile_key in self.concretized_order:
for lockfile_key in order_concretized:
for s in specs_by_hash[lockfile_key].traverse():
if s.dag_hash() not in first_seen:
first_seen[s.dag_hash()] = s
@@ -2028,12 +2325,10 @@ def _read_lockfile_dict(self, d):
# Now make sure concretized_order and our internal specs dict
# contains the keys used by modern spack (i.e. the dag_hash
# that includes build deps and package hash).
self.concretized_order = [
specs_by_hash[h_key].dag_hash() for h_key in self.concretized_order
]
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
order_concretized = [specs_by_hash[h_key].dag_hash() for h_key in order_concretized]
return first_seen, order_concretized
def write(self, regenerate: bool = True) -> None:
"""Writes an in-memory environment to its location on disk.
@@ -2046,7 +2341,7 @@ def write(self, regenerate: bool = True) -> None:
regenerate: regenerate views and run post-write hooks as well as writing if True.
"""
self.manifest_uptodate_or_warn()
if self.specs_by_hash:
if self.specs_by_hash or self.included_concrete_envs:
self.ensure_env_directory_exists(dot_env=True)
self.update_environment_repository()
self.manifest.flush()
@@ -2545,6 +2840,19 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
raise SpackEnvironmentError(msg) from e
self.changed = True
def set_include_concrete(self, include_concrete: List[str]) -> None:
"""Sets the included concrete environments in the manifest to the value(s) passed as input.
Args:
include_concrete: list of already existing concrete environments to include
"""
self.pristine_configuration[included_concrete_name] = []
for env_path in include_concrete:
self.pristine_configuration[included_concrete_name].append(env_path)
self.changed = True
def add_definition(self, user_spec: str, list_name: str) -> None:
"""Appends a user spec to the first active definition matching the name passed as argument.
@@ -2728,54 +3036,56 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# If scheme is not valid, config_path is not a url
# of a type Spack is generally aware
if spack.util.url.validate_scheme(include_url.scheme):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):

View File

@@ -662,9 +662,6 @@ def add_specs(self, *specs: spack.spec.Spec) -> None:
return
# Drop externals
for s in specs:
if s.external:
tty.warn("Skipping external package: " + s.short_spec)
specs = [s for s in specs if not s.external]
self._sanity_check_view_projection(specs)

View File

@@ -12,6 +12,10 @@
def post_install(spec, explicit):
# Push package to all buildcaches with autopush==True
# Do nothing if spec is an external package
if spec.external:
return
# Do nothing if package was not installed from source
pkg = spec.package
if pkg.installed_from_binary_cache:

View File

@@ -0,0 +1,8 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
def post_install(spec, explicit=None):
spec.package.windows_establish_runtime_linkage()

View File

@@ -488,6 +488,10 @@ def _process_binary_cache_tarball(
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
pkg.windows_establish_runtime_linkage()
if hasattr(pkg, "_post_buildcache_install_hook"):
pkg._post_buildcache_install_hook()
pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
@@ -1695,10 +1699,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
spack.package_base.PackageBase._verbose = spack.build_environment.start_build_process(
pkg, build_process, install_args
)
# Currently this is how RPATH-like behavior is achieved on Windows, after install
# establish runtime linkage via Windows Runtime link object
# Note: this is a no-op on non Windows platforms
pkg.windows_establish_runtime_linkage()
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)

View File

@@ -427,7 +427,7 @@ def make_argument_parser(**kwargs):
parser.add_argument(
"--color",
action="store",
default=os.environ.get("SPACK_COLOR", "auto"),
default=None,
choices=("always", "never", "auto"),
help="when to colorize output (default: auto)",
)
@@ -622,7 +622,8 @@ def setup_main_options(args):
# with color
color.try_enable_terminal_color_on_windows()
# when to use color (takes always, auto, or never)
color.set_color_when(args.color)
if args.color is not None:
color.set_color_when(args.color)
def allows_unknown_args(command):

View File

@@ -83,6 +83,17 @@ def configuration(module_set_name):
)
_FORMAT_STRING_RE = re.compile(r"({[^}]*})")
def _format_env_var_name(spec, var_name_fmt):
"""Format the variable name, but uppercase any formatted fields."""
fmt_parts = _FORMAT_STRING_RE.split(var_name_fmt)
return "".join(
spec.format(part).upper() if _FORMAT_STRING_RE.match(part) else part for part in fmt_parts
)
def _check_tokens_are_valid(format_string, message):
"""Checks that the tokens used in the format string are valid in
the context of module file and environment variable naming.
@@ -737,20 +748,12 @@ def environment_modifications(self):
exclude = self.conf.exclude_env_vars
# We may have tokens to substitute in environment commands
# Prepare a suitable transformation dictionary for the names
# of the environment variables. This means turn the valid
# tokens uppercase.
transform = {}
for token in _valid_tokens:
transform[token] = lambda s, string: str.upper(string)
for x in env:
# Ensure all the tokens are valid in this context
msg = "some tokens cannot be expanded in an environment variable name"
_check_tokens_are_valid(x.name, message=msg)
# Transform them
x.name = self.spec.format(x.name, transform=transform)
x.name = _format_env_var_name(self.spec, x.name)
if self.modification_needs_formatting(x):
try:
# Not every command has a value

View File

@@ -398,7 +398,7 @@ def create_opener():
opener = urllib.request.OpenerDirector()
for handler in [
urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(),
urllib.request.HTTPRedirectHandler(),
urllib.request.HTTPErrorProcessor(),
@@ -418,18 +418,27 @@ def ensure_status(request: urllib.request.Request, response: HTTPResponse, statu
)
def default_retry(f, retries: int = 3, sleep=None):
def default_retry(f, retries: int = 5, sleep=None):
sleep = sleep or time.sleep
def wrapper(*args, **kwargs):
for i in range(retries):
try:
return f(*args, **kwargs)
except urllib.error.HTTPError as e:
except (urllib.error.URLError, TimeoutError) as e:
# Retry on internal server errors, and rate limit errors
# Potentially this could take into account the Retry-After header
# if registries support it
if i + 1 != retries and (500 <= e.code < 600 or e.code == 429):
if i + 1 != retries and (
(
isinstance(e, urllib.error.HTTPError)
and (500 <= e.code < 600 or e.code == 429)
)
or (
isinstance(e, urllib.error.URLError) and isinstance(e.reason, TimeoutError)
)
or isinstance(e, TimeoutError)
):
# Exponential backoff
sleep(2**i)
continue

View File

@@ -39,6 +39,7 @@
)
from spack.build_systems.cargo import CargoPackage
from spack.build_systems.cmake import CMakePackage, generator
from spack.build_systems.compiler import CompilerPackage
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.generic import Package
from spack.build_systems.gnu import GNUMirrorPackage

View File

@@ -468,7 +468,41 @@ def _names(when_indexed_dictionary):
return sorted(all_names)
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -1206,7 +1240,7 @@ def install_test_root(self):
"""Return the install test root directory."""
tty.warn(
"The 'pkg.install_test_root' property is deprecated with removal "
"expected v0.22. Use 'install_test_root(pkg)' instead."
"expected v0.23. Use 'install_test_root(pkg)' instead."
)
return install_test_root(self)
@@ -1864,7 +1898,7 @@ def cache_extra_test_sources(self, srcs):
"""
msg = (
"'pkg.cache_extra_test_sources(srcs) is deprecated with removal "
"expected in v0.22. Use 'cache_extra_test_sources(pkg, srcs)' "
"expected in v0.23. Use 'cache_extra_test_sources(pkg, srcs)' "
"instead."
)
warnings.warn(msg)
@@ -2521,7 +2555,12 @@ class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""
def __init__(self, spec, dependents):
super().__init__("Cannot uninstall %s" % spec)
spec_fmt = spack.spec.DEFAULT_FORMAT + " /{hash:7}"
dep_fmt = "{name}{@versions} /{hash:7}"
super().__init__(
f"Cannot uninstall {spec.format(spec_fmt)}, "
f"needed by {[dep.format(dep_fmt) for dep in dependents]}"
)
self.spec = spec
self.dependents = dependents

View File

@@ -16,7 +16,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.paths
import spack.platforms
@@ -25,6 +25,7 @@
import spack.store
import spack.util.elf as elf
import spack.util.executable as executable
import spack.util.path
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer
@@ -613,7 +614,7 @@ def relocate_links(links, prefix_to_prefix):
"""Relocate links to a new install prefix."""
regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys()))
for link in links:
old_target = os.readlink(link)
old_target = readlink(link)
match = regex.match(old_target)
# No match.

View File

@@ -726,14 +726,14 @@ def first_repo(self):
"""Get the first repo in precedence order."""
return self.repos[0] if self.repos else None
@llnl.util.lang.memoized
def _all_package_names_set(self, include_virtuals):
return {name for repo in self.repos for name in repo.all_package_names(include_virtuals)}
@llnl.util.lang.memoized
def _all_package_names(self, include_virtuals):
"""Return all unique package names in all repositories."""
all_pkgs = set()
for repo in self.repos:
for name in repo.all_package_names(include_virtuals):
all_pkgs.add(name)
return sorted(all_pkgs, key=lambda n: n.lower())
return sorted(self._all_package_names_set(include_virtuals), key=lambda n: n.lower())
def all_package_names(self, include_virtuals=False):
return self._all_package_names(include_virtuals)
@@ -794,7 +794,11 @@ def patch_index(self):
@autospec
def providers_for(self, vpkg_spec):
providers = self.provider_index.providers_for(vpkg_spec)
providers = [
spec
for spec in self.provider_index.providers_for(vpkg_spec)
if spec.name in self._all_package_names_set(include_virtuals=False)
]
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
return providers

View File

@@ -27,7 +27,7 @@
from spack.error import SpackError
from spack.util.crypto import checksum
from spack.util.log_parse import parse_log_events
from spack.util.web import urllib_ssl_cert_handler
from spack.util.web import ssl_create_default_context
from .base import Reporter
from .extract import extract_test_parts
@@ -428,7 +428,7 @@ def upload(self, filename):
# Compute md5 checksum for the contents of this file.
md5sum = checksum(hashlib.md5, filename, block_size=8192)
opener = build_opener(HTTPSHandler(context=urllib_ssl_cert_handler()))
opener = build_opener(HTTPSHandler(context=ssl_create_default_context()))
with open(filename, "rb") as f:
params_dict = {
"build": self.buildname,

View File

@@ -9,13 +9,40 @@
"""
from typing import Any, Dict
LIST_OF_SPECS = {"type": "array", "items": {"type": "string"}}
properties: Dict[str, Any] = {
"concretizer": {
"type": "object",
"additionalProperties": False,
"properties": {
"reuse": {
"oneOf": [{"type": "boolean"}, {"type": "string", "enum": ["dependencies"]}]
"oneOf": [
{"type": "boolean"},
{"type": "string", "enum": ["dependencies"]},
{
"type": "object",
"properties": {
"roots": {"type": "boolean"},
"include": LIST_OF_SPECS,
"exclude": LIST_OF_SPECS,
"from": {
"type": "array",
"items": {
"type": "object",
"properties": {
"type": {
"type": "string",
"enum": ["local", "buildcache", "external"],
},
"include": LIST_OF_SPECS,
"exclude": LIST_OF_SPECS,
},
},
},
},
},
]
},
"enable_node_namespace": {"type": "boolean"},
"targets": {

View File

@@ -35,6 +35,7 @@
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spec_list_schema,
"include_concrete": {"type": "array", "default": [], "items": {"type": "string"}},
},
),
}

View File

@@ -141,7 +141,7 @@
"deprecatedProperties": {
"properties": ["version"],
"message": "setting version preferences in the 'all' section of packages.yaml "
"is deprecated and will be removed in v0.22\n\n\tThese preferences "
"is deprecated and will be removed in v0.23\n\n\tThese preferences "
"will be ignored by Spack. You can set them only in package-specific sections "
"of the same file.\n",
"error": False,
@@ -197,7 +197,7 @@
"properties": ["target", "compiler", "providers"],
"message": "setting 'compiler:', 'target:' or 'provider:' preferences in "
"a package-specific section of packages.yaml is deprecated, and will be "
"removed in v0.22.\n\n\tThese preferences will be ignored by Spack, and "
"removed in v0.23.\n\n\tThese preferences will be ignored by Spack, and "
"can be set only in the 'all' section of the same file. "
"You can run:\n\n\t\t$ spack audit configs\n\n\tto get better diagnostics, "
"including files:lines where the deprecated attributes are used.\n\n"

View File

@@ -6,6 +6,7 @@
import collections.abc
import copy
import enum
import functools
import itertools
import os
import pathlib
@@ -41,6 +42,8 @@
import spack.spec
import spack.store
import spack.util.crypto
import spack.util.elf
import spack.util.libc
import spack.util.path
import spack.util.timer
import spack.variant
@@ -283,6 +286,34 @@ def all_compilers_in_config(configuration):
return spack.compilers.all_compilers_from(configuration)
def all_libcs() -> Set[spack.spec.Spec]:
"""Return a set of all libc specs targeted by any configured compiler. If none, fall back to
libc determined from the current Python process if dynamically linked."""
libcs = {
c.default_libc for c in all_compilers_in_config(spack.config.CONFIG) if c.default_libc
}
if libcs:
return libcs
libc = spack.util.libc.libc_from_current_python_process()
return {libc} if libc else set()
def libc_is_compatible(lhs: spack.spec.Spec, rhs: spack.spec.Spec) -> List[spack.spec.Spec]:
return (
lhs.name == rhs.name
and lhs.external_path == rhs.external_path
and lhs.version >= rhs.version
)
def using_libc_compatibility() -> bool:
"""Returns True if we are currently using libc compatibility"""
return spack.platforms.host().name == "linux"
def extend_flag_list(flag_list, new_flags):
"""Extend a list of flags, preserving order and precedence.
@@ -566,6 +597,23 @@ def _spec_with_default_name(spec_str, name):
return spec
def _external_config_with_implicit_externals(configuration):
# Read packages.yaml and normalize it, so that it will not contain entries referring to
# virtual packages.
packages_yaml = _normalize_packages_yaml(configuration.get("packages"))
# Add externals for libc from compilers on Linux
if not using_libc_compatibility():
return packages_yaml
for compiler in all_compilers_in_config(configuration):
libc = compiler.default_libc
if libc:
entry = {"spec": f"{libc} %{compiler.spec}", "prefix": libc.external_path}
packages_yaml.setdefault(libc.name, {}).setdefault("externals", []).append(entry)
return packages_yaml
class ErrorHandler:
def __init__(self, model):
self.model = model
@@ -761,12 +809,22 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
A tuple of the solve result, the timer for the different phases of the
solve, and the internal statistics from clingo.
"""
# avoid circular import
import spack.bootstrap
output = output or DEFAULT_OUTPUT_CONFIGURATION
timer = spack.util.timer.Timer()
# Initialize the control object for the solver
self.control = control or default_clingo_control()
# ensure core deps are present on Windows
# needs to modify active config scope, so cannot be run within
# bootstrap config scope
if sys.platform == "win32":
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
spack.bootstrap.core.ensure_winsdk_external_or_raise()
timer.start("setup")
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
if output.out is not None:
@@ -784,10 +842,16 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
# Binary compatibility is based on libc on Linux, and on the os tag elsewhere
if using_libc_compatibility():
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
timer.stop("load")
# Grounding is the first step in the solve -- it turns our facts
@@ -880,14 +944,26 @@ class ConcreteSpecsByHash(collections.abc.Mapping):
def __init__(self) -> None:
self.data: Dict[str, spack.spec.Spec] = {}
self.explicit: Set[str] = set()
def __getitem__(self, dag_hash: str) -> spack.spec.Spec:
return self.data[dag_hash]
def explicit_items(self) -> Iterator[Tuple[str, spack.spec.Spec]]:
"""Iterate on items that have been added explicitly, and not just as a dependency
of other nodes.
"""
for h, s in self.items():
# We need to make an exception for gcc-runtime, until we can splice it.
if h in self.explicit or s.name == "gcc-runtime":
yield h, s
def add(self, spec: spack.spec.Spec) -> bool:
"""Adds a new concrete spec to the mapping. Returns True if the spec was just added,
False if the spec was already in the mapping.
Calling this function marks the spec as added explicitly.
Args:
spec: spec to be added
@@ -902,6 +978,7 @@ def add(self, spec: spack.spec.Spec) -> bool:
raise ValueError(msg)
dag_hash = spec.dag_hash()
self.explicit.add(dag_hash)
if dag_hash in self.data:
return False
@@ -984,6 +1061,9 @@ def __init__(self, tests: bool = False):
self.pkgs: Set[str] = set()
self.explicitly_required_namespaces: Dict[str, str] = {}
# list of unique libc specs targeted by compilers (or an educated guess if no compiler)
self.libcs: List[spack.spec.Spec] = []
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1165,6 +1245,9 @@ def pkg_rules(self, pkg, tests):
def trigger_rules(self):
"""Flushes all the trigger rules collected so far, and clears the cache."""
if not self._trigger_cache:
return
self.gen.h2("Trigger conditions")
for name in self._trigger_cache:
cache = self._trigger_cache[name]
@@ -1178,6 +1261,9 @@ def trigger_rules(self):
def effect_rules(self):
"""Flushes all the effect rules collected so far, and clears the cache."""
if not self._effect_cache:
return
self.gen.h2("Imposed requirements")
for name in self._effect_cache:
cache = self._effect_cache[name]
@@ -1340,7 +1426,6 @@ def condition(
raise ValueError(f"Must provide a name for anonymous condition: '{required_spec}'")
with spec_with_name(required_spec, name):
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the
# caller, we won't emit partial facts.
@@ -1554,14 +1639,35 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
requirement_weight += 1
def external_packages(self):
"""Facts on external packages, as read from packages.yaml"""
# Read packages.yaml and normalize it, so that it
# will not contain entries referring to virtual
# packages.
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
"""Facts on external packages, from packages.yaml and implicit externals."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
self.gen.h1("External packages")
spec_filters = []
concretizer_yaml = spack.config.get("concretizer")
reuse_yaml = concretizer_yaml.get("reuse")
if isinstance(reuse_yaml, typing.Mapping):
default_include = reuse_yaml.get("include", [])
default_exclude = reuse_yaml.get("exclude", [])
libc_externals = list(all_libcs())
for source in reuse_yaml.get("from", []):
if source["type"] != "external":
continue
include = source.get("include", default_include)
if include:
# Since libcs are implicit externals, we need to implicitly include them
include = include + libc_externals
exclude = source.get("exclude", default_exclude)
spec_filters.append(
SpecFilter(
factory=lambda: [],
is_usable=lambda x: True,
include=include,
exclude=exclude,
)
)
for pkg_name, data in packages_yaml.items():
if pkg_name == "all":
continue
@@ -1570,7 +1676,6 @@ def external_packages(self):
if pkg_name not in spack.repo.PATH:
continue
self.gen.h2("External package: {0}".format(pkg_name))
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
@@ -1580,7 +1685,17 @@ def external_packages(self):
# Read a list of all the specs for this package
externals = data.get("externals", [])
external_specs = [spack.spec.parse_with_version_concrete(x["spec"]) for x in externals]
candidate_specs = [
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
external_specs = []
if spec_filters:
for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs
external_specs.extend(current_filter.selected_specs())
else:
external_specs.extend(candidate_specs)
# Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way
@@ -1610,6 +1725,7 @@ def external_imposition(input_spec, requirements):
self.gen.newline()
self.trigger_rules()
self.effect_rules()
def preferred_variants(self, pkg_name):
"""Facts on concretization preferences, as read from packages.yaml"""
@@ -1831,6 +1947,16 @@ def _spec_clauses(
if dep.name == "gcc-runtime":
continue
# libc is also solved again by clingo, but in this case the compatibility
# is not encoded in the parent node - so we need to emit explicit facts
if "libc" in dspec.virtuals:
for libc in self.libcs:
if libc_is_compatible(libc, dep):
clauses.append(
fn.attr("compatible_libc", spec.name, libc.name, libc.version)
)
continue
# We know dependencies are real for concrete specs. For abstract
# specs they just mean the dep is somehow in the DAG.
for dtype in dt.ALL_FLAGS:
@@ -1963,7 +2089,7 @@ def _supported_targets(self, compiler_name, compiler_version, targets):
try:
with warnings.catch_warnings():
warnings.simplefilter("ignore")
target.optimization_flags(compiler_name, compiler_version)
target.optimization_flags(compiler_name, str(compiler_version))
supported.append(target)
except archspec.cpu.UnsupportedMicroarchitecture:
continue
@@ -2240,7 +2366,7 @@ def register_concrete_spec(self, spec, possible):
def concrete_specs(self):
"""Emit facts for reusable specs"""
for h, spec in self.reusable_and_possible.items():
for h, spec in self.reusable_and_possible.explicit_items():
# this indicates that there is a spec like this installed
self.gen.fact(fn.installed_hash(spec.name, h))
# this describes what constraints it imposes on the solve
@@ -2286,6 +2412,7 @@ def setup(
node_counter = _create_counter(specs, tests=self.tests)
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
self.libcs = sorted(all_libcs()) # type: ignore[type-var]
# Fail if we already know an unreachable node is requested
for spec in specs:
@@ -2295,13 +2422,17 @@ def setup(
if missing_deps:
raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
for node in spack.traverse.traverse_nodes(specs):
for node in traverse.traverse_nodes(specs):
if node.namespace is not None:
self.explicitly_required_namespaces[node.name] = node.namespace
self.gen = ProblemInstanceBuilder()
compiler_parser = CompilerParser(configuration=spack.config.CONFIG).with_input_specs(specs)
if using_libc_compatibility():
for libc in self.libcs:
self.gen.fact(fn.allowed_libc(libc.name, libc.version))
if not allow_deprecated:
self.gen.fact(fn.deprecated_versions_not_allowed())
@@ -2431,18 +2562,42 @@ def visit(node):
def define_runtime_constraints(self):
"""Define the constraints to be imposed on the runtimes"""
recorder = RuntimePropertyRecorder(self)
# TODO: Use only available compilers ?
for compiler in self.possible_compilers:
compiler_with_different_cls_names = {"oneapi": "intel-oneapi-compilers"}
compiler_with_different_cls_names = {
"oneapi": "intel-oneapi-compilers",
"clang": "llvm",
}
compiler_cls_name = compiler_with_different_cls_names.get(
compiler.spec.name, compiler.spec.name
)
try:
compiler_cls = spack.repo.PATH.get_pkg_class(compiler_cls_name)
if hasattr(compiler_cls, "runtime_constraints"):
compiler_cls.runtime_constraints(spec=compiler.spec, pkg=recorder)
except spack.repo.UnknownPackageError:
pass
# Inject libc from available compilers, on Linux
if not compiler.available:
continue
if hasattr(compiler_cls, "runtime_constraints"):
compiler_cls.runtime_constraints(spec=compiler.spec, pkg=recorder)
current_libc = compiler.compiler_obj.default_libc
# If this is a compiler yet to be built (config:install_missing_compilers:true)
# infer libc from the Python process
if not current_libc and compiler.compiler_obj.cc is None:
current_libc = spack.util.libc.libc_from_current_python_process()
if using_libc_compatibility() and current_libc:
recorder("*").depends_on(
"libc", when=f"%{compiler.spec}", type="link", description="Add libc"
)
recorder("*").depends_on(
str(current_libc),
when=f"%{compiler.spec}",
type="link",
description="Add libc",
)
recorder.consume_facts()
@@ -2820,6 +2975,13 @@ class CompilerParser:
def __init__(self, configuration) -> None:
self.compilers: Set[KnownCompiler] = set()
for c in all_compilers_in_config(configuration):
if using_libc_compatibility() and not c.default_libc:
warnings.warn(
f"cannot detect libc from {c.spec}. The compiler will not be used "
f"during concretization."
)
continue
target = c.target if c.target != "any" else None
candidate = KnownCompiler(
spec=c.spec, os=c.operating_system, target=target, available=True, compiler_obj=c
@@ -3088,13 +3250,16 @@ class SpecBuilder:
r"^.*_propagate$",
r"^.*_satisfies$",
r"^.*_set$",
r"^compatible_libc$",
r"^dependency_holds$",
r"^external_conditions_hold$",
r"^node_compiler$",
r"^package_hash$",
r"^root$",
r"^track_dependencies$",
r"^variant_default_value_from_cli$",
r"^virtual_node$",
r"^virtual_on_incoming_edges$",
r"^virtual_root$",
]
)
@@ -3184,12 +3349,8 @@ def no_flags(self, node, flag_type):
self._specs[node].compiler_flags[flag_type] = []
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx
has been selected for this package.
"""
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
"""This means that the external spec and index idx has been selected for this package."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
spec_info = packages_yaml[node.pkg]["externals"][int(idx)]
self._specs[node].external_path = spec_info.get("prefix", None)
self._specs[node].external_modules = spack.spec.Spec._format_module_list(
@@ -3469,25 +3630,165 @@ def _has_runtime_dependencies(spec: spack.spec.Spec) -> bool:
return True
class SpecFilter:
"""Given a method to produce a list of specs, this class can filter them according to
different criteria.
"""
def __init__(
self,
factory: Callable[[], List[spack.spec.Spec]],
is_usable: Callable[[spack.spec.Spec], bool],
include: List[str],
exclude: List[str],
) -> None:
"""
Args:
factory: factory to produce a list of specs
is_usable: predicate that takes a spec in input and returns False if the spec
should not be considered for this filter, True otherwise.
include: if present, a "good" spec must match at least one entry in the list
exclude: if present, a "good" spec must not match any entry in the list
"""
self.factory = factory
self.is_usable = is_usable
self.include = include
self.exclude = exclude
def is_selected(self, s: spack.spec.Spec) -> bool:
if not self.is_usable(s):
return False
if self.include and not any(s.satisfies(c) for c in self.include):
return False
if self.exclude and any(s.satisfies(c) for c in self.exclude):
return False
return True
def selected_specs(self) -> List[spack.spec.Spec]:
return [s for s in self.factory() if self.is_selected(s)]
@staticmethod
def from_store(configuration, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the current store."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=True)
factory = functools.partial(_specs_from_store, configuration=configuration)
return SpecFilter(factory=factory, is_usable=is_reusable, include=include, exclude=exclude)
@staticmethod
def from_buildcache(configuration, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the configured buildcaches."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=False)
return SpecFilter(
factory=_specs_from_mirror, is_usable=is_reusable, include=include, exclude=exclude
)
def _specs_from_store(configuration):
store = spack.store.create(configuration)
with store.db.read_transaction():
return store.db.query(installed=True)
def _specs_from_mirror():
try:
return spack.binary_distribution.update_cache_and_get_specs()
except (spack.binary_distribution.FetchCacheError, IndexError):
# this is raised when no mirrors had indices.
# TODO: update mirror configuration so it can indicate that the
# TODO: source cache (or any mirror really) doesn't have binaries.
return []
class ReuseStrategy(enum.Enum):
ROOTS = enum.auto()
DEPENDENCIES = enum.auto()
NONE = enum.auto()
class ReusableSpecsSelector:
"""Selects specs that can be reused during concretization."""
def __init__(self, configuration: spack.config.Configuration) -> None:
self.configuration = configuration
self.store = spack.store.create(configuration)
self.reuse_strategy = ReuseStrategy.ROOTS
reuse_yaml = self.configuration.get("concretizer:reuse", False)
self.reuse_sources = []
if not isinstance(reuse_yaml, typing.Mapping):
if reuse_yaml is False:
self.reuse_strategy = ReuseStrategy.NONE
if reuse_yaml == "dependencies":
self.reuse_strategy = ReuseStrategy.DEPENDENCIES
self.reuse_sources.extend(
[
SpecFilter.from_store(
configuration=self.configuration, include=[], exclude=[]
),
SpecFilter.from_buildcache(
configuration=self.configuration, include=[], exclude=[]
),
]
)
else:
roots = reuse_yaml.get("roots", True)
if roots is True:
self.reuse_strategy = ReuseStrategy.ROOTS
else:
self.reuse_strategy = ReuseStrategy.DEPENDENCIES
default_include = reuse_yaml.get("include", [])
default_exclude = reuse_yaml.get("exclude", [])
default_sources = [{"type": "local"}, {"type": "buildcache"}]
for source in reuse_yaml.get("from", default_sources):
include = source.get("include", default_include)
exclude = source.get("exclude", default_exclude)
if source["type"] == "local":
self.reuse_sources.append(
SpecFilter.from_store(self.configuration, include=include, exclude=exclude)
)
elif source["type"] == "buildcache":
self.reuse_sources.append(
SpecFilter.from_buildcache(
self.configuration, include=include, exclude=exclude
)
)
def reusable_specs(self, specs: List[spack.spec.Spec]) -> List[spack.spec.Spec]:
if self.reuse_strategy == ReuseStrategy.NONE:
return []
result = []
for reuse_source in self.reuse_sources:
result.extend(reuse_source.selected_specs())
# If we only want to reuse dependencies, remove the root specs
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
result = [spec for spec in result if not any(root in spec for root in specs)]
return result
class Solver:
"""This is the main external interface class for solving.
It manages solver configuration and preferences in one place. It sets up the solve
and passes the setup method to the driver, as well.
Properties of interest:
``reuse (bool)``
Whether to try to reuse existing installs/binaries
"""
def __init__(self):
self.driver = PyclingoDriver()
# These properties are settable via spack configuration, and overridable
# by setting them directly as properties.
self.reuse = spack.config.get("concretizer:reuse", False)
self.selector = ReusableSpecsSelector(configuration=spack.config.CONFIG)
if spack.platforms.host().name == "cray":
msg = (
"The Cray platform, i.e. 'platform=cray', will be removed in Spack v0.23. "
"All Cray machines will be then detected as 'platform=linux'."
)
warnings.warn(msg)
@staticmethod
def _check_input_and_extract_concrete_specs(specs):
@@ -3501,39 +3802,6 @@ def _check_input_and_extract_concrete_specs(specs):
spack.spec.Spec.ensure_valid_variants(s)
return reusable
def _reusable_specs(self, specs):
reusable_specs = []
if self.reuse:
packages = spack.config.get("packages")
# Specs from the local Database
with spack.store.STORE.db.read_transaction():
reusable_specs.extend(
s
for s in spack.store.STORE.db.query(installed=True)
if _is_reusable(s, packages, local=True)
)
# Specs from buildcaches
try:
reusable_specs.extend(
s
for s in spack.binary_distribution.update_cache_and_get_specs()
if _is_reusable(s, packages, local=False)
)
except (spack.binary_distribution.FetchCacheError, IndexError):
# this is raised when no mirrors had indices.
# TODO: update mirror configuration so it can indicate that the
# TODO: source cache (or any mirror really) doesn't have binaries.
pass
# If we only want to reuse dependencies, remove the root specs
if self.reuse == "dependencies":
reusable_specs = [
spec for spec in reusable_specs if not any(root in spec for root in specs)
]
return reusable_specs
def solve(
self,
specs,
@@ -3559,7 +3827,7 @@ def solve(
# Check upfront that the variants are admissible
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
result, _, _ = self.driver.solve(
@@ -3588,7 +3856,7 @@ def solve_in_rounds(
"""
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
# Tell clingo that we don't have to solve all the inputs at once

View File

@@ -993,14 +993,13 @@ pkg_fact(Package, variant_single_value("dev_path"))
% Platform semantics
%-----------------------------------------------------------------------------
% if no platform is set, fall back to the default
error(100, "platform '{0}' is not allowed on the current host", Platform)
:- attr("node_platform", _, Platform), not allowed_platform(Platform).
% NOTE: Currently we have a single allowed platform per DAG, therefore there is no
% need to have additional optimization criteria. If we ever add cross-platform dags,
% this needs to be changed.
:- 2 { allowed_platform(Platform) }, internal_error("More than one allowed platform detected").
attr("node_platform", PackageNode, Platform)
:- attr("node", PackageNode),
not attr("node_platform_set", PackageNode),
node_platform_default(Platform).
1 { attr("node_platform", PackageNode, Platform) : allowed_platform(Platform) } 1
:- attr("node", PackageNode).
% setting platform on a node is a hard constraint
attr("node_platform", PackageNode, Platform)
@@ -1024,14 +1023,6 @@ error(100, "Cannot select '{0} os={1}' (operating system '{1}' is not buildable)
attr("node_os", node(X, Package), OS),
not buildable_os(OS).
% can't have dependencies on incompatible OS's
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageNodeOS, DependencyOS)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_os", node(X, Package), PackageNodeOS),
attr("node_os", node(Y, Dependency), DependencyOS),
not os_compatible(PackageNodeOS, DependencyOS),
build(node(X, Package)).
% give OS choice weights according to os declarations
node_os_weight(PackageNode, Weight)
:- attr("node", PackageNode),
@@ -1044,13 +1035,6 @@ os_compatible(OS, OS) :- os(OS).
% Transitive compatibility among operating systems
os_compatible(OS1, OS3) :- os_compatible(OS1, OS2), os_compatible(OS2, OS3).
% We can select only operating systems compatible with the ones
% for which we can build software. We need a cardinality constraint
% since we might have more than one "buildable_os(OS)" fact.
:- not 1 { os_compatible(CurrentOS, ReusedOS) : buildable_os(CurrentOS) },
attr("node_os", Package, ReusedOS),
internal_error("Reused OS incompatible with build OS").
% If an OS is set explicitly respect the value
attr("node_os", PackageNode, OS) :- attr("node_os_set", PackageNode, OS), attr("node", PackageNode).
@@ -1098,6 +1082,9 @@ error(100, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Tar
compiler_version(CompilerID, Version),
build(node(X, Package)).
#defined compiler_supports_target/2.
#defined compiler_available/1.
% if a target is set explicitly, respect it
attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
@@ -1437,6 +1424,7 @@ opt_criterion(73, "deprecated versions used").
#minimize{
1@73+Priority,PackageNode
: attr("deprecated", PackageNode, _),
not external(PackageNode),
build_priority(PackageNode, Priority)
}.

View File

@@ -195,7 +195,7 @@ def _bootstrap_clingo() -> ModuleType:
import spack.bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
clingo_mod = importlib.import_module("clingo")
return clingo_mod

View File

@@ -0,0 +1,37 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Libc compatibility rules for reusing solves.
%
% These rules are used on Linux
%=============================================================================
% A package cannot be reused if the libc is not compatible with it
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).
% The libc must be chosen among available ones
:- has_built_packages(),
provider(node(X, LibcPackage), node(0, "libc")),
attr("node", node(X, LibcPackage)),
attr("version", node(X, LibcPackage), LibcVersion),
not allowed_libc(LibcPackage, LibcVersion).
% A built node must depend on libc
:- build(PackageNode),
provider(LibcNode, node(0, "libc")),
not external(PackageNode),
not depends_on(PackageNode, LibcNode).

View File

@@ -7,21 +7,26 @@
% OS compatibility rules for reusing solves.
% os_compatible(RecentOS, OlderOS)
% OlderOS binaries can be used on RecentOS
%
% These rules are used on every platform, but Linux
%=============================================================================
% macOS
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").
os_compatible("bigsur", "catalina").
% Ubuntu
os_compatible("ubuntu22.04", "ubuntu21.10").
os_compatible("ubuntu21.10", "ubuntu21.04").
os_compatible("ubuntu21.04", "ubuntu20.10").
os_compatible("ubuntu20.10", "ubuntu20.04").
os_compatible("ubuntu20.04", "ubuntu19.10").
os_compatible("ubuntu19.10", "ubuntu19.04").
os_compatible("ubuntu19.04", "ubuntu18.10").
os_compatible("ubuntu18.10", "ubuntu18.04").
% can't have dependencies on incompatible OS's
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageNodeOS, DependencyOS)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_os", node(X, Package), PackageNodeOS),
attr("node_os", node(Y, Dependency), DependencyOS),
not os_compatible(PackageNodeOS, DependencyOS),
build(node(X, Package)).
%EL8
os_compatible("rhel8", "rocky8").
% We can select only operating systems compatible with the ones
% for which we can build software. We need a cardinality constraint
% since we might have more than one "buildable_os(OS)" fact.
:- not 1 { os_compatible(CurrentOS, ReusedOS) : buildable_os(CurrentOS) },
attr("node_os", Package, ReusedOS).

View File

@@ -51,7 +51,6 @@
import collections
import collections.abc
import enum
import io
import itertools
import os
import pathlib
@@ -59,7 +58,7 @@
import re
import socket
import warnings
from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
from typing import Any, Callable, Dict, List, Match, Optional, Set, Tuple, Union
import llnl.path
import llnl.string
@@ -121,36 +120,44 @@
"SpecDeprecatedError",
]
SPEC_FORMAT_RE = re.compile(
r"(?:" # this is one big or, with matches ordered by priority
# OPTION 1: escaped character (needs to be first to catch opening \{)
# Note that an unterminated \ at the end of a string is left untouched
r"(?:\\(.))"
r"|" # or
# OPTION 2: an actual format string
r"{" # non-escaped open brace {
r"([%@/]|arch=)?" # optional sigil (to print sigil in color)
r"(?:\^([^}\.]+)\.)?" # optional ^depname. (to get attr from dependency)
# after the sigil or depname, we can have a hash expression or another attribute
r"(?:" # one of
r"(hash\b)(?:\:(\d+))?" # hash followed by :<optional length>
r"|" # or
r"([^}]*)" # another attribute to format
r")" # end one of
r"(})?" # finish format string with non-escaped close brace }, or missing if not present
r"|"
# OPTION 3: mismatched close brace (option 2 would consume a matched open brace)
r"(})" # brace
r")",
re.IGNORECASE,
)
#: Valid pattern for an identifier in Spack
IDENTIFIER_RE = r"\w[\w-]*"
# Coloring of specs when using color output. Fields are printed with
# different colors to enhance readability.
# See llnl.util.tty.color for descriptions of the color codes.
COMPILER_COLOR = "@g" #: color for highlighting compilers
VERSION_COLOR = "@c" #: color for highlighting versions
ARCHITECTURE_COLOR = "@m" #: color for highlighting architectures
ENABLED_VARIANT_COLOR = "@B" #: color for highlighting enabled variants
DISABLED_VARIANT_COLOR = "r" #: color for highlighting disabled varaints
DEPENDENCY_COLOR = "@." #: color for highlighting dependencies
VARIANT_COLOR = "@B" #: color for highlighting variants
HASH_COLOR = "@K" #: color for highlighting package hashes
#: This map determines the coloring of specs when using color output.
#: We make the fields different colors to enhance readability.
#: See llnl.util.tty.color for descriptions of the color codes.
COLOR_FORMATS = {
"%": COMPILER_COLOR,
"@": VERSION_COLOR,
"=": ARCHITECTURE_COLOR,
"+": ENABLED_VARIANT_COLOR,
"~": DISABLED_VARIANT_COLOR,
"^": DEPENDENCY_COLOR,
"#": HASH_COLOR,
}
#: Regex used for splitting by spec field separators.
#: These need to be escaped to avoid metacharacters in
#: ``COLOR_FORMATS.keys()``.
_SEPARATORS = "[\\%s]" % "\\".join(COLOR_FORMATS.keys())
#: Default format for Spec.format(). This format can be round-tripped, so that:
#: Spec(Spec("string").format()) == Spec("string)"
DEFAULT_FORMAT = (
@@ -193,26 +200,7 @@ class InstallStatus(enum.Enum):
missing = "@r{[-]} "
def colorize_spec(spec):
"""Returns a spec colorized according to the colors specified in
COLOR_FORMATS."""
class insert_color:
def __init__(self):
self.last = None
def __call__(self, match):
# ignore compiler versions (color same as compiler)
sep = match.group(0)
if self.last == "%" and sep == "@":
return clr.cescape(sep)
self.last = sep
return "%s%s" % (COLOR_FORMATS[sep], clr.cescape(sep))
return clr.colorize(re.sub(_SEPARATORS, insert_color(), str(spec)) + "@.")
# regexes used in spec formatting
OLD_STYLE_FMT_RE = re.compile(r"\${[A-Z]+}")
@@ -1042,16 +1030,13 @@ def clear(self):
self.edges.clear()
def _command_default_handler(descriptor, spec, cls):
def _command_default_handler(spec: "Spec"):
"""Default handler when looking for the 'command' attribute.
Tries to search for ``spec.name`` in the ``spec.home.bin`` directory.
Parameters:
descriptor (ForwardQueryToPackage): descriptor that triggered the call
spec (Spec): spec that is being queried
cls (type(spec)): type of spec, to match the signature of the
descriptor ``__get__`` method
spec: spec that is being queried
Returns:
Executable: An executable of the command
@@ -1064,22 +1049,17 @@ def _command_default_handler(descriptor, spec, cls):
if fs.is_exe(path):
return spack.util.executable.Executable(path)
else:
msg = "Unable to locate {0} command in {1}"
raise RuntimeError(msg.format(spec.name, home.bin))
raise RuntimeError(f"Unable to locate {spec.name} command in {home.bin}")
def _headers_default_handler(descriptor, spec, cls):
def _headers_default_handler(spec: "Spec"):
"""Default handler when looking for the 'headers' attribute.
Tries to search for ``*.h`` files recursively starting from
``spec.package.home.include``.
Parameters:
descriptor (ForwardQueryToPackage): descriptor that triggered the call
spec (Spec): spec that is being queried
cls (type(spec)): type of spec, to match the signature of the
descriptor ``__get__`` method
spec: spec that is being queried
Returns:
HeaderList: The headers in ``prefix.include``
@@ -1092,12 +1072,10 @@ def _headers_default_handler(descriptor, spec, cls):
if headers:
return headers
else:
msg = "Unable to locate {0} headers in {1}"
raise spack.error.NoHeadersError(msg.format(spec.name, home))
raise spack.error.NoHeadersError(f"Unable to locate {spec.name} headers in {home}")
def _libs_default_handler(descriptor, spec, cls):
def _libs_default_handler(spec: "Spec"):
"""Default handler when looking for the 'libs' attribute.
Tries to search for ``lib{spec.name}`` recursively starting from
@@ -1105,10 +1083,7 @@ def _libs_default_handler(descriptor, spec, cls):
``{spec.name}`` instead.
Parameters:
descriptor (ForwardQueryToPackage): descriptor that triggered the call
spec (Spec): spec that is being queried
cls (type(spec)): type of spec, to match the signature of the
descriptor ``__get__`` method
spec: spec that is being queried
Returns:
LibraryList: The libraries found
@@ -1147,27 +1122,33 @@ def _libs_default_handler(descriptor, spec, cls):
if libs:
return libs
msg = "Unable to recursively locate {0} libraries in {1}"
raise spack.error.NoLibrariesError(msg.format(spec.name, home))
raise spack.error.NoLibrariesError(
f"Unable to recursively locate {spec.name} libraries in {home}"
)
class ForwardQueryToPackage:
"""Descriptor used to forward queries from Spec to Package"""
def __init__(self, attribute_name, default_handler=None):
def __init__(
self,
attribute_name: str,
default_handler: Optional[Callable[["Spec"], Any]] = None,
_indirect: bool = False,
) -> None:
"""Create a new descriptor.
Parameters:
attribute_name (str): name of the attribute to be
searched for in the Package instance
default_handler (callable, optional): default function to be
called if the attribute was not found in the Package
instance
attribute_name: name of the attribute to be searched for in the Package instance
default_handler: default function to be called if the attribute was not found in the
Package instance
_indirect: temporarily added to redirect a query to another package.
"""
self.attribute_name = attribute_name
self.default = default_handler
self.indirect = _indirect
def __get__(self, instance, cls):
def __get__(self, instance: "SpecBuildInterface", cls):
"""Retrieves the property from Package using a well defined chain
of responsibility.
@@ -1189,13 +1170,18 @@ def __get__(self, instance, cls):
indicating a query failure, e.g. that library files were not found in a
'libs' query.
"""
pkg = instance.package
# TODO: this indirection exist solely for `spec["python"].command` to actually return
# spec["python-venv"].command. It should be removed when `python` is a virtual.
if self.indirect and instance.indirect_spec:
pkg = instance.indirect_spec.package
else:
pkg = instance.wrapped_obj.package
try:
query = instance.last_query
except AttributeError:
# There has been no query yet: this means
# a spec is trying to access its own attributes
_ = instance[instance.name] # NOQA: ignore=F841
_ = instance.wrapped_obj[instance.wrapped_obj.name] # NOQA: ignore=F841
query = instance.last_query
callbacks_chain = []
@@ -1207,7 +1193,8 @@ def __get__(self, instance, cls):
callbacks_chain.append(lambda: getattr(pkg, self.attribute_name))
# Final resort : default callback
if self.default is not None:
callbacks_chain.append(lambda: self.default(self, instance, cls))
_default = self.default # make mypy happy
callbacks_chain.append(lambda: _default(instance.wrapped_obj))
# Trigger the callbacks in order, the first one producing a
# value wins
@@ -1266,25 +1253,33 @@ def __set__(self, instance, value):
class SpecBuildInterface(lang.ObjectWrapper):
# home is available in the base Package so no default is needed
home = ForwardQueryToPackage("home", default_handler=None)
command = ForwardQueryToPackage("command", default_handler=_command_default_handler)
headers = ForwardQueryToPackage("headers", default_handler=_headers_default_handler)
libs = ForwardQueryToPackage("libs", default_handler=_libs_default_handler)
command = ForwardQueryToPackage(
"command", default_handler=_command_default_handler, _indirect=True
)
def __init__(self, spec, name, query_parameters):
def __init__(self, spec: "Spec", name: str, query_parameters: List[str], _parent: "Spec"):
super().__init__(spec)
# Adding new attributes goes after super() call since the ObjectWrapper
# resets __dict__ to behave like the passed object
original_spec = getattr(spec, "wrapped_obj", spec)
self.wrapped_obj = original_spec
self.token = original_spec, name, query_parameters
self.token = original_spec, name, query_parameters, _parent
is_virtual = spack.repo.PATH.is_virtual(name)
self.last_query = QueryState(
name=name, extra_parameters=query_parameters, isvirtual=is_virtual
)
# TODO: this ad-hoc logic makes `spec["python"].command` return
# `spec["python-venv"].command` and should be removed when `python` is a virtual.
self.indirect_spec = None
if spec.name == "python":
python_venvs = _parent.dependencies("python-venv")
if not python_venvs:
return
self.indirect_spec = python_venvs[0]
def __reduce__(self):
return SpecBuildInterface, self.token
@@ -3910,9 +3905,13 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if not self._dependencies:
return False
# If we arrived here, then rhs is abstract. At the moment we don't care about the edge
# structure of an abstract DAG, so we check if any edge could satisfy the properties
# we ask for.
# If we arrived here, the lhs root node satisfies the rhs root node. Now we need to check
# all the edges that have an abstract parent, and verify that they match some edge in the
# lhs.
#
# It might happen that the rhs brings in concrete sub-DAGs. For those we don't need to
# verify the edge properties, cause everything is encoded in the hash of the nodes that
# will be verified later.
lhs_edges: Dict[str, Set[DependencySpec]] = collections.defaultdict(set)
for rhs_edge in other.traverse_edges(root=False, cover="edges"):
# If we are checking for ^mpi we need to verify if there is any edge
@@ -3922,6 +3921,10 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if not rhs_edge.virtuals:
continue
# Skip edges from a concrete sub-DAG
if rhs_edge.parent.concrete:
continue
if not lhs_edges:
# Construct a map of the link/run subDAG + direct "build" edges,
# keyed by dependency name
@@ -4141,7 +4144,7 @@ def version(self):
raise spack.error.SpecError("Spec version is not concrete: " + str(self))
return self.versions[0]
def __getitem__(self, name):
def __getitem__(self, name: str):
"""Get a dependency from the spec by its name. This call implicitly
sets a query state in the package being retrieved. The behavior of
packages may be influenced by additional query parameters that are
@@ -4150,7 +4153,7 @@ def __getitem__(self, name):
Note that if a virtual package is queried a copy of the Spec is
returned while for non-virtual a reference is returned.
"""
query_parameters = name.split(":")
query_parameters: List[str] = name.split(":")
if len(query_parameters) > 2:
raise KeyError("key has more than one ':' symbol. At most one is admitted.")
@@ -4173,7 +4176,7 @@ def __getitem__(self, name):
)
try:
value = next(
child: Spec = next(
itertools.chain(
# Regular specs
(x for x in order() if x.name == name),
@@ -4190,9 +4193,9 @@ def __getitem__(self, name):
raise KeyError(f"No spec with name {name} in {self}")
if self._concrete:
return SpecBuildInterface(value, name, query_parameters)
return SpecBuildInterface(child, name, query_parameters, _parent=self)
return value
return child
def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec.
@@ -4295,10 +4298,7 @@ def deps():
yield deps
def colorized(self):
return colorize_spec(self)
def format(self, format_string=DEFAULT_FORMAT, **kwargs):
def format(self, format_string: str = DEFAULT_FORMAT, color: Optional[bool] = False) -> str:
r"""Prints out particular pieces of a spec, depending on what is
in the format string.
@@ -4361,79 +4361,65 @@ def format(self, format_string=DEFAULT_FORMAT, **kwargs):
literal ``\`` character.
Args:
format_string (str): string containing the format to be expanded
Keyword Args:
color (bool): True if returned string is colored
transform (dict): maps full-string formats to a callable \
that accepts a string and returns another one
format_string: string containing the format to be expanded
color: True for colorized result; False for no color; None for auto color.
"""
ensure_modern_format_string(format_string)
color = kwargs.get("color", False)
transform = kwargs.get("transform", {})
out = io.StringIO()
def safe_color(sigil: str, string: str, color_fmt: Optional[str]) -> str:
# avoid colorizing if there is no color or the string is empty
if (color is False) or not color_fmt or not string:
return sigil + string
# escape and add the sigil here to avoid multiple concatenations
if sigil == "@":
sigil = "@@"
return clr.colorize(f"{color_fmt}{sigil}{clr.cescape(string)}@.", color=color)
def write(s, c=None):
f = clr.cescape(s)
if c is not None:
f = COLOR_FORMATS[c] + f + "@."
clr.cwrite(f, stream=out, color=color)
def format_attribute(match_object: Match) -> str:
(esc, sig, dep, hash, hash_len, attribute, close_brace, unmatched_close_brace) = (
match_object.groups()
)
if esc:
return esc
elif unmatched_close_brace:
raise SpecFormatStringError(f"Unmatched close brace: '{format_string}'")
elif not close_brace:
raise SpecFormatStringError(f"Missing close brace: '{format_string}'")
def write_attribute(spec, attribute, color):
attribute = attribute.lower()
current = self if dep is None else self[dep]
sig = ""
if attribute.startswith(("@", "%", "/")):
# color sigils that are inside braces
sig = attribute[0]
attribute = attribute[1:]
elif attribute.startswith("arch="):
sig = " arch=" # include space as separator
attribute = attribute[5:]
current = spec
if attribute.startswith("^"):
attribute = attribute[1:]
dep, attribute = attribute.split(".", 1)
current = self[dep]
# Hash attributes can return early.
# NOTE: we currently treat abstract_hash like an attribute and ignore
# any length associated with it. We may want to change that.
if hash:
if sig and sig != "/":
raise SpecFormatSigilError(sig, "DAG hashes", hash)
try:
length = int(hash_len) if hash_len else None
except ValueError:
raise SpecFormatStringError(f"Invalid hash length: '{hash_len}'")
return safe_color(sig or "", current.dag_hash(length), HASH_COLOR)
if attribute == "":
raise SpecFormatStringError("Format string attributes must be non-empty")
attribute = attribute.lower()
parts = attribute.split(".")
assert parts
# check that the sigil is valid for the attribute.
if sig == "@" and parts[-1] not in ("versions", "version"):
if not sig:
sig = ""
elif sig == "@" and parts[-1] not in ("versions", "version"):
raise SpecFormatSigilError(sig, "versions", attribute)
elif sig == "%" and attribute not in ("compiler", "compiler.name"):
raise SpecFormatSigilError(sig, "compilers", attribute)
elif sig == "/" and not re.match(r"(abstract_)?hash(:\d+)?$", attribute):
elif sig == "/" and attribute != "abstract_hash":
raise SpecFormatSigilError(sig, "DAG hashes", attribute)
elif sig == " arch=" and attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
# find the morph function for our attribute
morph = transform.get(attribute, lambda s, x: x)
# Special cases for non-spec attributes and hashes.
# These must be the only non-dep component of the format attribute
if attribute == "spack_root":
write(morph(spec, spack.paths.spack_root))
return
elif attribute == "spack_install":
write(morph(spec, spack.store.STORE.layout.root))
return
elif re.match(r"hash(:\d)?", attribute):
col = "#"
if ":" in attribute:
_, length = attribute.split(":")
write(sig + morph(spec, current.dag_hash(int(length))), col)
else:
write(sig + morph(spec, current.dag_hash()), col)
return
elif sig == "arch=":
if attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
sig = " arch=" # include space as separator
# Iterate over components using getattr to get next element
for idx, part in enumerate(parts):
@@ -4442,7 +4428,7 @@ def write_attribute(spec, attribute, color):
if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if isinstance(current, vt.VariantMap):
if part == "variants" and isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names
current = current[part]
else:
@@ -4466,62 +4452,31 @@ def write_attribute(spec, attribute, color):
raise SpecFormatStringError(m)
if isinstance(current, vn.VersionList):
if current == vn.any_version:
# We don't print empty version lists
return
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# We're not printing anything
return
# not printing anything
return ""
# Set color codes for various attributes
col = None
color = None
if "variants" in parts:
col = "+"
color = VARIANT_COLOR
elif "architecture" in parts:
col = "="
color = ARCHITECTURE_COLOR
elif "compiler" in parts or "compiler_flags" in parts:
col = "%"
color = COMPILER_COLOR
elif "version" in parts or "versions" in parts:
col = "@"
color = VERSION_COLOR
# Finally, write the output
write(sig + morph(spec, str(current)), col)
# return colored output
return safe_color(sig, str(current), color)
attribute = ""
in_attribute = False
escape = False
for c in format_string:
if escape:
out.write(c)
escape = False
elif c == "\\":
escape = True
elif in_attribute:
if c == "}":
write_attribute(self, attribute, color)
attribute = ""
in_attribute = False
else:
attribute += c
else:
if c == "}":
raise SpecFormatStringError(
"Encountered closing } before opening { in %s" % format_string
)
elif c == "{":
in_attribute = True
else:
out.write(c)
if in_attribute:
raise SpecFormatStringError(
"Format string terminated while reading attribute." "Missing terminating }."
)
formatted_spec = out.getvalue()
return formatted_spec.strip()
return SPEC_FORMAT_RE.sub(format_attribute, format_string).strip()
def cformat(self, *args, **kwargs):
"""Same as format, but color defaults to auto instead of False."""
@@ -4529,6 +4484,16 @@ def cformat(self, *args, **kwargs):
kwargs.setdefault("color", None)
return self.format(*args, **kwargs)
@property
def spack_root(self):
"""Special field for using ``{spack_root}`` in Spec.format()."""
return spack.paths.spack_root
@property
def spack_install(self):
"""Special field for using ``{spack_install}`` in Spec.format()."""
return spack.store.STORE.layout.root
def format_path(
# self, format_string: str, _path_ctor: Optional[pathlib.PurePath] = None
self,
@@ -4554,14 +4519,21 @@ def format_path(
path_ctor = _path_ctor or pathlib.PurePath
format_string_as_path = path_ctor(format_string)
if format_string_as_path.is_absolute():
if format_string_as_path.is_absolute() or (
# Paths that begin with a single "\" on windows are relative, but we still
# want to preserve the initial "\\" to be consistent with PureWindowsPath.
# Ensure that this '\' is not passed to polite_filename() so it's not converted to '_'
(os.name == "nt" or path_ctor == pathlib.PureWindowsPath)
and format_string_as_path.parts[0] == "\\"
):
output_path_components = [format_string_as_path.parts[0]]
input_path_components = list(format_string_as_path.parts[1:])
else:
output_path_components = []
input_path_components = list(format_string_as_path.parts)
output_path_components += [
fs.polite_filename(self.format(x)) for x in input_path_components
fs.polite_filename(self.format(part)) for part in input_path_components
]
return str(path_ctor(*output_path_components))

View File

@@ -390,11 +390,11 @@ def test_built_spec_cache(mirror_dir):
assert any([r["spec"] == s for r in results])
def fake_dag_hash(spec):
def fake_dag_hash(spec, length=None):
# Generate an arbitrary hash that is intended to be different than
# whatever a Spec reported before (to test actions that trigger when
# the hash changes)
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"[:length]
@pytest.mark.usefixtures(

View File

@@ -127,7 +127,7 @@
spack_cflags = ["-Wall"]
spack_cxxflags = ["-Werror"]
spack_fflags = ["-w"]
spack_ldflags = ["-L", "foo"]
spack_ldflags = ["-Wl,--gc-sections", "-L", "foo"]
spack_ldlibs = ["-lfoo"]
lheaderpad = ["-Wl,-headerpad_max_install_names"]
@@ -279,7 +279,6 @@ def test_ld_flags(wrapper_environment, wrapper_flags):
test_args,
["ld"]
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ test_library_paths
+ ["--disable-new-dtags"]
+ test_rpaths
@@ -307,13 +306,14 @@ def test_cc_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_cppflags
+ spack_cflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)
@@ -325,12 +325,13 @@ def test_cxx_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_cppflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)
@@ -342,13 +343,14 @@ def test_fc_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_fflags
+ spack_cppflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)

Some files were not shown because too many files have changed in this diff Show More