Compare commits

...

597 Commits

Author SHA1 Message Date
Miranda Mundt
1ac2ee8043 Add Pyomo 6.7.2 (#44097) 2024-05-18 10:31:57 -05:00
Teague Sterling
36af1c1c73 perl-xml-libxml: add new versions and conflicts (fixes #44253) (#44254)
* Address #44253 by adding new versions and declaring conflicts for perl-xml-libxml

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/perl-xml-libxml/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-18 09:28:32 -06:00
Carlos Bederián
e2fa087002 namd: add 3.0b7 (#44198) 2024-05-18 10:15:55 -05:00
Teague Sterling
df02bfbad2 Adding new spark versions (#44250)
* Adding new spark versions (in preparation of HAIL package)

* Adding myself as potential maintainer
2024-05-18 10:06:12 -05:00
Teague Sterling
fecb63843e yq: add new package (#44249) 2024-05-18 06:38:26 -06:00
Scott Wittenburg
b33e2d09d3 oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-18 11:57:53 +02:00
Chris Green
f8054aa21a git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-05-18 03:32:06 -06:00
Valentin Volkl
8f3a2acc54 whizard: add gosam variant (#43595)
* whizard: add gosam variant

* adress comments, fix compiler wrapper issue
2024-05-18 10:11:26 +02:00
Kevin Huck
d1a20908b8 ZeroSum: add new package (#44228) 2024-05-18 09:23:45 +02:00
Derek Ryan Strong
dd781f7368 perl: add v5.36.3, v5.38.2; deprecate unsafe versions (#44186) 2024-05-18 09:21:03 +02:00
dmagdavector
9bcc43c4c1 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-17 17:50:29 -06:00
Todd Gamblin
77c83af17d docs: remove warning about repositories and package extension (#44247)
Local package repositories are very well supported and we test them extensively, so this
warning from 8 years ago can be removed from the docs.
2024-05-17 22:03:57 +00:00
Michael Kuhn
574bd2db99 netlib-scalapack: fix build with gcc@14 (#44120) 2024-05-17 22:38:46 +02:00
James Taliaferro
a76f37da96 kakoune: add v2024.05.09 (#44124) 2024-05-17 22:36:58 +02:00
Adam J. Stewart
9e75f3ec0a GDAL: add v3.9.0 (#44128) 2024-05-17 22:35:33 +02:00
Derek Ryan Strong
4d42d45897 Add latest Python versions (#44130) 2024-05-17 22:35:06 +02:00
SXS Bot
a4b4bfda73 spectre: add v2024.05.11 (#44139)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-05-17 22:28:25 +02:00
Michael Kuhn
1bcdd3a57e py-cython: add 3.0.10 (#44140) 2024-05-17 22:27:42 +02:00
Stephen Sachs
297a3a1bc9 Add mpas-model and mpich to pcluster neoverse stack (#44151)
Should build now since https://github.com/spack/spack/pull/43547 has been merged.
2024-05-17 22:00:17 +02:00
Adam J. Stewart
8d01e8c978 JAX: add v0.4.28 (#44112) 2024-05-17 21:58:44 +02:00
Wouter Deconinck
6be28aa303 pythia8: patch latest 8.311 for upstream bug (#43803)
* pythia8: prefer 8.310

* [@spackbot] updating style on behalf of wdconinc

* pythia8: filter_file to remove sed n

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit e2a3decaffbd3f464d1bd992025e1812df49f088.

* Revert "pythia8: prefer 8.310"

This reverts commit 568cb056b87129085e245d9dbef1732ee1c6c0aa.

* [@spackbot] updating style on behalf of wdconinc

* pythia8: comment for fix

* pythia8: fix style

* pythia8: filter_file with raw string because of escaped pipe

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-17 21:31:05 +02:00
Jon Rood
5e38310515 nalu-wind: fix trilinos rocm dependency (#44233) 2024-05-17 13:00:24 -06:00
wspear
ddfed65485 tau: fix lib/include paths with oneapi (#44170) 2024-05-17 18:55:27 +02:00
dependabot[bot]
2a16d8bfa8 build(deps): bump codecov/codecov-action from 4.3.1 to 4.4.0 (#44195)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5ecb98a3c6...6d798873df)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 18:46:17 +02:00
Jonathon Anderson
6a40a50a29 hpcviewer: Update URLs to use GitLab release assets (#44129) 2024-05-17 18:43:13 +02:00
Carlos Bederián
b2924f68c0 blis: add v1.0 (#44199) 2024-05-17 18:39:42 +02:00
Rocco Meli
41ffe36636 spglib: add v2.4.0 (#44202) 2024-05-17 18:33:40 +02:00
Chris Marsh
24edc72252 docs: show phase signature for builders (#44067) 2024-05-17 18:16:31 +02:00
Raffaele Solcà
83b38a26a0 New versions nvpl-blas and nvpl-lapack (#44244) 2024-05-17 17:46:25 +02:00
Nathalie Furmento
914d785e3b starpu: add v1.4.6 (#44203) 2024-05-17 08:17:01 -06:00
Garth N. Wells
f99f642fa8 fenicsx: remove deprecated versions (#44223) 2024-05-17 07:35:21 -06:00
Seth R. Johnson
e0bf3667e3 cli11: new version and enable library (#44204) 2024-05-17 15:08:28 +02:00
Andrew-Dunning-NNL
a24ca50fed Fix broken link in docs (#44217) 2024-05-17 12:58:11 +00:00
Fabien Bruneval
51e9f37252 MOLGW: add v3.3 (#44241)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 06:56:05 -06:00
Rémi Lacroix
453900c884 libxc: Fix compilation after distribution changes. (#44206)
The release tarballs are not available anymore which means autoconf, automake and libtool are always needed.

The NVHPC specific patches don't make sense anymore being that the patched files are not distributed in the new tar files.
2024-05-17 14:52:03 +02:00
Alberto Invernizzi
4696459d2d libcatalyst: add missing python dependencies (#44224) 2024-05-17 14:45:26 +02:00
Fabien Bruneval
ad1e3231e5 libcint: add v6.1.2, v5.5.0 (#44239)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 05:45:26 -06:00
Richard Berger
2ef7eb1826 exodusii: only use MPI fortran compiler if +fortran (#44211) 2024-05-17 13:01:09 +02:00
Dom Heinzeller
fe86019f9a ecflow: versions up to 5.11.4 require boost 1.84 or earlier (#44181) 2024-05-17 12:53:48 +02:00
Harmen Stoppels
9dbb18219f build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-05-17 12:29:56 +02:00
Mikael Simberg
451a977de0 hpx: change default of max_cpu_count variant to auto (#44220) 2024-05-17 11:54:48 +02:00
Jack S. Hale
e604929a4c FEniCS: add more maintainers (#44240) 2024-05-17 03:03:21 -06:00
dependabot[bot]
9d591f9f7c build(deps): bump actions/checkout from 4.1.5 to 4.1.6 (#44234)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5 to 4.1.6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](44c2b7a8a4...a5ac7e51b4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 10:57:08 +02:00
Matt Thompson
f8ad915100 esmf: add v8.6.1 (#44229) 2024-05-17 10:49:47 +02:00
Garth N. Wells
cbbabe6920 fenics-dolfinx: add spdlog dependency (#44237) 2024-05-17 10:48:55 +02:00
John W. Parent
81fe460194 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-05-16 17:00:02 -06:00
Paul Kuberry
b894f996c0 trilinos: catch kokkos inconsistency with trilinos (#44209)
* trilinos: catch kokkos inconsistency with trilinos

* trilinos: update kokkos version range
2024-05-16 13:36:11 -06:00
John W. Parent
1ce09847d9 Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-05-16 10:56:04 -07:00
Thomas Madlener
722d401394 gaudi: Don't apply the patch if it has already landed upstream (#44180) 2024-05-16 17:15:39 +02:00
Howard Pritchard
e6f04d5ef9 py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-16 16:18:44 +02:00
Mosè Giordano
b8e3ecbf00 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-05-16 11:07:18 +02:00
Todd Gamblin
d189387c24 bugfix: add arg to write_line_break() in spack_yaml (#42727)
`ruamel`'s `Emitter.write_line_break()` method takes an extra argument that we forgot to
implement in our custom emitter.
2024-05-15 19:25:06 -07:00
Andrew Lister
9e96ddc5ae CoinHSL: Support the Meson build system and add new release (#43610)
* Add maintainer and fix linting
* allow for fewer deps in archive
* use meson for archive packages
* Fix version spec and f-string
* fix blas dependency links
* Add new release to spack
* Fix checksums for latest release
2024-05-15 16:48:23 -06:00
dmagdavector
543bd189af iperf3: updated versions from 3.6 to 3.16 (#44152)
* Update version of iperf3 from 3.6 to 3.16

Spack currently only explicitly has version 3.6 of the iPerf3 package
(out of ESnet / LBNL). This makes the default the latest version of 3.16,
and adds some other versions (found in some Linux distros, for possible
compatibility purposes).

* iperf3: update to 3.17; update 3.6 hash for new url

* protobuf: update hash for patch needed when="@3.4:3.21"

* Revert "protobuf: update hash for patch needed when="@3.4:3.21""

This reverts commit 4d168d0b27.
2024-05-15 15:48:15 -06:00
John W. Parent
43291aa723 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-05-15 15:41:51 -04:00
Alec Scott
d0589285f7 unmaintained pkgs: bump versions 2024-05-10 (#44131)
* unmaintained pkgs: bump versions 2024-05-10

* openblas: fix satisfies syntax

* pixman: add autotools dependencies

* [@spackbot] updating style on behalf of alecbcs

* pixman: revert

* chapel: revert changes in favor of other PR

* openblas: revert due to failing tests

* Address review feedback for flint, biobam2, and pango

* pango: add version comment about v2.0

* numactl: revert changes due to ppc4le bug

* flint: remote duplicate configure arg

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* openvkl, rkcommon: remove commented maintainers template

* flint: fix style

---------

Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-15 11:04:52 -07:00
afzpatel
d079aaa083 enable tensorflow-2.14 and 2.16 for spack built ROCm (#44095)
* initial commit to enable tensorflow-2.14 for spack built ROCm

* fix style errors

* modify hipcc patch

* updates for rocm 6.1

* updates for tf-rocm-enhanced 2.16

* fix styling

* fix styling

* add patch for 2.16

* add patch for 2.16

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update rocm enhanced version names

* changes for rocm-enhanced version name change

* fix styling

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-15 17:06:35 +02:00
Paolo
6c65977e0d Fix gromacs installation with SVE. Issue 44062 (#44183)
* Fix gromacs installation with SVE. Issue 44062

* [@spackbot] updating style on behalf of paolotricerri

* Remove `neoverse_n2` target

We have removed the neoverse_n2 target as its detection is more involved
compared to neoverse_v*.
2024-05-15 07:22:47 -06:00
Carlos Bederián
1b5d786cf5 gromacs: add 2024.2, 2023.5 (#44197) 2024-05-15 07:21:55 -06:00
Ben Morgan
4cf00645bd VecGeom: new version 1.2.8 (#44179) 2024-05-15 07:50:48 -04:00
Jon Rood
e9149cfc3c nalu-wind: fix mistake (#44188) 2024-05-14 23:13:13 -06:00
Jon Rood
a5c8111076 exawind: updates to package to allow mixed device (#44159)
* exawind: updates to package to allow mixed device

* Style.

* Remove ninja variants.

* Add conflict for amr-wind+hypre with mixed device.

* Relax amr-wind~hypre requirement.

* Move runtime variables to nalu-wind.

* Update suggestions.

* Remove umpire.
2024-05-14 12:26:07 -06:00
Alec Scott
c3576f712d pkg-config: support apple-clang@15: (#44007) 2024-05-14 17:24:36 +02:00
Alec Scott
410e6a59b7 rust: fix v1.78.0 instructions (#44127) 2024-05-14 08:14:34 -07:00
Carlos Bederián
bd2b2fb75a python: add 3.10.14, 3.9.19, 3.8.19 (#44162) 2024-05-14 17:03:45 +02:00
Zachary Newell
7ae318efd0 Added NCCL version 2.21.5-1 (#44158) 2024-05-14 03:34:21 -06:00
Derek Ryan Strong
73e9d56647 Update bash 5.2 patches (#44172) 2024-05-14 09:08:39 +02:00
Derek Ryan Strong
f87a752b63 Add zsh 5.8.1 and 5.9 (#44173) 2024-05-14 09:08:08 +02:00
Derek Ryan Strong
ae2fec30c3 Add newer fish versions (#44174) 2024-05-14 09:07:49 +02:00
Derek Ryan Strong
1af5564cbe Add file 5.45 (#44175) 2024-05-14 09:07:25 +02:00
Derek Ryan Strong
a8f057a701 Add man-db 2.12.0 and 2.12.1 (#44176) 2024-05-14 09:07:10 +02:00
Michael B Kuhn
7f3dd38ccc amr-wind: add latest versions and correct waves2amr details (#44099)
* add latest versions and correct waves2amr details

* update commit associated with v2.1.0
2024-05-13 14:21:25 -06:00
Adam J. Stewart
8e9adefcd5 ML CI: update image (#43751)
* ML CI: update image

* Use main branch

* Use tagged version
2024-05-13 21:43:52 +02:00
jdomke
d276f9700f fujitsu-mpi package: help CMake find wrappers (#43979)
Set MPI_C_COMPILER etc. env vars when building with fujitsu-mpi, which
override CMake logic. Without using these variables to explicitly
request the Fujitsu MPI wrappers, the builtin CMake logic is unwilling
to use these wrappers unless the Fujitsu compiler is used, but they
should be used for %clang as well.

This avoids patching CMake module files like in #16864, primarily to
avoid the possibility of altering behavior for specs that do not use
%fj or ^fujitsu-mpi.
2024-05-13 11:15:45 -07:00
Harmen Stoppels
4f111659ec glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-05-13 20:11:27 +02:00
Dave Keeshan
eaf330f2a8 yosys: add v0.41 (#44153) 2024-05-13 09:40:42 -07:00
Julien Cortial
cdaeb74dc7 cdt: Add version 1.4.1 (#44155) 2024-05-13 09:38:25 -07:00
Alberto Sartori
fbaac46604 justbuild: add version 1.3.0 (#44148) 2024-05-13 09:35:29 -07:00
Jon Rood
7f6210ee90 nalu-wind: updates (#44046) 2024-05-13 10:28:24 -06:00
Wouter Deconinck
63f6e6079a gaudi: upstream patch when @38.1 for missing #include <list> (#44121)
* gaudi: upstream patch when @38.1 for missing #include <list>

* gaudi: apply list patch for all versions
2024-05-13 07:37:31 -06:00
Greg Becker
d4fd6caae0 spack uninstall: improve error message for dependent environment (#44149) 2024-05-13 14:58:13 +02:00
Mikael Simberg
fd3c18b6fd whip: Add 0.3.0 (#44146)
Co-authored-by: Mikael Simberg <simbergm@cscs.ch>
2024-05-13 03:09:32 -06:00
Adam J. Stewart
725f427f25 spack checksum: do not add expand=False to wheels (#44118) 2024-05-13 10:01:47 +02:00
Paul R. C. Kent
32b3e91ef7 llvm: add 18.1.5, 18.1.4 (#44123) 2024-05-12 16:28:32 -07:00
bk
b7e4602268 R: common package updates for 4.4.0 (#44088)
* r-ggplot2: v3.5.1, r-haven: v2.5.4, r-jsonlite: v1.8.8, r-pkgload: v1.3.4, r-vctrs: v0.6.5

* r-scales: v1.3.0
2024-05-12 10:49:38 -07:00
Stephen Sachs
4a98d4db93 Add applications to aws-pcluster-* stacks (#43901)
* Add openfoam to aws-pcluster-neoverse_v1 stack

* Add more apps to aws-pcluster-x86_64_v4 stack

* Remove WRF while hdf5 cannot build in buildcache at the moment

* Update comment

* Add more apps for aws-pcluster-neoverse_v1 stack

* Remove apps that currently do not build

* Disable those packages that won't build

* Modify syntax such that correct cflags are used

* Changing syntax again to what works with other packages

* Fix overriding packages.yaml entry for gettext

* Use new `prefer` and `require:when` clauses to clarify intent

* Use newer spack version to install intel compiler

This removes the need for patches and makes sure the `prefer` directives in
`package.yaml` are understood.

* `prefer` not strong enough, need to set compilers

* Revert "Use newer spack version to install intel compiler"

This reverts commit ecb25a192c.

Cannot update the spack version to install intel compiler as this changes the
compiler hash but not the version. This leads to incompatible compiler paths. If
we update this spack version in the future make sure the compiler version also updates.

Tested-by: Stephen Sachs <stesachs@amazon.com>

* Remove `prefer` clause as it is not strong enough for our needs

This way we can safely go back to installing the intel compiler with an older
spack version.

* Prefer gcc or oneapi to build gettext

* Pin gettext version compatible with system glibc-headers

* relax gettext version requirement to enable later versions

* oneapi cannot build older gettext version
2024-05-12 10:48:02 -07:00
Juan Miguel Carceller
9d6bf373be whizard: Add version 3.1.4 (#43045)
* whizard: Add a patch to fix builds with pythia8 >= 8.310

* Add whizard 3.1.4 and update accordingly

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:44:56 -07:00
Alex Richert
cff35c4987 octave: use pcre2 for @8: (#42636)
* octave: use pcre2 for @8:

* Add 'pcre2' variant to octave to control pcre vs. pcre2

* Update var/spack/repos/builtin/packages/octave/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-12 10:44:31 -07:00
Juan Miguel Carceller
d594f84b8f fastjet: Add a cxxstd variant (#44072)
* fastjet: Add a cxxstd variant

* Use f-strings

* Add multi=False

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:42:56 -07:00
Wouter Deconinck
f8f01c336c clang: support cxx20_flag and cxx23_flag (#43438)
* clang: support cxx20_flag and cxx23_flag

* clang: coverage test cxx{}_flag and c{}_flag additions
2024-05-12 07:45:59 -07:00
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
Massimiliano Culpo
f7afd67a26 Remove spurious ASP debug lines (#44051) 2024-05-07 11:28:38 +02:00
psakievich
d22bdc1c4e certs: fix interpolation and disallow relative paths (#44030) 2024-05-07 11:16:32 +02:00
Mikael Simberg
540f9eefb7 mold: Add 2.30.0 and 2.31.0 (#44034) 2024-05-07 10:41:13 +02:00
Massimiliano Culpo
2db5bca778 Warn users of the future removal of platform=cray (#43980) 2024-05-07 10:30:23 +02:00
Harmen Stoppels
bcd05407b8 llnl.util.tty.color._force_color: init in global scope (#44036)
Currently SPACK_COLOR=always is not respected in the build process on
macOS, because the global `_force_color` is re-evaluated in global scope
during module setup, where it is always `None`.

So, move global init bits from main.py to the module itself.
2024-05-07 09:49:46 +02:00
John W. Parent
b35ec605fe python-venv: use correct python name for which call (#44048) 2024-05-07 09:32:44 +02:00
Massimiliano Culpo
0a353abc42 Fix issues in packages with the new release of archspec (#44037) 2024-05-07 09:20:34 +02:00
Massimiliano Culpo
e178c58847 Respect requests when filtering reused specs (#44042)
Some specs which were excluded from reuse,
are currently added back to the solve when
we traverse dependencies of other reusable
specs.

This fixes the issue by keeping track of what
we can explicitly reuse.
2024-05-07 09:06:51 +02:00
Massimiliano Culpo
d7297e67a5 Maintenance for the "bootstrap" workflow in CI (#44031)
Removed a lot of duplication.

Fixed an issue in containers, leading to false negative
2024-05-07 08:49:53 +02:00
snehring
ee8addf04a neic-finitefault: adding new package and dependencies (#44032)
* neic-finitefault: adding new package
* py-obspy: adding new package
* py-okada-wrapper: adding new package
* py-pygmt: adding new package
* py-pyrocko: adding new package

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-06 15:45:32 -07:00
Jon Rood
fd3cd3a1c6 amr-wind: updates and fixes (#43993)
* Updates and fixes for AMR-Wind.

* Add versions and update openfast constraints.

* Style.

* Fix version.
2024-05-06 13:31:29 -06:00
Garth N. Wells
e585aeb883 (py)-fenics-dolfinx: version update (#44002)
* Update DOLFINx to v0.8
* Fix typo
2024-05-06 12:11:02 -07:00
James Taliaferro
1f43384db4 Add LSP client extension for Kakoune (#44000)
* new package: kakoune-lsp
* blacken
* add myself as a maintainer
2024-05-06 11:18:45 -07:00
Adam J. Stewart
814b328fca py-torchmetrics: add v1.4.0 (#44027) 2024-05-06 10:21:20 -07:00
Harmen Stoppels
125206d44d python: always use a venv (#40773)
This commit adds a layer of indirection to improve build isolation with 
and without external Python, as well as usability of environment views.

It adds `python-venv` as a dependency to all packages that `extends("python")`, 
which has the following advantages:

1. Build isolation: only `PYTHONPATH` is considered in builds, not 
   user / system packages
2. Stable install layout: fixes the problem on Debian, RHEL and Fedora where 
   external / system python produces `bin/local` subdirs in Spack install prefixes. 
3. Environment views are Python virtual environments (and if you add 
   `py-pip` things like `pip list` work)

Views work whether they're symlink, hardlink or copy type.

This commit additionally makes `spec["python"].command` return 
`spec["python-venv"].command`. The rationale is that packages in repos we do 
not own do not pass the underlying python to the build system, which could still 
result in incorrectly computed install layouts.

Other attributes like `libs`, `headers` should be on `python` anyways and need no change.
2024-05-06 16:17:35 +02:00
John W. Parent
a081b875b4 proj: patch for modern cmake (#43780) 2024-05-06 16:01:10 +02:00
Harmen Stoppels
a16ee3348b Do not cache indices in Gitlab (#44029) 2024-05-06 15:54:21 +02:00
Massimiliano Culpo
d654d6b1f4 Remove Fedora 37 and 38, Ubuntu 18 from CI (#44006) 2024-05-06 15:51:45 +02:00
Harmen Stoppels
9b4ca0be40 clingo bootstrap: remove 3.12 patch and concretizer workarounds (#44028) 2024-05-06 15:00:41 +02:00
Harmen Stoppels
dc71dcfdc2 bootstrap: lazy bootstrapping of clingo and GnuPG (#44026)
Currently bootstrapping from source fails because clingo requires gnupg
requires clingo.

This commit stops eager bootstrapping. We don't need `patchelf` nor `gnupg`
generally. They're bootstrapped when needed.
2024-05-06 14:02:39 +02:00
Greg Becker
1f31c3374c External package detection for compilers (#43464)
This creates shared infrastructure for compiler packages to implement the 
detailed search capabilities from the `spack compiler find` command for the 
`spack external find` command.

After this commit, `spack compiler find` can be replaced with 
`spack external find --tag compiler`, with the exception of mixed toolchains.
2024-05-06 10:33:33 +02:00
Massimiliano Culpo
27aeb6e293 Update vendored archspec to v0.2.4 (#44005) 2024-05-06 10:20:56 +02:00
Harmen Stoppels
715214c1a1 spack env create <env>: dir if dir-like (#44024)
A named env cannot contain `.` and `/`.

So when a user runs `spack env create ./here` do not error but treat it
as `spack env create -d ./here`.

Also fix help string of `spack env create`, which seems to have been
copied from `activate` incorrectly.
2024-05-06 02:00:23 -06:00
dependabot[bot]
b471d62dbd build(deps): bump black from 24.4.0 to 24.4.2 in /lib/spack/docs (#43878)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-06 09:55:43 +02:00
Massimiliano Culpo
a5f62889ca archspec: add v0.2.4 (#44022) 2024-05-06 09:51:01 +02:00
Alec Scott
2a942d98e3 pkgconf: add v2.2.0 (#44008) 2024-05-06 09:31:41 +02:00
Alec Scott
4a4077d4ef rust: add v1.78.0 (#44012) 2024-05-06 09:30:34 +02:00
Alec Scott
c0fcccc232 go: add v1.22.2 (#44013) 2024-05-06 09:30:13 +02:00
Alec Scott
0b2cbfefce ncurses: add v6.5 (#44015) 2024-05-06 09:30:00 +02:00
Massimiliano Culpo
c499514322 libgpg-error: add v1.49 (#44023) 2024-05-06 09:28:31 +02:00
Alec Scott
ae392b5935 pcre2: add v10.43 (#44020) 2024-05-06 09:25:11 +02:00
Alec Scott
62e9bb5d51 libunistring: add v1.2 (#44018) 2024-05-06 09:24:12 +02:00
Alec Scott
6cd948184e libidn2: add v2.3.7 (#44017) 2024-05-06 09:23:50 +02:00
Alec Scott
44ff24f558 openssl: v3.3.0 (#44019) 2024-05-06 09:22:15 +02:00
Alec Scott
c657dfb768 gettext: v0.22.5 (#44014) 2024-05-05 06:57:42 -06:00
Pranav Sivaraman
f2e410d95a lazygit: add v0.41.0 (#43903) 2024-05-04 17:18:30 -06:00
dependabot[bot]
df443a38d6 build(deps): bump actions/checkout from 4.1.2 to 4.1.4 (#43831)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.2 to 4.1.4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](9bb56186c3...0ad4b8fada)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:49:26 -07:00
dependabot[bot]
74fe498cb8 build(deps): bump mypy from 1.9.0 to 1.10.0 in /lib/spack/docs (#43834)
Bumps [mypy](https://github.com/python/mypy) from 1.9.0 to 1.10.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:48:49 -07:00
Carlos Bederián
5f13a48bf2 mesa: add 23.3.6 (#43736) 2024-05-04 15:46:39 -07:00
Jon Rood
c4824f7fd2 ccls: add new versions (#43923) 2024-05-04 15:42:07 -07:00
dependabot[bot]
49a8634584 build(deps): bump codecov/codecov-action from 4.3.0 to 4.3.1 (#43945)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](84508663e9...5ecb98a3c6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:39:42 -07:00
Seth Bromberger
eac5ea869f tmux: add v3.4 and missing yacc build dep (#43975) 2024-05-04 22:33:19 +02:00
Juan Miguel Carceller
f5946c4621 pythia8: Add a gzip variant and filter some compiler wrapper paths (#43807)
* Add a gzip variant and filter some compiler wrapper paths

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-04 09:33:44 -05:00
Harmen Stoppels
8564ab19c3 fix iconv from libc (#43996)
* fix iconv from libc

* fix args in glib
2024-05-04 10:30:43 +02:00
Scott Wittenburg
aae7a22d39 gitlab: release branch pipelines rebuild what changed (#43990) 2024-05-04 10:11:48 +02:00
eugeneswalker
09cea230b4 e4s-alc: add v1.0.2 (#44001) 2024-05-03 22:24:08 -06:00
wspear
a1f34ec58b Add a gmake dependency for TAU (#43870)
We discovered a case where the system gmake can get confused by spack's build environment. Let's use a spack-provided gmake for consistent builds.
2024-05-03 20:15:40 -07:00
Auriane R
4d7cd4c0bf Add py-flash-attn@2.4.2 (#43960) 2024-05-03 16:40:46 -07:00
Christopher Christofi
4adbfa3835 fix package permissions docs link for gaussian packages (#43998) 2024-05-03 17:35:39 -06:00
Adam J. Stewart
8a1b69c1d3 Modernize py-torch-geometric and dependencies (#43984)
* Modernize py-torch-geometric and dependencies
* Forgot one maintainer
2024-05-03 16:21:41 -07:00
Matthew Thompson
a1d69f8661 hdf package: fix building with apple-clang@15: (#43964) 2024-05-03 16:11:55 -07:00
Carlos Bederián
e05dbc529e globalarrays package: add missing dependencies for armci=ofi/openib (#43971) 2024-05-03 16:08:29 -07:00
jdomke
99d33bf1f2 hpl: linking against fujitsu ssl2 if available (#43978)
* linking against fujitsu ssl2 if available

---------

Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:08:12 -07:00
jdomke
bd1918cd71 adding clang compiler checks which behaves exactly like aocc (#43976)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:01:14 -07:00
snehring
2a967c7df4 graphicsmagick package: add version 1.3.43 (#43977) 2024-05-03 15:59:10 -07:00
downloadico
7596aac958 New package: py-pylith (#43987) 2024-05-03 15:56:21 -07:00
Frédéric Simonis
c73ded8ed6 preCICE: increase baseline versions for next ver (#43985) 2024-05-03 15:55:46 -07:00
Satish Balay
df1d783035 sowing: add @master (#43991) 2024-05-03 15:37:39 -07:00
Massimiliano Culpo
47051c3690 Add a default provider for armci (#43997) 2024-05-03 23:48:58 +02:00
Owen Solberg
3fd83c637d Add checksum for R v4.4.0 (#43973)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-05-03 14:47:53 -07:00
Jon Rood
ef5afb66da openfast: updates to package (#43994) 2024-05-03 15:22:12 -06:00
snehring
ecc4336bf9 r-asreml: adding new package (#43970)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-03 12:20:58 -07:00
Harmen Stoppels
d2ed217796 concretizer args: --fresh-roots == --reuse-deps (#43988)
Since reuse is the default now, `--reuse-deps` can be confusing, as it
technically does not imply roots are fresh.

So add `--fresh-roots`, which is also easier to discover when running
`spack concretize --fre<tab>`
2024-05-03 12:12:36 -07:00
Dom Heinzeller
272c7c069a Add -fPIC to hdf-eos2 builds (#43794)
* Add -fPIC to hdf-eos2 builds
* Use self.compiler.cc_pic_flag instead of -fPIC
* Fix style in var/spack/repos/builtin/packages/hdf-eos2/package.py
2024-05-03 11:38:33 -07:00
Jeff Hammond
23f16041cd NWChem ARMCI-MPI variant and optional TCE builds (#43883)
* WIP NWChem with ARMCI-MPI and TCE optional
* rename armci-mpi to armcimpi
* add version for git
* add ARMCI-MPI support to NWChem package
   EXTERNAL_ARMCI_PATH needs to be set to the ARMCI-MPI package install prefix.
   I could not get it from spec["armcimpi"] but it worked to use the dependent build environment method to export a    generic environmental variable to use instead.
* i suppose i can maintain NWChem as well
* check rejects option that dozens of packages use
   i do not have time to fight with this nonsense
   Run . share/spack/setup-env.sh
   ==> Error: armcimpi version 'master' has extra arguments: 'branch'
   Valid arguments for a url fetcher are:
       'url', 'sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', and 'checksum'
   Error: Process completed with exit code 1.
* style
* ARMCI-MPI needs depends_on; the rest seems fixed
* fix ARMCI selection per review feedback from @zzzoom
   https://github.com/spack/spack/pull/43883#discussion_r1585014147
* address reviewer feedback from @yizeyi18
   https://github.com/spack/spack/pull/43883#discussion_r1587228084
   elaborate on what +extratce does, in terms of the NWChem build environment variables,
   some of which are documented on https://nwchemgit.github.io/TCE.html.
* style
* Update var/spack/repos/builtin/packages/nwchem/package.py

---------

Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Carlos Bederián <4043375+zzzoom@users.noreply.github.com>
2024-05-03 11:13:54 -07:00
Frank Willmore
e2329adac0 Update package.py for vacuumms 1.2.0 (#43948)
* updating vacuumms for new release
* formatting
* re-order versions and remove triple quote
2024-05-03 11:06:15 -07:00
jalcaraz
4ec788ca12 [TAU package] Update with +rocprofv2. Updated some tests. (#43937)
* [TAU package] Update with +rocprofv2. Updated some tests.

pdated with +rocprofv2 flag. Only works with rocm-core >= 6.0.0

Can be tested with (Omnia):
spack install tau@master +rocm+rocprofv2 %gcc@11
Needs the last commit in our local repository, can be done by modifying the "git = " line, or waiting until the public one is updated.


In the case that tests cause issues when building TAU, there is the flag:
disable_tests = False

The rocm test is disabled by default, as the PR regarding tests loading dependencies is not solved (PR#43682).

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-05-03 08:33:13 -07:00
Andrew W Elble
c1cea9d304 py-tensorflow: fix aarch64 build (#43852)
* py-tensorflow: fix aarch64 build

* [@spackbot] updating style on behalf of aweits

* patch format

* change patch strategy, actually get the logic correct in
the patch

* only patch 2.16.1 onwards for aarch64

* __CUDACC__ not __NVCC__

* !(defined(__NVCC__) && defined(__CUDACC__))

---------

Co-authored-by: aweits <aweits@users.noreply.github.com>
2024-05-03 15:46:13 +02:00
Massimiliano Culpo
5c96e67bb1 Make runtimes depend on libc only on linux (#43981) 2024-05-03 15:40:02 +02:00
Marc T. Henry de Frahan
7008bb6335 Add rosco package (#43822)
* Add rosco package

* format

* remove unused

* Add in other versions

* start adding some patches

* finish adding the patches
2024-05-03 01:58:35 -06:00
Adam J. Stewart
14561fafff py-torch: set TORCH_CUDA_ARCH_LIST globally for dependents (#43962) 2024-05-03 09:11:59 +02:00
Massimiliano Culpo
89bf1edb6e Spec.satisfies: fix a bug with concrete spec from JSON (#43968)
Fix a bug triggered by missing a virtual on some transitive edge, in a subdag of a pure build dependency.
2024-05-02 22:16:02 -04:00
Todd Gamblin
cc85dc05b7 docs: re-enable google analytics (#43974)
We recently switched to using the new ReadTheDocs with "addons". That includes its own
analytics, which is nice, but we also want to continue using our GA4 analytics.

Adding GA4 is no longer supported by RTD, so we have to add it manually.

- [x] re-add the gtag to all pages, manually

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-02 21:56:19 -04:00
Adam J. Stewart
ae171f8b83 py-torch: conflict with newer cuda (#43969) 2024-05-02 18:47:33 -07:00
snehring
578dd18b34 minimap2: adding version 2.28 (#43972)
* minimap2: adding version 2.28
* minimap2: adding maintainer

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-02 18:45:59 -07:00
Garth N. Wells
a7a51ee5cf py-fenics-ffcx: update to v0.8 (#43929)
* Update FFCx and UFCx to v0.8
* Syntax fix
2024-05-02 09:53:31 -07:00
Stephanie Labasan Brink
960cc90667 variorum: add branch dev and version 0.8.0 (#43829)
* variorum: add branch dev and version 0.8.0
* formatting
2024-05-02 09:50:05 -07:00
Alex Richert
dea44bad8b grads: fix libpng and g2c deps (#43909)
* grads: fix libpng and g2c deps
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:42:15 -07:00
Richard Berger
e37870ff43 rdma-core: add versions 51.0, 50.0 and 49.1 (#43926) 2024-05-02 09:41:23 -07:00
Sajid Ali
3751642a27 mold: add new versions (#43931) 2024-05-02 09:39:09 -07:00
Nils Vu
0f386697c6 spectre: add versions, update constraints (#43936) 2024-05-02 09:38:06 -07:00
Alex Richert
67ce103b2c Update prod-util recipe (#43940)
* update prod-util recipe
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:24:39 -07:00
Alex Richert
a8c9fa0e45 g2tmpl: add v1.12.0 (#43941) 2024-05-02 09:23:42 -07:00
Alex Richert
b56a133fce update grib-util recipe (#43939) 2024-05-02 09:22:03 -07:00
Seth R. Johnson
f0b3d33145 Celeritas: new version 0.4.3 (#43949) 2024-05-02 09:04:30 -07:00
Adam J. Stewart
32564da9d0 rust: add conflict for older GCC (#43958) 2024-05-02 08:36:06 -07:00
Stephen Hudson
8f2faf65dc libEnsemble: add v1.3.0 (#43944) 2024-05-02 08:35:06 -07:00
eugeneswalker
1d59637051 e4s ci: add py-amrex (#43904) 2024-05-02 08:57:26 -06:00
jdomke
97dc353cb0 libc: detect ARM flavor (#43959)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-02 15:59:29 +02:00
wspear
beebe2c9d3 Accept later pythons for tau@2.33+ (#43911)
* Accept later pythons for tau@2.33+

* separate forward compat bounds
2024-05-02 07:13:21 -06:00
Harmen Stoppels
2eb7add8c4 gcc: write builtin specs before modifications (#43956) 2024-05-02 06:40:10 -06:00
Harmen Stoppels
a9fea9f611 nghttp2: add diffutils (#43954) 2024-05-02 14:22:53 +02:00
Harmen Stoppels
9b62a9c238 gitlab ci: cache user cache (#43952) 2024-05-02 12:06:30 +02:00
wspear
f7eb0ccfc9 Revert #43701 (#43935)
PR #43701 is broken, preventing scorep from being installed. As explained in #43700 the issue is internal to scorep and can not be fixed in the package, at least by the method being attempted here.
2024-05-02 09:56:21 +02:00
Adrien Bernede
a0aa35667c Ignore external packages when pushing to buildcache automatically (--autopush) (#43930) 2024-05-02 09:50:10 +02:00
Luc Berger
b1d4fd14bc Nalu: adding support for Trilinos 14.2.0 for Nalu 1.6.0 (#43857) 2024-05-02 01:10:27 -06:00
John W. Parent
7e8415a3a6 Windows: auto-add WGL/SDK as externals (#43752)
Adds a pre-concretization check for the Windows SDK and WGL (Windows
GL) packages as non-buildable externals.

This is a redo of https://github.com/spack/spack/pull/43459, but makes
sure to modify the configuration scope outside of the bootstrap scope:
whichever is highest-precedence in the user's environment at the time
the concretization runs, which should either be an env scope or the
~ scope.

Adds pytest fixture mocking the check for WGL and WSDK as if they were
present.
2024-05-02 01:05:03 -06:00
Simon Pintarelli
7f4f42894d update sirius package.py (#43872) 2024-05-02 08:23:10 +02:00
Massimiliano Culpo
4e876b4014 Allow more control over which specs are reused (#42782)
This PR gives users finer control over which specs are reused during concretization.

The value of the `concretizer:reuse` config option now can take an object with the following properties:
- `roots`: true if reusing roots, false if reusing just dependencies
- `exclude`: list of constraints used to select reusable specs 
- `include`: list of constraints used to select reusable specs 
- `from`: allows to select the sources of reused specs

### Examples

#### Reuse only specs compiled with GCC
```yaml
concretizer:
  reuse:
    roots: true
    include:
    - "%gcc"
```

#### `openmpi` must be used from externals, and it must be the only external used
```yaml
concretizer:
  reuse:
    roots: true
    from:
    - type: local
      exclude:
      - "openmpi"
    - type: buildcache
      exclude:
      - "openmpi"
    - type: external
      include:
      - "openmpi"
```
2024-05-01 23:05:26 -04:00
Wouter Deconinck
77a8a4fe08 abseil-cpp: new version 20240116.2 (#43666) 2024-05-01 17:06:57 -07:00
Thomas Bouvier
597e5a4e5e Update py-nvidia-dali to v1.36.0 (#43928)
* py-nvidia-dali: add v1.36.0
* fix: do not expand downloaded wheel
2024-05-01 16:35:43 -07:00
MatthewLieber
3c31c32f62 Adding sha for 7.4 release of OSU Micro Benchmarks (#43899)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-05-01 16:27:52 -07:00
Weiqun Zhang
3a93a716e4 amrex: add v24.05 (#43938) 2024-05-01 16:17:05 -07:00
Filippo Spiga
82229a0784 Adding EXTRAE 4.1.2 (#43907) 2024-05-01 15:55:54 -07:00
Axel Huebl
5d846a69d1 ADIOS2: Campaign Variant (#43906)
With v2.10+, ADIOS added a campaign manager. This is auto-enabled
if SQLite3 is found.

Add explicit control for it now and disables it by default, to avoid
picking up system dependencies or bloating by default the ADIOS2
dependencies. Also, not yet fully mature to be used by default:
https://github.com/ornladios/ADIOS2/issues/4148
2024-05-01 15:50:45 -07:00
Stephen Herbener
d21aa1cc12 Removed use of mpi wrappers for fms and mapl package.py scripts. These were causing (#43726)
builds to fail on MacOS and CMake appears to handle the build fine without the mpi wrappers.
2024-05-01 15:41:30 -07:00
Jake Koester
7896ff51f6 bumped up version of kokkos-kernels (#43920) 2024-05-01 12:20:51 -06:00
Chris Mc
5849a24a74 jwt-cpp: add v0.7.0 (#41894)
* jwt-cpp: add version 0.7.0

* Update var/spack/repos/builtin/packages/jwt-cpp/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-01 12:20:35 -06:00
Alex Leute
38c49d6b82 py-simpy: New package (#43497)
* pypi build of py-simpy

* [py-simpy] toml explicitly called out

* py-simpy: New version

* py-setuptools-scm: Added new version

* py-setuptools-scm: add url_for_version

Because versions @:7 have an underscore (setuptools_scm) in the URL, and
newer versions have a dash (setuptools-scm).

* py-setuptools-scm: fix flake8 line too long

* py-simpy: Fix hash

---------

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-05-01 12:06:52 -06:00
fpruvost
0d8900986d qrmumps: 3.1 (#43913) 2024-05-01 12:00:25 -06:00
dependabot[bot]
62554cebc4 build(deps): bump black in /.github/workflows/style (#43879)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:37:54 +02:00
Wouter Deconinck
067155cff5 containers: add ubuntu 24.04 (#43881)
* containers: add ubuntu 24.04

* containers: use python3-boto3 pkg instead of pip install
2024-05-01 13:37:13 +02:00
Wouter Deconinck
08e68d779f ci: update upload-artifact to v4 (in build-containers) (#43880) 2024-05-01 13:35:12 +02:00
Adam J. Stewart
05b04cd4c3 Update PyTorch ecosystem (#43823)
* Update PyTorch ecosystem

* Don't expose protobuf namespace to other packages, breaks torchtext build

* Revert "Don't expose protobuf namespace to other packages, breaks torchtext build"

This reverts commit 50a4b08f65.
2024-05-01 13:30:46 +02:00
dependabot[bot]
be48f762a9 build(deps): bump pytest from 8.1.1 to 8.2.0 in /lib/spack/docs (#43908)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.1.1 to 8.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.1.1...8.2.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:29:22 +02:00
Axel Huebl
de5b4840e9 Add quantiphy (#43894)
* Add quanitphy

Add https://github.com/KenKundert/quantiphy .

* Simplify Version Range
2024-04-30 15:26:35 -07:00
Harmen Stoppels
20f9884445 curl: perl is temporarily required (#43921) 2024-04-30 23:32:06 +02:00
Harmen Stoppels
deb78bcd93 re2c: depends on gmake (#43916)
regressed in 650a668a9d
2024-04-30 23:30:36 +02:00
Garth N. Wells
06239de0e9 py-fenics-ufl: add new version (#43848)
* Bump version to 2024.1.0
* Bump to .post0
* Dependency fix
* Format file
* Run black on package file
2024-04-30 14:10:55 -07:00
Garth N. Wells
1f904c38b3 Add version 1.9.2. (#43838) 2024-04-30 12:19:31 -07:00
Wouter Deconinck
f2d0ba8fcc PackageStillNeededError: add pkg that needs spec to exception msg (#43845)
* PackageStillNeededError: add pkg that needs spec to exception msg
* PackageStillNeededError: f-string with short fmt and hash
* PackageStillNeededError: split long string
2024-04-30 12:11:47 -07:00
Satish Balay
49d3eb1723 petsc, py-petsc4py: add v3.21.1 (#43887) 2024-04-30 10:07:33 -07:00
Dom Heinzeller
7c5439f48a Update crtm-fix and crtm from JCSDA fork (#43855) 2024-04-30 10:07:16 -07:00
Harmen Stoppels
7f2cedd31f hack: drop glibc and musl in old concretizer (#43914)
The old concretizer creates a cyclic graph when expanding virtuals for
`iconv`, which is a bug. This hack drops glibc and musl as possible
providers for `iconv` in the old concretizer to work around it.
2024-04-30 13:55:48 +02:00
Harmen Stoppels
d47951a1e3 glibc: provides iconv (#43897)
`iconv` is a bit of weird virtual because the only shared API between
`glibc` and `libiconv` is:

```
iconv
iconv_open
iconv_close
```

whereas `libiconv` has further symbols [iconvctl](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconvctl.3.html), [iconv_open_into](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconv_open_into.3.html), and an `iconv` executable and `libcharset.so`. Packages that need those will have to do `depends_on("[virtuals=iconv] libiconv")`.
2024-04-30 07:40:00 +02:00
Teague Sterling
f2bd0c5cf1 Fix duckdb version handling again (#43762)
* Added package to build Ollama
* Update package.py
   Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
* We can now use OVERRIDE_GIT_DESCRIBE to set the version in DuckDB
* Update duckdb/package.py
   - use f-string
   - fix version specs to address inconsistencies
* Update package.py
   Fix spec definitions further
* [@spackbot] updating style on behalf of teaguesterling

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-29 18:51:04 -07:00
one
4362382223 Update package.py (#43764) 2024-04-29 18:49:49 -07:00
Nathalie Furmento
ba4859b33d starpu: add release 1.4.5 and dependancy for building starpupy (#43810) 2024-04-29 18:48:37 -07:00
Carlos Bederián
e8472714ef ucc: add 1.3.0 (#43828) 2024-04-29 18:45:00 -07:00
Mikael Simberg
ee6960e53e boost: Add 1.85.0 (#43788)
* boost: Add 1.85.0
* Add conflict for Boost 1.85.0 stacktrace change
2024-04-29 18:41:38 -07:00
Nathalie Furmento
dad266c955 starpu: add branch 1.4 (#43837) 2024-04-29 18:38:43 -07:00
Garth N. Wells
7a234ce00a (py-)fenics-basix: update for new version (#43841)
* Bump version to 2024.1.0
* Bump py-basix version
* Revert UFL change
* Python version update
2024-04-29 18:11:36 -07:00
Loris Ercole
a0c2ed97c8 Update scine-qcmaquis (#43817)
`scine-qcmaquis` is updated with a version 3.1.4.
The option to build the OpenMolcas interface is added, and some
dependencies are clarified.
2024-04-29 16:38:29 -07:00
Rémi Lacroix
a3aa5b59cd dotnet-core-sdk: Fix environment setup (#43842)
The "DOTNET_CLI_TELEMETRY_OPTOUT" environment variable should be defined when using the product, not when installing it (the installation phase is just extract the files anyway).

Also use `~` instead of `-` to check for the variant and fix the second argument for `env.set`  which should also be a string.
2024-04-29 14:47:37 -07:00
Zach Jibben
f7dbb59d13 Update Petaca (#43849)
* Add recent petaca releases
* Add Petaca 24.04
2024-04-29 14:42:11 -07:00
Wouter Deconinck
0df27bc0f7 nopayloadclient: new package (#43853) 2024-04-29 14:16:57 -07:00
Massimiliano Culpo
877e09dcc1 Delete leftover file (#43869)
This file is not needed anymore, was introduced in #19688
2024-04-29 22:45:41 +02:00
Adam J. Stewart
c4439e86a2 Prettier: add new package (#43866) 2024-04-29 13:39:52 -07:00
Pramod Kumbhar
aa00dcac96 gprofng-gui: new package (#43873) 2024-04-29 13:38:48 -07:00
Adam J. Stewart
4c9a946b3b py-keras: add v3.3.3 (#43885) 2024-04-29 13:30:52 -07:00
Wouter Deconinck
0c6e6ad226 xbae: new package (#43889) 2024-04-29 13:05:56 -07:00
Adam J. Stewart
bf8f32443f py-einops: add v0.8.0 (#43890) 2024-04-29 12:54:33 -07:00
Wouter Deconinck
c2eef8bab2 fontconfig: depends_on gperf when 2.11.1: (#43892) 2024-04-29 11:56:59 -07:00
afzpatel
2df4b307d7 hipblaslt: new package (#43846)
* initial commit to add hipblaslt package
* remove master and update patch
* add docstring comment
2024-04-29 11:48:03 -07:00
Ryan Marcellino
3c57440c10 openjdk: add v21_35 (#40699)
* openjdk: add v21+35

* add provides java@21

* Update var/spack/repos/builtin/packages/openjdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-29 10:51:17 -05:00
Harmen Stoppels
3e6e9829da compiler.py: fix early return (#43898) 2024-04-29 08:53:27 -06:00
Gregory Becker
859745f1a9 Run audits on windows
Add debug log for external detection tests. The debug log
is used to print which test is being executed.

Skip version audit on Windows where appropriate
2024-04-29 14:13:10 +02:00
Massimiliano Culpo
ddabb8b12c Fix concretization when installing missing compilers (#43876)
Restore the previous behavior when config:install_missing_compilers
is True. The libc of the missing compiler is inferred from the
Python process.
2024-04-29 08:20:33 +02:00
Jeff Hammond
16bba32124 add ILP64 option for BLIS (#43882)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
2024-04-29 08:14:25 +02:00
Michael Kuhn
7d87369ead autoconf: fix typo in m4 dependencies (#43893)
m4 1.4.8 is actually required starting with autoconf 2.72 according to
the NEWS file.
2024-04-28 18:34:12 -05:00
Adam J. Stewart
7723bd28ed Revert "package/npm update (#43692)" (#43884)
This reverts commit 03a074ebe7.
2024-04-27 08:58:12 -06:00
Harmen Stoppels
43f3a35150 gcc: generate spec file and fix external libc default paths after install from cache (#43839)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-27 08:49:20 -06:00
Jonathon Anderson
ae9f2d4d40 containers: Add Fedora 40, 39 (#43847) 2024-04-26 20:02:04 -05:00
Huston Rogers
5a3814ff15 Miniconda3: Added Versions: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0 (#43868)
* Added Versions of miniconda3: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0

* fixed style

---------

Co-authored-by: James H. Rogers <jhrogers@spear.hpc.msstate.edu>
2024-04-26 18:58:58 -06:00
Sakib Rahman
946c539dbd osg-ca-certs: new version osg 1.119 and igtf 1.128 (#43871)
* Update package.py to osg 1.119 and igtf 1.128

* Remove unnecessary white space

* Add missing comma

* change url to use git and use commit hash from repository instead of ssh256sum of tar.gz

* [@spackbot] updating style on behalf of rahmans1

---------

Co-authored-by: rahmans1 <rahmans1@users.noreply.github.com>
2024-04-26 18:39:12 -06:00
Robert Cohn
0037462f9e [SOS] update license (#43864)
Auto-generated license was wrong
2024-04-26 18:45:29 +02:00
Massimiliano Culpo
e5edac4d0c ASP-based solver: update os compatibility for macOS (#43862) 2024-04-26 18:34:46 +02:00
Jeff Hammond
3e1474dbbb ARMCI-MPI: add new package (#43813)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-26 16:35:33 +02:00
Abhishek Yenpure
0f502bb6c3 libcatalyst: add v2.0.0 (#43804) 2024-04-26 16:27:46 +02:00
Robert Cohn
1eecbd3208 [intel-oneap-*] no redistribution (#43826) 2024-04-26 12:03:39 +00:00
Jack Morrison
6e92b9180c Add tests-sos package + Variants to sos package (#43830)
* * Add initial tests-sos package
* Remove failing call of missing setup_compiler_environment from sos
  package
* Add several variants for sos package

* [@spackbot] updating style on behalf of jack-morrison
2024-04-26 07:26:21 -04:00
Massimiliano Culpo
ac9012da0c spack audit externals: allow selecting platforms and checking extra attributes (#43782) 2024-04-26 12:47:17 +02:00
Mosè Giordano
e3cb4f09f0 julia: add v1.10.2 (#41151)
* julia: add v1.10.2

* julia: add patch to remove suite-sparse cuda stub files

* julia: use permalinks for patches
2024-04-26 12:44:54 +02:00
Harmen Stoppels
2e8600bb71 gcc: simplify specs file, make binutils locatable (#43861) 2024-04-26 01:30:51 -06:00
Harmen Stoppels
d946c37cbb ldflags=* are compiler flags, not linker flags (#43820)
We run `extend spack_flags_list SPACK_LDFLAGS` for `$mode in ld|ccld`.

That's problematic, cause `ccld` needs `-Wl,--flag` whereas `ld` needs
`--flag` directly. Only `-L` and `-l` are common to compiler & linker.

In all build systems `LDFLAGS` is for the compiler not the linker, cause
any linker flag `-x` can be passed as a compiler flag `-Wl,-x`, and there
are many compiler flags that affect the linker invocation, like `-fopenmp`,
`-fuse-ld=`, `-fsanitize=` etc.

So don't pass `LDFLAGS` to the linker directly.

This way users can set `ldflags: -Wl,--allow-shlib-undefined` in compilers.yaml
to work around an issue where the linker tries to resolve the `libcuda.so.1`
stub lib which cannot be located by design in `cuda`.
2024-04-26 09:19:03 +02:00
Harmen Stoppels
47a9f0bdf7 llvm: use --gcc_install_dir in config files (#43795) 2024-04-26 09:01:02 +02:00
Alex Richert
2bf900a893 Update package.py (#43836) 2024-04-25 16:00:43 -07:00
Andrey Perestoronin
99bba0b1ce Add intel-oneapi-mpi 2021.12.1 patch package (#43850)
* Add intel-oneapi-mpi package

* Fix style
2024-04-25 13:38:47 -06:00
Juan Miguel Carceller
a8506f9022 glew package: add ld flags when compiling with ^apple-gl (#43429) 2024-04-25 11:18:00 -07:00
Harmen Stoppels
4a40a76291 build_environment.py: expand SPACK_MANAGED_DIRS with realpath (#43844) 2024-04-25 13:33:50 +02:00
downloadico
fe9ddf22fc spatialdata: add spatialdata package to spack (#43500) 2024-04-25 04:18:08 -07:00
Adam J. Stewart
1cae1299eb CI: remove ML ROCm stack (#43825) 2024-04-25 12:56:59 +02:00
Adam J. Stewart
8b106416c0 py-lightning: add v2.2.3 (#43824) 2024-04-25 12:03:01 +02:00
jalcaraz
e2088b599e [TAU Package] Updates for rocm (#43790)
* Updates for rocm

Updated for rocm@6
Added conflict between rocprofiler and roctracer.
Request either +rocprofiler or +roctracer when +rocm. In this case, it automatically builds for one, instead of displaying the message. 
Request +rocm when using either +rocprofiler or +roctracer. In this case, it automatically builds with +rocm, instead of displaying the message.

Disabled the tests. Will update them with the new test method.

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-04-24 16:35:41 -07:00
Ken Raffenetti
56446685ca mpich: add 4.2.0 release (#42687) 2024-04-24 12:33:59 -07:00
Teo
47a8d875c8 update halide versions; sync llvm (#43793) 2024-04-24 12:27:42 -07:00
Adam J. Stewart
56b2d250c1 py-keras: add v3.3.1-2 (#43798) 2024-04-24 12:23:34 -07:00
David Boehme
abbd09b4b2 Add Caliper v2.11 (#43802) 2024-04-24 12:21:56 -07:00
Rémi Lacroix
9e5fdc6614 dotnet-core-sdk: Add versions 7.0.18 and 8.0.4 (#43814) 2024-04-24 12:19:33 -07:00
Harmen Stoppels
1224a3e8cf clang.py: detect flang-new (#43815)
If a flang-new exists, which is rather unlikely, it probably means the
user wants it as a fortran compiler.
2024-04-24 19:11:02 +02:00
Jake Koester
6c3218920f Trilinos: update kokkos dependency (#43785)
* fix so trilinos@master uses correct kokkos (@4.3.00)

* Update var/spack/repos/builtin/packages/trilinos/package.py
2024-04-24 10:42:08 -06:00
Peter Scheibel
02cc3ea005 Add new redistribute() directive (#20185)
Some packages can't be redistributed in source or binary form. We need an explicit way to say that in a package.

This adds a `redistribute()` directive so that package authors can write, e.g.:

```python
    redistribute(source=False, binary=False)
```

You can also do this conditionally with `when=`, as with other directives, e.g.:

```python
    # 12.0 and higher are proprietary
    redistribute(source=False, binary=False, when="@12.0:")

    # can't redistribute when we depend on some proprietary dependency
    redistribute(source=False, binary=False, when="^proprietary-dependency")
```


To prevent Spack from adding either their sources or binaries to public mirrors and build caches. You can still unconditionally add things *if* you run either:
* `spack mirror create --private`
* `spack buildcache push --private`

But the default behavior for build caches is not to include non-redistributable packages in either mirrors or build caches.  We have previously done this manually for our public buildcache, but with this we can start maintaining redistributability directly in packages.

Caveats: currently the default for `redistribute()` is `True` for both `source` and `binary`, and you can only set either of them to `False` via this directive.

- [x] add `redistribute()` directive
- [x] add `redistribute_source` and `redistribute_binary` class methods to `PackageBase`
- [x] add `--private` option to `spack mirror`
- [x] add `--private` option to `spack buildcache push`
- [x] test exclusion of packages from source mirror (both as a root and as a dependency)
- [x] test exclusion of packages from binary mirror (both as a root and as a dependency)
2024-04-24 09:41:03 -07:00
John W. Parent
641ab95a31 Revert "Windows: add win-sdk/wgl externals during bootstrapping (#43459)" (#43819)
This reverts commit 9e2558bd56.
2024-04-24 18:24:28 +02:00
Adam J. Stewart
e8b76c27e4 py-matplotlib: drop test dep (#43765)
test dependencies constrains build / link type deps, so avoid that
2024-04-24 18:03:13 +02:00
Massimiliano Culpo
0dbe4d54b6 Tune default compiler preferences (#43805)
Add `%oneapi`, remove compilers that have been discontinued upstream
2024-04-24 18:00:40 +02:00
Harmen Stoppels
1eb6977049 rust: use system libs (#43797) 2024-04-24 09:46:11 -06:00
Harmen Stoppels
3f1cfdb7d7 libc: from current python process (#43787)
If there's no compiler we currently don't have any external libc for the solver.

This commit adds a fallback on libc from the current Python process, which works if it is dynamically linked.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-24 05:10:48 -06:00
Massimiliano Culpo
d438d7993d nf-*-cli: fix typo in conditional (#43806)
The default runner changed on GitHub for macOS, and that
revealed a bug in a package when running audits
2024-04-24 11:30:35 +02:00
Todd Gamblin
aa0825d642 Refactor to improve spec format speed (#43712)
When looking at where we spend our time in solver setup, I noticed a fair bit of time is spent
in `Spec.format()`, and `Spec.format()` is a pretty old, slow, convoluted method.

This PR does a number of things:
- [x] Consolidate most of what was being done manually with a character loop and several
      regexes into a single regex.
- [x] Precompile regexes where we keep them 
- [x] Remove the `transform=` argument to `Spec.format()` which was only used in one 
      place in the code (modules) to uppercase env var names, but added a lot of complexity
- [x] Avoid escaping and colorizing specs unless necessary
- [x] Refactor a lot of the colorization logic to avoid unnecessary object construction
- [x] Add type hints and remove some spots in the code where we were using nonexistent
      arguments to `format()`.
- [x] Add trivial cases to `__str__` in `VariantMap` and `VersionList` to avoid sorting
- [x] Avoid calling `isinstance()` in the main loop of `Spec.format()`
- [x] Don't bother constructing a `string` representation for the result of `_prev_version`
      as it is only used for comparisons.

In my timings (on all the specs formatted in a solve of `hdf5`), this is over 2.67x faster than the 
original `format()`, and it seems to reduce setup time by around a second (for `hdf5`).
2024-04-23 10:52:15 -07:00
Greg Becker
978c20f35a concretizer: update reuse: default to True (#41302) 2024-04-23 17:42:14 +02:00
Marc T. Henry de Frahan
d535124500 Update amr-wind package with versions (#43728) 2024-04-23 05:07:42 -06:00
Harmen Stoppels
01f61a2eba Remove import distro from packages and docs (#43772) 2024-04-23 12:47:33 +02:00
Massimiliano Culpo
7d5e27d5e8 Do not detect a compiler without a C compiler (#43778) 2024-04-23 12:20:33 +02:00
Harmen Stoppels
d210425eef nettle: remove openssl dep (#43770) 2024-04-23 07:44:15 +02:00
Nathalie Furmento
6be07da201 starpu: fix release 1.4.4 (#43730) 2024-04-22 15:56:23 -07:00
Filippo Spiga
02b38716bf Adding UCX 1.16.0 (#43743)
* Adding UCX 1.16.0
* Fixed hash
2024-04-22 15:52:20 -07:00
Adam J. Stewart
d7bc624c61 Tags: add more build tools (#43766)
* Tags: add more build tools
* py-pythran: add maintainer
2024-04-22 15:34:38 -07:00
Adam J. Stewart
b7cecc9726 py-scikit-image: add v0.23 (#43767) 2024-04-22 15:32:55 -07:00
Adam J. Stewart
393a2f562b py-keras: add v3.3.0 (#43783) 2024-04-22 15:04:06 -07:00
Massimiliano Culpo
682fcec0b2 zig: add v0.12.0 (#43774) 2024-04-22 15:02:45 -07:00
Harmen Stoppels
d6baae525f repo.py: drop deleted packages from provider cache (#43779)
The reverse provider lookup may have stale entries for deleted packages, which used to cause errors. It's hard to invalidate those cache entries, so this commit simply drops entries w/o invalidating the cache.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-22 19:03:44 +02:00
Kyle Knoepfel
e1f2612581 Adjust severity of irreversible operations (#43721) 2024-04-22 16:41:53 +02:00
Harmen Stoppels
080fc875eb compiler.py: reduce verbosity of implicit link dirs parsing (#43777) 2024-04-22 16:07:14 +02:00
Harmen Stoppels
69f417b26a view: dont warn about externals (#43771)
since it's the status quo on linux after libc as external by default
2024-04-22 16:05:32 +02:00
Harmen Stoppels
80b5106611 bootstrap: no need to add dummy compilers (#43775) 2024-04-22 16:01:41 +02:00
Massimiliano Culpo
34146c197a Add libc dependency to compiled packages and runtime deps
This commit differentiate linux from other platforms by
using libc compatibility as a criterion for deciding
which buildcaches / binaries can be reused. Other
platforms still use OS compatibility.

On linux a libc is injected by all compilers as an implicit
external, and the compatibility criterion is that a libc is
compatible with all other libcs with the same name and a
version that is lesser or equal.

Some concretization unit tests use libc when run on linux.
2024-04-22 15:18:06 +02:00
Harmen Stoppels
209a3bf302 Compiler.default_libc
Some logic to detect what libc the c / cxx compilers use by default,
based on `-dynamic-linker`.

The function `compiler.default_libc()` returns a `Spec` of the form
`glibc@x.y` or `musl@x.y` with the `external_path` property set.

The idea is this can be injected as a dependency.

If we can't run the dynamic linker directly, fall back to `ldd` relative
to the prefix computed from `ld.so.`
2024-04-22 15:18:06 +02:00
Harmen Stoppels
e8c41cdbcb database.py: stream of json objects forward compat (#43598)
In the future we may transform the database from a single JSON object to
a stream of JSON objects.

This paves the way for constant time writes and constant time rereads
when only O(1) changes are made. Currently both are linear time.

This commit gives just enough forward compat for Spack to produce a
friendly error when we would move to a stream of json objects, and a db
would look like this:

```json
{"database": {"version": "<something newer>"}}
```
2024-04-22 09:43:41 +02:00
Massimiliano Culpo
a450dd31fa Fix a bug preventing to set platform= on externals (#43758)
closes #43406
2024-04-22 09:15:22 +02:00
Alex Richert
7c1a309453 perl: remove mkdirp from setup_dependent_package (#43733) 2024-04-20 21:34:07 +02:00
Massimiliano Culpo
78b6fa96e5 ci.py: visit all edges (#43761) 2024-04-20 21:29:32 +02:00
Jordan Ogas
1b315a9ede mpich: add v4.2.1 (#43753) 2024-04-20 19:45:25 +02:00
Harmen Stoppels
82df0e549d compiler wrapper: prioritize spack store paths in -L, -I, -rpath (#43593)
* compiler wrapper: prioritize spack managed paths in search order

This commit partitions search paths of -L, -I (and -rpath) into three
groups, from highest priority to lowest:

1. Spack managed directories: these include absolute paths such as
   stores and the stage dir, as well as all relative paths since they
   are relative to a Spack owned dir
2. Non-system dirs: these are for externals that live in non-system
   locations
3. System dirs: your typical `/usr/lib` etc.

It's very easy for Spack to known the prefixes it owns, it's much more
difficult to tell system dirs from non-system dirs. Before this commit
Spack tried to distinguish only system and non-system dirs, and failed
for very trivial cases like `/usr/lib/x/..` which comes up often, since
build systems sometimes copy search paths from `gcc -print-search-dirs`.

Potentially this implementation is even faster than the current state of
things, since a loop over paths is replaced with an eval'ed `case ...`.

* Trigger a pipeline

* Revert "Trigger a pipeline"

This reverts commit 5d7fa863de.

* remove redudant return statement
2024-04-20 13:23:37 -04:00
Harmen Stoppels
f5591f9068 ci.py: simplify, and dont warn excessively about externals (#43759) 2024-04-20 15:09:54 +02:00
FrederickDeny
98c08d277d e4s-alc: add new package (#43750)
* Added e4s-cl@1.0.3

* add e4s-alc package

* removed trailing whitespace
2024-04-19 15:54:23 -06:00
Jordan Galby
facca4e2c8 ccache: 4.9.1 and 4.8.3 (#43748) 2024-04-19 23:46:38 +02:00
Adam J. Stewart
764029bcd1 py-ruff: add v0.4.0 (#43740) 2024-04-19 15:44:19 -06:00
Harmen Stoppels
44cb4eca93 environment.py: fix excessive re-reads (#43746) 2024-04-19 13:39:34 -06:00
Jack Morrison
39888d4df6 libfabric: Add version 1.21.0 (#43735) 2024-04-19 11:09:59 -07:00
Vincent Michaud-Rioux
f68ea49e54 Update py-pennylane + Lightning plugins + few deps (#43706)
* Update PennyLane packages to v0.32.
* Reformat.
* Couple small fixes.
* Fix Lightning cmake_args.
* Couple dep fixes in py-pennylane + plugins.
* Fix scipy condition.
* Add comment on conflicting requirement.
* Update py-pl versions
* Update lightning versions.
* Fix copyright.
* Fix license.
* Update pl-kokkos versions
* run black
* Fix L-Kokkos build and update autoray.
* build step only required for older versions. update autograd
* Fix LK@0.31 kokkos compat issue. Introduce url_for_version.
* Fix few more version bounds.
2024-04-19 11:05:09 -07:00
Olivier Cessenat
78b5e4cdfa perl-fth: new version 0.529 (#43727) 2024-04-19 10:58:34 -07:00
Micael Oliveira
26515b8871 fms: add two variants (#43734)
* fms: add two variants supporting existing build options.
* Style fixes.
2024-04-19 10:51:58 -07:00
Harmen Stoppels
74640987c7 ruamel yaml: fix quadratic complexity bug (#43745) 2024-04-19 14:33:42 +02:00
Harmen Stoppels
d6154645c7 chai / raja / umpire: compile entire project with hipcc again (#43738) 2024-04-19 11:18:12 +02:00
James Smillie
faed43704b py-numpy package: enable build on Windows (#43686)
* Add conflicts for some blas implementations that don't build on
  Windows (or with %msvc)
* Need to enclose CC/CXX variables in quotes in case those paths
  have spaces, otherwise Meson runs into errors
* On Windows, Python dependencies now add <prefix>/Scripts to the
  PATH (this is established as a standard in PEP 370)
2024-04-18 16:38:08 -06:00
John W. Parent
6fba31ce34 Windows: Update MSVC + oneAPI detection and integration (#43646)
* Later versions of oneAPI have moved, so update detection to find it
  in both old and new location
* Remove reliance on ONEAPI_ROOT env variable when determining Fortran
  compiler version for %msvc
* When finding a Fortran compiler for MSVC, there was logic enforcing
  a maximum MSVC version for a given oneAPI Fortran version. This
  mapping was out of date and excluding valid combinations, so has
  been removed (the logic now just picks the latest available
  oneAPI Fortran compiler for any given MSVC version).
2024-04-18 21:53:56 +00:00
Olivier Cessenat
112cead00b keepassxc: new version 2.7.7 (#43729) 2024-04-18 14:54:08 -06:00
John W. Parent
9e2558bd56 Windows: add win-sdk/wgl externals during bootstrapping (#43459)
On Windows, bootstrapping logic now searches for and adds the win-sdk
and wgl packages to the user's top scope as externals if they are not
present.

These packages are generally required to install most packages with
Spack on Windows, and are only available as externals, so it is
assumed that doing this automatically would be useful and avoid
a mandatory manual step for each new Spack instance.

Note this is the first case of bootstrapping logic modifying
configuration other than the bootstrap configuration.
2024-04-18 10:20:04 -07:00
eugeneswalker
019058226f py-netcdf4 %oneapi: cflags append -Wno-error=int-conversion (#43629) 2024-04-18 10:11:24 -07:00
George Young
ac0040f67d spaceranger: new manual download package @2.1.1 (#42391)
* spaceranger: new manual download package @2.1.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:11:01 -07:00
wspear
38f341f12d Added tau@2.33.2 (#43682) 2024-04-18 10:10:18 -07:00
George Young
26ad22743f cellranger: new manual download package @7.1.0 (#38486)
* cellranger: new manual download package @7.1.0
* cellranger: updating to @7.2.0
* updating website
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:09:00 -07:00
George Young
46c2b8a565 xeniumranger: new manual download package @1.7.1 (#42389)
* xeniumranger: new manual download package @1.7.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:08:04 -07:00
James Taliaferro
5cbb59f2b8 New package: editorconfig (#43670)
* New package: editorconfig
* remove FIXMEs
* add description for editorconfig
2024-04-18 10:05:15 -07:00
Adam J. Stewart
f29fa1cfdf CI: remove MXNet (#43704) 2024-04-18 10:04:03 -07:00
Olivier Cessenat
c69951d6e1 gxsview: compiles against system qt and vtk on rhel8 (#43722)
* gxsview: compiles againts system qt and vtk on rhel8
* Update gxsview/package.py for blanks around operator
* Update gxsview/package.py import blank line
* Update gxsview/package.py for style
* Update gxsview/package.py checking vtk version
2024-04-18 10:18:05 -06:00
Adam J. Stewart
f406f27d9c ML CI: remove extra xgboost (#43709) 2024-04-18 09:08:25 -07:00
Tamara Dahlgren
36ea208e12 Twitter->X: Reflect the name (only) change (#43690) 2024-04-18 08:52:54 -07:00
Kyle Knoepfel
17e0774189 Make sure variable is None if exception is raised. (#43707) 2024-04-18 08:50:15 -07:00
Sam Gillingham
3162c2459d update py-python-fmask to version 0.5.9 (#43698)
* update py-python-fmask to version 0.5.9
* add gillins and neilflood as maintainers
* remove spaces
* remove blank line
* put maintainers higher
2024-04-18 08:48:15 -07:00
Massimiliano Culpo
7cad6c62a3 Associate condition sets from cli to root node (#43710)
This PR prevents a condition_set from having nodes that are not associated with the corresponding root node through some (transitive) dependencies.
2024-04-18 17:27:12 +02:00
Harmen Stoppels
eb2ddf6fa2 asp.py: do not copy 2024-04-18 15:39:26 +02:00
Harmen Stoppels
2bc2902fed spec.py: early return in __str__ 2024-04-18 15:39:26 +02:00
Mikael Simberg
b362362291 cvise package: add version 2.10.0 and ncurses constraint (#43319) 2024-04-17 13:41:33 -07:00
Rocco Meli
32bb5c7523 mkl interface (#43673) 2024-04-17 16:19:35 -04:00
FrederickDeny
a2b76c68a0 Added e4s-cl@1.0.3 (#43693) 2024-04-17 13:45:36 -06:00
Wouter Deconinck
62132919e1 xrootd: new version 5.6.9 (#43684) 2024-04-17 12:26:52 -07:00
Wouter Deconinck
b06929f6df xerces-c: new version 3.2.5 (#43687) 2024-04-17 12:24:27 -07:00
Wouter Deconinck
0f33de157b assimp: new version 5.4.0 (#43689) 2024-04-17 10:21:15 -07:00
Sinan
03a074ebe7 package/npm update (#43692)
* package/npm update
* add conflicts to exclude certain version intervals

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-04-17 10:06:09 -07:00
Adam J. Stewart
4d12b6a4fd py-shapely: add v2.0.4 (#43702) 2024-04-17 18:51:48 +02:00
Adam J. Stewart
26bb15e1fb py-sphinx: add v7.3 (#43703) 2024-04-17 18:51:31 +02:00
Bill Williams
1bf92c7881 [Score-P] Make local with-or-without not use "yes" (#43701)
Score-P does not accept "--with-foo=yes", but only "--with-foo" or "--with-foo=some-valid-specific-choice-or-path". This keeps Spack from generating config flags that will cause Score-P to barf.
2024-04-17 09:32:47 -07:00
Todd Gamblin
eefe0b2eec Improve spack find output in environments (#42334)
This adds some improvements to `spack find` output when in environments based
around some thoughts about what users want to know when they're in an env.

If you're working in an enviroment, you mostly care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized

So, this PR adds a couple tweaks to display that information more clearly:

- [x] We now display install status next to every root. You can easily see
      which are installed and which aren't.

- [x] When you run `spack find -l` in an env, the roots now show their concrete
      hash (if they've been concretized). They previously would show `-------`
      (b/c the root spec itself is abstract), but showing the concretized root's
      hash is a lot more useful.

- [x] Newly added/unconcretized specs still show `-------`, which now makes more
      sense, b/c they are not concretized.

- [x] There is a new option, `-r` / `--only-roots` to *only* show env roots if
      you don't want to look at all the installed specs.

- [x] Roots in the installed spec list are now highlighted as bold. This is
      actually an old feature from the first env implementation , but various
      refactors had disabled it inadvertently.
2024-04-17 16:22:05 +00:00
Wouter Deconinck
de6c6f0cd9 py-pyparsing: new version 3.1.2 (#43579) 2024-04-17 07:53:03 -06:00
James Smillie
309d3aa1ec Python package: fix install of static libs on Windows (#43564) 2024-04-16 16:40:15 -06:00
Andrey Perestoronin
feff11f914 intel-oneapi-dnn-2024.1.0: add DNN package version (#43679)
* add onednn package

* fix style

* Update var/spack/repos/builtin/packages/intel-oneapi-dnn/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-16 15:26:14 -04:00
John W. Parent
de3b324983 Windows filesystem utilities (bugfix): improve SFN usage (#43645)
Reduce incidence of spurious errors by:
* Ensuring we're passing the buffer by reference
* Get the correct short string size from Windows API instead of computing ourselves
* Ensure sufficient space for null terminator character

Add test for `windows_sfn`
2024-04-16 11:02:02 -07:00
kwryankrattiger
747cd374df Run after_script aggregator with spack python (#43669) 2024-04-16 19:03:44 +02:00
Wouter Deconinck
8b3ac40436 acts: new version 34.0.0 (#43680) 2024-04-16 11:03:31 -06:00
Wouter Deconinck
28e9be443c (py-)onnx: new version 1.16.0 (#43675) 2024-04-16 09:58:43 -07:00
John W. Parent
1381bede80 zstd: 1.5.6 does not build on Windows (#43677)
Conflict until a fix has been merged upstream
2024-04-16 09:36:27 -07:00
Wouter Deconinck
6502785908 podio: +rntuple requires root +root7 (#43672) 2024-04-16 09:27:07 -07:00
Erik Heeren
53257408a3 py-bluepyopt: 1.14.11 (#43678) 2024-04-16 09:21:24 -07:00
Wouter Deconinck
28d02dff60 pythia8: new version 8.311 (#43667) 2024-04-16 09:19:06 -07:00
Wouter Deconinck
9d60b42a97 jwt-cpp: new version 0.7.0, scitokens-cpp: new versions to 1.1.1 (#43657)
* jwt-cpp: new version 0.7.0, depends_on nlohmann-json
* scitokens-cpp: new versions to 1.1.1
* scitokens-cpp: conflicts ^jwt-cpp@0.7: when @:1.1
2024-04-16 09:17:18 -07:00
Harmen Stoppels
9ff5a30574 concretize.lp: fix issue with reuse of conditional variants (#43676)
Currently if you request pkg +example where example is a conditional
variant, and you have a pkg in the database for which the condition
did not hold (so no +example nor ~example), the solver would reuse it
regardless, not imposing +example.

The change rules out exactly one thing: variant_set without variant_value,
which in practice could only happen when not node_has_variant (i.e. when
under the current package.py rules the variant's when condition did not
trigger).
2024-04-16 16:09:32 +00:00
Wouter Deconinck
9a6c013365 wayland: +doc requires graphviz +expat for HTML tables (#43668) 2024-04-16 01:28:47 -06:00
Andrey Prokopenko
9f62a3e819 arborx: add v1.6 (#43623) 2024-04-16 09:27:14 +02:00
Maciej Wójcik
e380e9a0ab gromacs: prevent version conflict after enabling plumed (#43449) 2024-04-16 09:07:50 +02:00
Adam J. Stewart
8415ea9ada py-black: switch maintainer (#43652) 2024-04-16 08:51:12 +02:00
Josh Bowden
6960766e0c Damaris: add v1.10.0 (#43664)
Co-authored-by: Joshua Bowden <joshua-chales.bowden@inria.fr>
2024-04-15 23:43:39 -06:00
Michael Kuhn
0c2ca8c841 octave: add 8.4.0 and 9.1.0 (#43518) 2024-04-15 13:32:20 -07:00
Jen Herting
273960fdbb [py-pybedtools] added version 0.10.0 (#43625) 2024-04-15 13:27:22 -07:00
eugeneswalker
0cd2a1102c crtm: add noaa versions and package mods (#43635)
* crtm: add noaa versions and package mods
* crtm@v2.4.1-jedi: add missing depends_on netcdf-fortran, ecbuild from jcsda spack fork
2024-04-15 13:24:07 -07:00
Teague Sterling
e40676e901 Add ollama package (#43655)
* Added package to build Ollama
* Update package.py
  Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
2024-04-15 13:13:07 -07:00
Tristan Carel
4ddb07e94f py-morphio: add version 3.3.7, update license (#43420) 2024-04-15 14:06:48 -06:00
eugeneswalker
50585d55c5 hdf: %oneapi: -Wno-error=implicit-function-declaration (#43631)
* hdf: %oneapi: -Wno-error=implicit-function-declaration

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-15 12:17:07 -07:00
Adam J. Stewart
5d6b5f3f6f GDAL: fix MDB build (#43665) 2024-04-15 10:38:56 -07:00
Juan Miguel Carceller
2351c19489 glew: remove a few unused options (#43399)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-15 18:49:51 +02:00
snehring
08d49361f0 iq-tree: add v2.3.1 (#43442) 2024-04-15 18:48:58 +02:00
Ken Raffenetti
c3c63e5ca4 mpich: Update PMI configure options (#43551)
Add a "default" option that passes no option to configure. Existing
options changed in the MPICH 4.2.0 release, so update the package to
reflect those changes.
2024-04-15 18:40:37 +02:00
Richard Berger
e72d4075bd LAMMPS: add v20240207.1 (#43538)
Add workaround for undefined HIP_PATH in older versions
2024-04-15 16:34:30 +00:00
Todd Gamblin
f9f97bf22b tests: Spec tests shouldn't fetch remote git repositories. (#43656)
Currently, some of the tests in `spec_format` and `spec_semantics` fetch
the actual zlib repository when run, because they call `str()` on specs
like `zlib@foo/bar`, which at least currently requires a remote git clone
to resolve.

This doesn't change the behavior of git versions, but it uses our mock git
repo infrastructure and clones the `git-test` package instead of the *real*
URL from the mock `zlib` package.

This should speed up tests.  We could probably refactor more so that the git
tests *all* use such a fixture, but the `checks` field that unfortunately
tightly couples the mock git repository and the `git_fetch` tests complicates
this. We could also consider *not* making `str()` resolve git versions, but
I did not dig into that here.

- [x] add a mock_git_test_package fixture that sets up a mock git repo *and*
      monkeypatches the `git-test` package (like our git test packages do)
- [x] use fixture in `test_spec_format_path`
- [x] use fixture in `test_spec_format_path_posix`
- [x] use fixture in `test_spec_format_path_windows`
- [x] use fixture in `test_parse_single_spec`
2024-04-15 09:20:23 -07:00
Massimiliano Culpo
8033455d5f hdf5: require mpich+fortran when hdf5+fortran (#43591) 2024-04-15 18:17:28 +02:00
Eric Berquist
50a5a6fea4 tree-sitter: add versions up to 0.22.2 (#43608) 2024-04-15 09:10:19 -07:00
eugeneswalker
0de8a0e3f3 wgrib2: oneapi -> comp_sys="intel_linux" (#43632) 2024-04-15 18:04:41 +02:00
SXS Bot
0a26e74cc8 spectre: add v2024.04.12 (#43641)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-04-15 18:03:51 +02:00
dependabot[bot]
9dfd91efbb build(deps): bump docker/setup-buildx-action from 3.2.0 to 3.3.0 (#43542)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](2b51285047...d70bba72b1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:47 +02:00
dependabot[bot]
1a7baadbff build(deps): bump python-levenshtein in /lib/spack/docs (#43543)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.25.0 to 0.25.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.25.0...v0.25.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:24 +02:00
Marc Perache
afcfd56ae5 range-v3: add v0.12.0 (#40103) 2024-04-15 17:43:14 +02:00
dependabot[bot]
7eb2e704b6 build(deps): bump codecov/codecov-action from 4.1.1 to 4.3.0 (#43562)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.1.1 to 4.3.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](c16abc29c9...84508663e9)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 17:34:31 +02:00
Josh Milthorpe
564b4fa263 hipcub: depend on matching version of hip+cuda when +cuda (#42970) 2024-04-15 17:33:26 +02:00
Adam J. Stewart
0a941b43ca PyTorch: build with external cpuinfo (#40758) 2024-04-15 17:26:52 +02:00
one
35ff24ddea googletest: fix reversed pthreads variant logic (#43649) 2024-04-15 11:18:27 -04:00
Rocco Meli
7019e4e3cb openbabel: add CMake patch for 3.1.1 (#43612) 2024-04-15 17:07:54 +02:00
Weston Ortiz
cb16b8a047 goma: new version 7.6.1 (#43617) 2024-04-15 17:04:34 +02:00
Adam J. Stewart
381acb3726 Build systems: fix docstrings (#43618) 2024-04-15 17:01:52 +02:00
Adam J. Stewart
d87ea0b256 py-maturin: add v1.5.1 (#43619) 2024-04-15 16:55:17 +02:00
Adam J. Stewart
1a757e7f70 py-lightning: add v2.2.2 (#43621) 2024-04-15 16:54:34 +02:00
eugeneswalker
704e2c53a8 py-eccodes: add v1.4.2 (#43633) 2024-04-15 16:44:57 +02:00
renjithravindrankannath
478d8a668c rocm-opencl: add dependency on aqlprofile (#43637) 2024-04-15 16:44:16 +02:00
dependabot[bot]
7903f9fcfd build(deps): bump black from 24.3.0 to 24.4.0 in /lib/spack/docs (#43642)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:37 +02:00
dependabot[bot]
670d3d3fdc build(deps): bump black in /.github/workflows/style (#43643)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:14 +02:00
Vanessasaurus
e8aab6b31c flux-core: add v0.61.2 (#43648)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-15 16:34:58 +02:00
Adam J. Stewart
1ce408ecc5 py-ruff: add v0.3.7 (#43620) 2024-04-15 16:33:11 +02:00
Adam J. Stewart
dc81a2dcdb py-rasterio: add v1.3.10 (#43653) 2024-04-15 16:32:34 +02:00
Rocco Meli
b10f51f020 charmpp: add archs including Cray shasta with ARM (#43191)
Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-04-15 16:28:31 +02:00
Hariharan Devarajan
4f4e3f5607 dlio-profiler: add releases up yo v0.0.5 (#43530) 2024-04-15 16:21:52 +02:00
Wouter Deconinck
00fb80e766 util-linux: new version 2.40 (#43661) 2024-04-15 16:16:52 +02:00
Michael Kuhn
057603cad8 Fix pkgconfig dependencies (#43651)
pkgconfig is the virtual package, pkg-config and pkgconf are
implementations.
2024-04-15 16:05:11 +02:00
kjrstory
5b8b6e492d su2: add v8.0.0, v8.0.1 (#43662) 2024-04-15 15:25:16 +02:00
Wouter Deconinck
763279cd61 spdlog: new version 1.13.0 (#43658) 2024-04-15 12:36:10 +02:00
Wouter Deconinck
e4237b9153 zlib: new version 1.3.1 (#43660) 2024-04-15 10:59:06 +02:00
Wouter Deconinck
d288658cf0 zstd: new version 1.5.6 (#43659) 2024-04-15 09:04:42 +02:00
Kacper Kornet
2c22ae0576 intel-oneapi-compiler: Fix generation of config files (#43654)
Commit 330a9a7c9a aimed at preventing
generation of .cfg files when a given compiler does not exist
in the particular release. However the check does not
contain the full paths so it always fails resulting in empty
.cfg files. This commit fixes it.
2024-04-13 19:34:58 -04:00
eugeneswalker
fc3fc94689 gsibec: properly reference self.spec (#43627) 2024-04-13 14:21:03 -07:00
eugeneswalker
b5013c1372 dealii@9.5 +cgal requires ^cgal@5: (#43639) 2024-04-13 14:20:40 -07:00
Juan Miguel Carceller
e220674c4d sherpa: remove paths to compiler wrappers and use the provided libtool (#43611)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-13 15:24:26 -05:00
Harmen Stoppels
7f13518225 gettext: unvendor libxml (#43622) 2024-04-13 12:01:38 +02:00
Wouter Deconinck
96a13a97e6 freetype: new version 2.12 and 2.13 (#43571) 2024-04-13 11:48:48 +02:00
Chris White
6d244b3f67 remove hardcoded hipcc (#43644) 2024-04-12 19:38:15 -06:00
Chris White
6bc66db141 Axom: add new versions and bring in new changes (#43590)
* add new axom releases, sync changes between repos

* add new version of axom and add fallback depends for umpire/raja

* remove now redundant flags, add flags to cuda flags to flags output by cachedcmakepackage

Co-authored-by: white238 <white238@users.noreply.github.com>
2024-04-12 17:34:32 -06:00
James Shen
acfb2b9270 Geant4: extend patch to fix compile for 10.0-10.4 as well (#43640) 2024-04-12 18:52:22 -04:00
Frédéric Simonis
d92a2c31fb precice: Add version 3.1.1 (#43616) 2024-04-12 15:24:28 -07:00
eugeneswalker
e32561aff6 odc: add v1.4.6 (#43634) 2024-04-12 15:19:32 -07:00
Auriane R
4b0479159f pika: add pika 0.24.0 (#43624) 2024-04-12 15:18:27 -07:00
eugeneswalker
03bfd36926 w3nco %oneapi cflags: append -Wno-error=implicit-function-declaration (#43628)
* w3nco %oneapi cflags: append -Wno-error=implicit-function-declaration

* update flag for %apple-clang, %clang

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-12 20:54:18 +02:00
eugeneswalker
4d30c8dce4 gsibec: add v1.2.1 (#43630) 2024-04-12 12:42:46 -06:00
Eric Berquist
49d4104f22 emacs: add 29.3 (#43626) 2024-04-12 11:10:37 -07:00
Massimiliano Culpo
07fb83b493 gcc: add more detection tests (#43613) 2024-04-12 04:03:11 -06:00
Massimiliano Culpo
263007ba81 solver: add an integrity constraint for virtual nodes (#43582)
Upon close inspection of clingo answer sets, in some cases we have "equivalent" (i.e. same hash for the concrete spec) duplicates that differ only because of virtual nodes that are added to the answer set, without any edge using them.
2024-04-12 09:31:44 +02:00
Sajid Ali
3b6e99381f add py-h5py@3.11 (#43605)
* add py-h5py@3.11
* incorporate reviewer feedback
* incorporate reviewer feedback
2024-04-11 23:23:20 -06:00
Howard Pritchard
a30af1ac54 OpenMPI: add version 5.0.3 (#43609)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-04-11 22:48:15 -06:00
Matthew Thompson
294742ab7b openmpi: add MPIFC environment variable (#36669) 2024-04-11 21:50:59 -06:00
eugeneswalker
6391559fb6 e4s ci: add: netcdf-fortran, fpm, e4s-cl (#43601) 2024-04-11 21:01:38 -06:00
Dave Keeshan
d4d4f813a9 verible: Add version v0.0-3624-gd256d779 (#43604) 2024-04-11 20:56:00 -06:00
Dave Keeshan
4667163dc4 Add version 5.024 (#43603) 2024-04-11 20:45:06 -06:00
Dave Keeshan
439f105285 Add versions 0.39 and 0.40 (#43600) 2024-04-11 20:44:51 -06:00
eugeneswalker
f65b1fd7b6 dealii +cuda: conflicts with ^cuda@12: (#43599) 2024-04-11 19:43:32 -06:00
Radim Janalík
d23e06c27e Allow packages to be pushed to build cache after install from source (#42423)
This commit adds a property `autopush` to mirrors. When true, every source build is immediately followed by a push to the build cache. This is useful in ephemeral environments such as CI / containers.

To enable autopush on existing build caches, use `spack mirror set --autopush <name>`. The same flag can be used in `spack mirror add`.
2024-04-11 19:43:13 -06:00
Alberto Sartori
b76e9a887b justbuild: bump version 1.2.5 (#43592) 2024-04-11 18:51:03 -06:00
Filippo Spiga
55ffd439ce JUBE: add v2.6.x (#41272)
* Adding JUBE 2.6.0
* They quickly released a bugfix package
* Correct the version sha256 for v2.6.1

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: fspiga <fspiga@fc01-gg01.cm.cluster>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-04-11 16:01:43 -06:00
Thomas Green
d8a7b88e7b Create Castep package (#41230)
* Create package.py
* Update package.py
  Post review fixes.
* Style fixes.
2024-04-11 14:49:03 -07:00
Paul Kuberry
aaa1bb1d98 suite-sparse: refactor cmake args (#43386)
* Simplify config command and add BLAS/LAPACK location
* Use BLAS_ROOT and LAPACK_ROOT and disable use of system
  package registry
* Adds location of BLAS_LIBRARIES and LAPACK_LIBRARIES to CMake
* Adds CMake variables to prevent picking up system installations
  of BLAS/LAPACK. Fixes previous PR #43328 that was picking up
  incorrect installations
* Adds versions 7.2.1
2024-04-11 14:06:57 -07:00
Martin Aumüller
0d94b8044b libzip: add up to v1.10.1 (#43560)
* libzip: add up to v1.10.1
  - update homepage and change download url to GitHub
  - change build system to CMake for releases starting with 1.4
* [@spackbot] updating style on behalf of aumuell
* libzip: fix urls
* [@spackbot] updating style on behalf of aumuell
* libzip: do not add versions from libzip.org
  these are old, and urllib refuses to fetch them
* libzip: deprecate versions from libzip.org
  urllib refuses to fetch them, only curl would work

---------

Co-authored-by: aumuell <aumuell@users.noreply.github.com>
2024-04-11 14:51:02 -06:00
Wouter Deconinck
5a52780f7c acts: new version 33.1.0 (#43581)
* acts: new version 33.1.0
* actsvg: new version 0.4.41
* geomodel: new package
* [@spackbot] updating style on behalf of wdconinc
* acts: plugin_cmake_variant(geomodel)
* geomodel: with when(+visualization)

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-04-11 13:32:10 -06:00
Tristan Carel
dd0a8452ee py-simpleeval: add version 0.9.13 (#43568) 2024-04-11 12:19:30 -07:00
jalcaraz
c467bba73e TAU package: Include recent change for Ubuntu (#43572)
* Include recent change for Ubuntu
  Select option -disable-no-pie-on-ubuntu for some Ubuntu systems
  823971df01
* Added conflict for new variant
* Updated conflict version
* Added mention of Ubuntu to variant description
2024-04-11 12:06:34 -07:00
James Taliaferro
d680a0cb99 New package: timew (#43585)
* add timewarrior
* fix style checks
* fix style checks
2024-04-11 11:58:08 -07:00
Luc Berger
efadee26ef Kokkos Ecosystem: 4.3.00 (#43607)
* Kokkos Kernels: adding missing TPLs and pre-conditions
  Adding variants and dependencies for rocBLAS and rocSPARSE.
  Also adding a "when=" close to the TPL variants that prevents
  enabling the TPLs in versions of the library when it was not
  yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: adding cusolver and rocsolver for at version 4.3.00
* Kokkos Ecosystem: updating packages for release 4.3.00
* Kokkos: adding arch for SG2042
* Removing sg2042 from spack_micro_arch_map
  Removing it here and will work to add it in the proper generic spack location, likely:   `spack/lib/spack/external/archspec/json/cpu/microarchitectures.json` ?
2024-04-11 12:41:28 -06:00
Greg Becker
2077b3a006 invalid compiler: warn instead of error (#43491) 2024-04-11 20:39:27 +02:00
Adam J. Stewart
8e0c659b51 py-cartopy: add v0.23.0 (#43583)
* py-cartopy: add v0.23.0
* numpy 2 support added
2024-04-11 11:11:43 -07:00
Derek Ryan Strong
863ab5a597 opus: update package (#43587) 2024-04-11 10:58:27 -07:00
Derek Ryan Strong
db4e76ab27 cpio: add 2.15 (#43589) 2024-04-11 10:55:08 -07:00
Adam J. Stewart
6728a46a84 py-keras: add v3.2.1 (#43594) 2024-04-11 10:37:48 -07:00
Adam J. Stewart
5a09459dd5 py-pandas: add v2.2.2 (#43596) 2024-04-11 10:29:43 -07:00
James Taliaferro
7e14ff806a add taskwarrior 3.0.0 (#43580)
* add taskwarrior 3.0.0

* blacken
2024-04-11 04:59:25 -06:00
Wouter Deconinck
7e88cf795c wayland-protocols: new versions 1.34 (#43577) 2024-04-11 04:29:41 -06:00
Wouter Deconinck
1536e3d422 qt-base: depends_on cmake@3.21: when ~shared or platform=darwin (#43576) 2024-04-11 04:29:15 -06:00
Massimiliano Culpo
1fe8e63481 Reuse specs built with compilers not in config (#43539)
Allow reuse of specs that were built with compilers not in the current configuration. This means that specs from build caches don't need to have a matching compiler locally to be reused. Similarly when updating a distro. If a node needs to be built, only available compilers will be considered as candidates.
2024-04-11 09:13:24 +02:00
YI Zeping
dfca2c285e Packages: llvm cxx flag remove, new versions, and binutils build issue fix (#43567)
* add c++11 header to gold for compiler not defaulting to c++11

* glibc: add 2.39

* llvm: add 18.1.3

* fix #42314, remove cxx11 flag for llvm; should be controlled by cmake.

* modify patch

* llvm version

* add gmake version request
2024-04-10 23:04:58 -04:00
Martin Aumüller
2686f778fa qt-*: add v6.6.2, v6.6.3, and v6.7.0 (#43559)
* qt-base: add v6.6.2

* qt-declarative: add v6.6.2

* qt-quick3d: add v6.6.2

* qt-quicktimeline: add v6.6.2

* qt-shadertools: add v6.6.2

* qt-svg: add v6.6.2

* qt-base: add v6.7.0

* qt-svg: add v6.7.0

* qt-declarative: add v6.7.0

* qt-quick3d: add v6.7.0

* qt-quicktimeline: add v6.7.0

* qt-shadertools: add v6.7.0

* qt-*: add v6.6.3

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-10 16:30:09 -05:00
Wouter Deconinck
925e9c73b1 libpthread-stubs: new version 0.5 (#43574) 2024-04-10 13:29:45 -06:00
one
aba447e885 Add new versions for toml11 (#43469)
* Add new versions for toml11
  Added 3.8.0 and 3.8.1
* Update package.py, add cxx_std
* [@spackbot] updating style on behalf of alephpiece

---------

Co-authored-by: alephpiece <alephpiece@users.noreply.github.com>
2024-04-10 12:29:11 -07:00
Richard Berger
1113de0dad flecsi+legion: add cr versions FleCSI depended on in past releases (#43499)
* flecsi+legion: add cr versions FleCSI depended on in past releases
* flecsi: deprecate develop version
2024-04-10 12:26:17 -07:00
Wouter Deconinck
4110225166 libdrm: new version 2.4.120 (#43573) 2024-04-10 13:24:45 -06:00
Adam J. Stewart
24c839c837 py-scikit-learn: add v1.4.2 (#43557) 2024-04-10 12:20:18 -07:00
Martin Aumüller
42c6a6b189 ospray: add v3.1.0 and dependencies (#43558)
* rkcommon: add v1.13.0
* openvkl: add v2.0.1
* openimagedenoise: add v2.2.2
* ospray: add v3.1.0
2024-04-10 12:13:23 -07:00
Martin Aumüller
b0ea1c6f24 botan: add v3.4.0 (#43561) 2024-04-10 11:55:52 -07:00
Vanessasaurus
735102eb2b Automated deployment to update package flux-sched 2024-04-10 (#43566)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-10 11:54:49 -07:00
Vanessasaurus
2e3cdb349b Automated deployment to update package flux-core 2024-04-10 (#43565)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-10 11:54:20 -07:00
Seth R. Johnson
05c8030119 swig: update symlink alias to appease cmake (#43271) 2024-04-10 11:50:16 -07:00
Adam J. Stewart
bbcd4224fa py-matplotlib: add v3.8.4 (#43487)
* py-matplotlib: add v3.8.4

* matplotlib requires exact version of freetype for tests to pass

* Add version constraints to fontconfig deps

* Fix freetype build

* Freetype cmake build is useless

* Typo

* Fix download

* Fix build of older freetype

* cmake is useless
2024-04-10 18:20:42 +02:00
Adam J. Stewart
4c0cdb99b3 py-scipy: add v1.12 and v1.13 (#42213) 2024-04-10 17:56:21 +02:00
Axel Huebl
f22d009c6d pyAMReX: No CCache (#43570)
This interferes with Spack compiler wrappers.
2024-04-10 09:02:37 -06:00
Axel Huebl
c5a3e36ad0 Update pyAMReX: 24.03, 24.04 (#42858)
Latest version of pyAMReX.
2024-04-10 06:33:32 -06:00
Hariharan Devarajan
1c76ba1c3e Release for brahma 0.0.3 (#43525)
* Release for brahma 0.0.3
* switch the version directive order
2024-04-09 12:33:29 -06:00
Hariharan Devarajan
b969f739bd cpp-logger: add v0.0.3 (#43524)
* added cpp-logger 0.0.3
* switched version directive order
2024-04-09 12:27:41 -06:00
one
4788c4774c Add a new version for cxxopts (#43470)
Added 3.2.0
2024-04-09 10:36:05 -07:00
Wouter Deconinck
34de028dbc cppzmq: new version 4.10.0 (#43526) 2024-04-09 10:14:13 -07:00
Hariharan Devarajan
a69254fd79 release for gotcha 1.0.6 (#43531) 2024-04-09 10:10:32 -07:00
Adam J. Stewart
af5f205759 py-keras: add v3.2.0 (#43540) 2024-04-09 10:09:15 -07:00
HELICS-bot
77f9100a59 helics: Add version 3.5.2 (#43541)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-04-09 10:08:19 -07:00
Derek Ryan Strong
386bb71392 flac: update versions (#43544) 2024-04-09 10:07:15 -07:00
Wouter Deconinck
0676d6457f yoda: new version 2.0.0 (#43534)
* yoda: new versions 1.9.9, 1.9.10, and 2.0.0
* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-04-09 10:04:20 -07:00
Martin Lang
0b80e36867 [libxc] new homepage (#43546)
The old page on tddft.org is no longer reachable.
2024-04-09 09:59:47 -07:00
Stephen Sachs
4c9816f10c [mpas-model] Only add options for double precision when requested (#43547)
As in the original makefile "FFLAGS_PROMOTION = -fdefault-real-8
-fdefault-double-8" is only used when `precision=double`. This is the default
for the Spack package, so no change if `precision` is left unset.
2024-04-09 09:56:48 -07:00
psakievich
fb6741cf85 Trilinos: More accurate stk boot dependency (#43550)
Boost was not required as of `@13.4.0`
2024-04-09 06:49:06 -06:00
Kensuke WATANABE
3f2fa256fc LLVM: avoid Fujitsu compiler build fail in llvm17-18 (#43387)
* Avoid Fujitsu compiler Clang Mode options when building LLVM

* LLVM: avoid Fujitsu compiler build fail in llvm17-18

* address review comments
2024-04-08 19:56:30 -04:00
John W. Parent
d5c8864942 Windows bugfix: safe rename if renaming file onto itself (#43456)
* Generally use os.replace on Windows and Linux
* Windows behavior for os.replace differs when the destination exists
  and is a symlink to a directory: on Linux the dst is replaced and
  on Windows this fails - this PR makes Windows behave like Linux
  (by deleting the dst before doing the rename unless src and dst
  are the same)
2024-04-08 14:10:02 -07:00
Luc Berger
b3cef1072d Nalu: updating Trilinos recipe a bit (#43471)
* Nalu: updating Trilinos recipe a bit

Basic changes to build/install nalu properly using Spack.
Some more changes would be nice for instance adding an
option to build against Trilinos master or develop. Adding
a dependency on googletest to avoid the annoying build
failures in the unit-tests.

* Nalu: adding release 1.6.0

Nalu v1.6.0 can build cleanly against Trilinos 14.0.0 with the
proposed changes. The only other combo is master / master but
than one is "floating" as these branch evolve over time. When a
new Nalu comes out we might want to add another fixed version to
keep this recipes up to date!
2024-04-08 10:39:51 -06:00
Wouter Deconinck
e8ae9a403c acts: depends_on py-onnxruntime when +onnx for @23.3: (#43529) 2024-04-08 14:13:17 +02:00
Wouter Deconinck
1a8ef161c8 fastjet: new multi-valued variant plugins (#43523)
* fastjet: new multi-valued variant `plugins`

* rivet: depends_on fastjet plugins=cxx
2024-04-08 14:12:12 +02:00
Harmen Stoppels
d3913938bc py-tatsu: add upperbound on python (#43510) 2024-04-08 11:26:46 +02:00
Harmen Stoppels
4179880fe6 py-pymatgen: add forward compat bound for cython (#43511) 2024-04-08 11:26:09 +02:00
Harmen Stoppels
125dd0368e py-triton: add zlib (#43512) 2024-04-08 11:25:33 +02:00
Harmen Stoppels
fd68f8916c gperftools: add cmake build system (#43506)
the autotools build system does something funky which causes a link line
where gccs default link dirs are explicitly added and end up before the
-L from spack's libunwind, so that ultimately it links against system
libunwind.

the cmake build system does better.
2024-04-08 10:00:05 +02:00
Jonas Eschle
93e6f5fa4e Update jax & jaxlib versions (#42863)
* upgrade new versions

* style fix

* update jaxlib deps (not cuda and bazel yet)

* update jaxlib cuda versions

* update jaxlib cuda versions

* update jaxlib cuda versions

* chore: style fix

* Update package.py

* Update package.py

* fix:  typo

* docs: add source for cuda version

* py-jaxlib 0.4.14 also doesn't build on ppc64le

* Add 0.4.26

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-04-07 12:04:23 +02:00
Robert Cohn
54acda3f11 oneapi licenses (#43451) 2024-04-06 08:04:04 -04:00
kwryankrattiger
663e20fcc4 ParaView: add v5.12.0 (#42943)
* ParaView: Update version 5.12.0

Add 5.12.0 release
Update default to 5.12.0

* Add patch for building ParaView 5.12 with kits

* Drop VTKm from neoverse
2024-04-06 04:12:48 +00:00
eugeneswalker
6428132ebb e4s ci: enable lammps variants from presets/most.cmake (#43522) 2024-04-05 20:56:18 -07:00
eugeneswalker
171958cf09 py-deephyper: add v0.6.0 (#43492)
* py-deephyper: add latest version: v0.6.0

* e4s: add py-deephyper

* v0.6.0: depend on python@3.7:3.11

* add py-packaging constraint so arm64 builds work

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-06 00:28:37 +00:00
Frédéric Simonis
0d0f7ab030 Add release 3.1.0 (#43508) 2024-04-05 16:58:57 -07:00
eugeneswalker
35f8b43a54 e4s ci: add nekbone (#43515)
* e4s ci: add nekbone, nek5000

* remove nek5000
2024-04-05 16:36:13 -07:00
one
6f7eb3750c Add new versions of tinyxml2 (#43467)
* Add new versions of tinyxml2
  Added 7.0.0 to 10.0.0
* Add the variant "shared"
2024-04-05 13:36:45 -07:00
renjithravindrankannath
2121eb31ba Patch to set PARAMETERS_MIN_ALIGNMENT to the native alignment for rocm-opencl (#43444)
* For avx build, the start address of values_ buffer in KernelParameters
   is not a correct as it is computed based on 16-byte alignment.
* Style check error fix
2024-04-05 12:02:32 -07:00
kwryankrattiger
c68d739825 CI: Add debug to the log aggregation script (#42562)
* CI: Add debug to the log aggregation script
2024-04-05 14:00:27 -05:00
John W. Parent
c468697b35 Use correct method "append" instead of extend (#43514) 2024-04-05 18:46:47 +00:00
G-Ragghianti
c4094cf051 slate: Adding comgr as dependency (#43448)
* Adding comgr as dependency

* adding more smoke test deps
2024-04-05 11:32:16 -07:00
LinaMuryanto
9ff9ca61e6 py-amq, py-celery, py-kombu: New versions, fix build (#43295)
* updating package.py for py-celery, py-kombu, py-amq
* added more py-kombu package versions
* fix copyrights and stype on py-kombu/package.py
* removed extra spaces
* added py-billiard 4.2.0 and added back the license('BSD-3-Clause')
* removed extra spaces in py-celery/package.py
* fixed py-amqp 2.4.0 sha; fixed py-celery's dependency of py-click (when version restrictions)
* more clean up on specifying version bounds
2024-04-05 11:14:18 -07:00
Massimiliano Culpo
826e0c0405 Improve hit-rate on buildcaches (#43272)
* Relax compiler and target mismatches

The mismatch occurs on an edge. Previously it was assigned
the parent priority, now it is assigned the child priority.

This should make reuse from buildcaches or store more likely,
since most mismatches will be counted with "reused" priority.

* Optimize version badness for runtimes at very low priority

We don't want to e.g. switch other attributes because we
cannot reuse an old installed runtime.

* Optimize runtime attributes at very low priority

This is such that the version of the runtime would
not influence whether we should reuse a spec.

Compiler mismatches are considered for runtimes,
to avoid situations where compiling foo%gcc@9
brings in gcc-runtime%gcc@13 if gcc@13 is among
the available compilers

* Exclude specs without runtimes from reuse

This should ensure that we do not reuse specs that
could be broken, as they expect the compiler to be
installed in a specific place.
2024-04-05 20:10:28 +02:00
Andrey Perestoronin
1b86a842ea add itac and inspector packages (#43507) 2024-04-05 09:30:36 -04:00
Harmen Stoppels
558a28bf52 bazel: conflict with gcc 13 (#43504) 2024-04-05 14:24:06 +02:00
Harmen Stoppels
411576e1fa Do not acquire a write lock on the env post install if no views (#43505) 2024-04-05 12:31:21 +02:00
eugeneswalker
cab4f92960 datatransferkit: needs trilinos@14.2: for @3.1.0: (#43496) 2024-04-05 03:03:15 -06:00
Adam J. Stewart
c6c13f6782 GDAL: add v3.8.5 (#43493) 2024-04-04 21:29:28 -06:00
Daniele Cesarini
cf11fab5ad Added Libfort library (#43490) 2024-04-04 21:14:27 -06:00
Harmen Stoppels
1d8b35c840 installer.py: compute package_id from spec (#43485)
The installer runs `get_dependent_ids`, which follows edges outside the
subdag that's being installed, so it returns a superset of the actual
dependents.

That's generally fine, except that it calls `s.package` on every
dependent, which triggers a package class to be instantiated, which is a
lot of work.

Instead, compute the package id from the spec, since that's all that's
used anyways and does not trigger *lots* of slow and redundant
instantiations of package objects.
2024-04-04 20:39:30 -06:00
Weiqun Zhang
5dc46a976d amrex: add v24.04 (#43443) 2024-04-04 19:00:44 -07:00
Alex Richert
05f5596cdd Update grib-util recipe (#43484) 2024-04-04 15:07:05 -07:00
psakievich
6942c7f35b Update exawind packages (#40793) 2024-04-04 12:54:20 -06:00
Alex Richert
18f0ac0f94 Add g2@3.4.9 (#43481) 2024-04-04 11:50:08 -07:00
Vicente Bolea
d9196ee3f8 adios2: bump version 2.10.0 (#43479) 2024-04-04 13:46:40 -05:00
John W. Parent
ef0bb6fe6b Msvc: Determine OneAPI_ROOT from fc compiler path (#43131)
If ONEAPI_ROOT is not set as an environment variable, the current approach will raise an error.
Instead we can compute the OneAPI_ROOT from the compiler paths like we do with vcvarsall.
2024-04-04 11:14:44 -07:00
Alex Richert
3fed320013 Add MPI and arch bugfixes to SCOTCH (#39264)
* Add MPI and arch bugfixes to SCOTCH

* Update scotch/package.py
2024-04-04 12:48:39 -05:00
Chris Marsh
1aa77e695d Trilinos: add threadsafe variant (#43480)
* Fixes #43454 by exposing a threadsafe variant

* [@spackbot] updating style on behalf of Chrismarsh

* fix style

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-04-04 10:00:53 -07:00
Alex Richert
3a0efeecf1 add g2c@1.9.0 (#43482) 2024-04-04 09:56:19 -07:00
Alex Richert
5ffb5657c9 update g2tmpl recipe (#43483) 2024-04-04 09:56:09 -07:00
Alex Richert
2b3e7fd10a Add shared variant for fftw to allow static-only builds (#37897)
Co-authored-by: alexrichert <alexrichert@gmail.com>
2024-04-04 03:47:46 -06:00
James Smillie
cb315e18f0 py-pip package: enable install on Windows (#43203)
The installation mechanism used on Linux to install py-pip (using pip
from the downloaded wheel to install the wheel) does not work on Windows.

This updates the installation of py-pip on Windows to download and
use a zipapp of a specific pip version in order to install the wheel
pip version that is requested.
2024-04-03 23:03:56 -06:00
Jonas Eschle
10c637aca0 py-zfit: add v0.18.* (#43200)
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-04-04 02:55:00 +02:00
Greg Becker
fb4e1cad45 remove dpcpp compiler and package (#43418)
`dpcpp` is deprecated by intel and has been superseded by `oneapi` compilers for a very long time.

---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2024-04-03 15:34:23 -07:00
Adam J. Stewart
3054b71e2e py-rarfile: add v4.2 (#43477) 2024-04-03 15:08:36 -07:00
downloadico
47163f7435 py-cig-pythia: add missing py-setuptools dependency (#43473) 2024-04-03 15:02:32 -07:00
Dom Heinzeller
e322a8382f py-torch: Add variant 'internal-protobuf' to build with the internal protobuf (#43056) 2024-04-03 15:57:54 -06:00
Greg Sjaardema
53fb4795ca Seacas exodusii 04 2024 (#43468)
* SEACAS: Update package.py to handle new SEACAS project name
  The base project name for the SEACAS project has changed from
  "SEACASProj" to "SEACAS" as of @2022-10-14, so the package
  needed to be updated to use the new project name when needed.

  The refactor also changes several:
      "-DSome_CMAKE_Option:BOOL=ON"
  to
     define("Some_CMAKE_Option", True)
* SEACAS, EXODUSII: New version; deprecate older versions; better variant descriptions
* [@spackbot] updating style on behalf of gsjaardema
* Fix long lines reported by flake8

---------

Co-authored-by: gsjaardema <gsjaardema@users.noreply.github.com>
2024-04-03 15:46:57 -06:00
eugeneswalker
4517c7fa9b ginkgo@1.7 %oneapi@2024.1: icpx 2024.1 no longer accepts sycl::ext::intel::ctz (#43476) 2024-04-03 15:46:42 -06:00
Wouter Deconinck
efaed17f91 root: new version 6.30.06 (#43472) 2024-04-03 15:41:41 -06:00
Thomas Madlener
2c17cd365d Make it possible to build whizard from a git checkout (#43447) 2024-04-03 21:55:57 +02:00
psakievich
dfe537f688 Convert curl env mod method to a side effect (#43474) 2024-04-03 12:02:48 -07:00
G-Ragghianti
be0002b460 slate: Removing scalapack as test dependency, adding python (#43452)
* removing scalapack as test dependency, adding python
* fixing style
* style
2024-04-03 11:20:03 -07:00
Daryl W. Grunau
743ee5f3de eospac: expose versions 6.5.8 and 6.5.9 (#43450)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-04-03 11:17:15 -07:00
Cameron Smith
b6caf0156f simmetrix-simmodsuite: support RHEL8, fix module paths (#43455) 2024-04-03 11:07:43 -07:00
Christoph Junghans
ec00ffc244 byfl: fix llvm dep (#43460) 2024-04-03 10:39:06 -07:00
G-Ragghianti
f020256b9f magma add version 2.8.0 (#43417)
* Add release 2.8.0
* Changing C compiler to hipcc
* Final release version
* Adding new cmake definition for rocm support
* Enabling rocm version support
* Update sha256
* Updating website URL
* Removing unnecessary C compiler spec
* Adding rocm-core dependency
* fixing rocm-core version
* fixing rocm-core version
* fixing style
* bugfix
2024-04-03 11:25:46 -06:00
Peter Brady
04377e39e0 New package: Parthenon (#43426) 2024-04-03 10:05:02 -07:00
Vanessasaurus
ba2703fea6 flux-core: add v0.61.0 (#43465)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-03 12:07:32 +02:00
Adrien Bernede
92b1c8f763 RADIUSS packages update (Starting over #39613) (#41375) 2024-04-02 15:03:07 -07:00
Adam J. Stewart
2b29ecd9b6 py-pillow: add v10+ (#43441) 2024-04-02 14:46:21 -07:00
Adam J. Stewart
5b43bf1b58 py-scikit-image: add v0.21 and v0.22 (#43440)
* py-scikit-image: add v0.21 and v0.22
* Add maintainer and license
* Style fix
2024-04-02 14:41:29 -07:00
Juan Miguel Carceller
37d9770e02 gdb: add a dependency on pkgconfig (#43439)
* gdb: add a dependency on pkgconfig
* Apply dependency for 13.1 and onwards

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-02 14:39:03 -07:00
1033 changed files with 18424 additions and 7128 deletions

View File

@@ -17,33 +17,51 @@ concurrency:
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.operating_system }}
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit externals
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -1,7 +1,8 @@
#!/bin/bash
set -ex
set -e
source share/spack/setup-env.sh
$PYTHON bin/spack bootstrap disable github-actions-v0.4
$PYTHON bin/spack bootstrap disable spack-install
$PYTHON bin/spack -d solve zlib
$PYTHON bin/spack $SPACK_FLAGS solve zlib
tree $BOOTSTRAP/store
exit 0

View File

@@ -13,118 +13,22 @@ concurrency:
cancel-in-progress: true
jobs:
fedora-clingo-sources:
distros-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
container: ${{ matrix.image }}
strategy:
matrix:
image: ["fedora:latest", "opensuse/leap:latest"]
steps:
- name: Install dependencies
- name: Setup Fedora
if: ${{ matrix.image == 'fedora:latest' }}
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
@@ -133,15 +37,9 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -151,77 +49,102 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-sources:
runs-on: macos-latest
clingo-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison@2.7 tree
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: ${{ matrix.macos-version }}
gnupg-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
macos-version: ['macos-11', 'macos-12']
runner: [ 'macos-13', 'macos-14', "ubuntu-latest" ]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap clingo
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
# NOTE: test all pythons that exist, not all do on 12
done
ubuntu-clingo-binaries:
runs-on: ubuntu-20.04
steps:
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
- name: Bootstrap GnuPG
run: |
git --version
. .github/workflows/setup_git.sh
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
from-binaries:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: |
3.8
3.9
3.10
3.11
3.12
- name: Set bootstrap sources
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
echo "Testing $ver_dir"
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
@@ -236,122 +159,9 @@ jobs:
exit 1
fi
done
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-binaries:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
# [1] Distros that have patched git to resolve CVE-2022-24765 (e.g. Ubuntu patching v2.25.1)
# introduce breaking behaviorso we have to set `safe.directory` in gitconfig ourselves.
# See:
# - https://github.blog/2022-04-12-git-security-vulnerability-announced/
# - https://github.com/actions/checkout/issues/760
# - http://changelogs.ubuntu.com/changelogs/pool/main/g/git/git_2.25.1-1ubuntu3.3/changelog

View File

@@ -45,17 +45,18 @@ jobs:
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora37, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:37'],
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38']]
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -87,16 +88,16 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@a8a3f3ad30e3422c9c7b888a15615d19a852ae32
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@2b51285047da1547ffb1b2203d8be4c0af6b1f20
uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
@@ -120,3 +121,14 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -1,4 +1,4 @@
black==24.3.0
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -100,7 +100,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -195,10 +195,10 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-latest, macos-14]
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,7 +223,7 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -41,7 +41,7 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -59,7 +59,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -67,7 +67,7 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -1,3 +1,48 @@
# v0.21.2 (2024-03-01)
## Bugfixes
- Containerize: accommodate nested or pre-existing spack-env paths (#41558)
- Fix setup-env script, when going back and forth between instances (#40924)
- Fix using fully-qualified namespaces from root specs (#41957)
- Fix a bug when a required provider is requested for multiple virtuals (#42088)
- OCI buildcaches:
- only push in parallel when forking (#42143)
- use pickleable errors (#42160)
- Fix using sticky variants in externals (#42253)
- Fix a rare issue with conditional requirements and multi-valued variants (#42566)
## Package updates
- rust: add v1.75, rework a few variants (#41161,#41903)
- py-transformers: add v4.35.2 (#41266)
- mgard: fix OpenMP on AppleClang (#42933)
# v0.21.1 (2024-01-11)
## New features
- Add support for reading buildcaches created by Spack v0.22 (#41773)
## Bugfixes
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)
- Improve the error message for deprecated preferences (#41075)
- Fix MSVC preview version breaking clingo build on Windows (#41185)
- Fix multi-word aliases (#41126)
- Add a warning for unconfigured compiler (#41213)
- environment: fix an issue with deconcretization/reconcretization of specs (#41294)
- buildcache: don't error if a patch is missing, when installing from binaries (#41986)
- Multiple improvements to unit-tests (#41215,#41369,#41495,#41359,#41361,#41345,#41342,#41308,#41226)
## Package updates
- root: add a webgui patch to address security issue (#41404)
- BerkeleyGW: update source urls (#38218)
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.

View File

@@ -88,7 +88,7 @@ Resources:
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
* **X**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.

View File

@@ -144,3 +144,5 @@ switch($SpackSubCommand)
"unload" {Invoke-SpackLoad}
default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
}
exit $LASTEXITCODE

View File

@@ -15,7 +15,7 @@ concretizer:
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization. If `dependencies`, we'll only reuse dependencies but
# give you a fresh concretization for your root specs.
reuse: dependencies
reuse: true
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -19,7 +19,6 @@ packages:
- apple-clang
- clang
- gcc
- intel
providers:
elf: [libelf]
fuse: [macfuse]

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -15,9 +15,10 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, intel, pgi, clang, xl, nag, fj, aocc]
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-oneapi-daal]
@@ -35,6 +36,7 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx, mesa18+glx]
libifcore: [ intel-oneapi-runtime ]

12
lib/spack/docs/_templates/layout.html vendored Normal file
View File

@@ -0,0 +1,12 @@
{% extends "!layout.html" %}
{%- block extrahead %}
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S0PQ7WV75K"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-S0PQ7WV75K');
</script>
{% endblock %}

View File

@@ -865,7 +865,7 @@ There are several different ways to use Spack packages once you have
installed them. As you've seen, spack packages are installed into long
paths with hashes, and you need a way to get them into your path. The
easiest way is to use :ref:`spack load <cmd-spack-load>`, which is
described in the next section.
described in this section.
Some more advanced ways to use Spack packages include:
@@ -959,7 +959,86 @@ use ``spack find --loaded``.
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
We'll learn more about Spack's spec syntax in :ref:`a later section <sec-specs>`.
.. _extensions:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Python packages and virtual environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack can install a large number of Python packages. Their names are
typically prefixed with ``py-``. Installing and using them is no
different from any other package:
.. code-block:: console
$ spack install py-numpy
$ spack load py-numpy
$ python3
>>> import numpy
The ``spack load`` command sets the ``PATH`` variable so that the right Python
executable is used, and makes sure that ``numpy`` and its dependencies can be
located in the ``PYTHONPATH``.
Spack is different from other Python package managers in that it installs
every package into its *own* prefix. This is in contrast to ``pip``, which
installs all packages into the same prefix, be it in a virtual environment
or not.
For many users, **virtual environments** are more convenient than repeated
``spack load`` commands, particularly when working with multiple Python
packages. Fortunately Spack supports environments itself, which together
with a view are no different from Python virtual environments.
The recommended way of working with Python extensions such as ``py-numpy``
is through :ref:`Environments <environments>`. The following example creates
a Spack environment with ``numpy`` in the current working directory. It also
puts a filesystem view in ``./view``, which is a more traditional combined
prefix for all packages in the environment.
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
Now you can activate the environment and start using the packages:
.. code-block:: console
$ spack env activate .
$ python3
>>> import numpy
The environment view is also a virtual environment, which is useful if you are
sharing the environment with others who are unfamiliar with Spack. They can
either use the Python executable directly:
.. code-block:: console
$ ./view/bin/python3
>>> import numpy
or use the activation script:
.. code-block:: console
$ source ./view/bin/activate
$ python3
>>> import numpy
In general, there should not be much difference between ``spack env activate``
and using the virtual environment. The main advantage of ``spack env activate``
is that it knows about more packages than just Python packages, and it may set
additional runtime variables that are not covered by the virtual environment
activation script.
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
.. _sec-specs:
@@ -1705,165 +1784,6 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
.. _extensions:
---------------------------
Extensions & Python support
---------------------------
Spack's installation model assumes that each package will live in its
own install prefix. However, certain packages are typically installed
*within* the directory hierarchy of other packages. For example,
`Python <https://www.python.org>`_ packages are typically installed in the
``$prefix/lib/python-2.7/site-packages`` directory.
In Spack, installation prefixes are immutable, so this type of installation
is not directly supported. However, it is possible to create views that
allow you to merge install prefixes of multiple packages into a single new prefix.
Views are a convenient way to get a more traditional filesystem structure.
Using *extensions*, you can ensure that Python packages always share the
same prefix in the view as Python itself. Suppose you have
Python installed like so:
.. code-block:: console
$ spack find python
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
python@2.7.8
.. _cmd-spack-extensions:
^^^^^^^^^^^^^^^^^^^^
``spack extensions``
^^^^^^^^^^^^^^^^^^^^
You can find extensions for your Python installation like this:
.. code-block:: console
$ spack extensions python
==> python@2.7.8%gcc@4.4.7 arch=linux-debian7-x86_64-703c7a96
==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six
py-biopython py-mako py-pmw py-rpy2 py-sympy
py-cython py-matplotlib py-pychecker py-scientificpython py-virtualenv
py-dateutil py-mpi4py py-pygments py-scikit-learn
py-epydoc py-mx py-pylint py-scipy
py-gnuplot py-nose py-pyparsing py-setuptools
py-h5py py-numpy py-pyqt py-shiboken
==> 12 installed:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-dateutil@2.4.0 py-nose@1.3.4 py-pyside@1.2.2
py-dateutil@2.4.0 py-numpy@1.9.1 py-pytz@2014.10
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
The extensions are a subset of what's returned by ``spack list``, and
they are packages like any other. They are installed into their own
prefixes, and you can see this with ``spack find --paths``:
.. code-block:: console
$ spack find --paths py-numpy
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-numpy@1.9.1 ~/spack/opt/linux-debian7-x86_64/gcc@4.4.7/py-numpy@1.9.1-66733244
However, even though this package is installed, you cannot use it
directly when you run ``python``:
.. code-block:: console
$ spack load python
$ python
Python 2.7.8 (default, Feb 17 2015, 01:35:25)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>>
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Extensions in Environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The recommended way of working with extensions such as ``py-numpy``
above is through :ref:`Environments <environments>`. For example,
the following creates an environment in the current working directory
with a filesystem view in the ``./view`` directory:
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
We recommend environments for two reasons. Firstly, environments
can be activated (requires :ref:`shell-support`):
.. code-block:: console
$ spack env activate .
which sets all the right environment variables such as ``PATH`` and
``PYTHONPATH``. This ensures that
.. code-block:: console
$ python
>>> import numpy
works. Secondly, even without shell support, the view ensures
that Python can locate its extensions:
.. code-block:: console
$ ./view/bin/python
>>> import numpy
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
^^^^^^^^^^^^^^^^^^^^
Using ``spack load``
^^^^^^^^^^^^^^^^^^^^
A more traditional way of using Spack and extensions is ``spack load``
(requires :ref:`shell-support`). This will add the extension to ``PYTHONPATH``
in your current shell, and Python itself will be available in the ``PATH``:
.. code-block:: console
$ spack load py-numpy
$ python
>>> import numpy
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Apart from ``spack env activate`` and ``spack load``, you can load numpy
through your environment modules (using ``environment-modules`` or
``lmod``). This will also add the extension to the ``PYTHONPATH`` in
your current shell.
.. code-block:: console
$ module load <name of numpy module>
If you do not know the name of the specific numpy module you wish to
load, you can use the ``spack module tcl|lmod loads`` command to get
the name of the module from the Spack spec.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -220,6 +220,40 @@ section of the configuration:
.. _binary_caches_oci:
---------------------------------
Automatic push to a build cache
---------------------------------
Sometimes it is convenient to push packages to a build cache as soon as they are installed. Spack can do this by setting autopush flag when adding a mirror:
.. code-block:: console
$ spack mirror add --autopush <name> <url or path>
Or the autopush flag can be set for an existing mirror:
.. code-block:: console
$ spack mirror set --autopush <name> # enable automatic push for an existing mirror
$ spack mirror set --no-autopush <name> # disable automatic push for an existing mirror
Then after installing a package it is automatically pushed to all mirrors with ``autopush: true``. The command
.. code-block:: console
$ spack install <package>
will have the same effect as
.. code-block:: console
$ spack install <package>
$ spack buildcache push <cache> <package> # for all caches with autopush: true
.. note::
Packages are automatically pushed to a build cache only if they are built from source.
-----------------------------------------
OCI / Docker V2 registries as build cache
-----------------------------------------

View File

@@ -21,23 +21,86 @@ is the following:
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
The ``reuse`` attribute controls how aggressively Spack reuses binary packages during concretization. The
attribute can either be a single value, or an object for more complex configurations.
In the former case ("single value") it allows Spack to:
1. Reuse installed packages and buildcaches for all the specs to be concretized, when ``true``
2. Reuse installed packages and buildcaches only for the dependencies of the root specs, when ``dependencies``
3. Disregard reusing installed packages and buildcaches, when ``false``
In case a finer control over which specs are reused is needed, then the value of this attribute can be
an object, with the following keys:
1. ``roots``: if ``true`` root specs are reused, if ``false`` only dependencies of root specs are reused
2. ``from``: list of sources from which reused specs are taken
Each source in ``from`` is itself an object:
.. list-table:: Attributes for a source or reusable specs
:header-rows: 1
* - Attribute name
- Description
* - type (mandatory, string)
- Can be ``local``, ``buildcache``, or ``external``
* - include (optional, list of specs)
- If present, reusable specs must match at least one of the constraint in the list
* - exclude (optional, list of specs)
- If present, reusable specs must not match any of the constraint in the list.
For instance, the following configuration:
.. code-block:: yaml
concretizer:
reuse:
roots: true
from:
- type: local
include:
- "%gcc"
- "%clang"
tells the concretizer to reuse all specs compiled with either ``gcc`` or ``clang``, that are installed
in the local store. Any spec from remote buildcaches is disregarded.
To reduce the boilerplate in configuration files, default values for the ``include`` and
``exclude`` options can be pushed up one level:
.. code-block:: yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
from:
- type: local
- type: buildcache
- type: local
include:
- "foo %oneapi"
In the example above we reuse all specs compiled with ``gcc`` from the local store
and remote buildcaches, and we also reuse ``foo %oneapi``. Note that the last source of
specs override the default ``include`` attribute.
For one-off concretizations, the are command line arguments for each of the simple "single value"
configurations. This means a user can:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
to enable reuse for a single installation, or:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: dependencies`` is the default.
.. seealso::

View File

@@ -147,6 +147,15 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -718,23 +718,45 @@ command-line tool, or C/C++/Fortran program with optional Python
modules? The former should be prepended with ``py-``, while the
latter should not.
""""""""""""""""""""""
extends vs. depends_on
""""""""""""""""""""""
""""""""""""""""""""""""""""""
``extends`` vs. ``depends_on``
""""""""""""""""""""""""""""""
This is very similar to the naming dilemma above, with a slight twist.
As mentioned in the :ref:`Packaging Guide <packaging_extensions>`,
``extends`` and ``depends_on`` are very similar, but ``extends`` ensures
that the extension and extendee share the same prefix in views.
This allows the user to import a Python module without
having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``.
Additionally, ``extends("python")`` adds a dependency on the package
``python-venv``. This improves isolation from the system, whether
it's during the build or at runtime: user and system site packages
cannot accidentally be used by any package that ``extends("python")``.
As a rule of thumb: if a package does not install any Python modules
of its own, and merely puts a Python script in the ``bin`` directory,
then there is no need for ``extends``. If the package installs modules
in the ``site-packages`` directory, it requires ``extends``.
"""""""""""""""""""""""""""""""""""""
Executing ``python`` during the build
"""""""""""""""""""""""""""""""""""""
Whenever you need to execute a Python command or pass the path of the
Python interpreter to the build system, it is best to use the global
variable ``python`` directly. For example:
.. code-block:: python
@run_before("install")
def recythonize(self):
python("setup.py", "clean") # use the `python` global
As mentioned in the previous section, ``extends("python")`` adds an
automatic dependency on ``python-venv``, which is a virtual environment
that guarantees build isolation. The ``python`` global always refers to
the correct Python interpreter, whether the package uses ``extends("python")``
or ``depends_on("python")``.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack

View File

@@ -150,7 +150,7 @@ this can expose you to attacks. Use at your own risk.
--------------------
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to a file path.
filesytem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.
@@ -160,6 +160,9 @@ in the subprocess calling ``curl``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
``config:ssl_certs:$SSL_CERT_FILE`` or ``config:ssl_certs:$SSL_CERT_DIR``
will work.
In all cases the expanded path must be absolute for Spack to use the certificates.
Certificates relative to an environment can be created by prepending the path variable
with the Spack configuration variable``$env``.
--------------------
``checksum``

View File

@@ -194,15 +194,15 @@ The OS that are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -227,12 +227,12 @@ The OS that are currently supported are summarized in the table below:
* - Rocky Linux 9
- ``rockylinux:9``
- ``spack/rockylinux9``
* - Fedora Linux 37
- ``fedora:37``
- ``spack/fedora37``
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
* - Fedora Linux 39
- ``fedora:39``
- ``spack/fedora39``
* - Fedora Linux 40
- ``fedora:40``
- ``spack/fedora40``

View File

@@ -552,11 +552,11 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'import distro; distro.linux_distribution()'
('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
...
$ spack python -i ipython -c 'import distro; distro.linux_distribution()'
Out[1]: ('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
Out[1]: ...
or a file:
@@ -1071,9 +1071,9 @@ Announcing a release
We announce releases in all of the major Spack communication channels.
Publishing the release takes care of GitHub. The remaining channels are
Twitter, Slack, and the mailing list. Here are the steps:
X, Slack, and the mailing list. Here are the steps:
#. Announce the release on Twitter.
#. Announce the release on X.
* Compose the tweet on the ``@spackpm`` account per the
``spack-twitter`` slack channel.

View File

@@ -142,12 +142,8 @@ user's prompt to begin with the environment name in brackets.
$ spack env activate -p myenv
[myenv] $ ...
The ``activate`` command can also be used to create a new environment, if it is
not already defined, by adding the ``--create`` flag. Managed and anonymous
environments, anonymous environments are explained in the next section,
can both be created using the same flags that `spack env create` accepts.
If an environment already exists then spack will simply activate it and ignore the
create specific flags.
The ``activate`` command can also be used to create a new environment if it does not already
exist.
.. code-block:: console
@@ -176,21 +172,36 @@ environment will remove the view from the user environment.
Anonymous Environments
^^^^^^^^^^^^^^^^^^^^^^
Any directory can be treated as an environment if it contains a file
``spack.yaml``. To load an anonymous environment, use:
Apart from managed environments, Spack also supports anonymous environments.
Anonymous environments can be placed in any directory of choice.
.. note::
When uninstalling packages, Spack asks the user to confirm the removal of packages
that are still used in a managed environment. This is not the case for anonymous
environments.
To create an anonymous environment, use one of the following commands:
.. code-block:: console
$ spack env activate -d /path/to/directory
$ spack env create --dir my_env
$ spack env create ./my_env
Anonymous specs can be created in place using the command:
As a shorthand, you can also create an anonymous environment upon activation if it does not
already exist:
.. code-block:: console
$ spack env create -d .
$ spack env activate --create ./my_env
For convenience, Spack can also place an anonymous environment in a temporary directory for you:
.. code-block:: console
$ spack env activate --temp
In this case Spack simply creates a ``spack.yaml`` file in the requested
directory.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment Sensitive Commands
@@ -449,6 +460,125 @@ Sourcing that file in Bash will make the environment available to the
user; and can be included in ``.bashrc`` files, etc. The ``loads``
file may also be copied out of the environment, renamed, etc.
.. _environment_include_concrete:
------------------------------
Included Concrete Environments
------------------------------
Spack environments can create an environment based off of information in already
established environments. You can think of it as a combination of existing
environments. It will gather information from the existing environment's
``spack.lock`` and use that during the creation of this included concrete
environment. When an included concrete environment is created it will generate
a ``spack.lock`` file for the newly created environment.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Creating included environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To create a combined concrete environment, you must have at least one existing
concrete environment. You will use the command ``spack env create`` with the
argument ``--include-concrete`` followed by the name or path of the environment
you'd like to include. Here is an example of how to create a combined environment
from the command line.
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
You can also include an environment directly in the ``spack.yaml`` file. It
involves adding the ``include_concrete`` heading in the yaml followed by the
absolute path to the independent environments.
.. code-block:: yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /absolute/path/to/environment1
- /absolute/path/to/environment2
Once the ``spack.yaml`` has been updated you must concretize the environment to
get the concrete specs from the included environments.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Updating an included environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If changes were made to the base environment and you want that reflected in the
included environment you will need to reconcretize both the base environment and the
included environment for the change to be implemented. For example:
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
$ spack -e myenv find
==> In environment myenv
==> Root specs
python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
Here we see that ``included_env`` has access to the python package through
the ``myenv`` environment. But if we were to add another spec to ``myenv``,
``included_env`` will not be able to access the new information.
.. code-block:: console
$ spack -e myenv add perl
$ spack -e myenv concretize
$ spack -e myenv find
==> In environment myenv
==> Root specs
perl python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
It isn't until you run the ``spack concretize`` command that the combined
environment will get the updated information from the reconcretized base environmennt.
.. code-block:: console
$ spack -e included_env concretize
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
perl python
==> 0 installed packages
.. _environment-configuration:
------------------------
@@ -800,6 +930,7 @@ For example, the following environment has three root packages:
This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
@@ -1033,7 +1164,7 @@ other targets to depend on the environment installation.
A typical workflow is as follows:
.. code:: console
.. code-block:: console
spack env create -d .
spack -e . add perl
@@ -1126,7 +1257,7 @@ its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
.. code-block:: console
$ spack env depfile -o Makefile
@@ -1148,7 +1279,7 @@ This can be accomplished through the generated ``[<prefix>/]SPACK_PACKAGE_IDS``
variable. Assuming we have an active and concrete environment, we generate the
associated ``Makefile`` with a prefix ``example``:
.. code:: console
.. code-block:: console
$ spack env depfile -o env.mk --make-prefix example

View File

@@ -478,6 +478,13 @@ prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
Note that the format for the ``paths`` field in the
``extra_attributes`` section is different than in the ``compilers``
config. For compilers configured as external packages, the section is
named ``compilers`` and the dictionary maps language names (``c``,
``cxx``, ``fortran``) to paths, rather than using the names ``cc``,
``fc``, and ``f77``.
.. code-block:: yaml
packages:
@@ -493,11 +500,10 @@ all other fields from the compilers config can be added to the
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
paths:
cc: /usr/bin/clang-with-suffix
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fc: /usr/bin/gfortran
f77: /usr/bin/gfortran
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
@@ -1572,6 +1578,8 @@ Microsoft Visual Studio
"""""""""""""""""""""""
Microsoft Visual Studio provides the only Windows C/C++ compiler that is currently supported by Spack.
Spack additionally requires that the Windows SDK (including WGL) to be installed as part of your
visual studio installation as it is required to build many packages from source.
We require several specific components to be included in the Visual Studio installation.
One is the C/C++ toolset, which can be selected as "Desktop development with C++" or "C++ build tools,"
@@ -1579,6 +1587,7 @@ depending on installation type (Professional, Build Tools, etc.) The other requ
"C++ CMake tools for Windows," which can be selected from among the optional packages.
This provides CMake and Ninja for use during Spack configuration.
If you already have Visual Studio installed, you can make sure these components are installed by
rerunning the installer. Next to your installation, select "Modify" and look at the
"Installation details" pane on the right.

View File

@@ -6435,9 +6435,12 @@ the ``paths`` attribute:
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
platforms: ["linux", "darwin"]
results:
- spec: 'llvm@3.9.1 +clang~lld~lldb'
If the ``platforms`` attribute is present, tests are run only if the current host
matches one of the listed platforms.
Each test is performed by first creating a temporary directory structure as
specified in the corresponding ``layout`` and by then running
package detection and checking that the outcome matches the expected
@@ -6471,6 +6474,10 @@ package detection and checking that the outcome matches the expected
- A spec that is expected from detection
- Any valid spec
- Yes
* - ``results:[0]:extra_attributes``
- Extra attributes expected on the associated Spec
- Nested dictionary with string as keys, and regular expressions as leaf values
- No
"""""""""""""""""""""""""""""""
Reuse tests from other packages

View File

@@ -476,9 +476,3 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
pygments==2.18.0
urllib3==2.2.1
pytest==8.1.1
pytest==8.2.0
isort==5.13.2
black==24.3.0
black==24.4.2
flake8==7.0.0
mypy==1.9.0
mypy==1.10.0

255
lib/spack/env/cc vendored
View File

@@ -47,7 +47,8 @@ SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS"
SPACK_SYSTEM_DIRS
SPACK_MANAGED_DIRS"
# Optional parameters that aren't required to be set
@@ -173,22 +174,6 @@ preextend() {
unset IFS
}
# system_dir PATH
# test whether a path is a system directory
system_dir() {
IFS=':' # SPACK_SYSTEM_DIRS is colon-separated
path="$1"
for sd in $SPACK_SYSTEM_DIRS; do
if [ "${path}" = "${sd}" ] || [ "${path}" = "${sd}/" ]; then
# success if path starts with a system prefix
unset IFS
return 0
fi
done
unset IFS
return 1 # fail if path starts no system prefix
}
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -201,6 +186,18 @@ for param in $params; do
fi
done
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
# moving the eval inside the function would eval it every call.
eval "\
path_order() {
case \"\$1\" in
$SPACK_MANAGED_DIRS) return 0 ;;
$SPACK_SYSTEM_DIRS) return 2 ;;
/*) return 1 ;;
esac
}
"
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -248,7 +245,7 @@ case "$command" in
lang_flags=C
debug_flags="-g"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|dpcpp|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
command="$SPACK_CXX"
language="C++"
comp="CXX"
@@ -420,11 +417,12 @@ input_command="$*"
parse_Wl() {
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
wl_expect_rpath=no
else
case "$1" in
@@ -432,21 +430,25 @@ parse_Wl() {
arg="${1#-rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
wl_expect_rpath=yes
@@ -473,12 +475,20 @@ categorize_arguments() {
return_other_args_list=""
return_isystem_was_used=""
return_isystem_spack_store_include_dirs_list=""
return_isystem_system_include_dirs_list=""
return_isystem_include_dirs_list=""
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
@@ -526,7 +536,7 @@ categorize_arguments() {
continue
fi
replaced="$after$stripped"
replaced="$after$stripped"
# it matched, remove it
shift
@@ -546,29 +556,32 @@ categorize_arguments() {
arg="${1#-isystem}"
return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_isystem_system_include_dirs_list "$arg"
else
append return_isystem_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_include_dirs_list "$arg"
else
append return_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_lib_dirs_list "$arg"
else
append return_lib_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -601,29 +614,32 @@ categorize_arguments() {
break
elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path>
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
xlinker_expect_rpath=no
else
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
xlinker_expect_rpath=yes
@@ -661,16 +677,25 @@ categorize_arguments() {
}
categorize_arguments "$@"
include_dirs_list="$return_include_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
isystem_was_used="$return_isystem_was_used"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
other_args_list="$return_other_args_list"
spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
include_dirs_list="$return_include_dirs_list"
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list"
#
# Add flags from Spack's cppflags, cflags, cxxflags, fcflags, fflags, and
@@ -730,7 +755,7 @@ esac
# Linker flags
case "$mode" in
ld|ccld)
ccld)
extend spack_flags_list SPACK_LDFLAGS
;;
esac
@@ -738,16 +763,25 @@ esac
IFS="$lsep"
categorize_arguments $spack_flags_list
unset IFS
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_other_args_list="$return_other_args_list"
spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list"
# On macOS insert headerpad_max_install_names linker flag
@@ -767,11 +801,13 @@ if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
# Append RPATH directories. Note that in the case of the
# top-level package these directories may not exist yet. For dependencies
# it is assumed that paths have already been confirmed.
extend spack_store_rpath_dirs_list SPACK_STORE_RPATH_DIRS
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
extend spack_store_lib_dirs_list SPACK_STORE_LINK_DIRS
extend lib_dirs_list SPACK_LINK_DIRS
fi
@@ -798,38 +834,50 @@ case "$mode" in
;;
esac
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend include_dirs_list SPACK_INCLUDE_DIRS
fi
;;
esac
#
# Finally, reassemble the command line.
#
args_list="$flags_list"
# Insert include directories just prior to any system include directories
# Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_flags_include_dirs_list "-I"
extend args_list include_dirs_list "-I"
extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I
extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
elif [ "$isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
else
extend args_list SPACK_INCLUDE_DIRS "-I"
fi
;;
esac
extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_include_dirs_list -I
extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths
# Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L"
extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
@@ -839,8 +887,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;;
@@ -848,8 +900,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;;
@@ -913,4 +969,3 @@ fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.3 (commit 7b8fe60b69e2861e7dac104bc1c183decfcd3daf)
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse
----------------

View File

@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -1,3 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.3"
__version__ = "0.2.4"

View File

@@ -5,9 +5,10 @@
"""The "cpu" package permits to query and compare different
CPU microarchitectures.
"""
from .detect import host
from .detect import brand_string, host
from .microarchitecture import (
TARGETS,
InvalidCompilerVersion,
Microarchitecture,
UnsupportedMicroarchitecture,
generic_microarchitecture,
@@ -15,10 +16,12 @@
)
__all__ = [
"brand_string",
"host",
"TARGETS",
"InvalidCompilerVersion",
"Microarchitecture",
"UnsupportedMicroarchitecture",
"TARGETS",
"generic_microarchitecture",
"host",
"version_components",
]

View File

@@ -155,6 +155,31 @@ def _is_bit_set(self, register: int, bit: int) -> bool:
mask = 1 << bit
return register & mask > 0
def brand_string(self) -> Optional[str]:
"""Returns the brand string, if available."""
if self.highest_extension_support < 0x80000004:
return None
r1 = self.cpuid.registers_for(eax=0x80000002, ecx=0)
r2 = self.cpuid.registers_for(eax=0x80000003, ecx=0)
r3 = self.cpuid.registers_for(eax=0x80000004, ecx=0)
result = struct.pack(
"IIIIIIIIIIII",
r1.eax,
r1.ebx,
r1.ecx,
r1.edx,
r2.eax,
r2.ebx,
r2.ecx,
r2.edx,
r3.eax,
r3.ebx,
r3.ecx,
r3.edx,
).decode("utf-8")
return result.strip("\x00")
@detection(operating_system="Windows")
def cpuid_info():
@@ -174,8 +199,8 @@ def _check_output(args, env):
WINDOWS_MAPPING = {
"AMD64": "x86_64",
"ARM64": "aarch64",
"AMD64": X86_64,
"ARM64": AARCH64,
}
@@ -409,3 +434,16 @@ def compatibility_check_for_riscv64(info, target):
return (target == arch_root or arch_root in target.ancestors) and (
target.name == info.name or target.vendor == "generic"
)
def brand_string() -> Optional[str]:
"""Returns the brand string of the host, if detected, or None."""
if platform.system() == "Darwin":
return _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if host().family == X86_64:
return CpuidInfoCollector().brand_string()
return None

View File

@@ -208,6 +208,8 @@ def optimization_flags(self, compiler, version):
"""Returns a string containing the optimization flags that needs
to be used to produce code optimized for this micro-architecture.
The version is expected to be a string of dot separated digits.
If there is no information on the compiler passed as argument the
function returns an empty string. If it is known that the compiler
version we want to use does not support this architecture the function
@@ -216,6 +218,11 @@ def optimization_flags(self, compiler, version):
Args:
compiler (str): name of the compiler to be used
version (str): version of the compiler to be used
Raises:
UnsupportedMicroarchitecture: if the requested compiler does not support
this micro-architecture.
ValueError: if the version doesn't match the expected format
"""
# If we don't have information on compiler at all return an empty string
if compiler not in self.family.compilers:
@@ -232,6 +239,14 @@ def optimization_flags(self, compiler, version):
msg = msg.format(compiler, best_target, best_target.family)
raise UnsupportedMicroarchitecture(msg)
# Check that the version matches the expected format
if not re.match(r"^(?:\d+\.)*\d+$", version):
msg = (
"invalid format for the compiler version argument. "
"Only dot separated digits are allowed."
)
raise InvalidCompilerVersion(msg)
# If we have information on this compiler we need to check the
# version being used
compiler_info = self.compilers[compiler]
@@ -292,7 +307,7 @@ def generic_microarchitecture(name):
Args:
name (str): name of the micro-architecture
"""
return Microarchitecture(name, parents=[], vendor="generic", features=[], compilers={})
return Microarchitecture(name, parents=[], vendor="generic", features=set(), compilers={})
def version_components(version):
@@ -367,7 +382,15 @@ def fill_target_from_dict(name, data, targets):
TARGETS = LazyDictionary(_known_microarchitectures)
class UnsupportedMicroarchitecture(ValueError):
class ArchspecError(Exception):
"""Base class for errors within archspec"""
class UnsupportedMicroarchitecture(ArchspecError, ValueError):
"""Raised if a compiler version does not support optimization for a given
micro-architecture.
"""
class InvalidCompilerVersion(ArchspecError, ValueError):
"""Raised when an invalid format is used for compiler versions in archspec."""

View File

@@ -2937,8 +2937,6 @@
"ilrcpc",
"flagm",
"ssbs",
"paca",
"pacg",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3066,8 +3064,6 @@
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
@@ -3081,8 +3077,7 @@
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
"dgh"
],
"compilers" : {
"gcc": [

View File

@@ -0,0 +1,13 @@
diff --git a/lib/spack/external/_vendoring/ruamel/yaml/comments.py b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
index 1badeda585..892c868af3 100644
--- a/lib/spack/external/_vendoring/ruamel/yaml/comments.py
+++ b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
- setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
+ setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -98,3 +98,10 @@ def path_filter_caller(*args, **kwargs):
if _func:
return holder_func(_func)
return holder_func
def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.
no-op if extended path prefix is not present"""
return path.lstrip("\\\\?\\")

View File

@@ -187,26 +187,58 @@ def polite_filename(filename: str) -> str:
return _polite_antipattern().sub("_", filename)
def getuid():
def getuid() -> Union[str, int]:
"""Returns os getuid on non Windows
On Windows returns 0 for admin users, login string otherwise
This is in line with behavior from get_owner_uid which
always returns the login string on Windows
"""
if sys.platform == "win32":
import ctypes
# If not admin, use the string name of the login as a unique ID
if ctypes.windll.shell32.IsUserAnAdmin() == 0:
return 1
return os.getlogin()
return 0
else:
return os.getuid()
def _win_rename(src, dst):
# os.replace will still fail if on Windows (but not POSIX) if the dst
# is a symlink to a directory (all other cases have parity Windows <-> Posix)
if os.path.islink(dst) and os.path.isdir(os.path.realpath(dst)):
if os.path.samefile(src, dst):
# src and dst are the same
# do nothing and exit early
return
# If dst exists and is a symlink to a directory
# we need to remove dst and then perform rename/replace
# this is safe to do as there's no chance src == dst now
os.remove(dst)
os.replace(src, dst)
@system_path_filter
def msdos_escape_parens(path):
"""MS-DOS interprets parens as grouping parameters even in a quoted string"""
if sys.platform == "win32":
return path.replace("(", "^(").replace(")", "^)")
else:
return path
@system_path_filter
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
# os.replace is the same as os.rename on POSIX and is MoveFileExW w/
# the MOVEFILE_REPLACE_EXISTING flag on Windows
# Windows invocation is abstracted behind additonal logic handling
# remaining cases of divergent behavior accross platforms
if sys.platform == "win32":
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or islink(dst):
os.remove(dst)
os.rename(src, dst)
_win_rename(src, dst)
else:
os.replace(src, dst)
@system_path_filter
@@ -536,7 +568,13 @@ def exploding_archive_handler(tarball_container, stage):
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
def get_owner_uid(path, err_msg=None) -> Union[str, int]:
"""Returns owner UID of path destination
On non Windows this is the value of st_uid
On Windows this is the login string associated with the
owning user.
"""
if not os.path.exists(path):
mkdirp(path, mode=stat.S_IRWXU)
@@ -805,7 +843,7 @@ def copy_tree(
if islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
target = readlink(s)
if os.path.isabs(target):
def escaped_path(path):
@@ -1217,10 +1255,12 @@ def windows_sfn(path: os.PathLike):
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# Method with null values returns size of short path name
sz = k32.GetShortPathNameW(path, None, 0)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
TCHAR_arr = ctypes.c_wchar * sz
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
k32.GetShortPathNameW(path, ctypes.byref(ret_str), sz)
return ret_str.value
@@ -2410,9 +2450,10 @@ def add_library_dependent(self, *dest):
"""
for pth in dest:
if os.path.isfile(pth):
self._additional_library_dependents.add(pathlib.Path(pth).parent)
new_pth = pathlib.Path(pth).parent
else:
self._additional_library_dependents.add(pathlib.Path(pth))
new_pth = pathlib.Path(pth)
self._additional_library_dependents.add(new_pth)
@property
def rpaths(self):
@@ -2490,8 +2531,14 @@ def establish_link(self):
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
# do not rpath for system libraries included in the dag
# we should not be modifying libraries managed by the Windows system
# as this will negatively impact linker behavior and can result in permission
# errors if those system libs are not modifiable by Spack
if "windows-system" not in getattr(self.pkg, "tags", []):
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
@system_path_filter

View File

@@ -11,7 +11,7 @@
from llnl.util import lang, tty
from ..path import system_path_filter
from ..path import sanitize_win_longpath, system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
@@ -247,9 +247,9 @@ def _windows_create_junction(source: str, link: str):
out, err = proc.communicate()
tty.debug(out.decode())
if proc.returncode != 0:
err = err.decode()
tty.error(err)
raise SymlinkError("Make junction command returned a non-zero return code.", err)
err_str = err.decode()
tty.error(err_str)
raise SymlinkError("Make junction command returned a non-zero return code.", err_str)
def _windows_create_hard_link(path: str, link: str):
@@ -269,14 +269,14 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def readlink(path: str):
def readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
elif _windows_is_junction(path):
return _windows_read_junction(path)
else:
return os.readlink(path)
return sanitize_win_longpath(os.readlink(path, dir_fd=dir_fd))
def _windows_read_hard_link(link: str) -> str:

View File

@@ -12,7 +12,7 @@
import traceback
from datetime import datetime
from sys import platform as _platform
from typing import NoReturn
from typing import Any, NoReturn
if _platform != "win32":
import fcntl
@@ -158,21 +158,22 @@ def get_timestamp(force=False):
return ""
def msg(message, *args, **kwargs):
def msg(message: Any, *args: Any, newline: bool = True) -> None:
if not msg_enabled():
return
if isinstance(message, Exception):
message = "%s: %s" % (message.__class__.__name__, str(message))
message = f"{message.__class__.__name__}: {message}"
else:
message = str(message)
newline = kwargs.get("newline", True)
st_text = ""
if _stacktrace:
st_text = process_stacktrace(2)
if newline:
cprint("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
else:
cwrite("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
nl = "\n" if newline else ""
cwrite(f"@*b{{{st_text}==>}} {get_timestamp()}{cescape(_output_filter(message))}{nl}")
for arg in args:
print(indent + _output_filter(str(arg)))

View File

@@ -237,7 +237,6 @@ def transpose():
def colified(
elts: List[Any],
cols: int = 0,
output: Optional[IO] = None,
indent: int = 0,
padding: int = 2,
tty: Optional[bool] = None,

View File

@@ -59,9 +59,11 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
import os
import re
import sys
from contextlib import contextmanager
from typing import Optional
class ColorParseError(Exception):
@@ -95,14 +97,34 @@ def __init__(self, message):
} # white
# Regex to be used for color formatting
color_re = r"@(?:@|\.|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)"
COLOR_RE = re.compile(r"@(?:(@)|(\.)|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)")
# Mapping from color arguments to values for tty.set_color
color_when_values = {"always": True, "auto": None, "never": False}
# Force color; None: Only color if stdout is a tty
# True: Always colorize output, False: Never colorize output
_force_color = None
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def _color_from_environ() -> Optional[bool]:
try:
return _color_when_value(os.environ.get("SPACK_COLOR", "auto"))
except ValueError:
return None
#: When `None` colorize when stdout is tty, when `True` or `False` always or never colorize resp.
_force_color = _color_from_environ()
def try_enable_terminal_color_on_windows():
@@ -163,19 +185,6 @@ def _err_check(result, func, args):
debug("Unable to support color on Windows terminal")
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def get_color_when():
"""Return whether commands should print color or not."""
if _force_color is not None:
@@ -203,77 +212,64 @@ def color_when(value):
set_color_when(old_value)
class match_to_ansi:
def __init__(self, color=True, enclose=False, zsh=False):
self.color = _color_when_value(color)
self.enclose = enclose
self.zsh = zsh
def escape(self, s):
"""Returns a TTY escape sequence for a color"""
if self.color:
if self.zsh:
result = rf"\e[0;{s}m"
else:
result = f"\033[{s}m"
if self.enclose:
result = rf"\[{result}\]"
return result
def _escape(s: str, color: bool, enclose: bool, zsh: bool) -> str:
"""Returns a TTY escape sequence for a color"""
if color:
if zsh:
result = rf"\e[0;{s}m"
else:
return ""
result = f"\033[{s}m"
def __call__(self, match):
"""Convert a match object generated by ``color_re`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
style, color, text = match.groups()
m = match.group(0)
if enclose:
result = rf"\[{result}\]"
if m == "@@":
return "@"
elif m == "@.":
return self.escape(0)
elif m == "@":
raise ColorParseError("Incomplete color format: '%s' in %s" % (m, match.string))
string = styles[style]
if color:
if color not in colors:
raise ColorParseError(
"Invalid color specifier: '%s' in '%s'" % (color, match.string)
)
string += ";" + str(colors[color])
colored_text = ""
if text:
colored_text = text + self.escape(0)
return self.escape(string) + colored_text
return result
else:
return ""
def colorize(string, **kwargs):
def colorize(
string: str, color: Optional[bool] = None, enclose: bool = False, zsh: bool = False
) -> str:
"""Replace all color expressions in a string with ANSI control codes.
Args:
string (str): The string to replace
string: The string to replace
Returns:
str: The filtered string
The filtered string
Keyword Arguments:
color (bool): If False, output will be plain text without control
codes, for output to non-console devices.
enclose (bool): If True, enclose ansi color sequences with
color: If False, output will be plain text without control codes, for output to
non-console devices (default: automatically choose color or not)
enclose: If True, enclose ansi color sequences with
square brackets to prevent misestimation of terminal width.
zsh (bool): If True, use zsh ansi codes instead of bash ones (for variables like PS1)
zsh: If True, use zsh ansi codes instead of bash ones (for variables like PS1)
"""
color = _color_when_value(kwargs.get("color", get_color_when()))
zsh = kwargs.get("zsh", False)
string = re.sub(color_re, match_to_ansi(color, kwargs.get("enclose")), string, zsh)
string = string.replace("}}", "}")
return string
color = color if color is not None else get_color_when()
def match_to_ansi(match):
"""Convert a match object generated by ``COLOR_RE`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
escaped_at, dot, style, color_code, text = match.groups()
if escaped_at:
return "@"
elif dot:
return _escape(0, color, enclose, zsh)
elif not (style or color_code):
raise ColorParseError(
f"Incomplete color format: '{match.group(0)}' in '{match.string}'"
)
ansi_code = _escape(f"{styles[style]};{colors.get(color_code, '')}", color, enclose, zsh)
if text:
return f"{ansi_code}{text}{_escape(0, color, enclose, zsh)}"
else:
return ansi_code
return COLOR_RE.sub(match_to_ansi, string).replace("}}", "}")
def clen(string):
@@ -305,7 +301,7 @@ def cprint(string, stream=None, color=None):
cwrite(string + "\n", stream, color)
def cescape(string):
def cescape(string: str) -> str:
"""Escapes special characters needed for color codes.
Replaces the following symbols with their equivalent literal forms:
@@ -321,10 +317,7 @@ def cescape(string):
Returns:
(str): the string with color codes escaped
"""
string = str(string)
string = string.replace("@", "@@")
string = string.replace("}", "}}")
return string
return string.replace("@", "@@").replace("}", "}}")
class ColorStream:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.0.dev0"
__version__ = "0.23.0.dev0"
spack_version = __version__

View File

@@ -254,8 +254,8 @@ def _search_duplicate_specs_in_externals(error_cls):
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
@@ -1046,7 +1046,7 @@ def _extracts_errors(triggers, summary):
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
kwargs=("pkgs", "debug_log"),
)
@@ -1069,7 +1069,7 @@ def packages_with_detection_tests():
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
def _test_detection_by_executable(pkgs, debug_log, error_cls):
"""Test drive external detection for packages"""
import spack.detection
@@ -1095,6 +1095,7 @@ def _test_detection_by_executable(pkgs, error_cls):
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
debug_log(f"[{__file__}]: running test {idx} for package {pkg_name}")
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
@@ -1111,4 +1112,75 @@ def _test_detection_by_executable(pkgs, error_cls):
details = [msg.format(s, idx) for s in sorted(not_expected)]
errors.append(error_cls(summary=summary, details=details))
matched_detection = []
for candidate in expected_specs:
try:
idx = specs.index(candidate)
matched_detection.append((candidate, specs[idx]))
except (AttributeError, ValueError):
pass
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
_summary = f'{pkg_name}: illegal regex in "{_spec}" extra attributes'
_details = [f"{_expected} is not a valid regex"]
return [error_cls(summary=_summary, details=_details)]
if not _regex.match(_detected):
_summary = (
f'{pkg_name}: error when trying to match "{_expected}" '
f"in extra attributes"
)
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
_details = [
f'"{_expected}" was expected',
f'"{_detected}" was detected',
] + [f'attribute "{s}" was not detected' for s in sorted(_not_detected)]
result.append(error_cls(summary=_summary, details=_details))
_common = set(_expected.keys()) & set(_detected.keys())
for _key in _common:
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
return result
for expected, detected in matched_detection:
# We might not want to test all attributes, so avoid not_expected
not_detected = set(expected.extra_attributes) - set(detected.extra_attributes)
if not_detected:
summary = f"{pkg_name}: cannot detect some attributes for spec {expected}"
details = [
f'"{s}" was not detected [test_id={idx}]' for s in sorted(not_detected)
]
errors.append(error_cls(summary=summary, details=details))
common = set(expected.extra_attributes) & set(detected.extra_attributes)
for key in common:
errors.extend(
_compare_extra_attribute(
expected.extra_attributes[key],
detected.extra_attributes[key],
_spec=expected,
)
)
return errors

View File

@@ -29,6 +29,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
from llnl.util.symlink import readlink
import spack.caches
import spack.cmd
@@ -658,7 +659,7 @@ def get_buildfile_manifest(spec):
# 2. paths are used as strings.
for rel_path in visitor.symlinks:
abs_path = os.path.join(root, rel_path)
link = os.readlink(abs_path)
link = readlink(abs_path)
if os.path.isabs(link) and link.startswith(spack.store.STORE.layout.root):
data["link_to_relocate"].append(rel_path)
@@ -2001,6 +2002,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)

View File

@@ -5,7 +5,13 @@
"""Function and classes needed to bootstrap Spack itself."""
from .config import ensure_bootstrap_configuration, is_bootstrapping, store_path
from .core import all_core_root_specs, ensure_core_dependencies, ensure_patchelf_in_path_or_raise
from .core import (
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
from .environment import BootstrapEnvironment, ensure_environment_dependencies
from .status import status_message
@@ -13,6 +19,8 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",
"all_core_root_specs",
"ensure_environment_dependencies",

View File

@@ -54,10 +54,14 @@ def _try_import_from_store(
installed_specs = spack.store.STORE.db.query(query_spec, installed=True)
for candidate_spec in installed_specs:
pkg = candidate_spec["python"].package
# previously bootstrapped specs may not have a python-venv dependency.
if candidate_spec.dependencies("python-venv"):
python, *_ = candidate_spec.dependencies("python-venv")
else:
python, *_ = candidate_spec.dependencies("python")
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
os.path.join(candidate_spec.prefix, python.package.purelib),
os.path.join(candidate_spec.prefix, python.package.platlib),
]
path_before = list(sys.path)

View File

@@ -173,35 +173,14 @@ def _read_metadata(self, package_name: str) -> Any:
return data
def _install_by_hash(
self,
pkg_hash: str,
pkg_sha256: str,
index: List[spack.spec.Spec],
bincache_platform: spack.platforms.Platform,
self, pkg_hash: str, pkg_sha256: str, bincache_platform: spack.platforms.Platform
) -> None:
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null",
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family),
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override("compilers", [{"compiler": compiler_entry}]):
spec_str = "/" + pkg_hash
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
matches = spack.store.find([spec_str], multiple=False, query_fn=query)
for match in matches:
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
for match in spack.store.find([f"/{pkg_hash}"], multiple=False, query_fn=query):
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
def _install_and_test(
self,
@@ -232,7 +211,7 @@ def _install_and_test(
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, index, bincache_platform)
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
if test_fn(query_spec=abstract_spec, query_info=info):
@@ -291,10 +270,6 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
@@ -559,6 +534,41 @@ def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
)
def ensure_winsdk_external_or_raise() -> None:
"""Ensure the Windows SDK + WGL are available on system
If both of these package are found, the Spack user or bootstrap
configuration (depending on where Spack is running)
will be updated to include all versions and variants detected.
If either the WDK or WSDK are not found, this method will raise
a RuntimeError.
**NOTE:** This modifies the Spack config in the current scope,
either user or environment depending on the calling context.
This is different from all other current bootstrap dependency
checks.
"""
if set(["win-sdk", "wgl"]).issubset(spack.config.get("packages").keys()):
return
externals = spack.detection.by_path(["win-sdk", "wgl"])
if not set(["win-sdk", "wgl"]) == externals.keys():
missing_packages_lst = []
if "wgl" not in externals:
missing_packages_lst.append("wgl")
if "win-sdk" not in externals:
missing_packages_lst.append("win-sdk")
missing_packages = " & ".join(missing_packages_lst)
raise RuntimeError(
f"Unable to find the {missing_packages}, please install these packages \
via the Visual Studio installer \
before proceeding with Spack or provide the path to a non standard install with \
'spack external find --path'"
)
# wgl/sdk are not required for bootstrapping Spack, but
# are required for building anything non trivial
# add to user config so they can be used by subsequent Spack ops
spack.detection.update_configuration(externals, buildable=False)
def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":

View File

@@ -3,13 +3,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap non-core Spack dependencies from an environment."""
import glob
import hashlib
import os
import pathlib
import sys
import warnings
from typing import List
from typing import Iterable, List
import archspec.cpu
@@ -28,6 +26,16 @@
class BootstrapEnvironment(spack.environment.Environment):
"""Environment to install dependencies of Spack for a given interpreter and architecture"""
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
# Remove python package roots created before python-venv was introduced
for s in self.concrete_roots():
if "python" in s.package.extendees and not s.dependencies("python-venv"):
self.deconcretize(s)
@classmethod
def spack_dev_requirements(cls) -> List[str]:
"""Spack development requirements"""
@@ -59,31 +67,19 @@ def view_root(cls) -> pathlib.Path:
return cls.environment_root().joinpath("view")
@classmethod
def pythonpaths(cls) -> List[str]:
"""Paths to be added to sys.path or PYTHONPATH"""
python_dir_part = f"python{'.'.join(str(x) for x in sys.version_info[:2])}"
glob_expr = str(cls.view_root().joinpath("**", python_dir_part, "**"))
result = glob.glob(glob_expr)
if not result:
msg = f"Cannot find any Python path in {cls.view_root()}"
warnings.warn(msg)
return result
@classmethod
def bin_dirs(cls) -> List[pathlib.Path]:
def bin_dir(cls) -> pathlib.Path:
"""Paths to be added to PATH"""
return [cls.view_root().joinpath("bin")]
return cls.view_root().joinpath("bin")
def python_dirs(self) -> Iterable[pathlib.Path]:
python = next(s for s in self.all_specs_generator() if s.name == "python-venv").package
return {self.view_root().joinpath(p) for p in (python.platlib, python.purelib)}
@classmethod
def spack_yaml(cls) -> pathlib.Path:
"""Environment spack.yaml file"""
return cls.environment_root().joinpath("spack.yaml")
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
@@ -100,21 +96,13 @@ def update_installations(self) -> None:
self.install_all()
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
the environment view.
"""
# Do minimal modifications to sys.path and environment variables. In particular, pay
# attention to have the smallest PYTHONPATH / sys.path possible, since that may impact
# the performance of the current interpreter
sys.path.extend(self.pythonpaths())
os.environ["PATH"] = os.pathsep.join(
[str(x) for x in self.bin_dirs()] + os.environ.get("PATH", "").split(os.pathsep)
)
os.environ["PYTHONPATH"] = os.pathsep.join(
os.environ.get("PYTHONPATH", "").split(os.pathsep)
+ [str(x) for x in self.pythonpaths()]
)
def load(self) -> None:
"""Update PATH and sys.path."""
# Make executables available (shouldn't need PYTHONPATH)
os.environ["PATH"] = f"{self.bin_dir()}{os.pathsep}{os.environ.get('PATH', '')}"
# Spack itself imports pytest
sys.path.extend(str(p) for p in self.python_dirs())
def _write_spack_yaml_file(self) -> None:
tty.msg(
@@ -164,4 +152,4 @@ def ensure_environment_dependencies() -> None:
_add_externals_if_missing()
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()
env.load()

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
from typing import Dict, List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -57,8 +57,10 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
import spack.main
import spack.package_base
import spack.paths
@@ -66,6 +68,7 @@
import spack.repo
import spack.schema.environment
import spack.spec
import spack.stage
import spack.store
import spack.subprocess_context
import spack.user_environment
@@ -78,7 +81,7 @@
from spack.installer import InstallError
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import (
SYSTEM_DIRS,
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
env_flag,
filter_system_paths,
@@ -101,9 +104,13 @@
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
SPACK_RPATH_DIRS = "SPACK_RPATH_DIRS"
SPACK_STORE_INCLUDE_DIRS = "SPACK_STORE_INCLUDE_DIRS"
SPACK_STORE_LINK_DIRS = "SPACK_STORE_LINK_DIRS"
SPACK_STORE_RPATH_DIRS = "SPACK_STORE_RPATH_DIRS"
SPACK_RPATH_DEPS = "SPACK_RPATH_DEPS"
SPACK_LINK_DEPS = "SPACK_LINK_DEPS"
SPACK_PREFIX = "SPACK_PREFIX"
@@ -416,7 +423,7 @@ def set_compiler_environment_variables(pkg, env):
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", ":".join(SYSTEM_DIRS))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
@@ -544,9 +551,26 @@ def update_compiler_args_for_dep(dep):
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
env.set(SPACK_LINK_DIRS, ":".join(link_dirs))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs))
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
spack_managed_dirs: Set[str] = {
spack.stage.get_stage_root(),
spack.store.STORE.db.root,
*(db.root for db in spack.store.STORE.db.upstream_dbs),
}
spack_managed_dirs.update([os.path.realpath(p) for p in spack_managed_dirs])
env.set(SPACK_MANAGED_DIRS, "|".join(f'"{p}/"*' for p in sorted(spack_managed_dirs)))
is_spack_managed = lambda p: any(p.startswith(store) for store in spack_managed_dirs)
link_dirs_spack, link_dirs_system = stable_partition(link_dirs, is_spack_managed)
include_dirs_spack, include_dirs_system = stable_partition(include_dirs, is_spack_managed)
rpath_dirs_spack, rpath_dirs_system = stable_partition(rpath_dirs, is_spack_managed)
env.set(SPACK_LINK_DIRS, ":".join(link_dirs_system))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs_system))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs_system))
env.set(SPACK_STORE_LINK_DIRS, ":".join(link_dirs_spack))
env.set(SPACK_STORE_INCLUDE_DIRS, ":".join(include_dirs_spack))
env.set(SPACK_STORE_RPATH_DIRS, ":".join(rpath_dirs_spack))
def set_package_py_globals(pkg, context: Context = Context.BUILD):
@@ -583,10 +607,22 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
module.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# Useful directories within the prefix are encapsulated in
# a Prefix object.
@@ -694,12 +730,28 @@ def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwa
return compiler(*compiler_args, output=compiler_output)
def get_rpath_deps(pkg):
"""Return immediate or transitive RPATHs depending on the package."""
if pkg.transitive_rpaths:
return [d for d in pkg.spec.traverse(root=False, deptype=("link"))]
else:
return pkg.spec.dependencies(deptype="link")
def _get_rpath_deps_from_spec(
spec: spack.spec.Spec, transitive_rpaths: bool
) -> List[spack.spec.Spec]:
if not transitive_rpaths:
return spec.dependencies(deptype=dt.LINK)
by_name: Dict[str, spack.spec.Spec] = {}
for dep in spec.traverse(root=False, deptype=dt.LINK):
lookup = by_name.get(dep.name)
if lookup is None:
by_name[dep.name] = dep
elif lookup.version < dep.version:
by_name[dep.name] = dep
return list(by_name.values())
def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]:
"""Return immediate or transitive dependencies (depending on the package) that need to be
rpath'ed. If a package occurs multiple times, the newest version is kept."""
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def get_rpaths(pkg):

View File

@@ -434,11 +434,6 @@ def _do_patch_libtool(self):
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.pkg.compiler.name == "dpcpp":
# Hack to filter out spurious predep_objects when building with Intel dpcpp
# (see https://github.com/spack/spack/issues/32863):
x.filter(regex=r"^(predep_objects=.*)/tmp/conftest-[0-9A-Fa-f]+\.o", repl=r"\1")
x.filter(regex=r"^(predep_objects=.*)/tmp/a-[0-9A-Fa-f]+\.o", repl=r"\1")
elif self.pkg.compiler.name == "nag":
for tag in ["fc", "f77"]:
marker = markers[tag]

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import os
import re
from typing import Tuple
import llnl.util.filesystem as fs
@@ -15,6 +16,12 @@
from .cmake import CMakeBuilder, CMakePackage
def spec_uses_toolchain(spec):
gcc_toolchain_regex = re.compile(".*gcc-toolchain.*")
using_toolchain = list(filter(gcc_toolchain_regex.match, spec.compiler_flags["cxxflags"]))
return using_toolchain
def cmake_cache_path(name, value, comment="", force=False):
"""Generate a string for a cmake cache variable"""
force_str = " FORCE" if force else ""
@@ -213,7 +220,7 @@ def initconfig_mpi_entries(self):
else:
# starting with cmake 3.10, FindMPI expects MPIEXEC_EXECUTABLE
# vs the older versions which expect MPIEXEC
if self.pkg.spec["cmake"].satisfies("@3.10:"):
if spec["cmake"].satisfies("@3.10:"):
entries.append(cmake_cache_path("MPIEXEC_EXECUTABLE", mpiexec))
else:
entries.append(cmake_cache_path("MPIEXEC", mpiexec))
@@ -248,12 +255,17 @@ def initconfig_hardware_entries(self):
# Include the deprecated CUDA_TOOLKIT_ROOT_DIR for supporting BLT packages
entries.append(cmake_cache_path("CUDA_TOOLKIT_ROOT_DIR", cudatoolkitdir))
archs = spec.variants["cuda_arch"].value
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(
cmake_cache_string("CMAKE_CUDA_ARCHITECTURES", "{0}".format(arch_str))
)
# CUDA_FLAGS
cuda_flags = []
if not spec.satisfies("cuda_arch=none"):
cuda_archs = ";".join(spec.variants["cuda_arch"].value)
entries.append(cmake_cache_string("CMAKE_CUDA_ARCHITECTURES", cuda_archs))
if spec_uses_toolchain(spec):
cuda_flags.append("-Xcompiler {}".format(spec_uses_toolchain(spec)[0]))
entries.append(cmake_cache_string("CMAKE_CUDA_FLAGS", " ".join(cuda_flags)))
if "+rocm" in spec:
entries.append("#------------------{0}".format("-" * 30))
@@ -262,9 +274,6 @@ def initconfig_hardware_entries(self):
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
entries.append(
cmake_cache_path("HIP_CXX_COMPILER", "{0}".format(self.spec["hip"].hipcc))
)
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
@@ -277,11 +286,9 @@ def initconfig_hardware_entries(self):
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(
cmake_cache_string("CMAKE_HIP_ARCHITECTURES", "{0}".format(arch_str))
)
entries.append(cmake_cache_string("AMDGPU_TARGETS", "{0}".format(arch_str)))
entries.append(cmake_cache_string("GPU_TARGETS", "{0}".format(arch_str)))
entries.append(cmake_cache_string("CMAKE_HIP_ARCHITECTURES", arch_str))
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
return entries

View File

@@ -16,7 +16,7 @@
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using cargo."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -39,16 +39,11 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
if not getattr(pkg, "find_python_hints", False):
if not getattr(pkg, "find_python_hints", False) or not pkg.spec.dependencies(
"python", dt.BUILD | dt.LINK
):
return
pythons = pkg.spec.dependencies("python", dt.BUILD | dt.LINK)
if len(pythons) != 1:
return
try:
python_executable = pythons[0].package.command.path
except RuntimeError:
return
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),

View File

@@ -0,0 +1,144 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
import spack.compiler
import spack.package_base
# Local "type" for type hints
Path = Union[str, pathlib.Path]
class CompilerPackage(spack.package_base.PackageBase):
"""A Package mixin for all common logic for packages that implement compilers"""
# TODO: how do these play nicely with other tags
tags: Sequence[str] = ["compiler"]
#: Optional suffix regexes for searching for this type of compiler.
#: Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
#: version suffix for gcc.
compiler_suffixes: List[str] = [r"-.*"]
#: Optional prefix regexes for searching for this compiler
compiler_prefixes: List[str] = []
#: Compiler argument(s) that produces version information
#: If multiple arguments, the earlier arguments must produce errors when invalid
compiler_version_argument: Union[str, Tuple[str]] = "-dumpversion"
#: Regex used to extract version from compiler's output
compiler_version_regex: str = "(.*)"
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
assert set(self.supported_languages) <= set(self.compiler_languages), msg
@property
def supported_languages(self) -> Sequence[str]:
"""Dynamic definition of languages supported by this package"""
return self.compiler_languages
@classproperty
def compiler_names(cls) -> Sequence[str]:
"""Construct list of compiler names from per-language names"""
names = []
for language in cls.compiler_languages:
names.extend(getattr(cls, f"{language}_names"))
return names
@classproperty
def executables(cls) -> Sequence[str]:
"""Construct executables for external detection from names, prefixes, and suffixes."""
regexp_fmt = r"^({0}){1}({2})$"
prefixes = [""] + cls.compiler_prefixes
suffixes = [""] + cls.compiler_suffixes
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
suffixes += [suf + ext for suf in suffixes]
return [
regexp_fmt.format(prefix, re.escape(name), suffix)
for prefix, name, suffix in itertools.product(prefixes, cls.compiler_names, suffixes)
]
@classmethod
def determine_version(cls, exe: Path):
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
except spack.util.executable.ProcessError:
pass
except Exception as e:
tty.debug(
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
return os.path.join(prefix, "bin")
@classmethod
def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
"""Compute the paths to compiler executables associated with this package
This is a helper method for ``determine_variants`` to compute the ``extra_attributes``
to include with each spec object."""
# There are often at least two copies (not symlinks) of each compiler executable in the
# same directory: one with a canonical name, e.g. "gfortran", and another one with the
# target prefix, e.g. "x86_64-pc-linux-gnu-gfortran". There also might be a copy of "gcc"
# with the version suffix, e.g. "x86_64-pc-linux-gnu-gcc-6.3.0". To ensure the consistency
# of values in the "paths" dictionary (i.e. we prefer all of them to reference copies
# with canonical names if possible), we iterate over the executables in the reversed sorted
# order:
# First pass over languages identifies exes that are perfect matches for canonical names
# Second pass checks for names with prefix/suffix
# Second pass is sorted by language name length because longer named languages
# e.g. cxx can often contain the names of shorter named languages
# e.g. c (e.g. clang/clang++)
paths = {}
exes = sorted(exes, reverse=True)
languages = {
lang: getattr(cls, f"{lang}_names")
for lang in sorted(cls.compiler_languages, key=len, reverse=True)
}
for exe in exes:
for lang, names in languages.items():
if os.path.basename(exe) in names:
paths[lang] = exe
break
else:
for lang, names in languages.items():
if any(name in os.path.basename(exe) for name in names):
paths[lang] = exe
break
return paths
@classmethod
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}

View File

@@ -21,7 +21,7 @@
class MakefilePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -145,7 +145,7 @@ def install(self, pkg, spec, prefix):
opts += self.nmake_install_args()
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", prefix))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, variant
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -26,10 +26,11 @@ class IntelOneApiPackage(Package):
"""Base class for Intel oneAPI packages."""
homepage = "https://software.intel.com/oneapi"
license("https://intel.ly/393CijO")
# oneAPI license does not allow mirroring outside of the
# organization (e.g. University/Company).
redistribute_source = False
redistribute(source=False, binary=False)
for c in [
"target=ppc64:",

View File

@@ -138,16 +138,21 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
# Patch up shebangs if the package extends Python and we put a Python interpreter in the
# view.
if not self.extendee_spec:
return super().add_files_to_view(view, merge_map, skip_if_exists)
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
if python.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
for src, dst in merge_map.items():
if skip_if_exists and os.path.lexists(dst):
continue
@@ -168,7 +173,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
python.prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
view.link(src, dst)
@@ -199,14 +204,13 @@ def remove_files_from_view(self, view, merge_map):
ignore_namespace = True
bin_dir = self.spec.prefix.bin
global_view = self.extendee_spec.prefix == view.get_projection_for_spec(self.spec)
to_remove = []
for src, dst in merge_map.items():
if ignore_namespace and namespace_init(dst):
continue
if global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
to_remove.append(dst)
else:
os.remove(dst)
@@ -362,6 +366,12 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return f"https://pypi.org/simple/{name}/"
return None
@property
def python_spec(self):
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@property
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
@@ -371,8 +381,9 @@ def headers(self) -> HeaderList:
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
@@ -391,8 +402,9 @@ def libs(self) -> LibraryList:
name = self.spec.name[3:]
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
@@ -504,6 +516,8 @@ def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]
@@ -519,14 +533,6 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
else:
args.append(".")
pip = spec["python"].command
# Hide user packages, since we don't have build isolation. This is
# necessary because pip / setuptools may run hooks from arbitrary
# packages during the build. There is no equivalent variable to hide
# system packages, so this is not reliable for external Python.
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
with fs.working_dir(self.build_directory):
pip(*args)

View File

@@ -80,6 +80,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.util.environment import EnvironmentModifications
class ROCmPackage(PackageBase):
@@ -156,30 +157,23 @@ def hip_flags(amdgpu_target):
archs = ",".join(amdgpu_target)
return "--amdgpu-target={0}".format(archs)
# ASAN
@staticmethod
def asan_on(env, llvm_path):
def asan_on(self, env: EnvironmentModifications):
llvm_path = self.spec["llvm-amdgpu"].prefix
env.set("CC", llvm_path + "/bin/clang")
env.set("CXX", llvm_path + "/bin/clang++")
env.set("ASAN_OPTIONS", "detect_leaks=0")
for root, dirs, files in os.walk(llvm_path):
for root, _, files in os.walk(llvm_path):
if "libclang_rt.asan-x86_64.so" in files:
asan_lib_path = root
env.prepend_path("LD_LIBRARY_PATH", asan_lib_path)
SET_DWARF_VERSION_4 = ""
try:
# This will throw an error if imported on a non-Linux platform.
import distro
distname = distro.id()
except ImportError:
distname = "unknown"
if "rhel" in distname or "sles" in distname:
if "rhel" in self.spec.os or "sles" in self.spec.os:
SET_DWARF_VERSION_4 = "-gdwarf-5"
else:
SET_DWARF_VERSION_4 = ""
env.set("CFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CXXFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("CXXFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("LDFLAGS", "-Wl,--enable-new-dtags -fuse-ld=lld -fsanitize=address -g -Wl,")
# HIP version vs Architecture

View File

@@ -16,8 +16,8 @@
import tempfile
import time
import zipfile
from collections import namedtuple
from typing import List, Optional
from collections import defaultdict, namedtuple
from typing import Dict, List, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
@@ -44,6 +44,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
@@ -113,54 +114,24 @@ def _remove_reserved_tags(tags):
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _spec_deps_key(s):
def _spec_ci_label(s):
return f"{s.name}/{s.dag_hash(7)}"
def _add_dependency(spec_label, dep_label, deps):
if spec_label == dep_label:
return
if spec_label not in deps:
deps[spec_label] = set()
deps[spec_label].add(dep_label)
PlainNodes = Dict[str, spack.spec.Spec]
PlainEdges = Dict[str, Set[str]]
def _get_spec_dependencies(specs, deps, spec_labels):
spec_deps_obj = _compute_spec_deps(specs)
if spec_deps_obj:
dependencies = spec_deps_obj["dependencies"]
specs = spec_deps_obj["specs"]
for entry in specs:
spec_labels[entry["label"]] = entry["spec"]
for entry in dependencies:
_add_dependency(entry["spec"], entry["depends"], deps)
def stage_spec_jobs(specs):
"""Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework.
def stage_spec_jobs(specs: List[spack.spec.Spec]) -> Tuple[PlainNodes, PlainEdges, List[Set[str]]]:
"""Turn a DAG into a list of stages (set of nodes), the list is ordered topologically, so that
each node in a stage has dependencies only in previous stages.
Arguments:
specs (Iterable): Specs to build
Returns: A tuple of information objects describing the specs, dependencies
and stages:
spec_labels: A dictionary mapping the spec labels (which are formatted
as pkg-name/hash-prefix) to concrete specs.
deps: A dictionary where the keys should also have appeared as keys in
the spec_labels dictionary, and the values are the set of
dependencies for that spec.
stages: An ordered list of sets, each of which contains all the jobs to
built in that stage. The jobs are expressed in the same format as
the keys in the spec_labels and deps objects.
specs: Specs to build
Returns: A tuple (nodes, edges, stages) where ``nodes`` maps labels to specs, ``edges`` maps
labels to a set of labels of dependencies, and ``stages`` is a topologically ordered list
of sets of labels.
"""
# The convenience method below, "_remove_satisfied_deps()", does not modify
@@ -177,17 +148,12 @@ def _remove_satisfied_deps(deps, satisfied_list):
return new_deps
deps = {}
spec_labels = {}
nodes, edges = _extract_dag(specs)
_get_spec_dependencies(specs, deps, spec_labels)
# Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is
# overwritten rather than being modified each time through the loop,
# thus preserving the original value of "deps" saved here.
dependencies = deps
unstaged = set(spec_labels.keys())
# Save the original edges, as we need to return them at the end of the function. In the loop
# below, the "dependencies" variable is rebound rather than mutated, so "edges" is not mutated.
dependencies = edges
unstaged = set(nodes.keys())
stages = []
while dependencies:
@@ -203,7 +169,7 @@ def _remove_satisfied_deps(deps, satisfied_list):
if unstaged:
stages.append(unstaged.copy())
return spec_labels, deps, stages
return nodes, edges, stages
def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions):
@@ -235,87 +201,22 @@ def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisi
tty.msg(msg)
def _compute_spec_deps(spec_list):
"""
Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a
comprehensive list of all the edges in the dependency graph. For
example, given a single spec like 'readline@7.0', this function
generates the following JSON object:
def _extract_dag(specs: List[spack.spec.Spec]) -> Tuple[PlainNodes, PlainEdges]:
"""Extract a sub-DAG as plain old Python objects with external nodes removed."""
nodes: PlainNodes = {}
edges: PlainEdges = defaultdict(set)
.. code-block:: JSON
for edge in traverse.traverse_edges(specs, cover="edges"):
if (edge.parent and edge.parent.external) or edge.spec.external:
continue
child_id = _spec_ci_label(edge.spec)
nodes[child_id] = edge.spec
if edge.parent:
parent_id = _spec_ci_label(edge.parent)
nodes[parent_id] = edge.parent
edges[parent_id].add(child_id)
{
"dependencies": [
{
"depends": "readline/ip6aiun",
"spec": "readline/ip6aiun"
},
{
"depends": "ncurses/y43rifz",
"spec": "readline/ip6aiun"
},
{
"depends": "ncurses/y43rifz",
"spec": "readline/ip6aiun"
},
{
"depends": "pkgconf/eg355zb",
"spec": "ncurses/y43rifz"
},
{
"depends": "pkgconf/eg355zb",
"spec": "readline/ip6aiun"
}
],
"specs": [
{
"spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-highs...",
"label": "readline/ip6aiun"
},
{
"spec": "ncurses@6.1%apple-clang@9.1.0 arch=darwin-highsi...",
"label": "ncurses/y43rifz"
},
{
"spec": "pkgconf@1.5.4%apple-clang@9.1.0 arch=darwin-high...",
"label": "pkgconf/eg355zb"
}
]
}
"""
spec_labels = {}
specs = []
dependencies = []
def append_dep(s, d):
dependencies.append({"spec": s, "depends": d})
for spec in spec_list:
for s in spec.traverse(deptype="all"):
if s.external:
tty.msg(f"Will not stage external pkg: {s}")
continue
skey = _spec_deps_key(s)
spec_labels[skey] = s
for d in s.dependencies(deptype="all"):
dkey = _spec_deps_key(d)
if d.external:
tty.msg(f"Will not stage external dep: {d}")
continue
append_dep(skey, dkey)
for spec_label, concrete_spec in spec_labels.items():
specs.append({"label": spec_label, "spec": concrete_spec})
deps_json_obj = {"specs": specs, "dependencies": dependencies}
return deps_json_obj
return nodes, edges
def _spec_matches(spec, match_string):
@@ -327,7 +228,7 @@ def _format_job_needs(
):
needs_list = []
for dep_job in dep_jobs:
dep_spec_key = _spec_deps_key(dep_job)
dep_spec_key = _spec_ci_label(dep_job)
rebuild = rebuild_decisions[dep_spec_key].rebuild
if not prune_dag or rebuild:
@@ -783,6 +684,22 @@ def generate_gitlab_ci_yaml(
"instead.",
)
def ensure_expected_target_path(path):
"""Returns passed paths with all Windows path separators exchanged
for posix separators only if copy_only_pipeline is enabled
This is required as copy_only_pipelines are a unique scenario where
the generate job and child pipelines are run on different platforms.
To make this compatible w/ Windows, we cannot write Windows style path separators
that will be consumed on by the Posix copy job runner.
TODO (johnwparent): Refactor config + cli read/write to deal only in posix
style paths
"""
if copy_only_pipeline and path:
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
@@ -906,7 +823,7 @@ def generate_gitlab_ci_yaml(
if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope)
env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = env_includes
env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes]
if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]:
env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci")
@@ -1327,6 +1244,9 @@ def main_script_replacements(cmd):
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
}
output_vars = output_object["variables"]
for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val)
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
@@ -1383,7 +1303,6 @@ def main_script_replacements(cmd):
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
@@ -1578,6 +1497,12 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def win_quote(quote_str: str) -> str:
if IS_WINDOWS:
quote_str = f'"{quote_str}"'
return quote_str
def download_and_extract_artifacts(url, work_dir):
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -1600,7 +1525,7 @@ def download_and_extract_artifacts(url, work_dir):
request = Request(url, headers=headers)
request.get_method = lambda: "GET"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
@@ -1947,9 +1872,9 @@ def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
entrypoint_script.append(["echo", f"Re-run install script using:\n\t{install_mechanism}"])
# Allow interactive
if IS_WINDOWS:
entrypoint_script.extend(["&", "($args -Join ' ')", "-NoExit"])
entrypoint_script.append(["&", "($args -Join ' ')", "-NoExit"])
else:
entrypoint_script.extend(["exec", "$@"])
entrypoint_script.append(["exec", "$@"])
process_command(
"entrypoint", entrypoint_script, work_dir, run=False, exit_on_failure=False
@@ -2042,9 +1967,9 @@ def compose_command_err_handling(args):
# but we need to handle EXEs (git, etc) ourselves
catch_exe_failure = (
"""
if ($LASTEXITCODE -ne 0){
throw "Command {} has failed"
}
if ($LASTEXITCODE -ne 0){{
throw 'Command {} has failed'
}}
"""
if IS_WINDOWS
else ""
@@ -2276,13 +2201,13 @@ def __init__(self, ci_cdash):
def args(self):
return [
"--cdash-upload-url",
self.upload_url,
win_quote(self.upload_url),
"--cdash-build",
self.build_name,
win_quote(self.build_name),
"--cdash-site",
self.site,
win_quote(self.site),
"--cdash-buildstamp",
self.build_stamp,
win_quote(self.build_stamp),
]
@property # type: ignore
@@ -2348,7 +2273,7 @@ def create_buildgroup(self, opener, headers, url, group_name, group_type):
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code not in [200, 201]:
@@ -2394,7 +2319,7 @@ def populate_buildgroup(self, job_names):
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:

View File

@@ -334,8 +334,7 @@ def display_specs(specs, args=None, **kwargs):
variants (bool): Show variants with specs
indent (int): indent each line this much
groups (bool): display specs grouped by arch/compiler (default True)
decorators (dict): dictionary mappng specs to decorators
header_callback (typing.Callable): called at start of arch/compiler groups
decorator (typing.Callable): function to call to decorate specs
all_headers (bool): show headers even when arch/compiler aren't defined
output (typing.IO): A file object to write to. Default is ``sys.stdout``
@@ -384,15 +383,13 @@ def get_arg(name, default=None):
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + ffmt + vfmt
transform = {"package": decorator, "fullpackage": decorator}
def fmt(s, depth=0):
"""Formatter function for all output specs"""
string = ""
if hashes:
string += gray_hash(s, hlen) + " "
string += depth * " "
string += s.cformat(format_string, transform=transform)
string += decorator(s, s.cformat(format_string))
return string
def format_list(specs):
@@ -451,7 +448,7 @@ def filter_loaded_specs(specs):
return [x for x in specs if x.dag_hash() in hashes]
def print_how_many_pkgs(specs, pkg_type=""):
def print_how_many_pkgs(specs, pkg_type="", suffix=""):
"""Given a list of specs, this will print a message about how many
specs are in that list.
@@ -462,7 +459,7 @@ def print_how_many_pkgs(specs, pkg_type=""):
category, e.g. if pkg_type is "installed" then the message
would be "3 installed packages"
"""
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package"))
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package") + suffix)
def spack_is_git_repo():

View File

@@ -84,7 +84,7 @@ def externals(parser, args):
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs, debug_log=tty.debug)
_process_reports(reports)

View File

@@ -13,7 +13,6 @@
import shutil
import sys
import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty
@@ -54,6 +53,7 @@
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry,
upload_manifest_with_retry,
)
@@ -133,6 +133,11 @@ def setup_parser(subparser: argparse.ArgumentParser):
help="when pushing to an OCI registry, tag an image containing all root specs and their "
"runtime dependencies",
)
push.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
@@ -367,6 +372,25 @@ def _make_pool() -> MaybePool:
return NoPool()
def _skip_no_redistribute_for_public(specs):
remaining_specs = list()
removed_specs = list()
for spec in specs:
if spec.package.redistribute_binary:
remaining_specs.append(spec)
else:
removed_specs.append(spec)
if removed_specs:
colified_output = tty.colify.colified(list(s.name for s in removed_specs), indent=4)
tty.debug(
"The following specs will not be added to the binary cache"
" because they cannot be redistributed:\n"
f"{colified_output}\n"
"You can use `--private` to include them."
)
return remaining_specs
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -417,6 +441,8 @@ def push_fn(args):
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
if not args.private:
specs = _skip_no_redistribute_for_public(specs)
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.
@@ -830,10 +856,7 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
tags = list_tags(image_ref)
# Fetch all image config files in parallel
spec_dicts = pool.starmap(

View File

@@ -31,7 +31,6 @@
level = "long"
SPACK_COMMAND = "spack"
MAKE_COMMAND = "make"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
@@ -40,6 +39,12 @@ def deindent(desc):
return desc.replace(" ", "")
def unicode_escape(path: str) -> str:
"""Returns transformed path with any unicode
characters replaced with their corresponding escapes"""
return path.encode("unicode-escape").decode("utf-8")
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help="CI sub-commands")
@@ -551,75 +556,35 @@ def ci_rebuild(args):
# No hash match anywhere means we need to rebuild spec
# Start with spack arguments
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose"]
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose", "install"]
config = cfg.get("config")
if not config["verify_ssl"]:
spack_cmd.append("-k")
install_args = []
install_args = [f'--use-buildcache={spack_ci.win_quote("package:never,dependencies:only")}']
can_verify = spack_ci.can_verify_binaries()
verify_binaries = can_verify and spack_is_pr_pipeline is False
if not verify_binaries:
install_args.append("--no-check-signature")
slash_hash = "/{}".format(job_spec.dag_hash())
# Arguments when installing dependencies from cache
deps_install_args = install_args
slash_hash = spack_ci.win_quote("/" + job_spec.dag_hash())
# Arguments when installing the root from sources
root_install_args = install_args + [
"--keep-stage",
"--only=package",
"--use-buildcache=package:never,dependencies:only",
]
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
root_install_args.extend(cdash_handler.args())
root_install_args.append(slash_hash)
# ["x", "y"] -> "'x' 'y'"
args_to_string = lambda args: " ".join("'{}'".format(arg) for arg in args)
commands = [
# apparently there's a race when spack bootstraps? do it up front once
[SPACK_COMMAND, "-e", env.path, "bootstrap", "now"],
[
SPACK_COMMAND,
"-e",
env.path,
"env",
"depfile",
"-o",
"Makefile",
"--use-buildcache=package:never,dependencies:only",
slash_hash, # limit to spec we're building
],
[
# --output-sync requires GNU make 4.x.
# Old make errors when you pass it a flag it doesn't recognize,
# but it doesn't error or warn when you set unrecognized flags in
# this variable.
"export",
"GNUMAKEFLAGS=--output-sync=recurse",
],
[
MAKE_COMMAND,
"SPACK={}".format(args_to_string(spack_cmd)),
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,
[SPACK_COMMAND, "-e", unicode_escape(env.path), "bootstrap", "now"],
spack_cmd + deps_install_args + [slash_hash],
spack_cmd + root_install_args + [slash_hash],
]
tty.debug("Installing {0} from source".format(job_spec.name))
install_exit_code = spack_ci.process_command("install", commands, repro_dir)

View File

@@ -563,12 +563,13 @@ def add_concretizer_args(subparser):
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--fresh-roots",
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
help="concretize with fresh roots and reused dependencies",
)
subgroup.add_argument(
"--deprecated",

View File

@@ -10,13 +10,13 @@
import sys
import tempfile
from pathlib import Path
from typing import Optional
from typing import List, Optional
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from llnl.util.tty.color import cescape, colorize
import spack.cmd
import spack.cmd.common
@@ -61,14 +61,7 @@
#
def env_create_setup_parser(subparser):
"""create a new environment"""
subparser.add_argument(
"env_name",
metavar="env",
help=(
"name of managed environment or directory of the anonymous env "
"(when using --dir/-d) to activate"
),
)
subparser.add_argument("env_name", metavar="env", help="name or directory of environment")
subparser.add_argument(
"-d", "--dir", action="store_true", help="create an environment in a specific directory"
)
@@ -94,6 +87,9 @@ def env_create_setup_parser(subparser):
default=None,
help="either a lockfile (must end with '.json' or '.lock') or a manifest file",
)
subparser.add_argument(
"--include-concrete", action="append", help="name of old environment to copy specs from"
)
def env_create(args):
@@ -111,19 +107,32 @@ def env_create(args):
# the environment should not include a view.
with_view = None
include_concrete = None
if hasattr(args, "include_concrete"):
include_concrete = args.include_concrete
env = _env_create(
args.env_name,
init_file=args.envfile,
dir=args.dir,
dir=args.dir or os.path.sep in args.env_name or args.env_name in (".", ".."),
with_view=with_view,
keep_relative=args.keep_relative,
include_concrete=include_concrete,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
def _env_create(
name_or_path: str,
*,
init_file: Optional[str] = None,
dir: bool = False,
with_view: Optional[str] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
):
"""Create a new environment, with an optional yaml description.
Arguments:
@@ -135,22 +144,31 @@ def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep
keep_relative (bool): if True, develop paths are copied verbatim into
the new environment file, otherwise they may be made absolute if the
new environment is in a different location
include_concrete (list): list of the included concrete environments
"""
if not dir:
env = ev.create(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg("Created environment '%s' in %s" % (name_or_path, env.path))
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % (name_or_path))
return env
env = ev.create_in_dir(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
)
tty.msg("Created environment in %s" % env.path)
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % env.path)
tty.msg(
colorize(
f"Created environment @c{{{cescape(name_or_path)}}} in: @c{{{cescape(env.path)}}}"
)
)
else:
env = ev.create_in_dir(
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg(colorize(f"Created independent environment in: @c{{{cescape(env.path)}}}"))
tty.msg(f"Activate with: {colorize(f'@c{{spack env activate {cescape(name_or_path)}}}')}")
return env
@@ -436,6 +454,12 @@ def env_remove_setup_parser(subparser):
"""remove an existing environment"""
subparser.add_argument("rm_env", metavar="env", nargs="+", help="environment(s) to remove")
arguments.add_common_arguments(subparser, ["yes_to_all"])
subparser.add_argument(
"-f",
"--force",
action="store_true",
help="remove the environment even if it is included in another environment",
)
def env_remove(args):
@@ -445,13 +469,35 @@ def env_remove(args):
and manifests embedded in repositories should be removed manually.
"""
read_envs = []
valid_envs = []
bad_envs = []
for env_name in args.rm_env:
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
read_envs.append(env)
valid_envs.append(env_name)
if env_name in args.rm_env:
read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
bad_envs.append(env_name)
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if env is linked to another before trying to remove
for name in valid_envs:
# don't check if environment is included to itself
if name == env_name:
continue
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)

View File

@@ -14,6 +14,7 @@
import spack.cmd as cmd
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
@@ -69,6 +70,12 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["long", "very_long", "tags", "namespaces"])
subparser.add_argument(
"-r",
"--only-roots",
action="store_true",
help="don't show full list of installed specs in an environment",
)
subparser.add_argument(
"-c",
"--show-concretized",
@@ -189,26 +196,22 @@ def query_arguments(args):
return q_args
def setup_env(env):
def make_env_decorator(env):
"""Create a function for decorating specs when in an environment."""
def strip_build(seq):
return set(s.copy(deps=("link", "run")) for s in seq)
added = set(strip_build(env.added_specs()))
roots = set(strip_build(env.roots()))
removed = set(strip_build(env.removed_specs()))
roots = set(env.roots())
removed = set(env.removed_specs())
def decorator(spec, fmt):
# add +/-/* to show added/removed/root specs
if any(spec.dag_hash() == r.dag_hash() for r in roots):
return color.colorize("@*{%s}" % fmt)
return color.colorize(f"@*{{{fmt}}}")
elif spec in removed:
return color.colorize("@K{%s}" % fmt)
return color.colorize(f"@K{{{fmt}}}")
else:
return "%s" % fmt
return fmt
return decorator, added, roots, removed
return decorator
def display_env(env, args, decorator, results):
@@ -223,10 +226,54 @@ def display_env(env, args, decorator, results):
"""
tty.msg("In environment %s" % env.name)
if not env.user_specs:
tty.msg("No root specs")
else:
tty.msg("Root specs")
num_roots = len(env.user_specs) or "No"
tty.msg(f"{num_roots} root specs")
concrete_specs = {
root: concrete_root
for root, concrete_root in zip(env.concretized_user_specs, env.concrete_roots())
}
def root_decorator(spec, string):
"""Decorate root specs with their install status if needed"""
concrete = concrete_specs.get(spec)
if concrete:
status = color.colorize(concrete.install_status().value)
hash = concrete.dag_hash()
else:
status = color.colorize(spack.spec.InstallStatus.absent.value)
hash = "-" * 32
# TODO: status has two extra spaces on the end of it, but fixing this and other spec
# TODO: space format idiosyncrasies is complicated. Fix this eventually
status = status[:-2]
if args.long or args.very_long:
hash = color.colorize(f"@K{{{hash[: 7 if args.long else None]}}}")
return f"{status} {hash} {string}"
else:
return f"{status} {string}"
with spack.store.STORE.db.read_transaction():
cmd.display_specs(
env.user_specs,
args,
# these are overrides of CLI args
paths=False,
long=False,
very_long=False,
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
print()
if env.included_concrete_envs:
tty.msg("Included specs")
# Root specs cannot be displayed with prefixes, since those are not
# set for abstract specs. Same for hashes
@@ -236,10 +283,10 @@ def display_env(env, args, decorator, results):
# Roots are displayed with variants, etc. so that we can see
# specifically what the user asked for.
cmd.display_specs(
env.user_specs,
env.included_user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespaces=True,
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
@@ -254,7 +301,7 @@ def display_env(env, args, decorator, results):
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results:
if results and not args.only_roots:
tty.msg("Installed packages")
@@ -263,9 +310,10 @@ def find(parser, args):
results = args.specs(**q_args)
env = ev.active_environment()
decorator = lambda s, f: f
if env:
decorator, _, roots, _ = setup_env(env)
if not env and args.only_roots:
tty.die("-r / --only-roots requires an active environment")
decorator = make_env_decorator(env) if env else lambda s, f: f
# use groups by default except with format.
if args.groups is None:
@@ -292,9 +340,12 @@ def find(parser, args):
if env:
display_env(env, args, decorator, results)
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = " (not shown)"
if not args.only_roots:
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = ""
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type)
spack.cmd.print_how_many_pkgs(results, pkg_type, suffix=count_suffix)

View File

@@ -263,8 +263,8 @@ def _fmt_name_and_default(variant):
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when, indent):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(when)}")
def _fmt_when(when: "spack.spec.Spec", indent: int):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(str(when))}")
def _fmt_variant_description(variant, width, indent):
@@ -441,7 +441,7 @@ def get_url(version):
return "No URL"
url = get_url(preferred) if pkg.has_code else ""
line = version(" {0}".format(pad(preferred))) + color.cescape(url)
line = version(" {0}".format(pad(preferred))) + color.cescape(str(url))
color.cwrite(line)
print()
@@ -464,7 +464,7 @@ def get_url(version):
continue
for v, url in vers:
line = version(" {0}".format(pad(v))) + color.cescape(url)
line = version(" {0}".format(pad(v))) + color.cescape(str(url))
color.cprint(line)
@@ -475,10 +475,7 @@ def print_virtuals(pkg, args):
color.cprint(section_title("Virtual Packages: "))
if pkg.provided:
for when, specs in reversed(sorted(pkg.provided.items())):
line = " %s provides %s" % (
when.colorized(),
", ".join(s.colorized() for s in specs),
)
line = " %s provides %s" % (when.cformat(), ", ".join(s.cformat() for s in specs))
print(line)
else:
@@ -497,7 +494,9 @@ def print_licenses(pkg, args):
pad = padder(pkg.licenses, 4)
for when_spec in pkg.licenses:
license_identifier = pkg.licenses[when_spec]
line = license(" {0}".format(pad(license_identifier))) + color.cescape(when_spec)
line = license(" {0}".format(pad(license_identifier))) + color.cescape(
str(when_spec)
)
color.cprint(line)

View File

@@ -420,10 +420,9 @@ def install_with_active_env(env: ev.Environment, args, install_kwargs, reporter_
with reporter_factory(specs_to_install):
env.install_specs(specs_to_install, **install_kwargs)
finally:
# TODO: this is doing way too much to trigger
# views and modules to be generated.
with env.write_transaction():
env.write(regenerate=True)
if env.views:
with env.write_transaction():
env.write(regenerate=True)
def concrete_specs_from_cli(args, install_kwargs):

View File

@@ -71,6 +71,11 @@ def setup_parser(subparser):
help="the number of versions to fetch for each spec, choose 'all' to"
" retrieve all versions of each package",
)
create_parser.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(create_parser, ["specs"])
arguments.add_concretizer_args(create_parser)
@@ -108,6 +113,11 @@ def setup_parser(subparser):
"and source use `--type binary --type source` (default)"
),
)
add_parser.add_argument(
"--autopush",
action="store_true",
help=("set mirror to push automatically after installation"),
)
add_parser_signed = add_parser.add_mutually_exclusive_group(required=False)
add_parser_signed.add_argument(
"--unsigned",
@@ -175,6 +185,21 @@ def setup_parser(subparser):
),
)
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser_autopush = set_parser.add_mutually_exclusive_group(required=False)
set_parser_autopush.add_argument(
"--autopush",
help="set mirror to push automatically after installation",
action="store_true",
default=None,
dest="autopush",
)
set_parser_autopush.add_argument(
"--no-autopush",
help="set mirror to not push automatically after installation",
action="store_false",
default=None,
dest="autopush",
)
set_parser_unsigned = set_parser.add_mutually_exclusive_group(required=False)
set_parser_unsigned.add_argument(
"--unsigned",
@@ -218,6 +243,7 @@ def mirror_add(args):
or args.type
or args.oci_username
or args.oci_password
or args.autopush
or args.signed is not None
):
connection = {"url": args.url}
@@ -234,6 +260,8 @@ def mirror_add(args):
if args.type:
connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type
if args.autopush:
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
@@ -270,6 +298,8 @@ def _configure_mirror(args):
changes["access_pair"] = [args.oci_username, args.oci_password]
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
if getattr(args, "autopush", None) is not None:
changes["autopush"] = args.autopush
# argparse cannot distinguish between --binary and --no-binary when same dest :(
# notice that set-url does not have these args, so getattr
@@ -334,7 +364,6 @@ def concrete_specs_from_user(args):
specs = filter_externals(specs)
specs = list(set(specs))
specs.sort(key=lambda s: (s.name, s.version))
specs, _ = lang.stable_partition(specs, predicate_fn=not_excluded_fn(args))
return specs
@@ -379,36 +408,50 @@ def concrete_specs_from_cli_or_file(args):
return specs
def not_excluded_fn(args):
"""Return a predicate that evaluate to True if a spec was not explicitly
excluded by the user.
"""
exclude_specs = []
if args.exclude_file:
exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
class IncludeFilter:
def __init__(self, args):
self.exclude_specs = []
if args.exclude_file:
self.exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
self.exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
self.private = args.private
def not_excluded(x):
return not any(x.satisfies(y) for y in exclude_specs)
def __call__(self, x):
return all([self._not_license_excluded(x), self._not_cmdline_excluded(x)])
return not_excluded
def _not_license_excluded(self, x):
"""True if the spec is for a private mirror, or as long as the
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
return True
else:
tty.debug(
"Skip adding {0} to mirror: the package.py file"
" indicates that a public mirror should not contain"
" it.".format(x.name)
)
return False
def _not_cmdline_excluded(self, x):
"""True if a spec was not explicitly excluded by the user."""
return not any(x.satisfies(y) for y in self.exclude_specs)
def concrete_specs_from_environment(selection_fn):
def concrete_specs_from_environment():
env = ev.active_environment()
assert env, "an active environment is required"
mirror_specs = env.all_specs()
mirror_specs = filter_externals(mirror_specs)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
def all_specs_with_all_versions(selection_fn):
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
@@ -429,12 +472,6 @@ def versions_per_spec(args):
return num_versions
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def process_mirror_stats(present, mirrored, error):
p, m, e = len(present), len(mirrored), len(error)
tty.msg(
@@ -480,30 +517,28 @@ def mirror_create(args):
# When no directory is provided, the source dir is used
path = args.directory or spack.caches.fetch_cache_location()
mirror_specs, mirror_fn = _specs_and_action(args)
mirror_fn(mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions)
def _specs_and_action(args):
include_fn = IncludeFilter(args)
if args.all and not ev.active_environment():
create_mirror_for_all_specs(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = all_specs_with_all_versions()
mirror_fn = create_mirror_for_all_specs
elif args.all and ev.active_environment():
mirror_specs = concrete_specs_from_environment()
mirror_fn = create_mirror_for_individual_specs
else:
mirror_specs = concrete_specs_from_user(args)
mirror_fn = create_mirror_for_individual_specs
if args.all and ev.active_environment():
create_mirror_for_all_specs_inside_environment(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = concrete_specs_from_user(args)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions
)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=include_fn)
return mirror_specs, mirror_fn
def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
mirror_specs = all_specs_with_all_versions(selection_fn=selection_fn)
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
@@ -515,11 +550,10 @@ def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_all_specs_inside_environment(path, skip_unstable_versions, selection_fn):
mirror_specs = concrete_specs_from_environment(selection_fn=selection_fn)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=skip_unstable_versions
)
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def mirror_destroy(args):

View File

@@ -23,7 +23,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.21"
tutorial_branch = "releases/v0.22"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -239,6 +239,8 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -214,8 +214,6 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:

View File

@@ -20,8 +20,10 @@
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
@@ -107,7 +109,6 @@ def _parse_link_paths(string):
"""
lib_search_paths = False
raw_link_dirs = []
tty.debug("parsing implicit link info")
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
@@ -122,7 +123,7 @@ def _parse_link_paths(string):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug("linker line: %s" % line)
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
@@ -138,15 +139,12 @@ def _parse_link_paths(string):
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("linkdir: %s" % link_dir)
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("libpath: %s", link_dir)
raw_link_dirs.append(link_dir)
tty.debug("found raw link dirs: %s" % ", ".join(raw_link_dirs))
implicit_link_dirs = list()
visited = set()
@@ -156,7 +154,7 @@ def _parse_link_paths(string):
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug("found link dirs: %s" % ", ".join(implicit_link_dirs))
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@@ -417,17 +415,35 @@ def real_version(self):
self._real_version = self.version
return self._real_version
def implicit_rpaths(self):
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
# Put CXX first since it has the most linking issues
# And because it has flags that affect linking
link_dirs = self._get_compiler_link_paths()
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
output = self.compiler_verbose_output
if not output:
return None
dynamic_linker = spack.util.libc.parse_dynamic_linker(output)
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
@@ -436,17 +452,17 @@ def required_libs(self):
# By default every compiler returns the empty list
return []
def _get_compiler_link_paths(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if not cc or not self.verbose_flag:
# Cannot determine implicit link paths without a compiler / verbose flag
return []
# What flag types apply to first_compiler, in what order
if cc == self.cc:
flags = ["cflags", "cppflags", "ldflags"]
else:
flags = ["cxxflags", "cppflags", "ldflags"]
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
@@ -458,20 +474,19 @@ def _get_compiler_link_paths(self):
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in flags:
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
output = cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
return _parse_non_system_link_dirs(output)
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return []
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self):
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
@@ -669,8 +684,8 @@ def __str__(self):
@contextlib.contextmanager
def compiler_environment(self):
# yield immediately if no modules
if not self.modules:
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
@@ -687,13 +702,9 @@ def compiler_environment(self):
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()
spack.schema.environment.parse(self.environment).apply_modifications()
yield
except BaseException:
raise
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()

View File

@@ -10,6 +10,7 @@
import itertools
import multiprocessing.pool
import os
import warnings
from typing import Dict, List, Optional, Tuple
import archspec.cpu
@@ -109,27 +110,33 @@ def _to_dict(compiler):
return {"compiler": d}
def get_compiler_config(scope=None, init_config=False):
def get_compiler_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = False,
) -> List[Dict]:
"""Return the compiler configuration for the specified architecture."""
config = spack.config.CONFIG.get("compilers", scope=scope) or []
config = configuration.get("compilers", scope=scope) or []
if config or not init_config:
return config
merged_config = spack.config.CONFIG.get("compilers")
merged_config = configuration.get("compilers")
if merged_config:
# Config is empty for this scope
# Do not init config because there is a non-empty scope
return config
_init_compiler_config(scope=scope)
config = spack.config.CONFIG.get("compilers", scope=scope)
_init_compiler_config(configuration, scope=scope)
config = configuration.get("compilers", scope=scope)
return config
def get_compiler_config_from_packages(scope=None):
def get_compiler_config_from_packages(
configuration: "spack.config.Configuration", *, scope: Optional[str] = None
) -> List[Dict]:
"""Return the compiler configuration from packages.yaml"""
config = spack.config.get("packages", scope=scope)
config = configuration.get("packages", scope=scope)
if not config:
return []
@@ -157,33 +164,56 @@ def _compiler_config_from_package_config(config):
def _compiler_config_from_external(config):
extra_attributes_key = "extra_attributes"
compilers_key = "compilers"
c_key, cxx_key, fortran_key = "c", "cxx", "fortran"
# Allow `@x.y.z` instead of `@=x.y.z`
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
err_header = f"The external spec '{spec}' cannot be used as a compiler"
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("paths", {})
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if extra_attributes_key not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{extra_attributes_key}' key")
return None
extra_attributes = config[extra_attributes_key]
# If I have 'extra_attributes' warn if 'compilers' is missing, or we don't have a C compiler
if compilers_key not in extra_attributes:
warnings.warn(
f"{err_header}: missing the '{compilers_key}' key under '{extra_attributes_key}'"
)
return None
attribute_compilers = extra_attributes[compilers_key]
if c_key not in attribute_compilers:
warnings.warn(
f"{err_header}: missing the C compiler path under "
f"'{extra_attributes_key}:{compilers_key}'"
)
return None
c_compiler = attribute_compilers[c_key]
# C++ and Fortran compilers are not mandatory, so let's just leave a debug trace
if cxx_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a C++ compiler")
if fortran_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a Fortran compiler")
# compilers format has cc/fc/f77, externals format has "c/fortran"
paths = {
"cc": c_compiler,
"cxx": attribute_compilers.get(cxx_key, None),
"fc": attribute_compilers.get(fortran_key, None),
"f77": attribute_compilers.get(fortran_key, None),
}
if not spec.architecture:
host_platform = spack.platforms.host()
@@ -216,13 +246,15 @@ def _compiler_config_from_external(config):
return compiler_entry
def _init_compiler_config(*, scope):
def _init_compiler_config(
configuration: "spack.config.Configuration", *, scope: Optional[str]
) -> None:
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
compilers_dict = []
for compiler in compilers:
compilers_dict.append(_to_dict(compiler))
spack.config.set("compilers", compilers_dict, scope=scope)
configuration.set("compilers", compilers_dict, scope=scope)
def compiler_config_files():
@@ -233,7 +265,7 @@ def compiler_config_files():
compiler_config = config.get("compilers", scope=name)
if compiler_config:
config_files.append(config.get_config_filename(name, "compilers"))
compiler_config_from_packages = get_compiler_config_from_packages(scope=name)
compiler_config_from_packages = get_compiler_config_from_packages(config, scope=name)
if compiler_config_from_packages:
config_files.append(config.get_config_filename(name, "packages"))
return config_files
@@ -246,7 +278,9 @@ def add_compilers_to_config(compilers, scope=None):
compilers: a list of Compiler objects.
scope: configuration scope to modify.
"""
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(
configuration=spack.config.CONFIG, scope=scope, init_config=False
)
for compiler in compilers:
if not compiler.cc:
tty.debug(f"{compiler.spec} does not have a C compiler")
@@ -295,7 +329,9 @@ def _remove_compiler_from_scope(compiler_spec, scope):
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(
configuration=spack.config.CONFIG, scope=scope, init_config=False
)
filtered_compiler_config = [
compiler_entry
for compiler_entry in compiler_config
@@ -310,21 +346,28 @@ def _remove_compiler_from_scope(compiler_spec, scope):
# We need to preserve the YAML type for comments, hence we are copying the
# items in the list that has just been retrieved
compiler_config[:] = filtered_compiler_config
spack.config.set("compilers", compiler_config, scope=scope)
spack.config.CONFIG.set("compilers", compiler_config, scope=scope)
return True
def all_compilers_config(scope=None, init_config=True):
def all_compilers_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = True,
) -> List["spack.compiler.Compiler"]:
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
from_packages_yaml = get_compiler_config_from_packages(scope)
from_packages_yaml = get_compiler_config_from_packages(configuration, scope=scope)
if from_packages_yaml:
init_config = False
from_compilers_yaml = get_compiler_config(scope, init_config)
from_compilers_yaml = get_compiler_config(configuration, scope=scope, init_config=init_config)
result = from_compilers_yaml + from_packages_yaml
key = lambda c: _compiler_from_config_entry(c["compiler"])
# Dedupe entries by the compiler they represent
# If the entry is invalid, treat it as unique for deduplication
key = lambda c: _compiler_from_config_entry(c["compiler"] or id(c))
return list(llnl.util.lang.dedupe(result, key=key))
@@ -332,7 +375,7 @@ def all_compiler_specs(scope=None, init_config=True):
# Return compiler specs from the merged config.
return [
spack.spec.parse_with_version_concrete(s["compiler"]["spec"], compiler=True)
for s in all_compilers_config(scope, init_config)
for s in all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
]
@@ -492,11 +535,20 @@ def find_specs_by_arch(compiler_spec, arch_spec, scope=None, init_config=True):
def all_compilers(scope=None, init_config=True):
config = all_compilers_config(scope, init_config=init_config)
compilers = list()
for items in config:
return all_compilers_from(
configuration=spack.config.CONFIG, scope=scope, init_config=init_config
)
def all_compilers_from(configuration, scope=None, init_config=True):
compilers = []
for items in all_compilers_config(
configuration=configuration, scope=scope, init_config=init_config
):
items = items["compiler"]
compilers.append(_compiler_from_config_entry(items))
compiler = _compiler_from_config_entry(items) # can be None in error case
if compiler:
compilers.append(compiler)
return compilers
@@ -507,7 +559,7 @@ def compilers_for_spec(
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
config = all_compilers_config(scope, init_config)
config = all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
matches = set(find(compiler_spec, scope, init_config))
compilers = []
@@ -517,7 +569,7 @@ def compilers_for_spec(
def compilers_for_arch(arch_spec, scope=None):
config = all_compilers_config(scope)
config = all_compilers_config(spack.config.CONFIG, scope=scope)
return list(get_compilers(config, arch_spec=arch_spec))
@@ -603,7 +655,10 @@ def _compiler_from_config_entry(items):
compiler = _compiler_cache.get(config_id, None)
if compiler is None:
compiler = compiler_from_dict(items)
try:
compiler = compiler_from_dict(items)
except UnknownCompilerError as e:
warnings.warn(e.message)
_compiler_cache[config_id] = compiler
return compiler
@@ -656,7 +711,9 @@ def get_compilers(config, cspec=None, arch_spec=None):
raise ValueError(msg)
continue
compilers.append(_compiler_from_config_entry(items))
compiler = _compiler_from_config_entry(items)
if compiler:
compilers.append(compiler)
return compilers
@@ -933,10 +990,11 @@ def _default_make_compilers(cmp_id, paths):
make_mixed_toolchain(flat_compilers)
# Finally, create the compiler list
compilers = []
compilers: List["spack.compiler.Compiler"] = []
for compiler_id, _, compiler in flat_compilers:
make_compilers = getattr(compiler_id.os, "make_compilers", _default_make_compilers)
compilers.extend(make_compilers(compiler_id, compiler))
candidates = make_compilers(compiler_id, compiler)
compilers.extend(x for x in candidates if x.cc is not None)
return compilers

View File

@@ -38,10 +38,10 @@ class Clang(Compiler):
cxx_names = ["clang++"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["flang"]
f77_names = ["flang-new", "flang"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang"]
fc_names = ["flang-new", "flang"]
version_argument = "--version"
@@ -96,6 +96,8 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -120,6 +122,24 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -142,7 +162,10 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
return "-std=c2x"
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):
@@ -171,10 +194,11 @@ def extract_version_from_output(cls, output):
match = re.search(
# Normal clang compiler versions are left as-is
r"clang version ([^ )\n]+)-svn[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r"clang version ([^ )\n]+?)-[~.\w\d-]*|" r"clang version ([^ )\n]+)",
r"(?:clang|flang-new) version ([^ )\n]+?)-[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)",
output,
)
if match:

View File

@@ -1,34 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.compilers.oneapi
class Dpcpp(spack.compilers.oneapi.Oneapi):
"""This is the same as the oneAPI compiler but uses dpcpp instead of
icpx (for DPC++ source files). It explicitly refers to dpcpp, so that
CMake test files which check the compiler name (e.g. CMAKE_CXX_COMPILER)
detect it as dpcpp.
Ideally we could switch out icpx for dpcpp where needed in the oneAPI
compiler definition, but two things are needed for that: (a) a way to
tell the compiler that it should be using dpcpp and (b) a way to
customize the link_paths
See also: https://www.intel.com/content/www/us/en/develop/documentation/oneapi-dpcpp-cpp-compiler-dev-guide-and-reference/top/compiler-setup/using-the-command-line/invoking-the-compiler.html
"""
# Subclasses use possible names of C++ compiler
cxx_names = ["dpcpp"]
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("oneapi", "icx"),
"cxx": os.path.join("oneapi", "dpcpp"),
"f77": os.path.join("oneapi", "ifx"),
"fc": os.path.join("oneapi", "ifx"),
}

View File

@@ -8,7 +8,7 @@
import subprocess
import sys
import tempfile
from typing import Dict, List, Set
from typing import Dict, List
import archspec.cpu
@@ -20,15 +20,7 @@
from spack.error import SpackError
from spack.version import Version, VersionRange
avail_fc_version: Set[str] = set()
fc_path: Dict[str, str] = dict()
fortran_mapping = {
"2021.3.0": "19.29.30133",
"2021.2.1": "19.28.29913",
"2021.2.0": "19.28.29334",
"2021.1.0": "19.28.29333",
}
FC_PATH: Dict[str, str] = dict()
class CmdCall:
@@ -115,15 +107,13 @@ def command_str(self):
return f"{script} {self.arch} {self.sdk_ver} {self.vcvars_ver}"
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver)
def get_valid_fortran_pth():
"""Assign maximum available fortran compiler version"""
# TODO (johnwparent): validate compatibility w/ try compiler
# functionality when added
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
if ver in fortran_mapping:
if Version(cl_ver) <= Version(fortran_mapping[ver]):
return fc_path[ver]
return None
sort_fc_ver = sorted(list(FC_PATH.keys()), key=sort_fn)
return FC_PATH[sort_fc_ver[-1]] if sort_fc_ver else None
class Msvc(Compiler):
@@ -167,11 +157,9 @@ def __init__(self, *args, **kwargs):
# This positional argument "paths" is later parsed and process by the base class
# via the call to `super` later in this method
paths = args[3]
# This positional argument "cspec" is also parsed and handled by the base class
# constructor
cspec = args[0]
new_pth = [pth if pth else get_valid_fortran_pth(cspec.version) for pth in paths]
paths[:] = new_pth
latest_fc = get_valid_fortran_pth()
new_pth = [pth if pth else latest_fc for pth in paths[2:]]
paths[2:] = new_pth
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
super().__init__(*args, **kwargs)
@@ -183,7 +171,7 @@ def __init__(self, *args, **kwargs):
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
env_cmds = []
compiler_root = os.path.join(self.cc, "../../../../../../..")
compiler_root = os.path.join(os.path.dirname(self.cc), "../../../../../..")
vcvars_script_path = os.path.join(compiler_root, "Auxiliary", "Build", "vcvars64.bat")
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
@@ -198,11 +186,34 @@ def __init__(self, *args, **kwargs):
# paths[2] refers to the fc path and is a generic check
# for a fortran compiler
if paths[2]:
def get_oneapi_root(pth: str):
"""From within a prefix known to be a oneAPI path
determine the oneAPI root path from arbitrary point
under root
Args:
pth: path prefixed within oneAPI root
"""
if not pth:
return ""
while os.path.basename(pth) and os.path.basename(pth) != "oneAPI":
pth = os.path.dirname(pth)
return pth
# If this found, it sets all the vars
oneapi_root = os.getenv("ONEAPI_ROOT")
oneapi_root = get_oneapi_root(self.fc)
if not oneapi_root:
raise RuntimeError(f"Non-oneAPI Fortran compiler {self.fc} assigned to MSVC")
oneapi_root_setvars = os.path.join(oneapi_root, "setvars.bat")
# some oneAPI exes return a version more precise than their
# install paths specify, so we determine path from
# the install path rather than the fc executable itself
numver = r"\d+\.\d+(?:\.\d+)?"
pattern = f"((?:{numver})|(?:latest))"
version_from_path = re.search(pattern, self.fc).group(1)
oneapi_version_setvars = os.path.join(
oneapi_root, "compiler", str(self.ifx_version), "env", "vars.bat"
oneapi_root, "compiler", version_from_path, "env", "vars.bat"
)
# order matters here, the specific version env must be invoked first,
# otherwise it will be ignored if the root setvars sets up the oneapi
@@ -314,23 +325,19 @@ def setup_custom_environment(self, pkg, env):
@classmethod
def fc_version(cls, fc):
# We're using intel for the Fortran compilers, which exist if
# ONEAPI_ROOT is a meaningful variable
if not sys.platform == "win32":
return "unknown"
fc_ver = cls.default_version(fc)
avail_fc_version.add(fc_ver)
fc_path[fc_ver] = fc
if os.getenv("ONEAPI_ROOT"):
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError("Windows compiler search paths not established")
clp = spack.util.executable.which_string("cl", path=sps)
ver = cls.default_version(clp)
else:
ver = fc_ver
return ver
FC_PATH[fc_ver] = fc
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError(
"Windows compiler search paths not established, "
"please report this behavior to github.com/spack/spack"
)
clp = spack.util.executable.which_string("cl", path=sps)
return cls.default_version(clp) if clp else fc_ver
@classmethod
def f77_version(cls, f77):

View File

@@ -64,7 +64,7 @@ def verbose_flag(self):
#
# This way, we at least enable the implicit rpath detection, which is
# based on compilation of a C file (see method
# spack.compiler._get_compiler_link_paths): in the case of a mixed
# spack.compiler._compile_dummy_c_source): in the case of a mixed
# NAG/GCC toolchain, the flag will be passed to g++ (e.g.
# 'g++ -Wl,-v ./main.c'), otherwise, the flag will be passed to nagfor
# (e.g. 'nagfor -Wl,-v ./main.c' - note that nagfor recognizes '.c'

View File

@@ -74,6 +74,10 @@ class Concretizer:
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
@@ -113,7 +117,11 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.PATH.providers_for(spec)
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)

View File

@@ -1562,8 +1562,9 @@ def ensure_latest_format_fn(section: str) -> Callable[[YamlConfigDict], bool]:
def use_configuration(
*scopes_or_paths: Union[ConfigScope, str]
) -> Generator[Configuration, None, None]:
"""Use the configuration scopes passed as arguments within the
context manager.
"""Use the configuration scopes passed as arguments within the context manager.
This function invalidates caches, and is therefore very slow.
Args:
*scopes_or_paths: scope objects or paths to be used

View File

@@ -12,26 +12,26 @@
},
"os_package_manager": "yum_amazon"
},
"fedora:38": {
"fedora:40": {
"bootstrap": {
"template": "container/fedora_38.dockerfile",
"image": "docker.io/fedora:38"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:40"
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build": "spack/fedora40",
"final": {
"image": "docker.io/fedora:38"
"image": "docker.io/fedora:40"
}
},
"fedora:37": {
"fedora:39": {
"bootstrap": {
"template": "container/fedora_37.dockerfile",
"image": "docker.io/fedora:37"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:39"
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build": "spack/fedora39",
"final": {
"image": "docker.io/fedora:37"
"image": "docker.io/fedora:39"
}
},
"rockylinux:9": {
@@ -116,6 +116,13 @@
},
"os_package_manager": "apt"
},
"ubuntu:24.04": {
"bootstrap": {
"template": "container/ubuntu_2404.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-noble"
},
"ubuntu:22.04": {
"bootstrap": {
"template": "container/ubuntu_2204.dockerfile"
@@ -129,13 +136,6 @@
},
"build": "spack/ubuntu-focal",
"os_package_manager": "apt"
},
"ubuntu:18.04": {
"bootstrap": {
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -25,6 +25,7 @@
import socket
import sys
import time
from json import JSONDecoder
from typing import (
Any,
Callable,
@@ -818,7 +819,8 @@ def _read_from_file(self, filename):
"""
try:
with open(filename, "r") as f:
fdata = sjson.load(f)
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
raise CorruptDatabaseError("error parsing database:", str(e)) from e
@@ -833,27 +835,24 @@ def check(cond, msg):
# High-level file checks
db = fdata["database"]
check("installs" in db, "no 'installs' in JSON DB.")
check("version" in db, "no 'version' in JSON DB.")
installs = db["installs"]
# TODO: better version checking semantics.
version = vn.Version(db["version"])
if version > _DB_VERSION:
raise InvalidDatabaseVersionError(self, _DB_VERSION, version)
elif version < _DB_VERSION:
if not any(old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX):
tty.warn(
"Spack database version changed from %s to %s. Upgrading."
% (version, _DB_VERSION)
)
elif version < _DB_VERSION and not any(
old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields))
for k, v in self._data.items()
)
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
else:
check("installs" in db, "no 'installs' in JSON DB.")
installs = db["installs"]
spec_reader = reader(version)

View File

@@ -83,26 +83,15 @@ def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
return path_to_dict(search_paths)
def get_elf_compat(path):
"""For ELF files, get a triplet (EI_CLASS, EI_DATA, e_machine) and see if
it is host-compatible."""
# On ELF platforms supporting, we try to be a bit smarter when it comes to shared
# libraries, by dropping those that are not host compatible.
with open(path, "rb") as f:
elf = elf_utils.parse_elf(f, only_header=True)
return (elf.is_64_bit, elf.is_little_endian, elf.elf_hdr.e_machine)
def accept_elf(path, host_compat):
"""Accept an ELF file if the header matches the given compat triplet,
obtained with :py:func:`get_elf_compat`. In case it's not an ELF (e.g.
static library, or some arbitrary file, fall back to is_readable_file)."""
"""Accept an ELF file if the header matches the given compat triplet. In case it's not an ELF
(e.g. static library, or some arbitrary file, fall back to is_readable_file)."""
# Fast path: assume libraries at least have .so in their basename.
# Note: don't replace with splitext, because of libsmth.so.1.2.3 file names.
if ".so" not in os.path.basename(path):
return llnl.util.filesystem.is_readable_file(path)
try:
return host_compat == get_elf_compat(path)
return host_compat == elf_utils.get_elf_compat(path)
except (OSError, elf_utils.ElfParsingError):
return llnl.util.filesystem.is_readable_file(path)
@@ -155,7 +144,7 @@ def libraries_in_ld_and_system_library_path(
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
try:
host_compat = get_elf_compat(sys.executable)
host_compat = elf_utils.get_elf_compat(sys.executable)
accept = lambda path: accept_elf(path, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = llnl.util.filesystem.is_readable_file

View File

@@ -11,6 +11,7 @@
from llnl.util import filesystem
import spack.platforms
import spack.repo
import spack.spec
from spack.util import spack_yaml
@@ -32,6 +33,8 @@ class ExpectedTestResult(NamedTuple):
#: Spec to be detected
spec: str
#: Attributes expected in the external spec
extra_attributes: Dict[str, str]
class DetectionTest(NamedTuple):
@@ -100,7 +103,10 @@ def _create_executable_scripts(self, mock_executables: MockExecutables) -> List[
@property
def expected_specs(self) -> List[spack.spec.Spec]:
return [spack.spec.Spec(r.spec) for r in self.test.results]
return [
spack.spec.Spec.from_detection(item.spec, extra_attributes=item.extra_attributes)
for item in self.test.results
]
def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runner]:
@@ -117,9 +123,13 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
"""
result = []
detection_tests_content = read_detection_tests(pkg_name, repository)
current_platform = str(spack.platforms.host())
tests_by_path = detection_tests_content.get("paths", [])
for single_test_data in tests_by_path:
if current_platform not in single_test_data.get("platforms", [current_platform]):
continue
mock_executables = []
for layout in single_test_data["layout"]:
mock_executables.append(
@@ -127,7 +137,11 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
)
expected_results = []
for assertion in single_test_data["results"]:
expected_results.append(ExpectedTestResult(spec=assertion["spec"]))
expected_results.append(
ExpectedTestResult(
spec=assertion["spec"], extra_attributes=assertion.get("extra_attributes", {})
)
)
current_test = DetectionTest(
pkg_name=pkg_name, layout=mock_executables, results=expected_results

View File

@@ -27,6 +27,7 @@ class OpenMpi(Package):
* ``variant``
* ``version``
* ``requires``
* ``redistribute``
"""
import collections
@@ -63,6 +64,7 @@ class OpenMpi(Package):
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conflicts",
"depends_on",
@@ -75,6 +77,7 @@ class OpenMpi(Package):
"resource",
"build_system",
"requires",
"redistribute",
]
#: These are variant names used by Spack internally; packages can't use them
@@ -598,9 +601,68 @@ def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
By default, Packages allow source/binary distribution (i.e. in
mirrors). Because of this, and because overlapping enable/
disable specs are not allowed, this directive only allows users
to explicitly disable redistribution for specs.
"""
return lambda pkg: _execute_redistribute(pkg, source, binary, when)
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
elif (source is True) or (binary is True):
raise DirectiveError(
"Source/binary distribution are true by default, they can only "
"be explicitly disabled."
)
if source is None:
source = True
if binary is None:
binary = True
when_spec = _make_when_spec(when)
if not when_spec:
return
if source is False:
max_constraint = spack.spec.Spec(f"{pkg.name}@{when_spec.versions}")
if not max_constraint.satisfies(when_spec):
raise DirectiveError("Source distribution can only be disabled for versions")
if when_spec in pkg.disable_redistribute:
disable = pkg.disable_redistribute[when_spec]
if not source:
disable.source = True
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
source=not source, binary=not binary
)
@directive(("extendees", "dependencies"))
def extends(spec, when=None, type=("build", "run"), patches=None):
"""Same as depends_on, but also adds this package to the extendee list.
In case of Python, also adds a dependency on python-venv.
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
@@ -616,6 +678,11 @@ def _execute_extends(pkg):
_depends_on(pkg, spec, when=when, type=type, patches=patches)
spec_obj = spack.spec.Spec(spec)
# When extending python, also add a dependency on python-venv. This is done so that
# Spack environment views are Python virtual environments.
if spec_obj.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, "python-venv", when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[spec_obj.name] = (spec_obj, None)

View File

@@ -15,6 +15,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config
import spack.hash_types as ht
@@ -181,7 +182,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
base_dir = (
self.path_for_spec(deprecator_spec)
if deprecator_spec
else os.readlink(deprecated_spec.prefix)
else readlink(deprecated_spec.prefix)
)
yaml_path = os.path.join(

View File

@@ -34,6 +34,9 @@
* ``spec``: a string representation of the abstract spec that was concretized
4. ``concrete_specs``: a dictionary containing the specs in the environment.
5. ``include_concrete`` (dictionary): an optional dictionary that includes the roots
and concrete specs from the included environments, keyed by the path to that
environment
Compatibility
-------------
@@ -50,26 +53,37 @@
- ``v2``
- ``v3``
- ``v4``
- ``v5``
* - ``v0.12:0.14``
-
-
-
-
-
* - ``v0.15:0.16``
-
-
-
-
-
* - ``v0.17``
-
-
-
-
-
* - ``v0.18:``
-
-
-
-
-
* - ``v0.22:``
-
-
-
-
-
Version 1
---------
@@ -334,6 +348,118 @@
}
}
}
Version 5
---------
Version 5 doesn't change the top-level lockfile format, but an optional dictionary is
added. The dictionary has the ``root`` and ``concrete_specs`` of the included
environments, which are keyed by the path to that environment. Since this is optional
if the environment does not have any included environments ``include_concrete`` will
not be a part of the lockfile.
.. code-block:: json
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 5,
"specfile-version": 3
},
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
"include_concrete": {
"<path to environment>": {
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
}
}
}
"""
from .environment import (

View File

@@ -16,13 +16,13 @@
import urllib.parse
import urllib.request
import warnings
from typing import Dict, Iterable, List, Optional, Set, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.compilers
import spack.concretize
@@ -106,17 +106,16 @@ def environment_name(path: Union[str, pathlib.Path]) -> str:
return path_str
def check_disallowed_env_config_mods(scopes):
def ensure_no_disallowed_env_config_mods(scopes: List[spack.config.ConfigScope]) -> None:
for scope in scopes:
with spack.config.use_configuration(scope):
if spack.config.get("config:environments_root"):
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
return scopes
config = scope.get_section("config")
if config and "environments_root" in config["config"]:
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
def default_manifest_yaml():
@@ -160,6 +159,8 @@ def default_manifest_yaml():
default_view_name = "default"
# Default behavior to link all packages into views (vs. only root packages)
default_view_link = "all"
# The name for any included concrete specs
included_concrete_name = "include_concrete"
def installed_specs():
@@ -294,6 +295,7 @@ def create(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create a managed environment in Spack and returns it.
@@ -310,10 +312,15 @@ def create(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: list of concrete environment names/paths to be included
"""
environment_dir = environment_dir_from_name(name, exists_ok=False)
return create_in_dir(
environment_dir, init_file=init_file, with_view=with_view, keep_relative=keep_relative
environment_dir,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
@@ -322,6 +329,7 @@ def create_in_dir(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create an environment in the directory passed as input and returns it.
@@ -335,6 +343,7 @@ def create_in_dir(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: concrete environment names/paths to be included
"""
initialize_environment_dir(root, envfile=init_file)
@@ -347,6 +356,12 @@ def create_in_dir(
if with_view is not None:
manifest.set_default_view(with_view)
if include_concrete is not None:
set_included_envs_to_env_paths(include_concrete)
validate_included_envs_exists(include_concrete)
validate_included_envs_concrete(include_concrete)
manifest.set_include_concrete(include_concrete)
manifest.flush()
except (spack.config.ConfigFormatError, SpackEnvironmentConfigError) as e:
@@ -420,6 +435,67 @@ def ensure_env_root_path_exists():
fs.mkdirp(env_root_path())
def set_included_envs_to_env_paths(include_concrete: List[str]) -> None:
"""If the included environment(s) is the environment name
it is replaced by the path to the environment
Args:
include_concrete: list of env name or path to env"""
for i, env_name in enumerate(include_concrete):
if is_env_dir(env_name):
include_concrete[i] = env_name
elif exists(env_name):
include_concrete[i] = root(env_name)
def validate_included_envs_exists(include_concrete: List[str]) -> None:
"""Checks that all of the included environments exist
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments do not exist
"""
missing_envs = set()
for i, env_name in enumerate(include_concrete):
if not is_env_dir(env_name):
missing_envs.add(env_name)
if missing_envs:
msg = "The following environment(s) are missing: {0}".format(", ".join(missing_envs))
raise SpackEnvironmentError(msg)
def validate_included_envs_concrete(include_concrete: List[str]) -> None:
"""Checks that all of the included environments are concrete
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments are not concrete
"""
non_concrete_envs = set()
for env_path in include_concrete:
if not os.path.exists(Environment(env_path).lock_path):
non_concrete_envs.add(Environment(env_path).name)
if non_concrete_envs:
msg = "The following environment(s) are not concrete: {0}\n" "Please run:".format(
", ".join(non_concrete_envs)
)
for env in non_concrete_envs:
msg += f"\n\t`spack -e {env} concretize`"
raise SpackEnvironmentError(msg)
def all_environment_names():
"""List the names of environments that currently exist."""
# just return empty if the env path does not exist. A read-only
@@ -586,7 +662,7 @@ def _current_root(self):
if not os.path.islink(self.root):
return None
root = os.readlink(self.root)
root = readlink(self.root)
if os.path.isabs(root):
return root
@@ -822,6 +898,18 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
self.specs_by_hash: Dict[str, Spec] = {}
#: Repository for this environment (memoized)
self._repo = None
#: Environment paths for concrete (lockfile) included environments
self.included_concrete_envs: List[str] = []
#: First-level included concretized spec data from/to the lockfile.
self.included_concrete_spec_data: Dict[str, Dict[str, List[str]]] = {}
#: User specs from included environments from the last concretization
self.included_concretized_user_specs: Dict[str, List[Spec]] = {}
#: Roots from included environments with the last concretization, in order
self.included_concretized_order: Dict[str, List[str]] = {}
#: Concretized specs by hash from the included environments
self.included_specs_by_hash: Dict[str, Dict[str, Spec]] = {}
#: Previously active environment
self._previous_active = None
self._dev_specs = None
@@ -859,7 +947,7 @@ def _read(self):
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
read_lock_version = self._read_lockfile(f)
read_lock_version = self._read_lockfile(f)["_meta"]["lockfile-version"]
if read_lock_version == 1:
tty.debug(f"Storing backup of {self.lock_path} at {self._lock_backup_v1_path}")
@@ -927,6 +1015,20 @@ def add_view(name, values):
if self.views == dict():
self.views[default_view_name] = ViewDescriptor(self.path, self.view_path_default)
def _process_concrete_includes(self):
"""Extract and load into memory included concrete spec data."""
self.included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
if self.included_concrete_envs:
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
data = self._read_lockfile(f)
if included_concrete_name in data:
self.included_concrete_spec_data = data[included_concrete_name]
else:
self.include_concrete_envs()
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
@@ -943,6 +1045,31 @@ def _construct_state_from_manifest(self):
self.spec_lists[user_speclist_name] = user_specs
self._process_view(spack.config.get("view", True))
self._process_concrete_includes()
def all_concretized_user_specs(self) -> List[Spec]:
"""Returns all of the concretized user specs of the environment and
its included environment(s)."""
concretized_user_specs = self.concretized_user_specs[:]
for included_specs in self.included_concretized_user_specs.values():
for included in included_specs:
# Don't duplicate included spec(s)
if included not in concretized_user_specs:
concretized_user_specs.append(included)
return concretized_user_specs
def all_concretized_orders(self) -> List[str]:
"""Returns all of the concretized order of the environment and
its included environment(s)."""
concretized_order = self.concretized_order[:]
for included_concretized_order in self.included_concretized_order.values():
for included in included_concretized_order:
# Don't duplicate included spec(s)
if included not in concretized_order:
concretized_order.append(included)
return concretized_order
@property
def user_specs(self):
@@ -967,6 +1094,26 @@ def _read_dev_specs(self):
dev_specs[name] = local_entry
return dev_specs
@property
def included_user_specs(self) -> SpecList:
"""Included concrete user (or root) specs from last concretization."""
spec_list = SpecList()
if not self.included_concrete_envs:
return spec_list
def add_root_specs(included_concrete_specs):
# add specs from the include *and* any nested includes it may have
for env, info in included_concrete_specs.items():
for root_list in info["roots"]:
spec_list.add(root_list["spec"])
if "include_concrete" in info:
add_root_specs(info["include_concrete"])
add_root_specs(self.included_concrete_spec_data)
return spec_list
def clear(self, re_read=False):
"""Clear the contents of the environment
@@ -978,9 +1125,15 @@ def clear(self, re_read=False):
self.spec_lists[user_speclist_name] = SpecList()
self._dev_specs = {}
self.concretized_user_specs = [] # user specs from last concretize
self.concretized_order = [] # roots of last concretize, in order
self.concretized_user_specs = [] # user specs from last concretize
self.specs_by_hash = {} # concretized specs by hash
self.included_concrete_spec_data = {} # concretized specs from lockfile of included envs
self.included_concretized_order = {} # root specs of the included envs, keyed by env path
self.included_concretized_user_specs = {} # user specs from last concretize's included env
self.included_specs_by_hash = {} # concretized specs by hash from the included envs
self.invalidate_repository_cache()
self._previous_active = None # previously active environment
if not re_read:
@@ -1034,6 +1187,55 @@ def scope_name(self):
"""Name of the config scope of this environment's manifest file."""
return self.manifest.scope_name
def include_concrete_envs(self):
"""Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set()
concrete_hash_seen = set()
self.included_concrete_spec_data = {}
for env_path in self.included_concrete_envs:
# Check that environment exists
if not is_env_dir(env_path):
raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path)
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
# Copy unique root specs from env
self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env
for concrete_spec in lockfile_as_dict["concrete_specs"]:
if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path].update(
{"concrete_specs": lockfile_as_dict["concrete_specs"]}
)
concrete_hash_seen.add(concrete_spec)
if "include_concrete" in lockfile_as_dict.keys():
self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
"include_concrete"
]
self._read_lockfile_dict(self._to_lockfile_dict())
self.write()
def destroy(self):
"""Remove this environment from Spack entirely."""
shutil.rmtree(self.path)
@@ -1233,6 +1435,10 @@ def concretize(self, force=False, tests=False):
for spec in set(self.concretized_user_specs) - set(self.user_specs):
self.deconcretize(spec, concrete=False)
# If a combined env, check updated spec is in the linked envs
if self.included_concrete_envs:
self.include_concrete_envs()
# Pick the right concretization strategy
if self.unify == "when_possible":
return self._concretize_together_where_possible(tests=tests)
@@ -1416,7 +1622,7 @@ def _concretize_separately(self, tests=False):
# Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -1427,7 +1633,7 @@ def _concretize_separately(self, tests=False):
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.get_compiler_config(init_config=True)
_ = spack.compilers.get_compiler_config(spack.config.CONFIG, init_config=True)
# Early return if there is nothing to do
if len(args) == 0:
@@ -1705,8 +1911,14 @@ def _partition_roots_by_install_status(self):
of per spec."""
installed, uninstalled = [], []
with spack.store.STORE.db.read_transaction():
for concretized_hash in self.concretized_order:
spec = self.specs_by_hash[concretized_hash]
for concretized_hash in self.all_concretized_orders():
if concretized_hash in self.specs_by_hash:
spec = self.specs_by_hash[concretized_hash]
else:
for env_path in self.included_specs_by_hash.keys():
if concretized_hash in self.included_specs_by_hash[env_path]:
spec = self.included_specs_by_hash[env_path][concretized_hash]
break
if not spec.installed or (
spec.satisfies("dev_path=*") or spec.satisfies("^dev_path=*")
):
@@ -1786,8 +1998,14 @@ def added_specs(self):
def concretized_specs(self):
"""Tuples of (user spec, concrete spec) for all concrete specs."""
for s, h in zip(self.concretized_user_specs, self.concretized_order):
yield (s, self.specs_by_hash[h])
for s, h in zip(self.all_concretized_user_specs(), self.all_concretized_orders()):
if h in self.specs_by_hash:
yield (s, self.specs_by_hash[h])
else:
for env_path in self.included_specs_by_hash.keys():
if h in self.included_specs_by_hash[env_path]:
yield (s, self.included_specs_by_hash[env_path][h])
break
def concrete_roots(self):
"""Same as concretized_specs, except it returns the list of concrete
@@ -1916,8 +2134,7 @@ def _get_environment_specs(self, recurse_dependencies=True):
If these specs appear under different user_specs, only one copy
is added to the list returned.
"""
specs = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [self.specs_by_hash[h] for h in self.all_concretized_orders()]
if recurse_dependencies:
specs.extend(
traverse.traverse_nodes(
@@ -1962,31 +2179,76 @@ def _to_lockfile_dict(self):
"concrete_specs": concrete_specs,
}
if self.included_concrete_envs:
data[included_concrete_name] = self.included_concrete_spec_data
return data
def _read_lockfile(self, file_or_json):
"""Read a lockfile from a file or from a raw string."""
lockfile_dict = sjson.load(file_or_json)
self._read_lockfile_dict(lockfile_dict)
return lockfile_dict["_meta"]["lockfile-version"]
return lockfile_dict
def set_included_concretized_user_specs(
self,
env_name: str,
env_info: Dict[str, Dict[str, Any]],
included_json_specs_by_hash: Dict[str, Dict[str, Any]],
) -> Dict[str, Dict[str, Any]]:
"""Sets all of the concretized user specs from included environments
to include those from nested included environments.
Args:
env_name: the name (technically the path) of the included environment
env_info: included concrete environment data
included_json_specs_by_hash: concrete spec data keyed by hash
Returns: updated specs_by_hash
"""
self.included_concretized_order[env_name] = []
self.included_concretized_user_specs[env_name] = []
def add_specs(name, info, specs_by_hash):
# Add specs from the environment as well as any of its nested
# environments.
for root_info in info["roots"]:
self.included_concretized_order[name].append(root_info["hash"])
self.included_concretized_user_specs[name].append(Spec(root_info["spec"]))
if "concrete_specs" in info:
specs_by_hash.update(info["concrete_specs"])
if included_concrete_name in info:
for included_name, included_info in info[included_concrete_name].items():
if included_name not in self.included_concretized_order:
self.included_concretized_order[included_name] = []
self.included_concretized_user_specs[included_name] = []
add_specs(included_name, included_info, specs_by_hash)
add_specs(env_name, env_info, included_json_specs_by_hash)
return included_json_specs_by_hash
def _read_lockfile_dict(self, d):
"""Read a lockfile dictionary into this environment."""
self.specs_by_hash = {}
self.included_specs_by_hash = {}
self.included_concretized_user_specs = {}
self.included_concretized_order = {}
roots = d["roots"]
self.concretized_user_specs = [Spec(r["spec"]) for r in roots]
self.concretized_order = [r["hash"] for r in roots]
json_specs_by_hash = d["concrete_specs"]
included_json_specs_by_hash = {}
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
if included_concrete_name in d:
for env_name, env_info in d[included_concrete_name].items():
included_json_specs_by_hash.update(
self.set_included_concretized_user_specs(
env_name, env_info, included_json_specs_by_hash
)
)
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
current_lockfile_format = d["_meta"]["lockfile-version"]
try:
reader = READER_CLS[current_lockfile_format]
@@ -1999,6 +2261,39 @@ def _read_lockfile_dict(self, d):
msg += " You need to use a newer Spack version."
raise SpackEnvironmentError(msg)
first_seen, self.concretized_order = self.filter_specs(
reader, json_specs_by_hash, self.concretized_order
)
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
if any(self.included_concretized_order.values()):
first_seen = {}
for env_name, concretized_order in self.included_concretized_order.items():
filtered_spec, self.included_concretized_order[env_name] = self.filter_specs(
reader, included_json_specs_by_hash, concretized_order
)
first_seen.update(filtered_spec)
for env_path, spec_hashes in self.included_concretized_order.items():
self.included_specs_by_hash[env_path] = {}
for spec_dag_hash in spec_hashes:
self.included_specs_by_hash[env_path].update(
{spec_dag_hash: first_seen[spec_dag_hash]}
)
def filter_specs(self, reader, json_specs_by_hash, order_concretized):
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
# First pass: Put each spec in the map ignoring dependencies
for lockfile_key, node_dict in json_specs_by_hash.items():
spec = reader.from_node_dict(node_dict)
@@ -2021,7 +2316,8 @@ def _read_lockfile_dict(self, d):
# keep. This is only required as long as we support older lockfile
# formats where the mapping from DAG hash to lockfile key is possibly
# one-to-many.
for lockfile_key in self.concretized_order:
for lockfile_key in order_concretized:
for s in specs_by_hash[lockfile_key].traverse():
if s.dag_hash() not in first_seen:
first_seen[s.dag_hash()] = s
@@ -2029,12 +2325,10 @@ def _read_lockfile_dict(self, d):
# Now make sure concretized_order and our internal specs dict
# contains the keys used by modern spack (i.e. the dag_hash
# that includes build deps and package hash).
self.concretized_order = [
specs_by_hash[h_key].dag_hash() for h_key in self.concretized_order
]
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
order_concretized = [specs_by_hash[h_key].dag_hash() for h_key in order_concretized]
return first_seen, order_concretized
def write(self, regenerate: bool = True) -> None:
"""Writes an in-memory environment to its location on disk.
@@ -2047,7 +2341,7 @@ def write(self, regenerate: bool = True) -> None:
regenerate: regenerate views and run post-write hooks as well as writing if True.
"""
self.manifest_uptodate_or_warn()
if self.specs_by_hash:
if self.specs_by_hash or self.included_concrete_envs:
self.ensure_env_directory_exists(dot_env=True)
self.update_environment_repository()
self.manifest.flush()
@@ -2463,6 +2757,10 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
self.scope_name = f"env:{environment_name(self.manifest_dir)}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
#: Configuration scopes associated with this environment. Note that these are not
#: invalidated by a re-read of the manifest file.
self._config_scopes: Optional[List[spack.config.ConfigScope]] = None
if not self.manifest_file.exists():
msg = f"cannot find '{manifest_name}' in {self.manifest_dir}"
raise SpackEnvironmentError(msg)
@@ -2542,6 +2840,19 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
raise SpackEnvironmentError(msg) from e
self.changed = True
def set_include_concrete(self, include_concrete: List[str]) -> None:
"""Sets the included concrete environments in the manifest to the value(s) passed as input.
Args:
include_concrete: list of already existing concrete environments to include
"""
self.pristine_configuration[included_concrete_name] = []
for env_path in include_concrete:
self.pristine_configuration[included_concrete_name].append(env_path)
self.changed = True
def add_definition(self, user_spec: str, list_name: str) -> None:
"""Appends a user spec to the first active definition matching the name passed as argument.
@@ -2725,54 +3036,56 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# If scheme is not valid, config_path is not a url
# of a type Spack is generally aware
if spack.util.url.validate_scheme(include_url.scheme):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
@@ -2808,16 +3121,19 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest.
Returns: All configuration scopes associated with the environment
"""
config_name = self.scope_name
env_scope = spack.config.SingleFileScope(
config_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
)
return check_disallowed_env_config_mods(self.included_config_scopes + [env_scope])
"""A list of all configuration scopes for the environment manifest. On the first call this
instantiates all the scopes, on subsequent calls it returns the cached list."""
if self._config_scopes is not None:
return self._config_scopes
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
),
]
ensure_no_disallowed_env_config_mods(scopes)
self._config_scopes = scopes
return scopes
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""

View File

@@ -662,9 +662,6 @@ def add_specs(self, *specs: spack.spec.Spec) -> None:
return
# Drop externals
for s in specs:
if s.external:
tty.warn("Skipping external package: " + s.short_spec)
specs = [s for s in specs if not s.external]
self._sanity_check_view_projection(specs)

View File

@@ -0,0 +1,31 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirror
def post_install(spec, explicit):
# Push package to all buildcaches with autopush==True
# Do nothing if spec is an external package
if spec.external:
return
# Do nothing if package was not installed from source
pkg = spec.package
if pkg.installed_from_binary_cache:
return
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
bindist.push_or_raise(
spec,
mirror.push_url,
bindist.PushOptions(force=True, regenerate_index=False, unsigned=not mirror.signed),
)
tty.msg(f"{spec.name}: Pushed to build cache: '{mirror.name}'")

View File

@@ -0,0 +1,8 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
def post_install(spec, explicit=None):
spec.package.windows_establish_runtime_linkage()

View File

@@ -119,7 +119,7 @@ def __init__(self, pkg_count: int):
self.pkg_ids: Set[str] = set()
def next_pkg(self, pkg: "spack.package_base.PackageBase"):
pkg_id = package_id(pkg)
pkg_id = package_id(pkg.spec)
if pkg_id not in self.pkg_ids:
self.pkg_num += 1
@@ -221,12 +221,12 @@ def _handle_external_and_upstream(pkg: "spack.package_base.PackageBase", explici
# consists in module file generation and registration in the DB.
if pkg.spec.external:
_process_external_package(pkg, explicit)
_print_installed_pkg(f"{pkg.prefix} (external {package_id(pkg)})")
_print_installed_pkg(f"{pkg.prefix} (external {package_id(pkg.spec)})")
return True
if pkg.spec.installed_upstream:
tty.verbose(
f"{package_id(pkg)} is installed in an upstream Spack instance at "
f"{package_id(pkg.spec)} is installed in an upstream Spack instance at "
f"{pkg.spec.prefix}"
)
_print_installed_pkg(pkg.prefix)
@@ -403,7 +403,7 @@ def _install_from_cache(
return False
t.stop()
pkg_id = package_id(pkg)
pkg_id = package_id(pkg.spec)
tty.debug(f"Successfully extracted {pkg_id} from binary cache")
_write_timer_json(pkg, t, True)
@@ -484,10 +484,14 @@ def _process_binary_cache_tarball(
if download_result is None:
return False
tty.msg(f"Extracting {package_id(pkg)} from binary cache")
tty.msg(f"Extracting {package_id(pkg.spec)} from binary cache")
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
pkg.windows_establish_runtime_linkage()
if hasattr(pkg, "_post_buildcache_install_hook"):
pkg._post_buildcache_install_hook()
pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
@@ -513,7 +517,7 @@ def _try_install_from_binary_cache(
if not spack.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg)}")
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")
with timer.measure("search"):
matches = binary_distribution.get_mirrors_for_spec(pkg.spec, index_only=True)
@@ -610,7 +614,7 @@ def get_dependent_ids(spec: "spack.spec.Spec") -> List[str]:
Returns: list of package ids
"""
return [package_id(d.package) for d in spec.dependents()]
return [package_id(d) for d in spec.dependents()]
def install_msg(name: str, pid: int, install_status: InstallStatus) -> str:
@@ -720,7 +724,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
dump_packages(pkg.spec, packages_dir)
def package_id(pkg: "spack.package_base.PackageBase") -> str:
def package_id(spec: "spack.spec.Spec") -> str:
"""A "unique" package identifier for installation purposes
The identifier is used to track build tasks, locks, install, and
@@ -732,10 +736,10 @@ def package_id(pkg: "spack.package_base.PackageBase") -> str:
Args:
pkg: the package from which the identifier is derived
"""
if not pkg.spec.concrete:
if not spec.concrete:
raise ValueError("Cannot provide a unique, readable id when the spec is not concretized.")
return f"{pkg.name}-{pkg.version}-{pkg.spec.dag_hash()}"
return f"{spec.name}-{spec.version}-{spec.dag_hash()}"
class BuildRequest:
@@ -765,7 +769,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
self.pkg.last_phase = install_args.pop("stop_at", None) # type: ignore[attr-defined]
# Cache the package id for convenience
self.pkg_id = package_id(pkg)
self.pkg_id = package_id(pkg.spec)
# Save off the original install arguments plus standard defaults
# since they apply to the requested package *and* dependencies.
@@ -780,9 +784,9 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
# are not able to return full dependents for all packages across
# environment specs.
self.dependencies = set(
package_id(d.package)
package_id(d)
for d in self.pkg.spec.dependencies(deptype=self.get_depflags(self.pkg))
if package_id(d.package) != self.pkg_id
if package_id(d) != self.pkg_id
)
def __repr__(self) -> str:
@@ -832,7 +836,7 @@ def get_depflags(self, pkg: "spack.package_base.PackageBase") -> int:
depflag = dt.LINK | dt.RUN
include_build_deps = self.install_args.get("include_build_deps")
if self.pkg_id == package_id(pkg):
if self.pkg_id == package_id(pkg.spec):
cache_only = self.install_args.get("package_cache_only")
else:
cache_only = self.install_args.get("dependencies_cache_only")
@@ -927,7 +931,7 @@ def __init__(
raise ValueError(f"{self.pkg.name} must have a concrete spec")
# The "unique" identifier for the task's package
self.pkg_id = package_id(self.pkg)
self.pkg_id = package_id(self.pkg.spec)
# The explicit build request associated with the package
if not isinstance(request, BuildRequest):
@@ -965,9 +969,9 @@ def __init__(
# if use traverse for transitive dependencies, then must remove
# transitive dependents on failure.
self.dependencies = set(
package_id(d.package)
package_id(d)
for d in self.pkg.spec.dependencies(deptype=self.request.get_depflags(self.pkg))
if package_id(d.package) != self.pkg_id
if package_id(d) != self.pkg_id
)
# Handle bootstrapped compiler
@@ -976,14 +980,18 @@ def __init__(
# a dependency of the build task. Here we add it to self.dependencies
compiler_spec = self.pkg.spec.compiler
arch_spec = self.pkg.spec.architecture
if not spack.compilers.compilers_for_spec(compiler_spec, arch_spec=arch_spec):
strict = spack.concretize.Concretizer().check_for_compiler_existence
if (
not spack.compilers.compilers_for_spec(compiler_spec, arch_spec=arch_spec)
and not strict
):
# The compiler is in the queue, identify it as dependency
dep = spack.compilers.pkg_spec_for_compiler(compiler_spec)
dep.constrain(f"platform={str(arch_spec.platform)}")
dep.constrain(f"os={str(arch_spec.os)}")
dep.constrain(f"target={arch_spec.target.microarchitecture.family.name}:")
dep.concretize()
dep_id = package_id(dep.package)
dep_id = package_id(dep)
self.dependencies.add(dep_id)
# List of uninstalled dependencies, which is used to establish
@@ -1194,7 +1202,7 @@ def _add_bootstrap_compilers(
"""
packages = _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs)
for comp_pkg, is_compiler in packages:
pkgid = package_id(comp_pkg)
pkgid = package_id(comp_pkg.spec)
if pkgid not in self.build_tasks:
self._add_init_task(comp_pkg, request, is_compiler, all_deps)
elif is_compiler:
@@ -1241,7 +1249,7 @@ def _add_init_task(
"""
task = BuildTask(pkg, request, is_compiler, 0, 0, STATUS_ADDED, self.installed)
for dep_id in task.dependencies:
all_deps[dep_id].add(package_id(pkg))
all_deps[dep_id].add(package_id(pkg.spec))
self._push_task(task)
@@ -1276,7 +1284,7 @@ def _check_deps_status(self, request: BuildRequest) -> None:
err = "Cannot proceed with {0}: {1}"
for dep in request.traverse_dependencies():
dep_pkg = dep.package
dep_id = package_id(dep_pkg)
dep_id = package_id(dep)
# Check for failure since a prefix lock is not required
if spack.store.STORE.failure_tracker.has_failed(dep):
@@ -1409,7 +1417,7 @@ def _cleanup_task(self, pkg: "spack.package_base.PackageBase") -> None:
Args:
pkg: the package being installed
"""
self._remove_task(package_id(pkg))
self._remove_task(package_id(pkg.spec))
# Ensure we have a read lock to prevent others from uninstalling the
# spec during our installation.
@@ -1423,7 +1431,7 @@ def _ensure_install_ready(self, pkg: "spack.package_base.PackageBase") -> None:
Args:
pkg: the package being locally installed
"""
pkg_id = package_id(pkg)
pkg_id = package_id(pkg.spec)
pre = f"{pkg_id} cannot be installed locally:"
# External packages cannot be installed locally.
@@ -1465,7 +1473,7 @@ def _ensure_locked(
"write",
], f'"{lock_type}" is not a supported package management lock type'
pkg_id = package_id(pkg)
pkg_id = package_id(pkg.spec)
ltype, lock = self.locks.get(pkg_id, (lock_type, None))
if lock and ltype == lock_type:
return ltype, lock
@@ -1601,7 +1609,7 @@ def _add_tasks(self, request: BuildRequest, all_deps):
for dep in request.traverse_dependencies():
dep_pkg = dep.package
dep_id = package_id(dep_pkg)
dep_id = package_id(dep)
if dep_id not in self.build_tasks:
self._add_init_task(dep_pkg, request, False, all_deps)
@@ -1691,10 +1699,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
spack.package_base.PackageBase._verbose = spack.build_environment.start_build_process(
pkg, build_process, install_args
)
# Currently this is how RPATH-like behavior is achieved on Windows, after install
# establish runtime linkage via Windows Runtime link object
# Note: this is a no-op on non Windows platforms
pkg.windows_establish_runtime_linkage()
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
@@ -1913,7 +1917,7 @@ def _flag_installed(
dependent_ids: set of the package's dependent ids, or None if the dependent ids are
limited to those maintained in the package (dependency DAG)
"""
pkg_id = package_id(pkg)
pkg_id = package_id(pkg.spec)
if pkg_id in self.installed:
# Already determined the package has been installed
@@ -2309,7 +2313,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
# info/debug information
self.pre = _log_prefix(pkg.name)
self.pkg_id = package_id(pkg)
self.pkg_id = package_id(pkg.spec)
def run(self) -> bool:
"""Main entry point from ``build_process`` to kick off install in child."""

Some files were not shown because too many files have changed in this diff Show More