Compare commits

..

202 Commits

Author SHA1 Message Date
Peter Scheibel
c899dcac5b style fix 2024-05-23 22:25:25 -07:00
Peter Scheibel
94dc25ecfa add test 2024-05-23 17:14:00 -07:00
Peter Scheibel
aded859856 Merge branch 'develop' into bugfix/invalid-compiler-warning 2024-05-23 16:30:54 -07:00
Alex Richert
f7b9c30456 Add develop version to ufs-weather-model (major updates) (#39265)
* Add develop version to ufs-weather-model (major updates)

* Update ufs-weather-model maintainers

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update ufs-weather-model defaults and fms dep

* Update package.py

* Update package.py

* Update package.py
2024-05-22 11:33:17 -06:00
Harmen Stoppels
884620a38a gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
to make macOS's linker happy.
2024-05-22 17:36:42 +02:00
pauleonix
7503a41773 cuda: add v12.4.1 (#43488) 2024-05-22 10:50:56 +02:00
Matt Thompson
9a5fc6b4a3 mapl: add v2.46 (#44230) 2024-05-22 10:22:57 +02:00
Cyrus Harrison
a31aeed167 conduit: add v0.9.2 (#44308) 2024-05-22 10:05:40 +02:00
Tamara Dahlgren
71f542a951 kcov: convert to new stand-alone test process (tested with latest version) (#44309) 2024-05-22 10:04:27 +02:00
Carlos Bederián
322bd48788 gcc: add v13.3.0 (#44306) 2024-05-22 09:44:54 +02:00
Wouter Deconinck
b752fa59d4 qt: add version 5.15.13 (#44310) 2024-05-22 09:44:35 +02:00
Wouter Deconinck
d53e4cc426 rsync: add v3.3.0 (#44311) 2024-05-22 09:36:16 +02:00
Tom Scogland
ee4b7fa3a1 flux-sched: add docs variant to match core, fix ver (#44277) 2024-05-22 00:55:57 -06:00
Harmen Stoppels
d6f02c86d9 grpc: add 1.61 to 1.64 (#44297) 2024-05-22 08:04:35 +02:00
John W. Parent
62efde8e3c cmake/add-3.29 (#43349) 2024-05-21 23:35:54 -06:00
Harmen Stoppels
bda1d94d49 bison: add missing diffutils build dep (#44296) 2024-05-21 19:33:29 -06:00
Massimiliano Culpo
3f472039c5 Take a lock before querying installed_dependents (#44301)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-21 13:16:04 -06:00
Robert Underwood
912ef34206 get cupy compiling with latest cuda (#44266)
* get cupy compiling with latest cuda
* fix other cupy-deps
* fix version bounds

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 10:44:18 -07:00
Massimiliano Culpo
9c88a48a73 Remove mesa18 and libosmesa (#44264)
* Remove mesa18 and libosmesa

mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.

We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.

libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.

* Remove references to mesa18 and libosmesa

* vtk: rework dependency on gl and osmesa

* memsurfer: rework dependency on vtk

* visit: minimal fix to avoid having both osmesa and glx
2024-05-21 10:18:14 -05:00
Dan Lipsa
4bf5cc9a9a Paraview on windows: Use htf5::hdf5 (#44161)
* Use hdf5::hdf5 on Windows from Paraview CMake
   This patch is already applied on VTK 9 or greater.
* Add comments stating that vtk and paraview patches are the same and should be modified in concert.
2024-05-21 11:05:32 -04:00
Chris Marsh
08834e2b03 Add homebrew gcc@14.1.0 patch for aarch64 darwin (#44280) 2024-05-21 15:31:36 +02:00
Massimiliano Culpo
8020a111df Demote a warning to debug message, if C compiler is not there (#44182) 2024-05-21 14:09:29 +02:00
dependabot[bot]
86fb547f7c bump pytest from 8.2.0 to 8.2.1 --- (#44282)
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:42:22 +02:00
Robert Underwood
b9556c7c44 Additional packages for devtools-manylinux (#44273)
Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 13:37:04 +02:00
dependabot[bot]
7bdb106b1b --- (#44281)
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:15:05 +02:00
kenche-linaro
2b191cd7f4 linaro-forge: added 24.0 version (#44292) 2024-05-21 05:13:42 -06:00
Federico Ficarelli
774f0a4e60 grpc: remove a maintainer (#44294) 2024-05-21 12:58:26 +02:00
Keita Iwabuchi
faf11efa72 Metall: add v0.26, v0.27, and v0.28 (#44284)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2024-05-21 12:50:44 +02:00
Massimiliano Culpo
5a99142b41 Cleanup of Apple OpenGL shim packages (#44269)
- Remove duplicated code
- Use BundlePackage as a base class
2024-05-21 12:49:18 +02:00
Harmen Stoppels
a3aca0242a grpc: forward compat bound abseil (#44290) 2024-05-21 11:48:59 +02:00
Massimiliano Culpo
72f276fab3 ASP-based solver: fix version optimization for roots (#44272)
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.

The issue was the omission of `PackageNode` in the optimization
tuple.
2024-05-21 08:41:09 +02:00
Philipp Edelmann
21139945df rayleigh: new package (#38338)
* rayleigh: new package

* Update var/spack/repos/builtin/packages/rayleigh/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* split edit into three methods

* add comments to clarify use of configure

* rayleigh: copyright year

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-20 19:23:02 -06:00
George Young
900bd2f477 seqkit: add 2.4.0, switching to 'go build' install mechanic (#38192)
* seqkit: add 2.4.0

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: update to @2.6.1, drop deprecated version and build system

* seqkit: add @2.7.0, convert to GoPackage

* tidying

* tidying

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-05-20 18:08:57 -07:00
Scott Wittenburg
29d4a5af44 gitlab ci: fix untouched spec pruning on windows (#44279)
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
2024-05-21 00:56:48 +00:00
dependabot[bot]
dd9b7ed6a7 build(deps): bump types-six in /.github/workflows/style (#44167)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.9 to 1.16.21.20240513.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-20 17:41:42 -07:00
Wouter Deconinck
09ff74be62 libsigsegv: fix patch filename extension 2024-05-21 02:12:24 +02:00
Robert Underwood
a94ebfea11 libpressio update (#44076)
* libpressio update
* fix typos in libpressio packages
* Addressed review feedback from @tldahlgren
* fix ci issues
* add missing package for SZx
* simplify varient logic and fix GPU deps
* Update var/spack/repos/builtin/packages/py-langsmith/package.py

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 18:12:00 -06:00
Paul R. C. Kent
8f5fe1d123 Add llvm v1816 (#44271) 2024-05-20 16:39:43 -07:00
Loris Ercole
d4fb58efa3 Update, fix serenity & serenity-libint (#43816)
* Update, fix serenity & serenity-libint
   Version 1.6.1 of `serenity` is added, some dependencies are made more
   explicit, some options are improved or fixed.
   The url of `serenity-libint` is fixed. The old url is not reachable
   anymore.
* Use upstream patch, modify cmake patch
* Update var/spack/repos/builtin/packages/serenity/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 16:07:25 -07:00
Tony Weaver
ce900346cc heimdall: Astronomy software package (#38328)
* heimdall: Astronomy software package.  Requires dedisp and psrdada (included as part of this commit)

* Updated packages to align with Spack's style

Minor updates based on wdconinc's comments regarding Spack's style guide

* [@spackbot] updating style on behalf of aweaver1fandm

* Minor edits to fix copyright year and dedisp install

Fixed copyright year to be 2024 instead of 2023

Removed the overridden version of install and created a preinstall function to create the missing lib and inc directories, therefore allowing the default install to run

Here is output from the spack install of heimdall showing successful build with cuda.  If necessary I do a spack clean and freshly install it
./spack install heimdall +cuda cuda_arch=86
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libiconv-1.17-enpmbhsi3kztebwmpclpub2afhlbr3gy
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/xz-5.4.1-pte76kujkezxb3laqse3o4sctlbygsaw
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zlib-1.2.13-utlfo5ltxz5v5bckirn5v3amtbxjdvwh
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/bzip2-1.0.8-x6navz7ucgfnb5xq7aelqmgd4zxsz5bs
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libmd-1.0.4-ncomhrodpdul4dm64o6b7426fhmc2u64
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/ncurses-6.4-c77h34rooycbzapxjvc27sg5td5jiwyb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zstd-1.5.5-eporpybumydxveg5rwtfzysrsu4eqzcv
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libffi-3.4.4-vskntokaojclfqxjfzkbyirkeogddbpx
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxcrypt-4.4.33-gtpn32p6mxztul3c3dxzqj7gvcyh555j
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/util-linux-uuid-2.38.1-g322a5peqjaad6gl5q64cdu4qo7kvw6o
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxml2-2.10.3-pza3kz2mtbncbbeim6rejfqkgftnf4rz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/openssl-1.1.1t-adbquvgg4qpc3vq6jynf44qzq3gfwrv5
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/pigz-2.7-e7mxj4ya2u4a6zb4hu64g7docujmkxeb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libbsd-0.11.7-wh4xleivbe7wndiqt5nsehzlfrccnjcg
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/readline-8.2-hhb647bwmbcj7iwpmtetbylninfm5rxf
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/cuda-11.7.1-osue2sx5rv7dgzhsmaemydpwhyribxng
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/tar-1.34-35f5gki2ycxmy5zd7zs5tsvp3xoszxum
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/expat-2.5.0-4gizyhhqklciuyrbyinq2tdggt73gds4
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gdbm-1.23-w3uzihtubj2iwv6es55fis6nt2q5zwlr
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/sqlite-3.40.1-m4ntzvuupsnbtkdfbz7oqpbjdlaffp2a
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/dedisp-1.0-pq6r3jnyxq6jzoygz3fp2e6jc2ojpvap
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gettext-0.21.1-hglzdeadmgkzjb76bmemt6dnulfkrpha
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/python-3.10.10-khu36qq4p2te7jf475ewr2h7egidekfl
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/psrdada-master-by4w6mrpcfnoihlhos2jcfo2roiyagaz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/heimdall-1.0-ohtdnltuhhejysshcert25h6nmuvluqp

* [@spackbot] updating style on behalf of aweaver1fandm
2024-05-20 13:39:23 -06:00
Adam J. Stewart
7cb64e465f py-pytest: add v8.2.1 (#44267)
* py-pytest: add v8.2.1
* py-pluggy: add v1.5.0
2024-05-20 11:28:59 -07:00
Pramod Kumbhar
eb70c9f5b9 caliper: add new variant to support Intel Vtune (#44147) 2024-05-20 18:27:42 +02:00
Teague Sterling
a28405700e awscli-v2: add v2.15.53, and other updates (#44258)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-20 18:21:09 +02:00
Adam J. Stewart
f8f4d94d7a Deprecate py-cmake and py-ninja (#44257) 2024-05-20 18:05:06 +02:00
Matt Thompson
32dfb522d6 gh: add v2.49.2 (#44231) 2024-05-20 13:51:46 +02:00
Mark W. Krentel
c61c707aa5 hpctoolkit: restrict one patch to :2022 (#44268)
Restrict the hpcrun-fmt.txt patch to :2022.  It's fixed in the code
after that, and in recent develop, some code paths have moved causing
this patch to fail.
2024-05-20 01:44:28 -06:00
Harmen Stoppels
60d10848c8 gettext: no link dep on tar (#44256) 2024-05-20 09:23:46 +02:00
Adam J. Stewart
dcd6b530f9 py-matplotlib: add v3.9.0 (#44225) 2024-05-20 09:22:30 +02:00
Teague Sterling
419f0742a0 htslib, STAR: add zlib-ng conflict (#44261) 2024-05-20 09:20:19 +02:00
Jacob King
c99174798b nimrod-aai: add version 24.2 and fix url (#42464)
shas changed due to reorganization of GitLab.com repo 
into an open subgroup.
2024-05-20 09:13:33 +02:00
Adam J. Stewart
8df2a4b511 py-gdown: add new package (#44265) 2024-05-20 09:09:35 +02:00
Benjamin Meyers
c174cf6830 Add openjdk@15.0.2 (#35936) 2024-05-19 10:27:26 -06:00
Wouter Deconinck
5eebd65366 audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1 (#44212)
* audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1

* [@spackbot] updating style on behalf of wdconinc

* audit: fix style

* audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch

* [@spackbot] updating style on behalf of wdconinc

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit 2ecec99238.

* Revert "audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch"

This reverts commit 5bd7da2cad.

* fix: modify audit message with suggested fix

* audit: github.com/o/r/pull/n/commits/sha.patch -> /o/r/commit/sha.patch?full_index=1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-19 09:30:19 -05:00
Michael Kuhn
625f5323c0 py-pyqt5-sip: add 12.13.0 and fix build with gcc@14 (#44133) 2024-05-19 08:05:18 +02:00
Wouter Deconinck
e05a32cead gaudi: don't apply patch for 38.2 (#44252) 2024-05-18 23:13:46 -06:00
Juan Miguel Carceller
c69af5d1e5 podio: cleanup recipe, remove deprecated versions and patches (#44111)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-18 15:17:54 -06:00
Miranda Mundt
1ac2ee8043 Add Pyomo 6.7.2 (#44097) 2024-05-18 10:31:57 -05:00
Teague Sterling
36af1c1c73 perl-xml-libxml: add new versions and conflicts (fixes #44253) (#44254)
* Address #44253 by adding new versions and declaring conflicts for perl-xml-libxml

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/perl-xml-libxml/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-18 09:28:32 -06:00
Carlos Bederián
e2fa087002 namd: add 3.0b7 (#44198) 2024-05-18 10:15:55 -05:00
Teague Sterling
df02bfbad2 Adding new spark versions (#44250)
* Adding new spark versions (in preparation of HAIL package)

* Adding myself as potential maintainer
2024-05-18 10:06:12 -05:00
Teague Sterling
fecb63843e yq: add new package (#44249) 2024-05-18 06:38:26 -06:00
Scott Wittenburg
b33e2d09d3 oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-18 11:57:53 +02:00
Chris Green
f8054aa21a git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-05-18 03:32:06 -06:00
Valentin Volkl
8f3a2acc54 whizard: add gosam variant (#43595)
* whizard: add gosam variant

* adress comments, fix compiler wrapper issue
2024-05-18 10:11:26 +02:00
Kevin Huck
d1a20908b8 ZeroSum: add new package (#44228) 2024-05-18 09:23:45 +02:00
Derek Ryan Strong
dd781f7368 perl: add v5.36.3, v5.38.2; deprecate unsafe versions (#44186) 2024-05-18 09:21:03 +02:00
dmagdavector
9bcc43c4c1 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-17 17:50:29 -06:00
Todd Gamblin
77c83af17d docs: remove warning about repositories and package extension (#44247)
Local package repositories are very well supported and we test them extensively, so this
warning from 8 years ago can be removed from the docs.
2024-05-17 22:03:57 +00:00
Michael Kuhn
574bd2db99 netlib-scalapack: fix build with gcc@14 (#44120) 2024-05-17 22:38:46 +02:00
James Taliaferro
a76f37da96 kakoune: add v2024.05.09 (#44124) 2024-05-17 22:36:58 +02:00
Adam J. Stewart
9e75f3ec0a GDAL: add v3.9.0 (#44128) 2024-05-17 22:35:33 +02:00
Derek Ryan Strong
4d42d45897 Add latest Python versions (#44130) 2024-05-17 22:35:06 +02:00
SXS Bot
a4b4bfda73 spectre: add v2024.05.11 (#44139)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-05-17 22:28:25 +02:00
Michael Kuhn
1bcdd3a57e py-cython: add 3.0.10 (#44140) 2024-05-17 22:27:42 +02:00
Stephen Sachs
297a3a1bc9 Add mpas-model and mpich to pcluster neoverse stack (#44151)
Should build now since https://github.com/spack/spack/pull/43547 has been merged.
2024-05-17 22:00:17 +02:00
Adam J. Stewart
8d01e8c978 JAX: add v0.4.28 (#44112) 2024-05-17 21:58:44 +02:00
Wouter Deconinck
6be28aa303 pythia8: patch latest 8.311 for upstream bug (#43803)
* pythia8: prefer 8.310

* [@spackbot] updating style on behalf of wdconinc

* pythia8: filter_file to remove sed n

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit e2a3decaffbd3f464d1bd992025e1812df49f088.

* Revert "pythia8: prefer 8.310"

This reverts commit 568cb056b87129085e245d9dbef1732ee1c6c0aa.

* [@spackbot] updating style on behalf of wdconinc

* pythia8: comment for fix

* pythia8: fix style

* pythia8: filter_file with raw string because of escaped pipe

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-17 21:31:05 +02:00
Jon Rood
5e38310515 nalu-wind: fix trilinos rocm dependency (#44233) 2024-05-17 13:00:24 -06:00
wspear
ddfed65485 tau: fix lib/include paths with oneapi (#44170) 2024-05-17 18:55:27 +02:00
dependabot[bot]
2a16d8bfa8 build(deps): bump codecov/codecov-action from 4.3.1 to 4.4.0 (#44195)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5ecb98a3c6...6d798873df)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 18:46:17 +02:00
Jonathon Anderson
6a40a50a29 hpcviewer: Update URLs to use GitLab release assets (#44129) 2024-05-17 18:43:13 +02:00
Carlos Bederián
b2924f68c0 blis: add v1.0 (#44199) 2024-05-17 18:39:42 +02:00
Rocco Meli
41ffe36636 spglib: add v2.4.0 (#44202) 2024-05-17 18:33:40 +02:00
Chris Marsh
24edc72252 docs: show phase signature for builders (#44067) 2024-05-17 18:16:31 +02:00
Raffaele Solcà
83b38a26a0 New versions nvpl-blas and nvpl-lapack (#44244) 2024-05-17 17:46:25 +02:00
Nathalie Furmento
914d785e3b starpu: add v1.4.6 (#44203) 2024-05-17 08:17:01 -06:00
Garth N. Wells
f99f642fa8 fenicsx: remove deprecated versions (#44223) 2024-05-17 07:35:21 -06:00
Seth R. Johnson
e0bf3667e3 cli11: new version and enable library (#44204) 2024-05-17 15:08:28 +02:00
Andrew-Dunning-NNL
a24ca50fed Fix broken link in docs (#44217) 2024-05-17 12:58:11 +00:00
Fabien Bruneval
51e9f37252 MOLGW: add v3.3 (#44241)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 06:56:05 -06:00
Rémi Lacroix
453900c884 libxc: Fix compilation after distribution changes. (#44206)
The release tarballs are not available anymore which means autoconf, automake and libtool are always needed.

The NVHPC specific patches don't make sense anymore being that the patched files are not distributed in the new tar files.
2024-05-17 14:52:03 +02:00
Alberto Invernizzi
4696459d2d libcatalyst: add missing python dependencies (#44224) 2024-05-17 14:45:26 +02:00
Fabien Bruneval
ad1e3231e5 libcint: add v6.1.2, v5.5.0 (#44239)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 05:45:26 -06:00
Richard Berger
2ef7eb1826 exodusii: only use MPI fortran compiler if +fortran (#44211) 2024-05-17 13:01:09 +02:00
Dom Heinzeller
fe86019f9a ecflow: versions up to 5.11.4 require boost 1.84 or earlier (#44181) 2024-05-17 12:53:48 +02:00
Harmen Stoppels
9dbb18219f build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-05-17 12:29:56 +02:00
Mikael Simberg
451a977de0 hpx: change default of max_cpu_count variant to auto (#44220) 2024-05-17 11:54:48 +02:00
Jack S. Hale
e604929a4c FEniCS: add more maintainers (#44240) 2024-05-17 03:03:21 -06:00
dependabot[bot]
9d591f9f7c build(deps): bump actions/checkout from 4.1.5 to 4.1.6 (#44234)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5 to 4.1.6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](44c2b7a8a4...a5ac7e51b4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 10:57:08 +02:00
Matt Thompson
f8ad915100 esmf: add v8.6.1 (#44229) 2024-05-17 10:49:47 +02:00
Garth N. Wells
cbbabe6920 fenics-dolfinx: add spdlog dependency (#44237) 2024-05-17 10:48:55 +02:00
John W. Parent
81fe460194 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-05-16 17:00:02 -06:00
Paul Kuberry
b894f996c0 trilinos: catch kokkos inconsistency with trilinos (#44209)
* trilinos: catch kokkos inconsistency with trilinos

* trilinos: update kokkos version range
2024-05-16 13:36:11 -06:00
John W. Parent
1ce09847d9 Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-05-16 10:56:04 -07:00
Thomas Madlener
722d401394 gaudi: Don't apply the patch if it has already landed upstream (#44180) 2024-05-16 17:15:39 +02:00
Howard Pritchard
e6f04d5ef9 py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-16 16:18:44 +02:00
Mosè Giordano
b8e3ecbf00 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-05-16 11:07:18 +02:00
Todd Gamblin
d189387c24 bugfix: add arg to write_line_break() in spack_yaml (#42727)
`ruamel`'s `Emitter.write_line_break()` method takes an extra argument that we forgot to
implement in our custom emitter.
2024-05-15 19:25:06 -07:00
Andrew Lister
9e96ddc5ae CoinHSL: Support the Meson build system and add new release (#43610)
* Add maintainer and fix linting
* allow for fewer deps in archive
* use meson for archive packages
* Fix version spec and f-string
* fix blas dependency links
* Add new release to spack
* Fix checksums for latest release
2024-05-15 16:48:23 -06:00
dmagdavector
543bd189af iperf3: updated versions from 3.6 to 3.16 (#44152)
* Update version of iperf3 from 3.6 to 3.16

Spack currently only explicitly has version 3.6 of the iPerf3 package
(out of ESnet / LBNL). This makes the default the latest version of 3.16,
and adds some other versions (found in some Linux distros, for possible
compatibility purposes).

* iperf3: update to 3.17; update 3.6 hash for new url

* protobuf: update hash for patch needed when="@3.4:3.21"

* Revert "protobuf: update hash for patch needed when="@3.4:3.21""

This reverts commit 4d168d0b27.
2024-05-15 15:48:15 -06:00
John W. Parent
43291aa723 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-05-15 15:41:51 -04:00
Alec Scott
d0589285f7 unmaintained pkgs: bump versions 2024-05-10 (#44131)
* unmaintained pkgs: bump versions 2024-05-10

* openblas: fix satisfies syntax

* pixman: add autotools dependencies

* [@spackbot] updating style on behalf of alecbcs

* pixman: revert

* chapel: revert changes in favor of other PR

* openblas: revert due to failing tests

* Address review feedback for flint, biobam2, and pango

* pango: add version comment about v2.0

* numactl: revert changes due to ppc4le bug

* flint: remote duplicate configure arg

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* openvkl, rkcommon: remove commented maintainers template

* flint: fix style

---------

Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-15 11:04:52 -07:00
afzpatel
d079aaa083 enable tensorflow-2.14 and 2.16 for spack built ROCm (#44095)
* initial commit to enable tensorflow-2.14 for spack built ROCm

* fix style errors

* modify hipcc patch

* updates for rocm 6.1

* updates for tf-rocm-enhanced 2.16

* fix styling

* fix styling

* add patch for 2.16

* add patch for 2.16

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update rocm enhanced version names

* changes for rocm-enhanced version name change

* fix styling

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-15 17:06:35 +02:00
Paolo
6c65977e0d Fix gromacs installation with SVE. Issue 44062 (#44183)
* Fix gromacs installation with SVE. Issue 44062

* [@spackbot] updating style on behalf of paolotricerri

* Remove `neoverse_n2` target

We have removed the neoverse_n2 target as its detection is more involved
compared to neoverse_v*.
2024-05-15 07:22:47 -06:00
Carlos Bederián
1b5d786cf5 gromacs: add 2024.2, 2023.5 (#44197) 2024-05-15 07:21:55 -06:00
Ben Morgan
4cf00645bd VecGeom: new version 1.2.8 (#44179) 2024-05-15 07:50:48 -04:00
Jon Rood
e9149cfc3c nalu-wind: fix mistake (#44188) 2024-05-14 23:13:13 -06:00
Jon Rood
a5c8111076 exawind: updates to package to allow mixed device (#44159)
* exawind: updates to package to allow mixed device

* Style.

* Remove ninja variants.

* Add conflict for amr-wind+hypre with mixed device.

* Relax amr-wind~hypre requirement.

* Move runtime variables to nalu-wind.

* Update suggestions.

* Remove umpire.
2024-05-14 12:26:07 -06:00
Alec Scott
c3576f712d pkg-config: support apple-clang@15: (#44007) 2024-05-14 17:24:36 +02:00
Alec Scott
410e6a59b7 rust: fix v1.78.0 instructions (#44127) 2024-05-14 08:14:34 -07:00
Carlos Bederián
bd2b2fb75a python: add 3.10.14, 3.9.19, 3.8.19 (#44162) 2024-05-14 17:03:45 +02:00
Zachary Newell
7ae318efd0 Added NCCL version 2.21.5-1 (#44158) 2024-05-14 03:34:21 -06:00
Derek Ryan Strong
73e9d56647 Update bash 5.2 patches (#44172) 2024-05-14 09:08:39 +02:00
Derek Ryan Strong
f87a752b63 Add zsh 5.8.1 and 5.9 (#44173) 2024-05-14 09:08:08 +02:00
Derek Ryan Strong
ae2fec30c3 Add newer fish versions (#44174) 2024-05-14 09:07:49 +02:00
Derek Ryan Strong
1af5564cbe Add file 5.45 (#44175) 2024-05-14 09:07:25 +02:00
Derek Ryan Strong
a8f057a701 Add man-db 2.12.0 and 2.12.1 (#44176) 2024-05-14 09:07:10 +02:00
Michael B Kuhn
7f3dd38ccc amr-wind: add latest versions and correct waves2amr details (#44099)
* add latest versions and correct waves2amr details

* update commit associated with v2.1.0
2024-05-13 14:21:25 -06:00
Adam J. Stewart
8e9adefcd5 ML CI: update image (#43751)
* ML CI: update image

* Use main branch

* Use tagged version
2024-05-13 21:43:52 +02:00
jdomke
d276f9700f fujitsu-mpi package: help CMake find wrappers (#43979)
Set MPI_C_COMPILER etc. env vars when building with fujitsu-mpi, which
override CMake logic. Without using these variables to explicitly
request the Fujitsu MPI wrappers, the builtin CMake logic is unwilling
to use these wrappers unless the Fujitsu compiler is used, but they
should be used for %clang as well.

This avoids patching CMake module files like in #16864, primarily to
avoid the possibility of altering behavior for specs that do not use
%fj or ^fujitsu-mpi.
2024-05-13 11:15:45 -07:00
Harmen Stoppels
4f111659ec glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-05-13 20:11:27 +02:00
Dave Keeshan
eaf330f2a8 yosys: add v0.41 (#44153) 2024-05-13 09:40:42 -07:00
Julien Cortial
cdaeb74dc7 cdt: Add version 1.4.1 (#44155) 2024-05-13 09:38:25 -07:00
Alberto Sartori
fbaac46604 justbuild: add version 1.3.0 (#44148) 2024-05-13 09:35:29 -07:00
Jon Rood
7f6210ee90 nalu-wind: updates (#44046) 2024-05-13 10:28:24 -06:00
Wouter Deconinck
63f6e6079a gaudi: upstream patch when @38.1 for missing #include <list> (#44121)
* gaudi: upstream patch when @38.1 for missing #include <list>

* gaudi: apply list patch for all versions
2024-05-13 07:37:31 -06:00
Greg Becker
d4fd6caae0 spack uninstall: improve error message for dependent environment (#44149) 2024-05-13 14:58:13 +02:00
Mikael Simberg
fd3c18b6fd whip: Add 0.3.0 (#44146)
Co-authored-by: Mikael Simberg <simbergm@cscs.ch>
2024-05-13 03:09:32 -06:00
Adam J. Stewart
725f427f25 spack checksum: do not add expand=False to wheels (#44118) 2024-05-13 10:01:47 +02:00
Paul R. C. Kent
32b3e91ef7 llvm: add 18.1.5, 18.1.4 (#44123) 2024-05-12 16:28:32 -07:00
bk
b7e4602268 R: common package updates for 4.4.0 (#44088)
* r-ggplot2: v3.5.1, r-haven: v2.5.4, r-jsonlite: v1.8.8, r-pkgload: v1.3.4, r-vctrs: v0.6.5

* r-scales: v1.3.0
2024-05-12 10:49:38 -07:00
Stephen Sachs
4a98d4db93 Add applications to aws-pcluster-* stacks (#43901)
* Add openfoam to aws-pcluster-neoverse_v1 stack

* Add more apps to aws-pcluster-x86_64_v4 stack

* Remove WRF while hdf5 cannot build in buildcache at the moment

* Update comment

* Add more apps for aws-pcluster-neoverse_v1 stack

* Remove apps that currently do not build

* Disable those packages that won't build

* Modify syntax such that correct cflags are used

* Changing syntax again to what works with other packages

* Fix overriding packages.yaml entry for gettext

* Use new `prefer` and `require:when` clauses to clarify intent

* Use newer spack version to install intel compiler

This removes the need for patches and makes sure the `prefer` directives in
`package.yaml` are understood.

* `prefer` not strong enough, need to set compilers

* Revert "Use newer spack version to install intel compiler"

This reverts commit ecb25a192c.

Cannot update the spack version to install intel compiler as this changes the
compiler hash but not the version. This leads to incompatible compiler paths. If
we update this spack version in the future make sure the compiler version also updates.

Tested-by: Stephen Sachs <stesachs@amazon.com>

* Remove `prefer` clause as it is not strong enough for our needs

This way we can safely go back to installing the intel compiler with an older
spack version.

* Prefer gcc or oneapi to build gettext

* Pin gettext version compatible with system glibc-headers

* relax gettext version requirement to enable later versions

* oneapi cannot build older gettext version
2024-05-12 10:48:02 -07:00
Juan Miguel Carceller
9d6bf373be whizard: Add version 3.1.4 (#43045)
* whizard: Add a patch to fix builds with pythia8 >= 8.310

* Add whizard 3.1.4 and update accordingly

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:44:56 -07:00
Alex Richert
cff35c4987 octave: use pcre2 for @8: (#42636)
* octave: use pcre2 for @8:

* Add 'pcre2' variant to octave to control pcre vs. pcre2

* Update var/spack/repos/builtin/packages/octave/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-12 10:44:31 -07:00
Juan Miguel Carceller
d594f84b8f fastjet: Add a cxxstd variant (#44072)
* fastjet: Add a cxxstd variant

* Use f-strings

* Add multi=False

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:42:56 -07:00
Wouter Deconinck
f8f01c336c clang: support cxx20_flag and cxx23_flag (#43438)
* clang: support cxx20_flag and cxx23_flag

* clang: coverage test cxx{}_flag and c{}_flag additions
2024-05-12 07:45:59 -07:00
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
Gregory Becker
2e3fc288ae warn and continue on failure to parse compiler from external 2024-04-18 08:46:05 -07:00
640 changed files with 8118 additions and 6231 deletions

View File

@@ -28,7 +28,7 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
@@ -61,7 +61,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -60,7 +60,7 @@ jobs:
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -83,14 +83,16 @@ jobs:
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: brew install tree gawk
- name: Remove system executables
run: |
while [ -n "$(command -v gpg gpg2 patchelf)" ]; do
sudo rm $(command -v gpg gpg2 patchelf)
done
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -110,14 +112,16 @@ jobs:
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: brew install tree
- name: Remove system executables
run: |
while [ -n "$(command -v gpg gpg2 patchelf)" ]; do
sudo rm $(command -v gpg gpg2 patchelf)
done
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -77,8 +77,13 @@ jobs:
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all:
needs: [ unit-tests, bootstrap ]
needs: [ windows, unit-tests, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.9
types-six==1.16.21.20240513
vermin==1.6.0

View File

@@ -14,14 +14,14 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-22.04]
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
include:
- python-version: '3.11'
os: ubuntu-20.04
os: ubuntu-latest
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.6'
@@ -30,28 +30,28 @@ jobs:
on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude:
- python-version: '3.7'
os: ubuntu-latest
concretizer: 'clingo'
os: ubuntu-22.04
on_develop: false
- python-version: '3.8'
os: ubuntu-latest
concretizer: 'clingo'
os: ubuntu-22.04
on_develop: false
- python-version: '3.9'
os: ubuntu-latest
concretizer: 'clingo'
os: ubuntu-22.04
on_develop: false
- python-version: '3.10'
os: ubuntu-latest
concretizer: 'clingo'
os: ubuntu-22.04
on_develop: false
- python-version: '3.11'
os: ubuntu-latest
concretizer: 'clingo'
os: ubuntu-22.04
on_develop: false
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,16 +91,16 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Test shell integration
shell:
runs-on: ubuntu-22.04
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version
@@ -158,9 +158,9 @@ jobs:
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
runs-on: ubuntu-22.04
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -198,7 +198,7 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,39 +223,8 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Run unit tests on Windows
windows:
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version

83
.github/workflows/windows_python.yml vendored Normal file
View File

@@ -0,0 +1,83 @@
name: windows
on:
workflow_call:
concurrency:
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack -d external find cmake ninja
spack -d install abseil-cpp

View File

@@ -14,26 +14,3 @@ sphinx:
python:
install:
- requirements: lib/spack/docs/requirements.txt
search:
ranking:
spack.html: -10
spack.*.html: -10
llnl.html: -10
llnl.*.html: -10
_modules/*: -10
command_index.html: -9
basic_usage.html: 5
configuration.html: 5
config_yaml.html: 5
packages_yaml.html: 5
build_settings.html: 5
environments.html: 5
containers.html: 5
mirrors.html: 5
module_file_support.html: 5
repositories.html: 5
binary_caches.html: 5
chain.html: 5
pipelines.html: 5
packaging_guide.html: 5

View File

@@ -1,424 +1,3 @@
# v0.22.5 (2025-02-21)
## Bugfixes
- Continue to mark non-roots as implicitly installed on partial env installs (#47183)
## Notes
- v0.22.4 is skipped to do an error in the release process
# v0.22.3 (2024-11-18)
## Bugfixes
- Forward compatibility with Python 3.13 (#46775, #46983, #47035, #47175)
- `archspec` was updated to v0.2.5 (#46503, #46958)
- Fix path to Spack in `spack env depfile` makefile (#46966)
- Fix `glibc` detection in Chinese locales (#47434)
- Fix pickle round-trip of specs propagating variants (#47351)
- Fix a bug where concurrent spack install commands would not always update explicits correctly
(#47358)
- Fix a bug where autopush would run before all post install hooks modifying the install prefix
had run (#47329)
- Fix `spack find -u` (#47102)
- Fix a bug where sometimes the wrong Python interpreter was used for build dependencies such as
`py-setuptools` (#46980)
- Fix default config errors found by `spack audit externals` (#47308)
- Fix duplicate printing of external roots in installer (#44917)
- Fix modules schema in `compilers.yaml` (#47197)
- Reduce the size of generated YAML for Gitlab CI (#44995)
- Handle missing metadata file gracefully in bootstrap (#47278)
- Show underlying errors on fetch failure (#45714)
- Recognize `.` and `..` as paths instead of names in buildcache commands (#47105)
- Documentation and style (#46991, #47107, #47110, #47111, #47346, #47307, #47309, #47328, #47160,
#47402, #47557, #46709, #47080)
- Tests and CI fixes (#47165, #46711)
## Package updates
- ffmpeg: fix hash of patch (#45574)
# v0.22.2 (2024-09-21)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump vendored `archspec` for better aarch64 support (#45721, #46445)
- Support macOS Sequoia (#45018, #45127)
- Fix regression in `{variants.X}` and `{variants.X.value}` format strings (#46206)
- Ensure shell escaping of environment variable values in load and activate commands (#42780)
- Fix an issue where `spec[pkg]` considers specs outside the current DAG (#45090)
- Do not halt concretization on unknown variants in externals (#45326)
- Improve validation of `develop` config section (#46485)
- Explicitly disable `ccache` if turned off in config, to avoid cache pollution (#45275)
- Improve backwards compatibility in `include_concrete` (#45766)
- Fix issue where package tags were sometimes repeated (#45160)
- Make `setup-env.sh` "sourced only" by dropping execution bits (#45641)
- Make certain source/binary fetch errors recoverable instead of a hard error (#45683)
- Remove debug statements in package hash computation (#45235)
- Remove redundant clingo warnings (#45269)
- Remove hard-coded layout version (#45645)
- Do not initialize previous store state in `use_store` (#45268)
- Docs improvements (#46475)
## Package updates
- `chapel` major update (#42197, #44931, #45304)
# v0.22.1 (2024-07-04)
## Bugfixes
- Fix reuse of externals on Linux (#44316)
- Ensure parent gcc-runtime version >= child (#44834, #44870)
- Ensure the latest gcc-runtime is rpath'ed when multiple exist among link deps (#44219)
- Improve version detection of glibc (#44154)
- Improve heuristics for solver (#44893, #44976, #45023)
- Make strong preferences override reuse (#44373)
- Reduce verbosity when C compiler is missing (#44182)
- Make missing ccache executable an error when required (#44740)
- Make every environment view containing `python` a `venv` (#44382)
- Fix external detection for compilers with os but no target (#44156)
- Fix version optimization for roots (#44272)
- Handle common implementations of pagination of tags in OCI build caches (#43136)
- Apply fetched patches to develop specs (#44950)
- Avoid Windows wrappers for filesystem utilities on non-Windows (#44126)
- Fix issue with long filenames in build caches on Windows (#43851)
- Fix formatting issue in `spack audit` (#45045)
- CI fixes (#44582, #43965, #43967, #44279, #44213)
## Package updates
- protobuf: fix 3.4:3.21 patch checksum (#44443)
- protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
- git: bump v2.39 to 2.45; deprecate unsafe versions (#44248)
- gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
- Remove mesa18 and libosmesa (#44264)
- Enforce consistency of `gl` providers (#44307)
- Require libiconv for iconv (#44335, #45026).
Notice that glibc/musl also provide iconv, but are not guaranteed to be
complete. Set `packages:iconv:require:[glibc]` to restore the old behavior.
- py-matplotlib: qualify when to do a post install (#44191)
- rust: fix v1.78.0 instructions (#44127)
- suite-sparse: improve setting of the `libs` property (#44214)
- netlib-lapack: provide blas and lapack together (#44981)
# v0.22.0 (2024-05-12)
`v0.22.0` is a major feature release.
## Features in this release
1. **Compiler dependencies**
We are in the process of making compilers proper dependencies in Spack, and a number
of changes in `v0.22` support that effort. You may notice nodes in your dependency
graphs for compiler runtime libraries like `gcc-runtime` or `libgfortran`, and you
may notice that Spack graphs now include `libc`. We've also begun moving compiler
configuration from `compilers.yaml` to `packages.yaml` to make it consistent with
other externals. We are trying to do this with the least disruption possible, so
your existing `compilers.yaml` files should still work. We expect to be done with
this transition by the `v0.23` release in November.
* #41104: Packages compiled with `%gcc` on Linux, macOS and FreeBSD now depend on a
new package `gcc-runtime`, which contains a copy of the shared compiler runtime
libraries. This enables gcc runtime libraries to be installed and relocated when
using a build cache. When building minimal Spack-generated container images it is
no longer necessary to install libgfortran, libgomp etc. using the system package
manager.
* #42062: Packages compiled with `%oneapi` now depend on a new package
`intel-oneapi-runtime`. This is similar to `gcc-runtime`, and the runtimes can
provide virtuals and compilers can inject dependencies on virtuals into compiled
packages. This allows us to model library soname compatibility and allows
compilers like `%oneapi` to provide virtuals like `sycl` (which can also be
provided by standalone libraries). Note that until we have an agreement in place
with intel, Intel packages are marked `redistribute(source=False, binary=False)`
and must be downloaded outside of Spack.
* #43272: changes to the optimization criteria of the solver improve the hit-rate of
buildcaches by a fair amount. The solver more relaxed compatibility rules and will
not try to strictly match compilers or targets of reused specs. Users can still
enforce the previous strict behavior with `require:` sections in `packages.yaml`.
Note that to enforce correct linking, Spack will *not* reuse old `%gcc` and
`%oneapi` specs that do not have the runtime libraries as a dependency.
* #43539: Spack will reuse specs built with compilers that are *not* explicitly
configured in `compilers.yaml`. Because we can now keep runtime libraries in build
cache, we do not require you to also have a local configured compiler to *use* the
runtime libraries. This improves reuse in buildcaches and avoids conflicts with OS
updates that happen underneath Spack.
* #43190: binary compatibility on `linux` is now based on the `libc` version,
instead of on the `os` tag. Spack builds now detect the host `libc` (`glibc` or
`musl`) and add it as an implicit external node in the dependency graph. Binaries
with a `libc` with the same name and a version less than or equal to that of the
detected `libc` can be reused. This is only on `linux`, not `macos` or `Windows`.
* #43464: each package that can provide a compiler is now detectable using `spack
external find`. External packages defining compiler paths are effectively used as
compilers, and `spack external find -t compiler` can be used as a substitute for
`spack compiler find`. More details on this transition are in
[the docs](https://spack.readthedocs.io/en/latest/getting_started.html#manual-compiler-configuration)
2. **Improved `spack find` UI for Environments**
If you're working in an enviroment, you likely care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
We've tweaked `spack find` in environments to show this information much more
clearly. Installation status is shown next to each root, so you can see what is
installed. Roots are also shown in bold in the list of installed packages. There is
also a new option for `spack find -r` / `--only-roots` that will only show env
roots, if you don't want to look at all the installed specs.
More details in #42334.
3. **Improved command-line string quoting**
We are making some breaking changes to how Spack parses specs on the CLI in order to
respect shell quoting instead of trying to fight it. If you (sadly) had to write
something like this on the command line:
```
spack install zlib cflags=\"-O2 -g\"
```
That will now result in an error, but you can now write what you probably expected
to work in the first place:
```
spack install zlib cflags="-O2 -g"
```
Quoted can also now include special characters, so you can supply flags like:
```
spack intall zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'
```
To reduce ambiguity in parsing, we now require that you *not* put spaces around `=`
and `==` when for flags or variants. This would not have broken before but will now
result in an error:
```
spack install zlib cflags = "-O2 -g"
```
More details and discussion in #30634.
4. **Revert default `spack install` behavior to `--reuse`**
We changed the default concretizer behavior from `--reuse` to `--reuse-deps` in
#30990 (in `v0.20`), which meant that *every* `spack install` invocation would
attempt to build a new version of the requested package / any environment roots.
While this is a common ask for *upgrading* and for *developer* workflows, we don't
think it should be the default for a package manager.
We are going to try to stick to this policy:
1. Prioritize reuse and build as little as possible by default.
2. Only upgrade or install duplicates if they are explicitly asked for, or if there
is a known security issue that necessitates an upgrade.
With the install command you now have three options:
* `--reuse` (default): reuse as many existing installations as possible.
* `--reuse-deps` / `--fresh-roots`: upgrade (freshen) roots but reuse dependencies if possible.
* `--fresh`: install fresh versions of requested packages (roots) and their dependencies.
We've also introduced `--fresh-roots` as an alias for `--reuse-deps` to make it more clear
that it may give you fresh versions. More details in #41302 and #43988.
5. **More control over reused specs**
You can now control which packages to reuse and how. There is a new
`concretizer:reuse` config option, which accepts the following properties:
- `roots`: `true` to reuse roots, `false` to reuse just dependencies
- `exclude`: list of constraints used to select which specs *not* to reuse
- `include`: list of constraints used to select which specs *to* reuse
- `from`: list of sources for reused specs (some combination of `local`,
`buildcache`, or `external`)
For example, to reuse only specs compiled with GCC, you could write:
```yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
```
Or, if `openmpi` must be used from externals, and it must be the only external used:
```yaml
concretizer:
reuse:
roots: true
from:
- type: local
exclude: ["openmpi"]
- type: buildcache
exclude: ["openmpi"]
- type: external
include: ["openmpi"]
```
6. **New `redistribute()` directive**
Some packages can't be redistributed in source or binary form. We need an explicit
way to say that in a package.
Now there is a `redistribute()` directive so that package authors can write:
```python
class MyPackage(Package):
redistribute(source=False, binary=False)
```
Like other directives, this works with `when=`:
```python
class MyPackage(Package):
# 12.0 and higher are proprietary
redistribute(source=False, binary=False, when="@12.0:")
# can't redistribute when we depend on some proprietary dependency
redistribute(source=False, binary=False, when="^proprietary-dependency")
```
More in #20185.
7. **New `conflict:` and `prefer:` syntax for package preferences**
Previously, you could express conflicts and preferences in `packages.yaml` through
some contortions with `require:`:
```yaml
packages:
zlib-ng:
require:
- one_of: ["%clang", "@:"] # conflict on %clang
- any_of: ["+shared", "@:"] # strong preference for +shared
```
You can now use `require:` and `prefer:` for a much more readable configuration:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
See [the documentation](https://spack.readthedocs.io/en/latest/packages_yaml.html#conflicts-and-strong-preferences)
and #41832 for more details.
8. **`include_concrete` in environments**
You may want to build on the *concrete* contents of another environment without
changing that environment. You can now include the concrete specs from another
environment's `spack.lock` with `include_concrete`:
```yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /path/to/environment1
- /path/to/environment2
```
Now, when *this* environment is concretized, it will bring in the already concrete
specs from `environment1` and `environment2`, and build on top of them without
changing them. This is useful if you have phased deployments, where old deployments
should not be modified but you want to use as many of them as possible. More details
in #33768.
9. **`python-venv` isolation**
Spack has unique requirements for Python because it:
1. installs every package in its own independent directory, and
2. allows users to register *external* python installations.
External installations may contain their own installed packages that can interfere
with Spack installations, and some distributions (Debian and Ubuntu) even change the
`sysconfig` in ways that alter the installation layout of installed Python packages
(e.g., with the addition of a `/local` prefix on Debian or Ubuntu). To isolate Spack
from these and other issues, we now insert a small `python-venv` package in between
`python` and packages that need to install Python code. This isolates Spack's build
environment, isolates Spack from any issues with an external python, and resolves a
large number of issues we've had with Python installations.
See #40773 for further details.
## New commands, options, and directives
* Allow packages to be pushed to build cache after install from source (#42423)
* `spack develop`: stage build artifacts in same root as non-dev builds #41373
* Don't delete `spack develop` build artifacts after install (#43424)
* `spack find`: add options for local/upstream only (#42999)
* `spack logs`: print log files for packages (either partially built or installed) (#42202)
* `patch`: support reversing patches (#43040)
* `develop`: Add -b/--build-directory option to set build_directory package attribute (#39606)
* `spack list`: add `--namesapce` / `--repo` option (#41948)
* directives: add `checked_by` field to `license()`, add some license checks
* `spack gc`: add options for environments and build dependencies (#41731)
* Add `--create` to `spack env activate` (#40896)
## Performance improvements
* environment.py: fix excessive re-reads (#43746)
* ruamel yaml: fix quadratic complexity bug (#43745)
* Refactor to improve `spec format` speed (#43712)
* Do not acquire a write lock on the env post install if no views (#43505)
* asp.py: fewer calls to `spec.copy()` (#43715)
* spec.py: early return in `__str__`
* avoid `jinja2` import at startup unless needed (#43237)
## Other new features of note
* `archspec`: update to `v0.2.4`: support for Windows, bugfixes for `neoverse-v1` and
`neoverse-v2` detection.
* `spack config get`/`blame`: with no args, show entire config
* `spack env create <env>`: dir if dir-like (#44024)
* ASP-based solver: update os compatibility for macOS (#43862)
* Add handling of custom ssl certs in urllib ops (#42953)
* Add ability to rename environments (#43296)
* Add config option and compiler support to reuse across OS's (#42693)
* Support for prereleases (#43140)
* Only reuse externals when configured (#41707)
* Environments: Add support for including views (#42250)
## Binary caches
* Build cache: make signed/unsigned a mirror property (#41507)
* tools stack
## Removals, deprecations, and syntax changes
* remove `dpcpp` compiler and package (#43418)
* spack load: remove --only argument (#42120)
## Notable Bugfixes
* repo.py: drop deleted packages from provider cache (#43779)
* Allow `+` in module file names (#41999)
* `cmd/python`: use runpy to allow multiprocessing in scripts (#41789)
* Show extension commands with spack -h (#41726)
* Support environment variable expansion inside module projections (#42917)
* Alert user to failed concretizations (#42655)
* shell: fix zsh color formatting for PS1 in environments (#39497)
* spack mirror create --all: include patches (#41579)
## Spack community stats
* 7,994 total packages; 525 since `v0.21.0`
* 178 new Python packages, 5 new R packages
* 358 people contributed to this release
* 344 committers to packages
* 45 committers to core
# v0.21.2 (2024-03-01)
## Bugfixes
@@ -448,7 +27,7 @@
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)

View File

@@ -42,8 +42,8 @@ concretizer:
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Option to specify compatibility between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# Option to specify compatiblity between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
os_compatible: {}

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -1,3 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
iconv:
require: [libiconv]
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -37,9 +37,9 @@ packages:
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [gcc-runtime]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx]
libifcore: [intel-oneapi-runtime]
libifcore: [ intel-oneapi-runtime ]
libllvm: [llvm]
lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit]

View File

@@ -147,6 +147,15 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -5,9 +5,9 @@
.. chain:
=============================================
Chaining Spack Installations (upstreams.yaml)
=============================================
============================
Chaining Spack Installations
============================
You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance,

View File

@@ -184,7 +184,7 @@ Style Tests
Spack uses `Flake8 <http://flake8.pycqa.org/en/latest/>`_ to test for
`PEP 8 <https://www.python.org/dev/peps/pep-0008/>`_ conformance and
`mypy <https://mypy.readthedocs.io/en/stable/>`_ for type checking. PEP 8 is
`mypy <https://mypy.readthedocs.io/en/stable/>` for type checking. PEP 8 is
a series of style guides for Python that provide suggestions for everything
from variable naming to indentation. In order to limit the number of PRs that
were mostly style changes, we decided to enforce PEP 8 conformance. Your PR

View File

@@ -716,27 +716,27 @@ Release branches
^^^^^^^^^^^^^^^^
There are currently two types of Spack releases: :ref:`major releases
<major-releases>` (``0.21.0``, ``0.22.0``, etc.) and :ref:`patch releases
<patch-releases>` (``0.22.1``, ``0.22.2``, ``0.22.3``, etc.). Here is a
<major-releases>` (``0.17.0``, ``0.18.0``, etc.) and :ref:`point releases
<point-releases>` (``0.17.1``, ``0.17.2``, ``0.17.3``, etc.). Here is a
diagram of how Spack release branches work::
o branch: develop (latest version, v0.23.0.dev0)
o branch: develop (latest version, v0.19.0.dev0)
|
o
| o branch: releases/v0.22, tag: v0.22.1
| o branch: releases/v0.18, tag: v0.18.1
o |
| o tag: v0.22.0
| o tag: v0.18.0
o |
| o
|/
o
|
o
| o branch: releases/v0.21, tag: v0.21.2
| o branch: releases/v0.17, tag: v0.17.2
o |
| o tag: v0.21.1
| o tag: v0.17.1
o |
| o tag: v0.21.0
| o tag: v0.17.0
o |
| o
|/
@@ -747,8 +747,8 @@ requests target ``develop``. The ``develop`` branch will report that its
version is that of the next **major** release with a ``.dev0`` suffix.
Each Spack release series also has a corresponding branch, e.g.
``releases/v0.22`` has ``v0.22.x`` versions of Spack, and
``releases/v0.21`` has ``v0.21.x`` versions. A major release is the first
``releases/v0.18`` has ``0.18.x`` versions of Spack, and
``releases/v0.17`` has ``0.17.x`` versions. A major release is the first
tagged version on a release branch. Minor releases are back-ported from
develop onto release branches. This is typically done by cherry-picking
bugfix commits off of ``develop``.
@@ -778,40 +778,27 @@ for more details.
Scheduling work for releases
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
We schedule work for **major releases** through `milestones
<https://github.com/spack/spack/milestones>`_ and `GitHub Projects
<https://github.com/spack/spack/projects>`_, while **patch releases** use `labels
<https://github.com/spack/spack/labels>`_.
We schedule work for releases by creating `GitHub projects
<https://github.com/spack/spack/projects>`_. At any time, there may be
several open release projects. For example, below are two releases (from
some past version of the page linked above):
There is only one milestone open at a time. Its name corresponds to the next major version, for
example ``v0.23``. Important issues and pull requests should be assigned to this milestone by
core developers, so that they are not forgotten at the time of release. The milestone is closed
when the release is made, and a new milestone is created for the next major release.
.. image:: images/projects.png
Bug reports in GitHub issues are automatically labelled ``bug`` and ``triage``. Spack developers
assign one of the labels ``impact-low``, ``impact-medium`` or ``impact-high``. This will make the
issue appear in the `Triaged bugs <https://github.com/orgs/spack/projects/6>`_ project board.
Important issues should be assigned to the next milestone as well, so they appear at the top of
the project board.
This image shows one release in progress for ``0.15.1`` and another for
``0.16.0``. Each of these releases has a project board containing issues
and pull requests. GitHub shows a status bar with completed work in
green, work in progress in purple, and work not started yet in gray, so
it's fairly easy to see progress.
Spack's milestones are not firm commitments so we move work between releases frequently. If we
need to make a release and some tasks are not yet done, we will simply move them to the next major
release milestone, rather than delaying the release to complete them.
Spack's project boards are not firm commitments so we move work between
releases frequently. If we need to make a release and some tasks are not
yet done, we will simply move them to the next minor or major release, rather
than delaying the release to complete them.
^^^^^^^^^^^^^^^^^^^^^
Backporting bug fixes
^^^^^^^^^^^^^^^^^^^^^
For more on using GitHub project boards, see `GitHub's documentation
<https://docs.github.com/en/github/managing-your-work-on-github/about-project-boards>`_.
When a bug is fixed in the ``develop`` branch, it is often necessary to backport the fix to one
(or more) of the ``release/vX.Y`` branches. Only the release manager is responsible for doing
backports, but Spack maintainers are responsible for labelling pull requests (and issues if no bug
fix is available yet) with ``vX.Y.Z`` labels. The label should correspond to the next patch version
that the bug fix should be backported to.
Backports are done publicly by the release manager using a pull request named ``Backports vX.Y.Z``.
This pull request is opened from the ``backports/vX.Y.Z`` branch, targets the ``releases/vX.Y``
branch and contains a (growing) list of cherry-picked commits from the ``develop`` branch.
Typically there are one or two backport pull requests open at any given time.
.. _major-releases:
@@ -819,21 +806,25 @@ Typically there are one or two backport pull requests open at any given time.
Making major releases
^^^^^^^^^^^^^^^^^^^^^
Assuming all required work from the milestone is completed, the steps to make the major release
are:
Assuming a project board has already been created and all required work
completed, the steps to make the major release are:
#. `Create a new milestone <https://github.com/spack/spack/milestones>`_ for the next major
release.
#. Create two new project boards:
#. `Create a new label <https://github.com/spack/spack/labels>`_ for the next patch release.
* One for the next major release
* One for the next point release
#. Move any optional tasks that are not done to the next milestone.
#. Move any optional tasks that are not done to one of the new project boards.
In general, small bugfixes should go to the next point release. Major
features, refactors, and changes that could affect concretization should
go in the next major release.
#. Create a branch for the release, based on ``develop``:
.. code-block:: console
$ git checkout -b releases/v0.23 develop
$ git checkout -b releases/v0.15 develop
For a version ``vX.Y.Z``, the branch's name should be
``releases/vX.Y``. That is, you should create a ``releases/vX.Y``
@@ -869,8 +860,8 @@ are:
Create a pull request targeting the ``develop`` branch, bumping the major
version in ``lib/spack/spack/__init__.py`` with a ``dev0`` release segment.
For instance when you have just released ``v0.23.0``, set the version
to ``(0, 24, 0, 'dev0')`` on ``develop``.
For instance when you have just released ``v0.15.0``, set the version
to ``(0, 16, 0, 'dev0')`` on ``develop``.
#. Follow the steps in :ref:`publishing-releases`.
@@ -879,52 +870,82 @@ are:
#. Follow the steps in :ref:`announcing-releases`.
.. _patch-releases:
.. _point-releases:
^^^^^^^^^^^^^^^^^^^^^
Making patch releases
Making point releases
^^^^^^^^^^^^^^^^^^^^^
To make the patch release process both efficient and transparent, we use a *backports pull request*
which contains cherry-picked commits from the ``develop`` branch. The majority of the work is to
cherry-pick the bug fixes, which ideally should be done as soon as they land on ``develop``:
this ensures cherry-picking happens in order, and makes conflicts easier to resolve since the
changes are fresh in the mind of the developer.
Assuming a project board has already been created and all required work
completed, the steps to make the point release are:
The backports pull request is always titled ``Backports vX.Y.Z`` and is labelled ``backports``. It
is opened from a branch named ``backports/vX.Y.Z`` and targets the ``releases/vX.Y`` branch.
#. Create a new project board for the next point release.
Whenever a pull request labelled ``vX.Y.Z`` is merged, cherry-pick the associated squashed commit
on ``develop`` to the ``backports/vX.Y.Z`` branch. For pull requests that were rebased (or not
squashed), cherry-pick each associated commit individually. Never force push to the
``backports/vX.Y.Z`` branch.
#. Move any optional tasks that are not done to the next project board.
.. warning::
#. Check out the release branch (it should already exist).
Sometimes you may **still** get merge conflicts even if you have
cherry-picked all the commits in order. This generally means there
is some other intervening pull request that the one you're trying
to pick depends on. In these cases, you'll need to make a judgment
call regarding those pull requests. Consider the number of affected
files and/or the resulting differences.
For the ``X.Y.Z`` release, the release branch is called ``releases/vX.Y``.
For ``v0.15.1``, you would check out ``releases/v0.15``:
1. If the changes are small, you might just cherry-pick it.
.. code-block:: console
2. If the changes are large, then you may decide that this fix is not
worth including in a patch release, in which case you should remove
the label from the pull request. Remember that large, manual backports
are seldom the right choice for a patch release.
$ git checkout releases/v0.15
When all commits are cherry-picked in the ``backports/vX.Y.Z`` branch, make the patch
release as follows:
#. If a pull request to the release branch named ``Backports vX.Y.Z`` is not already
in the project, create it. This pull request ought to be created as early as
possible when working on a release project, so that we can build the release
commits incrementally, and identify potential conflicts at an early stage.
#. `Create a new label <https://github.com/spack/spack/labels>`_ ``vX.Y.{Z+1}`` for the next patch
release.
#. Cherry-pick each pull request in the ``Done`` column of the release
project board onto the ``Backports vX.Y.Z`` pull request.
#. Replace the label ``vX.Y.Z`` with ``vX.Y.{Z+1}`` for all PRs and issues that are not done.
This is **usually** fairly simple since we squash the commits from the
vast majority of pull requests. That means there is only one commit
per pull request to cherry-pick. For example, `this pull request
<https://github.com/spack/spack/pull/15777>`_ has three commits, but
they were squashed into a single commit on merge. You can see the
commit that was created here:
#. Manually push a single commit with commit message ``Set version to vX.Y.Z`` to the
``backports/vX.Y.Z`` branch, that both bumps the Spack version number and updates the changelog:
.. image:: images/pr-commit.png
You can easily cherry pick it like this (assuming you already have the
release branch checked out):
.. code-block:: console
$ git cherry-pick 7e46da7
For pull requests that were rebased (or not squashed), you'll need to
cherry-pick each associated commit individually.
.. warning::
It is important to cherry-pick commits in the order they happened,
otherwise you can get conflicts while cherry-picking. When
cherry-picking look at the merge date,
**not** the number of the pull request or the date it was opened.
Sometimes you may **still** get merge conflicts even if you have
cherry-picked all the commits in order. This generally means there
is some other intervening pull request that the one you're trying
to pick depends on. In these cases, you'll need to make a judgment
call regarding those pull requests. Consider the number of affected
files and or the resulting differences.
1. If the dependency changes are small, you might just cherry-pick it,
too. If you do this, add the task to the release board.
2. If the changes are large, then you may decide that this fix is not
worth including in a point release, in which case you should remove
the task from the release project.
3. You can always decide to manually back-port the fix to the release
branch if neither of the above options makes sense, but this can
require a lot of work. It's seldom the right choice.
#. When all the commits from the project board are cherry-picked into
the ``Backports vX.Y.Z`` pull request, you can push a commit to:
1. Bump the version in ``lib/spack/spack/__init__.py``.
2. Update ``CHANGELOG.md`` with a list of the changes.
@@ -933,22 +954,20 @@ release as follows:
release branch. See `the changelog from 0.14.1
<https://github.com/spack/spack/commit/ff0abb9838121522321df2a054d18e54b566b44a>`_.
#. Make sure CI passes on the **backports pull request**, including:
#. Merge the ``Backports vX.Y.Z`` PR with the **Rebase and merge** strategy. This
is needed to keep track in the release branch of all the commits that were
cherry-picked.
#. Make sure CI passes on the release branch, including:
* Regular unit tests
* Build tests
* The E4S pipeline at `gitlab.spack.io <https://gitlab.spack.io>`_
#. Merge the ``Backports vX.Y.Z`` PR with the **Rebase and merge** strategy. This
is needed to keep track in the release branch of all the commits that were
cherry-picked.
#. Make sure CI passes on the last commit of the **release branch**.
#. In the rare case you need to include additional commits in the patch release after the backports
PR is merged, it is best to delete the last commit ``Set version to vX.Y.Z`` from the release
branch with a single force push, open a new backports PR named ``Backports vX.Y.Z (2)``, and
repeat the process. Avoid repeated force pushes to the release branch.
If CI does not pass, you'll need to figure out why, and make changes
to the release branch until it does. You can make more commits, modify
or remove cherry-picked commits, or cherry-pick **more** from
``develop`` to make this happen.
#. Follow the steps in :ref:`publishing-releases`.
@@ -1023,31 +1042,25 @@ Updating `releases/latest`
If the new release is the **highest** Spack release yet, you should
also tag it as ``releases/latest``. For example, suppose the highest
release is currently ``0.22.3``:
release is currently ``0.15.3``:
* If you are releasing ``0.22.4`` or ``0.23.0``, then you should tag
it with ``releases/latest``, as these are higher than ``0.22.3``.
* If you are releasing ``0.15.4`` or ``0.16.0``, then you should tag
it with ``releases/latest``, as these are higher than ``0.15.3``.
* If you are making a new release of an **older** major version of
Spack, e.g. ``0.21.4``, then you should not tag it as
Spack, e.g. ``0.14.4``, then you should not tag it as
``releases/latest`` (as there are newer major versions).
To do so, first fetch the latest tag created on GitHub, since you may not have it locally:
To tag ``releases/latest``, do this:
.. code-block:: console
$ git fetch --force git@github.com:spack/spack vX.Y.Z
$ git checkout releases/vX.Y # vX.Y is the new release's branch
$ git tag --force releases/latest
$ git push --force --tags
Then tag ``vX.Y.Z`` as ``releases/latest`` and push the individual tag to GitHub.
.. code-block:: console
$ git tag --force releases/latest vX.Y.Z
$ git push --force git@github.com:spack/spack releases/latest
The ``--force`` argument to ``git tag`` makes ``git`` overwrite the existing ``releases/latest``
tag with the new one. Do **not** use the ``--tags`` flag when pushing, since this will push *all*
local tags.
The ``--force`` argument to ``git tag`` makes ``git`` overwrite the existing
``releases/latest`` tag with the new one.
.. _announcing-releases:

View File

@@ -35,7 +35,7 @@ A build matrix showing which packages are working on which systems is shown belo
.. code-block:: console
apt update
apt install bzip2 ca-certificates file g++ gcc gfortran git gzip lsb-release patch python3 tar unzip xz-utils zstd
apt install build-essential ca-certificates coreutils curl environment-modules gfortran git gpg lsb-release python3 python3-distutils python3-venv unzip zip
.. tab-item:: RHEL
@@ -43,14 +43,14 @@ A build matrix showing which packages are working on which systems is shown belo
dnf install epel-release
dnf group install "Development Tools"
dnf install gcc-gfortran redhat-lsb-core python3 unzip
dnf install curl findutils gcc-gfortran gnupg2 hostname iproute redhat-lsb-core python3 python3-pip python3-setuptools unzip python3-boto3
.. tab-item:: macOS Brew
.. code-block:: console
brew update
brew install gcc git zip
brew install curl gcc git gnupg zip
------------
Installation

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

View File

@@ -12,6 +12,10 @@
Spack
===================
.. epigraph::
`These are docs for the Spack package manager. For sphere packing, see` `pyspack <https://pyspack.readthedocs.io>`_.
Spack is a package management tool designed to support multiple
versions and configurations of software on a wide variety of platforms
and environments. It was designed for large supercomputing centers,

View File

@@ -2442,14 +2442,15 @@ with. For example, suppose that in the ``libdwarf`` package you write:
depends_on("libelf@0.8")
Now ``libdwarf`` will require ``libelf`` in the range ``0.8``, which
includes patch versions ``0.8.1``, ``0.8.2``, etc. Apart from version
restrictions, you can also specify variants if this package requires
optional features of the dependency.
Now ``libdwarf`` will require ``libelf`` at *exactly* version ``0.8``.
You can also specify a requirement for a particular variant or for
specific compiler flags:
.. code-block:: python
depends_on("libelf@0.8 +parser +pic")
depends_on("libelf@0.8+debug")
depends_on("libelf debug=True")
depends_on("libelf cppflags='-fPIC'")
Both users *and* package authors can use the same spec syntax to refer
to different package configurations. Users use the spec syntax on the
@@ -2457,82 +2458,46 @@ command line to find installed packages or to install packages with
particular constraints, and package authors can use specs to describe
relationships between packages.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Specifying backward and forward compatibility
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^
Version ranges
^^^^^^^^^^^^^^
Packages are often compatible with a range of versions of their
dependencies. This is typically referred to as backward and forward
compatibility. Spack allows you to specify this in the ``depends_on``
directive using version ranges.
**Backwards compatibility** means that the package requires at least a
certain version of its dependency:
Although some packages require a specific version for their dependencies,
most can be built with a range of versions. For example, if you are
writing a package for a legacy Python module that only works with Python
2.4 through 2.6, this would look like:
.. code-block:: python
depends_on("python@3.10:")
depends_on("python@2.4:2.6")
In this case, the package requires Python 3.10 or newer.
Commonly, packages drop support for older versions of a dependency as
they release new versions. In Spack you can conveniently add every
backward compatibility rule as a separate line:
Version ranges in Spack are *inclusive*, so ``2.4:2.6`` means any version
greater than or equal to ``2.4`` and up to and including any ``2.6.x``. If
you want to specify that a package works with any version of Python 3 (or
higher), this would look like:
.. code-block:: python
# backward compatibility with Python
depends_on("python@3.8:")
depends_on("python@3.9:", when="@1.2:")
depends_on("python@3.10:", when="@1.4:")
depends_on("python@3:")
This means that in general we need Python 3.8 or newer; from version
1.2 onwards we need Python 3.9 or newer; from version 1.4 onwards we
need Python 3.10 or newer. Notice that it's fine to have overlapping
ranges in the ``when`` clauses.
**Forward compatibility** means that the package requires at most a
certain version of its dependency. Forward compatibility rules are
necessary when there are breaking changes in the dependency that the
package cannot handle. In Spack we often add forward compatibility
bounds only at the time a new, breaking version of a dependency is
released. As with backward compatibility, it is typical to see a list
of forward compatibility bounds in a package file as seperate lines:
Here we leave out the upper bound. If you want to say that a package
requires Python 2, you can similarly leave out the lower bound:
.. code-block:: python
# forward compatibility with Python
depends_on("python@:3.12", when="@:1.10")
depends_on("python@:3.13", when="@:1.12")
depends_on("python@:2")
Notice how the ``:`` now appears before the version number both in the
dependency and in the ``when`` clause. This tells Spack that in general
we need Python 3.13 or older up to version ``1.12.x``, and up to version
``1.10.x`` we need Python 3.12 or older. Said differently, forward compatibility
with Python 3.13 was added in version 1.11, while version 1.13 added forward
compatibility with Python 3.14.
Notice that we didn't use ``@:3``. Version ranges are *inclusive*, so
``@:3`` means "up to and including any 3.x version".
Notice that a version range ``@:3.12`` includes *any* patch version
number ``3.12.x``, which is often useful when specifying forward compatibility
bounds.
So far we have seen open-ended version ranges, which is by far the most
common use case. It is also possible to specify both a lower and an upper bound
on the version of a dependency, like this:
You can also simply write
.. code-block:: python
depends_on("python@3.10:3.12")
depends_on("python@2.7")
There is short syntax to specify that a package is compatible with say any
``3.x`` version:
.. code-block:: python
depends_on("python@3")
The above is equivalent to ``depends_on("python@3:3")``, which means at least
Python version 3 and at most any version ``3.x.y``.
to tell Spack that the package needs Python 2.7.x. This is equivalent to
``@2.7:2.7``.
In very rare cases, you may need to specify an exact version, for example
if you need to distinguish between ``3.2`` and ``3.2.1``:

View File

@@ -476,9 +476,3 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -4,9 +4,9 @@ sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
pygments==2.18.0
urllib3==2.2.1
pytest==8.2.0
pytest==8.2.1
isort==5.13.2
black==24.4.2
flake8==7.0.0

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.5 (commit 38ce485258ffc4fc6dd6688f8dc90cb269478c47)
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse
----------------

View File

@@ -47,11 +47,7 @@ def decorator(factory):
def partial_uarch(
name: str = "",
vendor: str = "",
features: Optional[Set[str]] = None,
generation: int = 0,
cpu_part: str = "",
name: str = "", vendor: str = "", features: Optional[Set[str]] = None, generation: int = 0
) -> Microarchitecture:
"""Construct a partial microarchitecture, from information gathered during system scan."""
return Microarchitecture(
@@ -61,7 +57,6 @@ def partial_uarch(
features=features or set(),
compilers={},
generation=generation,
cpu_part=cpu_part,
)
@@ -95,7 +90,6 @@ def proc_cpuinfo() -> Microarchitecture:
return partial_uarch(
vendor=_canonicalize_aarch64_vendor(data),
features=_feature_set(data, key="Features"),
cpu_part=data.get("CPU part", ""),
)
if architecture in (PPC64LE, PPC64):
@@ -351,10 +345,6 @@ def sorting_fn(item):
generic_candidates = [c for c in candidates if c.vendor == "generic"]
best_generic = max(generic_candidates, key=sorting_fn)
# Relevant for AArch64. Filter on "cpu_part" if we have any match
if info.cpu_part != "" and any(c for c in candidates if info.cpu_part == c.cpu_part):
candidates = [c for c in candidates if info.cpu_part == c.cpu_part]
# Filter the candidates to be descendant of the best generic candidate.
# This is to avoid that the lack of a niche feature that can be disabled
# from e.g. BIOS prevents detection of a reasonably performant architecture

View File

@@ -2,7 +2,9 @@
# Archspec Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Types and functions to manage information on CPU microarchitectures."""
"""Types and functions to manage information
on CPU microarchitectures.
"""
import functools
import platform
import re
@@ -63,31 +65,23 @@ class Microarchitecture:
passed in as argument above.
* versions: versions that support this micro-architecture.
generation (int): generation of the micro-architecture, if relevant.
cpu_part (str): cpu part of the architecture, if relevant.
generation (int): generation of the micro-architecture, if
relevant.
"""
# pylint: disable=too-many-arguments,too-many-instance-attributes
# pylint: disable=too-many-arguments
#: Aliases for micro-architecture's features
feature_aliases = FEATURE_ALIASES
def __init__(self, name, parents, vendor, features, compilers, generation=0, cpu_part=""):
def __init__(self, name, parents, vendor, features, compilers, generation=0):
self.name = name
self.parents = parents
self.vendor = vendor
self.features = features
self.compilers = compilers
# Only relevant for PowerPC
self.generation = generation
# Only relevant for AArch64
self.cpu_part = cpu_part
# Cache the "ancestor" computation
# Cache the ancestor computation
self._ancestors = None
# Cache the "generic" computation
self._generic = None
# Cache the "family" computation
self._family = None
@property
def ancestors(self):
@@ -117,12 +111,8 @@ def __eq__(self, other):
and self.parents == other.parents # avoid ancestors here
and self.compilers == other.compilers
and self.generation == other.generation
and self.cpu_part == other.cpu_part
)
def __hash__(self):
return hash(self.name)
@coerce_target_names
def __ne__(self, other):
return not self == other
@@ -153,8 +143,7 @@ def __repr__(self):
cls_name = self.__class__.__name__
fmt = (
cls_name + "({0.name!r}, {0.parents!r}, {0.vendor!r}, "
"{0.features!r}, {0.compilers!r}, generation={0.generation!r}, "
"cpu_part={0.cpu_part!r})"
"{0.features!r}, {0.compilers!r}, {0.generation!r})"
)
return fmt.format(self)
@@ -179,22 +168,18 @@ def __contains__(self, feature):
@property
def family(self):
"""Returns the architecture family a given target belongs to"""
if self._family is None:
roots = [x for x in [self] + self.ancestors if not x.ancestors]
msg = "a target is expected to belong to just one architecture family"
msg += f"[found {', '.join(str(x) for x in roots)}]"
assert len(roots) == 1, msg
self._family = roots.pop()
roots = [x for x in [self] + self.ancestors if not x.ancestors]
msg = "a target is expected to belong to just one architecture family"
msg += f"[found {', '.join(str(x) for x in roots)}]"
assert len(roots) == 1, msg
return self._family
return roots.pop()
@property
def generic(self):
"""Returns the best generic architecture that is compatible with self"""
if self._generic is None:
generics = [x for x in [self] + self.ancestors if x.vendor == "generic"]
self._generic = max(generics, key=lambda x: len(x.ancestors))
return self._generic
generics = [x for x in [self] + self.ancestors if x.vendor == "generic"]
return max(generics, key=lambda x: len(x.ancestors))
def to_dict(self):
"""Returns a dictionary representation of this object."""
@@ -205,7 +190,6 @@ def to_dict(self):
"generation": self.generation,
"parents": [str(x) for x in self.parents],
"compilers": self.compilers,
"cpupart": self.cpu_part,
}
@staticmethod
@@ -218,7 +202,6 @@ def from_dict(data) -> "Microarchitecture":
features=set(data["features"]),
compilers=data.get("compilers", {}),
generation=data.get("generation", 0),
cpu_part=data.get("cpupart", ""),
)
def optimization_flags(self, compiler, version):
@@ -377,11 +360,8 @@ def fill_target_from_dict(name, data, targets):
features = set(values["features"])
compilers = values.get("compilers", {})
generation = values.get("generation", 0)
cpu_part = values.get("cpupart", "")
targets[name] = Microarchitecture(
name, parents, vendor, features, compilers, generation=generation, cpu_part=cpu_part
)
targets[name] = Microarchitecture(name, parents, vendor, features, compilers, generation)
known_targets = {}
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]

View File

@@ -1482,6 +1482,7 @@
"cldemote",
"movdir64b",
"movdiri",
"pdcm",
"serialize",
"waitpkg"
],
@@ -2224,96 +2225,14 @@
],
"nvhpc": [
{
"versions": "21.11:23.8",
"versions": "21.11:",
"name": "zen3",
"flags": "-tp {name}",
"warnings": "zen4 is not fully supported by nvhpc versions < 23.9, falling back to zen3"
},
{
"versions": "23.9:",
"flags": "-tp {name}"
"warnings": "zen4 is not fully supported by nvhpc yet, falling back to zen3"
}
]
}
},
"zen5": {
"from": ["zen4"],
"vendor": "AuthenticAMD",
"features": [
"abm",
"aes",
"avx",
"avx2",
"avx512_bf16",
"avx512_bitalg",
"avx512bw",
"avx512cd",
"avx512dq",
"avx512f",
"avx512ifma",
"avx512vbmi",
"avx512_vbmi2",
"avx512vl",
"avx512_vnni",
"avx512_vp2intersect",
"avx512_vpopcntdq",
"avx_vnni",
"bmi1",
"bmi2",
"clflushopt",
"clwb",
"clzero",
"cppc",
"cx16",
"f16c",
"flush_l1d",
"fma",
"fsgsbase",
"gfni",
"ibrs_enhanced",
"mmx",
"movbe",
"movdir64b",
"movdiri",
"pclmulqdq",
"popcnt",
"rdseed",
"sse",
"sse2",
"sse4_1",
"sse4_2",
"sse4a",
"ssse3",
"tsc_adjust",
"vaes",
"vpclmulqdq",
"xsavec",
"xsaveopt"
],
"compilers": {
"gcc": [
{
"versions": "14.1:",
"name": "znver5",
"flags": "-march={name} -mtune={name}"
}
],
"aocc": [
{
"versions": "5.0:",
"name": "znver5",
"flags": "-march={name} -mtune={name}"
}
],
"clang": [
{
"versions": "19.1:",
"name": "znver5",
"flags": "-march={name} -mtune={name}"
}
]
}
},
"ppc64": {
"from": [],
"vendor": "generic",
@@ -2792,8 +2711,7 @@
"flags": "-mcpu=thunderx2t99"
}
]
},
"cpupart": "0x0af"
}
},
"a64fx": {
"from": ["armv8.2a"],
@@ -2861,8 +2779,7 @@
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
}
]
},
"cpupart": "0x001"
}
},
"cortex_a72": {
"from": ["aarch64"],
@@ -2899,8 +2816,7 @@
"flags" : "-mcpu=cortex-a72"
}
]
},
"cpupart": "0xd08"
}
},
"neoverse_n1": {
"from": ["cortex_a72", "armv8.2a"],
@@ -2921,7 +2837,8 @@
"asimdrdm",
"lrcpc",
"dcpop",
"asimddp"
"asimddp",
"ssbs"
],
"compilers" : {
"gcc": [
@@ -2985,8 +2902,7 @@
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd0c"
}
},
"neoverse_v1": {
"from": ["neoverse_n1", "armv8.4a"],
@@ -3010,6 +2926,8 @@
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
@@ -3018,6 +2936,7 @@
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3085,7 +3004,7 @@
},
{
"versions": "11:",
"flags" : "-march=armv8.4-a+sve+fp16+bf16+crypto+i8mm+rng"
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
},
{
"versions": "12:",
@@ -3109,8 +3028,7 @@
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd40"
}
},
"neoverse_v2": {
"from": ["neoverse_n1", "armv9.0a"],
@@ -3134,22 +3052,32 @@
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"dcpodp",
"sve2",
"sveaes",
"svepmull",
"svebitperm",
"svesha3",
"svesm4",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16"
"bf16",
"dgh"
],
"compilers" : {
"gcc": [
@@ -3174,19 +3102,15 @@
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:11.3.99",
"versions": "10.0:11.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "11.4:11.99",
"flags" : "-mcpu=neoverse-v2"
},
{
"versions": "12.0:12.2.99",
"versions": "12.0:12.99",
"flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710"
},
{
"versions": "12.3:",
"versions": "13.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
@@ -3221,112 +3145,7 @@
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd4f"
},
"neoverse_n2": {
"from": ["neoverse_n1", "armv9.0a"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"uscat",
"ilrcpc",
"flagm",
"sb",
"dcpodp",
"sve2",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:5.99",
"flags": "-march=armv8-a"
},
{
"versions": "6:6.99",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7.0:7.99",
"flags" : "-march=armv8.2-a -mtune=cortex-a72"
},
{
"versions": "8.0:8.99",
"flags" : "-march=armv8.4-a+sve -mtune=cortex-a72"
},
{
"versions": "9.0:9.99",
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:10.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "11.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"clang" : [
{
"versions": "9.0:10.99",
"flags" : "-march=armv8.5-a+sve"
},
{
"versions": "11.0:13.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16"
},
{
"versions": "14.0:15.99",
"flags" : "-march=armv9-a+i8mm+bf16"
},
{
"versions": "16.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"arm" : [
{
"versions": "23.04.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"nvhpc" : [
{
"versions": "23.3:",
"name": "neoverse-n1",
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd49"
}
},
"m1": {
"from": ["armv8.4a"],
@@ -3392,8 +3211,7 @@
"flags" : "-mcpu=apple-m1"
}
]
},
"cpupart": "0x022"
}
},
"m2": {
"from": ["m1", "armv8.5a"],
@@ -3471,8 +3289,7 @@
"flags" : "-mcpu=apple-m2"
}
]
},
"cpupart": "0x032"
}
},
"arm": {
"from": [],

View File

@@ -52,9 +52,6 @@
}
}
}
},
"cpupart": {
"type": "string"
}
},
"required": [
@@ -110,4 +107,4 @@
"additionalProperties": false
}
}
}
}

View File

@@ -18,10 +18,9 @@
import threading
import traceback
from contextlib import contextmanager
from multiprocessing.connection import Connection
from threading import Thread
from types import ModuleType
from typing import Callable, Optional
from typing import Optional
import llnl.util.tty as tty
@@ -330,6 +329,49 @@ def close(self):
self.file.close()
class MultiProcessFd:
"""Return an object which stores a file descriptor and can be passed as an
argument to a function run with ``multiprocessing.Process``, such that
the file descriptor is available in the subprocess."""
def __init__(self, fd):
self._connection = None
self._fd = None
if sys.version_info >= (3, 8):
self._connection = multiprocessing.connection.Connection(fd)
else:
self._fd = fd
@property
def fd(self):
if self._connection:
return self._connection._handle
else:
return self._fd
def close(self):
if self._connection:
self._connection.close()
else:
os.close(self._fd)
def close_connection_and_file(multiprocess_fd, file):
# MultiprocessFd is intended to transmit a FD
# to a child process, this FD is then opened to a Python File object
# (using fdopen). In >= 3.8, MultiprocessFd encapsulates a
# multiprocessing.connection.Connection; Connection closes the FD
# when it is deleted, and prints a warning about duplicate closure if
# it is not explicitly closed. In < 3.8, MultiprocessFd encapsulates a
# simple FD; closing the FD here appears to conflict with
# closure of the File object (in < 3.8 that is). Therefore this needs
# to choose whether to close the File or the Connection.
if sys.version_info >= (3, 8):
multiprocess_fd.close()
else:
file.close()
@contextmanager
def replace_environment(env):
"""Replace the current environment (`os.environ`) with `env`.
@@ -487,20 +529,22 @@ def __enter__(self):
# forcing debug output.
self._saved_debug = tty._debug
# Pipe for redirecting output to logger
read_fd, self.write_fd = multiprocessing.Pipe(duplex=False)
# OS-level pipe for redirecting output to logger
read_fd, write_fd = os.pipe()
# Pipe for communication back from the daemon
read_multiprocess_fd = MultiProcessFd(read_fd)
# Multiprocessing pipe for communication back from the daemon
# Currently only used to save echo value between uses
self.parent_pipe, child_pipe = multiprocessing.Pipe(duplex=False)
self.parent_pipe, child_pipe = multiprocessing.Pipe()
# Sets a daemon that writes to file what it reads from a pipe
try:
# need to pass this b/c multiprocessing closes stdin in child.
input_fd = None
input_multiprocess_fd = None
try:
if sys.stdin.isatty():
input_fd = Connection(os.dup(sys.stdin.fileno()))
input_multiprocess_fd = MultiProcessFd(os.dup(sys.stdin.fileno()))
except BaseException:
# just don't forward input if this fails
pass
@@ -509,9 +553,9 @@ def __enter__(self):
self.process = multiprocessing.Process(
target=_writer_daemon,
args=(
input_fd,
read_fd,
self.write_fd,
input_multiprocess_fd,
read_multiprocess_fd,
write_fd,
self.echo,
self.log_file,
child_pipe,
@@ -522,9 +566,9 @@ def __enter__(self):
self.process.start()
finally:
if input_fd:
input_fd.close()
read_fd.close()
if input_multiprocess_fd:
input_multiprocess_fd.close()
read_multiprocess_fd.close()
# Flush immediately before redirecting so that anything buffered
# goes to the original stream
@@ -542,9 +586,9 @@ def __enter__(self):
self._saved_stderr = os.dup(sys.stderr.fileno())
# redirect to the pipe we created above
os.dup2(self.write_fd.fileno(), sys.stdout.fileno())
os.dup2(self.write_fd.fileno(), sys.stderr.fileno())
self.write_fd.close()
os.dup2(write_fd, sys.stdout.fileno())
os.dup2(write_fd, sys.stderr.fileno())
os.close(write_fd)
else:
# Handle I/O the Python way. This won't redirect lower-level
@@ -557,7 +601,7 @@ def __enter__(self):
self._saved_stderr = sys.stderr
# create a file object for the pipe; redirect to it.
pipe_fd_out = os.fdopen(self.write_fd.fileno(), "w", closefd=False)
pipe_fd_out = os.fdopen(write_fd, "w")
sys.stdout = pipe_fd_out
sys.stderr = pipe_fd_out
@@ -593,7 +637,6 @@ def __exit__(self, exc_type, exc_val, exc_tb):
else:
sys.stdout = self._saved_stdout
sys.stderr = self._saved_stderr
self.write_fd.close()
# print log contents in parent if needed.
if self.log_file.write_in_parent:
@@ -807,14 +850,14 @@ def force_echo(self):
def _writer_daemon(
stdin_fd: Optional[Connection],
read_fd: Connection,
write_fd: Connection,
echo: bool,
log_file_wrapper: FileWrapper,
control_fd: Connection,
filter_fn: Optional[Callable[[str], str]],
) -> None:
stdin_multiprocess_fd,
read_multiprocess_fd,
write_fd,
echo,
log_file_wrapper,
control_pipe,
filter_fn,
):
"""Daemon used by ``log_output`` to write to a log file and to ``stdout``.
The daemon receives output from the parent process and writes it both
@@ -851,37 +894,43 @@ def _writer_daemon(
``StringIO`` in the parent. This is mainly for testing.
Arguments:
stdin_fd: optional input from the terminal
read_fd: pipe for reading from parent's redirected stdout
echo: initial echo setting -- controlled by user and preserved across multiple writer
daemons
log_file_wrapper: file to log all output
control_pipe: multiprocessing pipe on which to send control information to the parent
filter_fn: optional function to filter each line of output
stdin_multiprocess_fd (int): input from the terminal
read_multiprocess_fd (int): pipe for reading from parent's redirected
stdout
echo (bool): initial echo setting -- controlled by user and
preserved across multiple writer daemons
log_file_wrapper (FileWrapper): file to log all output
control_pipe (Pipe): multiprocessing pipe on which to send control
information to the parent
filter_fn (callable, optional): function to filter each line of output
"""
# This process depends on closing all instances of write_pipe to terminate the reading loop
write_fd.close()
# If this process was forked, then it will inherit file descriptors from
# the parent process. This process depends on closing all instances of
# write_fd to terminate the reading loop, so we close the file descriptor
# here. Forking is the process spawning method everywhere except Mac OS
# for Python >= 3.8 and on Windows
if sys.version_info < (3, 8) or sys.platform != "darwin":
os.close(write_fd)
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O.
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
# 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False)
in_pipe = os.fdopen(read_multiprocess_fd.fd, "r", 1, encoding="utf-8")
if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
if stdin_multiprocess_fd:
stdin = os.fdopen(stdin_multiprocess_fd.fd)
else:
stdin_file = None
stdin = None
# list of streams to select from
istreams = [read_file, stdin_file] if stdin_file else [read_file]
istreams = [in_pipe, stdin] if stdin else [in_pipe]
force_echo = False # parent can force echo for certain output
log_file = log_file_wrapper.unwrap()
try:
with keyboard_input(stdin_file) as kb:
with keyboard_input(stdin) as kb:
while True:
# fix the terminal settings if we recently came to
# the foreground
@@ -894,12 +943,12 @@ def _writer_daemon(
# Allow user to toggle echo with 'v' key.
# Currently ignores other chars.
# only read stdin if we're in the foreground
if stdin_file and stdin_file in rlist and not _is_background_tty(stdin_file):
if stdin in rlist and not _is_background_tty(stdin):
# it's possible to be backgrounded between the above
# check and the read, so we ignore SIGTTIN here.
with ignore_signal(signal.SIGTTIN):
try:
if stdin_file.read(1) == "v":
if stdin.read(1) == "v":
echo = not echo
except IOError as e:
# If SIGTTIN is ignored, the system gives EIO
@@ -908,13 +957,13 @@ def _writer_daemon(
if e.errno != errno.EIO:
raise
if read_file in rlist:
if in_pipe in rlist:
line_count = 0
try:
while line_count < 100:
# Handle output from the calling process.
try:
line = _retry(read_file.readline)()
line = _retry(in_pipe.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
@@ -943,7 +992,7 @@ def _writer_daemon(
if xoff in controls:
force_echo = False
if not _input_available(read_file):
if not _input_available(in_pipe):
break
finally:
if line_count > 0:
@@ -958,14 +1007,14 @@ def _writer_daemon(
finally:
# send written data back to parent if we used a StringIO
if isinstance(log_file, io.StringIO):
control_fd.send(log_file.getvalue())
control_pipe.send(log_file.getvalue())
log_file_wrapper.close()
read_fd.close()
if stdin_fd:
stdin_fd.close()
close_connection_and_file(read_multiprocess_fd, in_pipe)
if stdin_multiprocess_fd:
close_connection_and_file(stdin_multiprocess_fd, stdin)
# send echo value back to the parent so it can be preserved.
control_fd.send(echo)
control_pipe.send(echo)
def _retry(function):

View File

@@ -4,21 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.6.dev0"
__version__ = "0.23.0.dev0"
spack_version = __version__
#: The current Package API version implemented by this version of Spack. The Package API defines
#: the Python interface for packages as well as the layout of package repositories. The minor
#: version is incremented when the package API is extended in a backwards-compatible way. The major
#: version is incremented upon breaking changes. This version is changed independently from the
#: Spack version.
package_api_version = (1, 0)
#: The minimum Package API version that this version of Spack is compatible with. This should
#: always be a tuple of the form ``(major, 0)``, since compatibility with vX.Y implies
#: compatibility with vX.0.
min_package_api_version = (1, 0)
def __try_int(v):
try:
@@ -31,4 +19,4 @@ def __try_int(v):
spack_version_info = tuple([__try_int(v) for v in __version__.split(".")])
__all__ = ["spack_version_info", "spack_version", "package_api_version", "min_package_api_version"]
__all__ = ["spack_version_info", "spack_version"]

View File

@@ -421,6 +421,10 @@ def _check_patch_urls(pkgs, error_cls):
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
)
github_pull_commits_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/pull/\d+/commits/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = (
@@ -436,14 +440,24 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch):
continue
if re.match(github_patch_url_re, patch.url):
if re.match(github_pull_commits_re, patch.url):
url = re.sub(r"/pull/\d+/commits/", r"/commit/", patch.url)
url = re.sub(r"^(.*)(?<!full_index=1)$", r"\1?full_index=1", url)
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ "must not be a pull request commit; "
+ f"instead use {url}",
[patch.url],
)
)
elif re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
f"patch URL in package {pkg_cls.name} "
+ f"must end with {full_index_arg}",
[patch.url],
)
)
@@ -451,9 +465,7 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(".diff"):
errors.append(
error_cls(
"patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
f"patch URL in package {pkg_cls.name} must end with .diff",
[patch.url],
)
)
@@ -779,7 +791,7 @@ def check_virtual_with_variants(spec, msg):
return
error = error_cls(
f"{pkg_name}: {msg}",
[f"remove variants from '{spec}' in depends_on directive in {filename}"],
f"remove variants from '{spec}' in depends_on directive in {filename}",
)
errors.append(error)

View File

@@ -23,6 +23,7 @@
import warnings
from contextlib import closing
from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
import llnl.util.lang
@@ -898,8 +899,9 @@ def url_read_method(url):
try:
_, _, spec_file = web_util.read_from_url(url)
contents = codecs.getreader("utf-8")(spec_file).read()
except web_util.SpackWebError as e:
tty.error(f"Error reading specfile: {url}: {e}")
except (URLError, web_util.SpackWebError) as url_err:
tty.error("Error reading specfile: {0}".format(url))
tty.error(url_err)
return contents
try:
@@ -2039,17 +2041,21 @@ def try_direct_fetch(spec, mirrors=None):
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
specfile_is_signed = True
except web_util.SpackWebError as e1:
except (URLError, web_util.SpackWebError, HTTPError) as url_err:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except web_util.SpackWebError as e2:
except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
tty.debug(
f"Did not find {specfile_name} on {buildcache_fetch_url_signed_json}",
e1,
"Did not find {0} on {1}".format(
specfile_name, buildcache_fetch_url_signed_json
),
url_err,
level=2,
)
tty.debug(
f"Did not find {specfile_name} on {buildcache_fetch_url_json}", e2, level=2
"Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_json),
url_err_x,
level=2,
)
continue
specfile_contents = codecs.getreader("utf-8")(fs).read()
@@ -2134,9 +2140,6 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
for mirror in mirror_collection.values():
fetch_url = mirror.fetch_url
# TODO: oci:// does not support signing.
if fetch_url.startswith("oci://"):
continue
keys_url = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
)
@@ -2147,12 +2150,19 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
try:
_, _, json_file = web_util.read_from_url(keys_index)
json_index = sjson.load(codecs.getreader("utf-8")(json_file))
except web_util.SpackWebError as url_err:
except (URLError, web_util.SpackWebError) as url_err:
if web_util.url_exists(keys_index):
err_msg = [
"Unable to find public keys in {0},",
" caught exception attempting to read from {1}.",
]
tty.error(
f"Unable to find public keys in {url_util.format(fetch_url)},"
f" caught exception attempting to read from {url_util.format(keys_index)}."
"".join(err_msg).format(
url_util.format(fetch_url), url_util.format(keys_index)
)
)
tty.debug(url_err)
continue
@@ -2432,7 +2442,7 @@ def get_remote_hash(self):
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except (TimeoutError, urllib.error.URLError):
except urllib.error.URLError:
return None
# Validate the hash
@@ -2454,7 +2464,7 @@ def conditional_fetch(self) -> FetchIndexResult:
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except (TimeoutError, urllib.error.URLError) as e:
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e
try:
@@ -2495,7 +2505,10 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
headers = {"User-Agent": web_util.SPACK_USER_AGENT, "If-None-Match": f'"{self.etag}"'}
headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),
}
try:
response = self.urlopen(urllib.request.Request(url, headers=headers))
@@ -2503,14 +2516,14 @@ def conditional_fetch(self) -> FetchIndexResult:
if e.getcode() == 304:
# Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError(f"Could not fetch index {url}", e) from e
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError(f"Could not fetch index {url}", e) from e
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
raise FetchIndexError(f"Remote index {url} is invalid", e) from e
raise FetchIndexError("Remote index {} is invalid".format(url), e) from e
headers = response.headers
etag_header_value = headers.get("Etag", None) or headers.get("etag", None)
@@ -2541,19 +2554,21 @@ def conditional_fetch(self) -> FetchIndexResult:
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
)
)
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError(f"Could not fetch manifest from {url_manifest}", e) from e
except urllib.error.URLError as e:
raise FetchIndexError(
"Could not fetch manifest from {}".format(url_manifest), e
) from e
try:
manifest = json.loads(response.read())
except Exception as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
# Get first blob hash, which should be the index.json
try:
index_digest = spack.oci.image.Digest.from_string(manifest["layers"][0]["digest"])
except Exception as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
# Fresh?
if index_digest.digest == self.local_hash:

View File

@@ -597,10 +597,7 @@ def bootstrapping_sources(scope: Optional[str] = None):
current = copy.copy(entry)
metadata_dir = spack.util.path.canonicalize_path(entry["metadata"])
metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME)
try:
with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream))
list_of_sources.append(current)
except OSError:
pass
with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream))
list_of_sources.append(current)
return list_of_sources

View File

@@ -43,8 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from multiprocessing.connection import Connection
from typing import Callable, Dict, List, Optional, Set, Tuple
from typing import Dict, List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -52,6 +51,7 @@
from llnl.util.lang import dedupe, stable_partition
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
import spack.build_systems.cmake
import spack.build_systems.meson
@@ -72,7 +72,6 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.executable
import spack.util.path
import spack.util.pattern
from spack import traverse
@@ -480,12 +479,12 @@ def set_wrapper_variables(pkg, env):
env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}"))
env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir)
# Find ccache binary and hand it to build environment
if spack.config.get("config:ccache"):
# Enable ccache in the compiler wrapper
env.set(SPACK_CCACHE_BINARY, spack.util.executable.which_string("ccache", required=True))
else:
# Avoid cache pollution if a build system forces `ccache <compiler wrapper invocation>`.
env.set("CCACHE_DISABLE", "1")
ccache = Executable("ccache")
if not ccache:
raise RuntimeError("No ccache binary found in PATH")
env.set(SPACK_CCACHE_BINARY, ccache)
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
@@ -1145,60 +1144,18 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg: "spack.subprocess_context.PackageInstallContext",
function: Callable,
kwargs: Dict,
write_pipe: Connection,
input_pipe: Optional[Connection],
jsfd1: Optional[Connection],
jsfd2: Optional[Connection],
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
"""Main entry point in the child process for Spack builds.
``_setup_pkg_and_run`` is called by the child process created in
``start_build_process()``, and its main job is to run ``function()`` on behalf of
some Spack installation (see :ref:`spack.installer.PackageInstaller._install_task`).
The child process is passed a ``write_pipe``, on which it's expected to send one of
the following:
* ``StopPhase``: error raised by a build process indicating it's stopping at a
particular build phase.
* ``BaseException``: any exception raised by a child build process, which will be
wrapped in ``ChildError`` (which adds a bunch of debug info and log context) and
raised in the parent.
* The return value of ``function()``, which can be anything (except an exception).
This is returned to the caller.
Note: ``jsfd1`` and ``jsfd2`` are passed solely to ensure that the child process
does not close these file descriptors. Some ``multiprocessing`` backends will close
them automatically in the child if they are not passed at process creation time.
Arguments:
serialized_pkg: Spack package install context object (serialized form of the
package that we'll build in the child process).
function: function to call in the child process; serialized_pkg is passed to
this as the first argument.
kwargs: additional keyword arguments to pass to ``function()``.
write_pipe: multiprocessing ``Connection`` to the parent process, to which the
child *must* send a result (or an error) back to parent on.
input_multiprocess_fd: stdin from the parent (not passed currently on Windows)
jsfd1: gmake Jobserver file descriptor 1.
jsfd2: gmake Jobserver file descriptor 2.
"""
context: str = kwargs.get("context", "build")
try:
# We are in the child process. Python sets sys.stdin to open(os.devnull) to prevent our
# process and its parent from simultaneously reading from the original stdin. But, we
# assume that the parent process is not going to read from it till we are done with the
# child, so we undo Python's precaution. closefd=False since Connection has ownership.
if input_pipe is not None:
sys.stdin = os.fdopen(input_pipe.fileno(), closefd=False)
# We are in the child process. Python sets sys.stdin to
# open(os.devnull) to prevent our process and its parent from
# simultaneously reading from the original stdin. But, we assume
# that the parent process is not going to read from it till we
# are done with the child, so we undo Python's precaution.
if input_multiprocess_fd is not None:
sys.stdin = os.fdopen(input_multiprocess_fd.fd)
pkg = serialized_pkg.restore()
@@ -1214,14 +1171,13 @@ def _setup_pkg_and_run(
# Do not create a full ChildError from this, it's not an error
# it's a control statement.
write_pipe.send(e)
except BaseException as e:
except BaseException:
# catch ANYTHING that goes wrong in the child process
exc_type, exc, tb = sys.exc_info()
# Need to unwind the traceback in the child because traceback
# objects can't be sent to the parent.
exc_type = type(e)
tb = e.__traceback__
tb_string = "".join(traceback.format_exception(exc_type, e, tb))
tb_string = traceback.format_exc()
# build up some context from the offending package so we can
# show that, too.
@@ -1238,8 +1194,8 @@ def _setup_pkg_and_run(
elif context == "test":
logfile = os.path.join(pkg.test_suite.stage, pkg.test_suite.test_log_name(pkg.spec))
error_msg = str(e)
if isinstance(e, (spack.multimethod.NoSuchMethodError, AttributeError)):
error_msg = str(exc)
if isinstance(exc, (spack.multimethod.NoSuchMethodError, AttributeError)):
process = "test the installation" if context == "test" else "build from sources"
error_msg = (
"The '{}' package cannot find an attribute while trying to {}. "
@@ -1249,7 +1205,7 @@ def _setup_pkg_and_run(
"More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure"
).format(pkg.name, process, context)
error_msg = colorize("@*R{{{}}}".format(error_msg))
error_msg = "{}\n\n{}".format(str(e), error_msg)
error_msg = "{}\n\n{}".format(str(exc), error_msg)
# make a pickleable exception to send to parent.
msg = "%s: %s" % (exc_type.__name__, error_msg)
@@ -1267,8 +1223,8 @@ def _setup_pkg_and_run(
finally:
write_pipe.close()
if input_pipe is not None:
input_pipe.close()
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
def start_build_process(pkg, function, kwargs):
@@ -1295,9 +1251,23 @@ def child_fun():
If something goes wrong, the child process catches the error and
passes it to the parent wrapped in a ChildError. The parent is
expected to handle (or re-raise) the ChildError.
This uses `multiprocessing.Process` to create the child process. The
mechanism used to create the process differs on different operating
systems and for different versions of Python. In some cases "fork"
is used (i.e. the "fork" system call) and some cases it starts an
entirely new Python interpreter process (in the docs this is referred
to as the "spawn" start method). Breaking it down by OS:
- Linux always uses fork.
- Mac OS uses fork before Python 3.8 and "spawn" for 3.8 and after.
- Windows always uses the "spawn" start method.
For more information on `multiprocessing` child process creation
mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
"""
read_pipe, write_pipe = multiprocessing.Pipe(duplex=False)
input_fd = None
input_multiprocess_fd = None
jobserver_fd1 = None
jobserver_fd2 = None
@@ -1306,13 +1276,14 @@ def child_fun():
try:
# Forward sys.stdin when appropriate, to allow toggling verbosity
if sys.platform != "win32" and sys.stdin.isatty() and hasattr(sys.stdin, "fileno"):
input_fd = Connection(os.dup(sys.stdin.fileno()))
input_fd = os.dup(sys.stdin.fileno())
input_multiprocess_fd = MultiProcessFd(input_fd)
mflags = os.environ.get("MAKEFLAGS", False)
if mflags:
m = re.search(r"--jobserver-[^=]*=(\d),(\d)", mflags)
if m:
jobserver_fd1 = Connection(int(m.group(1)))
jobserver_fd2 = Connection(int(m.group(2)))
jobserver_fd1 = MultiProcessFd(int(m.group(1)))
jobserver_fd2 = MultiProcessFd(int(m.group(2)))
p = multiprocessing.Process(
target=_setup_pkg_and_run,
@@ -1321,7 +1292,7 @@ def child_fun():
function,
kwargs,
write_pipe,
input_fd,
input_multiprocess_fd,
jobserver_fd1,
jobserver_fd2,
),
@@ -1341,8 +1312,8 @@ def child_fun():
finally:
# Close the input stream in the parent process
if input_fd is not None:
input_fd.close()
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
def exitcode_msg(p):
typ = "exit" if p.exitcode >= 0 else "signal"

View File

@@ -137,11 +137,14 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -22,8 +22,6 @@
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
import ruamel.yaml
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -1113,7 +1111,7 @@ def main_script_replacements(cmd):
if cdash_handler and cdash_handler.auth_token:
try:
cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError, TimeoutError) as err:
except (SpackError, HTTPError, URLError) as err:
tty.warn(f"Problem populating buildgroup: {err}")
else:
tty.warn("Unable to populate buildgroup without CDash credentials")
@@ -1312,11 +1310,8 @@ def main_script_replacements(cmd):
if not rebuild_everything:
sys.exit(1)
# Minimize yaml output size through use of anchors
syaml.anchorify(sorted_output)
with open(output_file, "w") as f:
ruamel.yaml.YAML().dump(sorted_output, f)
with open(output_file, "w") as outf:
outf.write(syaml.dump(sorted_output, default_flow_style=True))
def _url_encode_string(input_string):
@@ -2100,7 +2095,7 @@ def read_broken_spec(broken_spec_url):
"""
try:
_, _, fs = web_util.read_from_url(broken_spec_url)
except web_util.SpackWebError:
except (URLError, web_util.SpackWebError, HTTPError):
tty.warn(f"Unable to read broken spec from {broken_spec_url}")
return None

View File

@@ -813,7 +813,7 @@ def _push_oci(
def extra_config(spec: Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,

View File

@@ -11,6 +11,7 @@
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.argparsewriter import ArgparseRstWriter, ArgparseWriter, Command
from llnl.util.tty.colify import colify
@@ -866,6 +867,9 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
prepend_header(args, f)
formatter(args, f)
if args.update_completion:
fs.set_executable(args.update)
else:
prepend_header(args, sys.stdout)
formatter(args, sys.stdout)

View File

@@ -661,32 +661,34 @@ def mirror_name_or_url(m):
# accidentally to a dir in the current working directory.
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
if "/" in m or "\\" in m:
return spack.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirror.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
raise argparse.ArgumentTypeError(
str(e) + ". Did you mean {}?".format(os.path.join(".", m))
)
def mirror_url(url):
try:
return spack.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
raise argparse.ArgumentTypeError(str(e))
def mirror_directory(path):
try:
return spack.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
raise argparse.ArgumentTypeError(str(e))
def mirror_name(name):
try:
return spack.mirror.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
raise argparse.ArgumentTypeError(str(e))

View File

@@ -468,30 +468,32 @@ def env_remove(args):
This removes an environment managed by Spack. Directory environments
and manifests embedded in repositories should be removed manually.
"""
remove_envs = []
read_envs = []
valid_envs = []
bad_envs = []
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
valid_envs.append(env)
valid_envs.append(env_name)
if env_name in args.rm_env:
remove_envs.append(env)
read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if remove_env is included from another env before trying to remove
for env in valid_envs:
for remove_env in remove_envs:
# Check if env is linked to another before trying to remove
for name in valid_envs:
# don't check if environment is included to itself
if env.name == remove_env.name:
if name == env_name:
continue
if remove_env.path in env.included_concrete_envs:
msg = f'Environment "{remove_env.name}" is being used by environment "{env.name}"'
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force:
tty.warn(msg)
else:
@@ -504,7 +506,7 @@ def env_remove(args):
if not answer:
tty.die("Will not remove any environments")
for env in remove_envs:
for env in read_envs:
name = env.name
if env.active:
tty.die(f"Environment {name} can't be removed while activated.")

View File

@@ -169,9 +169,9 @@ def query_arguments(args):
if (args.missing or args.only_missing) and not args.only_deprecated:
installed.append(InstallStatuses.MISSING)
predicate_fn = None
known = any
if args.unknown:
predicate_fn = lambda x: not spack.repo.PATH.exists(x.spec.name)
known = False
explicit = any
if args.explicit:
@@ -179,7 +179,7 @@ def query_arguments(args):
if args.implicit:
explicit = False
q_args = {"installed": installed, "predicate_fn": predicate_fn, "explicit": explicit}
q_args = {"installed": installed, "known": known, "explicit": explicit}
install_tree = args.install_tree
upstreams = spack.config.get("upstreams", {})

View File

@@ -61,6 +61,7 @@ def install_kwargs_from_args(args):
"dependencies_use_cache": cache_opt(args.use_cache, dep_use_bc),
"dependencies_cache_only": cache_opt(args.cache_only, dep_use_bc),
"include_build_deps": args.include_build_deps,
"explicit": True, # Use true as a default for install command
"stop_at": args.until,
"unsigned": args.unsigned,
"install_deps": ("dependencies" in args.things_to_install),
@@ -472,7 +473,6 @@ def install_without_active_env(args, install_kwargs, reporter_factory):
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]
installs = [s.package for s in concrete_specs]
install_kwargs["explicit"] = [s.dag_hash() for s in concrete_specs]
builder = PackageInstaller(installs, install_kwargs)
installs = [(s.package, install_kwargs) for s in concrete_specs]
builder = PackageInstaller(installs)
builder.install()

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os
import re
from collections import defaultdict
@@ -96,7 +97,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = 2024 # year of 0.22 release
latest_year = datetime.date.today().year
strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files

View File

@@ -101,9 +101,8 @@ def do_mark(specs, explicit):
specs (list): list of specs to be marked
explicit (bool): whether to mark specs as explicitly installed
"""
with spack.store.STORE.db.write_transaction():
for spec in specs:
spack.store.STORE.db.mark(spec, "explicit", explicit)
for spec in specs:
spack.store.STORE.db.update_explicit(spec, explicit)
def mark_specs(args, specs):

View File

@@ -377,10 +377,7 @@ def refresh(module_type, specs, args):
def modules_cmd(parser, args, module_type, callbacks=callbacks):
# Qualifiers to be used when querying the db for specs
constraint_qualifiers = {
"refresh": {
"installed": True,
"predicate_fn": lambda x: spack.repo.PATH.exists(x.spec.name),
}
"refresh": {"installed": True, "known": lambda x: not spack.repo.PATH.exists(x)}
}
query_args = constraint_qualifiers.get(args.subparser_name, {})

View File

@@ -151,7 +151,8 @@ def is_installed(spec):
key=lambda s: s.dag_hash(),
)
return [spec for spec in specs if is_installed(spec)]
with spack.store.STORE.db.read_transaction():
return [spec for spec in specs if is_installed(spec)]
def dependent_environments(
@@ -239,6 +240,8 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -38,10 +38,10 @@
import spack.cmd
import spack.environment as ev
import spack.filesystem_view as fsv
import spack.schema.projections
import spack.store
from spack.config import validate
from spack.filesystem_view import YamlFilesystemView, view_func_parser
from spack.util import spack_yaml as s_yaml
description = "project packages to a compact naming scheme on the filesystem"
@@ -193,13 +193,17 @@ def view(parser, args):
ordered_projections = {}
# What method are we using for this view
link_type = args.action if args.action in actions_link else "symlink"
view = fsv.YamlFilesystemView(
if args.action in actions_link:
link_fn = view_func_parser(args.action)
else:
link_fn = view_func_parser("symlink")
view = YamlFilesystemView(
path,
spack.store.STORE.layout,
projections=ordered_projections,
ignore_conflicts=getattr(args, "ignore_conflicts", False),
link_type=link_type,
link=link_fn,
verbose=args.verbose,
)

View File

@@ -290,7 +290,7 @@ def __init__(
operating_system,
target,
paths,
modules: Optional[List[str]] = None,
modules=None,
alias=None,
environment=None,
extra_rpaths=None,

View File

@@ -156,7 +156,15 @@ def get_compiler_config_from_packages(
def _compiler_config_from_package_config(config):
compilers = []
for entry in config:
compiler = _compiler_config_from_external(entry)
try:
compiler = _compiler_config_from_external(entry)
except Exception as e:
msg = "Reading compiler from packages config section failed\n"
msg += f" Compiler: {entry.get('spec', None)}\n"
msg += f" Prefix: {entry.get('prefix', None)}\n"
msg += f" Failure: {e}"
warnings.warn(msg)
compiler = None
if compiler:
compilers.append(compiler)
@@ -220,10 +228,10 @@ def _compiler_config_from_external(config):
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture
else:
target = spec.architecture.target
target = spec.target
if not target:
target = spack.platforms.host().target("default_target")
target = target.microarchitecture
host_platform = spack.platforms.host()
target = host_platform.target("default_target").microarchitecture
operating_system = spec.os
if not operating_system:

View File

@@ -96,6 +96,8 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -120,6 +122,24 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -142,7 +162,10 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
return "-std=c2x"
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):

View File

@@ -283,9 +283,12 @@ def __reduce__(self):
database. If it is a spec, we'll evaluate
``spec.satisfies(query_spec)``
predicate_fn: optional predicate taking an InstallRecord as argument, and returning
whether that record is selected for the query. It can be used to craft criteria
that need some data for selection not provided by the Database itself.
known (bool or None): Specs that are "known" are those
for which Spack can locate a ``package.py`` file -- i.e.,
Spack "knows" how to install them. Specs that are unknown may
represent packages that existed in a previous version of
Spack, but have since either changed their name or
been removed
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed
@@ -585,9 +588,6 @@ def _path(self, spec: "spack.spec.Spec") -> pathlib.Path:
return self.dir / f"{spec.name}-{spec.dag_hash()}"
SelectType = Callable[[InstallRecord], bool]
class Database:
#: Fields written for each install record
record_fields: Tuple[str, ...] = DEFAULT_INSTALL_RECORD_FIELDS
@@ -1367,7 +1367,7 @@ def _deprecate(self, spec, deprecator):
self._data[spec_key] = spec_rec
@_autospec
def mark(self, spec: "spack.spec.Spec", key: str, value: Any) -> None:
def mark(self, spec, key, value):
"""Mark an arbitrary record on a spec."""
with self.write_transaction():
return self._mark(spec, key, value)
@@ -1516,7 +1516,7 @@ def get_by_hash(self, dag_hash, default=None, installed=any):
def _query(
self,
query_spec=any,
predicate_fn: Optional[SelectType] = None,
known=any,
installed=True,
explicit=any,
start_date=None,
@@ -1524,7 +1524,7 @@ def _query(
hashes=None,
in_buildcache=any,
origin=None,
) -> List["spack.spec.Spec"]:
):
"""Run a query on the database."""
# TODO: Specs are a lot like queries. Should there be a
@@ -1570,7 +1570,7 @@ def _query(
if explicit is not any and rec.explicit != explicit:
continue
if predicate_fn is not None and not predicate_fn(rec):
if known is not any and known(rec.spec.name):
continue
if start_date or end_date:
@@ -1655,14 +1655,14 @@ def query(self, *args, **kwargs):
query.__doc__ = ""
query.__doc__ += _QUERY_DOCSTRING
def query_one(self, query_spec, predicate_fn=None, installed=True):
def query_one(self, query_spec, known=any, installed=True):
"""Query for exactly one spec that matches the query spec.
Raises an assertion error if more than one spec matches the
query. Returns None if no installed package matches.
"""
concrete_specs = self.query(query_spec, predicate_fn=predicate_fn, installed=installed)
concrete_specs = self.query(query_spec, known=known, installed=installed)
assert len(concrete_specs) <= 1
return concrete_specs[0] if concrete_specs else None
@@ -1709,6 +1709,24 @@ def root(key, record):
if id(rec.spec) not in needed and rec.installed
]
def update_explicit(self, spec, explicit):
"""
Update the spec's explicit state in the database.
Args:
spec (spack.spec.Spec): the spec whose install record is being updated
explicit (bool): ``True`` if the package was requested explicitly
by the user, ``False`` if it was pulled in as a dependency of
an explicit package.
"""
rec = self.get_record(spec)
if explicit != rec.explicit:
with self.write_transaction():
message = "{s.name}@{s.version} : marking the package {0}"
status = "explicit" if explicit else "implicit"
tty.debug(message.format(status, s=spec))
rec.explicit = explicit
class UpstreamDatabaseLockingError(SpackError):
"""Raised when an operation would need to lock an upstream database"""

View File

@@ -97,7 +97,7 @@ class OpenMpi(Package):
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
SUPPORTED_LANGUAGES = ("fortran", "cxx")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:

View File

@@ -9,13 +9,11 @@
import os
import re
import shlex
from enum import Enum
from typing import List, Optional
import spack.deptypes as dt
import spack.environment.environment as ev
import spack.paths
import spack.spec
import spack.traverse as traverse
@@ -228,7 +226,6 @@ def to_dict(self):
"install_deps_target": self._target("install-deps"),
"any_hash_target": self._target("%"),
"jobserver_support": self.jobserver_support,
"spack_script": shlex.quote(spack.paths.spack_script),
"adjacency_list": self.make_adjacency_list,
"phony_convenience_targets": " ".join(self.phony_convenience_targets),
"pkg_ids_variable": self.pkg_identifier_variable,

View File

@@ -30,7 +30,6 @@
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.filesystem_view as fsv
import spack.hash_types as ht
import spack.hooks
import spack.main
@@ -53,6 +52,7 @@
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
@@ -606,7 +606,7 @@ def __init__(
self.projections = projections
self.select = select
self.exclude = exclude
self.link_type = fsv.canonicalize_link_type(link_type)
self.link_type = view_func_parser(link_type)
self.link = link
def select_fn(self, spec):
@@ -640,7 +640,7 @@ def to_dict(self):
if self.exclude:
ret["exclude"] = self.exclude
if self.link_type:
ret["link_type"] = self.link_type
ret["link_type"] = inverse_view_func_parser(self.link_type)
if self.link != default_view_link:
ret["link"] = self.link
return ret
@@ -690,7 +690,7 @@ def get_projection_for_spec(self, spec):
to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec)
def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
"""
Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem
@@ -710,14 +710,14 @@ def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
)
return self._view(path)
def _view(self, root: str) -> fsv.SimpleFilesystemView:
def _view(self, root: str) -> SimpleFilesystemView:
"""Returns a view object for a given root dir."""
return fsv.SimpleFilesystemView(
return SimpleFilesystemView(
root,
spack.store.STORE.layout,
ignore_conflicts=True,
projections=self.projections,
link_type=self.link_type,
link=self.link_type,
)
def __contains__(self, spec):
@@ -1190,6 +1190,7 @@ def scope_name(self):
def include_concrete_envs(self):
"""Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set()
concrete_hash_seen = set()
self.included_concrete_spec_data = {}
@@ -1200,26 +1201,37 @@ def include_concrete_envs(self):
raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path)
self.included_concrete_spec_data[env_path] = {"roots": [], "concrete_specs": {}}
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
# Copy unique root specs from env
for root_dict in env._concrete_roots_dict():
self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env
for dag_hash, spec_details in env._concrete_specs_dict().items():
if dag_hash not in concrete_hash_seen:
self.included_concrete_spec_data[env_path]["concrete_specs"].update(
{dag_hash: spec_details}
for concrete_spec in lockfile_as_dict["concrete_specs"]:
if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path].update(
{"concrete_specs": lockfile_as_dict["concrete_specs"]}
)
concrete_hash_seen.add(dag_hash)
concrete_hash_seen.add(concrete_spec)
# Copy transitive include data
transitive = env.included_concrete_spec_data
if transitive:
self.included_concrete_spec_data[env_path]["include_concrete"] = transitive
if "include_concrete" in lockfile_as_dict.keys():
self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
"include_concrete"
]
self._read_lockfile_dict(self._to_lockfile_dict())
self.write()
@@ -1936,18 +1948,13 @@ def install_specs(self, specs: Optional[List[Spec]] = None, **install_args):
specs = specs if specs is not None else roots
# Extend the set of specs to overwrite with modified dev specs and their parents
install_args["overwrite"] = {
*install_args.get("overwrite", ()),
*self._dev_specs_that_need_overwrite(),
}
install_args["overwrite"] = (
install_args.get("overwrite", []) + self._dev_specs_that_need_overwrite()
)
# Only environment roots are marked explicit
install_args["explicit"] = {
*install_args.get("explicit", ()),
*(s.dag_hash() for s in roots),
}
installs = [(spec.package, {**install_args, "explicit": spec in roots}) for spec in specs]
PackageInstaller([spec.package for spec in specs], install_args).install()
PackageInstaller(installs).install()
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
@@ -2137,23 +2144,16 @@ def _get_environment_specs(self, recurse_dependencies=True):
return specs
def _concrete_specs_dict(self):
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = {}
for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash):
spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash)
# Assumes no legacy formats, since this was just created.
spec_dict[ht.dag_hash.name] = s.dag_hash()
concrete_specs[s.dag_hash()] = spec_dict
return concrete_specs
def _concrete_roots_dict(self):
hash_spec_list = zip(self.concretized_order, self.concretized_user_specs)
return [{"hash": h, "spec": str(s)} for h, s in hash_spec_list]
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = self._concrete_specs_dict()
root_specs = self._concrete_roots_dict()
spack_dict = {"version": spack.spack_version}
spack_commit = spack.main.get_spack_commit()
@@ -2174,7 +2174,7 @@ def _to_lockfile_dict(self):
# spack version information
"spack": spack_dict,
# users specs + hashes are the 'roots' of the environment
"roots": root_specs,
"roots": [{"hash": h, "spec": str(s)} for h, s in hash_spec_list],
# Concrete specs by hash, including dependencies
"concrete_specs": concrete_specs,
}

View File

@@ -30,7 +30,6 @@
import shutil
import urllib.error
import urllib.parse
import urllib.request
from pathlib import PurePath
from typing import List, Optional
@@ -274,7 +273,10 @@ def __init__(self, url=None, checksum=None, **kwargs):
@property
def curl(self):
if not self._curl:
self._curl = web_util.require_curl()
try:
self._curl = which("curl", required=True)
except CommandNotFoundError as exc:
tty.error(str(exc))
return self._curl
def source_id(self):
@@ -295,23 +297,27 @@ def candidate_urls(self):
@_needs_stage
def fetch(self):
if self.archive_file:
tty.debug(f"Already downloaded {self.archive_file}")
tty.debug("Already downloaded {0}".format(self.archive_file))
return
errors: List[Exception] = []
url = None
errors = []
for url in self.candidate_urls:
if not web_util.url_exists(url):
tty.debug("URL does not exist: " + url)
continue
try:
self._fetch_from_url(url)
break
except FailedDownloadError as e:
errors.extend(e.exceptions)
else:
raise FailedDownloadError(*errors)
errors.append(str(e))
for msg in errors:
tty.debug(msg)
if not self.archive_file:
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
raise FailedDownloadError(url)
def _fetch_from_url(self, url):
if spack.config.get("config:url_fetch_method") == "curl":
@@ -330,20 +336,19 @@ def _check_headers(self, headers):
@_needs_stage
def _fetch_urllib(self, url):
save_file = self.stage.save_filename
tty.msg("Fetching {0}".format(url))
request = urllib.request.Request(url, headers={"User-Agent": web_util.SPACK_USER_AGENT})
# Run urllib but grab the mime type from the http headers
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
url, headers, response = web_util.read_from_url(url)
except web_util.SpackWebError as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if os.path.lexists(save_file):
os.remove(save_file)
raise FailedDownloadError(e) from e
tty.msg(f"Fetching {url}")
msg = "urllib failed to fetch with error {0}".format(e)
raise FailedDownloadError(url, msg)
if os.path.lexists(save_file):
os.remove(save_file)
@@ -351,7 +356,7 @@ def _fetch_urllib(self, url):
with open(save_file, "wb") as _open_file:
shutil.copyfileobj(response, _open_file)
self._check_headers(str(response.headers))
self._check_headers(str(headers))
@_needs_stage
def _fetch_curl(self, url):
@@ -360,7 +365,7 @@ def _fetch_curl(self, url):
if self.stage.save_filename:
save_file = self.stage.save_filename
partial_file = self.stage.save_filename + ".part"
tty.msg(f"Fetching {url}")
tty.msg("Fetching {0}".format(url))
if partial_file:
save_args = [
"-C",
@@ -400,8 +405,8 @@ def _fetch_curl(self, url):
try:
web_util.check_curl_code(curl.returncode)
except spack.error.FetchError as e:
raise FailedDownloadError(e) from e
except spack.error.FetchError as err:
raise spack.fetch_strategy.FailedDownloadError(url, str(err))
self._check_headers(headers)
@@ -549,13 +554,13 @@ def fetch(self):
try:
response = self._urlopen(self.url)
except (TimeoutError, urllib.error.URLError) as e:
except urllib.error.URLError as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if os.path.lexists(file):
os.remove(file)
raise FailedDownloadError(e) from e
raise FailedDownloadError(self.url, f"Failed to fetch {self.url}: {e}") from e
if os.path.lexists(file):
os.remove(file)
@@ -1307,41 +1312,35 @@ def __init__(self, *args, **kwargs):
@_needs_stage
def fetch(self):
if self.archive_file:
tty.debug(f"Already downloaded {self.archive_file}")
tty.debug("Already downloaded {0}".format(self.archive_file))
return
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "s3":
raise spack.error.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
tty.debug("Fetching {0}".format(self.url))
basename = os.path.basename(parsed_url.path)
request = urllib.request.Request(
self.url, headers={"User-Agent": web_util.SPACK_USER_AGENT}
)
with working_dir(self.stage.path):
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
raise FailedDownloadError(e) from e
tty.debug(f"Fetching {self.url}")
_, headers, stream = web_util.read_from_url(self.url)
with open(basename, "wb") as f:
shutil.copyfileobj(response, f)
shutil.copyfileobj(stream, f)
content_type = web_util.get_header(response.headers, "Content-type")
content_type = web_util.get_header(headers, "Content-type")
if content_type == "text/html":
warn_content_type_mismatch(self.archive_file or "the archive")
if self.stage.save_filename:
fs.rename(os.path.join(self.stage.path, basename), self.stage.save_filename)
llnl.util.filesystem.rename(
os.path.join(self.stage.path, basename), self.stage.save_filename
)
if not self.archive_file:
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
raise FailedDownloadError(self.url)
@fetcher
@@ -1367,23 +1366,17 @@ def fetch(self):
if parsed_url.scheme != "gs":
raise spack.error.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
tty.debug("Fetching {0}".format(self.url))
basename = os.path.basename(parsed_url.path)
request = urllib.request.Request(
self.url, headers={"User-Agent": web_util.SPACK_USER_AGENT}
)
with working_dir(self.stage.path):
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
raise FailedDownloadError(e) from e
tty.debug(f"Fetching {self.url}")
_, headers, stream = web_util.read_from_url(self.url)
with open(basename, "wb") as f:
shutil.copyfileobj(response, f)
shutil.copyfileobj(stream, f)
content_type = web_util.get_header(response.headers, "Content-type")
content_type = web_util.get_header(headers, "Content-type")
if content_type == "text/html":
warn_content_type_mismatch(self.archive_file or "the archive")
@@ -1392,9 +1385,7 @@ def fetch(self):
os.rename(os.path.join(self.stage.path, basename), self.stage.save_filename)
if not self.archive_file:
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
raise FailedDownloadError(self.url)
@fetcher
@@ -1731,9 +1722,9 @@ class NoCacheError(spack.error.FetchError):
class FailedDownloadError(spack.error.FetchError):
"""Raised when a download fails."""
def __init__(self, *exceptions: Exception):
super().__init__("Failed to download")
self.exceptions = exceptions
def __init__(self, url, msg=""):
super().__init__("Failed to fetch file from URL: %s" % url, msg)
self.url = url
class NoArchiveFileError(spack.error.FetchError):

View File

@@ -10,9 +10,8 @@
import shutil
import stat
import sys
from typing import Callable, Dict, Optional
from typing import Optional
from llnl.string import comma_or
from llnl.util import tty
from llnl.util.filesystem import (
mkdirp,
@@ -33,7 +32,6 @@
from llnl.util.tty.color import colorize
import spack.config
import spack.directory_layout
import spack.paths
import spack.projections
import spack.relocate
@@ -51,20 +49,19 @@
_projections_path = ".spack/projections.yaml"
LinkCallbackType = Callable[[str, str, "FilesystemView", Optional[spack.spec.Spec]], None]
def view_symlink(src: str, dst: str, *args, **kwargs) -> None:
def view_symlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
symlink(src, dst)
def view_hardlink(src: str, dst: str, *args, **kwargs) -> None:
def view_hardlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
os.link(src, dst)
def view_copy(
src: str, dst: str, view: "FilesystemView", spec: Optional[spack.spec.Spec] = None
) -> None:
def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
"""
Copy a file from src to dst.
@@ -107,40 +104,27 @@ def view_copy(
tty.debug(f"Can't change the permissions for {dst}")
#: supported string values for `link_type` in an env, mapped to canonical values
_LINK_TYPES = {
"hardlink": "hardlink",
"hard": "hardlink",
"copy": "copy",
"relocate": "copy",
"add": "symlink",
"symlink": "symlink",
"soft": "symlink",
}
_VALID_LINK_TYPES = sorted(set(_LINK_TYPES.values()))
def canonicalize_link_type(link_type: str) -> str:
"""Return canonical"""
canonical = _LINK_TYPES.get(link_type)
if not canonical:
raise ValueError(
f"Invalid link type: '{link_type}. Must be one of {comma_or(_VALID_LINK_TYPES)}'"
)
return canonical
def function_for_link_type(link_type: str) -> LinkCallbackType:
link_type = canonicalize_link_type(link_type)
if link_type == "hardlink":
def view_func_parser(parsed_name):
# What method are we using for this view
if parsed_name in ("hardlink", "hard"):
return view_hardlink
elif link_type == "symlink":
return view_symlink
elif link_type == "copy":
elif parsed_name in ("copy", "relocate"):
return view_copy
elif parsed_name in ("add", "symlink", "soft"):
return view_symlink
else:
raise ValueError(f"invalid link type for view: '{parsed_name}'")
assert False, "invalid link type" # need mypy Literal values
def inverse_view_func_parser(view_type):
# get string based on view type
if view_type is view_hardlink:
link_name = "hardlink"
elif view_type is view_copy:
link_name = "copy"
else:
link_name = "symlink"
return link_name
class FilesystemView:
@@ -156,16 +140,7 @@ class FilesystemView:
directory structure.
"""
def __init__(
self,
root: str,
layout: spack.directory_layout.DirectoryLayout,
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
def __init__(self, root, layout, **kwargs):
"""
Initialize a filesystem view under the given `root` directory with
corresponding directory `layout`.
@@ -174,17 +149,15 @@ def __init__(
"""
self._root = root
self.layout = layout
self.projections = {} if projections is None else projections
self.ignore_conflicts = ignore_conflicts
self.verbose = verbose
self.projections = kwargs.get("projections", {})
self.ignore_conflicts = kwargs.get("ignore_conflicts", False)
self.verbose = kwargs.get("verbose", False)
# Setup link function to include view
self.link_type = link_type
self._link = function_for_link_type(link_type)
def link(self, src: str, dst: str, spec: Optional[spack.spec.Spec] = None) -> None:
self._link(src, dst, self, spec)
link_func = kwargs.get("link", view_symlink)
self.link = ft.partial(link_func, view=self)
def add_specs(self, *specs, **kwargs):
"""
@@ -282,24 +255,8 @@ class YamlFilesystemView(FilesystemView):
Filesystem view to work with a yaml based directory layout.
"""
def __init__(
self,
root: str,
layout: spack.directory_layout.DirectoryLayout,
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
super().__init__(
root,
layout,
projections=projections,
ignore_conflicts=ignore_conflicts,
verbose=verbose,
link_type=link_type,
)
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
# Super class gets projections from the kwargs
# YAML specific to get projections from YAML file
@@ -681,6 +638,9 @@ class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to

View File

@@ -41,9 +41,8 @@ def _populate_hooks(cls):
relative_names = list(list_modules(spack.paths.hooks_path))
# write_install_manifest should come after any mutation of the install prefix, and
# autopush should include the install manifest.
ensure_last(relative_names, "absolutify_elf_sonames", "write_install_manifest", "autopush")
# Ensure that write_install_manifest comes last
ensure_last(relative_names, "absolutify_elf_sonames", "write_install_manifest")
for name in relative_names:
module_name = __name__ + "." + name

View File

@@ -440,7 +440,7 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
tty.debug(f"{pre} already registered in DB")
record = spack.store.STORE.db.get_record(spec)
if explicit and not record.explicit:
spack.store.STORE.db.mark(spec, "explicit", True)
spack.store.STORE.db.update_explicit(spec, explicit)
except KeyError:
# If not, register it and generate the module file.
@@ -761,8 +761,12 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
if not self.pkg.spec.concrete:
raise ValueError(f"{self.pkg.name} must have a concrete spec")
self.pkg.stop_before_phase = install_args.get("stop_before") # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.get("stop_at") # type: ignore[attr-defined]
# Cache the package phase options with the explicit package,
# popping the options to ensure installation of associated
# dependencies is NOT affected by these options.
self.pkg.stop_before_phase = install_args.pop("stop_before", None) # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.pop("stop_at", None) # type: ignore[attr-defined]
# Cache the package id for convenience
self.pkg_id = package_id(pkg.spec)
@@ -1072,17 +1076,19 @@ def flag_installed(self, installed: List[str]) -> None:
@property
def explicit(self) -> bool:
return self.pkg.spec.dag_hash() in self.request.install_args.get("explicit", [])
"""The package was explicitly requested by the user."""
return self.is_root and self.request.install_args.get("explicit", True)
@property
def is_build_request(self) -> bool:
"""The package was requested directly"""
def is_root(self) -> bool:
"""The package was requested directly, but may or may not be explicit
in an environment."""
return self.pkg == self.request.pkg
@property
def use_cache(self) -> bool:
_use_cache = True
if self.is_build_request:
if self.is_root:
return self.request.install_args.get("package_use_cache", _use_cache)
else:
return self.request.install_args.get("dependencies_use_cache", _use_cache)
@@ -1090,7 +1096,7 @@ def use_cache(self) -> bool:
@property
def cache_only(self) -> bool:
_cache_only = False
if self.is_build_request:
if self.is_root:
return self.request.install_args.get("package_cache_only", _cache_only)
else:
return self.request.install_args.get("dependencies_cache_only", _cache_only)
@@ -1116,17 +1122,24 @@ def priority(self):
class PackageInstaller:
"""
Class for managing the install process for a Spack instance based on a bottom-up DAG approach.
Class for managing the install process for a Spack instance based on a
bottom-up DAG approach.
This installer can coordinate concurrent batch and interactive, local and distributed (on a
shared file system) builds for the same Spack instance.
This installer can coordinate concurrent batch and interactive, local
and distributed (on a shared file system) builds for the same Spack
instance.
"""
def __init__(
self, packages: List["spack.package_base.PackageBase"], install_args: dict
) -> None:
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []) -> None:
"""Initialize the installer.
Args:
installs (list): list of tuples, where each
tuple consists of a package (PackageBase) and its associated
install arguments (dict)
"""
# List of build requests
self.build_requests = [BuildRequest(pkg, install_args) for pkg in packages]
self.build_requests = [BuildRequest(pkg, install_args) for pkg, install_args in installs]
# Priority queue of build tasks
self.build_pq: List[Tuple[Tuple[int, int], BuildTask]] = []
@@ -1363,8 +1376,8 @@ def _prepare_for_install(self, task: BuildTask) -> None:
self._update_installed(task)
# Only update the explicit entry once for the explicit package
if task.explicit and not rec.explicit:
spack.store.STORE.db.mark(task.pkg.spec, "explicit", True)
if task.explicit:
spack.store.STORE.db.update_explicit(task.pkg.spec, True)
def _cleanup_all_tasks(self) -> None:
"""Cleanup all build tasks to include releasing their locks."""
@@ -1544,6 +1557,17 @@ def _add_tasks(self, request: BuildRequest, all_deps):
tty.warn(f"Installation request refused: {str(err)}")
return
# Skip out early if the spec is not being installed locally (i.e., if
# external or upstream).
#
# External and upstream packages need to get flagged as installed to
# ensure proper status tracking for environment build.
explicit = request.install_args.get("explicit", True)
not_local = _handle_external_and_upstream(request.pkg, explicit)
if not_local:
self._flag_installed(request.pkg)
return
install_compilers = spack.config.get("config:install_missing_compilers", False)
install_deps = request.install_args.get("install_deps")
@@ -1659,6 +1683,10 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if not pkg.unit_test_check():
return
# Injecting information to know if this installation request is the root one
# to determine in BuildProcessInstaller whether installation is explicit or not
install_args["is_root"] = task.is_root
try:
self._setup_install_dir(pkg)
@@ -1970,8 +1998,8 @@ def install(self) -> None:
self._init_queue()
fail_fast_err = "Terminating after first install failure"
single_requested_spec = len(self.build_requests) == 1
failed_build_requests = []
single_explicit_spec = len(self.build_requests) == 1
failed_explicits = []
install_status = InstallStatus(len(self.build_pq))
@@ -2020,10 +2048,11 @@ def install(self) -> None:
# Skip the installation if the spec is not being installed locally
# (i.e., if external or upstream) BUT flag it as installed since
# some package likely depends on it.
if _handle_external_and_upstream(pkg, task.explicit):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
if not task.explicit:
if _handle_external_and_upstream(pkg, False):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
# Flag a failed spec. Do not need an (install) prefix lock since
# assume using a separate (failed) prefix lock file.
@@ -2168,11 +2197,14 @@ def install(self) -> None:
if self.fail_fast:
raise InstallError(f"{fail_fast_err}: {str(exc)}", pkg=pkg)
# Terminate when a single build request has failed, or summarize errors later.
if task.is_build_request:
if single_requested_spec:
raise
failed_build_requests.append((pkg, pkg_id, str(exc)))
# Terminate at this point if the single explicit spec has
# failed to install.
if single_explicit_spec and task.explicit:
raise
# Track explicit spec id and error to summarize when done
if task.explicit:
failed_explicits.append((pkg, pkg_id, str(exc)))
finally:
# Remove the install prefix if anything went wrong during
@@ -2195,16 +2227,16 @@ def install(self) -> None:
if request.install_args.get("install_package") and request.pkg_id not in self.installed
]
if failed_build_requests or missing:
for _, pkg_id, err in failed_build_requests:
if failed_explicits or missing:
for _, pkg_id, err in failed_explicits:
tty.error(f"{pkg_id}: {err}")
for _, pkg_id in missing:
tty.error(f"{pkg_id}: Package was not installed")
if len(failed_build_requests) > 0:
pkg = failed_build_requests[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_build_requests]
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
tty.debug(
"Associating installation failure with first failed "
f"explicit package ({ids[0]}) from {', '.join(ids)}"
@@ -2263,7 +2295,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
self.verbose = bool(install_args.get("verbose", False))
# whether installation was explicitly requested by the user
self.explicit = pkg.spec.dag_hash() in install_args.get("explicit", [])
self.explicit = install_args.get("is_root", False) and install_args.get("explicit", True)
# env before starting installation
self.unmodified_env = install_args.get("unmodified_env", {})

View File

@@ -87,8 +87,9 @@ def from_url(url: str):
"""Create an anonymous mirror by URL. This method validates the URL."""
if not urllib.parse.urlparse(url).scheme in supported_url_schemes:
raise ValueError(
f'"{url}" is not a valid mirror URL. '
f"Scheme must be one of {supported_url_schemes}."
'"{}" is not a valid mirror URL. Scheme must be once of {}.'.format(
url, ", ".join(supported_url_schemes)
)
)
return Mirror(url)
@@ -733,7 +734,7 @@ def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = spack.mirror.MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
raise ValueError('no mirror named "{0}"'.format(mirror_name))
return mirror

View File

@@ -143,7 +143,6 @@ def __init__(self):
"12": "monterey",
"13": "ventura",
"14": "sonoma",
"15": "sequoia",
}
version = macos_version()

View File

@@ -199,10 +199,10 @@ def __init__(cls, name, bases, attr_dict):
# assumed to be detectable
if hasattr(cls, "executables") or hasattr(cls, "libraries"):
# Append a tag to each detectable package, so that finding them is faster
if not hasattr(cls, "tags"):
if hasattr(cls, "tags"):
getattr(cls, "tags").append(DetectablePackageMeta.TAG)
else:
setattr(cls, "tags", [DetectablePackageMeta.TAG])
elif DetectablePackageMeta.TAG not in cls.tags:
cls.tags.append(DetectablePackageMeta.TAG)
@classmethod
def platform_executables(cls):
@@ -1119,9 +1119,10 @@ def _make_stage(self):
if not link_format:
link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format)
source_stage = DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
else:
source_stage = self._make_root_stage(self.fetcher)
return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
# To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
# all_stages is source + resources + patches
all_stages = StageComposite()
@@ -1450,8 +1451,10 @@ def do_fetch(self, mirror_only=False):
return
checksum = spack.config.get("config:checksum")
fetch = self.stage.needs_fetching
if (
checksum
and fetch
and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion))
):
@@ -1558,11 +1561,13 @@ def do_patch(self):
tty.debug("Patching failed last time. Restaging.")
self.stage.restage()
else:
# develop specs may have patch failures but should never be restaged
tty.warn(
f"A patch failure was detected in {self.name}."
" Build errors may occur due to this."
# develop specs/ DIYStages may have patch failures but
# should never be restaged
msg = (
"A patch failure was detected in %s." % self.name
+ " Build errors may occur due to this."
)
tty.warn(msg)
return
# If this file exists, then we already applied all the patches.
@@ -1876,10 +1881,7 @@ def do_install(self, **kwargs):
verbose (bool): Display verbose build output (by default,
suppresses it)
"""
explicit = kwargs.get("explicit", True)
if isinstance(explicit, bool):
kwargs["explicit"] = {self.spec.dag_hash()} if explicit else set()
PackageInstaller([self], kwargs).install()
PackageInstaller([(self, kwargs)]).install()
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the routine made available to

View File

@@ -33,7 +33,6 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.caches
import spack.config
import spack.error
@@ -50,8 +49,6 @@
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
ROOT_PYTHON_NAMESPACE = "spack.pkg"
_API_REGEX = re.compile(r"^v(\d+)\.(\d+)$")
def python_package_for_repo(namespace):
"""Returns the full namespace of a repository, given its relative one
@@ -912,49 +909,17 @@ def __contains__(self, pkg_name):
return self.exists(pkg_name)
def _parse_package_api_version(
config: Dict[str, Any],
min_api: Tuple[int, int] = spack.min_package_api_version,
max_api: Tuple[int, int] = spack.package_api_version,
) -> Tuple[int, int]:
api = config.get("api")
if api is None:
package_api = (1, 0)
else:
if not isinstance(api, str):
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
api_match = _API_REGEX.match(api)
if api_match is None:
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
package_api = (int(api_match.group(1)), int(api_match.group(2)))
if min_api <= package_api <= max_api:
return package_api
min_str = ".".join(str(i) for i in min_api)
max_str = ".".join(str(i) for i in max_api)
curr_str = ".".join(str(i) for i in package_api)
raise BadRepoError(
f"Package API v{curr_str} is not supported by this version of Spack ("
f"must be between v{min_str} and v{max_str})"
)
class Repo:
"""Class representing a package repository in the filesystem.
Each package repository must have a top-level configuration file called `repo.yaml`.
Each package repository must have a top-level configuration file
called `repo.yaml`.
It contains the following keys:
Currently, `repo.yaml` this must define:
`namespace`:
A Python namespace where the repository's packages should live.
`api`:
A string of the form vX.Y that indicates the Package API version. The default is "v1.0".
For the repo to be compatible with the current version of Spack, the version must be
greater than or equal to :py:data:`spack.min_package_api_version` and less than or equal to
:py:data:`spack.package_api_version`.
"""
def __init__(self, root, cache=None):
@@ -983,7 +948,7 @@ def check(condition, msg):
"%s must define a namespace." % os.path.join(root, repo_config_name),
)
self.namespace: str = config["namespace"]
self.namespace = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
("Invalid namespace '%s' in repo '%s'. " % (self.namespace, self.root))
@@ -996,15 +961,13 @@ def check(condition, msg):
# Keep name components around for checking prefixes.
self._names = self.full_namespace.split(".")
packages_dir: str = config.get("subdirectory", packages_dir_name)
packages_dir = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path),
"No directory '%s' found in '%s'" % (packages_dir, root),
)
self.package_api = _parse_package_api_version(config)
# These are internal cache variables.
self._modules = {}
self._classes = {}
@@ -1047,7 +1010,7 @@ def is_prefix(self, fullname):
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self) -> Dict[str, Any]:
def _read_config(self):
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file) as reponame_file:

View File

@@ -52,10 +52,7 @@
"target": {"type": "string"},
"alias": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"modules": {
"anyOf": [
{"type": "null"},
{"type": "array", "items": {"type": "string"}},
]
"anyOf": [{"type": "string"}, {"type": "null"}, {"type": "array"}]
},
"implicit_rpaths": {
"anyOf": [

View File

@@ -13,7 +13,6 @@
r"\w[\w-]*": {
"type": "object",
"additionalProperties": False,
"required": ["spec"],
"properties": {"spec": {"type": "string"}, "path": {"type": "string"}},
}
},

View File

@@ -844,6 +844,8 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
@@ -1433,14 +1435,16 @@ def condition(
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id))
)
if not imposed_spec:
return condition_id
@@ -1689,43 +1693,19 @@ def external_packages(self):
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
selected_externals = set()
external_specs = []
if spec_filters:
for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs
selected_externals.update(current_filter.selected_specs())
# Emit facts for externals specs. Note that "local_idx" is the index of the spec
# in packages:<pkg_name>:externals. This means:
#
# packages:<pkg_name>:externals[local_idx].spec == spec
external_versions = []
for local_idx, spec in enumerate(candidate_specs):
msg = f"{spec.name} available as external when satisfying {spec}"
if spec_filters and spec not in selected_externals:
continue
if not spec.versions.concrete:
warnings.warn(f"cannot use the external spec {spec}: needs a concrete version")
continue
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
external_versions.append((spec.version, local_idx))
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
external_specs.extend(current_filter.selected_specs())
else:
external_specs.extend(candidate_specs)
# Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way
external_versions = [
(x.version, external_id) for external_id, x in enumerate(external_specs)
]
external_versions = [
(v, idx, external_id)
for idx, (v, external_id) in enumerate(sorted(external_versions, reverse=True))
@@ -1735,6 +1715,19 @@ def external_imposition(input_spec, requirements):
DeclaredVersion(version=version, idx=idx, origin=Provenance.EXTERNAL)
)
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
self.trigger_rules()
self.effect_rules()
@@ -1887,8 +1880,11 @@ def _spec_clauses(
)
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
# Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we
@@ -1943,11 +1939,6 @@ def _spec_clauses(
for virtual in virtuals:
clauses.append(fn.attr("virtual_on_incoming_edges", spec.name, virtual))
# If the spec is external and concrete, we allow all the libcs on the system
if spec.external and spec.concrete and using_libc_compatibility():
for libc in self.libcs:
clauses.append(fn.attr("compatible_libc", spec.name, libc.name, libc.version))
# add all clauses from dependencies
if transitive:
# TODO: Eventually distinguish 2 deps on the same pkg (build and link)
@@ -2743,7 +2734,7 @@ class _Head:
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
propagate = fn.attr("propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class _Body:
@@ -2760,7 +2751,7 @@ class _Body:
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
propagate = fn.attr("propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class ProblemInstanceBuilder:
@@ -3234,39 +3225,6 @@ def requires(self, impose: str, *, when: str):
self.runtime_conditions.add((imposed_spec, when_spec))
self.reset()
def propagate(self, constraint_str: str, *, when: str):
msg = "the 'propagate' method can be called only with pkg('*')"
assert self.current_package == "*", msg
when_spec = spack.spec.Spec(when)
assert when_spec.name is None, "only anonymous when specs are accepted"
placeholder = "XXX"
node_variable = "node(ID, Package)"
when_spec.name = placeholder
body_clauses = self._setup.spec_clauses(when_spec, body=True)
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{placeholder}"', f"{node_variable}")
constraint_spec = spack.spec.Spec(constraint_str)
assert constraint_spec.name is None, "only anonymous constraint specs are accepted"
constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_compiler_version_satisfies":
self._setup.compiler_version_constraints.add(constraint_spec.compiler)
args = f'"{constraint_spec.compiler.name}", "{constraint_spec.compiler.versions}"'
head_str = f"propagate({node_variable}, node_compiler_version_satisfies({args}))"
rule = f"{head_str} :-\n{body_str}.\n\n"
self.rules.append(rule)
self.reset()
def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and
facts for the runtimes.

View File

@@ -811,6 +811,37 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)).
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant),
@@ -888,7 +919,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not propagate(node(ID, Package), variant_value(Variant, Value)),
not attr("variant_propagate", node(ID, Package), Variant, Value, _),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -901,7 +932,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _)),
not attr("variant_propagate", node(ID, Package), Variant, _, _),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -958,67 +989,6 @@ pkg_fact(Package, variant_single_value("dev_path"))
#defined variant_default_value/3.
#defined variant_default_value_from_packages_yaml/3.
%-----------------------------------------------------------------------------
% Propagation semantics
%-----------------------------------------------------------------------------
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
propagate(ParentNode, PropagatedAttribute),
depends_on(ParentNode, ChildNode).
%-----------------------------------------------------------------------------
% Activation of propagated values
%-----------------------------------------------------------------------------
%----
% Variants
%----
% If a variant is propagated, and can be accepted, set its value
attr("variant_value", node(ID, Package), Variant, Value) :-
propagate(node(ID, Package), variant_value(Variant, Value)),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not propagate(PackageNode, variant_value(Variant, Value)).
%----
% Compiler constraints
%----
attr("node_compiler_version_satisfies", node(ID, Package), Compiler, Version) :-
propagate(node(ID, Package), node_compiler_version_satisfies(Compiler, Version)),
node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
not runtime(Package),
not external(Package).
%-----------------------------------------------------------------------------
% Runtimes
%-----------------------------------------------------------------------------
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% If we build packages, the runtime nodes must use an available compiler
1 { node_compiler(PackageNode, CompilerID) : build(PackageNode), not external(PackageNode) } :-
has_built_packages(),
runtime(RuntimePackage),
node_compiler(node(_, RuntimePackage), CompilerID).
%-----------------------------------------------------------------------------
% Platform semantics
%-----------------------------------------------------------------------------
@@ -1120,18 +1090,10 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target
target_weight(Target, 0)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
attr("node_target_set", PackageNode, Target).
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
target(Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
:- attr("node_target", PackageNode, Target), not node_target_weight(PackageNode, _).
node_target_weight(node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
target_weight(Target, Weight).
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
@@ -1193,12 +1155,12 @@ error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _).
not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
@@ -1383,10 +1345,8 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
% topmost-priority criterion to reuse what is installed.
%
% The priority ranges are:
% 1000+ Optimizations for concretization errors
% 300 - 1000 Highest priority optimizations for valid solutions
% 200 - 299 Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds and minimizing dupes.
% 200+ Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds.
% 0 - 99 Priorities for non-built nodes.
build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode).
build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", PackageNode).
@@ -1434,16 +1394,6 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
% 2. a `#minimize{ 0@2 : #true }.` statement that ensures the criterion
% is displayed (clingo doesn't display sums over empty sets by default)
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(310, "requirement weight").
#minimize{ 0@310: #true }.
#minimize {
Weight@310,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight)
}.
% Try hard to reuse installed packages (i.e., minimize the number built)
opt_criterion(110, "number of packages to build (vs. reuse)").
#minimize { 0@110: #true }.
@@ -1455,6 +1405,18 @@ opt_criterion(100, "number of nodes from the same package").
#minimize { ID@100,Package : attr("virtual_node", node(ID, Package)) }.
#defined optimize_for_reuse/0.
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(75, "requirement weight").
#minimize{ 0@275: #true }.
#minimize{ 0@75: #true }.
#minimize {
Weight@75+Priority,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight),
build_priority(PackageNode, Priority)
}.
% Minimize the number of deprecated versions being used
opt_criterion(73, "deprecated versions used").
#minimize{ 0@273: #true }.
@@ -1534,7 +1496,7 @@ opt_criterion(45, "preferred providers (non-roots)").
}.
% Try to minimize the number of compiler mismatches in the DAG.
opt_criterion(40, "compiler mismatches that are not required").
opt_criterion(40, "compiler mismatches that are not from CLI").
#minimize{ 0@240: #true }.
#minimize{ 0@40: #true }.
#minimize{
@@ -1544,7 +1506,7 @@ opt_criterion(40, "compiler mismatches that are not required").
not runtime(Dependency)
}.
opt_criterion(39, "compiler mismatches that are required").
opt_criterion(39, "compiler mismatches that are not from CLI").
#minimize{ 0@239: #true }.
#minimize{ 0@39: #true }.
#minimize{

View File

@@ -4,35 +4,21 @@
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves
% Heuristic to speed-up solves (node with ID 0)
%=============================================================================
% No duplicates by default (most of them will be true)
#heuristic attr("node", node(PackageID, Package)). [100, init]
#heuristic attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("virtual_node", node(VirtualID, Virtual)). [100, init]
#heuristic attr("node", node(1..X-1, Package)) : max_dupes(Package, X), not virtual(Package), X > 1. [-1, sign]
#heuristic attr("virtual_node", node(1..X-1, Package)) : max_dupes(Package, X), virtual(Package) , X > 1. [-1, sign]
% Pick preferred version
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)). [40, init]
#heuristic version_weight(node(PackageID, Package), 0) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
%-----------------
% Domain heuristic
%-----------------
% Use default variants
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, true]
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : not variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, false]
% Root node
#heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Use default operating system and platform
#heuristic attr("node_os", node(PackageID, Package), OS) : os(OS, 0), attr("root", node(PackageID, Package)). [40, true]
#heuristic attr("node_platform", node(PackageID, Package), Platform) : allowed_platform(Platform), attr("root", node(PackageID, Package)). [40, true]
% Use default targets
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [30, init]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, 0), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use the default compilers
#heuristic node_compiler(node(PackageID, Package), ID) : compiler_weight(ID, 0), compiler_id(ID), attr("node", node(PackageID, Package)). [30, init]
% Providers
#heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]

View File

@@ -0,0 +1,24 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID > 0)
%=============================================================================
% node(ID, _)
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -10,13 +10,15 @@
%=============================================================================
% A package cannot be reused if the libc is not compatible with it
error(100, "Cannot reuse {0} since we cannot determine libc compatibility", ReusedPackage)
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).

View File

@@ -12,7 +12,6 @@
%=============================================================================
% macOS
os_compatible("sequoia", "sonoma").
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").

View File

@@ -2045,18 +2045,6 @@ def to_node_dict(self, hash=ht.dag_hash):
if params:
d["parameters"] = params
if params and not self.concrete:
flag_names = [
name
for name, flags in self.compiler_flags.items()
if any(x.propagate for x in flags)
]
d["propagate"] = sorted(
itertools.chain(
[v.name for v in self.variants.values() if v.propagate], flag_names
)
)
if self.external:
d["external"] = syaml.syaml_dict(
[
@@ -2229,10 +2217,16 @@ def node_dict_with_hashes(self, hash=ht.dag_hash):
spec is concrete, the full hash is added as well. If 'build' is in
the hash_type, the build hash is also added."""
node = self.to_node_dict(hash)
# All specs have at least a DAG hash
node[ht.dag_hash.name] = self.dag_hash()
if not self.concrete:
# dag_hash is lazily computed -- but if we write a spec out, we want it
# to be included. This is effectively the last chance we get to compute
# it accurately.
if self.concrete:
# all specs have at least a DAG hash
node[ht.dag_hash.name] = self.dag_hash()
else:
node["concrete"] = False
# we can also give them other hash types if we want
@@ -4170,21 +4164,29 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
# In some cases a package appears multiple times in the same DAG for *distinct*
# specs. For example, a build-type dependency may itself depend on a package
# the current spec depends on, but their specs may differ. Therefore we iterate
# in an order here that prioritizes the build, test and runtime dependencies;
# only when we don't find the package do we consider the full DAG.
order = lambda: itertools.chain(
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"),
self.traverse(deptype="link"),
self.dependencies(deptype=dt.BUILD | dt.RUN | dt.TEST),
self.traverse(), # fall back to a full search
)
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try:
child: Spec = next(
e.spec
for e in itertools.chain(
(e for e in order() if e.spec.name == name or name in e.virtuals),
# for historical reasons
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)),
itertools.chain(
# Regular specs
(x for x in order() if x.name == name),
(
x
for x in order()
if (not x.virtual)
and any(name in edge.virtuals for edge in x.edges_from_dependents())
),
(x for x in order() if (not x.virtual) and x.package.provides(name)),
)
)
except StopIteration:
@@ -4426,12 +4428,9 @@ def format_attribute(match_object: Match) -> str:
if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if isinstance(current, vt.VariantMap):
if part == "variants" and isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names
try:
current = current[part]
except KeyError:
raise SpecFormatStringError(f"Variant '{part}' does not exist")
current = current[part]
else:
# aliases
if part == "arch":
@@ -5005,17 +5004,13 @@ def from_node_dict(cls, node):
else:
spec.compiler = None
propagated_names = node.get("propagate", [])
for name, values in node.get("parameters", {}).items():
propagate = name in propagated_names
if name in _valid_compiler_flags:
spec.compiler_flags[name] = []
for val in values:
spec.compiler_flags.add_flag(name, val, propagate)
spec.compiler_flags.add_flag(name, val, False)
else:
spec.variants[name] = vt.MultiValuedVariant.from_node_dict(
name, values, propagate=propagate
)
spec.variants[name] = vt.MultiValuedVariant.from_node_dict(name, values)
spec.external_path = None
spec.external_modules = None

View File

@@ -13,7 +13,7 @@
import stat
import sys
import tempfile
from typing import Callable, Dict, Iterable, List, Optional, Set
from typing import Callable, Dict, Iterable, Optional, Set
import llnl.string
import llnl.util.lang
@@ -40,7 +40,6 @@
import spack.resource
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.lock
import spack.util.path as sup
import spack.util.pattern as pattern
@@ -347,6 +346,8 @@ class Stage(LockableStagingDir):
similar, and are intended to persist for only one run of spack.
"""
#: Most staging is managed by Spack. DIYStage is one exception.
needs_fetching = True
requires_patch_success = True
def __init__(
@@ -535,29 +536,32 @@ def generate_fetchers():
for fetcher in dynamic_fetchers:
yield fetcher
errors: List[str] = []
def print_errors(errors):
for msg in errors:
tty.debug(msg)
errors = []
for fetcher in generate_fetchers():
try:
fetcher.stage = self
self.fetcher = fetcher
self.fetcher.fetch()
break
except fs.NoCacheError:
except spack.fetch_strategy.NoCacheError:
# Don't bother reporting when something is not cached.
continue
except fs.FailedDownloadError as f:
errors.extend(f"{fetcher}: {e.__class__.__name__}: {e}" for e in f.exceptions)
continue
except spack.error.SpackError as e:
errors.append(f"{fetcher}: {e.__class__.__name__}: {e}")
errors.append("Fetching from {0} failed.".format(fetcher))
tty.debug(e)
continue
else:
print_errors(errors)
self.fetcher = self.default_fetcher
if err_msg:
raise spack.error.FetchError(err_msg)
raise spack.error.FetchError(
f"All fetchers failed for {self.name}", "\n".join(f" {e}" for e in errors)
)
default_msg = "All fetchers failed for {0}".format(self.name)
raise spack.error.FetchError(err_msg or default_msg, None)
print_errors(errors)
def steal_source(self, dest):
"""Copy the source_path directory in its entirety to directory dest
@@ -768,6 +772,8 @@ def __init__(self):
"cache_mirror",
"steal_source",
"disable_mirrors",
"needs_fetching",
"requires_patch_success",
]
)
@@ -806,10 +812,6 @@ def path(self):
def archive_file(self):
return self[0].archive_file
@property
def requires_patch_success(self):
return self[0].requires_patch_success
@property
def keep(self):
return self[0].keep
@@ -820,7 +822,64 @@ def keep(self, value):
item.keep = value
class DIYStage:
"""
Simple class that allows any directory to be a spack stage. Consequently,
it does not expect or require that the source path adhere to the standard
directory naming convention.
"""
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
raise ValueError("Cannot construct DIYStage without a path.")
elif not os.path.isdir(path):
raise StagePathError("The stage path directory does not exist:", path)
self.archive_file = None
self.path = path
self.source_path = path
self.created = True
# DIY stages do nothing as context managers.
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def fetch(self, *args, **kwargs):
tty.debug("No need to fetch for DIY.")
def check(self):
tty.debug("No checksum needed for DIY.")
def expand_archive(self):
tty.debug("Using source directory: {0}".format(self.source_path))
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def restage(self):
raise RestageError("Cannot restage a DIY stage.")
def create(self):
self.created = True
def destroy(self):
# No need to destroy DIY stage.
pass
def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False
def __init__(self, name, dev_path, reference_link):
@@ -1186,7 +1245,7 @@ def _fetch_and_checksum(url, options, keep_stage, action_fn=None):
# Checksum the archive and add it to the list
checksum = spack.util.crypto.checksum(hashlib.sha256, stage.archive_file)
return checksum, None
except fs.FailedDownloadError:
except FailedDownloadError:
return None, f"[WORKER] Failed to fetch {url}"
except Exception as e:
return None, f"[WORKER] Something failed on {url}, skipping. ({e})"
@@ -1206,3 +1265,7 @@ class RestageError(StageError):
class VersionFetchError(StageError):
"""Raised when we can't determine a URL to fetch a package."""
# Keep this in namespace for convenience
FailedDownloadError = fs.FailedDownloadError

View File

@@ -371,6 +371,7 @@ def use_store(
data.update(extra_data)
# Swap the store with the one just constructed and return it
ensure_singleton_created()
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}})
)

View File

@@ -218,12 +218,10 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec(
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
spec = Spec(f"a %gcc@10 foobar=bar target={root_target_range} ^b target={dep_target_range}")
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert spec.target == spec["pkg-b"].target == result
assert spec.target == spec["b"].target == result
@pytest.mark.parametrize(

View File

@@ -19,6 +19,8 @@
(["missing-dependency"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# The package use a non existing variant in a depends_on directive
(["wrong-variant-in-depends-on"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub pull request commit patch URL
(["invalid-github-pull-commits-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub patch URL without full_index=1
(["invalid-github-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has invalid GitLab patch URLs

View File

@@ -228,25 +228,3 @@ def test_source_is_disabled(mutable_config):
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")
def test_use_store_does_not_try_writing_outside_root(tmp_path, monkeypatch, mutable_config):
"""Tests that when we use the 'use_store' context manager, there is no attempt at creating
a Store outside the given root.
"""
initial_store = mutable_config.get("config:install_tree:root")
user_store = tmp_path / "store"
fn = spack.store.Store.__init__
def _checked_init(self, root, *args, **kwargs):
fn(self, root, *args, **kwargs)
assert self.root == str(user_store)
monkeypatch.setattr(spack.store.Store, "__init__", _checked_init)
spack.store.reinitialize()
with spack.store.use_store(user_store):
assert spack.config.CONFIG.get("config:install_tree:root") == str(user_store)
assert spack.config.CONFIG.get("config:install_tree:root") == initial_store

View File

@@ -457,14 +457,14 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
# a foobar=bar (parallel = False)
# |
# b (parallel =True)
s = default_mock_concretization("pkg-a foobar=bar")
s = default_mock_concretization("a foobar=bar")
spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD)
assert s["pkg-a"].package.module.make_jobs == 1
assert s["a"].package.module.make_jobs == 1
spack.build_environment.set_package_py_globals(s["pkg-b"].package, context=Context.BUILD)
assert s["pkg-b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["pkg-b"].package.parallel
spack.build_environment.set_package_py_globals(s["b"].package, context=Context.BUILD)
assert s["b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["b"].package.parallel
)
@@ -560,7 +560,7 @@ def test_dirty_disable_module_unload(config, mock_packages, working_env, mock_mo
"""Test that on CRAY platform 'module unload' is not called if the 'dirty'
option is on.
"""
s = spack.spec.Spec("pkg-a").concretized()
s = spack.spec.Spec("a").concretized()
# If called with "dirty" we don't unload modules, so no calls to the
# `module` function on Cray

View File

@@ -97,7 +97,7 @@ def test_negative_ninja_check(self, input_dir, test_dir, concretize_and_setup):
@pytest.mark.usefixtures("config", "mock_packages")
class TestAutotoolsPackage:
def test_with_or_without(self, default_mock_concretization):
s = default_mock_concretization("pkg-a")
s = default_mock_concretization("a")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -129,7 +129,7 @@ def activate(value):
assert "--without-lorem-ipsum" in options
def test_none_is_allowed(self, default_mock_concretization):
s = default_mock_concretization("pkg-a foo=none")
s = default_mock_concretization("a foo=none")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature

View File

@@ -12,21 +12,21 @@
def test_build_task_errors(install_mockery):
with pytest.raises(ValueError, match="must be a package"):
inst.BuildTask("abc", None, False, 0, 0, 0, set())
inst.BuildTask("abc", None, False, 0, 0, 0, [])
spec = spack.spec.Spec("trivial-install-test-package")
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
with pytest.raises(ValueError, match="must have a concrete spec"):
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, set())
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, [])
spec.concretize()
assert spec.concrete
with pytest.raises(ValueError, match="must have a build request"):
inst.BuildTask(spec.package, None, False, 0, 0, 0, set())
inst.BuildTask(spec.package, None, False, 0, 0, 0, [])
request = inst.BuildRequest(spec.package, {})
with pytest.raises(inst.InstallError, match="Cannot create a build task"):
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, set())
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, [])
def test_build_task_basics(install_mockery):
@@ -36,8 +36,8 @@ def test_build_task_basics(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
assert not task.explicit
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
assert task.explicit # package was "explicitly" requested
assert task.priority == len(task.uninstalled_deps)
assert task.key == (task.priority, task.sequence)
@@ -58,7 +58,7 @@ def test_build_task_strings(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
# Cover __repr__
irep = task.__repr__()

View File

@@ -106,24 +106,24 @@ def test_specs_staging(config, tmpdir):
"""
builder = repo.MockRepositoryBuilder(tmpdir)
builder.add_package("pkg-g")
builder.add_package("pkg-f")
builder.add_package("pkg-e")
builder.add_package("pkg-d", dependencies=[("pkg-f", None, None), ("pkg-g", None, None)])
builder.add_package("pkg-c")
builder.add_package("pkg-b", dependencies=[("pkg-d", None, None), ("pkg-e", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
builder.add_package("g")
builder.add_package("f")
builder.add_package("e")
builder.add_package("d", dependencies=[("f", None, None), ("g", None, None)])
builder.add_package("c")
builder.add_package("b", dependencies=[("d", None, None), ("e", None, None)])
builder.add_package("a", dependencies=[("b", None, None), ("c", None, None)])
with repo.use_repositories(builder.root):
spec_a = Spec("pkg-a").concretized()
spec_a = Spec("a").concretized()
spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["pkg-b"])
spec_c_label = ci._spec_ci_label(spec_a["pkg-c"])
spec_d_label = ci._spec_ci_label(spec_a["pkg-d"])
spec_e_label = ci._spec_ci_label(spec_a["pkg-e"])
spec_f_label = ci._spec_ci_label(spec_a["pkg-f"])
spec_g_label = ci._spec_ci_label(spec_a["pkg-g"])
spec_b_label = ci._spec_ci_label(spec_a["b"])
spec_c_label = ci._spec_ci_label(spec_a["c"])
spec_d_label = ci._spec_ci_label(spec_a["d"])
spec_e_label = ci._spec_ci_label(spec_a["e"])
spec_f_label = ci._spec_ci_label(spec_a["f"])
spec_g_label = ci._spec_ci_label(spec_a["g"])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])
@@ -1290,7 +1290,7 @@ def test_ci_generate_override_runner_attrs(
spack:
specs:
- flatten-deps
- pkg-a
- a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1307,12 +1307,12 @@ def test_ci_generate_override_runner_attrs(
- match:
- dependency-install
- match:
- pkg-a
- a
build-job:
tags:
- specific-a-2
- match:
- pkg-a
- a
build-job-remove:
tags:
- toplevel2
@@ -1372,8 +1372,8 @@ def test_ci_generate_override_runner_attrs(
assert global_vars["SPACK_CHECKOUT_VERSION"] == git_version or "v0.20.0.test0"
for ci_key in yaml_contents.keys():
if ci_key.startswith("pkg-a"):
# Make sure pkg-a's attributes override variables, and all the
if ci_key.startswith("a"):
# Make sure a's attributes override variables, and all the
# scripts. Also, make sure the 'toplevel' tag doesn't
# appear twice, but that a's specific extra tag does appear
the_elt = yaml_contents[ci_key]
@@ -1830,7 +1830,7 @@ def test_ci_generate_read_broken_specs_url(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
):
"""Verify that `broken-specs-url` works as intended"""
spec_a = Spec("pkg-a")
spec_a = Spec("a")
spec_a.concretize()
a_dag_hash = spec_a.dag_hash()
@@ -1856,7 +1856,7 @@ def test_ci_generate_read_broken_specs_url(
spack:
specs:
- flatten-deps
- pkg-a
- a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1864,9 +1864,9 @@ def test_ci_generate_read_broken_specs_url(
pipeline-gen:
- submapping:
- match:
- pkg-a
- a
- flatten-deps
- pkg-b
- b
- dependency-install
build-job:
tags:

View File

@@ -81,14 +81,14 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
"""
# Initial sanity check: we are planning on choosing a non-default
# value, so make sure that is in fact not the default.
check_defaults = spack.cmd.parse_specs(["pkg-a"], concretize=True)[0]
check_defaults = spack.cmd.parse_specs(["a"], concretize=True)[0]
assert not check_defaults.satisfies("foobar=baz")
e = ev.create("test")
e.add("pkg-a foobar=baz")
e.add("a foobar=baz")
e.concretize()
with e:
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
assert env_spec.satisfies("foobar=baz")
assert env_spec.concrete
@@ -96,12 +96,12 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config")
def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("pkg-a foobar=baz")
e.add("pkg-a foobar=fee")
e.add("a foobar=baz")
e.add("a foobar=fee")
e.concretize()
with e:
with pytest.raises(ev.SpackEnvironmentError) as exc_info:
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
assert "matches multiple specs" in exc_info.value.message
@@ -109,16 +109,16 @@ def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config")
def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("pkg-b@0.9")
e.add("pkg-a foobar=bar") # Depends on b, should choose b@1.0
e.add("b@0.9")
e.add("a foobar=bar") # Depends on b, should choose b@1.0
e.concretize()
with e:
# This query matches the root b and b as a dependency of a. In that
# case the root instance should be preferred.
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b"])[0])
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b"])[0])
assert env_spec1.satisfies("@0.9")
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b@1.0"])[0])
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b@1.0"])[0])
assert env_spec2

View File

@@ -51,8 +51,8 @@ def test_concretize_root_test_dependencies_are_concretized(unify, mutable_mock_e
with ev.read("test") as e:
e.unify = unify
add("pkg-a")
add("pkg-b")
add("a")
add("b")
concretize("--test", "root")
assert e.matching_spec("test-dependency")

View File

@@ -15,26 +15,26 @@
def test_env(mutable_mock_env_path, config, mock_packages):
ev.create("test")
with ev.read("test") as e:
e.add("pkg-a@2.0 foobar=bar ^pkg-b@1.0")
e.add("pkg-a@1.0 foobar=bar ^pkg-b@0.9")
e.add("a@2.0 foobar=bar ^b@1.0")
e.add("a@1.0 foobar=bar ^b@0.9")
e.concretize()
e.write()
def test_deconcretize_dep(test_env):
with ev.read("test") as e:
deconcretize("-y", "pkg-b@1.0")
deconcretize("-y", "b@1.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("pkg-a@1.0")
assert specs[0].satisfies("a@1.0")
def test_deconcretize_all_dep(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "pkg-b")
deconcretize("-y", "--all", "pkg-b")
deconcretize("-y", "b")
deconcretize("-y", "--all", "b")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0
@@ -42,27 +42,27 @@ def test_deconcretize_all_dep(test_env):
def test_deconcretize_root(test_env):
with ev.read("test") as e:
output = deconcretize("-y", "--root", "pkg-b@1.0")
output = deconcretize("-y", "--root", "b@1.0")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "pkg-a@2.0")
deconcretize("-y", "--root", "a@2.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("pkg-a@1.0")
assert specs[0].satisfies("a@1.0")
def test_deconcretize_all_root(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "--root", "pkg-a")
deconcretize("-y", "--root", "a")
output = deconcretize("-y", "--root", "--all", "pkg-b")
output = deconcretize("-y", "--root", "--all", "b")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "--all", "pkg-a")
deconcretize("-y", "--root", "--all", "a")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0

View File

@@ -28,9 +28,7 @@
import spack.package_base
import spack.paths
import spack.repo
import spack.store
import spack.util.spack_json as sjson
import spack.util.spack_yaml
from spack.cmd.env import _env_create
from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
@@ -503,7 +501,7 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
"""\
spack:
specs:
- pkg-a
- a
- depb
"""
)
@@ -522,8 +520,8 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
depb = spack.store.STORE.db.query_one("depb", installed=True)
assert depb, "Expected depb to be installed"
a = spack.store.STORE.db.query_one("pkg-a", installed=True)
assert a, "Expected pkg-a to be installed"
a = spack.store.STORE.db.query_one("a", installed=True)
assert a, "Expected a to be installed"
def test_remove_after_concretize():
@@ -827,7 +825,7 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
"""\
spack:
specs:
- pkg-a
- a
view: true
"""
)
@@ -835,9 +833,9 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
external_config = io.StringIO(
"""\
packages:
pkg-a:
a:
externals:
- spec: pkg-a@2.0
- spec: a@2.0
prefix: {a_prefix}
buildable: false
""".format(
@@ -1739,17 +1737,6 @@ def test_env_include_concrete_env_yaml(env_name):
assert test.path in combined_yaml["include_concrete"]
@pytest.mark.regression("45766")
@pytest.mark.parametrize("format", ["v1", "v2", "v3"])
def test_env_include_concrete_old_env(format, tmpdir):
lockfile = os.path.join(spack.paths.test_path, "data", "legacy_env", f"{format}.lock")
# create an env from old .lock file -- this does not update the format
env("create", "old-env", lockfile)
env("create", "--include-concrete", "old-env", "test")
assert ev.read("old-env").all_specs() == ev.read("test").all_specs()
def test_env_bad_include_concrete_env():
with pytest.raises(ev.SpackEnvironmentError):
env("create", "--include-concrete", "nonexistant_env", "combined_env")
@@ -3987,7 +3974,7 @@ def test_environment_depfile_makefile(depfile_flags, expected_installs, tmpdir,
)
# Do make dry run.
out = make("-n", "-f", makefile, "SPACK=spack", output=str)
out = make("-n", "-f", makefile, output=str)
specs_that_make_would_install = _parse_dry_run_package_installs(out)
@@ -4025,7 +4012,7 @@ def test_depfile_works_with_gitversions(tmpdir, mock_packages, monkeypatch):
env("depfile", "-o", makefile, "--make-disable-jobserver", "--make-prefix=prefix")
# Do a dry run on the generated depfile
out = make("-n", "-f", makefile, "SPACK=spack", output=str)
out = make("-n", "-f", makefile, output=str)
# Check that all specs are there (without duplicates)
specs_that_make_would_install = _parse_dry_run_package_installs(out)
@@ -4087,12 +4074,7 @@ def test_depfile_phony_convenience_targets(
# Phony install/* target should install picked package and all its deps
specs_that_make_would_install = _parse_dry_run_package_installs(
make(
"-n",
picked_spec.format("install/{name}-{version}-{hash}"),
"SPACK=spack",
output=str,
)
make("-n", picked_spec.format("install/{name}-{version}-{hash}"), output=str)
)
assert set(specs_that_make_would_install) == set(expected_installs)
@@ -4100,12 +4082,7 @@ def test_depfile_phony_convenience_targets(
# Phony install-deps/* target shouldn't install picked package
specs_that_make_would_install = _parse_dry_run_package_installs(
make(
"-n",
picked_spec.format("install-deps/{name}-{version}-{hash}"),
"SPACK=spack",
output=str,
)
make("-n", picked_spec.format("install-deps/{name}-{version}-{hash}"), output=str)
)
assert set(specs_that_make_would_install) == set(expected_installs) - {picked_package}
@@ -4165,7 +4142,7 @@ def test_spack_package_ids_variable(tmpdir, mock_packages):
make = Executable("make")
# Do dry run.
out = make("-n", "-C", str(tmpdir), "SPACK=spack", output=str)
out = make("-n", "-C", str(tmpdir), output=str)
# post-install: <hash> should've been executed
with ev.read("test") as test:

View File

@@ -69,10 +69,10 @@ def test_query_arguments():
q_args = query_arguments(args)
assert "installed" in q_args
assert "predicate_fn" in q_args
assert "known" in q_args
assert "explicit" in q_args
assert q_args["installed"] == ["installed"]
assert q_args["predicate_fn"] is None
assert q_args["known"] is any
assert q_args["explicit"] is any
assert "start_date" in q_args
assert "end_date" not in q_args

View File

@@ -89,7 +89,7 @@ def check(pkg):
assert pkg.run_tests
monkeypatch.setattr(spack.package_base.PackageBase, "unit_test_check", check)
install("--test=all", "pkg-a")
install("--test=all", "a")
def test_install_package_already_installed(
@@ -570,58 +570,61 @@ def test_cdash_upload_build_error(tmpdir, mock_fetch, install_mockery, capfd):
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_clean_build(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
with capfd.disabled():
with tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_extra_params(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - pkg-a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
with capfd.disabled():
with tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
@pytest.mark.disable_clean_stage_check
def test_cdash_buildstamp_param(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
with capfd.disabled():
with tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
@pytest.mark.disable_clean_stage_check
@@ -629,37 +632,38 @@ def test_cdash_install_from_spec_json(
tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive, config
):
# capfd interferes with Spack's capturing
with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
with capfd.disabled():
with tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("pkg-a")
pkg_spec.concretize()
pkg_spec = Spec("a")
pkg_spec.concretize()
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "pkg-a@" in install_command
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "a@" in install_command
@pytest.mark.disable_clean_stage_check
@@ -791,15 +795,15 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^libdwarf
# ^mpich
# libelf@0.8.10
# pkg-a~bvv
# ^pkg-b
# pkg-a
# ^pkg-b
# a~bvv
# ^b
# a
# ^b
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs
e.add("pkg-a")
e.add("pkg-a ~bvv")
e.add("a")
e.add("a ~bvv")
e.concretize()
e.write()
env_specs = e.all_specs()
@@ -810,9 +814,9 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# First find and remember some target concrete specs in the environment
for e_spec in env_specs:
if e_spec.satisfies(Spec("pkg-a ~bvv")):
if e_spec.satisfies(Spec("a ~bvv")):
a_spec = e_spec
elif e_spec.name == "pkg-b":
elif e_spec.name == "b":
b_spec = e_spec
elif e_spec.satisfies(Spec("mpi")):
mpi_spec = e_spec
@@ -835,8 +839,8 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that two packages "a" get installed
inst_out = install("pkg-a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "pkg-a"]) == 2
inst_out = install("a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "a"]) == 2
# Install an unambiguous dependency spec (that already exists as a dep
# in the environment) and make sure it gets installed (w/ deps),
@@ -869,7 +873,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# root of the environment as well as installed.
assert b_spec not in e.roots()
install("--add", "pkg-b")
install("--add", "b")
assert b_spec in e.roots()
assert b_spec not in e.uninstalled_specs()
@@ -904,7 +908,7 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
# capfd interferes with Spack's capturing
with tmpdir.as_cwd(), capfd.disabled():
monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "asdf")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "a")
assert "Using CDash auth token from environment" in out
@@ -912,25 +916,26 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
@pytest.mark.disable_clean_stage_check
def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd():
# Test would fail if install raised an error.
with capfd.disabled():
with tmpdir.as_cwd():
# Test would fail if install raised an error.
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = spack.spec.Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -947,7 +952,7 @@ def test_compiler_bootstrap(
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("pkg-a%gcc@=12.0")
install("a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows")
@@ -987,8 +992,8 @@ def test_compiler_bootstrap_from_binary_mirror(
# Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not
# raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0")
install("--no-check-signature", "--cache-only", "--only", "dependencies", "b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -1008,7 +1013,7 @@ def test_compiler_bootstrap_already_installed(
# Test succeeds if it does not raise an error
install("gcc@=12.0")
install("pkg-a%gcc@=12.0")
install("a%gcc@=12.0")
def test_install_fails_no_args(tmpdir):
@@ -1190,7 +1195,7 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
parser = argparse.ArgumentParser()
spack.cmd.install.setup_parser(parser)
args = parser.parse_args(
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "pkg-a"]
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "a"]
)
specs = spack.cmd.install.concrete_specs_from_cli(args, {})
filename = spack.cmd.install.report_filename(args, specs)

View File

@@ -121,7 +121,7 @@ def test_maintainers_list_packages(mock_packages, capfd):
def test_maintainers_list_fails(mock_packages, capfd):
out = maintainers("pkg-a", fail_on_error=False)
out = maintainers("a", fail_on_error=False)
assert not out
assert maintainers.returncode == 1

View File

@@ -11,7 +11,6 @@
import spack.config
import spack.main
import spack.modules
import spack.spec
import spack.store
module = spack.main.SpackCommand("module")
@@ -179,8 +178,8 @@ def test_setdefault_command(mutable_database, mutable_config):
}
}
spack.config.set("modules", data)
# Install two different versions of pkg-a
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0"
# Install two different versions of a package
other_spec, preferred = "a@1.0", "a@2.0"
spack.spec.Spec(other_spec).concretized().package.do_install(fake=True)
spack.spec.Spec(preferred).concretized().package.do_install(fake=True)

View File

@@ -28,8 +28,8 @@ def install(self, spec, prefix):
pass
"""
abc = {"mockpkg-a", "mockpkg-b", "mockpkg-c"}
abd = {"mockpkg-a", "mockpkg-b", "mockpkg-d"}
abc = set(("pkg-a", "pkg-b", "pkg-c"))
abd = set(("pkg-a", "pkg-b", "pkg-d"))
# Force all tests to use a git repository *in* the mock packages repo.
@@ -53,33 +53,27 @@ def mock_pkg_git_repo(git, tmpdir_factory):
git("config", "user.name", "Spack Testing")
git("-c", "commit.gpgsign=false", "commit", "-m", "initial mock repo commit")
# add commit with mockpkg-a, mockpkg-b, mockpkg-c packages
mkdirp("mockpkg-a", "mockpkg-b", "mockpkg-c")
with open("mockpkg-a/package.py", "w") as f:
# add commit with pkg-a, pkg-b, pkg-c packages
mkdirp("pkg-a", "pkg-b", "pkg-c")
with open("pkg-a/package.py", "w") as f:
f.write(pkg_template.format(name="PkgA"))
with open("mockpkg-b/package.py", "w") as f:
with open("pkg-b/package.py", "w") as f:
f.write(pkg_template.format(name="PkgB"))
with open("mockpkg-c/package.py", "w") as f:
with open("pkg-c/package.py", "w") as f:
f.write(pkg_template.format(name="PkgC"))
git("add", "mockpkg-a", "mockpkg-b", "mockpkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "add mockpkg-a, mockpkg-b, mockpkg-c")
git("add", "pkg-a", "pkg-b", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "add pkg-a, pkg-b, pkg-c")
# remove mockpkg-c, add mockpkg-d
with open("mockpkg-b/package.py", "a") as f:
f.write("\n# change mockpkg-b")
git("add", "mockpkg-b")
mkdirp("mockpkg-d")
with open("mockpkg-d/package.py", "w") as f:
# remove pkg-c, add pkg-d
with open("pkg-b/package.py", "a") as f:
f.write("\n# change pkg-b")
git("add", "pkg-b")
mkdirp("pkg-d")
with open("pkg-d/package.py", "w") as f:
f.write(pkg_template.format(name="PkgD"))
git("add", "mockpkg-d")
git("rm", "-rf", "mockpkg-c")
git(
"-c",
"commit.gpgsign=false",
"commit",
"-m",
"change mockpkg-b, remove mockpkg-c, add mockpkg-d",
)
git("add", "pkg-d")
git("rm", "-rf", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "change pkg-b, remove pkg-c, add pkg-d")
with spack.repo.use_repositories(str(repo_path)):
yield mock_repo_packages
@@ -92,11 +86,12 @@ def mock_pkg_names():
# Be sure to include virtual packages since packages with stand-alone
# tests may inherit additional tests from the virtuals they provide,
# such as packages that implement `mpi`.
return {
names = set(
name
for name in repo.all_package_names(include_virtuals=True)
if not name.startswith("mockpkg-")
}
if not name.startswith("pkg-")
)
return names
def split(output):
@@ -118,17 +113,17 @@ def test_mock_packages_path(mock_packages):
def test_pkg_add(git, mock_pkg_git_repo):
with working_dir(mock_pkg_git_repo):
mkdirp("mockpkg-e")
with open("mockpkg-e/package.py", "w") as f:
mkdirp("pkg-e")
with open("pkg-e/package.py", "w") as f:
f.write(pkg_template.format(name="PkgE"))
pkg("add", "mockpkg-e")
pkg("add", "pkg-e")
with working_dir(mock_pkg_git_repo):
try:
assert "A mockpkg-e/package.py" in git("status", "--short", output=str)
assert "A pkg-e/package.py" in git("status", "--short", output=str)
finally:
shutil.rmtree("mockpkg-e")
shutil.rmtree("pkg-e")
# Removing a package mid-run disrupts Spack's caching
if spack.repo.PATH.repos[0]._fast_package_checker:
spack.repo.PATH.repos[0]._fast_package_checker.invalidate()
@@ -143,10 +138,10 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
assert sorted(mock_pkg_names) == sorted(out)
out = split(pkg("list", "HEAD^"))
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-c"])) == sorted(out)
assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-c"])) == sorted(out)
out = split(pkg("list", "HEAD"))
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-d"])) == sorted(out)
assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-d"])) == sorted(out)
# test with three dots to make sure pkg calls `git merge-base`
out = split(pkg("list", "HEAD^^..."))
@@ -156,25 +151,25 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
@pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_diff(mock_pkg_git_repo, mock_pkg_names):
out = split(pkg("diff", "HEAD^^", "HEAD^"))
assert out == ["HEAD^:", "mockpkg-a", "mockpkg-b", "mockpkg-c"]
assert out == ["HEAD^:", "pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("diff", "HEAD^^", "HEAD"))
assert out == ["HEAD:", "mockpkg-a", "mockpkg-b", "mockpkg-d"]
assert out == ["HEAD:", "pkg-a", "pkg-b", "pkg-d"]
out = split(pkg("diff", "HEAD^", "HEAD"))
assert out == ["HEAD^:", "mockpkg-c", "HEAD:", "mockpkg-d"]
assert out == ["HEAD^:", "pkg-c", "HEAD:", "pkg-d"]
@pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_added(mock_pkg_git_repo):
out = split(pkg("added", "HEAD^^", "HEAD^"))
assert ["mockpkg-a", "mockpkg-b", "mockpkg-c"] == out
assert ["pkg-a", "pkg-b", "pkg-c"] == out
out = split(pkg("added", "HEAD^^", "HEAD"))
assert ["mockpkg-a", "mockpkg-b", "mockpkg-d"] == out
assert ["pkg-a", "pkg-b", "pkg-d"] == out
out = split(pkg("added", "HEAD^", "HEAD"))
assert ["mockpkg-d"] == out
assert ["pkg-d"] == out
out = split(pkg("added", "HEAD", "HEAD"))
assert out == []
@@ -189,7 +184,7 @@ def test_pkg_removed(mock_pkg_git_repo):
assert out == []
out = split(pkg("removed", "HEAD^", "HEAD"))
assert out == ["mockpkg-c"]
assert out == ["pkg-c"]
@pytest.mark.not_on_windows("stdout format conflict")
@@ -201,34 +196,34 @@ def test_pkg_changed(mock_pkg_git_repo):
assert out == []
out = split(pkg("changed", "--type", "a", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "--type", "r", "HEAD^^", "HEAD^"))
assert out == []
out = split(pkg("changed", "--type", "ar", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "--type", "arc", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "HEAD^", "HEAD"))
assert out == ["mockpkg-b"]
assert out == ["pkg-b"]
out = split(pkg("changed", "--type", "c", "HEAD^", "HEAD"))
assert out == ["mockpkg-b"]
assert out == ["pkg-b"]
out = split(pkg("changed", "--type", "a", "HEAD^", "HEAD"))
assert out == ["mockpkg-d"]
assert out == ["pkg-d"]
out = split(pkg("changed", "--type", "r", "HEAD^", "HEAD"))
assert out == ["mockpkg-c"]
assert out == ["pkg-c"]
out = split(pkg("changed", "--type", "ar", "HEAD^", "HEAD"))
assert out == ["mockpkg-c", "mockpkg-d"]
assert out == ["pkg-c", "pkg-d"]
out = split(pkg("changed", "--type", "arc", "HEAD^", "HEAD"))
assert out == ["mockpkg-b", "mockpkg-c", "mockpkg-d"]
assert out == ["pkg-b", "pkg-c", "pkg-d"]
# invalid type argument
with pytest.raises(spack.main.SpackCommandError):
@@ -294,7 +289,7 @@ def test_pkg_canonical_source(mock_packages):
def test_pkg_hash(mock_packages):
output = pkg("hash", "pkg-a", "pkg-b").strip().split()
output = pkg("hash", "a", "b").strip().split()
assert len(output) == 2 and all(len(elt) == 32 for elt in output)
output = pkg("hash", "multimethod").strip().split()

View File

@@ -58,7 +58,7 @@ def test_spec_concretizer_args(mutable_config, mutable_database, do_not_check_ru
def test_spec_parse_dependency_variant_value():
"""Verify that we can provide multiple key=value variants to multiple separate
packages within a spec string."""
output = spec("multivalue-variant fee=barbaz ^ pkg-a foobar=baz")
output = spec("multivalue-variant fee=barbaz ^ a foobar=baz")
assert "fee=barbaz" in output
assert "foobar=baz" in output

View File

@@ -10,14 +10,10 @@
from llnl.util.filesystem import copy_tree
import spack.cmd.common.arguments
import spack.cmd.install
import spack.cmd.test
import spack.config
import spack.install_test
import spack.package_base
import spack.paths
import spack.spec
import spack.store
from spack.install_test import TestStatus
from spack.main import SpackCommand

View File

@@ -384,9 +384,18 @@ def test_clang_flags():
unsupported_flag_test("cxx17_flag", "clang@3.4")
supported_flag_test("cxx17_flag", "-std=c++1z", "clang@3.5")
supported_flag_test("cxx17_flag", "-std=c++17", "clang@5.0")
unsupported_flag_test("cxx20_flag", "clang@4.0")
supported_flag_test("cxx20_flag", "-std=c++2a", "clang@5.0")
supported_flag_test("cxx20_flag", "-std=c++20", "clang@11.0")
unsupported_flag_test("cxx23_flag", "clang@11.0")
supported_flag_test("cxx23_flag", "-std=c++2b", "clang@12.0")
supported_flag_test("cxx23_flag", "-std=c++23", "clang@17.0")
supported_flag_test("c99_flag", "-std=c99", "clang@3.3")
unsupported_flag_test("c11_flag", "clang@2.0")
supported_flag_test("c11_flag", "-std=c11", "clang@6.1.0")
unsupported_flag_test("c23_flag", "clang@8.0")
supported_flag_test("c23_flag", "-std=c2x", "clang@9.0")
supported_flag_test("c23_flag", "-std=c23", "clang@18.0")
supported_flag_test("cc_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("cxx_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("f77_pic_flag", "-fPIC", "clang@3.3")

View File

@@ -24,8 +24,6 @@
import spack.platforms
import spack.repo
import spack.solver.asp
import spack.store
import spack.util.file_cache
import spack.util.libc
import spack.variant as vt
from spack.concretize import find_spec
@@ -406,7 +404,7 @@ def test_compiler_flags_from_compiler_and_dependent(self):
def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags):
mutable_config.set("compilers", [clang12_with_flags])
# Correct arch to use test compiler that has flags
spec = Spec("pkg-a %clang@12.2.0 platform=test os=fe target=fe")
spec = Spec("a %clang@12.2.0 platform=test os=fe target=fe")
# Get the compiler that matches the spec (
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
@@ -475,7 +473,7 @@ def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
assert s.architecture.target == spec.architecture.target
def test_compiler_flags_from_user_are_grouped(self):
spec = Spec('pkg-a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec = Spec('a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec.concretize()
cflags = spec.compiler_flags["cflags"]
assert any(x == "-foo-flag foo-val" for x in cflags)
@@ -583,20 +581,20 @@ def test_concretize_propagate_multivalue_variant(self):
spec = Spec("multivalue-variant foo==baz,fee")
spec.concretize()
assert spec.satisfies("^pkg-a foo=baz,fee")
assert spec.satisfies("^pkg-b foo=baz,fee")
assert not spec.satisfies("^pkg-a foo=bar")
assert not spec.satisfies("^pkg-b foo=bar")
assert spec.satisfies("^a foo=baz,fee")
assert spec.satisfies("^b foo=baz,fee")
assert not spec.satisfies("^a foo=bar")
assert not spec.satisfies("^b foo=bar")
def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed
with spack.concretize.enable_compiler_existence_check():
s = Spec("pkg-a %gcc@=0.0.0")
s = Spec("a %gcc@=0.0.0")
with pytest.raises(spack.concretize.UnavailableCompilerVersionError):
s.concretize()
def test_no_compilers_for_arch(self):
s = Spec("pkg-a arch=linux-rhel0-x86_64")
s = Spec("a arch=linux-rhel0-x86_64")
with pytest.raises(spack.error.SpackError):
s.concretize()
@@ -805,7 +803,7 @@ def test_regression_issue_7941(self):
# The string representation of a spec containing
# an explicit multi-valued variant and a dependency
# might be parsed differently than the originating spec
s = Spec("pkg-a foobar=bar ^pkg-b")
s = Spec("a foobar=bar ^b")
t = Spec(str(s))
s.concretize()
@@ -1185,14 +1183,14 @@ def test_conditional_provides_or_depends_on(self):
[
# Check that True is treated correctly and attaches test deps
# to all nodes in the DAG
("pkg-a", True, ["pkg-a"], []),
("pkg-a foobar=bar", True, ["pkg-a", "pkg-b"], []),
("a", True, ["a"], []),
("a foobar=bar", True, ["a", "b"], []),
# Check that a list of names activates the dependency only for
# packages in that list
("pkg-a foobar=bar", ["pkg-a"], ["pkg-a"], ["pkg-b"]),
("pkg-a foobar=bar", ["pkg-b"], ["pkg-b"], ["pkg-a"]),
("a foobar=bar", ["a"], ["a"], ["b"]),
("a foobar=bar", ["b"], ["b"], ["a"]),
# Check that False disregard test dependencies
("pkg-a foobar=bar", False, [], ["pkg-a", "pkg-b"]),
("a foobar=bar", False, [], ["a", "b"]),
],
)
def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, without_dep):
@@ -1251,7 +1249,7 @@ def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeyp
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
)
monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1")
s = Spec("pkg-a %gcc@10foo os=redhat6").concretized()
s = Spec("a %gcc@10foo os=redhat6").concretized()
assert "%gcc@10foo" in s
def test_all_patches_applied(self):
@@ -1395,10 +1393,10 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True)
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized()
spec = Spec("a cflags=-g cxxflags=-g").concretized()
spack.store.STORE.db.add(spec, None)
testspec = Spec("pkg-a cflags=-g")
testspec = Spec("a cflags=-g")
testspec.concretize()
assert testspec == spec
@@ -1741,49 +1739,49 @@ def test_reuse_with_unknown_namespace_dont_raise(
self, temporary_store, mock_custom_repository
):
with spack.repo.use_repositories(mock_custom_repository, override=False):
s = Spec("pkg-c").concretized()
s = Spec("c").concretized()
assert s.namespace != "builtin.mock"
s.package.do_install(fake=True, explicit=True)
with spack.config.override("concretizer:reuse", True):
s = Spec("pkg-c").concretized()
s = Spec("c").concretized()
assert s.namespace == "builtin.mock"
@pytest.mark.regression("28259")
def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, monkeypatch):
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"), namespace="myrepo")
builder.add_package("pkg-c")
builder.add_package("c")
with spack.repo.use_repositories(builder.root, override=False):
s = Spec("pkg-c").concretized()
s = Spec("c").concretized()
assert s.namespace == "myrepo"
s.package.do_install(fake=True, explicit=True)
del sys.modules["spack.pkg.myrepo.pkg-c"]
del sys.modules["spack.pkg.myrepo.c"]
del sys.modules["spack.pkg.myrepo"]
builder.remove("pkg-c")
builder.remove("c")
with spack.repo.use_repositories(builder.root, override=False) as repos:
# TODO (INJECT CONFIGURATION): unclear why the cache needs to be invalidated explicitly
repos.repos[0]._pkg_checker.invalidate()
with spack.config.override("concretizer:reuse", True):
s = Spec("pkg-c").concretized()
s = Spec("c").concretized()
assert s.namespace == "builtin.mock"
@pytest.mark.parametrize(
"specs,expected,libc_offset",
"specs,expected",
[
(["libelf", "libelf@0.8.10"], 1, 1),
(["libdwarf%gcc", "libelf%clang"], 2, 1),
(["libdwarf%gcc", "libdwarf%clang"], 3, 2),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4, 1),
(["hdf5", "zmpi"], 3, 1),
(["hdf5", "mpich"], 2, 1),
(["hdf5^zmpi", "mpich"], 4, 1),
(["mpi", "zmpi"], 2, 1),
(["mpi", "mpich"], 1, 1),
(["libelf", "libelf@0.8.10"], 1),
(["libdwarf%gcc", "libelf%clang"], 2),
(["libdwarf%gcc", "libdwarf%clang"], 3),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4),
(["hdf5", "zmpi"], 3),
(["hdf5", "mpich"], 2),
(["hdf5^zmpi", "mpich"], 4),
(["mpi", "zmpi"], 2),
(["mpi", "mpich"], 1),
],
)
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize(self, specs, expected, libc_offset):
def test_best_effort_coconcretize(self, specs, expected):
specs = [Spec(s) for s in specs]
solver = spack.solver.asp.Solver()
solver.reuse = False
@@ -1792,9 +1790,7 @@ def test_best_effort_coconcretize(self, specs, expected, libc_offset):
for s in result.specs:
concrete_specs.update(s.traverse())
if not spack.solver.asp.using_libc_compatibility():
libc_offset = 0
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
assert len(concrete_specs) == expected + libc_offset
@pytest.mark.parametrize(
@@ -1896,20 +1892,20 @@ def test_misleading_error_message_on_version(self, mutable_database):
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_version_weight_and_provenance(self):
"""Test package preferences during coconcretization."""
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("pkg-b@0.9", "pkg-b@1.0")]
root_spec = Spec("pkg-a foobar=bar")
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("b@0.9", "b@1.0")]
root_spec = Spec("a foobar=bar")
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
# The result here should have a single spec to build ('pkg-a')
# and it should be using pkg-b@1.0 with a version badness of 2
# The result here should have a single spec to build ('a')
# and it should be using b@1.0 with a version badness of 2
# The provenance is:
# version_declared("pkg-b","1.0",0,"package_py").
# version_declared("pkg-b","0.9",1,"package_py").
# version_declared("pkg-b","1.0",2,"installed").
# version_declared("pkg-b","0.9",3,"installed").
# version_declared("b","1.0",0,"package_py").
# version_declared("b","0.9",1,"package_py").
# version_declared("b","1.0",2,"installed").
# version_declared("b","0.9",3,"installed").
#
# Depending on the target, it may also use gnuconfig
result_spec = result.specs[0]
@@ -1923,11 +1919,11 @@ def test_version_weight_and_provenance(self):
for criterion in criteria:
assert criterion in result.criteria, criterion
assert result_spec.satisfies("^pkg-b@1.0")
assert result_spec.satisfies("^b@1.0")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_succeeds_with_config_compatible_os(self):
root_spec = Spec("pkg-b")
root_spec = Spec("b")
s = root_spec.concretized()
other_os = s.copy()
mock_os = "ubuntu2204"
@@ -2191,7 +2187,7 @@ def test_external_python_extension_find_unified_python(self):
"specs",
[
["mpileaks^ callpath ^dyninst@8.1.1:8 ^mpich2@1.3:1"],
["multivalue-variant ^pkg-a@2:2"],
["multivalue-variant ^a@2:2"],
["v1-consumer ^conditional-provider@1:1 +disable-v1"],
],
)
@@ -2230,9 +2226,9 @@ def test_unsolved_specs_raises_error(self, monkeypatch, mock_packages, config):
def test_clear_error_when_unknown_compiler_requested(self, mock_packages, config):
"""Tests that the solver can report a case where the compiler cannot be set"""
with pytest.raises(
spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: pkg-a%foo"
spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: a%foo"
):
Spec("pkg-a %foo").concretized()
Spec("a %foo").concretized()
@pytest.mark.regression("36339")
def test_compiler_match_constraints_when_selected(self):
@@ -2268,7 +2264,7 @@ def test_compiler_match_constraints_when_selected(self):
},
]
spack.config.set("compilers", compiler_configuration)
s = Spec("pkg-a %gcc@:11").concretized()
s = Spec("a %gcc@:11").concretized()
assert s.compiler.version == ver("=11.1.0"), s
@pytest.mark.regression("36339")
@@ -2289,7 +2285,7 @@ def test_compiler_with_custom_non_numeric_version(self, mock_executable):
}
]
spack.config.set("compilers", compiler_configuration)
s = Spec("pkg-a %gcc@foo").concretized()
s = Spec("a %gcc@foo").concretized()
assert s.compiler.version == ver("=foo")
@pytest.mark.regression("36628")
@@ -2315,7 +2311,7 @@ def test_concretization_with_compilers_supporting_target_any(self):
]
with spack.config.override("compilers", compiler_configuration):
s = Spec("pkg-a").concretized()
s = spack.spec.Spec("a").concretized()
assert s.satisfies("%gcc@12.1.0")
@pytest.mark.parametrize("spec_str", ["mpileaks", "mpileaks ^mpich"])
@@ -2350,7 +2346,7 @@ def test_dont_define_new_version_from_input_if_checksum_required(self, working_e
with pytest.raises(spack.error.UnsatisfiableSpecError):
# normally spack concretizes to @=3.0 if it's not defined in package.py, except
# when checksums are required
Spec("pkg-a@=3.0").concretized()
Spec("a@=3.0").concretized()
@pytest.mark.regression("39570")
@pytest.mark.db
@@ -2450,7 +2446,7 @@ def _default_libc(self):
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
)
mutable_config.set("config:install_missing_compilers", True)
s = Spec("pkg-a %gcc@=13.2.0").concretized()
s = Spec("a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267")
@@ -2550,79 +2546,6 @@ def test_include_specs_from_externals_and_libcs(
assert result["deprecated-versions"].satisfies("@1.0.0")
@pytest.mark.regression("44085")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_path):
"""Test that external specs that are in the DB can be reused. This means they are
preferred to concretizing another external from packages.yaml
"""
packages_yaml = {
"externaltool": {"externals": [{"spec": "externaltool@2.0", "prefix": "/fake/path"}]}
}
mutable_config.set("packages", packages_yaml)
# Concretize with gcc@9 to get a suboptimal spec, since we have gcc@10 available
external_spec = Spec("externaltool@2 %gcc@9").concretized()
assert external_spec.external
root_specs = [Spec("sombrero")]
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=[external_spec])
assert len(result.specs) == 1
sombrero = result.specs[0]
assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash()
@pytest.mark.regression("45321")
@pytest.mark.parametrize(
"corrupted_str",
[
"cmake@3.4.3 foo=bar", # cmake has no variant "foo"
"mvdefaults@1.0 foo=a,d", # variant "foo" has no value "d"
"cmake %gcc", # spec has no version
],
)
def test_corrupted_external_does_not_halt_concretization(self, corrupted_str, mutable_config):
"""Tests that having a wrong variant in an external spec doesn't stop concretization"""
corrupted_spec = Spec(corrupted_str)
packages_yaml = {
f"{corrupted_spec.name}": {
"externals": [{"spec": corrupted_str, "prefix": "/dev/null"}]
}
}
mutable_config.set("packages", packages_yaml)
# Assert we don't raise due to the corrupted external entry above
s = Spec("pkg-a").concretized()
assert s.concrete
@pytest.mark.regression("44828")
@pytest.mark.not_on_windows("Tests use linux paths")
def test_correct_external_is_selected_from_packages_yaml(self, mutable_config):
"""Tests that when filtering external specs, the correct external is selected to
reconstruct the prefix, and other external attributes.
"""
packages_yaml = {
"cmake": {
"externals": [
{"spec": "cmake@3.23.1 %gcc", "prefix": "/tmp/prefix1"},
{"spec": "cmake@3.23.1 %clang", "prefix": "/tmp/prefix2"},
]
}
}
concretizer_yaml = {
"reuse": {"roots": True, "from": [{"type": "external", "exclude": ["%gcc"]}]}
}
mutable_config.set("packages", packages_yaml)
mutable_config.set("concretizer", concretizer_yaml)
s = Spec("cmake").concretized()
# Check that we got the properties from the right external
assert s.external
assert s.satisfies("%clang")
assert s.prefix == "/tmp/prefix2"
@pytest.fixture()
def duplicates_test_repository():
@@ -2824,9 +2747,7 @@ def test_drop_moving_targets(v_str, v_opts, checksummed):
class TestConcreteSpecsByHash:
"""Tests the container of concrete specs"""
@pytest.mark.parametrize(
"input_specs", [["pkg-a"], ["pkg-a foobar=bar", "pkg-b"], ["pkg-a foobar=baz", "pkg-b"]]
)
@pytest.mark.parametrize("input_specs", [["a"], ["a foobar=bar", "b"], ["a foobar=baz", "b"]])
def test_adding_specs(self, input_specs, default_mock_concretization):
"""Tests that concrete specs in the container are equivalent, but stored as different
objects in memory.

View File

@@ -9,7 +9,6 @@
import archspec.cpu
import spack.config
import spack.paths
import spack.repo
import spack.solver.asp
@@ -48,8 +47,8 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized()
a, b = s["pkg-a"], s["pkg-b"]
s = spack.spec.Spec("a%gcc@10.2.1 ^b%gcc@9.4.0").concretized()
a, b = s["a"], s["b"]
# Both a and b should depend on the same gcc-runtime directly
assert a.dependencies("gcc-runtime") == b.dependencies("gcc-runtime")
@@ -62,16 +61,16 @@ def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_path):
"""Tests that external nodes don't have runtime dependencies."""
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}}
packages_yaml = {"b": {"externals": [{"spec": "b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml)
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized()
s = spack.spec.Spec("a%gcc@10.2.1").concretized()
a, b = s["pkg-a"], s["pkg-b"]
a, b = s["a"], s["b"]
# Since b is an external, it doesn't depend on gcc-runtime
assert a.dependencies("gcc-runtime")
assert a.dependencies("pkg-b")
assert a.dependencies("b")
assert not b.dependencies("gcc-runtime")
@@ -79,36 +78,23 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
"root_str,reused_str,expected,nruntime",
[
# The reused runtime is older than we need, thus we'll add a more recent one for a
(
"pkg-a%gcc@10.2.1",
"pkg-b%gcc@9.4.0",
{"pkg-a": "gcc-runtime@10.2.1", "pkg-b": "gcc-runtime@9.4.0"},
2,
),
# The root is compiled with an older compiler, thus we'll NOT reuse the runtime from b
(
"pkg-a%gcc@9.4.0",
"pkg-b%gcc@10.2.1",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
),
("a%gcc@10.2.1", "b%gcc@9.4.0", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@9.4.0"}, 2),
# The root is compiled with an older compiler, thus we'll reuse the runtime from b
("a%gcc@9.4.0", "b%gcc@10.2.1", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@10.2.1"}, 1),
# Same as before, but tests that we can reuse from a more generic target
pytest.param(
"pkg-a%gcc@9.4.0",
"pkg-b%gcc@10.2.1 target=x86_64",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
"a%gcc@9.4.0",
"b%gcc@10.2.1 target=x86_64",
{"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@10.2.1 target=x86_64"},
1,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
),
),
pytest.param(
"pkg-a%gcc@10.2.1",
"pkg-b%gcc@9.4.0 target=x86_64",
{
"pkg-a": "gcc-runtime@10.2.1 target=x86_64",
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",
},
"a%gcc@10.2.1",
"b%gcc@9.4.0 target=x86_64",
{"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@9.4.0 target=x86_64"},
2,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
@@ -116,19 +102,17 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
),
],
)
@pytest.mark.regression("44444")
def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo):
"""Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks
that the semantic for gcc-runtimes versions accounts for reused packages too.
Reusable runtime versions should be lower, or equal, to that of parent nodes.
"""
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
assert f"{expected['b']}" in reused_spec
runtime_a = root.dependencies("gcc-runtime")[0]
assert runtime_a.satisfies(expected["pkg-a"])
runtime_b = root["pkg-b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["pkg-b"])
assert runtime_a.satisfies(expected["a"])
runtime_b = root["b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["b"])
runtimes = [x for x in root.traverse() if x.name == "gcc-runtime"]
assert len(runtimes) == nruntime
@@ -139,7 +123,8 @@ def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime
[
# Ensure that, whether we have multiple runtimes in the DAG or not,
# we always link only the latest version
("pkg-a%gcc@10.2.1", "pkg-b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"])
("a%gcc@10.2.1", "b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("a%gcc@9.4.0", "b%gcc@10.2.1", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
],
)
def test_views_can_handle_duplicate_runtime_nodes(

View File

@@ -512,5 +512,5 @@ def test_default_preference_variant_different_type_does_not_error(self):
packages.yaml doesn't fail with an error.
"""
with spack.config.override("packages:all", {"variants": "+foo"}):
s = Spec("pkg-a").concretized()
s = Spec("a").concretized()
assert s.satisfies("foo=bar")

View File

@@ -103,6 +103,23 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
yield mock_repo_path
class MakeStage:
def __init__(self, stage):
self.stage = stage
def __call__(self, *args, **kwargs):
return self.stage
@pytest.fixture
def fake_installs(monkeypatch, tmpdir):
stage_path = str(tmpdir.ensure("fake-stage", dir=True))
universal_unused_stage = spack.stage.DIYStage(stage_path)
monkeypatch.setattr(
spack.build_systems.generic.Package, "_make_stage", MakeStage(universal_unused_stage)
)
def test_one_package_multiple_reqs(concretize_scope, test_repo):
conf_str = """\
packages:
@@ -497,7 +514,7 @@ def test_oneof_ordering(concretize_scope, test_repo):
assert s2.satisfies("@2.5")
def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, mock_fetch):
def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, fake_installs):
conf_str = """\
packages:
y:
@@ -927,9 +944,9 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
Spec("zlib ~shared").concretized()
# A spec without the shared variant still concretize
s = Spec("pkg-a").concretized()
assert not s.satisfies("pkg-a +shared")
assert not s.satisfies("pkg-a ~shared")
s = Spec("a").concretized()
assert not s.satisfies("a +shared")
assert not s.satisfies("a ~shared")
@pytest.mark.parametrize(
@@ -1159,46 +1176,3 @@ def test_forward_multi_valued_variant_using_requires(
for constraint in not_expected:
assert not s.satisfies(constraint)
def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_packages):
"""Tests that strong preferences have a higher priority than reusing specs."""
reused_spec = Spec("adios2~bzip2").concretized()
reuse_nodes = list(reused_spec.traverse())
root_specs = [Spec("ascent+adios2")]
# Check that without further configuration adios2 is reused
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=reuse_nodes)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent
# If we stick a preference, adios2 is not reused
update_packages_config(
"""
packages:
adios2:
prefer:
- "+bzip2"
"""
)
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=reuse_nodes)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() != reused_spec.dag_hash()
assert ascent["adios2"].satisfies("+bzip2")
# A preference is still preference, so we can override from input
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(
setup, [Spec("ascent+adios2 ^adios2~bzip2")], reuse=reuse_nodes
)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent

Some files were not shown because too many files have changed in this diff Show More