Compare commits

...

859 Commits

Author SHA1 Message Date
Peter Scheibel
c899dcac5b style fix 2024-05-23 22:25:25 -07:00
Peter Scheibel
94dc25ecfa add test 2024-05-23 17:14:00 -07:00
Peter Scheibel
aded859856 Merge branch 'develop' into bugfix/invalid-compiler-warning 2024-05-23 16:30:54 -07:00
Alex Richert
f7b9c30456 Add develop version to ufs-weather-model (major updates) (#39265)
* Add develop version to ufs-weather-model (major updates)

* Update ufs-weather-model maintainers

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update ufs-weather-model defaults and fms dep

* Update package.py

* Update package.py

* Update package.py
2024-05-22 11:33:17 -06:00
Harmen Stoppels
884620a38a gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
to make macOS's linker happy.
2024-05-22 17:36:42 +02:00
pauleonix
7503a41773 cuda: add v12.4.1 (#43488) 2024-05-22 10:50:56 +02:00
Matt Thompson
9a5fc6b4a3 mapl: add v2.46 (#44230) 2024-05-22 10:22:57 +02:00
Cyrus Harrison
a31aeed167 conduit: add v0.9.2 (#44308) 2024-05-22 10:05:40 +02:00
Tamara Dahlgren
71f542a951 kcov: convert to new stand-alone test process (tested with latest version) (#44309) 2024-05-22 10:04:27 +02:00
Carlos Bederián
322bd48788 gcc: add v13.3.0 (#44306) 2024-05-22 09:44:54 +02:00
Wouter Deconinck
b752fa59d4 qt: add version 5.15.13 (#44310) 2024-05-22 09:44:35 +02:00
Wouter Deconinck
d53e4cc426 rsync: add v3.3.0 (#44311) 2024-05-22 09:36:16 +02:00
Tom Scogland
ee4b7fa3a1 flux-sched: add docs variant to match core, fix ver (#44277) 2024-05-22 00:55:57 -06:00
Harmen Stoppels
d6f02c86d9 grpc: add 1.61 to 1.64 (#44297) 2024-05-22 08:04:35 +02:00
John W. Parent
62efde8e3c cmake/add-3.29 (#43349) 2024-05-21 23:35:54 -06:00
Harmen Stoppels
bda1d94d49 bison: add missing diffutils build dep (#44296) 2024-05-21 19:33:29 -06:00
Massimiliano Culpo
3f472039c5 Take a lock before querying installed_dependents (#44301)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-21 13:16:04 -06:00
Robert Underwood
912ef34206 get cupy compiling with latest cuda (#44266)
* get cupy compiling with latest cuda
* fix other cupy-deps
* fix version bounds

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 10:44:18 -07:00
Massimiliano Culpo
9c88a48a73 Remove mesa18 and libosmesa (#44264)
* Remove mesa18 and libosmesa

mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.

We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.

libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.

* Remove references to mesa18 and libosmesa

* vtk: rework dependency on gl and osmesa

* memsurfer: rework dependency on vtk

* visit: minimal fix to avoid having both osmesa and glx
2024-05-21 10:18:14 -05:00
Dan Lipsa
4bf5cc9a9a Paraview on windows: Use htf5::hdf5 (#44161)
* Use hdf5::hdf5 on Windows from Paraview CMake
   This patch is already applied on VTK 9 or greater.
* Add comments stating that vtk and paraview patches are the same and should be modified in concert.
2024-05-21 11:05:32 -04:00
Chris Marsh
08834e2b03 Add homebrew gcc@14.1.0 patch for aarch64 darwin (#44280) 2024-05-21 15:31:36 +02:00
Massimiliano Culpo
8020a111df Demote a warning to debug message, if C compiler is not there (#44182) 2024-05-21 14:09:29 +02:00
dependabot[bot]
86fb547f7c bump pytest from 8.2.0 to 8.2.1 --- (#44282)
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:42:22 +02:00
Robert Underwood
b9556c7c44 Additional packages for devtools-manylinux (#44273)
Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 13:37:04 +02:00
dependabot[bot]
7bdb106b1b --- (#44281)
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:15:05 +02:00
kenche-linaro
2b191cd7f4 linaro-forge: added 24.0 version (#44292) 2024-05-21 05:13:42 -06:00
Federico Ficarelli
774f0a4e60 grpc: remove a maintainer (#44294) 2024-05-21 12:58:26 +02:00
Keita Iwabuchi
faf11efa72 Metall: add v0.26, v0.27, and v0.28 (#44284)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2024-05-21 12:50:44 +02:00
Massimiliano Culpo
5a99142b41 Cleanup of Apple OpenGL shim packages (#44269)
- Remove duplicated code
- Use BundlePackage as a base class
2024-05-21 12:49:18 +02:00
Harmen Stoppels
a3aca0242a grpc: forward compat bound abseil (#44290) 2024-05-21 11:48:59 +02:00
Massimiliano Culpo
72f276fab3 ASP-based solver: fix version optimization for roots (#44272)
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.

The issue was the omission of `PackageNode` in the optimization
tuple.
2024-05-21 08:41:09 +02:00
Philipp Edelmann
21139945df rayleigh: new package (#38338)
* rayleigh: new package

* Update var/spack/repos/builtin/packages/rayleigh/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* split edit into three methods

* add comments to clarify use of configure

* rayleigh: copyright year

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-20 19:23:02 -06:00
George Young
900bd2f477 seqkit: add 2.4.0, switching to 'go build' install mechanic (#38192)
* seqkit: add 2.4.0

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: update to @2.6.1, drop deprecated version and build system

* seqkit: add @2.7.0, convert to GoPackage

* tidying

* tidying

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-05-20 18:08:57 -07:00
Scott Wittenburg
29d4a5af44 gitlab ci: fix untouched spec pruning on windows (#44279)
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
2024-05-21 00:56:48 +00:00
dependabot[bot]
dd9b7ed6a7 build(deps): bump types-six in /.github/workflows/style (#44167)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.9 to 1.16.21.20240513.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-20 17:41:42 -07:00
Wouter Deconinck
09ff74be62 libsigsegv: fix patch filename extension 2024-05-21 02:12:24 +02:00
Robert Underwood
a94ebfea11 libpressio update (#44076)
* libpressio update
* fix typos in libpressio packages
* Addressed review feedback from @tldahlgren
* fix ci issues
* add missing package for SZx
* simplify varient logic and fix GPU deps
* Update var/spack/repos/builtin/packages/py-langsmith/package.py

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 18:12:00 -06:00
Paul R. C. Kent
8f5fe1d123 Add llvm v1816 (#44271) 2024-05-20 16:39:43 -07:00
Loris Ercole
d4fb58efa3 Update, fix serenity & serenity-libint (#43816)
* Update, fix serenity & serenity-libint
   Version 1.6.1 of `serenity` is added, some dependencies are made more
   explicit, some options are improved or fixed.
   The url of `serenity-libint` is fixed. The old url is not reachable
   anymore.
* Use upstream patch, modify cmake patch
* Update var/spack/repos/builtin/packages/serenity/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 16:07:25 -07:00
Tony Weaver
ce900346cc heimdall: Astronomy software package (#38328)
* heimdall: Astronomy software package.  Requires dedisp and psrdada (included as part of this commit)

* Updated packages to align with Spack's style

Minor updates based on wdconinc's comments regarding Spack's style guide

* [@spackbot] updating style on behalf of aweaver1fandm

* Minor edits to fix copyright year and dedisp install

Fixed copyright year to be 2024 instead of 2023

Removed the overridden version of install and created a preinstall function to create the missing lib and inc directories, therefore allowing the default install to run

Here is output from the spack install of heimdall showing successful build with cuda.  If necessary I do a spack clean and freshly install it
./spack install heimdall +cuda cuda_arch=86
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libiconv-1.17-enpmbhsi3kztebwmpclpub2afhlbr3gy
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/xz-5.4.1-pte76kujkezxb3laqse3o4sctlbygsaw
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zlib-1.2.13-utlfo5ltxz5v5bckirn5v3amtbxjdvwh
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/bzip2-1.0.8-x6navz7ucgfnb5xq7aelqmgd4zxsz5bs
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libmd-1.0.4-ncomhrodpdul4dm64o6b7426fhmc2u64
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/ncurses-6.4-c77h34rooycbzapxjvc27sg5td5jiwyb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zstd-1.5.5-eporpybumydxveg5rwtfzysrsu4eqzcv
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libffi-3.4.4-vskntokaojclfqxjfzkbyirkeogddbpx
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxcrypt-4.4.33-gtpn32p6mxztul3c3dxzqj7gvcyh555j
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/util-linux-uuid-2.38.1-g322a5peqjaad6gl5q64cdu4qo7kvw6o
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxml2-2.10.3-pza3kz2mtbncbbeim6rejfqkgftnf4rz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/openssl-1.1.1t-adbquvgg4qpc3vq6jynf44qzq3gfwrv5
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/pigz-2.7-e7mxj4ya2u4a6zb4hu64g7docujmkxeb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libbsd-0.11.7-wh4xleivbe7wndiqt5nsehzlfrccnjcg
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/readline-8.2-hhb647bwmbcj7iwpmtetbylninfm5rxf
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/cuda-11.7.1-osue2sx5rv7dgzhsmaemydpwhyribxng
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/tar-1.34-35f5gki2ycxmy5zd7zs5tsvp3xoszxum
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/expat-2.5.0-4gizyhhqklciuyrbyinq2tdggt73gds4
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gdbm-1.23-w3uzihtubj2iwv6es55fis6nt2q5zwlr
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/sqlite-3.40.1-m4ntzvuupsnbtkdfbz7oqpbjdlaffp2a
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/dedisp-1.0-pq6r3jnyxq6jzoygz3fp2e6jc2ojpvap
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gettext-0.21.1-hglzdeadmgkzjb76bmemt6dnulfkrpha
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/python-3.10.10-khu36qq4p2te7jf475ewr2h7egidekfl
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/psrdada-master-by4w6mrpcfnoihlhos2jcfo2roiyagaz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/heimdall-1.0-ohtdnltuhhejysshcert25h6nmuvluqp

* [@spackbot] updating style on behalf of aweaver1fandm
2024-05-20 13:39:23 -06:00
Adam J. Stewart
7cb64e465f py-pytest: add v8.2.1 (#44267)
* py-pytest: add v8.2.1
* py-pluggy: add v1.5.0
2024-05-20 11:28:59 -07:00
Pramod Kumbhar
eb70c9f5b9 caliper: add new variant to support Intel Vtune (#44147) 2024-05-20 18:27:42 +02:00
Teague Sterling
a28405700e awscli-v2: add v2.15.53, and other updates (#44258)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-20 18:21:09 +02:00
Adam J. Stewart
f8f4d94d7a Deprecate py-cmake and py-ninja (#44257) 2024-05-20 18:05:06 +02:00
Matt Thompson
32dfb522d6 gh: add v2.49.2 (#44231) 2024-05-20 13:51:46 +02:00
Mark W. Krentel
c61c707aa5 hpctoolkit: restrict one patch to :2022 (#44268)
Restrict the hpcrun-fmt.txt patch to :2022.  It's fixed in the code
after that, and in recent develop, some code paths have moved causing
this patch to fail.
2024-05-20 01:44:28 -06:00
Harmen Stoppels
60d10848c8 gettext: no link dep on tar (#44256) 2024-05-20 09:23:46 +02:00
Adam J. Stewart
dcd6b530f9 py-matplotlib: add v3.9.0 (#44225) 2024-05-20 09:22:30 +02:00
Teague Sterling
419f0742a0 htslib, STAR: add zlib-ng conflict (#44261) 2024-05-20 09:20:19 +02:00
Jacob King
c99174798b nimrod-aai: add version 24.2 and fix url (#42464)
shas changed due to reorganization of GitLab.com repo 
into an open subgroup.
2024-05-20 09:13:33 +02:00
Adam J. Stewart
8df2a4b511 py-gdown: add new package (#44265) 2024-05-20 09:09:35 +02:00
Benjamin Meyers
c174cf6830 Add openjdk@15.0.2 (#35936) 2024-05-19 10:27:26 -06:00
Wouter Deconinck
5eebd65366 audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1 (#44212)
* audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1

* [@spackbot] updating style on behalf of wdconinc

* audit: fix style

* audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch

* [@spackbot] updating style on behalf of wdconinc

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit 2ecec99238.

* Revert "audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch"

This reverts commit 5bd7da2cad.

* fix: modify audit message with suggested fix

* audit: github.com/o/r/pull/n/commits/sha.patch -> /o/r/commit/sha.patch?full_index=1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-19 09:30:19 -05:00
Michael Kuhn
625f5323c0 py-pyqt5-sip: add 12.13.0 and fix build with gcc@14 (#44133) 2024-05-19 08:05:18 +02:00
Wouter Deconinck
e05a32cead gaudi: don't apply patch for 38.2 (#44252) 2024-05-18 23:13:46 -06:00
Juan Miguel Carceller
c69af5d1e5 podio: cleanup recipe, remove deprecated versions and patches (#44111)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-18 15:17:54 -06:00
Miranda Mundt
1ac2ee8043 Add Pyomo 6.7.2 (#44097) 2024-05-18 10:31:57 -05:00
Teague Sterling
36af1c1c73 perl-xml-libxml: add new versions and conflicts (fixes #44253) (#44254)
* Address #44253 by adding new versions and declaring conflicts for perl-xml-libxml

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/perl-xml-libxml/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-18 09:28:32 -06:00
Carlos Bederián
e2fa087002 namd: add 3.0b7 (#44198) 2024-05-18 10:15:55 -05:00
Teague Sterling
df02bfbad2 Adding new spark versions (#44250)
* Adding new spark versions (in preparation of HAIL package)

* Adding myself as potential maintainer
2024-05-18 10:06:12 -05:00
Teague Sterling
fecb63843e yq: add new package (#44249) 2024-05-18 06:38:26 -06:00
Scott Wittenburg
b33e2d09d3 oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-18 11:57:53 +02:00
Chris Green
f8054aa21a git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-05-18 03:32:06 -06:00
Valentin Volkl
8f3a2acc54 whizard: add gosam variant (#43595)
* whizard: add gosam variant

* adress comments, fix compiler wrapper issue
2024-05-18 10:11:26 +02:00
Kevin Huck
d1a20908b8 ZeroSum: add new package (#44228) 2024-05-18 09:23:45 +02:00
Derek Ryan Strong
dd781f7368 perl: add v5.36.3, v5.38.2; deprecate unsafe versions (#44186) 2024-05-18 09:21:03 +02:00
dmagdavector
9bcc43c4c1 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-17 17:50:29 -06:00
Todd Gamblin
77c83af17d docs: remove warning about repositories and package extension (#44247)
Local package repositories are very well supported and we test them extensively, so this
warning from 8 years ago can be removed from the docs.
2024-05-17 22:03:57 +00:00
Michael Kuhn
574bd2db99 netlib-scalapack: fix build with gcc@14 (#44120) 2024-05-17 22:38:46 +02:00
James Taliaferro
a76f37da96 kakoune: add v2024.05.09 (#44124) 2024-05-17 22:36:58 +02:00
Adam J. Stewart
9e75f3ec0a GDAL: add v3.9.0 (#44128) 2024-05-17 22:35:33 +02:00
Derek Ryan Strong
4d42d45897 Add latest Python versions (#44130) 2024-05-17 22:35:06 +02:00
SXS Bot
a4b4bfda73 spectre: add v2024.05.11 (#44139)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-05-17 22:28:25 +02:00
Michael Kuhn
1bcdd3a57e py-cython: add 3.0.10 (#44140) 2024-05-17 22:27:42 +02:00
Stephen Sachs
297a3a1bc9 Add mpas-model and mpich to pcluster neoverse stack (#44151)
Should build now since https://github.com/spack/spack/pull/43547 has been merged.
2024-05-17 22:00:17 +02:00
Adam J. Stewart
8d01e8c978 JAX: add v0.4.28 (#44112) 2024-05-17 21:58:44 +02:00
Wouter Deconinck
6be28aa303 pythia8: patch latest 8.311 for upstream bug (#43803)
* pythia8: prefer 8.310

* [@spackbot] updating style on behalf of wdconinc

* pythia8: filter_file to remove sed n

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit e2a3decaffbd3f464d1bd992025e1812df49f088.

* Revert "pythia8: prefer 8.310"

This reverts commit 568cb056b87129085e245d9dbef1732ee1c6c0aa.

* [@spackbot] updating style on behalf of wdconinc

* pythia8: comment for fix

* pythia8: fix style

* pythia8: filter_file with raw string because of escaped pipe

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-17 21:31:05 +02:00
Jon Rood
5e38310515 nalu-wind: fix trilinos rocm dependency (#44233) 2024-05-17 13:00:24 -06:00
wspear
ddfed65485 tau: fix lib/include paths with oneapi (#44170) 2024-05-17 18:55:27 +02:00
dependabot[bot]
2a16d8bfa8 build(deps): bump codecov/codecov-action from 4.3.1 to 4.4.0 (#44195)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5ecb98a3c6...6d798873df)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 18:46:17 +02:00
Jonathon Anderson
6a40a50a29 hpcviewer: Update URLs to use GitLab release assets (#44129) 2024-05-17 18:43:13 +02:00
Carlos Bederián
b2924f68c0 blis: add v1.0 (#44199) 2024-05-17 18:39:42 +02:00
Rocco Meli
41ffe36636 spglib: add v2.4.0 (#44202) 2024-05-17 18:33:40 +02:00
Chris Marsh
24edc72252 docs: show phase signature for builders (#44067) 2024-05-17 18:16:31 +02:00
Raffaele Solcà
83b38a26a0 New versions nvpl-blas and nvpl-lapack (#44244) 2024-05-17 17:46:25 +02:00
Nathalie Furmento
914d785e3b starpu: add v1.4.6 (#44203) 2024-05-17 08:17:01 -06:00
Garth N. Wells
f99f642fa8 fenicsx: remove deprecated versions (#44223) 2024-05-17 07:35:21 -06:00
Seth R. Johnson
e0bf3667e3 cli11: new version and enable library (#44204) 2024-05-17 15:08:28 +02:00
Andrew-Dunning-NNL
a24ca50fed Fix broken link in docs (#44217) 2024-05-17 12:58:11 +00:00
Fabien Bruneval
51e9f37252 MOLGW: add v3.3 (#44241)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 06:56:05 -06:00
Rémi Lacroix
453900c884 libxc: Fix compilation after distribution changes. (#44206)
The release tarballs are not available anymore which means autoconf, automake and libtool are always needed.

The NVHPC specific patches don't make sense anymore being that the patched files are not distributed in the new tar files.
2024-05-17 14:52:03 +02:00
Alberto Invernizzi
4696459d2d libcatalyst: add missing python dependencies (#44224) 2024-05-17 14:45:26 +02:00
Fabien Bruneval
ad1e3231e5 libcint: add v6.1.2, v5.5.0 (#44239)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 05:45:26 -06:00
Richard Berger
2ef7eb1826 exodusii: only use MPI fortran compiler if +fortran (#44211) 2024-05-17 13:01:09 +02:00
Dom Heinzeller
fe86019f9a ecflow: versions up to 5.11.4 require boost 1.84 or earlier (#44181) 2024-05-17 12:53:48 +02:00
Harmen Stoppels
9dbb18219f build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-05-17 12:29:56 +02:00
Mikael Simberg
451a977de0 hpx: change default of max_cpu_count variant to auto (#44220) 2024-05-17 11:54:48 +02:00
Jack S. Hale
e604929a4c FEniCS: add more maintainers (#44240) 2024-05-17 03:03:21 -06:00
dependabot[bot]
9d591f9f7c build(deps): bump actions/checkout from 4.1.5 to 4.1.6 (#44234)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5 to 4.1.6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](44c2b7a8a4...a5ac7e51b4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 10:57:08 +02:00
Matt Thompson
f8ad915100 esmf: add v8.6.1 (#44229) 2024-05-17 10:49:47 +02:00
Garth N. Wells
cbbabe6920 fenics-dolfinx: add spdlog dependency (#44237) 2024-05-17 10:48:55 +02:00
John W. Parent
81fe460194 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-05-16 17:00:02 -06:00
Paul Kuberry
b894f996c0 trilinos: catch kokkos inconsistency with trilinos (#44209)
* trilinos: catch kokkos inconsistency with trilinos

* trilinos: update kokkos version range
2024-05-16 13:36:11 -06:00
John W. Parent
1ce09847d9 Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-05-16 10:56:04 -07:00
Thomas Madlener
722d401394 gaudi: Don't apply the patch if it has already landed upstream (#44180) 2024-05-16 17:15:39 +02:00
Howard Pritchard
e6f04d5ef9 py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-16 16:18:44 +02:00
Mosè Giordano
b8e3ecbf00 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-05-16 11:07:18 +02:00
Todd Gamblin
d189387c24 bugfix: add arg to write_line_break() in spack_yaml (#42727)
`ruamel`'s `Emitter.write_line_break()` method takes an extra argument that we forgot to
implement in our custom emitter.
2024-05-15 19:25:06 -07:00
Andrew Lister
9e96ddc5ae CoinHSL: Support the Meson build system and add new release (#43610)
* Add maintainer and fix linting
* allow for fewer deps in archive
* use meson for archive packages
* Fix version spec and f-string
* fix blas dependency links
* Add new release to spack
* Fix checksums for latest release
2024-05-15 16:48:23 -06:00
dmagdavector
543bd189af iperf3: updated versions from 3.6 to 3.16 (#44152)
* Update version of iperf3 from 3.6 to 3.16

Spack currently only explicitly has version 3.6 of the iPerf3 package
(out of ESnet / LBNL). This makes the default the latest version of 3.16,
and adds some other versions (found in some Linux distros, for possible
compatibility purposes).

* iperf3: update to 3.17; update 3.6 hash for new url

* protobuf: update hash for patch needed when="@3.4:3.21"

* Revert "protobuf: update hash for patch needed when="@3.4:3.21""

This reverts commit 4d168d0b27.
2024-05-15 15:48:15 -06:00
John W. Parent
43291aa723 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-05-15 15:41:51 -04:00
Alec Scott
d0589285f7 unmaintained pkgs: bump versions 2024-05-10 (#44131)
* unmaintained pkgs: bump versions 2024-05-10

* openblas: fix satisfies syntax

* pixman: add autotools dependencies

* [@spackbot] updating style on behalf of alecbcs

* pixman: revert

* chapel: revert changes in favor of other PR

* openblas: revert due to failing tests

* Address review feedback for flint, biobam2, and pango

* pango: add version comment about v2.0

* numactl: revert changes due to ppc4le bug

* flint: remote duplicate configure arg

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* openvkl, rkcommon: remove commented maintainers template

* flint: fix style

---------

Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-15 11:04:52 -07:00
afzpatel
d079aaa083 enable tensorflow-2.14 and 2.16 for spack built ROCm (#44095)
* initial commit to enable tensorflow-2.14 for spack built ROCm

* fix style errors

* modify hipcc patch

* updates for rocm 6.1

* updates for tf-rocm-enhanced 2.16

* fix styling

* fix styling

* add patch for 2.16

* add patch for 2.16

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update rocm enhanced version names

* changes for rocm-enhanced version name change

* fix styling

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-15 17:06:35 +02:00
Paolo
6c65977e0d Fix gromacs installation with SVE. Issue 44062 (#44183)
* Fix gromacs installation with SVE. Issue 44062

* [@spackbot] updating style on behalf of paolotricerri

* Remove `neoverse_n2` target

We have removed the neoverse_n2 target as its detection is more involved
compared to neoverse_v*.
2024-05-15 07:22:47 -06:00
Carlos Bederián
1b5d786cf5 gromacs: add 2024.2, 2023.5 (#44197) 2024-05-15 07:21:55 -06:00
Ben Morgan
4cf00645bd VecGeom: new version 1.2.8 (#44179) 2024-05-15 07:50:48 -04:00
Jon Rood
e9149cfc3c nalu-wind: fix mistake (#44188) 2024-05-14 23:13:13 -06:00
Jon Rood
a5c8111076 exawind: updates to package to allow mixed device (#44159)
* exawind: updates to package to allow mixed device

* Style.

* Remove ninja variants.

* Add conflict for amr-wind+hypre with mixed device.

* Relax amr-wind~hypre requirement.

* Move runtime variables to nalu-wind.

* Update suggestions.

* Remove umpire.
2024-05-14 12:26:07 -06:00
Alec Scott
c3576f712d pkg-config: support apple-clang@15: (#44007) 2024-05-14 17:24:36 +02:00
Alec Scott
410e6a59b7 rust: fix v1.78.0 instructions (#44127) 2024-05-14 08:14:34 -07:00
Carlos Bederián
bd2b2fb75a python: add 3.10.14, 3.9.19, 3.8.19 (#44162) 2024-05-14 17:03:45 +02:00
Zachary Newell
7ae318efd0 Added NCCL version 2.21.5-1 (#44158) 2024-05-14 03:34:21 -06:00
Derek Ryan Strong
73e9d56647 Update bash 5.2 patches (#44172) 2024-05-14 09:08:39 +02:00
Derek Ryan Strong
f87a752b63 Add zsh 5.8.1 and 5.9 (#44173) 2024-05-14 09:08:08 +02:00
Derek Ryan Strong
ae2fec30c3 Add newer fish versions (#44174) 2024-05-14 09:07:49 +02:00
Derek Ryan Strong
1af5564cbe Add file 5.45 (#44175) 2024-05-14 09:07:25 +02:00
Derek Ryan Strong
a8f057a701 Add man-db 2.12.0 and 2.12.1 (#44176) 2024-05-14 09:07:10 +02:00
Michael B Kuhn
7f3dd38ccc amr-wind: add latest versions and correct waves2amr details (#44099)
* add latest versions and correct waves2amr details

* update commit associated with v2.1.0
2024-05-13 14:21:25 -06:00
Adam J. Stewart
8e9adefcd5 ML CI: update image (#43751)
* ML CI: update image

* Use main branch

* Use tagged version
2024-05-13 21:43:52 +02:00
jdomke
d276f9700f fujitsu-mpi package: help CMake find wrappers (#43979)
Set MPI_C_COMPILER etc. env vars when building with fujitsu-mpi, which
override CMake logic. Without using these variables to explicitly
request the Fujitsu MPI wrappers, the builtin CMake logic is unwilling
to use these wrappers unless the Fujitsu compiler is used, but they
should be used for %clang as well.

This avoids patching CMake module files like in #16864, primarily to
avoid the possibility of altering behavior for specs that do not use
%fj or ^fujitsu-mpi.
2024-05-13 11:15:45 -07:00
Harmen Stoppels
4f111659ec glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-05-13 20:11:27 +02:00
Dave Keeshan
eaf330f2a8 yosys: add v0.41 (#44153) 2024-05-13 09:40:42 -07:00
Julien Cortial
cdaeb74dc7 cdt: Add version 1.4.1 (#44155) 2024-05-13 09:38:25 -07:00
Alberto Sartori
fbaac46604 justbuild: add version 1.3.0 (#44148) 2024-05-13 09:35:29 -07:00
Jon Rood
7f6210ee90 nalu-wind: updates (#44046) 2024-05-13 10:28:24 -06:00
Wouter Deconinck
63f6e6079a gaudi: upstream patch when @38.1 for missing #include <list> (#44121)
* gaudi: upstream patch when @38.1 for missing #include <list>

* gaudi: apply list patch for all versions
2024-05-13 07:37:31 -06:00
Greg Becker
d4fd6caae0 spack uninstall: improve error message for dependent environment (#44149) 2024-05-13 14:58:13 +02:00
Mikael Simberg
fd3c18b6fd whip: Add 0.3.0 (#44146)
Co-authored-by: Mikael Simberg <simbergm@cscs.ch>
2024-05-13 03:09:32 -06:00
Adam J. Stewart
725f427f25 spack checksum: do not add expand=False to wheels (#44118) 2024-05-13 10:01:47 +02:00
Paul R. C. Kent
32b3e91ef7 llvm: add 18.1.5, 18.1.4 (#44123) 2024-05-12 16:28:32 -07:00
bk
b7e4602268 R: common package updates for 4.4.0 (#44088)
* r-ggplot2: v3.5.1, r-haven: v2.5.4, r-jsonlite: v1.8.8, r-pkgload: v1.3.4, r-vctrs: v0.6.5

* r-scales: v1.3.0
2024-05-12 10:49:38 -07:00
Stephen Sachs
4a98d4db93 Add applications to aws-pcluster-* stacks (#43901)
* Add openfoam to aws-pcluster-neoverse_v1 stack

* Add more apps to aws-pcluster-x86_64_v4 stack

* Remove WRF while hdf5 cannot build in buildcache at the moment

* Update comment

* Add more apps for aws-pcluster-neoverse_v1 stack

* Remove apps that currently do not build

* Disable those packages that won't build

* Modify syntax such that correct cflags are used

* Changing syntax again to what works with other packages

* Fix overriding packages.yaml entry for gettext

* Use new `prefer` and `require:when` clauses to clarify intent

* Use newer spack version to install intel compiler

This removes the need for patches and makes sure the `prefer` directives in
`package.yaml` are understood.

* `prefer` not strong enough, need to set compilers

* Revert "Use newer spack version to install intel compiler"

This reverts commit ecb25a192c.

Cannot update the spack version to install intel compiler as this changes the
compiler hash but not the version. This leads to incompatible compiler paths. If
we update this spack version in the future make sure the compiler version also updates.

Tested-by: Stephen Sachs <stesachs@amazon.com>

* Remove `prefer` clause as it is not strong enough for our needs

This way we can safely go back to installing the intel compiler with an older
spack version.

* Prefer gcc or oneapi to build gettext

* Pin gettext version compatible with system glibc-headers

* relax gettext version requirement to enable later versions

* oneapi cannot build older gettext version
2024-05-12 10:48:02 -07:00
Juan Miguel Carceller
9d6bf373be whizard: Add version 3.1.4 (#43045)
* whizard: Add a patch to fix builds with pythia8 >= 8.310

* Add whizard 3.1.4 and update accordingly

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:44:56 -07:00
Alex Richert
cff35c4987 octave: use pcre2 for @8: (#42636)
* octave: use pcre2 for @8:

* Add 'pcre2' variant to octave to control pcre vs. pcre2

* Update var/spack/repos/builtin/packages/octave/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-12 10:44:31 -07:00
Juan Miguel Carceller
d594f84b8f fastjet: Add a cxxstd variant (#44072)
* fastjet: Add a cxxstd variant

* Use f-strings

* Add multi=False

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:42:56 -07:00
Wouter Deconinck
f8f01c336c clang: support cxx20_flag and cxx23_flag (#43438)
* clang: support cxx20_flag and cxx23_flag

* clang: coverage test cxx{}_flag and c{}_flag additions
2024-05-12 07:45:59 -07:00
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
Massimiliano Culpo
f7afd67a26 Remove spurious ASP debug lines (#44051) 2024-05-07 11:28:38 +02:00
psakievich
d22bdc1c4e certs: fix interpolation and disallow relative paths (#44030) 2024-05-07 11:16:32 +02:00
Mikael Simberg
540f9eefb7 mold: Add 2.30.0 and 2.31.0 (#44034) 2024-05-07 10:41:13 +02:00
Massimiliano Culpo
2db5bca778 Warn users of the future removal of platform=cray (#43980) 2024-05-07 10:30:23 +02:00
Harmen Stoppels
bcd05407b8 llnl.util.tty.color._force_color: init in global scope (#44036)
Currently SPACK_COLOR=always is not respected in the build process on
macOS, because the global `_force_color` is re-evaluated in global scope
during module setup, where it is always `None`.

So, move global init bits from main.py to the module itself.
2024-05-07 09:49:46 +02:00
John W. Parent
b35ec605fe python-venv: use correct python name for which call (#44048) 2024-05-07 09:32:44 +02:00
Massimiliano Culpo
0a353abc42 Fix issues in packages with the new release of archspec (#44037) 2024-05-07 09:20:34 +02:00
Massimiliano Culpo
e178c58847 Respect requests when filtering reused specs (#44042)
Some specs which were excluded from reuse,
are currently added back to the solve when
we traverse dependencies of other reusable
specs.

This fixes the issue by keeping track of what
we can explicitly reuse.
2024-05-07 09:06:51 +02:00
Massimiliano Culpo
d7297e67a5 Maintenance for the "bootstrap" workflow in CI (#44031)
Removed a lot of duplication.

Fixed an issue in containers, leading to false negative
2024-05-07 08:49:53 +02:00
snehring
ee8addf04a neic-finitefault: adding new package and dependencies (#44032)
* neic-finitefault: adding new package
* py-obspy: adding new package
* py-okada-wrapper: adding new package
* py-pygmt: adding new package
* py-pyrocko: adding new package

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-06 15:45:32 -07:00
Jon Rood
fd3cd3a1c6 amr-wind: updates and fixes (#43993)
* Updates and fixes for AMR-Wind.

* Add versions and update openfast constraints.

* Style.

* Fix version.
2024-05-06 13:31:29 -06:00
Garth N. Wells
e585aeb883 (py)-fenics-dolfinx: version update (#44002)
* Update DOLFINx to v0.8
* Fix typo
2024-05-06 12:11:02 -07:00
James Taliaferro
1f43384db4 Add LSP client extension for Kakoune (#44000)
* new package: kakoune-lsp
* blacken
* add myself as a maintainer
2024-05-06 11:18:45 -07:00
Adam J. Stewart
814b328fca py-torchmetrics: add v1.4.0 (#44027) 2024-05-06 10:21:20 -07:00
Harmen Stoppels
125206d44d python: always use a venv (#40773)
This commit adds a layer of indirection to improve build isolation with 
and without external Python, as well as usability of environment views.

It adds `python-venv` as a dependency to all packages that `extends("python")`, 
which has the following advantages:

1. Build isolation: only `PYTHONPATH` is considered in builds, not 
   user / system packages
2. Stable install layout: fixes the problem on Debian, RHEL and Fedora where 
   external / system python produces `bin/local` subdirs in Spack install prefixes. 
3. Environment views are Python virtual environments (and if you add 
   `py-pip` things like `pip list` work)

Views work whether they're symlink, hardlink or copy type.

This commit additionally makes `spec["python"].command` return 
`spec["python-venv"].command`. The rationale is that packages in repos we do 
not own do not pass the underlying python to the build system, which could still 
result in incorrectly computed install layouts.

Other attributes like `libs`, `headers` should be on `python` anyways and need no change.
2024-05-06 16:17:35 +02:00
John W. Parent
a081b875b4 proj: patch for modern cmake (#43780) 2024-05-06 16:01:10 +02:00
Harmen Stoppels
a16ee3348b Do not cache indices in Gitlab (#44029) 2024-05-06 15:54:21 +02:00
Massimiliano Culpo
d654d6b1f4 Remove Fedora 37 and 38, Ubuntu 18 from CI (#44006) 2024-05-06 15:51:45 +02:00
Harmen Stoppels
9b4ca0be40 clingo bootstrap: remove 3.12 patch and concretizer workarounds (#44028) 2024-05-06 15:00:41 +02:00
Harmen Stoppels
dc71dcfdc2 bootstrap: lazy bootstrapping of clingo and GnuPG (#44026)
Currently bootstrapping from source fails because clingo requires gnupg
requires clingo.

This commit stops eager bootstrapping. We don't need `patchelf` nor `gnupg`
generally. They're bootstrapped when needed.
2024-05-06 14:02:39 +02:00
Greg Becker
1f31c3374c External package detection for compilers (#43464)
This creates shared infrastructure for compiler packages to implement the 
detailed search capabilities from the `spack compiler find` command for the 
`spack external find` command.

After this commit, `spack compiler find` can be replaced with 
`spack external find --tag compiler`, with the exception of mixed toolchains.
2024-05-06 10:33:33 +02:00
Massimiliano Culpo
27aeb6e293 Update vendored archspec to v0.2.4 (#44005) 2024-05-06 10:20:56 +02:00
Harmen Stoppels
715214c1a1 spack env create <env>: dir if dir-like (#44024)
A named env cannot contain `.` and `/`.

So when a user runs `spack env create ./here` do not error but treat it
as `spack env create -d ./here`.

Also fix help string of `spack env create`, which seems to have been
copied from `activate` incorrectly.
2024-05-06 02:00:23 -06:00
dependabot[bot]
b471d62dbd build(deps): bump black from 24.4.0 to 24.4.2 in /lib/spack/docs (#43878)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-06 09:55:43 +02:00
Massimiliano Culpo
a5f62889ca archspec: add v0.2.4 (#44022) 2024-05-06 09:51:01 +02:00
Alec Scott
2a942d98e3 pkgconf: add v2.2.0 (#44008) 2024-05-06 09:31:41 +02:00
Alec Scott
4a4077d4ef rust: add v1.78.0 (#44012) 2024-05-06 09:30:34 +02:00
Alec Scott
c0fcccc232 go: add v1.22.2 (#44013) 2024-05-06 09:30:13 +02:00
Alec Scott
0b2cbfefce ncurses: add v6.5 (#44015) 2024-05-06 09:30:00 +02:00
Massimiliano Culpo
c499514322 libgpg-error: add v1.49 (#44023) 2024-05-06 09:28:31 +02:00
Alec Scott
ae392b5935 pcre2: add v10.43 (#44020) 2024-05-06 09:25:11 +02:00
Alec Scott
62e9bb5d51 libunistring: add v1.2 (#44018) 2024-05-06 09:24:12 +02:00
Alec Scott
6cd948184e libidn2: add v2.3.7 (#44017) 2024-05-06 09:23:50 +02:00
Alec Scott
44ff24f558 openssl: v3.3.0 (#44019) 2024-05-06 09:22:15 +02:00
Alec Scott
c657dfb768 gettext: v0.22.5 (#44014) 2024-05-05 06:57:42 -06:00
Pranav Sivaraman
f2e410d95a lazygit: add v0.41.0 (#43903) 2024-05-04 17:18:30 -06:00
dependabot[bot]
df443a38d6 build(deps): bump actions/checkout from 4.1.2 to 4.1.4 (#43831)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.2 to 4.1.4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](9bb56186c3...0ad4b8fada)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:49:26 -07:00
dependabot[bot]
74fe498cb8 build(deps): bump mypy from 1.9.0 to 1.10.0 in /lib/spack/docs (#43834)
Bumps [mypy](https://github.com/python/mypy) from 1.9.0 to 1.10.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:48:49 -07:00
Carlos Bederián
5f13a48bf2 mesa: add 23.3.6 (#43736) 2024-05-04 15:46:39 -07:00
Jon Rood
c4824f7fd2 ccls: add new versions (#43923) 2024-05-04 15:42:07 -07:00
dependabot[bot]
49a8634584 build(deps): bump codecov/codecov-action from 4.3.0 to 4.3.1 (#43945)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](84508663e9...5ecb98a3c6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:39:42 -07:00
Seth Bromberger
eac5ea869f tmux: add v3.4 and missing yacc build dep (#43975) 2024-05-04 22:33:19 +02:00
Juan Miguel Carceller
f5946c4621 pythia8: Add a gzip variant and filter some compiler wrapper paths (#43807)
* Add a gzip variant and filter some compiler wrapper paths

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-04 09:33:44 -05:00
Harmen Stoppels
8564ab19c3 fix iconv from libc (#43996)
* fix iconv from libc

* fix args in glib
2024-05-04 10:30:43 +02:00
Scott Wittenburg
aae7a22d39 gitlab: release branch pipelines rebuild what changed (#43990) 2024-05-04 10:11:48 +02:00
eugeneswalker
09cea230b4 e4s-alc: add v1.0.2 (#44001) 2024-05-03 22:24:08 -06:00
wspear
a1f34ec58b Add a gmake dependency for TAU (#43870)
We discovered a case where the system gmake can get confused by spack's build environment. Let's use a spack-provided gmake for consistent builds.
2024-05-03 20:15:40 -07:00
Auriane R
4d7cd4c0bf Add py-flash-attn@2.4.2 (#43960) 2024-05-03 16:40:46 -07:00
Christopher Christofi
4adbfa3835 fix package permissions docs link for gaussian packages (#43998) 2024-05-03 17:35:39 -06:00
Adam J. Stewart
8a1b69c1d3 Modernize py-torch-geometric and dependencies (#43984)
* Modernize py-torch-geometric and dependencies
* Forgot one maintainer
2024-05-03 16:21:41 -07:00
Matthew Thompson
a1d69f8661 hdf package: fix building with apple-clang@15: (#43964) 2024-05-03 16:11:55 -07:00
Carlos Bederián
e05dbc529e globalarrays package: add missing dependencies for armci=ofi/openib (#43971) 2024-05-03 16:08:29 -07:00
jdomke
99d33bf1f2 hpl: linking against fujitsu ssl2 if available (#43978)
* linking against fujitsu ssl2 if available

---------

Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:08:12 -07:00
jdomke
bd1918cd71 adding clang compiler checks which behaves exactly like aocc (#43976)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:01:14 -07:00
snehring
2a967c7df4 graphicsmagick package: add version 1.3.43 (#43977) 2024-05-03 15:59:10 -07:00
downloadico
7596aac958 New package: py-pylith (#43987) 2024-05-03 15:56:21 -07:00
Frédéric Simonis
c73ded8ed6 preCICE: increase baseline versions for next ver (#43985) 2024-05-03 15:55:46 -07:00
Satish Balay
df1d783035 sowing: add @master (#43991) 2024-05-03 15:37:39 -07:00
Massimiliano Culpo
47051c3690 Add a default provider for armci (#43997) 2024-05-03 23:48:58 +02:00
Owen Solberg
3fd83c637d Add checksum for R v4.4.0 (#43973)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-05-03 14:47:53 -07:00
Jon Rood
ef5afb66da openfast: updates to package (#43994) 2024-05-03 15:22:12 -06:00
snehring
ecc4336bf9 r-asreml: adding new package (#43970)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-03 12:20:58 -07:00
Harmen Stoppels
d2ed217796 concretizer args: --fresh-roots == --reuse-deps (#43988)
Since reuse is the default now, `--reuse-deps` can be confusing, as it
technically does not imply roots are fresh.

So add `--fresh-roots`, which is also easier to discover when running
`spack concretize --fre<tab>`
2024-05-03 12:12:36 -07:00
Dom Heinzeller
272c7c069a Add -fPIC to hdf-eos2 builds (#43794)
* Add -fPIC to hdf-eos2 builds
* Use self.compiler.cc_pic_flag instead of -fPIC
* Fix style in var/spack/repos/builtin/packages/hdf-eos2/package.py
2024-05-03 11:38:33 -07:00
Jeff Hammond
23f16041cd NWChem ARMCI-MPI variant and optional TCE builds (#43883)
* WIP NWChem with ARMCI-MPI and TCE optional
* rename armci-mpi to armcimpi
* add version for git
* add ARMCI-MPI support to NWChem package
   EXTERNAL_ARMCI_PATH needs to be set to the ARMCI-MPI package install prefix.
   I could not get it from spec["armcimpi"] but it worked to use the dependent build environment method to export a    generic environmental variable to use instead.
* i suppose i can maintain NWChem as well
* check rejects option that dozens of packages use
   i do not have time to fight with this nonsense
   Run . share/spack/setup-env.sh
   ==> Error: armcimpi version 'master' has extra arguments: 'branch'
   Valid arguments for a url fetcher are:
       'url', 'sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', and 'checksum'
   Error: Process completed with exit code 1.
* style
* ARMCI-MPI needs depends_on; the rest seems fixed
* fix ARMCI selection per review feedback from @zzzoom
   https://github.com/spack/spack/pull/43883#discussion_r1585014147
* address reviewer feedback from @yizeyi18
   https://github.com/spack/spack/pull/43883#discussion_r1587228084
   elaborate on what +extratce does, in terms of the NWChem build environment variables,
   some of which are documented on https://nwchemgit.github.io/TCE.html.
* style
* Update var/spack/repos/builtin/packages/nwchem/package.py

---------

Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Carlos Bederián <4043375+zzzoom@users.noreply.github.com>
2024-05-03 11:13:54 -07:00
Frank Willmore
e2329adac0 Update package.py for vacuumms 1.2.0 (#43948)
* updating vacuumms for new release
* formatting
* re-order versions and remove triple quote
2024-05-03 11:06:15 -07:00
jalcaraz
4ec788ca12 [TAU package] Update with +rocprofv2. Updated some tests. (#43937)
* [TAU package] Update with +rocprofv2. Updated some tests.

pdated with +rocprofv2 flag. Only works with rocm-core >= 6.0.0

Can be tested with (Omnia):
spack install tau@master +rocm+rocprofv2 %gcc@11
Needs the last commit in our local repository, can be done by modifying the "git = " line, or waiting until the public one is updated.


In the case that tests cause issues when building TAU, there is the flag:
disable_tests = False

The rocm test is disabled by default, as the PR regarding tests loading dependencies is not solved (PR#43682).

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-05-03 08:33:13 -07:00
Andrew W Elble
c1cea9d304 py-tensorflow: fix aarch64 build (#43852)
* py-tensorflow: fix aarch64 build

* [@spackbot] updating style on behalf of aweits

* patch format

* change patch strategy, actually get the logic correct in
the patch

* only patch 2.16.1 onwards for aarch64

* __CUDACC__ not __NVCC__

* !(defined(__NVCC__) && defined(__CUDACC__))

---------

Co-authored-by: aweits <aweits@users.noreply.github.com>
2024-05-03 15:46:13 +02:00
Massimiliano Culpo
5c96e67bb1 Make runtimes depend on libc only on linux (#43981) 2024-05-03 15:40:02 +02:00
Marc T. Henry de Frahan
7008bb6335 Add rosco package (#43822)
* Add rosco package

* format

* remove unused

* Add in other versions

* start adding some patches

* finish adding the patches
2024-05-03 01:58:35 -06:00
Adam J. Stewart
14561fafff py-torch: set TORCH_CUDA_ARCH_LIST globally for dependents (#43962) 2024-05-03 09:11:59 +02:00
Massimiliano Culpo
89bf1edb6e Spec.satisfies: fix a bug with concrete spec from JSON (#43968)
Fix a bug triggered by missing a virtual on some transitive edge, in a subdag of a pure build dependency.
2024-05-02 22:16:02 -04:00
Todd Gamblin
cc85dc05b7 docs: re-enable google analytics (#43974)
We recently switched to using the new ReadTheDocs with "addons". That includes its own
analytics, which is nice, but we also want to continue using our GA4 analytics.

Adding GA4 is no longer supported by RTD, so we have to add it manually.

- [x] re-add the gtag to all pages, manually

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-02 21:56:19 -04:00
Adam J. Stewart
ae171f8b83 py-torch: conflict with newer cuda (#43969) 2024-05-02 18:47:33 -07:00
snehring
578dd18b34 minimap2: adding version 2.28 (#43972)
* minimap2: adding version 2.28
* minimap2: adding maintainer

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-02 18:45:59 -07:00
Garth N. Wells
a7a51ee5cf py-fenics-ffcx: update to v0.8 (#43929)
* Update FFCx and UFCx to v0.8
* Syntax fix
2024-05-02 09:53:31 -07:00
Stephanie Labasan Brink
960cc90667 variorum: add branch dev and version 0.8.0 (#43829)
* variorum: add branch dev and version 0.8.0
* formatting
2024-05-02 09:50:05 -07:00
Alex Richert
dea44bad8b grads: fix libpng and g2c deps (#43909)
* grads: fix libpng and g2c deps
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:42:15 -07:00
Richard Berger
e37870ff43 rdma-core: add versions 51.0, 50.0 and 49.1 (#43926) 2024-05-02 09:41:23 -07:00
Sajid Ali
3751642a27 mold: add new versions (#43931) 2024-05-02 09:39:09 -07:00
Nils Vu
0f386697c6 spectre: add versions, update constraints (#43936) 2024-05-02 09:38:06 -07:00
Alex Richert
67ce103b2c Update prod-util recipe (#43940)
* update prod-util recipe
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:24:39 -07:00
Alex Richert
a8c9fa0e45 g2tmpl: add v1.12.0 (#43941) 2024-05-02 09:23:42 -07:00
Alex Richert
b56a133fce update grib-util recipe (#43939) 2024-05-02 09:22:03 -07:00
Seth R. Johnson
f0b3d33145 Celeritas: new version 0.4.3 (#43949) 2024-05-02 09:04:30 -07:00
Adam J. Stewart
32564da9d0 rust: add conflict for older GCC (#43958) 2024-05-02 08:36:06 -07:00
Stephen Hudson
8f2faf65dc libEnsemble: add v1.3.0 (#43944) 2024-05-02 08:35:06 -07:00
eugeneswalker
1d59637051 e4s ci: add py-amrex (#43904) 2024-05-02 08:57:26 -06:00
jdomke
97dc353cb0 libc: detect ARM flavor (#43959)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-02 15:59:29 +02:00
wspear
beebe2c9d3 Accept later pythons for tau@2.33+ (#43911)
* Accept later pythons for tau@2.33+

* separate forward compat bounds
2024-05-02 07:13:21 -06:00
Harmen Stoppels
2eb7add8c4 gcc: write builtin specs before modifications (#43956) 2024-05-02 06:40:10 -06:00
Harmen Stoppels
a9fea9f611 nghttp2: add diffutils (#43954) 2024-05-02 14:22:53 +02:00
Harmen Stoppels
9b62a9c238 gitlab ci: cache user cache (#43952) 2024-05-02 12:06:30 +02:00
wspear
f7eb0ccfc9 Revert #43701 (#43935)
PR #43701 is broken, preventing scorep from being installed. As explained in #43700 the issue is internal to scorep and can not be fixed in the package, at least by the method being attempted here.
2024-05-02 09:56:21 +02:00
Adrien Bernede
a0aa35667c Ignore external packages when pushing to buildcache automatically (--autopush) (#43930) 2024-05-02 09:50:10 +02:00
Luc Berger
b1d4fd14bc Nalu: adding support for Trilinos 14.2.0 for Nalu 1.6.0 (#43857) 2024-05-02 01:10:27 -06:00
John W. Parent
7e8415a3a6 Windows: auto-add WGL/SDK as externals (#43752)
Adds a pre-concretization check for the Windows SDK and WGL (Windows
GL) packages as non-buildable externals.

This is a redo of https://github.com/spack/spack/pull/43459, but makes
sure to modify the configuration scope outside of the bootstrap scope:
whichever is highest-precedence in the user's environment at the time
the concretization runs, which should either be an env scope or the
~ scope.

Adds pytest fixture mocking the check for WGL and WSDK as if they were
present.
2024-05-02 01:05:03 -06:00
Simon Pintarelli
7f4f42894d update sirius package.py (#43872) 2024-05-02 08:23:10 +02:00
Massimiliano Culpo
4e876b4014 Allow more control over which specs are reused (#42782)
This PR gives users finer control over which specs are reused during concretization.

The value of the `concretizer:reuse` config option now can take an object with the following properties:
- `roots`: true if reusing roots, false if reusing just dependencies
- `exclude`: list of constraints used to select reusable specs 
- `include`: list of constraints used to select reusable specs 
- `from`: allows to select the sources of reused specs

### Examples

#### Reuse only specs compiled with GCC
```yaml
concretizer:
  reuse:
    roots: true
    include:
    - "%gcc"
```

#### `openmpi` must be used from externals, and it must be the only external used
```yaml
concretizer:
  reuse:
    roots: true
    from:
    - type: local
      exclude:
      - "openmpi"
    - type: buildcache
      exclude:
      - "openmpi"
    - type: external
      include:
      - "openmpi"
```
2024-05-01 23:05:26 -04:00
Wouter Deconinck
77a8a4fe08 abseil-cpp: new version 20240116.2 (#43666) 2024-05-01 17:06:57 -07:00
Thomas Bouvier
597e5a4e5e Update py-nvidia-dali to v1.36.0 (#43928)
* py-nvidia-dali: add v1.36.0
* fix: do not expand downloaded wheel
2024-05-01 16:35:43 -07:00
MatthewLieber
3c31c32f62 Adding sha for 7.4 release of OSU Micro Benchmarks (#43899)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-05-01 16:27:52 -07:00
Weiqun Zhang
3a93a716e4 amrex: add v24.05 (#43938) 2024-05-01 16:17:05 -07:00
Filippo Spiga
82229a0784 Adding EXTRAE 4.1.2 (#43907) 2024-05-01 15:55:54 -07:00
Axel Huebl
5d846a69d1 ADIOS2: Campaign Variant (#43906)
With v2.10+, ADIOS added a campaign manager. This is auto-enabled
if SQLite3 is found.

Add explicit control for it now and disables it by default, to avoid
picking up system dependencies or bloating by default the ADIOS2
dependencies. Also, not yet fully mature to be used by default:
https://github.com/ornladios/ADIOS2/issues/4148
2024-05-01 15:50:45 -07:00
Stephen Herbener
d21aa1cc12 Removed use of mpi wrappers for fms and mapl package.py scripts. These were causing (#43726)
builds to fail on MacOS and CMake appears to handle the build fine without the mpi wrappers.
2024-05-01 15:41:30 -07:00
Jake Koester
7896ff51f6 bumped up version of kokkos-kernels (#43920) 2024-05-01 12:20:51 -06:00
Chris Mc
5849a24a74 jwt-cpp: add v0.7.0 (#41894)
* jwt-cpp: add version 0.7.0

* Update var/spack/repos/builtin/packages/jwt-cpp/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-01 12:20:35 -06:00
Alex Leute
38c49d6b82 py-simpy: New package (#43497)
* pypi build of py-simpy

* [py-simpy] toml explicitly called out

* py-simpy: New version

* py-setuptools-scm: Added new version

* py-setuptools-scm: add url_for_version

Because versions @:7 have an underscore (setuptools_scm) in the URL, and
newer versions have a dash (setuptools-scm).

* py-setuptools-scm: fix flake8 line too long

* py-simpy: Fix hash

---------

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-05-01 12:06:52 -06:00
fpruvost
0d8900986d qrmumps: 3.1 (#43913) 2024-05-01 12:00:25 -06:00
dependabot[bot]
62554cebc4 build(deps): bump black in /.github/workflows/style (#43879)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:37:54 +02:00
Wouter Deconinck
067155cff5 containers: add ubuntu 24.04 (#43881)
* containers: add ubuntu 24.04

* containers: use python3-boto3 pkg instead of pip install
2024-05-01 13:37:13 +02:00
Wouter Deconinck
08e68d779f ci: update upload-artifact to v4 (in build-containers) (#43880) 2024-05-01 13:35:12 +02:00
Adam J. Stewart
05b04cd4c3 Update PyTorch ecosystem (#43823)
* Update PyTorch ecosystem

* Don't expose protobuf namespace to other packages, breaks torchtext build

* Revert "Don't expose protobuf namespace to other packages, breaks torchtext build"

This reverts commit 50a4b08f65.
2024-05-01 13:30:46 +02:00
dependabot[bot]
be48f762a9 build(deps): bump pytest from 8.1.1 to 8.2.0 in /lib/spack/docs (#43908)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.1.1 to 8.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.1.1...8.2.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:29:22 +02:00
Axel Huebl
de5b4840e9 Add quantiphy (#43894)
* Add quanitphy

Add https://github.com/KenKundert/quantiphy .

* Simplify Version Range
2024-04-30 15:26:35 -07:00
Harmen Stoppels
20f9884445 curl: perl is temporarily required (#43921) 2024-04-30 23:32:06 +02:00
Harmen Stoppels
deb78bcd93 re2c: depends on gmake (#43916)
regressed in 650a668a9d
2024-04-30 23:30:36 +02:00
Garth N. Wells
06239de0e9 py-fenics-ufl: add new version (#43848)
* Bump version to 2024.1.0
* Bump to .post0
* Dependency fix
* Format file
* Run black on package file
2024-04-30 14:10:55 -07:00
Garth N. Wells
1f904c38b3 Add version 1.9.2. (#43838) 2024-04-30 12:19:31 -07:00
Wouter Deconinck
f2d0ba8fcc PackageStillNeededError: add pkg that needs spec to exception msg (#43845)
* PackageStillNeededError: add pkg that needs spec to exception msg
* PackageStillNeededError: f-string with short fmt and hash
* PackageStillNeededError: split long string
2024-04-30 12:11:47 -07:00
Satish Balay
49d3eb1723 petsc, py-petsc4py: add v3.21.1 (#43887) 2024-04-30 10:07:33 -07:00
Dom Heinzeller
7c5439f48a Update crtm-fix and crtm from JCSDA fork (#43855) 2024-04-30 10:07:16 -07:00
Harmen Stoppels
7f2cedd31f hack: drop glibc and musl in old concretizer (#43914)
The old concretizer creates a cyclic graph when expanding virtuals for
`iconv`, which is a bug. This hack drops glibc and musl as possible
providers for `iconv` in the old concretizer to work around it.
2024-04-30 13:55:48 +02:00
Harmen Stoppels
d47951a1e3 glibc: provides iconv (#43897)
`iconv` is a bit of weird virtual because the only shared API between
`glibc` and `libiconv` is:

```
iconv
iconv_open
iconv_close
```

whereas `libiconv` has further symbols [iconvctl](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconvctl.3.html), [iconv_open_into](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconv_open_into.3.html), and an `iconv` executable and `libcharset.so`. Packages that need those will have to do `depends_on("[virtuals=iconv] libiconv")`.
2024-04-30 07:40:00 +02:00
Teague Sterling
f2bd0c5cf1 Fix duckdb version handling again (#43762)
* Added package to build Ollama
* Update package.py
   Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
* We can now use OVERRIDE_GIT_DESCRIBE to set the version in DuckDB
* Update duckdb/package.py
   - use f-string
   - fix version specs to address inconsistencies
* Update package.py
   Fix spec definitions further
* [@spackbot] updating style on behalf of teaguesterling

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-29 18:51:04 -07:00
one
4362382223 Update package.py (#43764) 2024-04-29 18:49:49 -07:00
Nathalie Furmento
ba4859b33d starpu: add release 1.4.5 and dependancy for building starpupy (#43810) 2024-04-29 18:48:37 -07:00
Carlos Bederián
e8472714ef ucc: add 1.3.0 (#43828) 2024-04-29 18:45:00 -07:00
Mikael Simberg
ee6960e53e boost: Add 1.85.0 (#43788)
* boost: Add 1.85.0
* Add conflict for Boost 1.85.0 stacktrace change
2024-04-29 18:41:38 -07:00
Nathalie Furmento
dad266c955 starpu: add branch 1.4 (#43837) 2024-04-29 18:38:43 -07:00
Garth N. Wells
7a234ce00a (py-)fenics-basix: update for new version (#43841)
* Bump version to 2024.1.0
* Bump py-basix version
* Revert UFL change
* Python version update
2024-04-29 18:11:36 -07:00
Loris Ercole
a0c2ed97c8 Update scine-qcmaquis (#43817)
`scine-qcmaquis` is updated with a version 3.1.4.
The option to build the OpenMolcas interface is added, and some
dependencies are clarified.
2024-04-29 16:38:29 -07:00
Rémi Lacroix
a3aa5b59cd dotnet-core-sdk: Fix environment setup (#43842)
The "DOTNET_CLI_TELEMETRY_OPTOUT" environment variable should be defined when using the product, not when installing it (the installation phase is just extract the files anyway).

Also use `~` instead of `-` to check for the variant and fix the second argument for `env.set`  which should also be a string.
2024-04-29 14:47:37 -07:00
Zach Jibben
f7dbb59d13 Update Petaca (#43849)
* Add recent petaca releases
* Add Petaca 24.04
2024-04-29 14:42:11 -07:00
Wouter Deconinck
0df27bc0f7 nopayloadclient: new package (#43853) 2024-04-29 14:16:57 -07:00
Massimiliano Culpo
877e09dcc1 Delete leftover file (#43869)
This file is not needed anymore, was introduced in #19688
2024-04-29 22:45:41 +02:00
Adam J. Stewart
c4439e86a2 Prettier: add new package (#43866) 2024-04-29 13:39:52 -07:00
Pramod Kumbhar
aa00dcac96 gprofng-gui: new package (#43873) 2024-04-29 13:38:48 -07:00
Adam J. Stewart
4c9a946b3b py-keras: add v3.3.3 (#43885) 2024-04-29 13:30:52 -07:00
Wouter Deconinck
0c6e6ad226 xbae: new package (#43889) 2024-04-29 13:05:56 -07:00
Adam J. Stewart
bf8f32443f py-einops: add v0.8.0 (#43890) 2024-04-29 12:54:33 -07:00
Wouter Deconinck
c2eef8bab2 fontconfig: depends_on gperf when 2.11.1: (#43892) 2024-04-29 11:56:59 -07:00
afzpatel
2df4b307d7 hipblaslt: new package (#43846)
* initial commit to add hipblaslt package
* remove master and update patch
* add docstring comment
2024-04-29 11:48:03 -07:00
Ryan Marcellino
3c57440c10 openjdk: add v21_35 (#40699)
* openjdk: add v21+35

* add provides java@21

* Update var/spack/repos/builtin/packages/openjdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-29 10:51:17 -05:00
Harmen Stoppels
3e6e9829da compiler.py: fix early return (#43898) 2024-04-29 08:53:27 -06:00
Gregory Becker
859745f1a9 Run audits on windows
Add debug log for external detection tests. The debug log
is used to print which test is being executed.

Skip version audit on Windows where appropriate
2024-04-29 14:13:10 +02:00
Massimiliano Culpo
ddabb8b12c Fix concretization when installing missing compilers (#43876)
Restore the previous behavior when config:install_missing_compilers
is True. The libc of the missing compiler is inferred from the
Python process.
2024-04-29 08:20:33 +02:00
Jeff Hammond
16bba32124 add ILP64 option for BLIS (#43882)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
2024-04-29 08:14:25 +02:00
Michael Kuhn
7d87369ead autoconf: fix typo in m4 dependencies (#43893)
m4 1.4.8 is actually required starting with autoconf 2.72 according to
the NEWS file.
2024-04-28 18:34:12 -05:00
Adam J. Stewart
7723bd28ed Revert "package/npm update (#43692)" (#43884)
This reverts commit 03a074ebe7.
2024-04-27 08:58:12 -06:00
Harmen Stoppels
43f3a35150 gcc: generate spec file and fix external libc default paths after install from cache (#43839)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-27 08:49:20 -06:00
Jonathon Anderson
ae9f2d4d40 containers: Add Fedora 40, 39 (#43847) 2024-04-26 20:02:04 -05:00
Huston Rogers
5a3814ff15 Miniconda3: Added Versions: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0 (#43868)
* Added Versions of miniconda3: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0

* fixed style

---------

Co-authored-by: James H. Rogers <jhrogers@spear.hpc.msstate.edu>
2024-04-26 18:58:58 -06:00
Sakib Rahman
946c539dbd osg-ca-certs: new version osg 1.119 and igtf 1.128 (#43871)
* Update package.py to osg 1.119 and igtf 1.128

* Remove unnecessary white space

* Add missing comma

* change url to use git and use commit hash from repository instead of ssh256sum of tar.gz

* [@spackbot] updating style on behalf of rahmans1

---------

Co-authored-by: rahmans1 <rahmans1@users.noreply.github.com>
2024-04-26 18:39:12 -06:00
Robert Cohn
0037462f9e [SOS] update license (#43864)
Auto-generated license was wrong
2024-04-26 18:45:29 +02:00
Massimiliano Culpo
e5edac4d0c ASP-based solver: update os compatibility for macOS (#43862) 2024-04-26 18:34:46 +02:00
Jeff Hammond
3e1474dbbb ARMCI-MPI: add new package (#43813)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-26 16:35:33 +02:00
Abhishek Yenpure
0f502bb6c3 libcatalyst: add v2.0.0 (#43804) 2024-04-26 16:27:46 +02:00
Robert Cohn
1eecbd3208 [intel-oneap-*] no redistribution (#43826) 2024-04-26 12:03:39 +00:00
Jack Morrison
6e92b9180c Add tests-sos package + Variants to sos package (#43830)
* * Add initial tests-sos package
* Remove failing call of missing setup_compiler_environment from sos
  package
* Add several variants for sos package

* [@spackbot] updating style on behalf of jack-morrison
2024-04-26 07:26:21 -04:00
Massimiliano Culpo
ac9012da0c spack audit externals: allow selecting platforms and checking extra attributes (#43782) 2024-04-26 12:47:17 +02:00
Mosè Giordano
e3cb4f09f0 julia: add v1.10.2 (#41151)
* julia: add v1.10.2

* julia: add patch to remove suite-sparse cuda stub files

* julia: use permalinks for patches
2024-04-26 12:44:54 +02:00
Harmen Stoppels
2e8600bb71 gcc: simplify specs file, make binutils locatable (#43861) 2024-04-26 01:30:51 -06:00
Harmen Stoppels
d946c37cbb ldflags=* are compiler flags, not linker flags (#43820)
We run `extend spack_flags_list SPACK_LDFLAGS` for `$mode in ld|ccld`.

That's problematic, cause `ccld` needs `-Wl,--flag` whereas `ld` needs
`--flag` directly. Only `-L` and `-l` are common to compiler & linker.

In all build systems `LDFLAGS` is for the compiler not the linker, cause
any linker flag `-x` can be passed as a compiler flag `-Wl,-x`, and there
are many compiler flags that affect the linker invocation, like `-fopenmp`,
`-fuse-ld=`, `-fsanitize=` etc.

So don't pass `LDFLAGS` to the linker directly.

This way users can set `ldflags: -Wl,--allow-shlib-undefined` in compilers.yaml
to work around an issue where the linker tries to resolve the `libcuda.so.1`
stub lib which cannot be located by design in `cuda`.
2024-04-26 09:19:03 +02:00
Harmen Stoppels
47a9f0bdf7 llvm: use --gcc_install_dir in config files (#43795) 2024-04-26 09:01:02 +02:00
Alex Richert
2bf900a893 Update package.py (#43836) 2024-04-25 16:00:43 -07:00
Andrey Perestoronin
99bba0b1ce Add intel-oneapi-mpi 2021.12.1 patch package (#43850)
* Add intel-oneapi-mpi package

* Fix style
2024-04-25 13:38:47 -06:00
Juan Miguel Carceller
a8506f9022 glew package: add ld flags when compiling with ^apple-gl (#43429) 2024-04-25 11:18:00 -07:00
Harmen Stoppels
4a40a76291 build_environment.py: expand SPACK_MANAGED_DIRS with realpath (#43844) 2024-04-25 13:33:50 +02:00
downloadico
fe9ddf22fc spatialdata: add spatialdata package to spack (#43500) 2024-04-25 04:18:08 -07:00
Adam J. Stewart
1cae1299eb CI: remove ML ROCm stack (#43825) 2024-04-25 12:56:59 +02:00
Adam J. Stewart
8b106416c0 py-lightning: add v2.2.3 (#43824) 2024-04-25 12:03:01 +02:00
jalcaraz
e2088b599e [TAU Package] Updates for rocm (#43790)
* Updates for rocm

Updated for rocm@6
Added conflict between rocprofiler and roctracer.
Request either +rocprofiler or +roctracer when +rocm. In this case, it automatically builds for one, instead of displaying the message. 
Request +rocm when using either +rocprofiler or +roctracer. In this case, it automatically builds with +rocm, instead of displaying the message.

Disabled the tests. Will update them with the new test method.

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-04-24 16:35:41 -07:00
Ken Raffenetti
56446685ca mpich: add 4.2.0 release (#42687) 2024-04-24 12:33:59 -07:00
Teo
47a8d875c8 update halide versions; sync llvm (#43793) 2024-04-24 12:27:42 -07:00
Adam J. Stewart
56b2d250c1 py-keras: add v3.3.1-2 (#43798) 2024-04-24 12:23:34 -07:00
David Boehme
abbd09b4b2 Add Caliper v2.11 (#43802) 2024-04-24 12:21:56 -07:00
Rémi Lacroix
9e5fdc6614 dotnet-core-sdk: Add versions 7.0.18 and 8.0.4 (#43814) 2024-04-24 12:19:33 -07:00
Harmen Stoppels
1224a3e8cf clang.py: detect flang-new (#43815)
If a flang-new exists, which is rather unlikely, it probably means the
user wants it as a fortran compiler.
2024-04-24 19:11:02 +02:00
Jake Koester
6c3218920f Trilinos: update kokkos dependency (#43785)
* fix so trilinos@master uses correct kokkos (@4.3.00)

* Update var/spack/repos/builtin/packages/trilinos/package.py
2024-04-24 10:42:08 -06:00
Peter Scheibel
02cc3ea005 Add new redistribute() directive (#20185)
Some packages can't be redistributed in source or binary form. We need an explicit way to say that in a package.

This adds a `redistribute()` directive so that package authors can write, e.g.:

```python
    redistribute(source=False, binary=False)
```

You can also do this conditionally with `when=`, as with other directives, e.g.:

```python
    # 12.0 and higher are proprietary
    redistribute(source=False, binary=False, when="@12.0:")

    # can't redistribute when we depend on some proprietary dependency
    redistribute(source=False, binary=False, when="^proprietary-dependency")
```


To prevent Spack from adding either their sources or binaries to public mirrors and build caches. You can still unconditionally add things *if* you run either:
* `spack mirror create --private`
* `spack buildcache push --private`

But the default behavior for build caches is not to include non-redistributable packages in either mirrors or build caches.  We have previously done this manually for our public buildcache, but with this we can start maintaining redistributability directly in packages.

Caveats: currently the default for `redistribute()` is `True` for both `source` and `binary`, and you can only set either of them to `False` via this directive.

- [x] add `redistribute()` directive
- [x] add `redistribute_source` and `redistribute_binary` class methods to `PackageBase`
- [x] add `--private` option to `spack mirror`
- [x] add `--private` option to `spack buildcache push`
- [x] test exclusion of packages from source mirror (both as a root and as a dependency)
- [x] test exclusion of packages from binary mirror (both as a root and as a dependency)
2024-04-24 09:41:03 -07:00
John W. Parent
641ab95a31 Revert "Windows: add win-sdk/wgl externals during bootstrapping (#43459)" (#43819)
This reverts commit 9e2558bd56.
2024-04-24 18:24:28 +02:00
Adam J. Stewart
e8b76c27e4 py-matplotlib: drop test dep (#43765)
test dependencies constrains build / link type deps, so avoid that
2024-04-24 18:03:13 +02:00
Massimiliano Culpo
0dbe4d54b6 Tune default compiler preferences (#43805)
Add `%oneapi`, remove compilers that have been discontinued upstream
2024-04-24 18:00:40 +02:00
Harmen Stoppels
1eb6977049 rust: use system libs (#43797) 2024-04-24 09:46:11 -06:00
Harmen Stoppels
3f1cfdb7d7 libc: from current python process (#43787)
If there's no compiler we currently don't have any external libc for the solver.

This commit adds a fallback on libc from the current Python process, which works if it is dynamically linked.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-24 05:10:48 -06:00
Massimiliano Culpo
d438d7993d nf-*-cli: fix typo in conditional (#43806)
The default runner changed on GitHub for macOS, and that
revealed a bug in a package when running audits
2024-04-24 11:30:35 +02:00
Todd Gamblin
aa0825d642 Refactor to improve spec format speed (#43712)
When looking at where we spend our time in solver setup, I noticed a fair bit of time is spent
in `Spec.format()`, and `Spec.format()` is a pretty old, slow, convoluted method.

This PR does a number of things:
- [x] Consolidate most of what was being done manually with a character loop and several
      regexes into a single regex.
- [x] Precompile regexes where we keep them 
- [x] Remove the `transform=` argument to `Spec.format()` which was only used in one 
      place in the code (modules) to uppercase env var names, but added a lot of complexity
- [x] Avoid escaping and colorizing specs unless necessary
- [x] Refactor a lot of the colorization logic to avoid unnecessary object construction
- [x] Add type hints and remove some spots in the code where we were using nonexistent
      arguments to `format()`.
- [x] Add trivial cases to `__str__` in `VariantMap` and `VersionList` to avoid sorting
- [x] Avoid calling `isinstance()` in the main loop of `Spec.format()`
- [x] Don't bother constructing a `string` representation for the result of `_prev_version`
      as it is only used for comparisons.

In my timings (on all the specs formatted in a solve of `hdf5`), this is over 2.67x faster than the 
original `format()`, and it seems to reduce setup time by around a second (for `hdf5`).
2024-04-23 10:52:15 -07:00
Greg Becker
978c20f35a concretizer: update reuse: default to True (#41302) 2024-04-23 17:42:14 +02:00
Marc T. Henry de Frahan
d535124500 Update amr-wind package with versions (#43728) 2024-04-23 05:07:42 -06:00
Harmen Stoppels
01f61a2eba Remove import distro from packages and docs (#43772) 2024-04-23 12:47:33 +02:00
Massimiliano Culpo
7d5e27d5e8 Do not detect a compiler without a C compiler (#43778) 2024-04-23 12:20:33 +02:00
Harmen Stoppels
d210425eef nettle: remove openssl dep (#43770) 2024-04-23 07:44:15 +02:00
Nathalie Furmento
6be07da201 starpu: fix release 1.4.4 (#43730) 2024-04-22 15:56:23 -07:00
Filippo Spiga
02b38716bf Adding UCX 1.16.0 (#43743)
* Adding UCX 1.16.0
* Fixed hash
2024-04-22 15:52:20 -07:00
Adam J. Stewart
d7bc624c61 Tags: add more build tools (#43766)
* Tags: add more build tools
* py-pythran: add maintainer
2024-04-22 15:34:38 -07:00
Adam J. Stewart
b7cecc9726 py-scikit-image: add v0.23 (#43767) 2024-04-22 15:32:55 -07:00
Adam J. Stewart
393a2f562b py-keras: add v3.3.0 (#43783) 2024-04-22 15:04:06 -07:00
Massimiliano Culpo
682fcec0b2 zig: add v0.12.0 (#43774) 2024-04-22 15:02:45 -07:00
Harmen Stoppels
d6baae525f repo.py: drop deleted packages from provider cache (#43779)
The reverse provider lookup may have stale entries for deleted packages, which used to cause errors. It's hard to invalidate those cache entries, so this commit simply drops entries w/o invalidating the cache.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-22 19:03:44 +02:00
Kyle Knoepfel
e1f2612581 Adjust severity of irreversible operations (#43721) 2024-04-22 16:41:53 +02:00
Harmen Stoppels
080fc875eb compiler.py: reduce verbosity of implicit link dirs parsing (#43777) 2024-04-22 16:07:14 +02:00
Harmen Stoppels
69f417b26a view: dont warn about externals (#43771)
since it's the status quo on linux after libc as external by default
2024-04-22 16:05:32 +02:00
Harmen Stoppels
80b5106611 bootstrap: no need to add dummy compilers (#43775) 2024-04-22 16:01:41 +02:00
Massimiliano Culpo
34146c197a Add libc dependency to compiled packages and runtime deps
This commit differentiate linux from other platforms by
using libc compatibility as a criterion for deciding
which buildcaches / binaries can be reused. Other
platforms still use OS compatibility.

On linux a libc is injected by all compilers as an implicit
external, and the compatibility criterion is that a libc is
compatible with all other libcs with the same name and a
version that is lesser or equal.

Some concretization unit tests use libc when run on linux.
2024-04-22 15:18:06 +02:00
Harmen Stoppels
209a3bf302 Compiler.default_libc
Some logic to detect what libc the c / cxx compilers use by default,
based on `-dynamic-linker`.

The function `compiler.default_libc()` returns a `Spec` of the form
`glibc@x.y` or `musl@x.y` with the `external_path` property set.

The idea is this can be injected as a dependency.

If we can't run the dynamic linker directly, fall back to `ldd` relative
to the prefix computed from `ld.so.`
2024-04-22 15:18:06 +02:00
Harmen Stoppels
e8c41cdbcb database.py: stream of json objects forward compat (#43598)
In the future we may transform the database from a single JSON object to
a stream of JSON objects.

This paves the way for constant time writes and constant time rereads
when only O(1) changes are made. Currently both are linear time.

This commit gives just enough forward compat for Spack to produce a
friendly error when we would move to a stream of json objects, and a db
would look like this:

```json
{"database": {"version": "<something newer>"}}
```
2024-04-22 09:43:41 +02:00
Massimiliano Culpo
a450dd31fa Fix a bug preventing to set platform= on externals (#43758)
closes #43406
2024-04-22 09:15:22 +02:00
Alex Richert
7c1a309453 perl: remove mkdirp from setup_dependent_package (#43733) 2024-04-20 21:34:07 +02:00
Massimiliano Culpo
78b6fa96e5 ci.py: visit all edges (#43761) 2024-04-20 21:29:32 +02:00
Jordan Ogas
1b315a9ede mpich: add v4.2.1 (#43753) 2024-04-20 19:45:25 +02:00
Harmen Stoppels
82df0e549d compiler wrapper: prioritize spack store paths in -L, -I, -rpath (#43593)
* compiler wrapper: prioritize spack managed paths in search order

This commit partitions search paths of -L, -I (and -rpath) into three
groups, from highest priority to lowest:

1. Spack managed directories: these include absolute paths such as
   stores and the stage dir, as well as all relative paths since they
   are relative to a Spack owned dir
2. Non-system dirs: these are for externals that live in non-system
   locations
3. System dirs: your typical `/usr/lib` etc.

It's very easy for Spack to known the prefixes it owns, it's much more
difficult to tell system dirs from non-system dirs. Before this commit
Spack tried to distinguish only system and non-system dirs, and failed
for very trivial cases like `/usr/lib/x/..` which comes up often, since
build systems sometimes copy search paths from `gcc -print-search-dirs`.

Potentially this implementation is even faster than the current state of
things, since a loop over paths is replaced with an eval'ed `case ...`.

* Trigger a pipeline

* Revert "Trigger a pipeline"

This reverts commit 5d7fa863de.

* remove redudant return statement
2024-04-20 13:23:37 -04:00
Harmen Stoppels
f5591f9068 ci.py: simplify, and dont warn excessively about externals (#43759) 2024-04-20 15:09:54 +02:00
FrederickDeny
98c08d277d e4s-alc: add new package (#43750)
* Added e4s-cl@1.0.3

* add e4s-alc package

* removed trailing whitespace
2024-04-19 15:54:23 -06:00
Jordan Galby
facca4e2c8 ccache: 4.9.1 and 4.8.3 (#43748) 2024-04-19 23:46:38 +02:00
Adam J. Stewart
764029bcd1 py-ruff: add v0.4.0 (#43740) 2024-04-19 15:44:19 -06:00
Harmen Stoppels
44cb4eca93 environment.py: fix excessive re-reads (#43746) 2024-04-19 13:39:34 -06:00
Jack Morrison
39888d4df6 libfabric: Add version 1.21.0 (#43735) 2024-04-19 11:09:59 -07:00
Vincent Michaud-Rioux
f68ea49e54 Update py-pennylane + Lightning plugins + few deps (#43706)
* Update PennyLane packages to v0.32.
* Reformat.
* Couple small fixes.
* Fix Lightning cmake_args.
* Couple dep fixes in py-pennylane + plugins.
* Fix scipy condition.
* Add comment on conflicting requirement.
* Update py-pl versions
* Update lightning versions.
* Fix copyright.
* Fix license.
* Update pl-kokkos versions
* run black
* Fix L-Kokkos build and update autoray.
* build step only required for older versions. update autograd
* Fix LK@0.31 kokkos compat issue. Introduce url_for_version.
* Fix few more version bounds.
2024-04-19 11:05:09 -07:00
Olivier Cessenat
78b5e4cdfa perl-fth: new version 0.529 (#43727) 2024-04-19 10:58:34 -07:00
Micael Oliveira
26515b8871 fms: add two variants (#43734)
* fms: add two variants supporting existing build options.
* Style fixes.
2024-04-19 10:51:58 -07:00
Harmen Stoppels
74640987c7 ruamel yaml: fix quadratic complexity bug (#43745) 2024-04-19 14:33:42 +02:00
Harmen Stoppels
d6154645c7 chai / raja / umpire: compile entire project with hipcc again (#43738) 2024-04-19 11:18:12 +02:00
James Smillie
faed43704b py-numpy package: enable build on Windows (#43686)
* Add conflicts for some blas implementations that don't build on
  Windows (or with %msvc)
* Need to enclose CC/CXX variables in quotes in case those paths
  have spaces, otherwise Meson runs into errors
* On Windows, Python dependencies now add <prefix>/Scripts to the
  PATH (this is established as a standard in PEP 370)
2024-04-18 16:38:08 -06:00
John W. Parent
6fba31ce34 Windows: Update MSVC + oneAPI detection and integration (#43646)
* Later versions of oneAPI have moved, so update detection to find it
  in both old and new location
* Remove reliance on ONEAPI_ROOT env variable when determining Fortran
  compiler version for %msvc
* When finding a Fortran compiler for MSVC, there was logic enforcing
  a maximum MSVC version for a given oneAPI Fortran version. This
  mapping was out of date and excluding valid combinations, so has
  been removed (the logic now just picks the latest available
  oneAPI Fortran compiler for any given MSVC version).
2024-04-18 21:53:56 +00:00
Olivier Cessenat
112cead00b keepassxc: new version 2.7.7 (#43729) 2024-04-18 14:54:08 -06:00
John W. Parent
9e2558bd56 Windows: add win-sdk/wgl externals during bootstrapping (#43459)
On Windows, bootstrapping logic now searches for and adds the win-sdk
and wgl packages to the user's top scope as externals if they are not
present.

These packages are generally required to install most packages with
Spack on Windows, and are only available as externals, so it is
assumed that doing this automatically would be useful and avoid
a mandatory manual step for each new Spack instance.

Note this is the first case of bootstrapping logic modifying
configuration other than the bootstrap configuration.
2024-04-18 10:20:04 -07:00
eugeneswalker
019058226f py-netcdf4 %oneapi: cflags append -Wno-error=int-conversion (#43629) 2024-04-18 10:11:24 -07:00
George Young
ac0040f67d spaceranger: new manual download package @2.1.1 (#42391)
* spaceranger: new manual download package @2.1.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:11:01 -07:00
wspear
38f341f12d Added tau@2.33.2 (#43682) 2024-04-18 10:10:18 -07:00
George Young
26ad22743f cellranger: new manual download package @7.1.0 (#38486)
* cellranger: new manual download package @7.1.0
* cellranger: updating to @7.2.0
* updating website
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:09:00 -07:00
George Young
46c2b8a565 xeniumranger: new manual download package @1.7.1 (#42389)
* xeniumranger: new manual download package @1.7.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:08:04 -07:00
James Taliaferro
5cbb59f2b8 New package: editorconfig (#43670)
* New package: editorconfig
* remove FIXMEs
* add description for editorconfig
2024-04-18 10:05:15 -07:00
Adam J. Stewart
f29fa1cfdf CI: remove MXNet (#43704) 2024-04-18 10:04:03 -07:00
Olivier Cessenat
c69951d6e1 gxsview: compiles against system qt and vtk on rhel8 (#43722)
* gxsview: compiles againts system qt and vtk on rhel8
* Update gxsview/package.py for blanks around operator
* Update gxsview/package.py import blank line
* Update gxsview/package.py for style
* Update gxsview/package.py checking vtk version
2024-04-18 10:18:05 -06:00
Adam J. Stewart
f406f27d9c ML CI: remove extra xgboost (#43709) 2024-04-18 09:08:25 -07:00
Tamara Dahlgren
36ea208e12 Twitter->X: Reflect the name (only) change (#43690) 2024-04-18 08:52:54 -07:00
Kyle Knoepfel
17e0774189 Make sure variable is None if exception is raised. (#43707) 2024-04-18 08:50:15 -07:00
Sam Gillingham
3162c2459d update py-python-fmask to version 0.5.9 (#43698)
* update py-python-fmask to version 0.5.9
* add gillins and neilflood as maintainers
* remove spaces
* remove blank line
* put maintainers higher
2024-04-18 08:48:15 -07:00
Gregory Becker
2e3fc288ae warn and continue on failure to parse compiler from external 2024-04-18 08:46:05 -07:00
Massimiliano Culpo
7cad6c62a3 Associate condition sets from cli to root node (#43710)
This PR prevents a condition_set from having nodes that are not associated with the corresponding root node through some (transitive) dependencies.
2024-04-18 17:27:12 +02:00
Harmen Stoppels
eb2ddf6fa2 asp.py: do not copy 2024-04-18 15:39:26 +02:00
Harmen Stoppels
2bc2902fed spec.py: early return in __str__ 2024-04-18 15:39:26 +02:00
Mikael Simberg
b362362291 cvise package: add version 2.10.0 and ncurses constraint (#43319) 2024-04-17 13:41:33 -07:00
Rocco Meli
32bb5c7523 mkl interface (#43673) 2024-04-17 16:19:35 -04:00
FrederickDeny
a2b76c68a0 Added e4s-cl@1.0.3 (#43693) 2024-04-17 13:45:36 -06:00
Wouter Deconinck
62132919e1 xrootd: new version 5.6.9 (#43684) 2024-04-17 12:26:52 -07:00
Wouter Deconinck
b06929f6df xerces-c: new version 3.2.5 (#43687) 2024-04-17 12:24:27 -07:00
Wouter Deconinck
0f33de157b assimp: new version 5.4.0 (#43689) 2024-04-17 10:21:15 -07:00
Sinan
03a074ebe7 package/npm update (#43692)
* package/npm update
* add conflicts to exclude certain version intervals

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-04-17 10:06:09 -07:00
Adam J. Stewart
4d12b6a4fd py-shapely: add v2.0.4 (#43702) 2024-04-17 18:51:48 +02:00
Adam J. Stewart
26bb15e1fb py-sphinx: add v7.3 (#43703) 2024-04-17 18:51:31 +02:00
Bill Williams
1bf92c7881 [Score-P] Make local with-or-without not use "yes" (#43701)
Score-P does not accept "--with-foo=yes", but only "--with-foo" or "--with-foo=some-valid-specific-choice-or-path". This keeps Spack from generating config flags that will cause Score-P to barf.
2024-04-17 09:32:47 -07:00
Todd Gamblin
eefe0b2eec Improve spack find output in environments (#42334)
This adds some improvements to `spack find` output when in environments based
around some thoughts about what users want to know when they're in an env.

If you're working in an enviroment, you mostly care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized

So, this PR adds a couple tweaks to display that information more clearly:

- [x] We now display install status next to every root. You can easily see
      which are installed and which aren't.

- [x] When you run `spack find -l` in an env, the roots now show their concrete
      hash (if they've been concretized). They previously would show `-------`
      (b/c the root spec itself is abstract), but showing the concretized root's
      hash is a lot more useful.

- [x] Newly added/unconcretized specs still show `-------`, which now makes more
      sense, b/c they are not concretized.

- [x] There is a new option, `-r` / `--only-roots` to *only* show env roots if
      you don't want to look at all the installed specs.

- [x] Roots in the installed spec list are now highlighted as bold. This is
      actually an old feature from the first env implementation , but various
      refactors had disabled it inadvertently.
2024-04-17 16:22:05 +00:00
Wouter Deconinck
de6c6f0cd9 py-pyparsing: new version 3.1.2 (#43579) 2024-04-17 07:53:03 -06:00
James Smillie
309d3aa1ec Python package: fix install of static libs on Windows (#43564) 2024-04-16 16:40:15 -06:00
Andrey Perestoronin
feff11f914 intel-oneapi-dnn-2024.1.0: add DNN package version (#43679)
* add onednn package

* fix style

* Update var/spack/repos/builtin/packages/intel-oneapi-dnn/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-16 15:26:14 -04:00
John W. Parent
de3b324983 Windows filesystem utilities (bugfix): improve SFN usage (#43645)
Reduce incidence of spurious errors by:
* Ensuring we're passing the buffer by reference
* Get the correct short string size from Windows API instead of computing ourselves
* Ensure sufficient space for null terminator character

Add test for `windows_sfn`
2024-04-16 11:02:02 -07:00
kwryankrattiger
747cd374df Run after_script aggregator with spack python (#43669) 2024-04-16 19:03:44 +02:00
Wouter Deconinck
8b3ac40436 acts: new version 34.0.0 (#43680) 2024-04-16 11:03:31 -06:00
Wouter Deconinck
28e9be443c (py-)onnx: new version 1.16.0 (#43675) 2024-04-16 09:58:43 -07:00
John W. Parent
1381bede80 zstd: 1.5.6 does not build on Windows (#43677)
Conflict until a fix has been merged upstream
2024-04-16 09:36:27 -07:00
Wouter Deconinck
6502785908 podio: +rntuple requires root +root7 (#43672) 2024-04-16 09:27:07 -07:00
Erik Heeren
53257408a3 py-bluepyopt: 1.14.11 (#43678) 2024-04-16 09:21:24 -07:00
Wouter Deconinck
28d02dff60 pythia8: new version 8.311 (#43667) 2024-04-16 09:19:06 -07:00
Wouter Deconinck
9d60b42a97 jwt-cpp: new version 0.7.0, scitokens-cpp: new versions to 1.1.1 (#43657)
* jwt-cpp: new version 0.7.0, depends_on nlohmann-json
* scitokens-cpp: new versions to 1.1.1
* scitokens-cpp: conflicts ^jwt-cpp@0.7: when @:1.1
2024-04-16 09:17:18 -07:00
Harmen Stoppels
9ff5a30574 concretize.lp: fix issue with reuse of conditional variants (#43676)
Currently if you request pkg +example where example is a conditional
variant, and you have a pkg in the database for which the condition
did not hold (so no +example nor ~example), the solver would reuse it
regardless, not imposing +example.

The change rules out exactly one thing: variant_set without variant_value,
which in practice could only happen when not node_has_variant (i.e. when
under the current package.py rules the variant's when condition did not
trigger).
2024-04-16 16:09:32 +00:00
Wouter Deconinck
9a6c013365 wayland: +doc requires graphviz +expat for HTML tables (#43668) 2024-04-16 01:28:47 -06:00
Andrey Prokopenko
9f62a3e819 arborx: add v1.6 (#43623) 2024-04-16 09:27:14 +02:00
Maciej Wójcik
e380e9a0ab gromacs: prevent version conflict after enabling plumed (#43449) 2024-04-16 09:07:50 +02:00
Adam J. Stewart
8415ea9ada py-black: switch maintainer (#43652) 2024-04-16 08:51:12 +02:00
Josh Bowden
6960766e0c Damaris: add v1.10.0 (#43664)
Co-authored-by: Joshua Bowden <joshua-chales.bowden@inria.fr>
2024-04-15 23:43:39 -06:00
Michael Kuhn
0c2ca8c841 octave: add 8.4.0 and 9.1.0 (#43518) 2024-04-15 13:32:20 -07:00
Jen Herting
273960fdbb [py-pybedtools] added version 0.10.0 (#43625) 2024-04-15 13:27:22 -07:00
eugeneswalker
0cd2a1102c crtm: add noaa versions and package mods (#43635)
* crtm: add noaa versions and package mods
* crtm@v2.4.1-jedi: add missing depends_on netcdf-fortran, ecbuild from jcsda spack fork
2024-04-15 13:24:07 -07:00
Teague Sterling
e40676e901 Add ollama package (#43655)
* Added package to build Ollama
* Update package.py
  Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
2024-04-15 13:13:07 -07:00
Tristan Carel
4ddb07e94f py-morphio: add version 3.3.7, update license (#43420) 2024-04-15 14:06:48 -06:00
eugeneswalker
50585d55c5 hdf: %oneapi: -Wno-error=implicit-function-declaration (#43631)
* hdf: %oneapi: -Wno-error=implicit-function-declaration

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-15 12:17:07 -07:00
Adam J. Stewart
5d6b5f3f6f GDAL: fix MDB build (#43665) 2024-04-15 10:38:56 -07:00
Juan Miguel Carceller
2351c19489 glew: remove a few unused options (#43399)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-15 18:49:51 +02:00
snehring
08d49361f0 iq-tree: add v2.3.1 (#43442) 2024-04-15 18:48:58 +02:00
Ken Raffenetti
c3c63e5ca4 mpich: Update PMI configure options (#43551)
Add a "default" option that passes no option to configure. Existing
options changed in the MPICH 4.2.0 release, so update the package to
reflect those changes.
2024-04-15 18:40:37 +02:00
Richard Berger
e72d4075bd LAMMPS: add v20240207.1 (#43538)
Add workaround for undefined HIP_PATH in older versions
2024-04-15 16:34:30 +00:00
Todd Gamblin
f9f97bf22b tests: Spec tests shouldn't fetch remote git repositories. (#43656)
Currently, some of the tests in `spec_format` and `spec_semantics` fetch
the actual zlib repository when run, because they call `str()` on specs
like `zlib@foo/bar`, which at least currently requires a remote git clone
to resolve.

This doesn't change the behavior of git versions, but it uses our mock git
repo infrastructure and clones the `git-test` package instead of the *real*
URL from the mock `zlib` package.

This should speed up tests.  We could probably refactor more so that the git
tests *all* use such a fixture, but the `checks` field that unfortunately
tightly couples the mock git repository and the `git_fetch` tests complicates
this. We could also consider *not* making `str()` resolve git versions, but
I did not dig into that here.

- [x] add a mock_git_test_package fixture that sets up a mock git repo *and*
      monkeypatches the `git-test` package (like our git test packages do)
- [x] use fixture in `test_spec_format_path`
- [x] use fixture in `test_spec_format_path_posix`
- [x] use fixture in `test_spec_format_path_windows`
- [x] use fixture in `test_parse_single_spec`
2024-04-15 09:20:23 -07:00
Massimiliano Culpo
8033455d5f hdf5: require mpich+fortran when hdf5+fortran (#43591) 2024-04-15 18:17:28 +02:00
Eric Berquist
50a5a6fea4 tree-sitter: add versions up to 0.22.2 (#43608) 2024-04-15 09:10:19 -07:00
eugeneswalker
0de8a0e3f3 wgrib2: oneapi -> comp_sys="intel_linux" (#43632) 2024-04-15 18:04:41 +02:00
SXS Bot
0a26e74cc8 spectre: add v2024.04.12 (#43641)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-04-15 18:03:51 +02:00
dependabot[bot]
9dfd91efbb build(deps): bump docker/setup-buildx-action from 3.2.0 to 3.3.0 (#43542)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](2b51285047...d70bba72b1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:47 +02:00
dependabot[bot]
1a7baadbff build(deps): bump python-levenshtein in /lib/spack/docs (#43543)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.25.0 to 0.25.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.25.0...v0.25.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:24 +02:00
Marc Perache
afcfd56ae5 range-v3: add v0.12.0 (#40103) 2024-04-15 17:43:14 +02:00
dependabot[bot]
7eb2e704b6 build(deps): bump codecov/codecov-action from 4.1.1 to 4.3.0 (#43562)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.1.1 to 4.3.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](c16abc29c9...84508663e9)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 17:34:31 +02:00
Josh Milthorpe
564b4fa263 hipcub: depend on matching version of hip+cuda when +cuda (#42970) 2024-04-15 17:33:26 +02:00
Adam J. Stewart
0a941b43ca PyTorch: build with external cpuinfo (#40758) 2024-04-15 17:26:52 +02:00
one
35ff24ddea googletest: fix reversed pthreads variant logic (#43649) 2024-04-15 11:18:27 -04:00
Rocco Meli
7019e4e3cb openbabel: add CMake patch for 3.1.1 (#43612) 2024-04-15 17:07:54 +02:00
Weston Ortiz
cb16b8a047 goma: new version 7.6.1 (#43617) 2024-04-15 17:04:34 +02:00
Adam J. Stewart
381acb3726 Build systems: fix docstrings (#43618) 2024-04-15 17:01:52 +02:00
Adam J. Stewart
d87ea0b256 py-maturin: add v1.5.1 (#43619) 2024-04-15 16:55:17 +02:00
Adam J. Stewart
1a757e7f70 py-lightning: add v2.2.2 (#43621) 2024-04-15 16:54:34 +02:00
eugeneswalker
704e2c53a8 py-eccodes: add v1.4.2 (#43633) 2024-04-15 16:44:57 +02:00
renjithravindrankannath
478d8a668c rocm-opencl: add dependency on aqlprofile (#43637) 2024-04-15 16:44:16 +02:00
dependabot[bot]
7903f9fcfd build(deps): bump black from 24.3.0 to 24.4.0 in /lib/spack/docs (#43642)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:37 +02:00
dependabot[bot]
670d3d3fdc build(deps): bump black in /.github/workflows/style (#43643)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:14 +02:00
Vanessasaurus
e8aab6b31c flux-core: add v0.61.2 (#43648)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-15 16:34:58 +02:00
Adam J. Stewart
1ce408ecc5 py-ruff: add v0.3.7 (#43620) 2024-04-15 16:33:11 +02:00
Adam J. Stewart
dc81a2dcdb py-rasterio: add v1.3.10 (#43653) 2024-04-15 16:32:34 +02:00
Rocco Meli
b10f51f020 charmpp: add archs including Cray shasta with ARM (#43191)
Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-04-15 16:28:31 +02:00
Hariharan Devarajan
4f4e3f5607 dlio-profiler: add releases up yo v0.0.5 (#43530) 2024-04-15 16:21:52 +02:00
Wouter Deconinck
00fb80e766 util-linux: new version 2.40 (#43661) 2024-04-15 16:16:52 +02:00
Michael Kuhn
057603cad8 Fix pkgconfig dependencies (#43651)
pkgconfig is the virtual package, pkg-config and pkgconf are
implementations.
2024-04-15 16:05:11 +02:00
kjrstory
5b8b6e492d su2: add v8.0.0, v8.0.1 (#43662) 2024-04-15 15:25:16 +02:00
Wouter Deconinck
763279cd61 spdlog: new version 1.13.0 (#43658) 2024-04-15 12:36:10 +02:00
Wouter Deconinck
e4237b9153 zlib: new version 1.3.1 (#43660) 2024-04-15 10:59:06 +02:00
Wouter Deconinck
d288658cf0 zstd: new version 1.5.6 (#43659) 2024-04-15 09:04:42 +02:00
Kacper Kornet
2c22ae0576 intel-oneapi-compiler: Fix generation of config files (#43654)
Commit 330a9a7c9a aimed at preventing
generation of .cfg files when a given compiler does not exist
in the particular release. However the check does not
contain the full paths so it always fails resulting in empty
.cfg files. This commit fixes it.
2024-04-13 19:34:58 -04:00
eugeneswalker
fc3fc94689 gsibec: properly reference self.spec (#43627) 2024-04-13 14:21:03 -07:00
eugeneswalker
b5013c1372 dealii@9.5 +cgal requires ^cgal@5: (#43639) 2024-04-13 14:20:40 -07:00
Juan Miguel Carceller
e220674c4d sherpa: remove paths to compiler wrappers and use the provided libtool (#43611)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-13 15:24:26 -05:00
Harmen Stoppels
7f13518225 gettext: unvendor libxml (#43622) 2024-04-13 12:01:38 +02:00
Wouter Deconinck
96a13a97e6 freetype: new version 2.12 and 2.13 (#43571) 2024-04-13 11:48:48 +02:00
Chris White
6d244b3f67 remove hardcoded hipcc (#43644) 2024-04-12 19:38:15 -06:00
Chris White
6bc66db141 Axom: add new versions and bring in new changes (#43590)
* add new axom releases, sync changes between repos

* add new version of axom and add fallback depends for umpire/raja

* remove now redundant flags, add flags to cuda flags to flags output by cachedcmakepackage

Co-authored-by: white238 <white238@users.noreply.github.com>
2024-04-12 17:34:32 -06:00
James Shen
acfb2b9270 Geant4: extend patch to fix compile for 10.0-10.4 as well (#43640) 2024-04-12 18:52:22 -04:00
Frédéric Simonis
d92a2c31fb precice: Add version 3.1.1 (#43616) 2024-04-12 15:24:28 -07:00
eugeneswalker
e32561aff6 odc: add v1.4.6 (#43634) 2024-04-12 15:19:32 -07:00
Auriane R
4b0479159f pika: add pika 0.24.0 (#43624) 2024-04-12 15:18:27 -07:00
eugeneswalker
03bfd36926 w3nco %oneapi cflags: append -Wno-error=implicit-function-declaration (#43628)
* w3nco %oneapi cflags: append -Wno-error=implicit-function-declaration

* update flag for %apple-clang, %clang

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-12 20:54:18 +02:00
eugeneswalker
4d30c8dce4 gsibec: add v1.2.1 (#43630) 2024-04-12 12:42:46 -06:00
Eric Berquist
49d4104f22 emacs: add 29.3 (#43626) 2024-04-12 11:10:37 -07:00
Massimiliano Culpo
07fb83b493 gcc: add more detection tests (#43613) 2024-04-12 04:03:11 -06:00
Massimiliano Culpo
263007ba81 solver: add an integrity constraint for virtual nodes (#43582)
Upon close inspection of clingo answer sets, in some cases we have "equivalent" (i.e. same hash for the concrete spec) duplicates that differ only because of virtual nodes that are added to the answer set, without any edge using them.
2024-04-12 09:31:44 +02:00
Sajid Ali
3b6e99381f add py-h5py@3.11 (#43605)
* add py-h5py@3.11
* incorporate reviewer feedback
* incorporate reviewer feedback
2024-04-11 23:23:20 -06:00
Howard Pritchard
a30af1ac54 OpenMPI: add version 5.0.3 (#43609)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-04-11 22:48:15 -06:00
Matthew Thompson
294742ab7b openmpi: add MPIFC environment variable (#36669) 2024-04-11 21:50:59 -06:00
eugeneswalker
6391559fb6 e4s ci: add: netcdf-fortran, fpm, e4s-cl (#43601) 2024-04-11 21:01:38 -06:00
Dave Keeshan
d4d4f813a9 verible: Add version v0.0-3624-gd256d779 (#43604) 2024-04-11 20:56:00 -06:00
Dave Keeshan
4667163dc4 Add version 5.024 (#43603) 2024-04-11 20:45:06 -06:00
Dave Keeshan
439f105285 Add versions 0.39 and 0.40 (#43600) 2024-04-11 20:44:51 -06:00
eugeneswalker
f65b1fd7b6 dealii +cuda: conflicts with ^cuda@12: (#43599) 2024-04-11 19:43:32 -06:00
Radim Janalík
d23e06c27e Allow packages to be pushed to build cache after install from source (#42423)
This commit adds a property `autopush` to mirrors. When true, every source build is immediately followed by a push to the build cache. This is useful in ephemeral environments such as CI / containers.

To enable autopush on existing build caches, use `spack mirror set --autopush <name>`. The same flag can be used in `spack mirror add`.
2024-04-11 19:43:13 -06:00
Alberto Sartori
b76e9a887b justbuild: bump version 1.2.5 (#43592) 2024-04-11 18:51:03 -06:00
Filippo Spiga
55ffd439ce JUBE: add v2.6.x (#41272)
* Adding JUBE 2.6.0
* They quickly released a bugfix package
* Correct the version sha256 for v2.6.1

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: fspiga <fspiga@fc01-gg01.cm.cluster>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-04-11 16:01:43 -06:00
Thomas Green
d8a7b88e7b Create Castep package (#41230)
* Create package.py
* Update package.py
  Post review fixes.
* Style fixes.
2024-04-11 14:49:03 -07:00
Paul Kuberry
aaa1bb1d98 suite-sparse: refactor cmake args (#43386)
* Simplify config command and add BLAS/LAPACK location
* Use BLAS_ROOT and LAPACK_ROOT and disable use of system
  package registry
* Adds location of BLAS_LIBRARIES and LAPACK_LIBRARIES to CMake
* Adds CMake variables to prevent picking up system installations
  of BLAS/LAPACK. Fixes previous PR #43328 that was picking up
  incorrect installations
* Adds versions 7.2.1
2024-04-11 14:06:57 -07:00
Martin Aumüller
0d94b8044b libzip: add up to v1.10.1 (#43560)
* libzip: add up to v1.10.1
  - update homepage and change download url to GitHub
  - change build system to CMake for releases starting with 1.4
* [@spackbot] updating style on behalf of aumuell
* libzip: fix urls
* [@spackbot] updating style on behalf of aumuell
* libzip: do not add versions from libzip.org
  these are old, and urllib refuses to fetch them
* libzip: deprecate versions from libzip.org
  urllib refuses to fetch them, only curl would work

---------

Co-authored-by: aumuell <aumuell@users.noreply.github.com>
2024-04-11 14:51:02 -06:00
Wouter Deconinck
5a52780f7c acts: new version 33.1.0 (#43581)
* acts: new version 33.1.0
* actsvg: new version 0.4.41
* geomodel: new package
* [@spackbot] updating style on behalf of wdconinc
* acts: plugin_cmake_variant(geomodel)
* geomodel: with when(+visualization)

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-04-11 13:32:10 -06:00
Tristan Carel
dd0a8452ee py-simpleeval: add version 0.9.13 (#43568) 2024-04-11 12:19:30 -07:00
jalcaraz
c467bba73e TAU package: Include recent change for Ubuntu (#43572)
* Include recent change for Ubuntu
  Select option -disable-no-pie-on-ubuntu for some Ubuntu systems
  823971df01
* Added conflict for new variant
* Updated conflict version
* Added mention of Ubuntu to variant description
2024-04-11 12:06:34 -07:00
James Taliaferro
d680a0cb99 New package: timew (#43585)
* add timewarrior
* fix style checks
* fix style checks
2024-04-11 11:58:08 -07:00
Luc Berger
efadee26ef Kokkos Ecosystem: 4.3.00 (#43607)
* Kokkos Kernels: adding missing TPLs and pre-conditions
  Adding variants and dependencies for rocBLAS and rocSPARSE.
  Also adding a "when=" close to the TPL variants that prevents
  enabling the TPLs in versions of the library when it was not
  yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: adding cusolver and rocsolver for at version 4.3.00
* Kokkos Ecosystem: updating packages for release 4.3.00
* Kokkos: adding arch for SG2042
* Removing sg2042 from spack_micro_arch_map
  Removing it here and will work to add it in the proper generic spack location, likely:   `spack/lib/spack/external/archspec/json/cpu/microarchitectures.json` ?
2024-04-11 12:41:28 -06:00
Greg Becker
2077b3a006 invalid compiler: warn instead of error (#43491) 2024-04-11 20:39:27 +02:00
Adam J. Stewart
8e0c659b51 py-cartopy: add v0.23.0 (#43583)
* py-cartopy: add v0.23.0
* numpy 2 support added
2024-04-11 11:11:43 -07:00
Derek Ryan Strong
863ab5a597 opus: update package (#43587) 2024-04-11 10:58:27 -07:00
Derek Ryan Strong
db4e76ab27 cpio: add 2.15 (#43589) 2024-04-11 10:55:08 -07:00
Adam J. Stewart
6728a46a84 py-keras: add v3.2.1 (#43594) 2024-04-11 10:37:48 -07:00
Adam J. Stewart
5a09459dd5 py-pandas: add v2.2.2 (#43596) 2024-04-11 10:29:43 -07:00
James Taliaferro
7e14ff806a add taskwarrior 3.0.0 (#43580)
* add taskwarrior 3.0.0

* blacken
2024-04-11 04:59:25 -06:00
Wouter Deconinck
7e88cf795c wayland-protocols: new versions 1.34 (#43577) 2024-04-11 04:29:41 -06:00
Wouter Deconinck
1536e3d422 qt-base: depends_on cmake@3.21: when ~shared or platform=darwin (#43576) 2024-04-11 04:29:15 -06:00
Massimiliano Culpo
1fe8e63481 Reuse specs built with compilers not in config (#43539)
Allow reuse of specs that were built with compilers not in the current configuration. This means that specs from build caches don't need to have a matching compiler locally to be reused. Similarly when updating a distro. If a node needs to be built, only available compilers will be considered as candidates.
2024-04-11 09:13:24 +02:00
YI Zeping
dfca2c285e Packages: llvm cxx flag remove, new versions, and binutils build issue fix (#43567)
* add c++11 header to gold for compiler not defaulting to c++11

* glibc: add 2.39

* llvm: add 18.1.3

* fix #42314, remove cxx11 flag for llvm; should be controlled by cmake.

* modify patch

* llvm version

* add gmake version request
2024-04-10 23:04:58 -04:00
Martin Aumüller
2686f778fa qt-*: add v6.6.2, v6.6.3, and v6.7.0 (#43559)
* qt-base: add v6.6.2

* qt-declarative: add v6.6.2

* qt-quick3d: add v6.6.2

* qt-quicktimeline: add v6.6.2

* qt-shadertools: add v6.6.2

* qt-svg: add v6.6.2

* qt-base: add v6.7.0

* qt-svg: add v6.7.0

* qt-declarative: add v6.7.0

* qt-quick3d: add v6.7.0

* qt-quicktimeline: add v6.7.0

* qt-shadertools: add v6.7.0

* qt-*: add v6.6.3

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-10 16:30:09 -05:00
Wouter Deconinck
925e9c73b1 libpthread-stubs: new version 0.5 (#43574) 2024-04-10 13:29:45 -06:00
one
aba447e885 Add new versions for toml11 (#43469)
* Add new versions for toml11
  Added 3.8.0 and 3.8.1
* Update package.py, add cxx_std
* [@spackbot] updating style on behalf of alephpiece

---------

Co-authored-by: alephpiece <alephpiece@users.noreply.github.com>
2024-04-10 12:29:11 -07:00
Richard Berger
1113de0dad flecsi+legion: add cr versions FleCSI depended on in past releases (#43499)
* flecsi+legion: add cr versions FleCSI depended on in past releases
* flecsi: deprecate develop version
2024-04-10 12:26:17 -07:00
Wouter Deconinck
4110225166 libdrm: new version 2.4.120 (#43573) 2024-04-10 13:24:45 -06:00
Adam J. Stewart
24c839c837 py-scikit-learn: add v1.4.2 (#43557) 2024-04-10 12:20:18 -07:00
Martin Aumüller
42c6a6b189 ospray: add v3.1.0 and dependencies (#43558)
* rkcommon: add v1.13.0
* openvkl: add v2.0.1
* openimagedenoise: add v2.2.2
* ospray: add v3.1.0
2024-04-10 12:13:23 -07:00
Martin Aumüller
b0ea1c6f24 botan: add v3.4.0 (#43561) 2024-04-10 11:55:52 -07:00
Vanessasaurus
735102eb2b Automated deployment to update package flux-sched 2024-04-10 (#43566)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-10 11:54:49 -07:00
Vanessasaurus
2e3cdb349b Automated deployment to update package flux-core 2024-04-10 (#43565)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-10 11:54:20 -07:00
Seth R. Johnson
05c8030119 swig: update symlink alias to appease cmake (#43271) 2024-04-10 11:50:16 -07:00
Adam J. Stewart
bbcd4224fa py-matplotlib: add v3.8.4 (#43487)
* py-matplotlib: add v3.8.4

* matplotlib requires exact version of freetype for tests to pass

* Add version constraints to fontconfig deps

* Fix freetype build

* Freetype cmake build is useless

* Typo

* Fix download

* Fix build of older freetype

* cmake is useless
2024-04-10 18:20:42 +02:00
Adam J. Stewart
4c0cdb99b3 py-scipy: add v1.12 and v1.13 (#42213) 2024-04-10 17:56:21 +02:00
Axel Huebl
f22d009c6d pyAMReX: No CCache (#43570)
This interferes with Spack compiler wrappers.
2024-04-10 09:02:37 -06:00
Axel Huebl
c5a3e36ad0 Update pyAMReX: 24.03, 24.04 (#42858)
Latest version of pyAMReX.
2024-04-10 06:33:32 -06:00
Hariharan Devarajan
1c76ba1c3e Release for brahma 0.0.3 (#43525)
* Release for brahma 0.0.3
* switch the version directive order
2024-04-09 12:33:29 -06:00
Hariharan Devarajan
b969f739bd cpp-logger: add v0.0.3 (#43524)
* added cpp-logger 0.0.3
* switched version directive order
2024-04-09 12:27:41 -06:00
one
4788c4774c Add a new version for cxxopts (#43470)
Added 3.2.0
2024-04-09 10:36:05 -07:00
Wouter Deconinck
34de028dbc cppzmq: new version 4.10.0 (#43526) 2024-04-09 10:14:13 -07:00
Hariharan Devarajan
a69254fd79 release for gotcha 1.0.6 (#43531) 2024-04-09 10:10:32 -07:00
Adam J. Stewart
af5f205759 py-keras: add v3.2.0 (#43540) 2024-04-09 10:09:15 -07:00
HELICS-bot
77f9100a59 helics: Add version 3.5.2 (#43541)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-04-09 10:08:19 -07:00
Derek Ryan Strong
386bb71392 flac: update versions (#43544) 2024-04-09 10:07:15 -07:00
Wouter Deconinck
0676d6457f yoda: new version 2.0.0 (#43534)
* yoda: new versions 1.9.9, 1.9.10, and 2.0.0
* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-04-09 10:04:20 -07:00
Martin Lang
0b80e36867 [libxc] new homepage (#43546)
The old page on tddft.org is no longer reachable.
2024-04-09 09:59:47 -07:00
Stephen Sachs
4c9816f10c [mpas-model] Only add options for double precision when requested (#43547)
As in the original makefile "FFLAGS_PROMOTION = -fdefault-real-8
-fdefault-double-8" is only used when `precision=double`. This is the default
for the Spack package, so no change if `precision` is left unset.
2024-04-09 09:56:48 -07:00
psakievich
fb6741cf85 Trilinos: More accurate stk boot dependency (#43550)
Boost was not required as of `@13.4.0`
2024-04-09 06:49:06 -06:00
Kensuke WATANABE
3f2fa256fc LLVM: avoid Fujitsu compiler build fail in llvm17-18 (#43387)
* Avoid Fujitsu compiler Clang Mode options when building LLVM

* LLVM: avoid Fujitsu compiler build fail in llvm17-18

* address review comments
2024-04-08 19:56:30 -04:00
John W. Parent
d5c8864942 Windows bugfix: safe rename if renaming file onto itself (#43456)
* Generally use os.replace on Windows and Linux
* Windows behavior for os.replace differs when the destination exists
  and is a symlink to a directory: on Linux the dst is replaced and
  on Windows this fails - this PR makes Windows behave like Linux
  (by deleting the dst before doing the rename unless src and dst
  are the same)
2024-04-08 14:10:02 -07:00
Luc Berger
b3cef1072d Nalu: updating Trilinos recipe a bit (#43471)
* Nalu: updating Trilinos recipe a bit

Basic changes to build/install nalu properly using Spack.
Some more changes would be nice for instance adding an
option to build against Trilinos master or develop. Adding
a dependency on googletest to avoid the annoying build
failures in the unit-tests.

* Nalu: adding release 1.6.0

Nalu v1.6.0 can build cleanly against Trilinos 14.0.0 with the
proposed changes. The only other combo is master / master but
than one is "floating" as these branch evolve over time. When a
new Nalu comes out we might want to add another fixed version to
keep this recipes up to date!
2024-04-08 10:39:51 -06:00
Wouter Deconinck
e8ae9a403c acts: depends_on py-onnxruntime when +onnx for @23.3: (#43529) 2024-04-08 14:13:17 +02:00
Wouter Deconinck
1a8ef161c8 fastjet: new multi-valued variant plugins (#43523)
* fastjet: new multi-valued variant `plugins`

* rivet: depends_on fastjet plugins=cxx
2024-04-08 14:12:12 +02:00
Harmen Stoppels
d3913938bc py-tatsu: add upperbound on python (#43510) 2024-04-08 11:26:46 +02:00
Harmen Stoppels
4179880fe6 py-pymatgen: add forward compat bound for cython (#43511) 2024-04-08 11:26:09 +02:00
Harmen Stoppels
125dd0368e py-triton: add zlib (#43512) 2024-04-08 11:25:33 +02:00
Harmen Stoppels
fd68f8916c gperftools: add cmake build system (#43506)
the autotools build system does something funky which causes a link line
where gccs default link dirs are explicitly added and end up before the
-L from spack's libunwind, so that ultimately it links against system
libunwind.

the cmake build system does better.
2024-04-08 10:00:05 +02:00
Jonas Eschle
93e6f5fa4e Update jax & jaxlib versions (#42863)
* upgrade new versions

* style fix

* update jaxlib deps (not cuda and bazel yet)

* update jaxlib cuda versions

* update jaxlib cuda versions

* update jaxlib cuda versions

* chore: style fix

* Update package.py

* Update package.py

* fix:  typo

* docs: add source for cuda version

* py-jaxlib 0.4.14 also doesn't build on ppc64le

* Add 0.4.26

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-04-07 12:04:23 +02:00
Robert Cohn
54acda3f11 oneapi licenses (#43451) 2024-04-06 08:04:04 -04:00
kwryankrattiger
663e20fcc4 ParaView: add v5.12.0 (#42943)
* ParaView: Update version 5.12.0

Add 5.12.0 release
Update default to 5.12.0

* Add patch for building ParaView 5.12 with kits

* Drop VTKm from neoverse
2024-04-06 04:12:48 +00:00
eugeneswalker
6428132ebb e4s ci: enable lammps variants from presets/most.cmake (#43522) 2024-04-05 20:56:18 -07:00
eugeneswalker
171958cf09 py-deephyper: add v0.6.0 (#43492)
* py-deephyper: add latest version: v0.6.0

* e4s: add py-deephyper

* v0.6.0: depend on python@3.7:3.11

* add py-packaging constraint so arm64 builds work

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-06 00:28:37 +00:00
Frédéric Simonis
0d0f7ab030 Add release 3.1.0 (#43508) 2024-04-05 16:58:57 -07:00
eugeneswalker
35f8b43a54 e4s ci: add nekbone (#43515)
* e4s ci: add nekbone, nek5000

* remove nek5000
2024-04-05 16:36:13 -07:00
one
6f7eb3750c Add new versions of tinyxml2 (#43467)
* Add new versions of tinyxml2
  Added 7.0.0 to 10.0.0
* Add the variant "shared"
2024-04-05 13:36:45 -07:00
renjithravindrankannath
2121eb31ba Patch to set PARAMETERS_MIN_ALIGNMENT to the native alignment for rocm-opencl (#43444)
* For avx build, the start address of values_ buffer in KernelParameters
   is not a correct as it is computed based on 16-byte alignment.
* Style check error fix
2024-04-05 12:02:32 -07:00
kwryankrattiger
c68d739825 CI: Add debug to the log aggregation script (#42562)
* CI: Add debug to the log aggregation script
2024-04-05 14:00:27 -05:00
John W. Parent
c468697b35 Use correct method "append" instead of extend (#43514) 2024-04-05 18:46:47 +00:00
G-Ragghianti
c4094cf051 slate: Adding comgr as dependency (#43448)
* Adding comgr as dependency

* adding more smoke test deps
2024-04-05 11:32:16 -07:00
LinaMuryanto
9ff9ca61e6 py-amq, py-celery, py-kombu: New versions, fix build (#43295)
* updating package.py for py-celery, py-kombu, py-amq
* added more py-kombu package versions
* fix copyrights and stype on py-kombu/package.py
* removed extra spaces
* added py-billiard 4.2.0 and added back the license('BSD-3-Clause')
* removed extra spaces in py-celery/package.py
* fixed py-amqp 2.4.0 sha; fixed py-celery's dependency of py-click (when version restrictions)
* more clean up on specifying version bounds
2024-04-05 11:14:18 -07:00
Massimiliano Culpo
826e0c0405 Improve hit-rate on buildcaches (#43272)
* Relax compiler and target mismatches

The mismatch occurs on an edge. Previously it was assigned
the parent priority, now it is assigned the child priority.

This should make reuse from buildcaches or store more likely,
since most mismatches will be counted with "reused" priority.

* Optimize version badness for runtimes at very low priority

We don't want to e.g. switch other attributes because we
cannot reuse an old installed runtime.

* Optimize runtime attributes at very low priority

This is such that the version of the runtime would
not influence whether we should reuse a spec.

Compiler mismatches are considered for runtimes,
to avoid situations where compiling foo%gcc@9
brings in gcc-runtime%gcc@13 if gcc@13 is among
the available compilers

* Exclude specs without runtimes from reuse

This should ensure that we do not reuse specs that
could be broken, as they expect the compiler to be
installed in a specific place.
2024-04-05 20:10:28 +02:00
Andrey Perestoronin
1b86a842ea add itac and inspector packages (#43507) 2024-04-05 09:30:36 -04:00
Harmen Stoppels
558a28bf52 bazel: conflict with gcc 13 (#43504) 2024-04-05 14:24:06 +02:00
Harmen Stoppels
411576e1fa Do not acquire a write lock on the env post install if no views (#43505) 2024-04-05 12:31:21 +02:00
eugeneswalker
cab4f92960 datatransferkit: needs trilinos@14.2: for @3.1.0: (#43496) 2024-04-05 03:03:15 -06:00
Adam J. Stewart
c6c13f6782 GDAL: add v3.8.5 (#43493) 2024-04-04 21:29:28 -06:00
Daniele Cesarini
cf11fab5ad Added Libfort library (#43490) 2024-04-04 21:14:27 -06:00
Harmen Stoppels
1d8b35c840 installer.py: compute package_id from spec (#43485)
The installer runs `get_dependent_ids`, which follows edges outside the
subdag that's being installed, so it returns a superset of the actual
dependents.

That's generally fine, except that it calls `s.package` on every
dependent, which triggers a package class to be instantiated, which is a
lot of work.

Instead, compute the package id from the spec, since that's all that's
used anyways and does not trigger *lots* of slow and redundant
instantiations of package objects.
2024-04-04 20:39:30 -06:00
Weiqun Zhang
5dc46a976d amrex: add v24.04 (#43443) 2024-04-04 19:00:44 -07:00
Alex Richert
05f5596cdd Update grib-util recipe (#43484) 2024-04-04 15:07:05 -07:00
psakievich
6942c7f35b Update exawind packages (#40793) 2024-04-04 12:54:20 -06:00
Alex Richert
18f0ac0f94 Add g2@3.4.9 (#43481) 2024-04-04 11:50:08 -07:00
Vicente Bolea
d9196ee3f8 adios2: bump version 2.10.0 (#43479) 2024-04-04 13:46:40 -05:00
John W. Parent
ef0bb6fe6b Msvc: Determine OneAPI_ROOT from fc compiler path (#43131)
If ONEAPI_ROOT is not set as an environment variable, the current approach will raise an error.
Instead we can compute the OneAPI_ROOT from the compiler paths like we do with vcvarsall.
2024-04-04 11:14:44 -07:00
Alex Richert
3fed320013 Add MPI and arch bugfixes to SCOTCH (#39264)
* Add MPI and arch bugfixes to SCOTCH

* Update scotch/package.py
2024-04-04 12:48:39 -05:00
Chris Marsh
1aa77e695d Trilinos: add threadsafe variant (#43480)
* Fixes #43454 by exposing a threadsafe variant

* [@spackbot] updating style on behalf of Chrismarsh

* fix style

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-04-04 10:00:53 -07:00
Alex Richert
3a0efeecf1 add g2c@1.9.0 (#43482) 2024-04-04 09:56:19 -07:00
Alex Richert
5ffb5657c9 update g2tmpl recipe (#43483) 2024-04-04 09:56:09 -07:00
Alex Richert
2b3e7fd10a Add shared variant for fftw to allow static-only builds (#37897)
Co-authored-by: alexrichert <alexrichert@gmail.com>
2024-04-04 03:47:46 -06:00
James Smillie
cb315e18f0 py-pip package: enable install on Windows (#43203)
The installation mechanism used on Linux to install py-pip (using pip
from the downloaded wheel to install the wheel) does not work on Windows.

This updates the installation of py-pip on Windows to download and
use a zipapp of a specific pip version in order to install the wheel
pip version that is requested.
2024-04-03 23:03:56 -06:00
Jonas Eschle
10c637aca0 py-zfit: add v0.18.* (#43200)
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-04-04 02:55:00 +02:00
Greg Becker
fb4e1cad45 remove dpcpp compiler and package (#43418)
`dpcpp` is deprecated by intel and has been superseded by `oneapi` compilers for a very long time.

---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2024-04-03 15:34:23 -07:00
Adam J. Stewart
3054b71e2e py-rarfile: add v4.2 (#43477) 2024-04-03 15:08:36 -07:00
downloadico
47163f7435 py-cig-pythia: add missing py-setuptools dependency (#43473) 2024-04-03 15:02:32 -07:00
Dom Heinzeller
e322a8382f py-torch: Add variant 'internal-protobuf' to build with the internal protobuf (#43056) 2024-04-03 15:57:54 -06:00
Greg Sjaardema
53fb4795ca Seacas exodusii 04 2024 (#43468)
* SEACAS: Update package.py to handle new SEACAS project name
  The base project name for the SEACAS project has changed from
  "SEACASProj" to "SEACAS" as of @2022-10-14, so the package
  needed to be updated to use the new project name when needed.

  The refactor also changes several:
      "-DSome_CMAKE_Option:BOOL=ON"
  to
     define("Some_CMAKE_Option", True)
* SEACAS, EXODUSII: New version; deprecate older versions; better variant descriptions
* [@spackbot] updating style on behalf of gsjaardema
* Fix long lines reported by flake8

---------

Co-authored-by: gsjaardema <gsjaardema@users.noreply.github.com>
2024-04-03 15:46:57 -06:00
eugeneswalker
4517c7fa9b ginkgo@1.7 %oneapi@2024.1: icpx 2024.1 no longer accepts sycl::ext::intel::ctz (#43476) 2024-04-03 15:46:42 -06:00
Wouter Deconinck
efaed17f91 root: new version 6.30.06 (#43472) 2024-04-03 15:41:41 -06:00
Thomas Madlener
2c17cd365d Make it possible to build whizard from a git checkout (#43447) 2024-04-03 21:55:57 +02:00
psakievich
dfe537f688 Convert curl env mod method to a side effect (#43474) 2024-04-03 12:02:48 -07:00
G-Ragghianti
be0002b460 slate: Removing scalapack as test dependency, adding python (#43452)
* removing scalapack as test dependency, adding python
* fixing style
* style
2024-04-03 11:20:03 -07:00
Daryl W. Grunau
743ee5f3de eospac: expose versions 6.5.8 and 6.5.9 (#43450)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-04-03 11:17:15 -07:00
Cameron Smith
b6caf0156f simmetrix-simmodsuite: support RHEL8, fix module paths (#43455) 2024-04-03 11:07:43 -07:00
Christoph Junghans
ec00ffc244 byfl: fix llvm dep (#43460) 2024-04-03 10:39:06 -07:00
G-Ragghianti
f020256b9f magma add version 2.8.0 (#43417)
* Add release 2.8.0
* Changing C compiler to hipcc
* Final release version
* Adding new cmake definition for rocm support
* Enabling rocm version support
* Update sha256
* Updating website URL
* Removing unnecessary C compiler spec
* Adding rocm-core dependency
* fixing rocm-core version
* fixing rocm-core version
* fixing style
* bugfix
2024-04-03 11:25:46 -06:00
Peter Brady
04377e39e0 New package: Parthenon (#43426) 2024-04-03 10:05:02 -07:00
Vanessasaurus
ba2703fea6 flux-core: add v0.61.0 (#43465)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-03 12:07:32 +02:00
Adrien Bernede
92b1c8f763 RADIUSS packages update (Starting over #39613) (#41375) 2024-04-02 15:03:07 -07:00
Adam J. Stewart
2b29ecd9b6 py-pillow: add v10+ (#43441) 2024-04-02 14:46:21 -07:00
Adam J. Stewart
5b43bf1b58 py-scikit-image: add v0.21 and v0.22 (#43440)
* py-scikit-image: add v0.21 and v0.22
* Add maintainer and license
* Style fix
2024-04-02 14:41:29 -07:00
Juan Miguel Carceller
37d9770e02 gdb: add a dependency on pkgconfig (#43439)
* gdb: add a dependency on pkgconfig
* Apply dependency for 13.1 and onwards

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-02 14:39:03 -07:00
Brad Geltz
0e016ba6f5 geopm-runtime: New package (#42737)
* Add systemd

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* gobject-introspection: Correct glib versions

- The meson.build requirement that the glib version
  is >= the gobject-introspective version is not in place
  until v1.76.1.
- Prior to that, the requirement was glib >= 2.58.0.
- Bug introduced in acbf0d99c4, PR #42222.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* util-linux: add v2.39.3

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* py-natsort: add new versions

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* geopm-service: default systemd support to true

- Make the dependency sticky to force a failure
  if systemd compilation fails, or force
  the user to disable the option.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* geopm-service: Add initial multi-architecture support

- Restrict arch conflicts to 3.0.1
- Disable cpuid at configure time on non-x86_64 platforms.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* geopm-service: update docstrings

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add py-geopmdpy

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add geopm-runtime recipe

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

---------

Signed-off-by: Brad Geltz <brad.geltz@intel.com>
2024-04-02 17:27:36 +02:00
psakievich
7afa949da1 Add handling of custom ssl certs in urllib ops (#42953)
This PR allows the user to specify a path to a custom cert file (or directory) in
Spack's config:

```yaml
  # This is where custom certs for proxy/firewall are stored.
  # It can be a path or environment variable. To match ssl env configuration
  # the default is the environment variable SSL_CERT_FILE
  ssl_certs: $SSL_CERT_FILE
```

`config:ssl_certs` can be a path to a file or a directory, or it can be and environment
variable that resolves to one of those. When it posts to something valid, Spack will
update the ssl context to include custom certs, and fetching via `urllib` and `curl`
will trust the provided certs.

This should resolve many issues with fetching behind corporate firewalls.


---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: Alec Scott <alec@bcs.sh>
2024-04-01 11:11:13 -07:00
Jose E. Roman
b81d7d0aac slepc, py-slepc4py, petsc, py-petsc4py add v3.21.0 (#43435)
* New release SLEPc 3.21

* petsc, py-petsc4py  add v3.21.0

* [@spackbot] updating style on behalf of joseeroman

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: joseeroman <joseeroman@users.noreply.github.com>
2024-03-31 08:57:44 -07:00
Peter Scheibel
e78484f501 Concretize when_possible: add failure detection and explicit message (#43202)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-31 14:02:09 +02:00
Adam J. Stewart
6fd43b4e75 PyTorch: update ecosystem (#43423) 2024-03-30 06:47:22 -06:00
G-Ragghianti
14edb55288 SLATE package: fix smoke test (#43425) 2024-03-29 17:17:48 -07:00
Juan Miguel Carceller
f062f1c5b3 ROOT package: add patch for builds with libAfterImage for MacOS (#43428) 2024-03-29 16:48:59 -07:00
eugeneswalker
7756c8f4fc ci devtools manylinux2014: update ci image with compatible gpg (#43421) 2024-03-29 16:12:55 -06:00
eugeneswalker
69c8a9e4ba veloc@1.7: depend on er@0.4: (#43433) 2024-03-29 12:54:27 -07:00
Todd Gamblin
47c0736952 xz: add comment to avoid 5.6 pending CVE resolution (#43432)
XZ is compromised; add a note for maintainers to avoid updating until we
have a release without the CVE.
2024-03-29 12:03:13 -06:00
kwryankrattiger
8b89287084 CI Reproducer on Metal (#43411)
* MacOS image remove requires override syntax

* Metal reproducer auto start and cross-platform
2024-03-29 12:32:54 -05:00
Julien Cortial
8bd6283b52 med: add v4.1.1, and v5.0.0, update package recipe (#43409) 2024-03-29 18:27:33 +01:00
Peter Scheibel
179e4f3ad1 Don't delete "spack develop" build artifacts after install (#43424)
After #41373, where we stopped considering the source directory to be the stage for develop builds,
we resumed *deleting* the stage even after a successful build.

We don't want this for develop builds because developers need to iterate; we should keep the artifacts
unless they explicitly run `spack clean`.  

Now:
- [x] Build artifacts for develop packages are not removed after a successful install
- [x] They are also not removed before an install starts, i.e. develop packages always 
      reuse prior artifacts, if available.
- [x] They can be deleted in any other context, e.g. by running  `spack clean --stage`
2024-03-29 09:36:31 -07:00
kwryankrattiger
e97787691b force oneapi compiler unless specified otherwise (#43419) 2024-03-29 09:20:26 -07:00
kjrstory
5932ee901c openradioss: add DEXEC_NAME to cmake variable and change style of cmake_args (#43365) 2024-03-29 10:45:54 +01:00
afzpatel
3bdebeba3c UCX: Add patch to set HIP_PLATFORM_AMD (#43403) 2024-03-29 10:41:45 +01:00
Massimiliano Culpo
d390ee1902 spack load: remove --only argument (#42120)
The argument was deprecated in v0.21 and slated
for removal in v0.22.
2024-03-29 10:19:10 +01:00
AMD Toolchain Support
4f9fe6f9bf Adding 'logging' and 'tracing' variants to enable AOCL DTL trace and logging capabilities (#43414) 2024-03-29 00:58:31 -06:00
Richard Berger
df6d6d9b5c flecsi: depend on legion@24.03.0: moving forward (#43410)
Prior FleCSI releases relied on commits on the control-replication branch (cr)
of Legion. That branch has now been merged into master and is part of the
24.03.0 Legion release.
2024-03-29 00:53:36 -06:00
Luc Berger
e57d33b29f Trilinos: update SuperLU dependency (#43346) 2024-03-29 00:48:27 -06:00
Wouter Deconinck
85c6d6dbab (py-)onnx: new version 1.14.{0,1}, 1.15.0 (#41877)
* (py-)onnx: new version 1.14.{0,1}, 1.15.0

Notes on `onnx`:
- The C++ standard was changed to 14 in 1.15, so no more filter_file is needed. (The C++ standard has since changed to 17 in master.)

Notes on `py-onnx`:
- `py-pybind11` was an unlisted requirement in CMakeLists.txt since 1.3 or so (before earliest spack package).

* py-onnx: depends_on pybind11 with type link, not run

* py-onnx: depends_on py-setuptools@64:
2024-03-29 00:33:27 -06:00
Kyle Knoepfel
5f9228746e Add ability to rename environments (#43296) 2024-03-28 15:15:04 -06:00
Adam J. Stewart
9f2451ddff py-jaxlib: ppc64le support has been fixed (#43422) 2024-03-28 21:15:17 +01:00
Tristan Carel
a05eb11b7b steps: add version 5.0.1 (#43360) 2024-03-28 17:43:10 +01:00
kwryankrattiger
ae2d0ff1cd CI: fail the rebuild command if buildcache push failed (#40045) 2024-03-28 17:02:41 +01:00
Greg Becker
7e906ced75 spack find: add options for local/upstream only (#42999)
Users requested an option to filter between local/upstream results in `spack find` output.

```
# default behavior, same as without --install-tree argument
$ spack find --install-tree all

# show only local results
$ spack find --install-tree local  

# show results from all upstreams
$ spack find --install-tree upstream 

# show results from a particular upstream or the local install_tree
$ spack find --install-tree /path/to/install/tree/root
```

---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2024-03-28 10:00:55 -05:00
Andrey Perestoronin
647e89f6bc intel-oneapi 2024.1.0: added new version to packages (#43375)
* added new package versions

* add itac and inspector

* Remove ITAC and Inspector

* Set older version for MKL as a workaround to pass CI issue
2024-03-28 08:09:37 -06:00
Anderson Chauphan
3239c29fb0 trilinos: add v15.1.1 (#42996) 2024-03-28 10:08:28 -04:00
Martin Aumüller
abced0e87d openscenegraph: patch for compatibility with ffmpeg@5: (#43051)
Co-authored-by: aumuell <aumuell@users.noreply.github.com>
2024-03-28 11:21:47 +01:00
liam-o-marsh
300fc2ee42 scipy: register conflict with too-recent openblas (#43301) 2024-03-28 11:03:45 +01:00
Christopher Christofi
13c4258e54 py-folium: add new package with version 0.16.0 (#43352) 2024-03-28 10:50:37 +01:00
Sergey Kosukhin
f29cb7f953 zlib-ng: %nvhpc: Fix build with %nvhpc (similar to zlib) (#43095)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-03-28 10:41:10 +01:00
Ryan Honeyager
826b8f25c5 hdf5: fixes floating point exceptions generated when fpe trapping is enabled (#42880) 2024-03-28 10:34:10 +01:00
Cody Balos
ebaeea7820 sundials: add new version (#43008)
* sundials: add new version
* note previous default
* update when clause for removed options

---------

Co-authored-by: David J. Gardner <gardner48@llnl.gov>
2024-03-28 10:12:49 +01:00
Kensuke WATANABE
f76eb993aa pixman: avoid assembler macros error with Fujitsu compiler (#43362) 2024-03-28 10:10:59 +01:00
Tristan Carel
0b2c370a83 py-pytest-mpi: new package starting at 0.6 (#43368) 2024-03-28 10:01:01 +01:00
Tristan Carel
6a9ee480bf py-vascpy: new package starting at 0.1.1 (#43370) 2024-03-28 09:58:43 +01:00
potter-s
cc80d52b62 aws-cli-2: restrict supported Python versions to @:3.11 (#43390)
Co-authored-by: Simon Potter <sp39@sanger.ac.uk>
2024-03-28 08:39:51 +01:00
Paul R. C. Kent
b9c7d3b89b llvm: add 18.1.2 (#43401) 2024-03-28 08:24:05 +01:00
AMD Toolchain Support
c1be6a5483 QE: backport 7.3.1 elpa build fix to 7.3 (#43394)
Co-authored-by: Ning Li <ning.li@amd.com>
2024-03-28 08:12:23 +01:00
Juan Miguel Carceller
42550208c3 gaudi: add version 38 and a gaudialg variant (#42856)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-28 08:10:00 +01:00
Ken Raffenetti
be231face6 mpich: fixup removal of pmi=off option (#43377) 2024-03-28 08:08:37 +01:00
Chris White
89ac747a76 add lua 5.4.6 (#43407) 2024-03-27 22:37:29 -06:00
Richard Berger
5d8f36d667 Legion: add Rust-based profiler to install (#43408)
* legion: add missing license
* legion: add rust-based profiler to install
2024-03-27 22:25:54 -06:00
Sreenivasa Murthy Kolam
6c3fed351f Fix build failure when run-tests is enabled for rocfft (#42424)
* initial commit to fix the error when running run tests for rocfft

* apply patch from 5.7.0 onwards

* add description for the patch that is added
2024-03-27 21:09:29 -05:00
afzpatel
b9cbd15674 hipcc: new package for HIP (#43245)
* initial commit to add hipcc package
* restore setup_dependent_package
2024-03-27 21:05:50 -05:00
Alex Richert
b8f633246a Add aocc support to qt (#43400) 2024-03-27 17:23:45 -06:00
Massimiliano Culpo
a2f3e98ab9 pinentry: add v1.3.0 (#43395) 2024-03-27 22:28:34 +01:00
Paul R. C. Kent
acffe37313 libffi: add 3.4.6 and 3.4.5 (#43404) 2024-03-27 22:27:19 +01:00
Greg Sjaardema
249e5415e8 Update exodusii package (#43379)
* SEACAS: Update package.py to handle new SEACAS project name
  The base project name for the SEACAS project has changed from
  "SEACASProj" to "SEACAS" as of @2022-10-14, so the package
  needed to be updated to use the new project name when needed.
  The refactor also changes several:
      "-DSome_CMAKE_Option:BOOL=ON"
  to
     define("Some_CMAKE_Option", True)
* exodusii -- refactor and bring up-to-date
* Add missed patch file
* [@spackbot] updating style on behalf of gsjaardema
* Apply seacas windows patch here also.
* Update url so old checksums valid; redo new checksums

---------

Co-authored-by: gsjaardema <gsjaardema@users.noreply.github.com>
2024-03-27 14:14:49 -07:00
Elliott Slaughter
e2a942d07e legion: Add 24.03.0, update HIP dependency (#43398)
* legion: Add 24.03.0, update HIP dependency.
* legion: Remove CUDA upper bound, update HIP bounds, use f-strings.
* legion: Fix format.
* legion: Fix format again.
2024-03-27 14:38:28 -06:00
afzpatel
32deca2a4c initial commit to add check function to rocthrust (#43405) 2024-03-27 13:06:55 -07:00
Yanfei Guo
e4c64865f1 argobots: update to v1.2 (#43381)
* argobots: update to v1.2
* Replace prior maintainer with current one.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-03-27 13:58:37 -06:00
Adam J. Stewart
1175f37577 py-kornia: add v0.7.2 (#43184)
* py-kornia: add v0.7.2
* Add maintainers to recipe
* https not supported
* Debug build failure
* More detailed debug info
* Try rust+dev
* Fix pyo3 build
* Fix turbojpeg-sys build
* Fix dlpack-rs build
* Get rid of debug env vars
2024-03-27 10:58:06 -07:00
James Edgeley
faa183331f Update Nektar++ package (#43397)
* Update Nektar++ package
* shorten line
* correct to 5.5.0
* use cmake helpers
* style

---------

Co-authored-by: JamesEdgeley <JamesEdgeley@users.noreply.github.com>
2024-03-27 11:55:14 -06:00
afzpatel
bbac33871c bump rocprofiler-dev to 6.0 and add aqlprofile (#42459)
* Initial commit to bump rocprofiler-dev to 6.0 and add aqlprofile recipe
* bump to 6.0.2 and extracting binaries from deb pkg
* fixes for hpctoolkit build errors
* add yum and zyp aqlprofile packages
* fix style issues
2024-03-27 10:36:10 -07:00
afzpatel
6d4dd33c46 Enable ASAN in ROCm packages (#42704)
* Initial commit to enable ASAN
* fix styling
* fix styling
* add asan option for hip-tensor and roctracer-dev
2024-03-27 09:40:21 -07:00
Adrien Cotte
579bad05a8 Add py-modules-gui (mogui) (#43396) 2024-03-27 09:35:17 -07:00
psakievich
27a8eb0f68 Add config option and compiler support to reuse across OS's (#42693)
* Allow compilers to function across compatible OS's
* Add documentation in the default yaml

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
2024-03-27 15:39:07 +00:00
Tristan Carel
4cd993070f py-libsonata: new package starting at 0.1.25 (#43372)
* py-libsonata: new package starting at 0.1.25
* align spack version with the git branch
* Fix min required versions / use spack packages instead of submodules
2024-03-27 08:20:13 -07:00
renjithravindrankannath
4c55c6a268 rvs binary path updated for 6.0 (#43359) 2024-03-27 07:30:30 -07:00
Wouter Deconinck
a4a27fb1e4 root: new version 6.30.04 (#43378)
New version of ROOT, no changes to build system that aren't already accounted for (e.g. require http for webgui), https://github.com/root-project/root/compare/v6-30-02...v6-30-04
2024-03-27 07:28:34 -07:00
Massimiliano Culpo
66345e7185 Improve fixup macos rpath unit test (#43392)
Starting from XCode version 15 the linker ignores
duplicate rpaths, so the libraries don't need fixing
in those cases
2024-03-27 12:01:12 +01:00
dependabot[bot]
8f76f1b0d8 build(deps): bump actions/setup-python from 5.0.0 to 5.1.0 (#43384) 2024-03-27 10:49:45 +01:00
dependabot[bot]
4cab6f3af5 build(deps): bump codecov/codecov-action from 4.1.0 to 4.1.1 (#43385) 2024-03-27 10:49:33 +01:00
Harmen Stoppels
0d4665583b ci: inherit secrets in local workflows (#43391) 2024-03-27 09:48:14 +01:00
Harmen Stoppels
5d0ef9e4f4 ci: remove outdated version comments (#43389) 2024-03-27 09:26:12 +01:00
Harmen Stoppels
e145baf619 codecov: verbose: true (#43388) 2024-03-27 09:15:34 +01:00
Wouter Deconinck
6c912b30a2 Codespaces support for rapid PR evaluation (#41901)
* Create devcontainer.json

* Ensure codespace can be setup for current branch

* fix: find compilers in site scope

* fix: use cloud_pipelines ubuntu20.04 image

* fix: spack config --scope site add

* fix: use develop, not develop-root mirror
2024-03-26 21:13:32 -05:00
Jordan Ogas
f4da453f6b Charliecloud package: add 0.36 and 0.37; update dependencies. (#42590)
This adds a dependency on pkg-config which in turn builds pkg-config
on pipelines using %onapi/%cce: update the pkg-config build to disable
specific warnings-as-errors from these compilers.

Co-authored-by: Reid Priedhorsky <1682574+reidpr@users.noreply.github.com>
2024-03-26 16:13:29 -07:00
Harmen Stoppels
7e9caed8c2 ci: fix codecov upload (#43382) 2024-03-26 22:49:26 +01:00
Matthew Thompson
69509a6d9a pflogger: add version 1.14.0 (#43373) 2024-03-26 14:08:24 -06:00
Massimiliano Culpo
0841050d20 Add macos-14 as a runner (Apple M1) (#42728)
* Add macos-14 as a runner (Apple M1)

* Mark a test xfail

We need to check later if this test needs modifications
on Apple Silicon chips.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: alalazo <alalazo@users.noreply.github.com>
2024-03-26 12:36:21 -07:00
Cody Balos
321ffd732b axom package: add tests (#43312) 2024-03-26 11:21:28 -07:00
Simon Frasch
22922323e3 spfft package: add version 1.1.0 (#43318) 2024-03-26 10:19:06 -07:00
eugeneswalker
0b5b192c18 glvis: fix spack issue #42839 (#43369)
* glvis: fix spack issue #42839

* add issue link
2024-03-26 11:13:54 -06:00
Andrey Alekseenko
1275c57d88 gromacs: add new versions 2024, 2024.1; fix license (#43366) 2024-03-26 10:19:58 -06:00
Juan Miguel Carceller
29a39ac6a0 xrootd: add a dependency on pkgconfig when building with +davix (#43339)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-26 09:01:47 -07:00
kwryankrattiger
ae9c86a930 buildcache sync: manifest-glob with arbitrary destination (#41284)
* buildcache sync: manifest-glob with arbitrary destination

The current implementation of the --manifest-glob is a bit restrictive
requiring the destination to be known by the generation stage of CI.
This allows specifying an arbitrary destination mirror URL.

* Add unit test for buildcache sync with manifest

* Fix test and arguments for manifest-glob with override destination

* Add testing path for unused mirror argument
2024-03-26 08:47:45 -07:00
Massimiliano Culpo
83199a981d Allow unit test to work on Apple M1/M2 (#43363)
* Remove a few compilers from static test data

These compilers were used only in a bunch of tests, so
they are added only there.

* Remove clang@3.3 from unit test configuration

* Parametrize compilers.yaml

* Remove specially named gcc from static data

The compilers are used in two tests

* Remove apple-clang and macOS compilers from static data

The compiler was used only in multimethod tests

* Remove clang@3.5 (compiler seems to be unused)

* Remove gcc@4.4.0 (compiler seems to be unused)

* Exclude x86_64 tests on other architectures

* Mark two tests as for clingo only

* Update version syntax in compilers.yaml

* Parametrize tcl tests on architectures

* Parametrize lmod tests on architectures

* Substitute gcc@4.5.0 with gcc@4.8.0 so it can be used on aarch64

* Fix a few issues with aarch64 and unit-tests
2024-03-26 16:20:42 +01:00
eugeneswalker
ed40c3210e ci: add developer-tools-manylinux2014 stack (#43128)
* ci: add developer-tools-manylinux2014 stack

* add libtree, patchelf
2024-03-26 08:02:16 -07:00
Christopher Christofi
be96460ab2 py-nibabel: add new version 5.2.1 (#43335) 2024-03-26 12:46:48 +01:00
pauleonix
95caf55fe7 cuda: add 12.4.0, 12.3.2, 12.3.1 and 12.2.2 (#42748)
Add conflicts to ginkgo and petsc to avoid build failures with cuda@12.4

Co-authored-by: pauleonix <pauleonix@users.noreply.github.com>
2024-03-26 07:10:48 +01:00
Mark W. Krentel
960af24270 hpctoolkit: add version 2024.01.1 (#43353)
Add version 2024.01.1.  Adjust the dependencies for develop that no
longer uses libmonitor.
2024-03-25 17:57:45 -06:00
Christopher Christofi
899bef2aa8 py-nilearn: add new version (#43332)
* py-nilearn: add new version
* add maintainers and license
2024-03-25 16:45:13 -07:00
Adam J. Stewart
f0f092d9f1 py-keras: add v3.1.1 (#43283) 2024-03-26 00:38:01 +01:00
Christopher Christofi
6eaac2270d py-trx-python: add new package with version 0.2.9 (#43333) 2024-03-25 16:11:28 -07:00
John W. Parent
a9f3f6c007 seacas: fix linking on Windows (#43356) 2024-03-25 16:28:03 -06:00
Rocco Meli
08a04ebd46 spglib: add version 2.3.1 (#43345) 2024-03-25 14:27:32 -06:00
Matthew Thompson
d8e642ecb7 Update gftl, pflogger to match GFE 1.14 (#43291) 2024-03-25 13:22:50 -07:00
Christopher Christofi
669ed69d8e py-mne: add new version (#43334) 2024-03-25 13:21:07 -07:00
potter-s
7ebb21a0da Updated version (#43343)
Co-authored-by: Simon Potter <sp39@sanger.ac.uk>
2024-03-25 13:12:58 -07:00
Alec Scott
93ffa9ba5d go: add v1.22.1 (#43337) 2024-03-25 09:32:58 -07:00
liam-o-marsh
e5fdb90496 pyscf: new dependency bounds for pyscf, new version (#43300) 2024-03-25 10:24:01 -05:00
Danny McClanahan
303a0b3653 add command_line scope to help metavar (#42890)
It's now possible to add config on the command line with `spack -c <CONFIG_VARS> ...`, but the new `command_line` scope isn't reflected in the help output for `--scope`:

```bash
> spack help config
...
  --scope {defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT
                        configuration scope to read/modify
...
```
2024-03-25 07:13:43 -07:00
Juan Miguel Carceller
9f07544bde r-rcpp: add version 1.0.11 and 1.0.12 (#43330) 2024-03-25 13:53:12 +01:00
Harmen Stoppels
9b046a39a8 strip url: fix whl suffix, remove exe (#43344) 2024-03-25 13:46:09 +01:00
Massimiliano Culpo
0c9a53ba3a Add intel-oneapi-runtime, allow injecting virtual dependencies (#42062)
This PR adds:
- A new runtime for `%oneapi` compilers, called `intel-oneapi-runtime`
- Information to both `gcc-runtime`  and `intel-oneapi-runtime`, to ensure
  that we don't mix compilers using different soname for either `libgfortran`
  or `libifcore`

To do so, the following internal mechanisms have been implemented:
- Possibility to inject virtual dependencies from the `runtime_constraints`
  callback on packages

Information has been added to `gcc-runtime` to provide the correct soname
under different conditions on its `%gcc`.

Rules injected into the solver looks like:

```prolog
% Add a dependency on 'gfortran@5' for nodes compiled with gcc@=13.2.0 and using the 'fortran' language
attr("dependency_holds", node(ID, Package), "gfortran", "link") :-
  attr("node", node(ID, Package)),
  attr("node_compiler", node(ID, Package), "gcc"),
  attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
  not external(node(ID, Package)),
  not runtime(Package),
  attr("language", node(ID, Package), "fortran").

attr("virtual_node", node(RuntimeID, "gfortran")) :-
  attr("depends_on", node(ID, Package), ProviderNode, "link"),
  provider(ProviderNode, node(RuntimeID, "gfortran")),
  attr("node", node(ID, Package)),
  attr("node_compiler", node(ID, Package), "gcc"),
  attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
  not external(node(ID, Package)),
  not runtime(Package),
  attr("language", node(ID, Package), "fortran").

attr("node_version_satisfies", node(RuntimeID, "gfortran"), "5") :-
  attr("depends_on", node(ID, Package), ProviderNode, "link"),
  provider(ProviderNode, node(RuntimeID, "gfortran")),
  attr("node", node(ID, Package)),
  attr("node_compiler", node(ID, Package), "gcc"),
  attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
  not external(node(ID, Package)),
  not runtime(Package),
  attr("language", node(ID, Package), "fortran").
```
2024-03-24 22:59:21 -07:00
Loic Hausammann
1fd4353289 openmpi: Add new variant: romio-filesystem=string (#43265)
Co-authored-by: loikki <loic.hausammann@id.ethz.ch>
2024-03-24 01:00:38 +01:00
potter-s
fcb8ed6409 py-plotly: Add versions up to 5.20 (#43284) 2024-03-24 00:55:42 +01:00
Marie Houillon
2f11862832 openCARP: Add v15.0 packages (#43299)
Co-authored-by: openCARP consortium <info@opencarp.org>
2024-03-24 00:35:32 +01:00
Gavin John
bff11ce8e7 py-kaleido: Add MacOS build, fix checksums (#43309)
* py-kaledio: Fix completely borked package.py

* Readd stuff I forgot to add

* And one last missing thing

* Remove python restriction

* [@spackbot] updating style on behalf of Pandapip1

* Add MacOS build

* Fix checksum

* Handle all supported OSes

* Split imports

* Remove extra version stuff
2024-03-24 00:31:38 +01:00
fgava90
218693431c paraview: fix range of exodusII-netcdf4.9.0.patch (#42926)
Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
2024-03-23 20:33:20 +01:00
Alex Richert
e036cd9ef6 zlib-ng: New variants: +shared and +pic (#42796) 2024-03-23 19:32:42 +01:00
Sinan
cd5bef6780 py-line-profiler: Add 4.1.2 and 3.5.1 with their deps (#43156)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-03-23 11:16:48 -06:00
Davide
159e9a20d1 suite-sparse: Add version 7.3.1 (#43328) 2024-03-23 10:00:28 -06:00
Auriane R
99bb288db7 py-torch-nvidia-apex: @3.11: Add config_settings(PEP517), add missing py-packaging (#43306) 2024-03-23 16:04:10 +01:00
Chris White
99744a766b BLT: add new version 0.6.2 (#43257) 2024-03-23 15:25:29 +01:00
Juan Miguel Carceller
ddd8be51a0 re2: use the same C++ std used by abseil-cpp (#43288)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-23 14:48:38 +01:00
吴坎
bba66b1063 py-vl-convert-python: Add 1.3.0 (#43297) 2024-03-23 14:44:23 +01:00
Howard Pritchard
1c3c21d9c7 mpich: add variant +xpmem to specify use of xpmem (#43293)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-03-23 14:36:06 +01:00
Christopher Christofi
cbe9b3d01c bgen: add new package with version 1.1.7 (#43327) 2024-03-23 13:49:41 +01:00
Richard Berger
0abf5ba43c hip: don't set HIP_PATH in ROCm 5.5+ (#42882) 2024-03-23 13:22:06 +01:00
Jonas Eschle
9ab3c1332b py-tensorflow-probability: Re-activate +py-jax variant for @0.20:, Add 0.23.0 (#43249) 2024-03-23 13:15:26 +01:00
Christopher Christofi
b6425da50f New package: qctool (#43326) 2024-03-23 05:46:03 -06:00
YI Zeping
937a4dbf69 NWChem pacakge: expand fftw patch appliance range to 7.2.2 (#43321)
* expand fftw patch appliance range to include 7.2.2
* fix formatting bug
2024-03-23 05:41:34 -06:00
Gavin John
cd779ee54d octave package: correction to jdk variant description (#43325) 2024-03-23 05:41:11 -06:00
Gavin John
7ddcb13325 openjdk package: use correct homepage (#43324) 2024-03-23 05:40:58 -06:00
Gavin John
7666046ce3 icedtea: change jdk dependency to java (#43323) 2024-03-23 05:40:43 -06:00
Miguel Dias Costa
8e89e61402 BerkeleyGW package: add version 4.0 (#43316) 2024-03-23 04:02:44 -06:00
Auriane R
d0dbfaa5d6 aws-ofi-nccl package: add versions including 1.8.1 (#43305)
The default url couldn't be the one with v0.0.0-aws since spack was
replacing v0.0.0-aws with v<version_number> for example, deleting the
-aws suffix. I used the url_for_version method to specify this suffix.
2024-03-22 17:49:00 -07:00
Gavin John
26f562b5a7 New package: py-kneaddata (#43310) 2024-03-22 17:20:52 -07:00
Chris Marsh
2967804da1 netcdf-cxx4: fix reference error (#43322) 2024-03-22 16:05:17 -07:00
Harmen Stoppels
c3eaf4d6cf Support for prereleases (#43140)
This adds support for prereleases. Alpha, beta and release candidate
suffixes are ordered in the intuitive way:

```
1.2.0-alpha < 1.2.0-alpha.1 < 1.2.0-beta.2 < 1.2.0-rc.3 < 1.2.0 < 1.2.0-xyz
```

Alpha, beta and rc prereleases are defined as follows: split the version
string into components like before (on delimiters and string boundaries).
If there's a string component `alpha`, `beta` or `rc` followed by an optional
numeric component at the end, then the version is prerelease.

So `1.2.0-alpha.1 == 1.2.0alpha1 == 1.2.0.alpha1` are all the same, as usual.

The strings `alpha`, `beta` and `rc` are chosen because they match semver,
they are sufficiently long to be unambiguous, and and all contain at least
one non-hex character so distinguish them from shasum/digest type suffixes.

The comparison key is now stored as `(release_tuple, prerelease_tuple)`, so in
the above example:

```
((1,2,0),(ALPHA,)) < ((1,2,0),(ALPHA,1)) < ((1,2,0),(BETA,2)) < ((1,2,0),(RC,3)) < ((1,2,0),(FINAL,)) < ((1,2,0,"xyz"), (FINAL,))
```

The version ranges `@1.2.0:` and `@:1.1` do *not* include prereleases of
`1.2.0`.

So for packaging, if the `1.2.0alpha` and `1.2.0` versions have the same constraints on
dependencies, it's best to write

```python
depends_on("x@1:", when="@1.2.0alpha:")
```

However, `@1.2:` does include `1.2.0alpha`. This is because Spack considers
`1.2 < 1.2.0` as distinct versions, with `1.2 < 1.2.0alpha < 1.2.0` as a consequence.

Alternatively, the above `depends_on` statement can thus be written

```python
depends_on("x@1:", when="@1.2:")
```

which can be useful too. A short-hand to include prereleases, but you
can still be explicit to exclude the prerelease by specifying the patch version
number.

### Concretization

Concretization uses a different version order than `<`. Prereleases are ordered
between final releases and develop versions. That way, users should not
have to set `preferred=True` on every final release if they add just one
prerelease to a package. The concretizer is unlikely to pick a prerelease when
final releases are possible.

### Limitations

1. You can't express a range that includes all alpha release but excludes all beta
   releases. Only alternative is good old repeated nines: `@:1.2.0alpha99`.

2. The Python ecosystem defaults to `a`, `b`, `rc` strings, so translation of Python versions to
   Spack versions requires expansion to `alpha`, `beta`, `rc`. It's mildly annoying, because
   this means we may need to compute URLs differently (not done in this commit).

### Hash

Care is taken not to break hashes of versions that do not have a prerelease
suffix.
2024-03-22 23:30:32 +01:00
John W. Parent
397334a4be Spack CI: Refactor process_command for Cross Platform support (#39739)
Generate CI scripts as powershell on Windows. This is intended to
output exactly the same bash scripts as before on Linux.

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2024-03-22 14:06:29 -07:00
Harmen Stoppels
434836be81 python wheels: do not "expand" (#43317) 2024-03-22 16:57:46 +01:00
Thomas-Ulrich
7b9b976f40 openssh: add 9.7p1 and 9.6p1; update workaround for clang (#40857)
Signed-off-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-03-22 08:24:32 -06:00
Mikael Simberg
4746e8a048 apex: Set APEX_WITH_KOKKOS CMake option in apex package (#43243)
* Make sure APEX_WITH_KOKKOS CMake option is set in apex

* Add conflict for apex with ~kokkos
2024-03-22 08:50:29 +01:00
Rocco Meli
69c684fef9 ELPA: enable GPU streams and update deprecated variables (#43145)
* elpa streams

* [@spackbot] updating style on behalf of RMeli

* Apply suggestions from @albestro

* Update var/spack/repos/builtin/packages/elpa/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2024-03-22 08:42:24 +01:00
Valentin Volkl
2314aeb884 py-cmake: only run test suite when run_tests (#43246)
As the cmake build is triggered by scikit build, the usual spack option
for enabling tests had no effect and the heavy test suite ran all the time.

Used https://github.com/scikit-build/cmake-python-distributions/issues/172#issuecomment-890322263
to implement how to pass options to the actual `cmake` build.

I also excluded some tests that failed for me on alma9 (gcc 11.4.1),
so the rest of the test suite can be run.
2024-03-22 04:34:43 +01:00
Martin Aumüller
d33e10a695 ffmpeg: add v6.1.1 and older patch release updates (#43050) 2024-03-22 03:39:53 +01:00
Henning Glawe
7668a0889a ncurses: Add terminfo for rxvt-unicode{,-256color} (#42721)
taken from the debian ncurses source package.
2024-03-22 02:50:24 +01:00
Thomas-Ulrich
d7a74bde9f easi: add v1.3.0, python bindings and master (#42784) 2024-03-22 02:44:20 +01:00
AMD Toolchain Support
fedf8128ae openblas: Add variant dynamic_dispatch: select best kernel at runtime (#42746)
Enable OpenBLAS's built-in CPU capability detection and kernel selection. 

This allows run-time selection of the "best" kernels for the running CPU, rather
than what is specified at build time.  For example, it allows OpenBLAS  to use
AVX512 kernels when running on ZEN4, and built targeting the "ZEN" architecture.

Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-03-22 02:27:21 +01:00
Richard Berger
f70af2cc57 libristra: depends_on() for lua: allow newer lua versions (#42810) 2024-03-22 01:53:33 +01:00
Alex Leute
50562e6a0e py-neptune-client and missing deps: new package (#43059)
Co-authored-by: Cecilia Lau <chlits@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-03-22 01:51:13 +01:00
Sergey Kosukhin
4ac51b2127 libiconv: fix building with nvhpc (#43033) 2024-03-22 01:43:30 +01:00
HELICS-bot
81c9e346dc helics: Add version 3.5.1 (#43314)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-03-21 18:08:36 -06:00
Christopher Christofi
73e16a7881 py-mrcfile: add new version (#43125) 2024-03-22 00:00:15 +01:00
Alex Richert
af8868fa47 xv (image viewer): Add missing depends_on(libxt) (#43277) 2024-03-21 23:24:29 +01:00
Rémi Lacroix
cfd4e356f8 x264: Tag a recent commit: 20240314 (#43304)
x264 does not publish releases so tag a newer commit in Spack in order to be able to use an up-to-date version.
2024-03-21 23:20:00 +01:00
Adam J. Stewart
fc87dcad4c py-torchmetrics: add v1.3.2 (#43244) 2024-03-21 23:00:32 +01:00
snehring
65472159c7 dorado: adding version 0.5.3 (#43313) 2024-03-21 22:59:20 +01:00
Stephen Hudson
d1f9d8f06d libEnsemble: add v1.2.2 (#43308) 2024-03-21 22:57:19 +01:00
Rocco Meli
67ac9c46a8 namd: disable parallel build for 3.0b3 (#43215) 2024-03-21 22:26:36 +01:00
Stephen Sachs
aa39465188 Re enable aws pcluster buildcache stack (#38931)
* Changes to re-enable aws-pcluster pipelines

- Use compilers from pre-installed spack store such that compiler path relocation works when downloading from buildcache.
- Install gcc from hash so there is no risk of building gcc from source in pipleine.
- `packages.yam` files are now part of the pipelines.
- No more eternal `postinstall.sh`. The necessary steps are in `setup=pcluster.sh` and will be version controlled within this repo.
- Re-enable pipelines.

* Add  and

* Debugging output & mv skylake -> skylake_avx512

* Explicilty check for packages

* Handle case with no intel compiler

* compatibility when using setup-pcluster.sh on a pre-installed cluster.

* Disable palace as parser cannot read require clause at the moment

* ifort cannot build superlu in buildcache

`ifort` is unable to handle such long file names as used when cmake compiles
test programs inside build cache.

* Fix spack commit for intel compiler installation

* Need to fetch other commits before using them

* fix style

* Add TODO

* Update packages.yaml to not use 'compiler:', 'target:' or 'provider:'

Synchronize with changes in https://github.com/spack/spack-configs/blob/main/AWS/parallelcluster/

* Use Intel compiler from later version (orig commit no longer found)

* Use envsubst to deal with quoted newlines

This is cleaner than the `eval` command used.

* Need to fetch tags for checkout on version number

* Intel compiler needs to be from version that has compatible DB

* Install intel compiler with commit that has DB ver 7

* Decouple the intel compiler installation from current commit

- Use a completely different spack installation such that this current pipeline
commit remains untouched.
- Make the script suceed even if the compiler installation fails (e.g. because
the Database version has been updated)
- Make the install targets fall back to gcc in case the compiler did not install
correctly.

* Use generic target for x86_64_vX

There is no way to provision a skylake/icelake/zen runner. They are all in the
same pools under x86_64_v3 and x86_64_v4.

* Find the intel compiler in the current spack installation

* Remove SPACK_TARGET_ARCH

* Fix virtual package index & use package.yaml for intel compiler

* Use only one stack & pipeline per generic architecture

* Fix yaml format

* Cleanup typos

* Include fix for ifx.cfg to get the right gcc toolchain when linking

* [removeme] Adding timeout to debug hang in make (palace)

* Revert "[removeme] Adding timeout to debug hang in make (palace)"

This reverts commit fee8a01580489a4ea364368459e9353b46d0d7e2.

* palace x86_64_v4 gets stuck when compiling try newer oneapi

* Update comment

* Use the latest container image

* Update gcc_hashes to match new container

* Use only one tag providing tags per extends call

Also removed an unnecessary tag.

* Move generic setup script out of individual stack

* Cleanup from last commit

* Enable checking signature for packages available on the container

* Remove commented packages / Add comment for palace

* Enable openmpi@5 which needs pmix>3

* don't look for intel compiler on aarch64
2024-03-21 14:45:05 -05:00
downloadico
09810a5e7c py-cig-pythia: add py-cig-pythia package to spack (#43294) 2024-03-21 11:11:14 -07:00
Alec Scott
446c0f2325 Disable interactive editor when --batch if passed to checksum (#43102) 2024-03-21 18:15:09 +01:00
Auriane R
c4ce51c9be Add flash-attn package (#42939) 2024-03-21 07:23:07 -06:00
Harmen Stoppels
1f63a764ac jdk: new versions (#43264) 2024-03-21 12:49:29 +01:00
Alec Scott
384e198304 py-python-lsp-server: add v1.10.0 (#42799)
* py-python-lsp-server: add v1.10.0

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove py-wheel from package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-21 10:36:15 +01:00
Auriane R
2303332415 Update aws-ofi-nccl to use the hwloc option (#43287) 2024-03-21 09:38:40 +01:00
Tom Scogland
0eb1957999 cmd/python: use runpy to allow multiprocessing in scripts (#41789)
Running a `spack-python` script like this:

```python

import spack
import multiprocessing

def echo(args):
    print(args)

if __name__ == "__main__":
    pool = multiprocessing.Pool(2)
    pool.map(echo, range(10))
```

will fail in `develop` with an error like this:

```console
_pickle.PicklingError: Can't pickle <function echo at 0x104865820>: attribute lookup echo on __main__ failed
```

Python expects to be able to look up the method `echo` in `sys.path["__main__"]` in
subprocesses spawned by `multiprocessing`, but because we use `InteractiveConsole` to
run `spack python`, the executed file isn't considered to be the `__main__` module, and
lookups in subprocesses fail. We tried to fake this by setting `__name__` to `__main__`
in the `spack python` command, but that doesn't fix the fact that no `__main__` module
exists.

Another annoyance with `InteractiveConsole` is that `__file__` is not defined in the
main script scope, so you can't use it in your scripts.

We can use the [runpy.run_path()](https://docs.python.org/3/library/runpy.html#runpy.run_path) function,
which has been around since Python 3.2, to fix this.

- [x] Use `runpy` module to launch non-interactive `spack python` invocations
- [x] Only use `InteractiveConsole` for interactive `spack python`
2024-03-21 01:32:28 -07:00
Greg Becker
de1f9593c6 cray: return false more readily in detection logic (#43150)
Often in containers, the files we use to detect whether a cray system supports new features are not available.

Given that the cray containers only support the newer versions, and that these versions have been
around for a while at this point and few sites don't support them, this PR changes the logic for
detecting cray systems so that:

1. Don't even consider whether something is the `cray` platform if `opt/cray` is not in `MODULEPATH`
2. Only use the `cray` platform if we can read files in /opt/cray/pe and positively detect an older version
3. Otherwise, assume we're *not* on a cray (includes newer Cray PE's, which we treat as Linux)
2024-03-20 15:43:45 -07:00
Christopher Christofi
65fa71c1b4 py-pycm: new package (#43251)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Signed-off-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-03-20 19:02:36 +01:00
Martin Lang
9802649716 berkeleygw: update FCPP flags for gcc (#42848)
Compilation with the old flags fails on PowerPC (power8le) due to syntax
errors in the output from the preprocessor. Compilation with the
extended set of flags works both on PowerPC and x86_64.

The correct set of flags was suggested from the berkeleygw developers:
https://groups.google.com/a/berkeleygw.org/g/help/c/ewi3RZgOyeE/m/jSIoe45PAgAJ

Uses spack-provided compiler prefix to call cpp when compiling berkeleygw with gcc.
2024-03-20 18:55:54 +01:00
Alex Leute
8d9d721f07 py-postcactus: new package (#42907)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2024-03-20 18:50:52 +01:00
Greg Becker
ecef72c471 Target.optimization_flags converts non-numeric versions to numeric (#43179) 2024-03-20 09:39:26 -07:00
Robert Cohn
485b6e2170 Update Intel download URLs (#43286) 2024-03-20 12:34:03 -04:00
mvlopri
ba02c6b70f seacas: update the variants and tpls (#43195)
Adds variants to turn off tests
Add variants for some missing TPL options
Add the variables required to build in ~shared

* Add pamgen to Trilinos as a variant to support SEACAS

This adds the ability to turn off and on pamgen as needed
through the variant interface for the Trilinos package.py.
Add changes for seacas package.py to build the appropriate
Trilinos variants.
Add zlib-api as depends_on instead of zlib directly for SEACAS
package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-20 10:14:24 -06:00
Juan Miguel Carceller
7028669d50 fastjet: add version 3.4.2 (#43285)
* fastjet: add version 3.4.2

* Change the range of the ATLAS patch

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-20 16:46:09 +01:00
Dom Heinzeller
2f0a73f7ef Define variant typescript for py-jupyter-server with explicit dependency on npm (#43279)
* Add variant typescript for py-jupyter-server@:1, which then requires npm/node. Patch the build system for ~typescript so that it doesn't find any npm/node installations and attempts to build the typescript extension even though it shouldn't

* Fix formatting in var/spack/repos/builtin/packages/py-jupyter-server/package.py

* Constrain typescript variant to py-jupyter-server versions 1.10.2:1

* with when not needed if variant doesn't exist for other versions
2024-03-20 08:03:18 -06:00
Massimiliano Culpo
7cb0dbf77a Remove optimization criterion on OS mismatches (#43282) 2024-03-20 04:57:56 -06:00
G-Ragghianti
ac8800ffc7 heffte: Update MKL dependency to intel-oneapi-mkl (#43273) 2024-03-20 03:28:21 -06:00
Richard Berger
eb11fa7d18 lua-sol2: merge duplicate sol2 package into it (#43155) 2024-03-20 03:22:58 -06:00
Mikael Simberg
4d8381a775 fmt: Add master branch as version (#43239) 2024-03-20 03:18:06 -06:00
jmlapre
de5e20fc21 py-snoop: new package (#42945) 2024-03-20 01:18:24 +01:00
Adam J. Stewart
c33af49ed5 py-keras: add v3.1.0 (#43268) 2024-03-20 01:10:14 +01:00
Martin Aumüller
3addda6c4d mgard: disable C++11 warning also for apple-clang@15 (#43170) 2024-03-19 17:27:42 -06:00
AMD Toolchain Support
33f6f55d6b aocl-sparse: fix inconsistency in dependency logic (#43259) 2024-03-19 16:45:39 -06:00
AMD Toolchain Support
41d20d3731 amdblibm: add support for parallel build (#43258) 2024-03-19 16:45:09 -06:00
Bill Williams
dde8fa5561 scorep: add conflict for ROCm6 (#43240)
Co-authored-by: William Williams <william.williams@tu-dresden.de>
Co-authored-by: wrwilliams <wrwilliams@users.noreply.github.com>
2024-03-19 16:40:08 -06:00
Mikael Simberg
588a94bc8c apex: add v2.6.5 (#43242) 2024-03-19 16:02:58 -06:00
Alec Scott
06392f2c01 gnutls: add v3.8.3 (#43229)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-03-19 22:23:19 +01:00
Tom Payerle
f16e29559e n2p2: add --no-print-directory flag to calls to "make" (#43196)
This should fix issue #43192

Basically, had issue where a make variable was set to the output
of a shell function which included cd commands, and then the value
of that variable used as a makefile target.

The cd commands in the shell function caused assorted informational
messages (e.g. "Entering directory ...") which got included in the
return of the shell function, corrupting the value of the variable.
The presence of colons in the corrupted value caused make to issue
"multiple target" erros.

This fix adds --no-print-directory flags to the calls to the
make function in the package's build method, which resolves the
issue above.

It is admittedly a crude fix, and will remove *all* informational
messages re directory changes, thereby potentially making it more
difficult to diagnose/debug future issues building this package.
However, I do not see a way for to turn off these messages in a
more surgical manner.
2024-03-19 20:04:15 +01:00
Rémi Lacroix
ea96403157 ffmpeg: Fix patch hash (#43269) 2024-03-19 19:46:02 +01:00
G-Ragghianti
b659eac453 slate: add v2023.11.05 (#42913)
* Updated version of slate

* Added rocm version conflict

* Added patch to fix openMP problem
2024-03-19 08:58:34 -07:00
John W. Parent
ab590cc03a WGL: Update libs for new archspec on Win (#43253) 2024-03-19 09:43:29 -06:00
Harmen Stoppels
1a007a842b cmake: deprecate old patch releases and add missing gmake dep (#43261) 2024-03-19 15:41:50 +01:00
Martin Aumüller
9756354998 mgard: don't restrict protobuf version more than necessary (#43172)
* mgard: don't restrict protobuf version more than necessary

successfully built:
mgard@2022-11-18 ^protobuf@3.{4,21,25}
mgard@2023-01-10 ^protobuf@3.{4,25}
mgard@2023-03-31 ^protobuf@3.{4,25}

compile failures:
mgard@2022-11-18 ^protobuf@3.3
mgard@2023-01-10 ^protobuf@3.3
mgard@2023-03-31 ^protobuf@3.3

* mgard: add conflicts to address CI errors

* mgard: conflict between cuda and abseil@20240116.1

compiling mgard+cuda with gcc@12.3.0 and nvcc from cuda@12.3.0 against
protobuf pulling in abseil-cpp@20240116.1 results in the errors reported
here: https://github.com/abseil/abseil-cpp/issues/1629
2024-03-19 09:29:44 +01:00
Sinan
3984dd750c package/py-setuptools_add_new_versions (#43180)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-03-18 12:33:39 -06:00
Gregor Daiß
d5c1e16e43 sgpp: update dependency versions (#43178)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-18 15:52:16 +01:00
Sinan
56ace9a087 py-wand: add v0.6.13 (#42972) 2024-03-18 15:13:50 +01:00
jdomke
6e0bab1706 fujitsu-mpi: add gcc and clang support (#43053)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-03-18 15:12:44 +01:00
John W. Parent
193386f6ac netcdfc: consider static build in pkgconf filtering (#43084) 2024-03-18 14:40:32 +01:00
Christopher Christofi
755131fcdf py-optax: add new version (#43169) 2024-03-18 14:19:53 +01:00
dependabot[bot]
9a71733adb build(deps): bump docker/setup-buildx-action from 3.1.0 to 3.2.0 (#43204)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](0d103c3126...2b51285047)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-18 14:10:11 +01:00
dependabot[bot]
cd919d51ea build(deps): bump docker/build-push-action from 5.2.0 to 5.3.0 (#43205)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.2.0 to 5.3.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](af5a7ed5ba...2cdde995de)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-18 14:09:29 +01:00
George Young
12adf66d07 telocal: add new package (#43241)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-03-18 07:08:32 -06:00
afzpatel
c02f58da8f llvm-amdgpu: add rpath to HIP rt (#42876) 2024-03-18 13:37:27 +01:00
Harmen Stoppels
9662d181a0 use directives in some packages (#43238) 2024-03-18 12:53:53 +01:00
Alec Scott
282df7aecc fzf: add v0.48.0 (#43230) 2024-03-18 12:53:37 +01:00
wspear
b4c0e6f03b scorep: add v8.4 (#43225) 2024-03-18 11:23:28 +01:00
Wileam Y. Phan
4cd8488139 intel-gtpin: add version 4.0 (#43216) 2024-03-18 11:09:29 +01:00
George Young
69a052841c py-cutadapt: updating to @4.7 (#43214)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-03-18 11:08:14 +01:00
Matthias Wolf
a3f39890c2 py-shacl: new version, update dependencies (#42905)
* py-shacl: new version, update dependencies

Also updates the dependencies py-prettytable and py-rdflib.

* review comments

* Update var/spack/repos/builtin/packages/py-pyshacl/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-poetry-core: add required 1.8.1

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-18 10:56:40 +01:00
Christopher Christofi
02d126ce2b py-geopandas: add new version 0.14.3 (#43235)
* py-geopandas: add new version

* add when specification on setuptools requirement
2024-03-18 10:52:19 +01:00
Christopher Christofi
339a63370f py-pyshp: add new version (#43234) 2024-03-18 10:23:11 +01:00
Christopher Christofi
fef6aed627 py-branca: add new package (#43236) 2024-03-18 10:16:58 +01:00
afzpatel
3445da807e rocm-smi-lib: remove standalone test and add build time test (#43129) 2024-03-18 10:13:08 +01:00
Pieter P
429c3598af Fix CMake generator documentation (#43232) 2024-03-18 10:02:33 +01:00
Todd Gamblin
3d8136493a performance: avoid jinja2 import at startup unless needed (#43237)
`jinja2` can be a costly import, and right now it happens at startup every time we run
Spack. This slows down `spack --print-shell-vars` a bit, which is needed by `setup-env.*sh`.
2024-03-18 10:00:37 +01:00
Sergey Kosukhin
8cd160db85 zlib-ng: add variant new_strategies (#43219) 2024-03-18 09:42:43 +01:00
Simon Pintarelli
a7dd756b34 gcc 12.3 ICE patch for aarch64 (#43093)
* gcc12.3 patch for ICE on aarch64

* aarch64 ICE patch for gcc@13.2
2024-03-18 09:12:49 +01:00
Thomas Padioleau
53be280681 Remove bundled fmt (#43210) 2024-03-17 21:43:40 -07:00
1332 changed files with 27079 additions and 11200 deletions

View File

@@ -0,0 +1,4 @@
{
"image": "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01",
"postCreateCommand": "./.devcontainer/postCreateCommand.sh"
}

View File

@@ -0,0 +1,20 @@
#!/bin/bash
# Load spack environment at terminal startup
cat <<EOF >> /root/.bashrc
. /workspaces/spack/share/spack/setup-env.sh
EOF
# Load spack environment in this script
. /workspaces/spack/share/spack/setup-env.sh
# Ensure generic targets for maximum matching with buildcaches
spack config --scope site add "packages:all:require:[target=x86_64_v3]"
spack config --scope site add "concretizer:targets:granularity:generic"
# Find compiler and install gcc-runtime
spack compiler find --scope site
# Setup buildcaches
spack mirror add --scope site develop https://binaries.spack.io/develop
spack buildcache keys --install --trust

View File

@@ -17,33 +17,53 @@ concurrency:
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.operating_system }}
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit externals
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -1,7 +1,8 @@
#!/bin/bash
set -ex
set -e
source share/spack/setup-env.sh
$PYTHON bin/spack bootstrap disable github-actions-v0.4
$PYTHON bin/spack bootstrap disable spack-install
$PYTHON bin/spack -d solve zlib
$PYTHON bin/spack $SPACK_FLAGS solve zlib
tree $BOOTSTRAP/store
exit 0

View File

@@ -13,118 +13,22 @@ concurrency:
cancel-in-progress: true
jobs:
fedora-clingo-sources:
distros-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
container: ${{ matrix.image }}
strategy:
matrix:
image: ["fedora:latest", "opensuse/leap:latest"]
steps:
- name: Install dependencies
- name: Setup Fedora
if: ${{ matrix.image == 'fedora:latest' }}
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
@@ -133,15 +37,9 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -151,77 +49,102 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-sources:
runs-on: macos-latest
clingo-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison@2.7 tree
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: ${{ matrix.macos-version }}
gnupg-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
macos-version: ['macos-11', 'macos-12']
runner: [ 'macos-13', 'macos-14', "ubuntu-latest" ]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap clingo
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
# NOTE: test all pythons that exist, not all do on 12
done
ubuntu-clingo-binaries:
runs-on: ubuntu-20.04
steps:
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
- name: Bootstrap GnuPG
run: |
git --version
. .github/workflows/setup_git.sh
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
from-binaries:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: |
3.8
3.9
3.10
3.11
3.12
- name: Set bootstrap sources
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
echo "Testing $ver_dir"
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
@@ -236,122 +159,9 @@ jobs:
exit 1
fi
done
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-binaries:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
# [1] Distros that have patched git to resolve CVE-2022-24765 (e.g. Ubuntu patching v2.25.1)
# introduce breaking behaviorso we have to set `safe.directory` in gitconfig ourselves.
# See:
# - https://github.blog/2022-04-12-git-security-vulnerability-announced/
# - https://github.com/actions/checkout/issues/760
# - http://changelogs.ubuntu.com/changelogs/pool/main/g/git/git_2.25.1-1ubuntu3.3/changelog

View File

@@ -45,17 +45,18 @@ jobs:
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora37, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:37'],
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38']]
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -87,16 +88,16 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@a8a3f3ad30e3422c9c7b888a15615d19a852ae32
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@0d103c3126aa41d772a8362f6aa67afac040f80c
uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
@@ -113,10 +114,21 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@af5a7ed5ba88268d5278f7203fb52cd833f66d6e
uses: docker/build-push-action@2cdde995de11925a030ce8070c3d77a52ffcf1c0
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -18,6 +18,7 @@ jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
all-prechecks:
@@ -35,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -70,14 +71,17 @@ jobs:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.bootstrap == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/bootstrap.yml
secrets: inherit
unit-tests:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all:
needs: [ windows, unit-tests, bootstrap ]
runs-on: ubuntu-latest

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,7 +1,7 @@
black==24.3.0
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.9
types-six==1.16.21.20240513
vermin==1.6.0

View File

@@ -51,10 +51,10 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -91,17 +91,19 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Test shell integration
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
- name: Install System packages
@@ -122,9 +124,11 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
@@ -137,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version
@@ -156,10 +160,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
- name: Install System packages
@@ -181,20 +185,23 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Run unit tests on MacOS
macos:
runs-on: macos-latest
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -216,6 +223,8 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -18,8 +18,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
cache: 'pip'
@@ -35,10 +35,10 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
cache: 'pip'
@@ -56,6 +56,7 @@ jobs:
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.11'
@@ -69,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,10 +15,10 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
@@ -33,16 +33,18 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
@@ -57,16 +59,18 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,48 @@
# v0.21.2 (2024-03-01)
## Bugfixes
- Containerize: accommodate nested or pre-existing spack-env paths (#41558)
- Fix setup-env script, when going back and forth between instances (#40924)
- Fix using fully-qualified namespaces from root specs (#41957)
- Fix a bug when a required provider is requested for multiple virtuals (#42088)
- OCI buildcaches:
- only push in parallel when forking (#42143)
- use pickleable errors (#42160)
- Fix using sticky variants in externals (#42253)
- Fix a rare issue with conditional requirements and multi-valued variants (#42566)
## Package updates
- rust: add v1.75, rework a few variants (#41161,#41903)
- py-transformers: add v4.35.2 (#41266)
- mgard: fix OpenMP on AppleClang (#42933)
# v0.21.1 (2024-01-11)
## New features
- Add support for reading buildcaches created by Spack v0.22 (#41773)
## Bugfixes
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)
- Improve the error message for deprecated preferences (#41075)
- Fix MSVC preview version breaking clingo build on Windows (#41185)
- Fix multi-word aliases (#41126)
- Add a warning for unconfigured compiler (#41213)
- environment: fix an issue with deconcretization/reconcretization of specs (#41294)
- buildcache: don't error if a patch is missing, when installing from binaries (#41986)
- Multiple improvements to unit-tests (#41215,#41369,#41495,#41359,#41361,#41345,#41342,#41308,#41226)
## Package updates
- root: add a webgui patch to address security issue (#41404)
- BerkeleyGW: update source urls (#38218)
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.

View File

@@ -88,7 +88,7 @@ Resources:
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
* **X**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.

View File

@@ -144,3 +144,5 @@ switch($SpackSubCommand)
"unload" {Invoke-SpackLoad}
default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
}
exit $LASTEXITCODE

View File

@@ -15,7 +15,7 @@ concretizer:
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization. If `dependencies`, we'll only reuse dependencies but
# give you a fresh concretization for your root specs.
reuse: dependencies
reuse: true
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets
@@ -42,3 +42,8 @@ concretizer:
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Option to specify compatiblity between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
os_compatible: {}

View File

@@ -101,6 +101,12 @@ config:
verify_ssl: true
# This is where custom certs for proxy/firewall are stored.
# It can be a path or environment variable. To match ssl env configuration
# the default is the environment variable SSL_CERT_FILE
ssl_certs: $SSL_CERT_FILE
# Suppress gpg warnings from binary package verification
# Only suppresses warnings, gpg failure will still fail the install
# Potential rationale to set True: users have already explicitly trusted the

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -19,7 +19,6 @@ packages:
- apple-clang
- clang
- gcc
- intel
providers:
elf: [libelf]
fuse: [macfuse]

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -15,15 +15,17 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, intel, pgi, clang, xl, nag, fj, aocc]
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]
glu: [mesa-glu, openglu]
@@ -34,9 +36,11 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libglx: [mesa+glx, mesa18+glx]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx]
libifcore: [ intel-oneapi-runtime ]
libllvm: [llvm]
libosmesa: [mesa+osmesa, mesa18+osmesa]
lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]

12
lib/spack/docs/_templates/layout.html vendored Normal file
View File

@@ -0,0 +1,12 @@
{% extends "!layout.html" %}
{%- block extrahead %}
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S0PQ7WV75K"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-S0PQ7WV75K');
</script>
{% endblock %}

View File

@@ -865,7 +865,7 @@ There are several different ways to use Spack packages once you have
installed them. As you've seen, spack packages are installed into long
paths with hashes, and you need a way to get them into your path. The
easiest way is to use :ref:`spack load <cmd-spack-load>`, which is
described in the next section.
described in this section.
Some more advanced ways to use Spack packages include:
@@ -959,7 +959,86 @@ use ``spack find --loaded``.
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
We'll learn more about Spack's spec syntax in :ref:`a later section <sec-specs>`.
.. _extensions:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Python packages and virtual environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack can install a large number of Python packages. Their names are
typically prefixed with ``py-``. Installing and using them is no
different from any other package:
.. code-block:: console
$ spack install py-numpy
$ spack load py-numpy
$ python3
>>> import numpy
The ``spack load`` command sets the ``PATH`` variable so that the right Python
executable is used, and makes sure that ``numpy`` and its dependencies can be
located in the ``PYTHONPATH``.
Spack is different from other Python package managers in that it installs
every package into its *own* prefix. This is in contrast to ``pip``, which
installs all packages into the same prefix, be it in a virtual environment
or not.
For many users, **virtual environments** are more convenient than repeated
``spack load`` commands, particularly when working with multiple Python
packages. Fortunately Spack supports environments itself, which together
with a view are no different from Python virtual environments.
The recommended way of working with Python extensions such as ``py-numpy``
is through :ref:`Environments <environments>`. The following example creates
a Spack environment with ``numpy`` in the current working directory. It also
puts a filesystem view in ``./view``, which is a more traditional combined
prefix for all packages in the environment.
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
Now you can activate the environment and start using the packages:
.. code-block:: console
$ spack env activate .
$ python3
>>> import numpy
The environment view is also a virtual environment, which is useful if you are
sharing the environment with others who are unfamiliar with Spack. They can
either use the Python executable directly:
.. code-block:: console
$ ./view/bin/python3
>>> import numpy
or use the activation script:
.. code-block:: console
$ source ./view/bin/activate
$ python3
>>> import numpy
In general, there should not be much difference between ``spack env activate``
and using the virtual environment. The main advantage of ``spack env activate``
is that it knows about more packages than just Python packages, and it may set
additional runtime variables that are not covered by the virtual environment
activation script.
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
.. _sec-specs:
@@ -1119,6 +1198,9 @@ and ``3.4.2``. Similarly, ``@4.2:`` means any version above and including
``4.2``. As a short-hand, ``@3`` is equivalent to the range ``@3:3`` and
includes any version with major version ``3``.
Versions are ordered lexicograpically by its components. For more details
on the order, see :ref:`the packaging guide <version-comparison>`.
Notice that you can distinguish between the specific version ``@=3.2`` and
the range ``@3.2``. This is useful for packages that follow a versioning
scheme that omits the zero patch version number: ``3.2``, ``3.2.1``,
@@ -1702,165 +1784,6 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
.. _extensions:
---------------------------
Extensions & Python support
---------------------------
Spack's installation model assumes that each package will live in its
own install prefix. However, certain packages are typically installed
*within* the directory hierarchy of other packages. For example,
`Python <https://www.python.org>`_ packages are typically installed in the
``$prefix/lib/python-2.7/site-packages`` directory.
In Spack, installation prefixes are immutable, so this type of installation
is not directly supported. However, it is possible to create views that
allow you to merge install prefixes of multiple packages into a single new prefix.
Views are a convenient way to get a more traditional filesystem structure.
Using *extensions*, you can ensure that Python packages always share the
same prefix in the view as Python itself. Suppose you have
Python installed like so:
.. code-block:: console
$ spack find python
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
python@2.7.8
.. _cmd-spack-extensions:
^^^^^^^^^^^^^^^^^^^^
``spack extensions``
^^^^^^^^^^^^^^^^^^^^
You can find extensions for your Python installation like this:
.. code-block:: console
$ spack extensions python
==> python@2.7.8%gcc@4.4.7 arch=linux-debian7-x86_64-703c7a96
==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six
py-biopython py-mako py-pmw py-rpy2 py-sympy
py-cython py-matplotlib py-pychecker py-scientificpython py-virtualenv
py-dateutil py-mpi4py py-pygments py-scikit-learn
py-epydoc py-mx py-pylint py-scipy
py-gnuplot py-nose py-pyparsing py-setuptools
py-h5py py-numpy py-pyqt py-shiboken
==> 12 installed:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-dateutil@2.4.0 py-nose@1.3.4 py-pyside@1.2.2
py-dateutil@2.4.0 py-numpy@1.9.1 py-pytz@2014.10
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
The extensions are a subset of what's returned by ``spack list``, and
they are packages like any other. They are installed into their own
prefixes, and you can see this with ``spack find --paths``:
.. code-block:: console
$ spack find --paths py-numpy
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-numpy@1.9.1 ~/spack/opt/linux-debian7-x86_64/gcc@4.4.7/py-numpy@1.9.1-66733244
However, even though this package is installed, you cannot use it
directly when you run ``python``:
.. code-block:: console
$ spack load python
$ python
Python 2.7.8 (default, Feb 17 2015, 01:35:25)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>>
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Extensions in Environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The recommended way of working with extensions such as ``py-numpy``
above is through :ref:`Environments <environments>`. For example,
the following creates an environment in the current working directory
with a filesystem view in the ``./view`` directory:
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
We recommend environments for two reasons. Firstly, environments
can be activated (requires :ref:`shell-support`):
.. code-block:: console
$ spack env activate .
which sets all the right environment variables such as ``PATH`` and
``PYTHONPATH``. This ensures that
.. code-block:: console
$ python
>>> import numpy
works. Secondly, even without shell support, the view ensures
that Python can locate its extensions:
.. code-block:: console
$ ./view/bin/python
>>> import numpy
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
^^^^^^^^^^^^^^^^^^^^
Using ``spack load``
^^^^^^^^^^^^^^^^^^^^
A more traditional way of using Spack and extensions is ``spack load``
(requires :ref:`shell-support`). This will add the extension to ``PYTHONPATH``
in your current shell, and Python itself will be available in the ``PATH``:
.. code-block:: console
$ spack load py-numpy
$ python
>>> import numpy
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Apart from ``spack env activate`` and ``spack load``, you can load numpy
through your environment modules (using ``environment-modules`` or
``lmod``). This will also add the extension to the ``PYTHONPATH`` in
your current shell.
.. code-block:: console
$ module load <name of numpy module>
If you do not know the name of the specific numpy module you wish to
load, you can use the ``spack module tcl|lmod loads`` command to get
the name of the module from the Spack spec.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -220,6 +220,40 @@ section of the configuration:
.. _binary_caches_oci:
---------------------------------
Automatic push to a build cache
---------------------------------
Sometimes it is convenient to push packages to a build cache as soon as they are installed. Spack can do this by setting autopush flag when adding a mirror:
.. code-block:: console
$ spack mirror add --autopush <name> <url or path>
Or the autopush flag can be set for an existing mirror:
.. code-block:: console
$ spack mirror set --autopush <name> # enable automatic push for an existing mirror
$ spack mirror set --no-autopush <name> # disable automatic push for an existing mirror
Then after installing a package it is automatically pushed to all mirrors with ``autopush: true``. The command
.. code-block:: console
$ spack install <package>
will have the same effect as
.. code-block:: console
$ spack install <package>
$ spack buildcache push <cache> <package> # for all caches with autopush: true
.. note::
Packages are automatically pushed to a build cache only if they are built from source.
-----------------------------------------
OCI / Docker V2 registries as build cache
-----------------------------------------

View File

@@ -21,23 +21,86 @@ is the following:
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
The ``reuse`` attribute controls how aggressively Spack reuses binary packages during concretization. The
attribute can either be a single value, or an object for more complex configurations.
In the former case ("single value") it allows Spack to:
1. Reuse installed packages and buildcaches for all the specs to be concretized, when ``true``
2. Reuse installed packages and buildcaches only for the dependencies of the root specs, when ``dependencies``
3. Disregard reusing installed packages and buildcaches, when ``false``
In case a finer control over which specs are reused is needed, then the value of this attribute can be
an object, with the following keys:
1. ``roots``: if ``true`` root specs are reused, if ``false`` only dependencies of root specs are reused
2. ``from``: list of sources from which reused specs are taken
Each source in ``from`` is itself an object:
.. list-table:: Attributes for a source or reusable specs
:header-rows: 1
* - Attribute name
- Description
* - type (mandatory, string)
- Can be ``local``, ``buildcache``, or ``external``
* - include (optional, list of specs)
- If present, reusable specs must match at least one of the constraint in the list
* - exclude (optional, list of specs)
- If present, reusable specs must not match any of the constraint in the list.
For instance, the following configuration:
.. code-block:: yaml
concretizer:
reuse:
roots: true
from:
- type: local
include:
- "%gcc"
- "%clang"
tells the concretizer to reuse all specs compiled with either ``gcc`` or ``clang``, that are installed
in the local store. Any spec from remote buildcaches is disregarded.
To reduce the boilerplate in configuration files, default values for the ``include`` and
``exclude`` options can be pushed up one level:
.. code-block:: yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
from:
- type: local
- type: buildcache
- type: local
include:
- "foo %oneapi"
In the example above we reuse all specs compiled with ``gcc`` from the local store
and remote buildcaches, and we also reuse ``foo %oneapi``. Note that the last source of
specs override the default ``include`` attribute.
For one-off concretizations, the are command line arguments for each of the simple "single value"
configurations. This means a user can:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
to enable reuse for a single installation, or:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: dependencies`` is the default.
.. seealso::

View File

@@ -147,6 +147,15 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -250,7 +250,7 @@ generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = "Ninja"
generator("ninja")
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -718,23 +718,45 @@ command-line tool, or C/C++/Fortran program with optional Python
modules? The former should be prepended with ``py-``, while the
latter should not.
""""""""""""""""""""""
extends vs. depends_on
""""""""""""""""""""""
""""""""""""""""""""""""""""""
``extends`` vs. ``depends_on``
""""""""""""""""""""""""""""""
This is very similar to the naming dilemma above, with a slight twist.
As mentioned in the :ref:`Packaging Guide <packaging_extensions>`,
``extends`` and ``depends_on`` are very similar, but ``extends`` ensures
that the extension and extendee share the same prefix in views.
This allows the user to import a Python module without
having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``.
Additionally, ``extends("python")`` adds a dependency on the package
``python-venv``. This improves isolation from the system, whether
it's during the build or at runtime: user and system site packages
cannot accidentally be used by any package that ``extends("python")``.
As a rule of thumb: if a package does not install any Python modules
of its own, and merely puts a Python script in the ``bin`` directory,
then there is no need for ``extends``. If the package installs modules
in the ``site-packages`` directory, it requires ``extends``.
"""""""""""""""""""""""""""""""""""""
Executing ``python`` during the build
"""""""""""""""""""""""""""""""""""""
Whenever you need to execute a Python command or pass the path of the
Python interpreter to the build system, it is best to use the global
variable ``python`` directly. For example:
.. code-block:: python
@run_before("install")
def recythonize(self):
python("setup.py", "clean") # use the `python` global
As mentioned in the previous section, ``extends("python")`` adds an
automatic dependency on ``python-venv``, which is a virtual environment
that guarantees build isolation. The ``python`` global always refers to
the correct Python interpreter, whether the package uses ``extends("python")``
or ``depends_on("python")``.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack

View File

@@ -145,6 +145,25 @@ hosts when making ``ssl`` connections. Set to ``false`` to disable, and
tools like ``curl`` will use their ``--insecure`` options. Disabling
this can expose you to attacks. Use at your own risk.
--------------------
``ssl_certs``
--------------------
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.
When ``url_fetch_method:curl`` the ``config:ssl_certs`` should resolve to
a single file. Spack will then set the environment variable ``CURL_CA_BUNDLE``
in the subprocess calling ``curl``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
``config:ssl_certs:$SSL_CERT_FILE`` or ``config:ssl_certs:$SSL_CERT_DIR``
will work.
In all cases the expanded path must be absolute for Spack to use the certificates.
Certificates relative to an environment can be created by prepending the path variable
with the Spack configuration variable``$env``.
--------------------
``checksum``
--------------------

View File

@@ -194,15 +194,15 @@ The OS that are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -227,12 +227,12 @@ The OS that are currently supported are summarized in the table below:
* - Rocky Linux 9
- ``rockylinux:9``
- ``spack/rockylinux9``
* - Fedora Linux 37
- ``fedora:37``
- ``spack/fedora37``
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
* - Fedora Linux 39
- ``fedora:39``
- ``spack/fedora39``
* - Fedora Linux 40
- ``fedora:40``
- ``spack/fedora40``

View File

@@ -552,11 +552,11 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'import distro; distro.linux_distribution()'
('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
...
$ spack python -i ipython -c 'import distro; distro.linux_distribution()'
Out[1]: ('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
Out[1]: ...
or a file:
@@ -1071,9 +1071,9 @@ Announcing a release
We announce releases in all of the major Spack communication channels.
Publishing the release takes care of GitHub. The remaining channels are
Twitter, Slack, and the mailing list. Here are the steps:
X, Slack, and the mailing list. Here are the steps:
#. Announce the release on Twitter.
#. Announce the release on X.
* Compose the tweet on the ``@spackpm`` account per the
``spack-twitter`` slack channel.

View File

@@ -142,12 +142,8 @@ user's prompt to begin with the environment name in brackets.
$ spack env activate -p myenv
[myenv] $ ...
The ``activate`` command can also be used to create a new environment, if it is
not already defined, by adding the ``--create`` flag. Managed and anonymous
environments, anonymous environments are explained in the next section,
can both be created using the same flags that `spack env create` accepts.
If an environment already exists then spack will simply activate it and ignore the
create specific flags.
The ``activate`` command can also be used to create a new environment if it does not already
exist.
.. code-block:: console
@@ -176,21 +172,36 @@ environment will remove the view from the user environment.
Anonymous Environments
^^^^^^^^^^^^^^^^^^^^^^
Any directory can be treated as an environment if it contains a file
``spack.yaml``. To load an anonymous environment, use:
Apart from managed environments, Spack also supports anonymous environments.
Anonymous environments can be placed in any directory of choice.
.. note::
When uninstalling packages, Spack asks the user to confirm the removal of packages
that are still used in a managed environment. This is not the case for anonymous
environments.
To create an anonymous environment, use one of the following commands:
.. code-block:: console
$ spack env activate -d /path/to/directory
$ spack env create --dir my_env
$ spack env create ./my_env
Anonymous specs can be created in place using the command:
As a shorthand, you can also create an anonymous environment upon activation if it does not
already exist:
.. code-block:: console
$ spack env create -d .
$ spack env activate --create ./my_env
For convenience, Spack can also place an anonymous environment in a temporary directory for you:
.. code-block:: console
$ spack env activate --temp
In this case Spack simply creates a ``spack.yaml`` file in the requested
directory.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment Sensitive Commands
@@ -449,6 +460,125 @@ Sourcing that file in Bash will make the environment available to the
user; and can be included in ``.bashrc`` files, etc. The ``loads``
file may also be copied out of the environment, renamed, etc.
.. _environment_include_concrete:
------------------------------
Included Concrete Environments
------------------------------
Spack environments can create an environment based off of information in already
established environments. You can think of it as a combination of existing
environments. It will gather information from the existing environment's
``spack.lock`` and use that during the creation of this included concrete
environment. When an included concrete environment is created it will generate
a ``spack.lock`` file for the newly created environment.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Creating included environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To create a combined concrete environment, you must have at least one existing
concrete environment. You will use the command ``spack env create`` with the
argument ``--include-concrete`` followed by the name or path of the environment
you'd like to include. Here is an example of how to create a combined environment
from the command line.
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
You can also include an environment directly in the ``spack.yaml`` file. It
involves adding the ``include_concrete`` heading in the yaml followed by the
absolute path to the independent environments.
.. code-block:: yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /absolute/path/to/environment1
- /absolute/path/to/environment2
Once the ``spack.yaml`` has been updated you must concretize the environment to
get the concrete specs from the included environments.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Updating an included environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If changes were made to the base environment and you want that reflected in the
included environment you will need to reconcretize both the base environment and the
included environment for the change to be implemented. For example:
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
$ spack -e myenv find
==> In environment myenv
==> Root specs
python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
Here we see that ``included_env`` has access to the python package through
the ``myenv`` environment. But if we were to add another spec to ``myenv``,
``included_env`` will not be able to access the new information.
.. code-block:: console
$ spack -e myenv add perl
$ spack -e myenv concretize
$ spack -e myenv find
==> In environment myenv
==> Root specs
perl python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
It isn't until you run the ``spack concretize`` command that the combined
environment will get the updated information from the reconcretized base environmennt.
.. code-block:: console
$ spack -e included_env concretize
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
perl python
==> 0 installed packages
.. _environment-configuration:
------------------------
@@ -800,6 +930,7 @@ For example, the following environment has three root packages:
This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
@@ -1033,7 +1164,7 @@ other targets to depend on the environment installation.
A typical workflow is as follows:
.. code:: console
.. code-block:: console
spack env create -d .
spack -e . add perl
@@ -1126,7 +1257,7 @@ its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
.. code-block:: console
$ spack env depfile -o Makefile
@@ -1148,7 +1279,7 @@ This can be accomplished through the generated ``[<prefix>/]SPACK_PACKAGE_IDS``
variable. Assuming we have an active and concrete environment, we generate the
associated ``Makefile`` with a prefix ``example``:
.. code:: console
.. code-block:: console
$ spack env depfile -o env.mk --make-prefix example

View File

@@ -478,6 +478,13 @@ prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
Note that the format for the ``paths`` field in the
``extra_attributes`` section is different than in the ``compilers``
config. For compilers configured as external packages, the section is
named ``compilers`` and the dictionary maps language names (``c``,
``cxx``, ``fortran``) to paths, rather than using the names ``cc``,
``fc``, and ``f77``.
.. code-block:: yaml
packages:
@@ -493,11 +500,10 @@ all other fields from the compilers config can be added to the
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
paths:
cc: /usr/bin/clang-with-suffix
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fc: /usr/bin/gfortran
f77: /usr/bin/gfortran
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
@@ -1572,6 +1578,8 @@ Microsoft Visual Studio
"""""""""""""""""""""""
Microsoft Visual Studio provides the only Windows C/C++ compiler that is currently supported by Spack.
Spack additionally requires that the Windows SDK (including WGL) to be installed as part of your
visual studio installation as it is required to build many packages from source.
We require several specific components to be included in the Visual Studio installation.
One is the C/C++ toolset, which can be selected as "Desktop development with C++" or "C++ build tools,"
@@ -1579,6 +1587,7 @@ depending on installation type (Professional, Build Tools, etc.) The other requ
"C++ CMake tools for Windows," which can be selected from among the optional packages.
This provides CMake and Ninja for use during Spack configuration.
If you already have Visual Studio installed, you can make sure these components are installed by
rerunning the installer. Next to your installation, select "Modify" and look at the
"Installation details" pane on the right.

View File

@@ -893,26 +893,50 @@ as an option to the ``version()`` directive. Example situations would be a
"snapshot"-like Version Control System (VCS) tag, a VCS branch such as
``v6-16-00-patches``, or a URL specifying a regularly updated snapshot tarball.
.. _version-comparison:
^^^^^^^^^^^^^^^^^^
Version comparison
^^^^^^^^^^^^^^^^^^
Spack imposes a generic total ordering on the set of versions,
independently from the package they are associated with.
Most Spack versions are numeric, a tuple of integers; for example,
``0.1``, ``6.96`` or ``1.2.3.1``. Spack knows how to compare and sort
numeric versions.
``0.1``, ``6.96`` or ``1.2.3.1``. In this very basic case, version
comparison is lexicographical on the numeric components:
``1.2 < 1.2.1 < 1.2.2 < 1.10``.
Some Spack versions involve slight extensions of numeric syntax; for
example, ``py-sphinx-rtd-theme@=0.1.10a0``. In this case, numbers are
always considered to be "newer" than letters. This is for consistency
with `RPM <https://bugzilla.redhat.com/show_bug.cgi?id=50977>`_.
Spack can also supports string components such as ``1.1.1a`` and
``1.y.0``. String components are considered less than numeric
components, so ``1.y.0 < 1.0``. This is for consistency with
`RPM <https://bugzilla.redhat.com/show_bug.cgi?id=50977>`_. String
components do not have to be separated by dots or any other delimiter.
So, the contrived version ``1y0`` is identical to ``1.y.0``.
Spack versions may also be arbitrary non-numeric strings, for example
``develop``, ``master``, ``local``.
Pre-release suffixes also contain string parts, but they are handled
in a special way. For example ``1.2.3alpha1`` is parsed as a pre-release
of the version ``1.2.3``. This allows Spack to order it before the
actual release: ``1.2.3alpha1 < 1.2.3``. Spack supports alpha, beta and
release candidate suffixes: ``1.2alpha1 < 1.2beta1 < 1.2rc1 < 1.2``. Any
suffix not recognized as a pre-release is treated as an ordinary
string component, so ``1.2 < 1.2-mysuffix``.
The order on versions is defined as follows. A version string is split
into a list of components based on delimiters such as ``.``, ``-`` etc.
Lists are then ordered lexicographically, where components are ordered
as follows:
Finally, there are a few special string components that are considered
"infinity versions". They include ``develop``, ``main``, ``master``,
``head``, ``trunk``, and ``stable``. For example: ``1.2 < develop``.
These are useful for specifying the most recent development version of
a package (often a moving target like a git branch), without assigning
a specific version number. Infinity versions are not automatically used when determining the latest version of a package unless explicitly required by another package or user.
More formally, the order on versions is defined as follows. A version
string is split into a list of components based on delimiters such as
``.`` and ``-`` and string boundaries. The components are split into
the **release** and a possible **pre-release** (if the last component
is numeric and the second to last is a string ``alpha``, ``beta`` or ``rc``).
The release components are ordered lexicographically, with comparsion
between different types of components as follows:
#. The following special strings are considered larger than any other
numeric or non-numeric version component, and satisfy the following
@@ -925,6 +949,9 @@ as follows:
#. All other non-numeric components are less than numeric components,
and are ordered alphabetically.
Finally, if the release components are equal, the pre-release components
are used to break the tie, in the obvious way.
The logic behind this sort order is two-fold:
#. Non-numeric versions are usually used for special cases while
@@ -6408,9 +6435,12 @@ the ``paths`` attribute:
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
platforms: ["linux", "darwin"]
results:
- spec: 'llvm@3.9.1 +clang~lld~lldb'
If the ``platforms`` attribute is present, tests are run only if the current host
matches one of the listed platforms.
Each test is performed by first creating a temporary directory structure as
specified in the corresponding ``layout`` and by then running
package detection and checking that the outcome matches the expected
@@ -6444,6 +6474,10 @@ package detection and checking that the outcome matches the expected
- A spec that is expected from detection
- Any valid spec
- Yes
* - ``results:[0]:extra_attributes``
- Extra attributes expected on the associated Spec
- Nested dictionary with string as keys, and regular expressions as leaf values
- No
"""""""""""""""""""""""""""""""
Reuse tests from other packages

View File

@@ -476,9 +476,3 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
pygments==2.18.0
urllib3==2.2.1
pytest==8.1.1
pytest==8.2.1
isort==5.13.2
black==24.3.0
black==24.4.2
flake8==7.0.0
mypy==1.9.0
mypy==1.10.0

255
lib/spack/env/cc vendored
View File

@@ -47,7 +47,8 @@ SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS"
SPACK_SYSTEM_DIRS
SPACK_MANAGED_DIRS"
# Optional parameters that aren't required to be set
@@ -173,22 +174,6 @@ preextend() {
unset IFS
}
# system_dir PATH
# test whether a path is a system directory
system_dir() {
IFS=':' # SPACK_SYSTEM_DIRS is colon-separated
path="$1"
for sd in $SPACK_SYSTEM_DIRS; do
if [ "${path}" = "${sd}" ] || [ "${path}" = "${sd}/" ]; then
# success if path starts with a system prefix
unset IFS
return 0
fi
done
unset IFS
return 1 # fail if path starts no system prefix
}
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -201,6 +186,18 @@ for param in $params; do
fi
done
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
# moving the eval inside the function would eval it every call.
eval "\
path_order() {
case \"\$1\" in
$SPACK_MANAGED_DIRS) return 0 ;;
$SPACK_SYSTEM_DIRS) return 2 ;;
/*) return 1 ;;
esac
}
"
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -248,7 +245,7 @@ case "$command" in
lang_flags=C
debug_flags="-g"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|dpcpp|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
command="$SPACK_CXX"
language="C++"
comp="CXX"
@@ -420,11 +417,12 @@ input_command="$*"
parse_Wl() {
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
wl_expect_rpath=no
else
case "$1" in
@@ -432,21 +430,25 @@ parse_Wl() {
arg="${1#-rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
wl_expect_rpath=yes
@@ -473,12 +475,20 @@ categorize_arguments() {
return_other_args_list=""
return_isystem_was_used=""
return_isystem_spack_store_include_dirs_list=""
return_isystem_system_include_dirs_list=""
return_isystem_include_dirs_list=""
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
@@ -526,7 +536,7 @@ categorize_arguments() {
continue
fi
replaced="$after$stripped"
replaced="$after$stripped"
# it matched, remove it
shift
@@ -546,29 +556,32 @@ categorize_arguments() {
arg="${1#-isystem}"
return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_isystem_system_include_dirs_list "$arg"
else
append return_isystem_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_include_dirs_list "$arg"
else
append return_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_lib_dirs_list "$arg"
else
append return_lib_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -601,29 +614,32 @@ categorize_arguments() {
break
elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path>
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
xlinker_expect_rpath=no
else
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
xlinker_expect_rpath=yes
@@ -661,16 +677,25 @@ categorize_arguments() {
}
categorize_arguments "$@"
include_dirs_list="$return_include_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
isystem_was_used="$return_isystem_was_used"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
other_args_list="$return_other_args_list"
spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
include_dirs_list="$return_include_dirs_list"
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list"
#
# Add flags from Spack's cppflags, cflags, cxxflags, fcflags, fflags, and
@@ -730,7 +755,7 @@ esac
# Linker flags
case "$mode" in
ld|ccld)
ccld)
extend spack_flags_list SPACK_LDFLAGS
;;
esac
@@ -738,16 +763,25 @@ esac
IFS="$lsep"
categorize_arguments $spack_flags_list
unset IFS
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_other_args_list="$return_other_args_list"
spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list"
# On macOS insert headerpad_max_install_names linker flag
@@ -767,11 +801,13 @@ if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
# Append RPATH directories. Note that in the case of the
# top-level package these directories may not exist yet. For dependencies
# it is assumed that paths have already been confirmed.
extend spack_store_rpath_dirs_list SPACK_STORE_RPATH_DIRS
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
extend spack_store_lib_dirs_list SPACK_STORE_LINK_DIRS
extend lib_dirs_list SPACK_LINK_DIRS
fi
@@ -798,38 +834,50 @@ case "$mode" in
;;
esac
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend include_dirs_list SPACK_INCLUDE_DIRS
fi
;;
esac
#
# Finally, reassemble the command line.
#
args_list="$flags_list"
# Insert include directories just prior to any system include directories
# Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_flags_include_dirs_list "-I"
extend args_list include_dirs_list "-I"
extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I
extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
elif [ "$isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
else
extend args_list SPACK_INCLUDE_DIRS "-I"
fi
;;
esac
extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_include_dirs_list -I
extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths
# Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L"
extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
@@ -839,8 +887,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;;
@@ -848,8 +900,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;;
@@ -913,4 +969,3 @@ fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.3 (commit 7b8fe60b69e2861e7dac104bc1c183decfcd3daf)
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse
----------------

View File

@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -1,3 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.3"
__version__ = "0.2.4"

View File

@@ -5,9 +5,10 @@
"""The "cpu" package permits to query and compare different
CPU microarchitectures.
"""
from .detect import host
from .detect import brand_string, host
from .microarchitecture import (
TARGETS,
InvalidCompilerVersion,
Microarchitecture,
UnsupportedMicroarchitecture,
generic_microarchitecture,
@@ -15,10 +16,12 @@
)
__all__ = [
"brand_string",
"host",
"TARGETS",
"InvalidCompilerVersion",
"Microarchitecture",
"UnsupportedMicroarchitecture",
"TARGETS",
"generic_microarchitecture",
"host",
"version_components",
]

View File

@@ -155,6 +155,31 @@ def _is_bit_set(self, register: int, bit: int) -> bool:
mask = 1 << bit
return register & mask > 0
def brand_string(self) -> Optional[str]:
"""Returns the brand string, if available."""
if self.highest_extension_support < 0x80000004:
return None
r1 = self.cpuid.registers_for(eax=0x80000002, ecx=0)
r2 = self.cpuid.registers_for(eax=0x80000003, ecx=0)
r3 = self.cpuid.registers_for(eax=0x80000004, ecx=0)
result = struct.pack(
"IIIIIIIIIIII",
r1.eax,
r1.ebx,
r1.ecx,
r1.edx,
r2.eax,
r2.ebx,
r2.ecx,
r2.edx,
r3.eax,
r3.ebx,
r3.ecx,
r3.edx,
).decode("utf-8")
return result.strip("\x00")
@detection(operating_system="Windows")
def cpuid_info():
@@ -174,8 +199,8 @@ def _check_output(args, env):
WINDOWS_MAPPING = {
"AMD64": "x86_64",
"ARM64": "aarch64",
"AMD64": X86_64,
"ARM64": AARCH64,
}
@@ -409,3 +434,16 @@ def compatibility_check_for_riscv64(info, target):
return (target == arch_root or arch_root in target.ancestors) and (
target.name == info.name or target.vendor == "generic"
)
def brand_string() -> Optional[str]:
"""Returns the brand string of the host, if detected, or None."""
if platform.system() == "Darwin":
return _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if host().family == X86_64:
return CpuidInfoCollector().brand_string()
return None

View File

@@ -208,6 +208,8 @@ def optimization_flags(self, compiler, version):
"""Returns a string containing the optimization flags that needs
to be used to produce code optimized for this micro-architecture.
The version is expected to be a string of dot separated digits.
If there is no information on the compiler passed as argument the
function returns an empty string. If it is known that the compiler
version we want to use does not support this architecture the function
@@ -216,6 +218,11 @@ def optimization_flags(self, compiler, version):
Args:
compiler (str): name of the compiler to be used
version (str): version of the compiler to be used
Raises:
UnsupportedMicroarchitecture: if the requested compiler does not support
this micro-architecture.
ValueError: if the version doesn't match the expected format
"""
# If we don't have information on compiler at all return an empty string
if compiler not in self.family.compilers:
@@ -232,6 +239,14 @@ def optimization_flags(self, compiler, version):
msg = msg.format(compiler, best_target, best_target.family)
raise UnsupportedMicroarchitecture(msg)
# Check that the version matches the expected format
if not re.match(r"^(?:\d+\.)*\d+$", version):
msg = (
"invalid format for the compiler version argument. "
"Only dot separated digits are allowed."
)
raise InvalidCompilerVersion(msg)
# If we have information on this compiler we need to check the
# version being used
compiler_info = self.compilers[compiler]
@@ -292,7 +307,7 @@ def generic_microarchitecture(name):
Args:
name (str): name of the micro-architecture
"""
return Microarchitecture(name, parents=[], vendor="generic", features=[], compilers={})
return Microarchitecture(name, parents=[], vendor="generic", features=set(), compilers={})
def version_components(version):
@@ -367,7 +382,15 @@ def fill_target_from_dict(name, data, targets):
TARGETS = LazyDictionary(_known_microarchitectures)
class UnsupportedMicroarchitecture(ValueError):
class ArchspecError(Exception):
"""Base class for errors within archspec"""
class UnsupportedMicroarchitecture(ArchspecError, ValueError):
"""Raised if a compiler version does not support optimization for a given
micro-architecture.
"""
class InvalidCompilerVersion(ArchspecError, ValueError):
"""Raised when an invalid format is used for compiler versions in archspec."""

View File

@@ -2937,8 +2937,6 @@
"ilrcpc",
"flagm",
"ssbs",
"paca",
"pacg",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3066,8 +3064,6 @@
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
@@ -3081,8 +3077,7 @@
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
"dgh"
],
"compilers" : {
"gcc": [

View File

@@ -0,0 +1,13 @@
diff --git a/lib/spack/external/_vendoring/ruamel/yaml/comments.py b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
index 1badeda585..892c868af3 100644
--- a/lib/spack/external/_vendoring/ruamel/yaml/comments.py
+++ b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
- setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
+ setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -98,3 +98,10 @@ def path_filter_caller(*args, **kwargs):
if _func:
return holder_func(_func)
return holder_func
def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.
no-op if extended path prefix is not present"""
return path.lstrip("\\\\?\\")

View File

@@ -12,7 +12,7 @@
# Archive extensions allowed in Spack
PREFIX_EXTENSIONS = ("tar", "TAR")
EXTENSIONS = ("gz", "bz2", "xz", "Z")
NO_TAR_EXTENSIONS = ("zip", "tgz", "tbz2", "tbz", "txz")
NO_TAR_EXTENSIONS = ("zip", "tgz", "tbz2", "tbz", "txz", "whl")
# Add PREFIX_EXTENSIONS and EXTENSIONS last so that .tar.gz is matched *before* .tar or .gz
ALLOWED_ARCHIVE_TYPES = (
@@ -357,10 +357,8 @@ def strip_version_suffixes(path_or_url: str) -> str:
r"i[36]86",
r"ppc64(le)?",
r"armv?(7l|6l|64)?",
# PyPI
r"[._-]py[23].*\.whl",
r"[._-]cp[23].*\.whl",
r"[._-]win.*\.exe",
# PyPI wheels
r"-(?:py|cp)[23].*",
]
for regex in suffix_regexes:
@@ -403,7 +401,7 @@ def expand_contracted_extension_in_path(
def compression_ext_from_compressed_archive(extension: str) -> Optional[str]:
"""Returns compression extension for a compressed archive"""
extension = expand_contracted_extension(extension)
for ext in [*EXTENSIONS]:
for ext in EXTENSIONS:
if ext in extension:
return ext
return None

View File

@@ -187,26 +187,58 @@ def polite_filename(filename: str) -> str:
return _polite_antipattern().sub("_", filename)
def getuid():
def getuid() -> Union[str, int]:
"""Returns os getuid on non Windows
On Windows returns 0 for admin users, login string otherwise
This is in line with behavior from get_owner_uid which
always returns the login string on Windows
"""
if sys.platform == "win32":
import ctypes
# If not admin, use the string name of the login as a unique ID
if ctypes.windll.shell32.IsUserAnAdmin() == 0:
return 1
return os.getlogin()
return 0
else:
return os.getuid()
def _win_rename(src, dst):
# os.replace will still fail if on Windows (but not POSIX) if the dst
# is a symlink to a directory (all other cases have parity Windows <-> Posix)
if os.path.islink(dst) and os.path.isdir(os.path.realpath(dst)):
if os.path.samefile(src, dst):
# src and dst are the same
# do nothing and exit early
return
# If dst exists and is a symlink to a directory
# we need to remove dst and then perform rename/replace
# this is safe to do as there's no chance src == dst now
os.remove(dst)
os.replace(src, dst)
@system_path_filter
def msdos_escape_parens(path):
"""MS-DOS interprets parens as grouping parameters even in a quoted string"""
if sys.platform == "win32":
return path.replace("(", "^(").replace(")", "^)")
else:
return path
@system_path_filter
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
# os.replace is the same as os.rename on POSIX and is MoveFileExW w/
# the MOVEFILE_REPLACE_EXISTING flag on Windows
# Windows invocation is abstracted behind additonal logic handling
# remaining cases of divergent behavior accross platforms
if sys.platform == "win32":
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or islink(dst):
os.remove(dst)
os.rename(src, dst)
_win_rename(src, dst)
else:
os.replace(src, dst)
@system_path_filter
@@ -536,7 +568,13 @@ def exploding_archive_handler(tarball_container, stage):
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
def get_owner_uid(path, err_msg=None) -> Union[str, int]:
"""Returns owner UID of path destination
On non Windows this is the value of st_uid
On Windows this is the login string associated with the
owning user.
"""
if not os.path.exists(path):
mkdirp(path, mode=stat.S_IRWXU)
@@ -805,7 +843,7 @@ def copy_tree(
if islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
target = readlink(s)
if os.path.isabs(target):
def escaped_path(path):
@@ -1217,10 +1255,12 @@ def windows_sfn(path: os.PathLike):
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# Method with null values returns size of short path name
sz = k32.GetShortPathNameW(path, None, 0)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
TCHAR_arr = ctypes.c_wchar * sz
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
k32.GetShortPathNameW(path, ctypes.byref(ret_str), sz)
return ret_str.value
@@ -2410,9 +2450,10 @@ def add_library_dependent(self, *dest):
"""
for pth in dest:
if os.path.isfile(pth):
self._additional_library_dependents.add(pathlib.Path(pth).parent)
new_pth = pathlib.Path(pth).parent
else:
self._additional_library_dependents.add(pathlib.Path(pth))
new_pth = pathlib.Path(pth)
self._additional_library_dependents.add(new_pth)
@property
def rpaths(self):
@@ -2490,8 +2531,14 @@ def establish_link(self):
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
# do not rpath for system libraries included in the dag
# we should not be modifying libraries managed by the Windows system
# as this will negatively impact linker behavior and can result in permission
# errors if those system libs are not modifiable by Spack
if "windows-system" not in getattr(self.pkg, "tags", []):
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
@system_path_filter

View File

@@ -11,7 +11,7 @@
from llnl.util import lang, tty
from ..path import system_path_filter
from ..path import sanitize_win_longpath, system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
@@ -247,9 +247,9 @@ def _windows_create_junction(source: str, link: str):
out, err = proc.communicate()
tty.debug(out.decode())
if proc.returncode != 0:
err = err.decode()
tty.error(err)
raise SymlinkError("Make junction command returned a non-zero return code.", err)
err_str = err.decode()
tty.error(err_str)
raise SymlinkError("Make junction command returned a non-zero return code.", err_str)
def _windows_create_hard_link(path: str, link: str):
@@ -269,14 +269,14 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def readlink(path: str):
def readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
elif _windows_is_junction(path):
return _windows_read_junction(path)
else:
return os.readlink(path)
return sanitize_win_longpath(os.readlink(path, dir_fd=dir_fd))
def _windows_read_hard_link(link: str) -> str:

View File

@@ -12,7 +12,7 @@
import traceback
from datetime import datetime
from sys import platform as _platform
from typing import NoReturn
from typing import Any, NoReturn
if _platform != "win32":
import fcntl
@@ -158,21 +158,22 @@ def get_timestamp(force=False):
return ""
def msg(message, *args, **kwargs):
def msg(message: Any, *args: Any, newline: bool = True) -> None:
if not msg_enabled():
return
if isinstance(message, Exception):
message = "%s: %s" % (message.__class__.__name__, str(message))
message = f"{message.__class__.__name__}: {message}"
else:
message = str(message)
newline = kwargs.get("newline", True)
st_text = ""
if _stacktrace:
st_text = process_stacktrace(2)
if newline:
cprint("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
else:
cwrite("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
nl = "\n" if newline else ""
cwrite(f"@*b{{{st_text}==>}} {get_timestamp()}{cescape(_output_filter(message))}{nl}")
for arg in args:
print(indent + _output_filter(str(arg)))

View File

@@ -237,7 +237,6 @@ def transpose():
def colified(
elts: List[Any],
cols: int = 0,
output: Optional[IO] = None,
indent: int = 0,
padding: int = 2,
tty: Optional[bool] = None,

View File

@@ -59,9 +59,11 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
import os
import re
import sys
from contextlib import contextmanager
from typing import Optional
class ColorParseError(Exception):
@@ -95,14 +97,34 @@ def __init__(self, message):
} # white
# Regex to be used for color formatting
color_re = r"@(?:@|\.|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)"
COLOR_RE = re.compile(r"@(?:(@)|(\.)|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)")
# Mapping from color arguments to values for tty.set_color
color_when_values = {"always": True, "auto": None, "never": False}
# Force color; None: Only color if stdout is a tty
# True: Always colorize output, False: Never colorize output
_force_color = None
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def _color_from_environ() -> Optional[bool]:
try:
return _color_when_value(os.environ.get("SPACK_COLOR", "auto"))
except ValueError:
return None
#: When `None` colorize when stdout is tty, when `True` or `False` always or never colorize resp.
_force_color = _color_from_environ()
def try_enable_terminal_color_on_windows():
@@ -163,19 +185,6 @@ def _err_check(result, func, args):
debug("Unable to support color on Windows terminal")
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def get_color_when():
"""Return whether commands should print color or not."""
if _force_color is not None:
@@ -203,77 +212,64 @@ def color_when(value):
set_color_when(old_value)
class match_to_ansi:
def __init__(self, color=True, enclose=False, zsh=False):
self.color = _color_when_value(color)
self.enclose = enclose
self.zsh = zsh
def escape(self, s):
"""Returns a TTY escape sequence for a color"""
if self.color:
if self.zsh:
result = rf"\e[0;{s}m"
else:
result = f"\033[{s}m"
if self.enclose:
result = rf"\[{result}\]"
return result
def _escape(s: str, color: bool, enclose: bool, zsh: bool) -> str:
"""Returns a TTY escape sequence for a color"""
if color:
if zsh:
result = rf"\e[0;{s}m"
else:
return ""
result = f"\033[{s}m"
def __call__(self, match):
"""Convert a match object generated by ``color_re`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
style, color, text = match.groups()
m = match.group(0)
if enclose:
result = rf"\[{result}\]"
if m == "@@":
return "@"
elif m == "@.":
return self.escape(0)
elif m == "@":
raise ColorParseError("Incomplete color format: '%s' in %s" % (m, match.string))
string = styles[style]
if color:
if color not in colors:
raise ColorParseError(
"Invalid color specifier: '%s' in '%s'" % (color, match.string)
)
string += ";" + str(colors[color])
colored_text = ""
if text:
colored_text = text + self.escape(0)
return self.escape(string) + colored_text
return result
else:
return ""
def colorize(string, **kwargs):
def colorize(
string: str, color: Optional[bool] = None, enclose: bool = False, zsh: bool = False
) -> str:
"""Replace all color expressions in a string with ANSI control codes.
Args:
string (str): The string to replace
string: The string to replace
Returns:
str: The filtered string
The filtered string
Keyword Arguments:
color (bool): If False, output will be plain text without control
codes, for output to non-console devices.
enclose (bool): If True, enclose ansi color sequences with
color: If False, output will be plain text without control codes, for output to
non-console devices (default: automatically choose color or not)
enclose: If True, enclose ansi color sequences with
square brackets to prevent misestimation of terminal width.
zsh (bool): If True, use zsh ansi codes instead of bash ones (for variables like PS1)
zsh: If True, use zsh ansi codes instead of bash ones (for variables like PS1)
"""
color = _color_when_value(kwargs.get("color", get_color_when()))
zsh = kwargs.get("zsh", False)
string = re.sub(color_re, match_to_ansi(color, kwargs.get("enclose")), string, zsh)
string = string.replace("}}", "}")
return string
color = color if color is not None else get_color_when()
def match_to_ansi(match):
"""Convert a match object generated by ``COLOR_RE`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
escaped_at, dot, style, color_code, text = match.groups()
if escaped_at:
return "@"
elif dot:
return _escape(0, color, enclose, zsh)
elif not (style or color_code):
raise ColorParseError(
f"Incomplete color format: '{match.group(0)}' in '{match.string}'"
)
ansi_code = _escape(f"{styles[style]};{colors.get(color_code, '')}", color, enclose, zsh)
if text:
return f"{ansi_code}{text}{_escape(0, color, enclose, zsh)}"
else:
return ansi_code
return COLOR_RE.sub(match_to_ansi, string).replace("}}", "}")
def clen(string):
@@ -305,7 +301,7 @@ def cprint(string, stream=None, color=None):
cwrite(string + "\n", stream, color)
def cescape(string):
def cescape(string: str) -> str:
"""Escapes special characters needed for color codes.
Replaces the following symbols with their equivalent literal forms:
@@ -321,10 +317,7 @@ def cescape(string):
Returns:
(str): the string with color codes escaped
"""
string = str(string)
string = string.replace("@", "@@")
string = string.replace("}", "}}")
return string
return string.replace("@", "@@").replace("}", "}}")
class ColorStream:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.0.dev0"
__version__ = "0.23.0.dev0"
spack_version = __version__

View File

@@ -254,8 +254,8 @@ def _search_duplicate_specs_in_externals(error_cls):
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
@@ -421,6 +421,10 @@ def _check_patch_urls(pkgs, error_cls):
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
)
github_pull_commits_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/pull/\d+/commits/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = (
@@ -436,14 +440,24 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch):
continue
if re.match(github_patch_url_re, patch.url):
if re.match(github_pull_commits_re, patch.url):
url = re.sub(r"/pull/\d+/commits/", r"/commit/", patch.url)
url = re.sub(r"^(.*)(?<!full_index=1)$", r"\1?full_index=1", url)
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ "must not be a pull request commit; "
+ f"instead use {url}",
[patch.url],
)
)
elif re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
f"patch URL in package {pkg_cls.name} "
+ f"must end with {full_index_arg}",
[patch.url],
)
)
@@ -451,9 +465,7 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(".diff"):
errors.append(
error_cls(
"patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
f"patch URL in package {pkg_cls.name} must end with .diff",
[patch.url],
)
)
@@ -1046,7 +1058,7 @@ def _extracts_errors(triggers, summary):
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
kwargs=("pkgs", "debug_log"),
)
@@ -1069,7 +1081,7 @@ def packages_with_detection_tests():
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
def _test_detection_by_executable(pkgs, debug_log, error_cls):
"""Test drive external detection for packages"""
import spack.detection
@@ -1095,6 +1107,7 @@ def _test_detection_by_executable(pkgs, error_cls):
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
debug_log(f"[{__file__}]: running test {idx} for package {pkg_name}")
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
@@ -1111,4 +1124,75 @@ def _test_detection_by_executable(pkgs, error_cls):
details = [msg.format(s, idx) for s in sorted(not_expected)]
errors.append(error_cls(summary=summary, details=details))
matched_detection = []
for candidate in expected_specs:
try:
idx = specs.index(candidate)
matched_detection.append((candidate, specs[idx]))
except (AttributeError, ValueError):
pass
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
_summary = f'{pkg_name}: illegal regex in "{_spec}" extra attributes'
_details = [f"{_expected} is not a valid regex"]
return [error_cls(summary=_summary, details=_details)]
if not _regex.match(_detected):
_summary = (
f'{pkg_name}: error when trying to match "{_expected}" '
f"in extra attributes"
)
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
_details = [
f'"{_expected}" was expected',
f'"{_detected}" was detected',
] + [f'attribute "{s}" was not detected' for s in sorted(_not_detected)]
result.append(error_cls(summary=_summary, details=_details))
_common = set(_expected.keys()) & set(_detected.keys())
for _key in _common:
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
return result
for expected, detected in matched_detection:
# We might not want to test all attributes, so avoid not_expected
not_detected = set(expected.extra_attributes) - set(detected.extra_attributes)
if not_detected:
summary = f"{pkg_name}: cannot detect some attributes for spec {expected}"
details = [
f'"{s}" was not detected [test_id={idx}]' for s in sorted(not_detected)
]
errors.append(error_cls(summary=summary, details=details))
common = set(expected.extra_attributes) & set(detected.extra_attributes)
for key in common:
errors.extend(
_compare_extra_attribute(
expected.extra_attributes[key],
detected.extra_attributes[key],
_spec=expected,
)
)
return errors

View File

@@ -17,7 +17,6 @@
import tarfile
import tempfile
import time
import traceback
import urllib.error
import urllib.parse
import urllib.request
@@ -30,6 +29,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
from llnl.util.symlink import readlink
import spack.caches
import spack.cmd
@@ -111,10 +111,6 @@ def __init__(self, errors):
super().__init__(self.message)
class ListMirrorSpecsError(spack.error.SpackError):
"""Raised when unable to retrieve list of specs from the mirror"""
class BinaryCacheIndex:
"""
The BinaryCacheIndex tracks what specs are available on (usually remote)
@@ -541,83 +537,6 @@ def binary_index_location():
BINARY_INDEX: BinaryCacheIndex = llnl.util.lang.Singleton(BinaryCacheIndex) # type: ignore
class NoOverwriteException(spack.error.SpackError):
"""Raised when a file would be overwritten"""
def __init__(self, file_path):
super().__init__(f"Refusing to overwrite the following file: {file_path}")
class NoGpgException(spack.error.SpackError):
"""
Raised when gpg2 is not in PATH
"""
def __init__(self, msg):
super().__init__(msg)
class NoKeyException(spack.error.SpackError):
"""
Raised when gpg has no default key added.
"""
def __init__(self, msg):
super().__init__(msg)
class PickKeyException(spack.error.SpackError):
"""
Raised when multiple keys can be used to sign.
"""
def __init__(self, keys):
err_msg = "Multiple keys available for signing\n%s\n" % keys
err_msg += "Use spack buildcache create -k <key hash> to pick a key."
super().__init__(err_msg)
class NoVerifyException(spack.error.SpackError):
"""
Raised if file fails signature verification.
"""
pass
class NoChecksumException(spack.error.SpackError):
"""
Raised if file fails checksum verification.
"""
def __init__(self, path, size, contents, algorithm, expected, computed):
super().__init__(
f"{algorithm} checksum failed for {path}",
f"Expected {expected} but got {computed}. "
f"File size = {size} bytes. Contents = {contents!r}",
)
class NewLayoutException(spack.error.SpackError):
"""
Raised if directory layout is different from buildcache.
"""
def __init__(self, msg):
super().__init__(msg)
class InvalidMetadataFile(spack.error.SpackError):
pass
class UnsignedPackageException(spack.error.SpackError):
"""
Raised if installation of unsigned package is attempted without
the use of ``--no-check-signature``.
"""
def compute_hash(data):
if isinstance(data, str):
data = data.encode("utf-8")
@@ -740,7 +659,7 @@ def get_buildfile_manifest(spec):
# 2. paths are used as strings.
for rel_path in visitor.symlinks:
abs_path = os.path.join(root, rel_path)
link = os.readlink(abs_path)
link = readlink(abs_path)
if os.path.isabs(link) and link.startswith(spack.store.STORE.layout.root):
data["link_to_relocate"].append(rel_path)
@@ -992,15 +911,10 @@ def url_read_method(url):
if entry.endswith("spec.json") or entry.endswith("spec.json.sig")
]
read_fn = url_read_method
except KeyError as inst:
msg = "No packages at {0}: {1}".format(cache_prefix, inst)
tty.warn(msg)
except Exception as err:
# If we got some kind of S3 (access denied or other connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Encountered problem listing packages at {0}: {1}".format(cache_prefix, err)
tty.warn(msg)
# If we got some kind of S3 (access denied or other connection error), the first non
# boto-specific class in the exception is Exception. Just print a warning and return
tty.warn(f"Encountered problem listing packages at {cache_prefix}: {err}")
return file_list, read_fn
@@ -1047,11 +961,10 @@ def generate_package_index(cache_prefix, concurrency=32):
"""
try:
file_list, read_fn = _spec_files_from_cache(cache_prefix)
except ListMirrorSpecsError as err:
tty.error("Unable to generate package index, {0}".format(err))
return
except ListMirrorSpecsError as e:
raise GenerateIndexError(f"Unable to generate package index: {e}") from e
tty.debug("Retrieving spec descriptor files from {0} to build index".format(cache_prefix))
tty.debug(f"Retrieving spec descriptor files from {cache_prefix} to build index")
tmpdir = tempfile.mkdtemp()
@@ -1061,27 +974,22 @@ def generate_package_index(cache_prefix, concurrency=32):
try:
_read_specs_and_push_index(file_list, read_fn, cache_prefix, db, db_root_dir, concurrency)
except Exception as err:
msg = "Encountered problem pushing package index to {0}: {1}".format(cache_prefix, err)
tty.warn(msg)
tty.debug("\n" + traceback.format_exc())
except Exception as e:
raise GenerateIndexError(
f"Encountered problem pushing package index to {cache_prefix}: {e}"
) from e
finally:
shutil.rmtree(tmpdir)
shutil.rmtree(tmpdir, ignore_errors=True)
def generate_key_index(key_prefix, tmpdir=None):
"""Create the key index page.
Creates (or replaces) the "index.json" page at the location given in
key_prefix. This page contains an entry for each key (.pub) under
key_prefix.
Creates (or replaces) the "index.json" page at the location given in key_prefix. This page
contains an entry for each key (.pub) under key_prefix.
"""
tty.debug(
" ".join(
("Retrieving key.pub files from", url_util.format(key_prefix), "to build key index")
)
)
tty.debug(f"Retrieving key.pub files from {url_util.format(key_prefix)} to build key index")
try:
fingerprints = (
@@ -1089,17 +997,8 @@ def generate_key_index(key_prefix, tmpdir=None):
for entry in web_util.list_url(key_prefix, recursive=False)
if entry.endswith(".pub")
)
except KeyError as inst:
msg = "No keys at {0}: {1}".format(key_prefix, inst)
tty.warn(msg)
return
except Exception as err:
# If we got some kind of S3 (access denied or other connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Encountered problem listing keys at {0}: {1}".format(key_prefix, err)
tty.warn(msg)
return
except Exception as e:
raise CannotListKeys(f"Encountered problem listing keys at {key_prefix}: {e}") from e
remove_tmpdir = False
@@ -1124,12 +1023,13 @@ def generate_key_index(key_prefix, tmpdir=None):
keep_original=False,
extra_args={"ContentType": "application/json"},
)
except Exception as err:
msg = "Encountered problem pushing key index to {0}: {1}".format(key_prefix, err)
tty.warn(msg)
except Exception as e:
raise GenerateIndexError(
f"Encountered problem pushing key index to {key_prefix}: {e}"
) from e
finally:
if remove_tmpdir:
shutil.rmtree(tmpdir)
shutil.rmtree(tmpdir, ignore_errors=True)
def tarfile_of_spec_prefix(tar: tarfile.TarFile, prefix: str) -> None:
@@ -1200,7 +1100,8 @@ def push_or_raise(spec: Spec, out_url: str, options: PushOptions):
used at the mirror (following <tarball_directory_name>).
This method raises :py:class:`NoOverwriteException` when ``force=False`` and the tarball or
spec.json file already exist in the buildcache.
spec.json file already exist in the buildcache. It raises :py:class:`PushToBuildCacheError`
when the tarball or spec.json file cannot be pushed to the buildcache.
"""
if not spec.concrete:
raise ValueError("spec must be concrete to build tarball")
@@ -1278,13 +1179,18 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
key = select_signing_key(options.key)
sign_specfile(key, options.force, specfile_path)
# push tarball and signed spec json to remote mirror
web_util.push_to_url(spackfile_path, remote_spackfile_path, keep_original=False)
web_util.push_to_url(
signed_specfile_path if not options.unsigned else specfile_path,
remote_signed_specfile_path if not options.unsigned else remote_specfile_path,
keep_original=False,
)
try:
# push tarball and signed spec json to remote mirror
web_util.push_to_url(spackfile_path, remote_spackfile_path, keep_original=False)
web_util.push_to_url(
signed_specfile_path if not options.unsigned else specfile_path,
remote_signed_specfile_path if not options.unsigned else remote_specfile_path,
keep_original=False,
)
except Exception as e:
raise PushToBuildCacheError(
f"Encountered problem pushing binary {remote_spackfile_path}: {e}"
) from e
# push the key to the build cache's _pgp directory so it can be
# imported
@@ -1296,8 +1202,6 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
if options.regenerate_index:
generate_package_index(url_util.join(out_url, os.path.relpath(cache_prefix, stage_dir)))
return None
class NotInstalledError(spack.error.SpackError):
"""Raised when a spec is not installed but picked to be packaged."""
@@ -1352,28 +1256,6 @@ def specs_to_be_packaged(
return [s for s in itertools.chain(roots, deps) if not s.external]
def push(spec: Spec, mirror_url: str, options: PushOptions):
"""Create and push binary package for a single spec to the specified
mirror url.
Args:
spec: Spec to package and push
mirror_url: Desired destination url for binary package
options:
Returns:
True if package was pushed, False otherwise.
"""
try:
push_or_raise(spec, mirror_url, options)
except NoOverwriteException as e:
warnings.warn(str(e))
return False
return True
def try_verify(specfile_path):
"""Utility function to attempt to verify a local file. Assumes the
file is a clearsigned signature file.
@@ -2120,6 +2002,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)
@@ -2706,3 +2589,96 @@ def conditional_fetch(self) -> FetchIndexResult:
raise FetchIndexError(f"Remote index {url_manifest} is invalid")
return FetchIndexResult(etag=None, hash=index_digest.digest, data=result, fresh=False)
class NoOverwriteException(spack.error.SpackError):
"""Raised when a file would be overwritten"""
def __init__(self, file_path):
super().__init__(f"Refusing to overwrite the following file: {file_path}")
class NoGpgException(spack.error.SpackError):
"""
Raised when gpg2 is not in PATH
"""
def __init__(self, msg):
super().__init__(msg)
class NoKeyException(spack.error.SpackError):
"""
Raised when gpg has no default key added.
"""
def __init__(self, msg):
super().__init__(msg)
class PickKeyException(spack.error.SpackError):
"""
Raised when multiple keys can be used to sign.
"""
def __init__(self, keys):
err_msg = "Multiple keys available for signing\n%s\n" % keys
err_msg += "Use spack buildcache create -k <key hash> to pick a key."
super().__init__(err_msg)
class NoVerifyException(spack.error.SpackError):
"""
Raised if file fails signature verification.
"""
pass
class NoChecksumException(spack.error.SpackError):
"""
Raised if file fails checksum verification.
"""
def __init__(self, path, size, contents, algorithm, expected, computed):
super().__init__(
f"{algorithm} checksum failed for {path}",
f"Expected {expected} but got {computed}. "
f"File size = {size} bytes. Contents = {contents!r}",
)
class NewLayoutException(spack.error.SpackError):
"""
Raised if directory layout is different from buildcache.
"""
def __init__(self, msg):
super().__init__(msg)
class InvalidMetadataFile(spack.error.SpackError):
pass
class UnsignedPackageException(spack.error.SpackError):
"""
Raised if installation of unsigned package is attempted without
the use of ``--no-check-signature``.
"""
class ListMirrorSpecsError(spack.error.SpackError):
"""Raised when unable to retrieve list of specs from the mirror"""
class GenerateIndexError(spack.error.SpackError):
"""Raised when unable to generate key or package index for mirror"""
class CannotListKeys(GenerateIndexError):
"""Raised when unable to list keys when generating key index"""
class PushToBuildCacheError(spack.error.SpackError):
"""Raised when unable to push objects to binary mirror"""

View File

@@ -5,7 +5,13 @@
"""Function and classes needed to bootstrap Spack itself."""
from .config import ensure_bootstrap_configuration, is_bootstrapping, store_path
from .core import all_core_root_specs, ensure_core_dependencies, ensure_patchelf_in_path_or_raise
from .core import (
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
from .environment import BootstrapEnvironment, ensure_environment_dependencies
from .status import status_message
@@ -13,6 +19,8 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",
"all_core_root_specs",
"ensure_environment_dependencies",

View File

@@ -54,10 +54,14 @@ def _try_import_from_store(
installed_specs = spack.store.STORE.db.query(query_spec, installed=True)
for candidate_spec in installed_specs:
pkg = candidate_spec["python"].package
# previously bootstrapped specs may not have a python-venv dependency.
if candidate_spec.dependencies("python-venv"):
python, *_ = candidate_spec.dependencies("python-venv")
else:
python, *_ = candidate_spec.dependencies("python")
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
os.path.join(candidate_spec.prefix, python.package.purelib),
os.path.join(candidate_spec.prefix, python.package.platlib),
]
path_before = list(sys.path)

View File

@@ -173,35 +173,14 @@ def _read_metadata(self, package_name: str) -> Any:
return data
def _install_by_hash(
self,
pkg_hash: str,
pkg_sha256: str,
index: List[spack.spec.Spec],
bincache_platform: spack.platforms.Platform,
self, pkg_hash: str, pkg_sha256: str, bincache_platform: spack.platforms.Platform
) -> None:
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null",
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family),
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override("compilers", [{"compiler": compiler_entry}]):
spec_str = "/" + pkg_hash
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
matches = spack.store.find([spec_str], multiple=False, query_fn=query)
for match in matches:
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
for match in spack.store.find([f"/{pkg_hash}"], multiple=False, query_fn=query):
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
def _install_and_test(
self,
@@ -232,7 +211,7 @@ def _install_and_test(
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, index, bincache_platform)
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
if test_fn(query_spec=abstract_spec, query_info=info):
@@ -291,10 +270,6 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
@@ -559,6 +534,41 @@ def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
)
def ensure_winsdk_external_or_raise() -> None:
"""Ensure the Windows SDK + WGL are available on system
If both of these package are found, the Spack user or bootstrap
configuration (depending on where Spack is running)
will be updated to include all versions and variants detected.
If either the WDK or WSDK are not found, this method will raise
a RuntimeError.
**NOTE:** This modifies the Spack config in the current scope,
either user or environment depending on the calling context.
This is different from all other current bootstrap dependency
checks.
"""
if set(["win-sdk", "wgl"]).issubset(spack.config.get("packages").keys()):
return
externals = spack.detection.by_path(["win-sdk", "wgl"])
if not set(["win-sdk", "wgl"]) == externals.keys():
missing_packages_lst = []
if "wgl" not in externals:
missing_packages_lst.append("wgl")
if "win-sdk" not in externals:
missing_packages_lst.append("win-sdk")
missing_packages = " & ".join(missing_packages_lst)
raise RuntimeError(
f"Unable to find the {missing_packages}, please install these packages \
via the Visual Studio installer \
before proceeding with Spack or provide the path to a non standard install with \
'spack external find --path'"
)
# wgl/sdk are not required for bootstrapping Spack, but
# are required for building anything non trivial
# add to user config so they can be used by subsequent Spack ops
spack.detection.update_configuration(externals, buildable=False)
def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":

View File

@@ -3,13 +3,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap non-core Spack dependencies from an environment."""
import glob
import hashlib
import os
import pathlib
import sys
import warnings
from typing import List
from typing import Iterable, List
import archspec.cpu
@@ -28,6 +26,16 @@
class BootstrapEnvironment(spack.environment.Environment):
"""Environment to install dependencies of Spack for a given interpreter and architecture"""
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
# Remove python package roots created before python-venv was introduced
for s in self.concrete_roots():
if "python" in s.package.extendees and not s.dependencies("python-venv"):
self.deconcretize(s)
@classmethod
def spack_dev_requirements(cls) -> List[str]:
"""Spack development requirements"""
@@ -59,31 +67,19 @@ def view_root(cls) -> pathlib.Path:
return cls.environment_root().joinpath("view")
@classmethod
def pythonpaths(cls) -> List[str]:
"""Paths to be added to sys.path or PYTHONPATH"""
python_dir_part = f"python{'.'.join(str(x) for x in sys.version_info[:2])}"
glob_expr = str(cls.view_root().joinpath("**", python_dir_part, "**"))
result = glob.glob(glob_expr)
if not result:
msg = f"Cannot find any Python path in {cls.view_root()}"
warnings.warn(msg)
return result
@classmethod
def bin_dirs(cls) -> List[pathlib.Path]:
def bin_dir(cls) -> pathlib.Path:
"""Paths to be added to PATH"""
return [cls.view_root().joinpath("bin")]
return cls.view_root().joinpath("bin")
def python_dirs(self) -> Iterable[pathlib.Path]:
python = next(s for s in self.all_specs_generator() if s.name == "python-venv").package
return {self.view_root().joinpath(p) for p in (python.platlib, python.purelib)}
@classmethod
def spack_yaml(cls) -> pathlib.Path:
"""Environment spack.yaml file"""
return cls.environment_root().joinpath("spack.yaml")
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
@@ -100,21 +96,13 @@ def update_installations(self) -> None:
self.install_all()
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
the environment view.
"""
# Do minimal modifications to sys.path and environment variables. In particular, pay
# attention to have the smallest PYTHONPATH / sys.path possible, since that may impact
# the performance of the current interpreter
sys.path.extend(self.pythonpaths())
os.environ["PATH"] = os.pathsep.join(
[str(x) for x in self.bin_dirs()] + os.environ.get("PATH", "").split(os.pathsep)
)
os.environ["PYTHONPATH"] = os.pathsep.join(
os.environ.get("PYTHONPATH", "").split(os.pathsep)
+ [str(x) for x in self.pythonpaths()]
)
def load(self) -> None:
"""Update PATH and sys.path."""
# Make executables available (shouldn't need PYTHONPATH)
os.environ["PATH"] = f"{self.bin_dir()}{os.pathsep}{os.environ.get('PATH', '')}"
# Spack itself imports pytest
sys.path.extend(str(p) for p in self.python_dirs())
def _write_spack_yaml_file(self) -> None:
tty.msg(
@@ -164,4 +152,4 @@ def ensure_environment_dependencies() -> None:
_add_externals_if_missing()
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()
env.load()

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
from typing import Dict, List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -57,8 +57,10 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
import spack.main
import spack.package_base
import spack.paths
@@ -66,6 +68,7 @@
import spack.repo
import spack.schema.environment
import spack.spec
import spack.stage
import spack.store
import spack.subprocess_context
import spack.user_environment
@@ -78,7 +81,7 @@
from spack.installer import InstallError
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import (
SYSTEM_DIRS,
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
env_flag,
filter_system_paths,
@@ -101,9 +104,13 @@
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
SPACK_RPATH_DIRS = "SPACK_RPATH_DIRS"
SPACK_STORE_INCLUDE_DIRS = "SPACK_STORE_INCLUDE_DIRS"
SPACK_STORE_LINK_DIRS = "SPACK_STORE_LINK_DIRS"
SPACK_STORE_RPATH_DIRS = "SPACK_STORE_RPATH_DIRS"
SPACK_RPATH_DEPS = "SPACK_RPATH_DEPS"
SPACK_LINK_DEPS = "SPACK_LINK_DEPS"
SPACK_PREFIX = "SPACK_PREFIX"
@@ -416,7 +423,7 @@ def set_compiler_environment_variables(pkg, env):
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", ":".join(SYSTEM_DIRS))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
@@ -544,9 +551,26 @@ def update_compiler_args_for_dep(dep):
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
env.set(SPACK_LINK_DIRS, ":".join(link_dirs))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs))
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
spack_managed_dirs: Set[str] = {
spack.stage.get_stage_root(),
spack.store.STORE.db.root,
*(db.root for db in spack.store.STORE.db.upstream_dbs),
}
spack_managed_dirs.update([os.path.realpath(p) for p in spack_managed_dirs])
env.set(SPACK_MANAGED_DIRS, "|".join(f'"{p}/"*' for p in sorted(spack_managed_dirs)))
is_spack_managed = lambda p: any(p.startswith(store) for store in spack_managed_dirs)
link_dirs_spack, link_dirs_system = stable_partition(link_dirs, is_spack_managed)
include_dirs_spack, include_dirs_system = stable_partition(include_dirs, is_spack_managed)
rpath_dirs_spack, rpath_dirs_system = stable_partition(rpath_dirs, is_spack_managed)
env.set(SPACK_LINK_DIRS, ":".join(link_dirs_system))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs_system))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs_system))
env.set(SPACK_STORE_LINK_DIRS, ":".join(link_dirs_spack))
env.set(SPACK_STORE_INCLUDE_DIRS, ":".join(include_dirs_spack))
env.set(SPACK_STORE_RPATH_DIRS, ":".join(rpath_dirs_spack))
def set_package_py_globals(pkg, context: Context = Context.BUILD):
@@ -583,10 +607,22 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
module.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# Useful directories within the prefix are encapsulated in
# a Prefix object.
@@ -694,12 +730,28 @@ def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwa
return compiler(*compiler_args, output=compiler_output)
def get_rpath_deps(pkg):
"""Return immediate or transitive RPATHs depending on the package."""
if pkg.transitive_rpaths:
return [d for d in pkg.spec.traverse(root=False, deptype=("link"))]
else:
return pkg.spec.dependencies(deptype="link")
def _get_rpath_deps_from_spec(
spec: spack.spec.Spec, transitive_rpaths: bool
) -> List[spack.spec.Spec]:
if not transitive_rpaths:
return spec.dependencies(deptype=dt.LINK)
by_name: Dict[str, spack.spec.Spec] = {}
for dep in spec.traverse(root=False, deptype=dt.LINK):
lookup = by_name.get(dep.name)
if lookup is None:
by_name[dep.name] = dep
elif lookup.version < dep.version:
by_name[dep.name] = dep
return list(by_name.values())
def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]:
"""Return immediate or transitive dependencies (depending on the package) that need to be
rpath'ed. If a package occurs multiple times, the newest version is kept."""
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def get_rpaths(pkg):
@@ -789,7 +841,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in ["cray-mpich", "cray-libsci"]:
module("unload", mod)
if target.module_name:
if target and target.module_name:
load_module(target.module_name)
load_external_modules(pkg)

View File

@@ -434,11 +434,6 @@ def _do_patch_libtool(self):
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.pkg.compiler.name == "dpcpp":
# Hack to filter out spurious predep_objects when building with Intel dpcpp
# (see https://github.com/spack/spack/issues/32863):
x.filter(regex=r"^(predep_objects=.*)/tmp/conftest-[0-9A-Fa-f]+\.o", repl=r"\1")
x.filter(regex=r"^(predep_objects=.*)/tmp/a-[0-9A-Fa-f]+\.o", repl=r"\1")
elif self.pkg.compiler.name == "nag":
for tag in ["fc", "f77"]:
marker = markers[tag]

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import os
import re
from typing import Tuple
import llnl.util.filesystem as fs
@@ -15,6 +16,12 @@
from .cmake import CMakeBuilder, CMakePackage
def spec_uses_toolchain(spec):
gcc_toolchain_regex = re.compile(".*gcc-toolchain.*")
using_toolchain = list(filter(gcc_toolchain_regex.match, spec.compiler_flags["cxxflags"]))
return using_toolchain
def cmake_cache_path(name, value, comment="", force=False):
"""Generate a string for a cmake cache variable"""
force_str = " FORCE" if force else ""
@@ -213,7 +220,7 @@ def initconfig_mpi_entries(self):
else:
# starting with cmake 3.10, FindMPI expects MPIEXEC_EXECUTABLE
# vs the older versions which expect MPIEXEC
if self.pkg.spec["cmake"].satisfies("@3.10:"):
if spec["cmake"].satisfies("@3.10:"):
entries.append(cmake_cache_path("MPIEXEC_EXECUTABLE", mpiexec))
else:
entries.append(cmake_cache_path("MPIEXEC", mpiexec))
@@ -248,12 +255,17 @@ def initconfig_hardware_entries(self):
# Include the deprecated CUDA_TOOLKIT_ROOT_DIR for supporting BLT packages
entries.append(cmake_cache_path("CUDA_TOOLKIT_ROOT_DIR", cudatoolkitdir))
archs = spec.variants["cuda_arch"].value
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(
cmake_cache_string("CMAKE_CUDA_ARCHITECTURES", "{0}".format(arch_str))
)
# CUDA_FLAGS
cuda_flags = []
if not spec.satisfies("cuda_arch=none"):
cuda_archs = ";".join(spec.variants["cuda_arch"].value)
entries.append(cmake_cache_string("CMAKE_CUDA_ARCHITECTURES", cuda_archs))
if spec_uses_toolchain(spec):
cuda_flags.append("-Xcompiler {}".format(spec_uses_toolchain(spec)[0]))
entries.append(cmake_cache_string("CMAKE_CUDA_FLAGS", " ".join(cuda_flags)))
if "+rocm" in spec:
entries.append("#------------------{0}".format("-" * 30))
@@ -262,9 +274,6 @@ def initconfig_hardware_entries(self):
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
entries.append(
cmake_cache_path("HIP_CXX_COMPILER", "{0}".format(self.spec["hip"].hipcc))
)
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
@@ -277,11 +286,9 @@ def initconfig_hardware_entries(self):
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
entries.append(
cmake_cache_string("CMAKE_HIP_ARCHITECTURES", "{0}".format(arch_str))
)
entries.append(cmake_cache_string("AMDGPU_TARGETS", "{0}".format(arch_str)))
entries.append(cmake_cache_string("GPU_TARGETS", "{0}".format(arch_str)))
entries.append(cmake_cache_string("CMAKE_HIP_ARCHITECTURES", arch_str))
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
return entries

View File

@@ -16,7 +16,7 @@
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using cargo."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -39,16 +39,11 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
if not getattr(pkg, "find_python_hints", False):
if not getattr(pkg, "find_python_hints", False) or not pkg.spec.dependencies(
"python", dt.BUILD | dt.LINK
):
return
pythons = pkg.spec.dependencies("python", dt.BUILD | dt.LINK)
if len(pythons) != 1:
return
try:
python_executable = pythons[0].package.command.path
except RuntimeError:
return
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),

View File

@@ -0,0 +1,144 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
import spack.compiler
import spack.package_base
# Local "type" for type hints
Path = Union[str, pathlib.Path]
class CompilerPackage(spack.package_base.PackageBase):
"""A Package mixin for all common logic for packages that implement compilers"""
# TODO: how do these play nicely with other tags
tags: Sequence[str] = ["compiler"]
#: Optional suffix regexes for searching for this type of compiler.
#: Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
#: version suffix for gcc.
compiler_suffixes: List[str] = [r"-.*"]
#: Optional prefix regexes for searching for this compiler
compiler_prefixes: List[str] = []
#: Compiler argument(s) that produces version information
#: If multiple arguments, the earlier arguments must produce errors when invalid
compiler_version_argument: Union[str, Tuple[str]] = "-dumpversion"
#: Regex used to extract version from compiler's output
compiler_version_regex: str = "(.*)"
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
assert set(self.supported_languages) <= set(self.compiler_languages), msg
@property
def supported_languages(self) -> Sequence[str]:
"""Dynamic definition of languages supported by this package"""
return self.compiler_languages
@classproperty
def compiler_names(cls) -> Sequence[str]:
"""Construct list of compiler names from per-language names"""
names = []
for language in cls.compiler_languages:
names.extend(getattr(cls, f"{language}_names"))
return names
@classproperty
def executables(cls) -> Sequence[str]:
"""Construct executables for external detection from names, prefixes, and suffixes."""
regexp_fmt = r"^({0}){1}({2})$"
prefixes = [""] + cls.compiler_prefixes
suffixes = [""] + cls.compiler_suffixes
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
suffixes += [suf + ext for suf in suffixes]
return [
regexp_fmt.format(prefix, re.escape(name), suffix)
for prefix, name, suffix in itertools.product(prefixes, cls.compiler_names, suffixes)
]
@classmethod
def determine_version(cls, exe: Path):
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
except spack.util.executable.ProcessError:
pass
except Exception as e:
tty.debug(
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
return os.path.join(prefix, "bin")
@classmethod
def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
"""Compute the paths to compiler executables associated with this package
This is a helper method for ``determine_variants`` to compute the ``extra_attributes``
to include with each spec object."""
# There are often at least two copies (not symlinks) of each compiler executable in the
# same directory: one with a canonical name, e.g. "gfortran", and another one with the
# target prefix, e.g. "x86_64-pc-linux-gnu-gfortran". There also might be a copy of "gcc"
# with the version suffix, e.g. "x86_64-pc-linux-gnu-gcc-6.3.0". To ensure the consistency
# of values in the "paths" dictionary (i.e. we prefer all of them to reference copies
# with canonical names if possible), we iterate over the executables in the reversed sorted
# order:
# First pass over languages identifies exes that are perfect matches for canonical names
# Second pass checks for names with prefix/suffix
# Second pass is sorted by language name length because longer named languages
# e.g. cxx can often contain the names of shorter named languages
# e.g. c (e.g. clang/clang++)
paths = {}
exes = sorted(exes, reverse=True)
languages = {
lang: getattr(cls, f"{lang}_names")
for lang in sorted(cls.compiler_languages, key=len, reverse=True)
}
for exe in exes:
for lang, names in languages.items():
if os.path.basename(exe) in names:
paths[lang] = exe
break
else:
for lang, names in languages.items():
if any(name in os.path.basename(exe) for name in names):
paths[lang] = exe
break
return paths
@classmethod
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}

View File

@@ -137,11 +137,14 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -21,7 +21,7 @@
class MakefilePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -145,7 +145,7 @@ def install(self, pkg, spec, prefix):
opts += self.nmake_install_args()
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", prefix))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, variant
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -26,10 +26,11 @@ class IntelOneApiPackage(Package):
"""Base class for Intel oneAPI packages."""
homepage = "https://software.intel.com/oneapi"
license("https://intel.ly/393CijO")
# oneAPI license does not allow mirroring outside of the
# organization (e.g. University/Company).
redistribute_source = False
redistribute(source=False, binary=False)
for c in [
"target=ppc64:",

View File

@@ -138,16 +138,21 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
# Patch up shebangs if the package extends Python and we put a Python interpreter in the
# view.
if not self.extendee_spec:
return super().add_files_to_view(view, merge_map, skip_if_exists)
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
if python.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
for src, dst in merge_map.items():
if skip_if_exists and os.path.lexists(dst):
continue
@@ -168,7 +173,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
python.prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
view.link(src, dst)
@@ -199,14 +204,13 @@ def remove_files_from_view(self, view, merge_map):
ignore_namespace = True
bin_dir = self.spec.prefix.bin
global_view = self.extendee_spec.prefix == view.get_projection_for_spec(self.spec)
to_remove = []
for src, dst in merge_map.items():
if ignore_namespace and namespace_init(dst):
continue
if global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
to_remove.append(dst)
else:
os.remove(dst)
@@ -362,6 +366,12 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return f"https://pypi.org/simple/{name}/"
return None
@property
def python_spec(self):
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@property
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
@@ -371,8 +381,9 @@ def headers(self) -> HeaderList:
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
@@ -391,8 +402,9 @@ def libs(self) -> LibraryList:
name = self.spec.name[3:]
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
@@ -504,6 +516,8 @@ def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]
@@ -519,14 +533,6 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
else:
args.append(".")
pip = spec["python"].command
# Hide user packages, since we don't have build isolation. This is
# necessary because pip / setuptools may run hooks from arbitrary
# packages during the build. There is no equivalent variable to hide
# system packages, so this is not reliable for external Python.
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
with fs.working_dir(self.build_directory):
pip(*args)

View File

@@ -75,9 +75,12 @@
# does not like its directory structure.
#
import os
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.util.environment import EnvironmentModifications
class ROCmPackage(PackageBase):
@@ -154,6 +157,25 @@ def hip_flags(amdgpu_target):
archs = ",".join(amdgpu_target)
return "--amdgpu-target={0}".format(archs)
def asan_on(self, env: EnvironmentModifications):
llvm_path = self.spec["llvm-amdgpu"].prefix
env.set("CC", llvm_path + "/bin/clang")
env.set("CXX", llvm_path + "/bin/clang++")
env.set("ASAN_OPTIONS", "detect_leaks=0")
for root, _, files in os.walk(llvm_path):
if "libclang_rt.asan-x86_64.so" in files:
asan_lib_path = root
env.prepend_path("LD_LIBRARY_PATH", asan_lib_path)
if "rhel" in self.spec.os or "sles" in self.spec.os:
SET_DWARF_VERSION_4 = "-gdwarf-5"
else:
SET_DWARF_VERSION_4 = ""
env.set("CFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("CXXFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("LDFLAGS", "-Wl,--enable-new-dtags -fuse-ld=lld -fsanitize=address -g -Wl,")
# HIP version vs Architecture
# TODO: add a bunch of lines like:

File diff suppressed because it is too large Load Diff

View File

@@ -334,8 +334,7 @@ def display_specs(specs, args=None, **kwargs):
variants (bool): Show variants with specs
indent (int): indent each line this much
groups (bool): display specs grouped by arch/compiler (default True)
decorators (dict): dictionary mappng specs to decorators
header_callback (typing.Callable): called at start of arch/compiler groups
decorator (typing.Callable): function to call to decorate specs
all_headers (bool): show headers even when arch/compiler aren't defined
output (typing.IO): A file object to write to. Default is ``sys.stdout``
@@ -384,15 +383,13 @@ def get_arg(name, default=None):
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + ffmt + vfmt
transform = {"package": decorator, "fullpackage": decorator}
def fmt(s, depth=0):
"""Formatter function for all output specs"""
string = ""
if hashes:
string += gray_hash(s, hlen) + " "
string += depth * " "
string += s.cformat(format_string, transform=transform)
string += decorator(s, s.cformat(format_string))
return string
def format_list(specs):
@@ -451,7 +448,7 @@ def filter_loaded_specs(specs):
return [x for x in specs if x.dag_hash() in hashes]
def print_how_many_pkgs(specs, pkg_type=""):
def print_how_many_pkgs(specs, pkg_type="", suffix=""):
"""Given a list of specs, this will print a message about how many
specs are in that list.
@@ -462,7 +459,7 @@ def print_how_many_pkgs(specs, pkg_type=""):
category, e.g. if pkg_type is "installed" then the message
would be "3 installed packages"
"""
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package"))
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package") + suffix)
def spack_is_git_repo():

View File

@@ -84,7 +84,7 @@ def externals(parser, args):
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs, debug_log=tty.debug)
_process_reports(reports)

View File

@@ -13,7 +13,6 @@
import shutil
import sys
import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty
@@ -54,6 +53,7 @@
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry,
upload_manifest_with_retry,
)
@@ -133,6 +133,11 @@ def setup_parser(subparser: argparse.ArgumentParser):
help="when pushing to an OCI registry, tag an image containing all root specs and their "
"runtime dependencies",
)
push.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
@@ -275,23 +280,37 @@ def setup_parser(subparser: argparse.ArgumentParser):
# Sync buildcache entries from one mirror to another
sync = subparsers.add_parser("sync", help=sync_fn.__doc__)
sync.add_argument(
"--manifest-glob", help="a quoted glob pattern identifying copy manifest files"
sync_manifest_source = sync.add_argument_group(
"Manifest Source",
"Specify a list of build cache objects to sync using manifest file(s)."
'This option takes the place of the "source mirror" for synchronization'
'and optionally takes a "destination mirror" ',
)
sync.add_argument(
sync_manifest_source.add_argument(
"--manifest-glob", help="a quoted glob pattern identifying CI rebuild manifest files"
)
sync_source_mirror = sync.add_argument_group(
"Named Source",
"Specify a single registered source mirror to synchronize from. This option requires"
"the specification of a destination mirror.",
)
sync_source_mirror.add_argument(
"src_mirror",
metavar="source mirror",
type=arguments.mirror_name_or_url,
nargs="?",
type=arguments.mirror_name_or_url,
help="source mirror name, path, or URL",
)
sync.add_argument(
"dest_mirror",
metavar="destination mirror",
type=arguments.mirror_name_or_url,
nargs="?",
type=arguments.mirror_name_or_url,
help="destination mirror name, path, or URL",
)
sync.set_defaults(func=sync_fn)
# Update buildcache index without copying any additional packages
@@ -353,6 +372,25 @@ def _make_pool() -> MaybePool:
return NoPool()
def _skip_no_redistribute_for_public(specs):
remaining_specs = list()
removed_specs = list()
for spec in specs:
if spec.package.redistribute_binary:
remaining_specs.append(spec)
else:
removed_specs.append(spec)
if removed_specs:
colified_output = tty.colify.colified(list(s.name for s in removed_specs), indent=4)
tty.debug(
"The following specs will not be added to the binary cache"
" because they cannot be redistributed:\n"
f"{colified_output}\n"
"You can use `--private` to include them."
)
return remaining_specs
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -403,6 +441,8 @@ def push_fn(args):
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
if not args.private:
specs = _skip_no_redistribute_for_public(specs)
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.
@@ -816,10 +856,7 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
tags = list_tags(image_ref)
# Fetch all image config files in parallel
spec_dicts = pool.starmap(
@@ -1070,7 +1107,17 @@ def sync_fn(args):
requires an active environment in order to know which specs to sync
"""
if args.manifest_glob:
manifest_copy(glob.glob(args.manifest_glob))
# Passing the args.src_mirror here because it is not possible to
# have the destination be required when specifying a named source
# mirror and optional for the --manifest-glob argument. In the case
# of manifest glob sync, the source mirror positional argument is the
# destination mirror if it is specified. If there are two mirrors
# specified, the second is ignored and the first is the override
# destination.
if args.dest_mirror:
tty.warn(f"Ignoring unused arguemnt: {args.dest_mirror.name}")
manifest_copy(glob.glob(args.manifest_glob), args.src_mirror)
return 0
if args.src_mirror is None or args.dest_mirror is None:
@@ -1121,7 +1168,7 @@ def sync_fn(args):
shutil.rmtree(tmpdir)
def manifest_copy(manifest_file_list):
def manifest_copy(manifest_file_list, dest_mirror=None):
"""Read manifest files containing information about specific specs to copy
from source to destination, remove duplicates since any binary packge for
a given hash should be the same as any other, and copy all files specified
@@ -1135,10 +1182,17 @@ def manifest_copy(manifest_file_list):
# Last duplicate hash wins
deduped_manifest[spec_hash] = copy_list
build_cache_dir = bindist.build_cache_relative_path()
for spec_hash, copy_list in deduped_manifest.items():
for copy_file in copy_list:
tty.debug("copying {0} to {1}".format(copy_file["src"], copy_file["dest"]))
copy_buildcache_file(copy_file["src"], copy_file["dest"])
dest = copy_file["dest"]
if dest_mirror:
src_relative_path = os.path.join(
build_cache_dir, copy_file["src"].rsplit(build_cache_dir, 1)[1].lstrip("/")
)
dest = url_util.join(dest_mirror.push_url, src_relative_path)
tty.debug("copying {0} to {1}".format(copy_file["src"], dest))
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
@@ -1165,14 +1219,18 @@ def update_index(mirror: spack.mirror.Mirror, update_keys=False):
url, bindist.build_cache_relative_path(), bindist.build_cache_keys_relative_path()
)
bindist.generate_key_index(keys_url)
try:
bindist.generate_key_index(keys_url)
except bindist.CannotListKeys as e:
# Do not error out if listing keys went wrong. This usually means that the _gpg path
# does not exist. TODO: distinguish between this and other errors.
tty.warn(f"did not update the key index: {e}")
def update_index_fn(args):
"""update a buildcache index"""
update_index(args.mirror, update_keys=args.keys)
return update_index(args.mirror, update_keys=args.keys)
def buildcache(parser, args):
if args.func:
args.func(args)
return args.func(args)

View File

@@ -183,7 +183,7 @@ def checksum(parser, args):
print()
if args.add_to_package:
add_versions_to_package(pkg, version_lines)
add_versions_to_package(pkg, version_lines, args.batch)
def print_checksum_status(pkg: PackageBase, version_hashes: dict):
@@ -229,7 +229,7 @@ def print_checksum_status(pkg: PackageBase, version_hashes: dict):
tty.die("Invalid checksums found.")
def add_versions_to_package(pkg: PackageBase, version_lines: str):
def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool):
"""
Add checksumed versions to a package's instructions and open a user's
editor so they may double check the work of the function.
@@ -282,5 +282,5 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str):
tty.msg(f"Added {num_versions_added} new versions to {pkg.name}")
tty.msg(f"Open {filename} to review the additions.")
if sys.stdout.isatty():
if sys.stdout.isatty() and not is_batch:
editor(filename)

View File

@@ -14,6 +14,7 @@
import spack.binary_distribution as bindist
import spack.ci as spack_ci
import spack.cmd
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
@@ -30,14 +31,20 @@
level = "long"
SPACK_COMMAND = "spack"
MAKE_COMMAND = "make"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
def deindent(desc):
return desc.replace(" ", "")
def unicode_escape(path: str) -> str:
"""Returns transformed path with any unicode
characters replaced with their corresponding escapes"""
return path.encode("unicode-escape").decode("utf-8")
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help="CI sub-commands")
@@ -549,75 +556,35 @@ def ci_rebuild(args):
# No hash match anywhere means we need to rebuild spec
# Start with spack arguments
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose"]
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose", "install"]
config = cfg.get("config")
if not config["verify_ssl"]:
spack_cmd.append("-k")
install_args = []
install_args = [f'--use-buildcache={spack_ci.win_quote("package:never,dependencies:only")}']
can_verify = spack_ci.can_verify_binaries()
verify_binaries = can_verify and spack_is_pr_pipeline is False
if not verify_binaries:
install_args.append("--no-check-signature")
slash_hash = "/{}".format(job_spec.dag_hash())
# Arguments when installing dependencies from cache
deps_install_args = install_args
slash_hash = spack_ci.win_quote("/" + job_spec.dag_hash())
# Arguments when installing the root from sources
root_install_args = install_args + [
"--keep-stage",
"--only=package",
"--use-buildcache=package:never,dependencies:only",
]
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
root_install_args.extend(cdash_handler.args())
root_install_args.append(slash_hash)
# ["x", "y"] -> "'x' 'y'"
args_to_string = lambda args: " ".join("'{}'".format(arg) for arg in args)
commands = [
# apparently there's a race when spack bootstraps? do it up front once
[SPACK_COMMAND, "-e", env.path, "bootstrap", "now"],
[
SPACK_COMMAND,
"-e",
env.path,
"env",
"depfile",
"-o",
"Makefile",
"--use-buildcache=package:never,dependencies:only",
slash_hash, # limit to spec we're building
],
[
# --output-sync requires GNU make 4.x.
# Old make errors when you pass it a flag it doesn't recognize,
# but it doesn't error or warn when you set unrecognized flags in
# this variable.
"export",
"GNUMAKEFLAGS=--output-sync=recurse",
],
[
MAKE_COMMAND,
"SPACK={}".format(args_to_string(spack_cmd)),
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,
[SPACK_COMMAND, "-e", unicode_escape(env.path), "bootstrap", "now"],
spack_cmd + deps_install_args + [slash_hash],
spack_cmd + root_install_args + [slash_hash],
]
tty.debug("Installing {0} from source".format(job_spec.name))
install_exit_code = spack_ci.process_command("install", commands, repro_dir)
@@ -705,11 +672,9 @@ def ci_rebuild(args):
cdash_handler.report_skipped(job_spec, reports_dir, reason=msg)
cdash_handler.copy_test_results(reports_dir, job_test_dir)
# If the install succeeded, create a buildcache entry for this job spec
# and push it to one or more mirrors. If the install did not succeed,
# print out some instructions on how to reproduce this build failure
# outside of the pipeline environment.
if install_exit_code == 0:
# If the install succeeded, push it to one or more mirrors. Failure to push to any mirror
# will result in a non-zero exit code. Pushing is best-effort.
mirror_urls = [buildcache_mirror_url]
# TODO: Remove this block in Spack 0.23
@@ -721,13 +686,12 @@ def ci_rebuild(args):
destination_mirror_urls=mirror_urls,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
"{} {} to {}".format(
"Pushed" if result.success else "Failed to push",
job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when()),
result.url,
)
if not result.success:
install_exit_code = FAILED_CREATE_BUILDCACHE_CODE
(tty.msg if result.success else tty.error)(
f'{"Pushed" if result.success else "Failed to push"} '
f'{job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when())} '
f"to {result.url}"
)
# If this is a develop pipeline, check if the spec that we just built is
@@ -748,22 +712,22 @@ def ci_rebuild(args):
tty.warn(msg.format(broken_spec_path, err))
else:
# If the install did not succeed, print out some instructions on how to reproduce this
# build failure outside of the pipeline environment.
tty.debug("spack install exited non-zero, will not create buildcache")
api_root_url = os.environ.get("CI_API_V4_URL")
ci_project_id = os.environ.get("CI_PROJECT_ID")
ci_job_id = os.environ.get("CI_JOB_ID")
repro_job_url = "{0}/projects/{1}/jobs/{2}/artifacts".format(
api_root_url, ci_project_id, ci_job_id
)
repro_job_url = f"{api_root_url}/projects/{ci_project_id}/jobs/{ci_job_id}/artifacts"
# Control characters cause this to be printed in blue so it stands out
reproduce_msg = """
print(
f"""
\033[34mTo reproduce this build locally, run:
spack ci reproduce-build {0} [--working-dir <dir>] [--autostart]
spack ci reproduce-build {repro_job_url} [--working-dir <dir>] [--autostart]
If this project does not have public pipelines, you will need to first:
@@ -771,12 +735,9 @@ def ci_rebuild(args):
... then follow the printed instructions.\033[0;0m
""".format(
repro_job_url
"""
)
print(reproduce_msg)
rebuild_timer.stop()
try:
with open("install_timers.json", "w") as timelog:

View File

@@ -563,12 +563,13 @@ def add_concretizer_args(subparser):
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--fresh-roots",
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
help="concretize with fresh roots and reused dependencies",
)
subgroup.add_argument(
"--deprecated",

View File

@@ -9,13 +9,14 @@
import shutil
import sys
import tempfile
from typing import Optional
from pathlib import Path
from typing import List, Optional
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from llnl.util.tty.color import cescape, colorize
import spack.cmd
import spack.cmd.common
@@ -44,6 +45,7 @@
"deactivate",
"create",
["remove", "rm"],
["rename", "mv"],
["list", "ls"],
["status", "st"],
"loads",
@@ -59,14 +61,7 @@
#
def env_create_setup_parser(subparser):
"""create a new environment"""
subparser.add_argument(
"env_name",
metavar="env",
help=(
"name of managed environment or directory of the anonymous env "
"(when using --dir/-d) to activate"
),
)
subparser.add_argument("env_name", metavar="env", help="name or directory of environment")
subparser.add_argument(
"-d", "--dir", action="store_true", help="create an environment in a specific directory"
)
@@ -92,6 +87,9 @@ def env_create_setup_parser(subparser):
default=None,
help="either a lockfile (must end with '.json' or '.lock') or a manifest file",
)
subparser.add_argument(
"--include-concrete", action="append", help="name of old environment to copy specs from"
)
def env_create(args):
@@ -109,19 +107,32 @@ def env_create(args):
# the environment should not include a view.
with_view = None
include_concrete = None
if hasattr(args, "include_concrete"):
include_concrete = args.include_concrete
env = _env_create(
args.env_name,
init_file=args.envfile,
dir=args.dir,
dir=args.dir or os.path.sep in args.env_name or args.env_name in (".", ".."),
with_view=with_view,
keep_relative=args.keep_relative,
include_concrete=include_concrete,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
def _env_create(
name_or_path: str,
*,
init_file: Optional[str] = None,
dir: bool = False,
with_view: Optional[str] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
):
"""Create a new environment, with an optional yaml description.
Arguments:
@@ -133,22 +144,31 @@ def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep
keep_relative (bool): if True, develop paths are copied verbatim into
the new environment file, otherwise they may be made absolute if the
new environment is in a different location
include_concrete (list): list of the included concrete environments
"""
if not dir:
env = ev.create(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg("Created environment '%s' in %s" % (name_or_path, env.path))
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % (name_or_path))
return env
env = ev.create_in_dir(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
)
tty.msg("Created environment in %s" % env.path)
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % env.path)
tty.msg(
colorize(
f"Created environment @c{{{cescape(name_or_path)}}} in: @c{{{cescape(env.path)}}}"
)
)
else:
env = ev.create_in_dir(
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg(colorize(f"Created independent environment in: @c{{{cescape(env.path)}}}"))
tty.msg(f"Activate with: {colorize(f'@c{{spack env activate {cescape(name_or_path)}}}')}")
return env
@@ -434,6 +454,12 @@ def env_remove_setup_parser(subparser):
"""remove an existing environment"""
subparser.add_argument("rm_env", metavar="env", nargs="+", help="environment(s) to remove")
arguments.add_common_arguments(subparser, ["yes_to_all"])
subparser.add_argument(
"-f",
"--force",
action="store_true",
help="remove the environment even if it is included in another environment",
)
def env_remove(args):
@@ -443,13 +469,35 @@ def env_remove(args):
and manifests embedded in repositories should be removed manually.
"""
read_envs = []
valid_envs = []
bad_envs = []
for env_name in args.rm_env:
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
read_envs.append(env)
valid_envs.append(env_name)
if env_name in args.rm_env:
read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
bad_envs.append(env_name)
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if env is linked to another before trying to remove
for name in valid_envs:
# don't check if environment is included to itself
if name == env_name:
continue
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)
@@ -472,11 +520,82 @@ def env_remove(args):
tty.msg(f"Successfully removed environment '{bad_env_name}'")
#
# env rename
#
def env_rename_setup_parser(subparser):
"""rename an existing environment"""
subparser.add_argument(
"mv_from", metavar="from", help="name (or path) of existing environment"
)
subparser.add_argument(
"mv_to", metavar="to", help="new name (or path) for existing environment"
)
subparser.add_argument(
"-d",
"--dir",
action="store_true",
help="the specified arguments correspond to directory paths",
)
subparser.add_argument(
"-f", "--force", action="store_true", help="allow overwriting of an existing environment"
)
def env_rename(args):
"""Rename an environment.
This renames a managed environment or moves an anonymous environment.
"""
# Directory option has been specified
if args.dir:
if not ev.is_env_dir(args.mv_from):
tty.die("The specified path does not correspond to a valid spack environment")
from_path = Path(args.mv_from)
if not args.force:
if ev.is_env_dir(args.mv_to):
tty.die(
"The new path corresponds to an existing environment;"
" specify the --force flag to overwrite it."
)
if Path(args.mv_to).exists():
tty.die("The new path already exists; specify the --force flag to overwrite it.")
to_path = Path(args.mv_to)
# Name option being used
elif ev.exists(args.mv_from):
from_path = ev.environment.environment_dir_from_name(args.mv_from)
if not args.force and ev.exists(args.mv_to):
tty.die(
"The new name corresponds to an existing environment;"
" specify the --force flag to overwrite it."
)
to_path = ev.environment.root(args.mv_to)
# Neither
else:
tty.die("The specified name does not correspond to a managed spack environment")
# Guard against renaming from or to an active environment
active_env = ev.active_environment()
if active_env:
from_env = ev.Environment(from_path)
if from_env.path == active_env.path:
tty.die("Cannot rename active environment")
if to_path == active_env.path:
tty.die(f"{args.mv_to} is an active environment")
shutil.rmtree(to_path, ignore_errors=True)
fs.rename(from_path, to_path)
tty.msg(f"Successfully renamed environment {args.mv_from} to {args.mv_to}")
#
# env list
#
def env_list_setup_parser(subparser):
"""list available environments"""
"""list managed environments"""
def env_list(args):

View File

@@ -14,6 +14,7 @@
import spack.cmd as cmd
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
@@ -69,6 +70,12 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["long", "very_long", "tags", "namespaces"])
subparser.add_argument(
"-r",
"--only-roots",
action="store_true",
help="don't show full list of installed specs in an environment",
)
subparser.add_argument(
"-c",
"--show-concretized",
@@ -140,6 +147,12 @@ def setup_parser(subparser):
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--install-tree",
action="store",
default="all",
help="Install trees to query: 'all' (default), 'local', 'upstream', upstream name or path",
)
subparser.add_argument("--start-date", help="earliest date of installation [YYYY-MM-DD]")
subparser.add_argument("--end-date", help="latest date of installation [YYYY-MM-DD]")
@@ -168,6 +181,12 @@ def query_arguments(args):
q_args = {"installed": installed, "known": known, "explicit": explicit}
install_tree = args.install_tree
upstreams = spack.config.get("upstreams", {})
if install_tree in upstreams.keys():
install_tree = upstreams[install_tree]["install_tree"]
q_args["install_tree"] = install_tree
# Time window of installation
for attribute in ("start_date", "end_date"):
date = getattr(args, attribute)
@@ -177,26 +196,22 @@ def query_arguments(args):
return q_args
def setup_env(env):
def make_env_decorator(env):
"""Create a function for decorating specs when in an environment."""
def strip_build(seq):
return set(s.copy(deps=("link", "run")) for s in seq)
added = set(strip_build(env.added_specs()))
roots = set(strip_build(env.roots()))
removed = set(strip_build(env.removed_specs()))
roots = set(env.roots())
removed = set(env.removed_specs())
def decorator(spec, fmt):
# add +/-/* to show added/removed/root specs
if any(spec.dag_hash() == r.dag_hash() for r in roots):
return color.colorize("@*{%s}" % fmt)
return color.colorize(f"@*{{{fmt}}}")
elif spec in removed:
return color.colorize("@K{%s}" % fmt)
return color.colorize(f"@K{{{fmt}}}")
else:
return "%s" % fmt
return fmt
return decorator, added, roots, removed
return decorator
def display_env(env, args, decorator, results):
@@ -211,10 +226,54 @@ def display_env(env, args, decorator, results):
"""
tty.msg("In environment %s" % env.name)
if not env.user_specs:
tty.msg("No root specs")
else:
tty.msg("Root specs")
num_roots = len(env.user_specs) or "No"
tty.msg(f"{num_roots} root specs")
concrete_specs = {
root: concrete_root
for root, concrete_root in zip(env.concretized_user_specs, env.concrete_roots())
}
def root_decorator(spec, string):
"""Decorate root specs with their install status if needed"""
concrete = concrete_specs.get(spec)
if concrete:
status = color.colorize(concrete.install_status().value)
hash = concrete.dag_hash()
else:
status = color.colorize(spack.spec.InstallStatus.absent.value)
hash = "-" * 32
# TODO: status has two extra spaces on the end of it, but fixing this and other spec
# TODO: space format idiosyncrasies is complicated. Fix this eventually
status = status[:-2]
if args.long or args.very_long:
hash = color.colorize(f"@K{{{hash[: 7 if args.long else None]}}}")
return f"{status} {hash} {string}"
else:
return f"{status} {string}"
with spack.store.STORE.db.read_transaction():
cmd.display_specs(
env.user_specs,
args,
# these are overrides of CLI args
paths=False,
long=False,
very_long=False,
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
print()
if env.included_concrete_envs:
tty.msg("Included specs")
# Root specs cannot be displayed with prefixes, since those are not
# set for abstract specs. Same for hashes
@@ -224,10 +283,10 @@ def display_env(env, args, decorator, results):
# Roots are displayed with variants, etc. so that we can see
# specifically what the user asked for.
cmd.display_specs(
env.user_specs,
env.included_user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespaces=True,
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
@@ -242,7 +301,7 @@ def display_env(env, args, decorator, results):
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results:
if results and not args.only_roots:
tty.msg("Installed packages")
@@ -251,9 +310,10 @@ def find(parser, args):
results = args.specs(**q_args)
env = ev.active_environment()
decorator = lambda s, f: f
if env:
decorator, _, roots, _ = setup_env(env)
if not env and args.only_roots:
tty.die("-r / --only-roots requires an active environment")
decorator = make_env_decorator(env) if env else lambda s, f: f
# use groups by default except with format.
if args.groups is None:
@@ -280,9 +340,12 @@ def find(parser, args):
if env:
display_env(env, args, decorator, results)
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = " (not shown)"
if not args.only_roots:
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = ""
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type)
spack.cmd.print_how_many_pkgs(results, pkg_type, suffix=count_suffix)

View File

@@ -263,8 +263,8 @@ def _fmt_name_and_default(variant):
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when, indent):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(when)}")
def _fmt_when(when: "spack.spec.Spec", indent: int):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(str(when))}")
def _fmt_variant_description(variant, width, indent):
@@ -441,7 +441,7 @@ def get_url(version):
return "No URL"
url = get_url(preferred) if pkg.has_code else ""
line = version(" {0}".format(pad(preferred))) + color.cescape(url)
line = version(" {0}".format(pad(preferred))) + color.cescape(str(url))
color.cwrite(line)
print()
@@ -464,7 +464,7 @@ def get_url(version):
continue
for v, url in vers:
line = version(" {0}".format(pad(v))) + color.cescape(url)
line = version(" {0}".format(pad(v))) + color.cescape(str(url))
color.cprint(line)
@@ -475,10 +475,7 @@ def print_virtuals(pkg, args):
color.cprint(section_title("Virtual Packages: "))
if pkg.provided:
for when, specs in reversed(sorted(pkg.provided.items())):
line = " %s provides %s" % (
when.colorized(),
", ".join(s.colorized() for s in specs),
)
line = " %s provides %s" % (when.cformat(), ", ".join(s.cformat() for s in specs))
print(line)
else:
@@ -497,7 +494,9 @@ def print_licenses(pkg, args):
pad = padder(pkg.licenses, 4)
for when_spec in pkg.licenses:
license_identifier = pkg.licenses[when_spec]
line = license(" {0}".format(pad(license_identifier))) + color.cescape(when_spec)
line = license(" {0}".format(pad(license_identifier))) + color.cescape(
str(when_spec)
)
color.cprint(line)

View File

@@ -420,10 +420,9 @@ def install_with_active_env(env: ev.Environment, args, install_kwargs, reporter_
with reporter_factory(specs_to_install):
env.install_specs(specs_to_install, **install_kwargs)
finally:
# TODO: this is doing way too much to trigger
# views and modules to be generated.
with env.write_transaction():
env.write(regenerate=True)
if env.views:
with env.write_transaction():
env.write(regenerate=True)
def concrete_specs_from_cli(args, install_kwargs):

View File

@@ -5,8 +5,6 @@
import sys
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.find
import spack.environment as ev
@@ -70,16 +68,6 @@ def setup_parser(subparser):
help="load the first match if multiple packages match the spec",
)
subparser.add_argument(
"--only",
default="package,dependencies",
dest="things_to_load",
choices=["package", "dependencies"],
help="select whether to load the package and its dependencies\n\n"
"the default is to load the package and all dependencies. alternatively, "
"one can decide to load only the package or only the dependencies",
)
subparser.add_argument(
"--list",
action="store_true",
@@ -110,11 +98,6 @@ def load(parser, args):
)
return 1
if args.things_to_load != "package,dependencies":
tty.warn(
"The `--only` flag in spack load is deprecated and will be removed in Spack v0.22"
)
with spack.store.STORE.db.read_transaction():
env_mod = uenv.environment_modifications_for_specs(*specs)
for spec in specs:

View File

@@ -71,6 +71,11 @@ def setup_parser(subparser):
help="the number of versions to fetch for each spec, choose 'all' to"
" retrieve all versions of each package",
)
create_parser.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(create_parser, ["specs"])
arguments.add_concretizer_args(create_parser)
@@ -108,6 +113,11 @@ def setup_parser(subparser):
"and source use `--type binary --type source` (default)"
),
)
add_parser.add_argument(
"--autopush",
action="store_true",
help=("set mirror to push automatically after installation"),
)
add_parser_signed = add_parser.add_mutually_exclusive_group(required=False)
add_parser_signed.add_argument(
"--unsigned",
@@ -175,6 +185,21 @@ def setup_parser(subparser):
),
)
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser_autopush = set_parser.add_mutually_exclusive_group(required=False)
set_parser_autopush.add_argument(
"--autopush",
help="set mirror to push automatically after installation",
action="store_true",
default=None,
dest="autopush",
)
set_parser_autopush.add_argument(
"--no-autopush",
help="set mirror to not push automatically after installation",
action="store_false",
default=None,
dest="autopush",
)
set_parser_unsigned = set_parser.add_mutually_exclusive_group(required=False)
set_parser_unsigned.add_argument(
"--unsigned",
@@ -218,6 +243,7 @@ def mirror_add(args):
or args.type
or args.oci_username
or args.oci_password
or args.autopush
or args.signed is not None
):
connection = {"url": args.url}
@@ -234,6 +260,8 @@ def mirror_add(args):
if args.type:
connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type
if args.autopush:
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
@@ -270,6 +298,8 @@ def _configure_mirror(args):
changes["access_pair"] = [args.oci_username, args.oci_password]
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
if getattr(args, "autopush", None) is not None:
changes["autopush"] = args.autopush
# argparse cannot distinguish between --binary and --no-binary when same dest :(
# notice that set-url does not have these args, so getattr
@@ -334,7 +364,6 @@ def concrete_specs_from_user(args):
specs = filter_externals(specs)
specs = list(set(specs))
specs.sort(key=lambda s: (s.name, s.version))
specs, _ = lang.stable_partition(specs, predicate_fn=not_excluded_fn(args))
return specs
@@ -379,36 +408,50 @@ def concrete_specs_from_cli_or_file(args):
return specs
def not_excluded_fn(args):
"""Return a predicate that evaluate to True if a spec was not explicitly
excluded by the user.
"""
exclude_specs = []
if args.exclude_file:
exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
class IncludeFilter:
def __init__(self, args):
self.exclude_specs = []
if args.exclude_file:
self.exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
self.exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
self.private = args.private
def not_excluded(x):
return not any(x.satisfies(y) for y in exclude_specs)
def __call__(self, x):
return all([self._not_license_excluded(x), self._not_cmdline_excluded(x)])
return not_excluded
def _not_license_excluded(self, x):
"""True if the spec is for a private mirror, or as long as the
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
return True
else:
tty.debug(
"Skip adding {0} to mirror: the package.py file"
" indicates that a public mirror should not contain"
" it.".format(x.name)
)
return False
def _not_cmdline_excluded(self, x):
"""True if a spec was not explicitly excluded by the user."""
return not any(x.satisfies(y) for y in self.exclude_specs)
def concrete_specs_from_environment(selection_fn):
def concrete_specs_from_environment():
env = ev.active_environment()
assert env, "an active environment is required"
mirror_specs = env.all_specs()
mirror_specs = filter_externals(mirror_specs)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
def all_specs_with_all_versions(selection_fn):
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
@@ -429,12 +472,6 @@ def versions_per_spec(args):
return num_versions
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def process_mirror_stats(present, mirrored, error):
p, m, e = len(present), len(mirrored), len(error)
tty.msg(
@@ -480,30 +517,28 @@ def mirror_create(args):
# When no directory is provided, the source dir is used
path = args.directory or spack.caches.fetch_cache_location()
mirror_specs, mirror_fn = _specs_and_action(args)
mirror_fn(mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions)
def _specs_and_action(args):
include_fn = IncludeFilter(args)
if args.all and not ev.active_environment():
create_mirror_for_all_specs(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = all_specs_with_all_versions()
mirror_fn = create_mirror_for_all_specs
elif args.all and ev.active_environment():
mirror_specs = concrete_specs_from_environment()
mirror_fn = create_mirror_for_individual_specs
else:
mirror_specs = concrete_specs_from_user(args)
mirror_fn = create_mirror_for_individual_specs
if args.all and ev.active_environment():
create_mirror_for_all_specs_inside_environment(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = concrete_specs_from_user(args)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions
)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=include_fn)
return mirror_specs, mirror_fn
def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
mirror_specs = all_specs_with_all_versions(selection_fn=selection_fn)
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
@@ -515,11 +550,10 @@ def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_all_specs_inside_environment(path, skip_unstable_versions, selection_fn):
mirror_specs = concrete_specs_from_environment(selection_fn=selection_fn)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=skip_unstable_versions
)
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def mirror_destroy(args):

View File

@@ -116,39 +116,38 @@ def ipython_interpreter(args):
def python_interpreter(args):
"""A python interpreter is the default interpreter"""
# Fake a main python shell by setting __name__ to __main__.
console = code.InteractiveConsole({"__name__": "__main__", "spack": spack})
if "PYTHONSTARTUP" in os.environ:
startup_file = os.environ["PYTHONSTARTUP"]
if os.path.isfile(startup_file):
with open(startup_file) as startup:
console.runsource(startup.read(), startup_file, "exec")
if args.python_command:
propagate_exceptions_from(console)
console.runsource(args.python_command)
elif args.python_args:
propagate_exceptions_from(console)
if args.python_args and not args.python_command:
sys.argv = args.python_args
with open(args.python_args[0]) as file:
console.runsource(file.read(), args.python_args[0], "exec")
runpy.run_path(args.python_args[0], run_name="__main__")
else:
# Provides readline support, allowing user to use arrow keys
console.push("import readline")
# Provide tabcompletion
console.push("from rlcompleter import Completer")
console.push("readline.set_completer(Completer(locals()).complete)")
console.push('readline.parse_and_bind("tab: complete")')
# Fake a main python shell by setting __name__ to __main__.
console = code.InteractiveConsole({"__name__": "__main__", "spack": spack})
if "PYTHONSTARTUP" in os.environ:
startup_file = os.environ["PYTHONSTARTUP"]
if os.path.isfile(startup_file):
with open(startup_file) as startup:
console.runsource(startup.read(), startup_file, "exec")
if args.python_command:
propagate_exceptions_from(console)
console.runsource(args.python_command)
else:
# Provides readline support, allowing user to use arrow keys
console.push("import readline")
# Provide tabcompletion
console.push("from rlcompleter import Completer")
console.push("readline.set_completer(Completer(locals()).complete)")
console.push('readline.parse_and_bind("tab: complete")')
console.interact(
"Spack version %s\nPython %s, %s %s"
% (
spack.spack_version,
platform.python_version(),
platform.system(),
platform.machine(),
console.interact(
"Spack version %s\nPython %s, %s %s"
% (
spack.spack_version,
platform.python_version(),
platform.system(),
platform.machine(),
)
)
)
def propagate_exceptions_from(console):

View File

@@ -91,7 +91,6 @@ def setup_parser(subparser):
def _process_result(result, show, required_format, kwargs):
result.raise_if_unsat()
opt, _, _ = min(result.answers)
if ("opt" in show) and (not required_format):
tty.msg("Best of %d considered solutions." % result.nmodels)

View File

@@ -23,7 +23,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.21"
tutorial_branch = "releases/v0.22"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -151,7 +151,8 @@ def is_installed(spec):
key=lambda s: s.dag_hash(),
)
return [spec for spec in specs if is_installed(spec)]
with spack.store.STORE.db.read_transaction():
return [spec for spec in specs if is_installed(spec)]
def dependent_environments(
@@ -239,6 +240,8 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -34,6 +34,13 @@ def setup_parser(subparser):
default=False,
help="show full pytest help, with advanced options",
)
subparser.add_argument(
"-n",
"--numprocesses",
type=int,
default=1,
help="run tests in parallel up to this wide, default 1 for sequential",
)
# extra spack arguments to list tests
list_group = subparser.add_argument_group("listing tests")
@@ -207,8 +214,6 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:
@@ -229,6 +234,16 @@ def unit_test(parser, args, unknown_args):
if args.extension:
pytest_root = spack.extensions.load_extension(args.extension)
if args.numprocesses is not None and args.numprocesses > 1:
pytest_args.extend(
[
"--dist",
"loadfile",
"--tx",
f"{args.numprocesses}*popen//python=spack-tmpconfig spack python",
]
)
# pytest.ini lives in the root of the spack repository.
with llnl.util.filesystem.working_dir(pytest_root):
if args.list:

View File

@@ -20,8 +20,10 @@
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
@@ -107,7 +109,6 @@ def _parse_link_paths(string):
"""
lib_search_paths = False
raw_link_dirs = []
tty.debug("parsing implicit link info")
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
@@ -122,7 +123,7 @@ def _parse_link_paths(string):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug("linker line: %s" % line)
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
@@ -138,15 +139,12 @@ def _parse_link_paths(string):
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("linkdir: %s" % link_dir)
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("libpath: %s", link_dir)
raw_link_dirs.append(link_dir)
tty.debug("found raw link dirs: %s" % ", ".join(raw_link_dirs))
implicit_link_dirs = list()
visited = set()
@@ -156,7 +154,7 @@ def _parse_link_paths(string):
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug("found link dirs: %s" % ", ".join(implicit_link_dirs))
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@@ -417,17 +415,35 @@ def real_version(self):
self._real_version = self.version
return self._real_version
def implicit_rpaths(self):
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
# Put CXX first since it has the most linking issues
# And because it has flags that affect linking
link_dirs = self._get_compiler_link_paths()
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
output = self.compiler_verbose_output
if not output:
return None
dynamic_linker = spack.util.libc.parse_dynamic_linker(output)
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
@@ -436,17 +452,17 @@ def required_libs(self):
# By default every compiler returns the empty list
return []
def _get_compiler_link_paths(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if not cc or not self.verbose_flag:
# Cannot determine implicit link paths without a compiler / verbose flag
return []
# What flag types apply to first_compiler, in what order
if cc == self.cc:
flags = ["cflags", "cppflags", "ldflags"]
else:
flags = ["cxxflags", "cppflags", "ldflags"]
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
@@ -458,20 +474,19 @@ def _get_compiler_link_paths(self):
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in flags:
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
output = cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
return _parse_non_system_link_dirs(output)
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return []
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self):
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
@@ -669,8 +684,8 @@ def __str__(self):
@contextlib.contextmanager
def compiler_environment(self):
# yield immediately if no modules
if not self.modules:
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
@@ -687,13 +702,9 @@ def compiler_environment(self):
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()
spack.schema.environment.parse(self.environment).apply_modifications()
yield
except BaseException:
raise
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()

View File

@@ -10,6 +10,7 @@
import itertools
import multiprocessing.pool
import os
import warnings
from typing import Dict, List, Optional, Tuple
import archspec.cpu
@@ -109,27 +110,33 @@ def _to_dict(compiler):
return {"compiler": d}
def get_compiler_config(scope=None, init_config=False):
def get_compiler_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = False,
) -> List[Dict]:
"""Return the compiler configuration for the specified architecture."""
config = spack.config.get("compilers", scope=scope) or []
config = configuration.get("compilers", scope=scope) or []
if config or not init_config:
return config
merged_config = spack.config.get("compilers")
merged_config = configuration.get("compilers")
if merged_config:
# Config is empty for this scope
# Do not init config because there is a non-empty scope
return config
_init_compiler_config(scope=scope)
config = spack.config.get("compilers", scope=scope)
_init_compiler_config(configuration, scope=scope)
config = configuration.get("compilers", scope=scope)
return config
def get_compiler_config_from_packages(scope=None):
def get_compiler_config_from_packages(
configuration: "spack.config.Configuration", *, scope: Optional[str] = None
) -> List[Dict]:
"""Return the compiler configuration from packages.yaml"""
config = spack.config.get("packages", scope=scope)
config = configuration.get("packages", scope=scope)
if not config:
return []
@@ -149,7 +156,15 @@ def get_compiler_config_from_packages(scope=None):
def _compiler_config_from_package_config(config):
compilers = []
for entry in config:
compiler = _compiler_config_from_external(entry)
try:
compiler = _compiler_config_from_external(entry)
except Exception as e:
msg = "Reading compiler from packages config section failed\n"
msg += f" Compiler: {entry.get('spec', None)}\n"
msg += f" Prefix: {entry.get('prefix', None)}\n"
msg += f" Failure: {e}"
warnings.warn(msg)
compiler = None
if compiler:
compilers.append(compiler)
@@ -157,33 +172,56 @@ def _compiler_config_from_package_config(config):
def _compiler_config_from_external(config):
extra_attributes_key = "extra_attributes"
compilers_key = "compilers"
c_key, cxx_key, fortran_key = "c", "cxx", "fortran"
# Allow `@x.y.z` instead of `@=x.y.z`
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
err_header = f"The external spec '{spec}' cannot be used as a compiler"
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("paths", {})
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if extra_attributes_key not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{extra_attributes_key}' key")
return None
extra_attributes = config[extra_attributes_key]
# If I have 'extra_attributes' warn if 'compilers' is missing, or we don't have a C compiler
if compilers_key not in extra_attributes:
warnings.warn(
f"{err_header}: missing the '{compilers_key}' key under '{extra_attributes_key}'"
)
return None
attribute_compilers = extra_attributes[compilers_key]
if c_key not in attribute_compilers:
warnings.warn(
f"{err_header}: missing the C compiler path under "
f"'{extra_attributes_key}:{compilers_key}'"
)
return None
c_compiler = attribute_compilers[c_key]
# C++ and Fortran compilers are not mandatory, so let's just leave a debug trace
if cxx_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a C++ compiler")
if fortran_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a Fortran compiler")
# compilers format has cc/fc/f77, externals format has "c/fortran"
paths = {
"cc": c_compiler,
"cxx": attribute_compilers.get(cxx_key, None),
"fc": attribute_compilers.get(fortran_key, None),
"f77": attribute_compilers.get(fortran_key, None),
}
if not spec.architecture:
host_platform = spack.platforms.host()
@@ -216,13 +254,15 @@ def _compiler_config_from_external(config):
return compiler_entry
def _init_compiler_config(*, scope):
def _init_compiler_config(
configuration: "spack.config.Configuration", *, scope: Optional[str]
) -> None:
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
compilers_dict = []
for compiler in compilers:
compilers_dict.append(_to_dict(compiler))
spack.config.set("compilers", compilers_dict, scope=scope)
configuration.set("compilers", compilers_dict, scope=scope)
def compiler_config_files():
@@ -233,7 +273,7 @@ def compiler_config_files():
compiler_config = config.get("compilers", scope=name)
if compiler_config:
config_files.append(config.get_config_filename(name, "compilers"))
compiler_config_from_packages = get_compiler_config_from_packages(scope=name)
compiler_config_from_packages = get_compiler_config_from_packages(config, scope=name)
if compiler_config_from_packages:
config_files.append(config.get_config_filename(name, "packages"))
return config_files
@@ -246,7 +286,9 @@ def add_compilers_to_config(compilers, scope=None):
compilers: a list of Compiler objects.
scope: configuration scope to modify.
"""
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(
configuration=spack.config.CONFIG, scope=scope, init_config=False
)
for compiler in compilers:
if not compiler.cc:
tty.debug(f"{compiler.spec} does not have a C compiler")
@@ -295,7 +337,9 @@ def _remove_compiler_from_scope(compiler_spec, scope):
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(
configuration=spack.config.CONFIG, scope=scope, init_config=False
)
filtered_compiler_config = [
compiler_entry
for compiler_entry in compiler_config
@@ -310,21 +354,28 @@ def _remove_compiler_from_scope(compiler_spec, scope):
# We need to preserve the YAML type for comments, hence we are copying the
# items in the list that has just been retrieved
compiler_config[:] = filtered_compiler_config
spack.config.set("compilers", compiler_config, scope=scope)
spack.config.CONFIG.set("compilers", compiler_config, scope=scope)
return True
def all_compilers_config(scope=None, init_config=True):
def all_compilers_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = True,
) -> List["spack.compiler.Compiler"]:
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
from_packages_yaml = get_compiler_config_from_packages(scope)
from_packages_yaml = get_compiler_config_from_packages(configuration, scope=scope)
if from_packages_yaml:
init_config = False
from_compilers_yaml = get_compiler_config(scope, init_config)
from_compilers_yaml = get_compiler_config(configuration, scope=scope, init_config=init_config)
result = from_compilers_yaml + from_packages_yaml
key = lambda c: _compiler_from_config_entry(c["compiler"])
# Dedupe entries by the compiler they represent
# If the entry is invalid, treat it as unique for deduplication
key = lambda c: _compiler_from_config_entry(c["compiler"] or id(c))
return list(llnl.util.lang.dedupe(result, key=key))
@@ -332,7 +383,7 @@ def all_compiler_specs(scope=None, init_config=True):
# Return compiler specs from the merged config.
return [
spack.spec.parse_with_version_concrete(s["compiler"]["spec"], compiler=True)
for s in all_compilers_config(scope, init_config)
for s in all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
]
@@ -492,11 +543,20 @@ def find_specs_by_arch(compiler_spec, arch_spec, scope=None, init_config=True):
def all_compilers(scope=None, init_config=True):
config = all_compilers_config(scope, init_config=init_config)
compilers = list()
for items in config:
return all_compilers_from(
configuration=spack.config.CONFIG, scope=scope, init_config=init_config
)
def all_compilers_from(configuration, scope=None, init_config=True):
compilers = []
for items in all_compilers_config(
configuration=configuration, scope=scope, init_config=init_config
):
items = items["compiler"]
compilers.append(_compiler_from_config_entry(items))
compiler = _compiler_from_config_entry(items) # can be None in error case
if compiler:
compilers.append(compiler)
return compilers
@@ -507,7 +567,7 @@ def compilers_for_spec(
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
config = all_compilers_config(scope, init_config)
config = all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
matches = set(find(compiler_spec, scope, init_config))
compilers = []
@@ -517,7 +577,7 @@ def compilers_for_spec(
def compilers_for_arch(arch_spec, scope=None):
config = all_compilers_config(scope)
config = all_compilers_config(spack.config.CONFIG, scope=scope)
return list(get_compilers(config, arch_spec=arch_spec))
@@ -603,7 +663,10 @@ def _compiler_from_config_entry(items):
compiler = _compiler_cache.get(config_id, None)
if compiler is None:
compiler = compiler_from_dict(items)
try:
compiler = compiler_from_dict(items)
except UnknownCompilerError as e:
warnings.warn(e.message)
_compiler_cache[config_id] = compiler
return compiler
@@ -656,7 +719,9 @@ def get_compilers(config, cspec=None, arch_spec=None):
raise ValueError(msg)
continue
compilers.append(_compiler_from_config_entry(items))
compiler = _compiler_from_config_entry(items)
if compiler:
compilers.append(compiler)
return compilers
@@ -933,10 +998,11 @@ def _default_make_compilers(cmp_id, paths):
make_mixed_toolchain(flat_compilers)
# Finally, create the compiler list
compilers = []
compilers: List["spack.compiler.Compiler"] = []
for compiler_id, _, compiler in flat_compilers:
make_compilers = getattr(compiler_id.os, "make_compilers", _default_make_compilers)
compilers.extend(make_compilers(compiler_id, compiler))
candidates = make_compilers(compiler_id, compiler)
compilers.extend(x for x in candidates if x.cc is not None)
return compilers

View File

@@ -38,10 +38,10 @@ class Clang(Compiler):
cxx_names = ["clang++"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["flang"]
f77_names = ["flang-new", "flang"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang"]
fc_names = ["flang-new", "flang"]
version_argument = "--version"
@@ -96,6 +96,8 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -120,6 +122,24 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -142,7 +162,10 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
return "-std=c2x"
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):
@@ -171,10 +194,11 @@ def extract_version_from_output(cls, output):
match = re.search(
# Normal clang compiler versions are left as-is
r"clang version ([^ )\n]+)-svn[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r"clang version ([^ )\n]+?)-[~.\w\d-]*|" r"clang version ([^ )\n]+)",
r"(?:clang|flang-new) version ([^ )\n]+?)-[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)",
output,
)
if match:

View File

@@ -1,34 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.compilers.oneapi
class Dpcpp(spack.compilers.oneapi.Oneapi):
"""This is the same as the oneAPI compiler but uses dpcpp instead of
icpx (for DPC++ source files). It explicitly refers to dpcpp, so that
CMake test files which check the compiler name (e.g. CMAKE_CXX_COMPILER)
detect it as dpcpp.
Ideally we could switch out icpx for dpcpp where needed in the oneAPI
compiler definition, but two things are needed for that: (a) a way to
tell the compiler that it should be using dpcpp and (b) a way to
customize the link_paths
See also: https://www.intel.com/content/www/us/en/develop/documentation/oneapi-dpcpp-cpp-compiler-dev-guide-and-reference/top/compiler-setup/using-the-command-line/invoking-the-compiler.html
"""
# Subclasses use possible names of C++ compiler
cxx_names = ["dpcpp"]
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("oneapi", "icx"),
"cxx": os.path.join("oneapi", "dpcpp"),
"f77": os.path.join("oneapi", "ifx"),
"fc": os.path.join("oneapi", "ifx"),
}

View File

@@ -8,7 +8,7 @@
import subprocess
import sys
import tempfile
from typing import Dict, List, Set
from typing import Dict, List
import archspec.cpu
@@ -20,15 +20,7 @@
from spack.error import SpackError
from spack.version import Version, VersionRange
avail_fc_version: Set[str] = set()
fc_path: Dict[str, str] = dict()
fortran_mapping = {
"2021.3.0": "19.29.30133",
"2021.2.1": "19.28.29913",
"2021.2.0": "19.28.29334",
"2021.1.0": "19.28.29333",
}
FC_PATH: Dict[str, str] = dict()
class CmdCall:
@@ -115,15 +107,13 @@ def command_str(self):
return f"{script} {self.arch} {self.sdk_ver} {self.vcvars_ver}"
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver)
def get_valid_fortran_pth():
"""Assign maximum available fortran compiler version"""
# TODO (johnwparent): validate compatibility w/ try compiler
# functionality when added
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
if ver in fortran_mapping:
if Version(cl_ver) <= Version(fortran_mapping[ver]):
return fc_path[ver]
return None
sort_fc_ver = sorted(list(FC_PATH.keys()), key=sort_fn)
return FC_PATH[sort_fc_ver[-1]] if sort_fc_ver else None
class Msvc(Compiler):
@@ -167,11 +157,9 @@ def __init__(self, *args, **kwargs):
# This positional argument "paths" is later parsed and process by the base class
# via the call to `super` later in this method
paths = args[3]
# This positional argument "cspec" is also parsed and handled by the base class
# constructor
cspec = args[0]
new_pth = [pth if pth else get_valid_fortran_pth(cspec.version) for pth in paths]
paths[:] = new_pth
latest_fc = get_valid_fortran_pth()
new_pth = [pth if pth else latest_fc for pth in paths[2:]]
paths[2:] = new_pth
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
super().__init__(*args, **kwargs)
@@ -183,7 +171,7 @@ def __init__(self, *args, **kwargs):
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
env_cmds = []
compiler_root = os.path.join(self.cc, "../../../../../../..")
compiler_root = os.path.join(os.path.dirname(self.cc), "../../../../../..")
vcvars_script_path = os.path.join(compiler_root, "Auxiliary", "Build", "vcvars64.bat")
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
@@ -198,11 +186,34 @@ def __init__(self, *args, **kwargs):
# paths[2] refers to the fc path and is a generic check
# for a fortran compiler
if paths[2]:
def get_oneapi_root(pth: str):
"""From within a prefix known to be a oneAPI path
determine the oneAPI root path from arbitrary point
under root
Args:
pth: path prefixed within oneAPI root
"""
if not pth:
return ""
while os.path.basename(pth) and os.path.basename(pth) != "oneAPI":
pth = os.path.dirname(pth)
return pth
# If this found, it sets all the vars
oneapi_root = os.getenv("ONEAPI_ROOT")
oneapi_root = get_oneapi_root(self.fc)
if not oneapi_root:
raise RuntimeError(f"Non-oneAPI Fortran compiler {self.fc} assigned to MSVC")
oneapi_root_setvars = os.path.join(oneapi_root, "setvars.bat")
# some oneAPI exes return a version more precise than their
# install paths specify, so we determine path from
# the install path rather than the fc executable itself
numver = r"\d+\.\d+(?:\.\d+)?"
pattern = f"((?:{numver})|(?:latest))"
version_from_path = re.search(pattern, self.fc).group(1)
oneapi_version_setvars = os.path.join(
oneapi_root, "compiler", str(self.ifx_version), "env", "vars.bat"
oneapi_root, "compiler", version_from_path, "env", "vars.bat"
)
# order matters here, the specific version env must be invoked first,
# otherwise it will be ignored if the root setvars sets up the oneapi
@@ -314,23 +325,19 @@ def setup_custom_environment(self, pkg, env):
@classmethod
def fc_version(cls, fc):
# We're using intel for the Fortran compilers, which exist if
# ONEAPI_ROOT is a meaningful variable
if not sys.platform == "win32":
return "unknown"
fc_ver = cls.default_version(fc)
avail_fc_version.add(fc_ver)
fc_path[fc_ver] = fc
if os.getenv("ONEAPI_ROOT"):
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError("Windows compiler search paths not established")
clp = spack.util.executable.which_string("cl", path=sps)
ver = cls.default_version(clp)
else:
ver = fc_ver
return ver
FC_PATH[fc_ver] = fc
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError(
"Windows compiler search paths not established, "
"please report this behavior to github.com/spack/spack"
)
clp = spack.util.executable.which_string("cl", path=sps)
return cls.default_version(clp) if clp else fc_ver
@classmethod
def f77_version(cls, f77):

View File

@@ -64,7 +64,7 @@ def verbose_flag(self):
#
# This way, we at least enable the implicit rpath detection, which is
# based on compilation of a C file (see method
# spack.compiler._get_compiler_link_paths): in the case of a mixed
# spack.compiler._compile_dummy_c_source): in the case of a mixed
# NAG/GCC toolchain, the flag will be passed to g++ (e.g.
# 'g++ -Wl,-v ./main.c'), otherwise, the flag will be passed to nagfor
# (e.g. 'nagfor -Wl,-v ./main.c' - note that nagfor recognizes '.c'

View File

@@ -74,6 +74,10 @@ class Concretizer:
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
@@ -113,7 +117,11 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.PATH.providers_for(spec)
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)
@@ -749,7 +757,6 @@ def _concretize_specs_together_new(*abstract_specs, **kwargs):
result = solver.solve(
abstract_specs, tests=kwargs.get("tests", False), allow_deprecated=allow_deprecated
)
result.raise_if_unsat()
return [s.copy() for s in result.specs]

View File

@@ -107,7 +107,7 @@
#: metavar to use for commands that accept scopes
#: this is shorter and more readable than listing all choices
SCOPES_METAVAR = "{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT"
SCOPES_METAVAR = "{defaults,system,site,user,command_line}[/PLATFORM] or env:ENVIRONMENT"
#: Base name for the (internal) overrides scope.
_OVERRIDES_BASE_NAME = "overrides-"
@@ -1562,8 +1562,9 @@ def ensure_latest_format_fn(section: str) -> Callable[[YamlConfigDict], bool]:
def use_configuration(
*scopes_or_paths: Union[ConfigScope, str]
) -> Generator[Configuration, None, None]:
"""Use the configuration scopes passed as arguments within the
context manager.
"""Use the configuration scopes passed as arguments within the context manager.
This function invalidates caches, and is therefore very slow.
Args:
*scopes_or_paths: scope objects or paths to be used

View File

@@ -12,26 +12,26 @@
},
"os_package_manager": "yum_amazon"
},
"fedora:38": {
"fedora:40": {
"bootstrap": {
"template": "container/fedora_38.dockerfile",
"image": "docker.io/fedora:38"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:40"
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build": "spack/fedora40",
"final": {
"image": "docker.io/fedora:38"
"image": "docker.io/fedora:40"
}
},
"fedora:37": {
"fedora:39": {
"bootstrap": {
"template": "container/fedora_37.dockerfile",
"image": "docker.io/fedora:37"
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:39"
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build": "spack/fedora39",
"final": {
"image": "docker.io/fedora:37"
"image": "docker.io/fedora:39"
}
},
"rockylinux:9": {
@@ -116,6 +116,13 @@
},
"os_package_manager": "apt"
},
"ubuntu:24.04": {
"bootstrap": {
"template": "container/ubuntu_2404.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-noble"
},
"ubuntu:22.04": {
"bootstrap": {
"template": "container/ubuntu_2204.dockerfile"
@@ -129,13 +136,6 @@
},
"build": "spack/ubuntu-focal",
"os_package_manager": "apt"
},
"ubuntu:18.04": {
"bootstrap": {
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -25,6 +25,7 @@
import socket
import sys
import time
from json import JSONDecoder
from typing import (
Any,
Callable,
@@ -818,7 +819,8 @@ def _read_from_file(self, filename):
"""
try:
with open(filename, "r") as f:
fdata = sjson.load(f)
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
raise CorruptDatabaseError("error parsing database:", str(e)) from e
@@ -833,27 +835,24 @@ def check(cond, msg):
# High-level file checks
db = fdata["database"]
check("installs" in db, "no 'installs' in JSON DB.")
check("version" in db, "no 'version' in JSON DB.")
installs = db["installs"]
# TODO: better version checking semantics.
version = vn.Version(db["version"])
if version > _DB_VERSION:
raise InvalidDatabaseVersionError(self, _DB_VERSION, version)
elif version < _DB_VERSION:
if not any(old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX):
tty.warn(
"Spack database version changed from %s to %s. Upgrading."
% (version, _DB_VERSION)
)
elif version < _DB_VERSION and not any(
old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields))
for k, v in self._data.items()
)
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
else:
check("installs" in db, "no 'installs' in JSON DB.")
installs = db["installs"]
spec_reader = reader(version)
@@ -1621,15 +1620,32 @@ def query_local(self, *args, **kwargs):
query_local.__doc__ += _QUERY_DOCSTRING
def query(self, *args, **kwargs):
"""Query the Spack database including all upstream databases."""
"""Query the Spack database including all upstream databases.
Additional Arguments:
install_tree (str): query 'all' (default), 'local', 'upstream', or upstream path
"""
install_tree = kwargs.pop("install_tree", "all")
valid_trees = ["all", "upstream", "local", self.root] + [u.root for u in self.upstream_dbs]
if install_tree not in valid_trees:
msg = "Invalid install_tree argument to Database.query()\n"
msg += f"Try one of {', '.join(valid_trees)}"
tty.error(msg)
return []
upstream_results = []
for upstream_db in self.upstream_dbs:
upstreams = self.upstream_dbs
if install_tree not in ("all", "upstream"):
upstreams = [u for u in self.upstream_dbs if u.root == install_tree]
for upstream_db in upstreams:
# queries for upstream DBs need to *not* lock - we may not
# have permissions to do this and the upstream DBs won't know about
# us anyway (so e.g. they should never uninstall specs)
upstream_results.extend(upstream_db._query(*args, **kwargs) or [])
local_results = set(self.query_local(*args, **kwargs))
local_results = []
if install_tree in ("all", "local") or self.root == install_tree:
local_results = set(self.query_local(*args, **kwargs))
results = list(local_results) + list(x for x in upstream_results if x not in local_results)

Some files were not shown because too many files have changed in this diff Show More