Compare commits

...

611 Commits

Author SHA1 Message Date
Todd Gamblin
2d464f8c89 spack-python: remove superfluous /usr/bin/env (#44724)
Not sure why I had this here, as `/bin/sh` will find the first spack in `PATH` just like
`env`.

- [x] remove `/usr/bin/env` and avoid an extra process launch.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-15 08:11:19 -07:00
AcriusWinter
456a8e3553 omega-h: reformatted from old test method to new test method (#44712)
* omega-h: reformatted from old test method to new test method
* omega-h: added proper output checking

--------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-14 20:03:39 -06:00
AcriusWinter
db94696cf0 libstdcompat: removed duplicate build-time test method (#44725)
* libstdcompat: removed stand-alone test method
2024-06-14 18:40:03 -06:00
kwryankrattiger
72bb656b9e Intel MKL express requirements as requires (#44727)
The ``+cluster` variant requires there be an MPI family known in the
spec. When using externals it is easy to miss-configure this requirement
leading to a runtime exception which is not desirable. This converts the
exception to a package rule.
2024-06-14 19:07:37 -04:00
James Smillie
e092026eb8 muparser: refactor to use new multi-build-system logic (#44552) 2024-06-14 13:43:20 -07:00
psakievich
e5f5749d67 HDF5: Fix typo in symlink paths (#44711) 2024-06-14 12:48:32 -06:00
renjithravindrankannath
6e4f8ea7e4 hip: Adding hipother for cuda support starting from 6.0 (#44453)
* Adding hipother for cuda support starting from 6.0
* Updating the version check as per current spec
2024-06-14 08:26:07 -07:00
Massimiliano Culpo
5e8eff24d2 Build developer-tools pipeline only on manylinux (#43811) 2024-06-14 10:13:06 +02:00
Alec Scott
36f1801eb8 rclone: add v1.65.2 (#44657) 2024-06-13 21:28:14 -05:00
Alec Scott
e3deee57ba go: add v1.22.4 (#44656) 2024-06-13 21:28:06 -05:00
Alec Scott
404deb99f4 hugo: add v0.127.0 (#44626) 2024-06-13 21:27:56 -05:00
Alec Scott
f594bc7aea fd: add v10.1.0 (#44623) 2024-06-13 21:27:46 -05:00
Harshula Jayasuriya
9ed948523a openmpi: disable remark 10441 for intel classic 2021.7.0 or newer (#44614)
* Compilation of openmpi fails when intel classic compiler 2021.7.0
  or newer is used.
2024-06-13 17:58:50 -07:00
Wouter Deconinck
72d7c2d558 xorg pkgs: update versions and homepage cgit -> gitlab (#44420)
* xorg libs: new versions
* xorg pkgs: update homepage from cgit to gitlab
* xorgs pkgs: fix homepage since, yeah, I did edit those 122 files by hand...
* libxmu: don't need the .0 patch version here

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-13 17:42:17 -07:00
Matt Thompson
d4a7582955 gh: add 2.50.0 (#44670) 2024-06-13 16:46:52 -07:00
dependabot[bot]
42ac1f0cb2 build(deps): bump codecov/codecov-action from 4.4.1 to 4.5.0 (#44710)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.4.1 to 4.5.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](125fc84a9a...e28ff129e5)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-13 16:45:24 -07:00
Mathew Cleveland
4af61d432f add draco-7_18_0 to package.py (#44673)
Co-authored-by: Cleveland <cleveland@lanl.gov>
2024-06-13 16:43:00 -07:00
Teague Sterling
52bdaa7bf5 perl-dbd-mysql: add v4.052, v5.005 (#44503)
* Fixed some issues with installation breaking due to mysql_client

* Remove debugging details

* Adding client_only to deps

* Adding client_only to variant and deps

* Undoingw client_only to variant and deps
2024-06-13 16:42:23 -07:00
Diego Alvarez S
96b42238c5 openjdk: update default version to 17.x (#44647)
* Set default version of OpenJDK to 17.x

* Fix OpenJDK newer versions in Apple Silicon

* Specify java@11 for astral and stc
2024-06-13 16:40:50 -07:00
Teague Sterling
c7bd259739 Adding pkgconfig dependecy to startup-notification (#44709)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-13 17:37:51 -06:00
Jonathon Anderson
0dfc360b1e hpctoolkit: conflicts with elfutils @0.191 (#44696)
See https://gitlab.com/hpctoolkit/hpctoolkit/-/issues/831
2024-06-13 12:25:37 -07:00
AcriusWinter
5e578e2e4e Converted warpx from the old format to the new format for stand-alone testing (#44677)
* Converted warpx from the old format to the new format for stand-alone testing

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-13 11:34:17 -07:00
AcriusWinter
93111d495b Changed sz3 stand alone tests from old to new API (#44691) 2024-06-13 10:45:16 -07:00
Howard Pritchard
b1bbe240d7 py-xtb: fix problem with meson file (#44595)
Using meson 1.3.2 the py-xtb package fails
during the install process.  Error signature shown in

    https://github.com/grimme-lab/xtb-python/pull/114

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-06-13 10:18:42 -07:00
Jean Luca Bez
bbb58ff4c6 update PDC version (#44702) 2024-06-13 11:15:33 -06:00
Sam Gillingham
7a710bee17 update py-tuiview to 1.2.14 (#44692)
* update py-tuiview to 1.2.14
* remove whitespace
2024-06-13 10:07:36 -07:00
renjithravindrankannath
bc3903d0e0 New patch to add half include path for tests in 6.1 (#44697)
* Add half include path for tests in 6.1
* removing unwanted  when="@:6.1"
* Correcting conditon to apply patch
* Stlye check error fix and updating rpp dependency
* Audit check error fix
* Correcting rpp dependency and adding hsa-rocr-dev lib path to LD_LIBRARY_PATH
2024-06-13 09:42:56 -07:00
Simon Pintarelli
61c8326180 cleanup q-e-sirius recipe (#44698) 2024-06-13 09:39:06 -07:00
Adrien Bernede
c5caa4b838 Add latest releases to camp, raja and umpire (#44699)
* Add latest releases to camp, raja and umpire
2024-06-13 09:31:38 -07:00
Joe Schoonover
ebdff73c8c Add new versions of feq-parse (#44701) 2024-06-13 09:25:54 -07:00
eugeneswalker
f78beb71f7 tau@2.33.2 +rocm: patch to disable llvm plugin (#44690) 2024-06-13 09:13:41 -07:00
Simon Frasch
74210c7f46 sirius: fix spla+openmp requirement (#44685) 2024-06-13 06:53:28 +02:00
Harmen Stoppels
eff4451cdd Revert "linux-pam: fixes #44637" (#44682)
* Revert "linux-pam: fixes #44637 (#44638)"

This reverts commit 9151fc1653.

* linux-pam: drop redundant deps, add missing, dont build docs, nls
2024-06-12 13:13:11 -07:00
Teague Sterling
8d0fc3e639 gettext: fix condition (#44680) 2024-06-12 13:24:42 -06:00
Taillefumier Mathieu
3736da3f89 Update cp2k old recipe (#44650)
The old build system has issues with the library order. This commit is an attempt
to fix it.
2024-06-12 10:11:02 +02:00
Melven Roehrig-Zoellner
221e464df3 py-pyopengl: new package (#32357)
* py-pyopengl: new simple package

* py-pyopengl: Fix typo in comment

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-pyopengl: add variants and dependencies

* py-pyopengl: build from source and improve variants

* py-pyopengl: use corrected freeglut libs

* py-pyopengl: update copyright

* py-pyopengl: remove duplicate conflict clause

* py-pyopengl: change dependency to link

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-12 09:52:49 +02:00
Gavin John
bac5253169 py-biopython: Require python <= 3.9 for certain versions (#44668) 2024-06-12 08:57:56 +02:00
AMD Toolchain Support
fc2ee5cae8 WRF 4.5.2 support is added for AOCC compilers (#44584)
Co-authored-by: Raviteja K <rkuppili@amd.com>
2024-06-11 18:45:08 -07:00
psakievich
e11f83f34b Nalu-Wind Tioga Dependency (#44675)
* Nalu-Wind Tioga Dependency
   We created the tioga@1.0.0 tag for reproducing the exawind 1.0 release
* Update tioga tags
2024-06-11 18:44:37 -07:00
Veselin Dobrev
6e7fb9a308 MFEM: add new version v4.7 (#44010)
* Core change: logic for extracting RPATHs from modules may return
  `None`: filter this out of the set of RPATHs that is auto-generated
* Core change: `CachedCMakePackage` no longer adds ldflags to
  `CMAKE_STATIC_LINKER_FLAGS`: generally these flags are not appropriate
  for static linking (e.g. invocation of `ar`)
* [mfem] Add version 4.7
* [mfem] Add variant for precision (single/double). Enforce consistency
  for precision amongst mfem and hypre/petsc/mumps dependencies
* [mfem] Add cxxstd (and related constraints preventing use of
  old cxxstd values for newer versions of some dependencies)
* [hypre] In line with prior point, added support for specifying
  precision
* [petsc] Add config option to avoid error when building against
  `superlu-dist+rocm`
* [hiop] add proper `raja`/`umpire`/`camp` version constraints for
  `hiop` versions 0.3.99-0.4.x; require `+raja` for `+rocm`, and
  add dependency on `hiprand` for `+rocm`
* [butterflypack, mfem, strumpack, suite-sparse] Require
  `CRAYLIBS_{target-family}` env var to be defined
* [suite-sparse] versions `@7.4:` changed install location of headers:
  add symlink from old location to new location
* [zlib-ng] Fix error where shared libs were not successfully built for
  `%cce@17` (the build did not fail, but the finished `zlib-ng%cce@17`
  install did not have shared libs)
2024-06-11 16:39:22 -07:00
renjithravindrankannath
b156a62a44 making suite-sparse library path generic in hipsolver (#44473)
* suite-sparse library path can be lib or lib64
* fixing style check error
* Adding the missing import os
* Style check error fix
* Updating patch for 6.1.0 with correct lib path
2024-06-11 14:41:42 -07:00
Elliott Slaughter
990e77c55f legion: Requires C++17 minimum after 24.03.0. (#44596) 2024-06-11 14:10:04 -07:00
Matt Thompson
c2fb529819 fms: add llvm-openmp for apple-clang +openmp (#44667)
* fms: add llvm-openmp for apple-clang
2024-06-11 14:08:31 -07:00
psakievich
337bf3b944 Enable and constrain reuse for GitVersion installations (#43859)
* Preserve higher weight for CLI git ref versions

Currently the concretizer fails if you reuse a git ref version
that has already been installed but modify the spec at all.

See #38484 for futher diagnosis

The issue here is that since there is no established provenance for
these versions the highest weight they are currently assigned is
that of prior install. Re-use checks then fail because the weight of
the version is identical to the solver.

Ironically, these versions are given the highest weights possible when
specified on the CLI for the first time.  They should only appear in a
DAG if they are an exact match or if the user specifies them at the CLI.
Therefore it makes sense to preserve their higher ordering.

Getting this right is critical to moving all branch based versions to a pinned
git-ref in the future.

* [@spackbot] updating style on behalf of psakievich

* Update lib/spack/spack/solver/asp.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Add provenance specific to git ref installs

* Sensitvity to name that I could not track down

* Add regression test

* Adjust test

* Add prefer standard unit-test

* Style

* Add required mock

* Format and mark

* Make unit-test case reproduce CLI investigation

* Remove unnecessary mock package

* [@spackbot] updating style on behalf of psakievich

* Use already developed fixture

* Add zlib-ng to mocks again

* Remove accidental adds

* Remove maintainer

* [@spackbot] updating style on behalf of psakievich

* Rename test file

* [@spackbot] updating style on behalf of psakievich

* Remove unused imports

* Update tests

* [@spackbot] updating style on behalf of psakievich

* Style

* Update lib/spack/spack/test/concretize.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Update solver rule

* Duplicate installed rules for installed_git_version

* Revert "Duplicate installed rules for installed_git_version"

This reverts commit 17223fc8d1.

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-06-11 14:57:09 -06:00
Nuno Nobre
a49b2f4f16 Use https, not ssh, for AMS git clones (#44669) 2024-06-11 13:53:27 -07:00
AcriusWinter
ddcf1a4b2e Renamed build-time test method (#44572)
* libdistributed: renamed test to check_test
* libpressio-rmetric: renamed test method
* libpressio-tools: renamed test method
* Renamed build-time test methods for: libpressio-opt, libpressio-tthresh, opened

---------

Co-authored-by: Jackson Peter Lawrence <lawrence31@borax5.llnl.gov>
2024-06-11 13:22:50 -06:00
Simon Frasch
82a932c078 spla: add v1.6.0 (#44609) 2024-06-11 13:06:50 +02:00
Harmen Stoppels
0781615117 openssl: remove deprecated versions (#44629) 2024-06-11 10:13:13 +02:00
Teague Sterling
9151fc1653 linux-pam: fixes #44637 (#44638)
* package/linux-pam: dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* package/linux-pam: Fixes #44637

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update var/spack/repos/builtin/packages/linux-pam/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/linux-pam/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* [@spackbot] updating style on behalf of teaguesterling

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-10 23:41:11 -06:00
Robert Underwood
3a83b21ce1 improvements for dlib;highway;libjxl (#44348)
* improvements for dlib;highway;libjxl

* Update var/spack/repos/builtin/packages/dlib/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* fix style

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-10 16:50:36 -07:00
William Hart
cfc042d901 cppad: add v20180000.0, v20240000.4, master (#44299)
* Updating cppad package

* Disabling doc generation

* Adding whart222 as maintainer

* Style fixes

* Pushing changes based on Tamara's feedback.

* Updating sha256 hash for version 20180000.0

* Reorg of declarations

* Reworking cmake args

* Reformatting
2024-06-10 16:48:19 -07:00
Teague Sterling
211ad9e7d9 gdk-pixbuf: adding variant version constraint (#44645)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-10 16:40:21 -07:00
Teague Sterling
437b259829 perl-http-tiny: new package (#44502)
* Adding perl-http-tiny package

* Update package.py
2024-06-10 17:23:03 -06:00
Loïc Pottier
f524bba869 flux-sched: fixed missing modules in PATH when lib64 and lib are both used (#44658)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2024-06-10 17:03:00 -06:00
Kelly (KT) Thompson
2f31fb5f17 lcov: add master (#42498)
* Provide instructions for building a developmental version of lcov.

* style fix

* style fix

* Promote single version git URL to the package level.

* Formatting fix.
2024-06-10 16:34:09 -06:00
Dave Keeshan
c3567cb199 verilator: Some environment varibles are no longer required (#44655)
* verilator: Some environment varibles are no longer required

* Revert depends_on for flex back to the deafult case

* Use decorator, when, for setup_run_environment insdtead of if inside function
2024-06-10 16:14:26 -06:00
Melven Roehrig-Zoellner
ae4c1d11f7 julia and libcxxwrap-julia: add new versions (#44635)
* julia: new version (1.10.4)

* libcxxwrap-julia: new version (0.12.5)
2024-06-10 14:29:46 -07:00
Thomas Madlener
cbab451c1a dd4hep: Add version 1.29 (#44652) 2024-06-10 12:54:18 -07:00
Thomas Madlener
9cec17ca26 lcio: Add latest tag v02-22 (#44606)
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-06-10 11:56:31 -06:00
Jon Rood
d9c5d91b6f hdf5: solve cmake findhdf5 issue with hl components (#44423)
* hdf5: solve cmake findhdf5 issue with hl components

* Fix mistake.

* Update var/spack/repos/builtin/packages/hdf5/package.py

Co-authored-by: psakievich <psakiev@sandia.gov>

* Style.

* Style.

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-06-10 10:11:49 -06:00
Chris Green
6e194c6ffe root: new (old) versions and fixes for 6.32.00 (#44611)
* Older versions and backported patches
   * Add 6.28.12 and 6.24.04.
   * Backported patches for segfault in weighted likelihood fits.
   * Minor style/signature improvements.
* ROOT 6.32.00 fixes omitted from https://github.com/spack/spack/pull/44550
2024-06-10 17:59:33 +02:00
Mikael Simberg
8f0c28037b mold: add 2.32.0 (#44648) 2024-06-10 03:57:52 -06:00
Rocco Meli
31aabcabf7 elsi: add v2.10.1, master (#44579) 2024-06-10 09:13:06 +02:00
Melven Roehrig-Zoellner
ca9531d205 qt: new minor version 5.15.14 (#44632) 2024-06-09 17:03:15 -04:00
Harmen Stoppels
794c5eb6a0 git: remove deprecated versions (#44631) 2024-06-08 06:46:04 -07:00
Massimiliano Culpo
c6cc125e22 Remove failing unit-test on Windows (#44627) 2024-06-08 09:36:22 +02:00
Bernhard Kaindl
528c1ed9ba binutils: detect the "gold" and "headers" variants (#40214)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-06-07 18:08:55 -06:00
Kyle Gerheiser
52cc603245 fabtests: add v1.19.1 -> v1.21.0 (#44358) 2024-06-08 00:07:18 +02:00
Wouter Deconinck
5e55af2dce py-scikit-build-core: add v0.7.1, v0.8.2, v0.9.5 (#44492) 2024-06-08 00:00:45 +02:00
Wouter Deconinck
24ee7c8928 py-partd: add v1.4.1, v1.4.2 (#44497) 2024-06-07 23:59:07 +02:00
Vanessasaurus
605df09ae1 flux-sched: add v0.35.0 (#44602)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-06-07 23:56:39 +02:00
eugeneswalker
4aebef900c sundials+rocm: patch hip platform for rocm 5.7-6 (#44617) 2024-06-07 11:37:52 -06:00
eugeneswalker
59c5bef165 magma: add rocm-core versions >=6 (#44598) 2024-06-07 08:17:18 -07:00
eugeneswalker
a18adf74bf camp@0.2.3 +rocm: patch for rocm 6 from LLNL/camp PR#143 (#44597) 2024-06-07 08:16:59 -07:00
Rocco Meli
6426ab1b7e dftbplus: add v24.1 (#44613) 2024-06-07 08:54:51 -06:00
Nils Vu
7d1de58378 blaze: add v3.8.2, v3.8.1 (#44556) 2024-06-07 08:54:14 -06:00
Mikael Simberg
82a54378d8 pika: Add valgrind variant (#44558) 2024-06-07 07:33:08 -07:00
Adam J. Stewart
e6e8fada8b GEOS: add new versions (#44586) 2024-06-07 13:47:04 +02:00
Adam J. Stewart
7b541ac322 PyTorch: update ecosystem (#44585) 2024-06-07 13:46:50 +02:00
kwryankrattiger
b0a2ea3970 Generate jobs should use x86_64_v3 runners only (#44582) 2024-06-07 13:20:02 +02:00
Sreenivasa Murthy Kolam
cb439a09dd Deprecate older rocm releases- 5.2 till 5.4 (#43181) 2024-06-07 13:19:34 +02:00
Rocco Meli
f87ee334c2 CP2K: use dla-future-fortran depencency (#44603)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* cp2k: add dla-future-fortran dependency

* Update var/spack/repos/builtin/packages/cp2k/package.py
2024-06-07 13:16:53 +02:00
Paul R. C. Kent
e8f8cf8543 Add llvm v18.1.7 (#44581) 2024-06-06 22:28:34 -06:00
Teague Sterling
8c93fb747b py-jproperties: add v1.0.1, v2.0.0, v2.1.0, v2.1.1 (#44519)
* py-jproperties: add v1.0.1, v2.0.0, v2.1.0, v2.1.1
* Incorporating changes from review: https://github.com/spack/spack/pull/44519#discussion_r1625170596
2024-06-06 14:40:07 -07:00
Alex Leute
1701e929bc py-pyseer and deps: new package (#44543)
* py-pyseer: Added package py-pyseer and dependency py-glmnet-python
* py-glmnet-python: Updated homepage, and added comments
* Update copyrights

---------

Co-authored-by: Jen Herting <jen@herting.cc>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-06 14:34:11 -07:00
Melven Roehrig-Zoellner
1bb3e04263 t8code: new versions (#44570) 2024-06-06 14:30:28 -07:00
AMD Toolchain Support
e91ae19ec4 amdlibm: fix build environment (#44054)
Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-06-06 14:59:48 -06:00
Melven Roehrig-Zoellner
818ae08c61 py-h5py: fix py-cython version constraints (#44569)
* py-h5py: fix py-cython version constraints
* py-mpi4py: new version and updated dependencies
2024-06-06 13:22:30 -06:00
jmuddnv
15f32f2ca1 Update package.py (#44568)
New tarfiles for HPC SDK 24.5.  Release is 24.5-2.
2024-06-06 12:04:48 -06:00
Melven Roehrig-Zoellner
59aa62ea5c gdb: add version 14.2 (#44564) 2024-06-06 11:42:03 -06:00
SXS Bot
b4c292ddd0 spectre: add v2024.06.05 (#44566)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-06-06 11:01:55 -06:00
Pariksheet Nanda
a25655446a libtiff: add v4.6.0 and default disable opengl (Fixes #44545) (#44546)
* libtiff: add v4.6.0 and default disable opengl (#44545)

* libtiff: Fix typo in CMake key

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* libtiff: Broader description of OpenGL variant

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* libtiff: reformat using spack style black recommendation

* libtiff: couple opengl flag with autotools

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-06 18:05:29 +02:00
Carlos Bederián
cf3d59bb2e libarchive: add versions 3.7.4, 3.7.3, 3.7.2 (#44363) 2024-06-06 08:03:22 -06:00
dependabot[bot]
f80287166e build(deps): bump sphinx-design from 0.5.0 to 0.6.0 in /lib/spack/docs (#44360)
Bumps [sphinx-design](https://github.com/executablebooks/sphinx-design) from 0.5.0 to 0.6.0.
- [Release notes](https://github.com/executablebooks/sphinx-design/releases)
- [Changelog](https://github.com/executablebooks/sphinx-design/blob/main/CHANGELOG.md)
- [Commits](https://github.com/executablebooks/sphinx-design/compare/v0.5.0...v0.6.0)

---
updated-dependencies:
- dependency-name: sphinx-design
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-06 14:34:59 +02:00
Wouter Deconinck
329dc40b98 py-editables: add v0.4, v0.5 (switch to flit-core) (#44380) 2024-06-06 14:32:28 +02:00
Kyle Gerheiser
8328c34a3e libfabric: add v1.19.1 and v1.20.2 (#44357) 2024-06-06 14:25:07 +02:00
Melven Roehrig-Zoellner
6c309d3bc9 cgns: patch for gcc14 (#44562) 2024-06-06 06:16:11 -06:00
Rocco Meli
24b49eee83 py-gsd: add v3.2.1 (#44474) 2024-06-06 14:13:58 +02:00
Wouter Deconinck
715fab340f (py-)onnx: add v1.16.1 (#44485) 2024-06-06 14:11:23 +02:00
Wouter Deconinck
8231e84d15 py-ipykernel: add latest bugfixes from v6.23.3 through v6.29.4 (#44498) 2024-06-06 14:07:56 +02:00
Wouter Deconinck
8dc91a7a5c py-ipython: add latest bugfixes from v8.15.0 through v8.25.0 (#44499) 2024-06-06 14:07:16 +02:00
Teague Sterling
4c195b1a06 py-azure-mgmt-storage: add v20.0.0, v20.1.0 (#44511) 2024-06-06 14:05:08 +02:00
Teague Sterling
c6fb6bf5f8 py-nest-asyncio: v1.5.8, v1.5.9, v1.6.0 (#44512) 2024-06-06 14:03:28 +02:00
Teague Sterling
ed6161b80c py-rich: add v12.6.0,13.7.1 (#44515) 2024-06-06 14:02:50 +02:00
Derek Ryan Strong
93424e4565 hwloc: add v2.9.2, v2.9.3, v2.10.0 (#44577) 2024-06-06 04:00:46 -06:00
Teague Sterling
ca00e42f1d py-aiohttp: add 3.9.0, 3.9.4, 3.9.5 (#44510) 2024-06-06 11:22:31 +02:00
Teague Sterling
357ee1c632 py-uvloop: add v0.17.0,v0.18.0,v0.19.0 (#44513) 2024-06-06 11:15:22 +02:00
Derek Ryan Strong
cb0a3eaade neovim: add v0.10.0, v0.9.5 (#44365) 2024-06-06 11:11:45 +02:00
dependabot[bot]
a20d34b8aa build(deps): bump pytest from 8.2.1 to 8.2.2 in /lib/spack/docs (#44553)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.2.1 to 8.2.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.2.1...8.2.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-06 08:38:54 +02:00
Derek Ryan Strong
329910a620 ffmpeg: add v7.0, master (#44107)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-06-06 07:57:29 +02:00
Derek Ryan Strong
c374d04b0d wget: add v1.21.4 and v1.24.5 (#44574) 2024-06-06 07:56:22 +02:00
Derek Ryan Strong
b68cf16c85 Add gmake build dependency for unibilium (#44575) 2024-06-06 07:54:46 +02:00
Alex Richert
391c4cf099 netcdf-c: add logging variant (#43380)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2024-06-05 17:48:27 +02:00
Michael Kuhn
8260599e98 py-netcdf4: fix build with gcc@14 (#44134) 2024-06-05 17:38:54 +02:00
Brian Vanderwende
3433c8b8a5 ncl: consolidate patch methods (#44333) 2024-06-05 17:37:25 +02:00
Harmen Stoppels
e53bc780e4 libuuid: deprecate entirely (#44525)
There hasn't been a release in almost a decade, and build failures with
GCC 14 were reported. I don't think it makes sense to patch it, since
the project moved over to `util-linux`, and Spack's default provider is
that package. Better to just get rid of it in the next Spack release.
2024-06-05 17:36:09 +02:00
Teague Sterling
53346dbaa6 libuuid: address #44479 (#44481)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-06-05 08:14:20 -06:00
Chris Green
99994ea245 [root] New version 6.32.00 (#44550) 2024-06-05 08:27:13 -05:00
Harmen Stoppels
3ffe02a2fe spack edit: allow edit multiple files at once (#44416) 2024-06-05 14:19:11 +02:00
Carlos Bederián
9b77502360 openblas: add v0.3.27 (#44312) 2024-06-05 13:18:32 +02:00
Vanessasaurus
96a97328cf Automated deployment to update package flux-core 2024-06-05 (#44554)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-06-05 12:44:31 +02:00
Mikael Simberg
1a400383c0 valgrind: Add 3.21.0, 3.22.0, and 3.23.0 (#44557) 2024-06-05 12:33:00 +02:00
Harmen Stoppels
4f8ab19355 mold: unvendor (#44539) 2024-06-05 12:12:41 +02:00
Chris Marsh
8919677faf Work around the linker incompatibility that exists with fortran and apple-clang (#44547) 2024-06-05 02:16:52 -07:00
Chris Marsh
858b185a0e Armadillo: fix for linker error with apple-clang 15 (#44551)
* Armadillo needs to use -ld_classic with apple-clang 15
2024-06-05 01:02:43 -07:00
Tony Weaver
bc738cea32 py-your: new package (#44448)
* py-your: new package

Spack package recipe for YOUR, Your Unified Reader.  YOUR processes pulsar data in different formats.

Output below from spack install py-your
spack install py-your
[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/pcre-8.45-33dfhul
[+] /opt/apps/spack/libpciaccess-0.17-2qdxdjo
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/qhull-2020.2-klc7ewb
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/alsa-lib-1.2.3.2-a7icjdy
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/yasm-1.3.0-v4etmmm
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/unzip-6.0-mqftjtp
[+] /opt/apps/spack/libjpeg-turbo-3.0.0-vjvivme
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/libpng-1.6.39-kiuku4y
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/swig-4.0.2-fortran-z3sbnze
[+] /opt/apps/spack/binutils-2.42-vkjcvfr
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/hdf5-1.14.3-sbuiw6q
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/ffmpeg-6.1.1-2vhrbda
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/freetype-2.13.2-in4taxi
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/hwloc-2.9.1-fhoz6al
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/lua-5.3.6-47356ia
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/elfutils-0.190-uswzaiw
[+] /opt/apps/spack/py-pytz-2023.3-ojdlhrd
[+] /opt/apps/spack/py-cycler-0.11.0-b7mjvv7
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/llvm-14.0.6-3nosumn
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-six-1.16.0-iv6iv3q
[+] /opt/apps/spack/py-pybind11-2.12.0-5rvupjv
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-pyparsing-3.1.2-fq6imlt
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
[+] /opt/apps/spack/py-llvmlite-0.41.1-qom3l5h
[+] /opt/apps/spack/py-astropy-4.0.1.post1-xbojixg
[+] /opt/apps/spack/py-python-dateutil-2.8.2-kzdfskc
[+] /opt/apps/spack/py-numexpr-2.8.4-fc5xc5s
[+] /opt/apps/spack/py-h5py-3.11.0-y6drk3j
[+] /opt/apps/spack/py-tifffile-2023.8.30-oof4try
[+] /opt/apps/spack/py-pygments-2.16.1-stgrccl
[+] /opt/apps/spack/py-mdurl-0.1.2-nqk43ry
[+] /opt/apps/spack/py-scipy-1.13.0-vxjgfov
[+] /opt/apps/spack/py-contourpy-1.0.7-jg5lhss
[+] /opt/apps/spack/py-pillow-10.3.0-ijh2cju
[+] /opt/apps/spack/py-kiwisolver-1.4.5-vdh5sq5
[+] /opt/apps/spack/py-fonttools-4.39.4-x5ydbox
[+] /opt/apps/spack/py-bottleneck-1.3.7-ztsm4lg
[+] /opt/apps/spack/py-lazy-loader-0.4-k7hgvka
[+] /opt/apps/spack/py-numba-0.58.1-hzvrjwj
[+] /opt/apps/spack/py-markdown-it-py-3.0.0-l4p4qv5
[+] /opt/apps/spack/py-imageio-2.34.0-z5hu4yc
[+] /opt/apps/spack/py-matplotlib-3.8.4-azq2fzm
[+] /opt/apps/spack/py-pandas-1.5.3-p3gnh6t
[+] /opt/apps/spack/py-rich-13.4.2-okhgwwz
[+] /opt/apps/spack/py-networkx-3.1-b54br7r
[+] /opt/apps/spack/py-scikit-image-0.23.2-cvyzh3t
==> Installing py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x [83/83]
==> No binary for py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x found: installing from source
==> Fetching https://github.com/thepetabyteproject/your/archive/refs/tags/0.6.7.tar.gz
==> No patches needed for py-your
==> py-your: Executing phase: 'install'
==> py-your: Successfully installed py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x
  Stage: 1.43s.  Install: 0.99s.  Post-install: 0.12s.  Total: 3.12s

* Removed setup_run_environment

After some testing, both spack load and module load for the package will include the bin directory generated by py-your as well as the path to the version of python the package was built with, without the need for the setup_run_environment function.

I removed that function (Although, like Tamara I thought it would be necessary based on other package setups I used as a  basis for this package).

Note: I also updated the required version of py-astropy from py-astropy@4.0: to @py-astropy@6.1.0:  In my test builds, the install was picking up version py-astropy@4.0.1.post1 and numpy1.26.  However when I  tried to run some of the code I was getting errors about py-astropy making numpy calls that are now removed.  The newer version of py-astropy corrects these.  Ideally this would be handled in the py-astropy package to make sure numpy isn't too new
2024-06-04 21:38:38 -05:00
Teague Sterling
c2196f7d3a py-humanize: add v1.0.0, v1.1.0, v2.6.0, v3.14.0, v4.8.0, v4.9.0 (#44516)
* py-humanize: add v1.0.0, v1.1.0, v2.6.0, v3.14.0, v4.8.0, v4.9.0
2024-06-04 17:34:12 -07:00
Kyle Knoepfel
d45c27fdbd [intel-tbb] Speed up build and add versions (#44549)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-06-04 15:33:53 -06:00
Teague Sterling
173084de19 py-python-json-logger: add v2.0.2 (#44517)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies
* py-python-json-logger: add v2.0.2
2024-06-04 11:20:47 -07:00
Teague Sterling
fd2c5fa247 py-maturin: fix rust version dependencies to build (#44518)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies
* rollback
* py-maturin: fix rust version dependencies to build
2024-06-04 11:09:37 -07:00
kenche-linaro
73e0e9bdff linaro-forge: added 24.0.1 version (#44538) 2024-06-04 11:00:54 -07:00
snehring
4442414d74 kallisto: add version 0.50.1 (#44544)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-04 10:21:54 -07:00
Chris Marsh
8dbf9005f0 proj@7: support the new tiff interface in cmake@3.19 (#44535)
Add patch for proj@7 to support the new tiff interface shipped in cmake@3.19: This compliments the existing patch for @8  in #43780
2024-06-04 13:09:49 -04:00
Scott Wittenburg
0c3da1b498 gitlab ci: Remove protected publish job (#44304) 2024-06-04 11:54:05 -05:00
Todd Gamblin
278f5818b7 python: make every view a venv (#44382)
#40773 introduced python-venv, which improved build isolation and avoids issues with,
e.g., `ubuntu`'s system python modifying `sysconfig` to include a (very unwanted)
`local` directory within the default install layout.

This addresses a few cases where #40773 removed functionality, without harming the
default cases where we use `python-venv`.

Traditionally, *every* view with `python` in it was essentially a virtual environment,
because we would copy the `python` interpreter and `os.py` into every view when linking.
We now rely on `python-venv` to do that, but only when it's used (i.e. new builds) and
only for packages that have an `extends("python")` directive.

This again makes every view with `python` in it a virtual environment, but only
if we're not already using a package like `python-venv`. This uses a different
mechanism from before -- instead of using the `virtualenv` trick of copying `python`
into the prefix, we instead create a `pyvenv.cfg` like `venv` (the more modern way
to do it).

This fixes two things:
1. If you already had an environment before Spack `v0.22` that worked, it would
   stop working without a reconcretize and rebuild in `v0.22`, because we no longer
   copy the python interpreter on link. Adding `pyvenv.cfg` fixes this in a more
   modern way, so old views will keep working.

2. If you have an env that only includes python packages that use `depends_on("python")`
   instead of `extends("python")`, those packages will now be importable as before,
   though they won't have the same level of build isolation you'd get with `extends`
   and `python-venv`.

* views: avoid making client code deal with link functions

Users of views and ViewDescriptors shouldn't have to deal with link functions -- they
should just say what type of linking they want.

- [x] views take a link_type, not a link function
- [x] views work out the link function from the link type
- [x] view descriptors and commands now just tell the view what they want.

* python: simplify logic for avoiding pyvenv.cfg in copy views

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-04 09:52:21 -06:00
John W. Parent
c2e85202c7 CMake: add 3.28.6 && 3.29.4 (#44532)
* Add CMake 3.28.6

* Add 3.29.4
2024-06-04 09:08:09 -06:00
Mikael Simberg
b021b12043 gcc: add mold variant to use mold by default (#44117) 2024-06-04 16:46:01 +02:00
Todd Gamblin
89a0c9f4b3 nvhpc: Do not use -Wno-error with nvhpc (#44142)
In #30882, we made Spack ignore `-Werror` calls so that it could more easily build
projects that inject `-Werror` into their builds. We did this by translating them to
`-Wno-error` in the compiler wrapper. However, some compilers (like `nvhpc`) do not
support `-Wno-error`. We need to exclude them from this feature until they do.

- [x] make a property on `PackageBase` for `keep_werror` that knows not to use it for
      `nvhpc`.

- [x] update property so that it keeps only the specific `-Werror=...` args for newer nvhpc's,
      which support `-Wno-error` but not `-Wno-error=...`

---------

Co-authored-by: William Mou <william.mou1024@gmail.com>
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-04 03:46:35 -07:00
Weiqun Zhang
259629c300 amrex: add v24.06 (#44495) 2024-06-03 17:52:18 -07:00
Teague Sterling
1ce2baf7a2 duckdb: add v1.0.0, v0.10.3 (#44531)
* duckdb: add v1.0.0, v0.10.3
* Adding issue reference
2024-06-04 02:33:36 +02:00
Garth N. Wells
4576a42a0f py-fenics-dolfinx: dependency update (#44524)
* Update nanobind dependency
* Update py-nanobind dependency
2024-06-03 17:24:40 -07:00
Nils Vu
4fba351b92 charmpp: enable darwin arm build (#44103) 2024-06-03 17:44:29 -06:00
Garth N. Wells
706737245a py-nanobind: add v2.0.0 (#44371)
* Add nanobind 2.0.0
* Add dependency
* Fix dependency name
* Change "_" -> "-"
2024-06-03 15:25:03 -07:00
吴坎
0dbdf49075 Update py-vl-convert-python (#44527)
* Update py-vl-convert-python:
  1. set version to 1.4.0
  2. set version 1.3.0 deprecated since its rust dependency curve@4.1.1 is not able to compile
2024-06-03 15:07:16 -07:00
Owen Solberg
641075539c hugo: add v0.126.3 (#44530)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-06-03 23:37:51 +02:00
Diego Alvarez S
9428d99991 seqfu: new package (#44445)
* Add seqfu

---------

Co-authored-by: dialvarezs <dialvarezs@users.noreply.github.com>
2024-06-03 14:30:00 -07:00
eugeneswalker
f3cf2e94c4 hip@6.1: fix reference to hsa-rocr-dev so it works when externally defined (#44528) 2024-06-03 14:11:47 -07:00
Todd Gamblin
85f13442d2 Consolidate concretization output for environments (#44489)
When Spack concretizes environments, it prints every (newly concretized) root spec
individually with all of its dependencies. For most reasonably sized environments, this
is too much output. This is true for three commands:

* `spack concretize` when concretizing an environment with newly added specs
* `spack install` when installing an environment with newly added specs
* `spack spec` with no arguments in an environment

The output dates back to before we had unified environments or nicer spec traversal
routines, and we can improve it.

This PR makes environment concretization output analogous to what we do for regular
specs. Just like `spack spec` for a single spec, we show all root specs with no
indentation, so you can easily see the specs you explicitly requested. Dependencies are
shown:

1. With indentation according to their depth in a breadth-first traversal starting at
   the roots;
2. Only once if they appear on paths from multiple roots

So, the default is now consistent with `spack spec` for one spec--it's `--cover=nodes`.
i.e., if there are 100 specs in your environment, you'll get 100 lines of output.

If you want to see more details, you can do that with `spack spec` using the arguments
you're already familiar with. For example, if you wanted to see dependency types and
*all* dependencies, you could use `spack spec -l --cover=edges`. Or you could add
deptypes and namespaces with, e.g. `spack spec -ltN`.

With no arguments in an environment, `spack spec` concretizes (if necessary) and shows
the concretized environment. If you run `spack concretize` *first*, inspecting the
environment repeatedly with `spack spec` will be fast, as everything is already in the
`spack.lock` file.

- [x] factor most logic of `Spec.tree()` out of `Spec` class into `spack.spec.tree()`,
      which can take multiple specs as roots.
- [x] make `Spec.tree()` call `spack.spec.tree()`
- [x] `spack.environment.display_specs()` now uses `spack.spec.tree()`
- [x] Update `spack concretize`
- [x] Update `spack install`
- [x] Update `spack spec` to call `spack.spec.tree()` for environments.
- [x] Continue to output specs individually for `spack spec` when using
      `--yaml` or `--json`
2024-06-03 13:29:14 -07:00
James Taliaferro
f478a65635 nb: new package (#44456)
* new package: nb
* only one filter_file, install completions
* completions now implicit, merged by the view
2024-06-03 12:57:47 -07:00
Rocco Meli
eca44600c5 rdkit (#44476) 2024-06-03 12:53:08 -07:00
Matt Thompson
d7a4652554 pflogger: add version 1.15.0 (#44467) 2024-06-03 12:41:51 -07:00
G-Ragghianti
d85cdd1946 Slate version 2024 05 31 (#44529)
* updating version for slate, blaspp, and lapackpp
* verified new hashes
2024-06-03 12:39:50 -07:00
Chris Marsh
ce3aae1501 seacas: protect against known mixed-toolchain problem (#44378)
* Protected against a known problem with mixed gcc/apple-clang toolchains. Fixes #44330

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-06-03 12:27:53 -06:00
Chris Marsh
90c4f9d463 Apply patch to allow vtk to compile with %gcc 13 and 14. (#44332)
* Apply patch from upstream to allow vtk compilation with %gcc 12 and 14.

* Fixes #44331

* fix soec usage

* fix compiler version range

* Finalize version range, switch to .diff file


---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-06-03 14:21:09 -04:00
Dan Bonachea
93ccd81d86 upcxx: update to latest gasnet and fix some bitrot (#44488)
* Now need to explicitly depend on libfabric for Cray EX
* Ensure build uses the selected CUDA and ROCm versions
* Correct spelling on `configure --with-ldflags`
* Patch a defect regarding `configure --with-ldflags`
* Default to Spack's copy of GASNet-EX, which is newer than embedded
2024-06-02 16:59:13 +02:00
Dan Bonachea
9c5a70ab6c gasnet: add v2024.5.0 (#44478)
* Add GASNet v 2024.5.0

* cosmetic fix to info output

* Add a missing dependency

* Ensure GASNet detects the provided ROCm/CUDA dependency

* [@spackbot] updating style on behalf of bonachea

---------

Co-authored-by: bonachea <bonachea@users.noreply.github.com>
2024-06-01 16:04:45 -07:00
Rocco Meli
5ef58144cb py-fortls: add v3.1.0 (#44477)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* update

* [@spackbot] updating style on behalf of RMeli

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 15:01:47 -05:00
Filippo Spiga
41ddbdfd90 gmsh: add v4.11.1, master (#41320)
* Adding gmsh 4.11.1

* Becoming a maintainer

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 11:30:25 -06:00
Rocco Meli
66924c85a3 py-griddataformats: add v1.0.2 (#44475)
* update

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 10:52:17 -05:00
snehring
a4c3fc138c apptainer: adding version 1.3.1 (#44104)
e2fsprogs: adding version 1.47.0 and fuse2fs variant
fuse-overlayfs: adding version 1.13
squashfuse: adding version 0.5.1 and 0.5.2
e2fsprogs: fixing audit errors
apptainer: expanding deps to cover more versions, fix binary_path logic.
apptainer: add go version deps, fix e2fsprogs patch sums, rework dep paths
fuse-overlayfs: tightening up the libfuse dep

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-01 01:03:35 -06:00
Matt Thompson
62ed2a07a7 mapl: add 2.46.2, 2.40.5 (#44466) 2024-05-31 22:21:12 -06:00
James Taliaferro
e7aec9e872 pass: new package (#44454)
* New package: password-store

* add bash completion as variant

* also patch the cygwin platform snippet

* update description and maintainers

* make completions implicit and don't overwrite the completions package

* remove completion option

* formatting

* clean up file filter syntax

* remove reference to completion variant

* remove dependency on bash-completion

* clarify comments

* bashcompdir is already the default

* add optional dependency on xclip

* fix formatting
2024-05-31 18:05:06 -06:00
snehring
b065c3e11e ruby: adding version 3.3.2 (#44447)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-31 17:39:09 -06:00
Lucas Frérot
88b357c453 tamaas: adding new versions and python install fix (#44469)
* tamaas: install python extension with explicit pip call
* tamaas: patching compilation issues with recent compilers
* tamaas: added versions 2.7.0 and 2.7.1
2024-05-31 16:02:24 -07:00
Ryan Mulhall
bb7299c04a add in the latest versions of FMS (#44471)
Co-authored-by: rem1776 <rem1776@github.com>
2024-05-31 16:00:39 -07:00
Raffaele Solcà
7a5bddfd15 dla-future: Add 0.5.0 (#44463)
* add dla-future@0.5.0

* [@spackbot] updating style on behalf of rasolca

* fix typo

* review suggestions

---------

Co-authored-by: rasolca <rasolca@users.noreply.github.com>
2024-05-31 13:53:37 -06:00
potter-s
50fe769f82 Updated version (#44461)
Co-authored-by: Simon Potter <sp39@sanger.ac.uk>
2024-05-31 13:44:00 -06:00
Wouter Deconinck
29d39d1adf util-macros: ensure url_for_version works for older versions (#44421)
* util-macros: ensure url_for_version only used for older versions
* util-macros: use url.substitute_version after xz -> bz2
* util-macros: mv url_for_version down the file
2024-05-31 12:36:44 -07:00
Lydéric Debusschère
8dde7f3975 py-junit2html: new package, version '31.0.2' (#44399)
* py-junit2html: new package, version '31.0.2'
* py-junit2html: install from sources instead of from wheel
2024-05-31 12:36:02 -07:00
renjithravindrankannath
0cd038273e Bump-up rocm-opencl with 6.1.0 & 6.1.1 and adding hsa library path in LD_LIBRARY_PATH (#44171)
* Adding hsa library path in LD_LIBRARY_PATH
* Prepending instead of setting LD_LIBRARY_PATH to hsa-rocr-dev/lib
* adding rocm-opencl 6.1.0 & 6.1.1 updates
2024-05-31 12:30:55 -07:00
Matt Schramm
1f5bfe80ed Updates meep to latest release 1.29.0 (#44468) 2024-05-31 12:10:30 -07:00
Hartmut Kaiser
4d2611ad8a Adding HPX v1.10.0 to package (#44470) 2024-05-31 12:08:52 -07:00
jmuddnv
21a97dad31 Changes for NVIDIA HPC SDK 24.5 (#44354) 2024-05-31 10:32:39 -07:00
jdomke
338a01ca6d fujitsu-mpi: point MPI compiler wrappers to Spack compiler wrappers (#44457)
Using OMPI_ environment variables, like openmpi does.
2024-05-31 09:39:51 -06:00
Harmen Stoppels
392396ded4 traverse: pass key correctly (#44460)
Fixes a bug where custom keys to identify nodes were not passed
correctly.
2024-05-31 08:26:38 -07:00
Wouter Deconinck
a336e0edb7 py-numexpr: add v2.8.8, v2.9.0 (#44451) 2024-05-31 08:34:19 -06:00
Satish Balay
9426fefa00 petsc4py: fix typo with version (#44452) 2024-05-31 08:24:16 -06:00
Tom Bradford
812192eef5 protobuf: fix 3.4:3.21 patch checksum (#44443) 2024-05-31 13:59:49 +02:00
Rocco Meli
b8c8e80965 dla-future-fortran: fix +test option (#44458)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option
2024-05-31 11:18:52 +02:00
Alberto Invernizzi
77fd5d8414 paraview: update cuda_arch management (#44243)
* refactor logic to switch to cmake for cuda management
2024-05-30 23:33:38 -05:00
Tom Bradford
82050ed371 corrected the vmd 1.9.3 tarball checksum (#44433) 2024-05-30 17:02:55 -07:00
John W. Parent
a7381a9413 Bootstrapping: don't use Mac OS binaries on Windows (#44193)
`BuildcacheBootstrapper` uses `Spec.intersects` to match specs needed
for bootstrapping against the binary cache. The specs were not
sufficiently-detailed to prevent matching e.g. cached binaries for
Mac OS on Windows; this commit adds the platform to each requested
bootstrap spec to prevent that.
2024-05-30 17:10:29 -06:00
Alec Scott
b932783d4d unmaintained pkgs: bump versions 2024-05-20 (#44274)
* unmaintained pkgs: bump versions 2024-05-20

* seqkit: revert changes

* msgpack-c: add dependency on boost

* msgpack-c: revert to test neovim

* revert: parallel, openblas, and re2c due to conflicts

* gmsh: revert changing version

* trilinos: limit version of suite-sparse to fix build

* gtkorvo-atl: deprecate develop before remove

* msgpack-c: deprecate v1.4.1

* msgpack-c: fix style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-30 13:27:11 -06:00
Teague Sterling
0b51f25034 Package/gettext: Old version issues (#44440)
gcc@:5 hits https://savannah.gnu.org/bugs/index.php?65811 in gettext@0.22:

also fix patch of configure script

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-30 12:35:20 -06:00
Sreenivasa Murthy Kolam
d6a182fb5d Bump-up the version for RoCm-6.1.1 release (#44178)
* initial commit for RoCm-6.1.1 release

* fix style issues

* update the version for rocmlir and rocm-cmake

* restrict the patch for 6.0.0 release

* fix build failure for hipsolver-6.1.1 release

* update the hipblaslt for rocm-6.1.1 release

* add the patch for hipsparselt for 6.1.1 rel
2024-05-30 08:57:43 -07:00
Satish Balay
e8635adb21 petsc, py-petsc4py: add v3.21.2 (#44439) 2024-05-30 07:10:24 -07:00
Massimiliano Culpo
f242e0fd0c remove platform=cray (#43796)
Remove support for `cray` as a separate platform.

Any platform previously detected as `cray` is now detected as `linux`.

Users who still need platform=cray have to stick to Spack 0.22
2024-05-30 14:21:32 +02:00
renjithravindrankannath
67b5f6b838 rocm-validation-suite: custom patch update for lib and lib64 yaml library path (#44287) 2024-05-30 09:06:57 +02:00
Wouter Deconinck
9d16f17463 conmon: add v2.1.12 (#44389) 2024-05-30 08:57:05 +02:00
Harmen Stoppels
f44f5b0db0 tests: use fewer default paths (#44432)
Set config:install_tree:root and modules:default:roots to something
sensible.
2024-05-30 08:12:19 +02:00
Harmen Stoppels
39ace5fc45 concretizer: enforce host compat when reuse only (#44428)
Fixes a bug in the concretizer where specs depending on a host
incompatible libc would be used. This bug triggers when nothing is
built.

In the case where everything is reused, there is no libc provider from
the perspective of the solver, there is only compatible_libc. This
commit ensures that we require a host compatible libc on any reused
spec, additionally to requiring compat with the chosen libc provider.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-30 07:31:28 +02:00
Diego Alvarez S
0601d6a0c5 nim: add v2.0.4 (#44375)
* Add nim 2.0.4
* Use install instead of copy

---------

Co-authored-by: dialvarezs <dialvarezs@users.noreply.github.com>
2024-05-29 18:05:11 -07:00
renjithravindrankannath
11869ff872 Patch to add hsa prefix/include path (#44334)
* Patch to add hsa prefix/include path:wq
* Style check error fix
* The patch need to be applied to 2024.01.stable. ROCm 6.0 requires hpctoolkit 2024.01.1 or later
* Fixing style chec error
2024-05-29 18:02:34 -07:00
Teague Sterling
6753605807 Update py-pyspark and py-py4j (#44263)
* Adding a py4j variant that requires Java via spack to avoid situations where a system doesn't have Java and py4j expects it
* Adding new versions of py-pyspark
* Adding a new variant to require java (via py4j) and clean up dependency handling
* Adding myself as a maintainer for py-pyspark and py-py4j
* Fix overlooked version bump in py4j
* Version bump to meet py-spark expectations
* Version bump to add latest compatibile version with pyspark
* Matching py-grpcio bump
* Adding variants and dependents for pyspark
* Adding runtime deps
* Changing default java requirement. I'm not sure this is the right call
* Changing py4j with java dependency handling
* Fix style
* Update package.py fix unnecessary f-strinh
* Make +java the default for both
* Fix nested deps
* Revert styles after default change
* Added new versions and GCC 14 compatbility conflicts
* Added new versions and compatibility conflicts for gcc 14
* Added new versions paired to arrow (for gcc14 compat)
* Update py-protobuf compiler conflict
* Update depends to match 
    See https://github.com/grpc/grpc/blob/master/src/python/grpcio_status/setup.py
* Updating dependencies and conflicts for py-googleapis-common-protos. Added new version to avoid future issues
* Remove upper bound version on py-protobuf and add default_args
* Adding new versions and updating dependencies back to versions 1.35.0
* Updating oldest numpy deps
* Fixing merge
* bit more cleaniness for var/spack/repos/builtin/packages/py-googleapis-common-protos/package.py
* Adding latest matching version of py-grpcio and py-grpcio-status
* Update package.py
   https://github.com/spack/spack/pull/44263#discussion_r1612317943
* Update dependencies
* Adding additional versions for dependent packages. Deprecated two versions: 1.16 is old, built for python ~3.6, and does not build for 3.8. 1.52.0 was removed from pypi
* Revert py-grpcio-tools changes. Will include in separate PR
* Adding patches and constraints to get 1.48 to build as it's a dependency that is called out for some other packages
* Updating to account for yanked packages for dependencies
* Fix style
* Update sha256 for py-grpcio v0.16.0 to reflect change

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-29 18:58:46 -06:00
Tony Weaver
918db85737 py-astropy: add v6.1 and new py-astropy-iers-data dependency (#44318)
* py-astropy: Add version 6.1

Added build info for version 6.1 in py-astropy.  Requires a new additional package, astropy-iers-data which has been included as py-astropy-iers-data to match with spack's general naming conventions.

Below is the output of the spack install showing successful build fro version 6.1 and the new py-astropy-iers-data package

[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/libyaml-0.2.5-fxathvq
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/pcre2-10.42-fu62kky
[+] /opt/apps/spack/libunistring-1.2-whrov3e
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/berkeley-db-18.1.40-jftva2u
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/libidn2-2.3.7-vnie4rz
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/perl-5.38.0-gzljgek
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/krb5-1.20.1-tqiapsx
[+] /opt/apps/spack/py-pyyaml-6.0-rju7jls
[+] /opt/apps/spack/py-tomli-2.0.1-eanxpu2
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/openssh-9.7p1-jxrkzso
[+] /opt/apps/spack/py-pyerfa-2.0.0.1-kyfazhs
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-typing-extensions-4.8.0-ujwbb6g
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
[+] /opt/apps/spack/git-2.45.1-tuc5jnb
[+] /opt/apps/spack/py-cython-3.0.0-zx62ssd
==> Installing py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj [52/55]
==> No binary for py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj found: installing from source
==> No patches needed for py-astropy-iers-data
==> py-astropy-iers-data: Executing phase: 'install'
==> py-astropy-iers-data: Successfully installed py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj
  Stage: 1.74s.  Install: 0.93s.  Post-install: 0.52s.  Total: 3.36s
[+] /opt/apps/spack/py-astropy-iers-data-main-ukchsfz
[+] /opt/apps/spack/py-extension-helpers-0.1-a5hmr6j
[+] /opt/apps/spack/py-setuptools-scm-8.0.4-qdhxchg
==> Installing py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo [55/55]
==> No binary for py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/a/astropy/astropy-6.1.0.tar.gz
==> Ran patch() for py-astropy
==> py-astropy: Executing phase: 'install'
==> Warning: Module file /opt/modulefiles/spack/Core/py-astropy/6.1.0.lua exists and will not be overwritten
==> py-astropy: Successfully installed py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo
  Stage: 1.29s.  Install: 1m 5.77s.  Post-install: 0.60s.  Total: 1m 7.94s

* Removed extra-whitespace

A blank line had white space, removed the white space

* Additional formatting changes for black

* Additional package updates

Based on previous recommendations updated py-astropy and py-astropy-iers-data packages.  Also added a new version to py-pyerfa package to match the 6.1.0 dependencies better

Of importance in these updates, I did add pypi and version info to py-astropy-iers-data.  Originally I had argued that this package updates quite frequently (on a weekly basis) and so it did not make sense to include pypi/versions and we should instead use the non-version git-repo structure based on the master branch.  However, when I tried to build the package py-setuptools-scm errored out when trying to build the package:

/opt/apps/spack/py-setuptools-scm-8.0.4-ax2zqro/lib/python3.10/site-packages/setuptools_scm/git.py:163: UserWarning: "/tmp/root/spack-stage/spack-stage-py-astropy-iers-data-main-iw2mdzlukb37mkcbcozjjefjoefw2eyp/spack-src" is shallow and may cause errors

I believe this is due to the download file/stage directory not containing the version and instead including the branch.   I changed the package to use versions instead and it worked just fine as shown below.  In addition, when I had done some preliminary testing, the package installed fine using the non-version master branch.  When I checked, that installation it used py-setuptools-scm@7.1 while in this installation run it used a much more recent 8.0.4 so it is possible that somewhere between scm7.1 and 8.0.04 something changed that caused this error to show up.  Since the setuptools-scm package has something to do extracting package versions, I imagine it's some kind of mismatch issue

Output from build:
spack install py-astropy
[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/berkeley-db-18.1.40-jftva2u
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/libunistring-1.2-whrov3e
[+] /opt/apps/spack/pcre2-10.42-fu62kky
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/libyaml-0.2.5-fxathvq
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/libidn2-2.3.7-vnie4rz
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/perl-5.38.0-gzljgek
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/krb5-1.20.1-tqiapsx
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/openssh-9.7p1-jxrkzso
[+] /opt/apps/spack/py-tomli-2.0.1-eanxpu2
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/git-2.45.1-zu6qkoc
[+] /opt/apps/spack/py-markupsafe-2.1.3-isgtki6
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
==> Installing py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a [49/57]
==> No binary for py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a found: installing from source
==> Fetching ac8a6fe91c.tar.gz
==> No patches needed for py-extension-helpers
==> py-extension-helpers: Executing phase: 'install'
==> py-extension-helpers: Successfully installed py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a
  Stage: 0.37s.  Install: 0.88s.  Post-install: 0.52s.  Total: 1.89s
[+] /opt/apps/spack/py-extension-helpers-0.1-a5hmr6j
==> Installing py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m [50/57]
==> No binary for py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m found: installing from source
==> Fetching 350b18f967.tar.gz
==> No patches needed for py-cython
==> py-cython: Executing phase: 'install'
==> py-cython: Successfully installed py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m
  Stage: 1.07s.  Install: 2m 49.80s.  Post-install: 0.40s.  Total: 2m 51.37s
[+] /opt/apps/spack/py-cython-3.0.0-zx62ssd
[+] /opt/apps/spack/py-jinja2-3.1.2-wacpq7j
[+] /opt/apps/spack/py-typing-extensions-4.8.0-ujwbb6g
[+] /opt/apps/spack/py-pyyaml-6.0-rju7jls
[+] /opt/apps/spack/py-setuptools-scm-8.0.4-ax2zqro
==> Installing py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6 [55/57]
==> No binary for py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6 found: installing from source
==> Using cached archive: /opt/spack/var/spack/cache/_source-cache/archive/7f/7fff3d3404ae8560533ac0e685db7acc02c4d8984faa4ac3d607096879fba2d1.tar.gz
==> No patches needed for py-astropy-iers-data
==> py-astropy-iers-data: Executing phase: 'install'
==> py-astropy-iers-data: Successfully installed py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6
  Stage: 0.07s.  Install: 1.21s.  Post-install: 0.39s.  Total: 1.88s
[+] /opt/apps/spack/py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6
==> Installing py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp [56/57]
==> No binary for py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/p/pyerfa/pyerfa-2.0.1.1.tar.gz
==> No patches needed for py-pyerfa
==> py-pyerfa: Executing phase: 'install'
==> py-pyerfa: Successfully installed py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp
  Stage: 0.93s.  Install: 17.72s.  Post-install: 0.33s.  Total: 19.15s
[+] /opt/apps/spack/py-pyerfa-2.0.1.1-pkokp6u
==> Installing py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs [57/57]
==> No binary for py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/a/astropy/astropy-6.1.0.tar.gz
==> Ran patch() for py-astropy
==> py-astropy: Executing phase: 'install'
==> py-astropy: Successfully installed py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs
  Stage: 1.16s.  Install: 1m 3.34s.  Post-install: 1.05s.  Total: 1m 5.77s
[+] /opt/apps/spack/py-astropy-6.1.0-5brbkjn

* Updated to match with black formatting
2024-05-29 15:16:54 -07:00
Robert Cohn
1184de8352 intel-oneapi-mkl: change logic for gfortran compatibility (#44242)
* [intel-oneapi-mkl]: change logic for gfortran compatibility

* review comments

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

* warn if fortran-rt and gcc are used without gfortran

* remove warning

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-05-29 15:02:28 -06:00
Vanessasaurus
2470fde5d9 flux-sched: add v0.34.0 (#44205)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-29 12:02:58 -07:00
Harmen Stoppels
abfff43976 remove non-existent when kwarg (#44437)
Not sure why this was added, it is ignored.
2024-05-29 20:39:37 +02:00
Chris Green
230687a501 postgresql: fix thinko with thread-safety option (#44381) 2024-05-29 11:30:54 -07:00
eugeneswalker
5ff8908ff3 e4s hopper ci: minimize root specs (#44436) 2024-05-29 10:09:04 -07:00
Diego Alvarez S
882e09e50b mmseqs2: add v15-6f452 (#44362) 2024-05-29 10:05:38 -07:00
Harmen Stoppels
6753f4a7cb bootstrap: fix broken test (#44403) 2024-05-29 19:03:49 +02:00
Wouter Deconinck
1dc63dbea6 acts: add v34.1.0, v35.0.0 (identification, sycl variants changes) (#44407)
* acts: add v34.1.0, v35.0.0 (identification, sycl variants changes)

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-29 08:52:20 -07:00
Vicente Bolea
b9dfae4722 adios2: update to latest version 2.10.1 (#44426) 2024-05-29 06:46:45 -05:00
Harmen Stoppels
70412612c7 installer: improve init signature and explicits (#44374)
Change the installer to take `([pkg], args)` in the constructor instead
of `[(pkg, args)]`. The reason is that certain arguments are global
settings, and the new API ensures that those arguments cannot be
different across different "build requests".

The `explicit` install arg is now a list of hashes, and the installer is
no longer responsible for determining what package is installed
explicitly. This way environment installs can simply pass the list of
environment roots, without them necessarily being explicit build
requests. For example an env with two roots [a, b], where b depends on
a, would not always cause spack install to mark b as explicit.

Notice that `overwrite` already took a list of hashes, this makes
`explicit` consistent.

`package.do_install(explicit=True)` continues to take a boolean.
2024-05-29 08:25:34 +02:00
Harmen Stoppels
cd741c368c py-pythran: add 0.16, fix compat bounds (#43983) 2024-05-29 08:13:11 +02:00
Teague Sterling
16a7bef456 fix: mariadb confilct (#44418)
This is a small fix the the conflicts of mariadb recently pushed. The max conflict version for gcc>=13 was off by one.
2024-05-28 22:14:09 -06:00
Wouter Deconinck
85f62728c6 libxcb, xcb-proto: add v1.17.1 (#44394)
* libxcb, xcb-proto: add v1.17.1
* libxcb, xcb-proto: inherit XorgPackage
* xcb-proto: http -> https
2024-05-28 14:36:16 -06:00
Wouter Deconinck
092dc96e6c acts: pass cuda_arch to CMAKE_CUDA_ARCHITECTURES (#44397) 2024-05-28 11:41:32 -07:00
Carsten Uphoff
2bb20caa5f New package: tiny tensor compiler (#44401)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-28 10:27:44 -07:00
Nathalie Furmento
00bcf935e8 starpu: add v1.4.7 (#44415) 2024-05-28 10:24:14 -07:00
Martin Pokorny
3751372396 legion: add missing build dependency and new variants (#44329)
* legion: add pip dependency for build

pip is needed for the build when the Legion Python binding is enabled

* legion: add variants for gc logging and system OpenMP use

This also removes the `cmake_cxx_flags` variable from `cmake_args()`
because that variable had no effect.

* [@spackbot] updating style on behalf of mpokorny

* legion: use spec.satisfies() in cmake_args()

e.g, replace use of '"+foo" in spec' with 'spec.satisfies("+foo")'

* legion: avoid configuring with multiple "-DLegion_REDOP_COMPLEX=ON" arguments

In the previous version, when both '+redop_complex' and '+bindings'
was set, two instances of "-DLegion_REDOP_COMPLEX=ON" arguments were
generated for cmake configuration.

* legion: fix value of "Legion_OUTPUT_LEVEL" for configuration

the previous version had no effect on setting the configuration value

---------

Co-authored-by: mpokorny <mpokorny@users.noreply.github.com>
2024-05-28 10:21:36 -07:00
Wouter Deconinck
e6afeca92f xorg-docs: add v1.7.3 (#44388)
* xorg-sgml-doctools: add v1.12.1

* xorg-docs: add v1.7.3

* util-macros: add v1.20.1 (now distributed as xz)

* util-macros: prefer spec.satisfies

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-28 10:19:31 -07:00
Wouter Deconinck
35b9307af6 py-uhi: add v0.4.0 (#44411) 2024-05-28 10:05:52 -07:00
Alberto Sartori
567f728579 justbuild: add v1.3.1 (#44398) 2024-05-28 07:03:07 -07:00
Teague Sterling
404c5c29a1 mariadb: add v10.8.8, v10.9.6, v11.3.2 (#44412) 2024-05-28 06:58:23 -07:00
Matthieu Dorier
63712ba6c6 mochi-margo: add v0.16.0, v0.17.0 (#44406) 2024-05-28 06:53:04 -07:00
Wouter Deconinck
ef62d47dc7 prmon: add v3.1.0 (#44410) 2024-05-28 06:52:17 -07:00
Wouter Deconinck
a4594857fc nlohmann-json: add v3.11.3 (#44409) 2024-05-28 14:14:47 +02:00
Wouter Deconinck
e77572b753 git-lfs: add v3.4.1, v3.5.1 (#44392)
* git-lfs: add v3.4.1, v3.5.1

* git-lfs: rm trailing spaces
2024-05-28 07:42:18 -04:00
Juan Miguel Carceller
8c84c5ff66 whizard: add a dependency on ghostscript and fix +openmp (#44414) 2024-05-28 10:57:56 +02:00
Wouter Deconinck
5d8beaf0ed doxygen: add v1.11.0, v1.10.0 (#44390) 2024-05-28 10:05:12 +02:00
Wouter Deconinck
ac405f3d79 elfutils: add v0.191 (#44391) 2024-05-27 22:29:00 -07:00
Wouter Deconinck
2e30553310 madx: add v5.09.03 (#44408) 2024-05-27 18:56:53 -07:00
Diego Alvarez S
85a61772d8 seqkit: add v2.8.2 (#44356) 2024-05-27 11:41:56 -07:00
Teague Sterling
4007f8726d rust: add conflicts with gcc >= 13 (#44404) 2024-05-27 11:32:48 -07:00
Adam J. Stewart
a097f7791b py-ruff: add v0.4.5 (#44355) 2024-05-27 11:28:48 -07:00
kwryankrattiger
3d4d89b2c0 ParasView Release 5.12.1 (#44396)
Add ParaView 5.12.1
Update preferred ParaView to 5.12.1
2024-05-27 10:45:02 -07:00
Harmen Stoppels
e461234865 link: directly bind to os.* on non-windows (#44400)
The windows wrappers for basic functions like `os.symlink`,
`os.readlink` and `os.path.islink` in the `llnl.util.symlink` module
have bugs, and trigger more file system operations on non-windows than
they should.

This commit just binds `llnl.util.symlink.symlink = os.symlink` etc so
built-in functions are used on non-windows
2024-05-27 13:37:04 +02:00
Adam J. Stewart
2c1d5f9844 Remove deprecated versions from packages (#44157) 2024-05-27 09:30:55 +02:00
Michael Kuhn
c4b682b983 rocksdb: add 9.2.1 (#44384) 2024-05-26 14:15:03 -05:00
Derek Ryan Strong
de0b784d5a nano: add v8.0 (#44366)
* Add nano v8.0

* Change to pkgconfig virtual provider

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-25 22:12:08 -05:00
Derek Ryan Strong
5f38afdfc7 Add vim 9.1.0437 (#44364) 2024-05-25 17:20:26 -07:00
Derek Ryan Strong
ac67c6e34b htop: add v3.3.0 (#44369) 2024-05-25 17:18:59 -07:00
Peter Scheibel
72deb53832 Make spack clean env-aware (#44227)
`spack clean <spec>` will now resolve specs based on the active environment if one is active.

If an env is active but no matching spec is found, this will fall back on fully concretizing.
2024-05-24 15:00:50 -07:00
Massimiliano Culpo
7c87253fd8 Make strong preferences even stronger (#44373)
Before this PR, if Spack could see a possibility to reuse a spec that
doesn't match a strong preference, it would do so. After the PR, a
strong preference would take precedence.
2024-05-24 10:06:28 -07:00
Carsten Uphoff
1136aedd08 Add Khronos official OpenCL ICD loader (#44351)
* Add Khronos official OpenCL ICD loader

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

* Formatting; add missing opencl-c-headers version

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

* opencl-icd-loader: use define instead of f-string

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

---------

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-24 09:57:39 -07:00
AMD Toolchain Support
24e1b56268 uprof: update recipe, add missing dependency (#44293) 2024-05-24 16:19:18 +02:00
Harmen Stoppels
eef6a79b35 Prefer libiconv for iconv (#44335)
`glibc` and `musl` provide a basic implementation of `iconv` (`iconv`,
`iconv_open`, `iconv_close`), but in practice the installation may be
missing the character encoding methods to make them usable. On Fedora
for example, users need to

```yum install glibc-gconv-extra```

to get the character encodings that `gettext` requires during configure,
namely EUC-JP. Users may not have permissions to install the missing
parts of glibc.

Since Spack can install `libiconv`, it is simpler to use that by
default.
2024-05-24 13:25:59 +02:00
Adam J. Stewart
556a36cbd7 py-scikit-learn: add v1.5.0 (#44303)
* py-scikit-learn: add v1.5.0

* Add maintainers

* py-scikit-learn-extra: latest py-scikit-learn not supported
2024-05-24 12:41:59 +02:00
Rocco Meli
8aa490d6b7 DLA-Future-Fortran: new package (#44314)
* Spglib: add version 2.4.0

* dla-future-fortran: new package version 0.1.0

* [@spackbot] updating style on behalf of RMeli

* apply suggestion and add maintainer

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-05-24 09:53:33 +02:00
Chris Marsh
d9d085da10 Fix patch being applied @7 which causes an error (#44367) 2024-05-24 09:45:20 +02:00
Samuel Khuvis
d88d720577 add mvapich support for intel scalapack_libs (#44246)
* add mvapich support for intel scalapack_libs

* Add mvapich support for oneapi scalapack_libs
2024-05-24 02:51:53 +00:00
Harry Sharma
1d670ae744 feat: add metacarpa@1.0.1 to spack (#44339)
* feat: add metacarpa@1.0.1 to spack
* fix: style issue
* Update copyright year
2024-05-23 20:43:13 -06:00
Veselin Dobrev
35ad6f52c1 Better Emacs build on Mac OS (#37294)
* [emacs] When installing on Mac OS, build and install the native
        Emacs.app along with the standard executables.

* [emacs] Make the GUI build on Mac optional by adding "gui" variant

* Apply reviewer suggestion

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Add emacs version 29.3

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-23 18:17:38 -07:00
Greg Becker
b61bae7640 bugfix: external detection for compilers with os but not target (#44156)
avoid calling `spec.target` when None.

When an external compiler package has an `os` set but no `target` set, Spack
currently falls into a codepath that calls `spec.target` (which itself calls
`spec.architecture.target.Microarchitecture`) when `spec.architecture.target`
is None, throwing an error.

e.g.

```
packages:
  gcc:
    externals:
    - spec: gcc@12.3.1 os=rhel7
      prefix: /usr
```

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-24 00:13:36 +00:00
Massimiliano Culpo
8b7abace8b Enforce consistency of gl providers (#44307)
* glew: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-demos: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-glu: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* paraview: fix dependency on glew

* mesa: group dependency on when("+glx")

* Add missing dependency on libxml2

* paraview: remove the "osmesa" and "egl" variant

Instead, enforce consistency using the "gl" virtual that allows
only one provider.

* visit: remove osmesa variant

* Disable paraview in the aws-isc stacks

* data-vis-sdk: rework constrains to enforce front-ends

* e4s-power: remove redundant paraview

* Pipelines: update osmesa variants

* trilinos-catalyst-ioss-adapter: make gl a run dependency
2024-05-23 20:17:51 +00:00
pauleonix
5cf98d9564 asio: Add 1.30.0:1.30.2 (#44346) 2024-05-23 13:15:02 -07:00
Diego Alvarez S
973a961cb5 Add opendjk 11.0.23+9, 17.0.11+9, 21.0.3+9 (#44340) 2024-05-23 12:48:57 -05:00
Adam J. Stewart
868d0cb957 py-scipy: add v1.13.1 (#44337) 2024-05-23 10:45:51 -07:00
Diego Alvarez S
497f3a3832 nextflow: add 24.04.1 (#44338)
* Add nextflow 24.04.1
* Install java 17 in this version
2024-05-23 10:44:36 -07:00
Wouter Deconinck
9843f41bce libxkbcommon: add v1.6.0, v1.7.0 (#44344) 2024-05-23 10:37:07 -07:00
Carsten Uphoff
e54fefc2b7 Add double-batched-fft-library@0.5.1 (#44345)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-23 10:34:57 -07:00
pauleonix
90c0889533 benchmark: Add 1.8.4 (#44347) 2024-05-23 10:31:29 -07:00
Matt Thompson
6696e82ce7 mapl: add conflicts for intel 2021.7 (#44350) 2024-05-23 10:20:53 -07:00
Carsten Uphoff
dcc55d53db Add level zero loader versions (#44349)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-23 10:56:31 -06:00
Harmen Stoppels
92000e81b8 absolutify_elf_sonames.py: fix _patchelf (#44343) 2024-05-23 14:05:10 +00:00
Mikael Simberg
125175ae25 gperftools: Don't build benchmarks or tests (#44336) 2024-05-23 06:55:52 -06:00
Massimiliano Culpo
f60e548a0d ASP-based solver: fix reusing externals on linux (#44316)
We need to tell clingo the libc compatibility of external nodes
in buildcaches or stores, to allow reuse.
2024-05-23 14:37:48 +02:00
John W. Parent
04dc16a6b1 cmake: add v3.28 (#43723) 2024-05-23 07:58:05 +02:00
Wouter Deconinck
27b90e38db py-cloudpickle: new version 3.0.0 (switch to flit-core) (#44324) 2024-05-23 07:40:16 +02:00
Wouter Deconinck
7e5ce3ba48 py-ordered-set: new version 4.1.0 (flit-core) (#44325) 2024-05-23 07:38:53 +02:00
Paolo
f5f7cfdc8f armpl-gcc: add v24.04 (#43553)
Starting from 24.04, armpl-gcc will only have three packages files: dev, rpm, and macOS. 

Only one version for gcc is provided, so the changes in the code reflect that the tar provided
for gcc is only one rather than many.
2024-05-23 06:37:23 +02:00
Alex Richert
3e1a562312 Update package.py (#44322) 2024-05-22 21:13:35 -06:00
Todd Gamblin
ce4d962faa README.md: add windows 2024-05-22 18:12:12 -07:00
mSamiolo
b9816a97fc docs: update chain.rst to improve discussion of upstreams (#43918)
* Update chain.rst

* Update lib/spack/docs/chain.rst

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update lib/spack/docs/chain.rst

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* docs: rm leading spaces to avoid indent

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-22 22:04:53 +00:00
Alex Richert
f7b9c30456 Add develop version to ufs-weather-model (major updates) (#39265)
* Add develop version to ufs-weather-model (major updates)

* Update ufs-weather-model maintainers

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update ufs-weather-model defaults and fms dep

* Update package.py

* Update package.py

* Update package.py
2024-05-22 11:33:17 -06:00
Harmen Stoppels
884620a38a gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
to make macOS's linker happy.
2024-05-22 17:36:42 +02:00
pauleonix
7503a41773 cuda: add v12.4.1 (#43488) 2024-05-22 10:50:56 +02:00
Matt Thompson
9a5fc6b4a3 mapl: add v2.46 (#44230) 2024-05-22 10:22:57 +02:00
Cyrus Harrison
a31aeed167 conduit: add v0.9.2 (#44308) 2024-05-22 10:05:40 +02:00
Tamara Dahlgren
71f542a951 kcov: convert to new stand-alone test process (tested with latest version) (#44309) 2024-05-22 10:04:27 +02:00
Carlos Bederián
322bd48788 gcc: add v13.3.0 (#44306) 2024-05-22 09:44:54 +02:00
Wouter Deconinck
b752fa59d4 qt: add version 5.15.13 (#44310) 2024-05-22 09:44:35 +02:00
Wouter Deconinck
d53e4cc426 rsync: add v3.3.0 (#44311) 2024-05-22 09:36:16 +02:00
Tom Scogland
ee4b7fa3a1 flux-sched: add docs variant to match core, fix ver (#44277) 2024-05-22 00:55:57 -06:00
Harmen Stoppels
d6f02c86d9 grpc: add 1.61 to 1.64 (#44297) 2024-05-22 08:04:35 +02:00
John W. Parent
62efde8e3c cmake/add-3.29 (#43349) 2024-05-21 23:35:54 -06:00
Harmen Stoppels
bda1d94d49 bison: add missing diffutils build dep (#44296) 2024-05-21 19:33:29 -06:00
Massimiliano Culpo
3f472039c5 Take a lock before querying installed_dependents (#44301)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-21 13:16:04 -06:00
Robert Underwood
912ef34206 get cupy compiling with latest cuda (#44266)
* get cupy compiling with latest cuda
* fix other cupy-deps
* fix version bounds

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 10:44:18 -07:00
Massimiliano Culpo
9c88a48a73 Remove mesa18 and libosmesa (#44264)
* Remove mesa18 and libosmesa

mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.

We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.

libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.

* Remove references to mesa18 and libosmesa

* vtk: rework dependency on gl and osmesa

* memsurfer: rework dependency on vtk

* visit: minimal fix to avoid having both osmesa and glx
2024-05-21 10:18:14 -05:00
Dan Lipsa
4bf5cc9a9a Paraview on windows: Use htf5::hdf5 (#44161)
* Use hdf5::hdf5 on Windows from Paraview CMake
   This patch is already applied on VTK 9 or greater.
* Add comments stating that vtk and paraview patches are the same and should be modified in concert.
2024-05-21 11:05:32 -04:00
Chris Marsh
08834e2b03 Add homebrew gcc@14.1.0 patch for aarch64 darwin (#44280) 2024-05-21 15:31:36 +02:00
Massimiliano Culpo
8020a111df Demote a warning to debug message, if C compiler is not there (#44182) 2024-05-21 14:09:29 +02:00
dependabot[bot]
86fb547f7c bump pytest from 8.2.0 to 8.2.1 --- (#44282)
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:42:22 +02:00
Robert Underwood
b9556c7c44 Additional packages for devtools-manylinux (#44273)
Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-05-21 13:37:04 +02:00
dependabot[bot]
7bdb106b1b --- (#44281)
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-21 13:15:05 +02:00
kenche-linaro
2b191cd7f4 linaro-forge: added 24.0 version (#44292) 2024-05-21 05:13:42 -06:00
Federico Ficarelli
774f0a4e60 grpc: remove a maintainer (#44294) 2024-05-21 12:58:26 +02:00
Keita Iwabuchi
faf11efa72 Metall: add v0.26, v0.27, and v0.28 (#44284)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2024-05-21 12:50:44 +02:00
Massimiliano Culpo
5a99142b41 Cleanup of Apple OpenGL shim packages (#44269)
- Remove duplicated code
- Use BundlePackage as a base class
2024-05-21 12:49:18 +02:00
Harmen Stoppels
a3aca0242a grpc: forward compat bound abseil (#44290) 2024-05-21 11:48:59 +02:00
Massimiliano Culpo
72f276fab3 ASP-based solver: fix version optimization for roots (#44272)
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.

The issue was the omission of `PackageNode` in the optimization
tuple.
2024-05-21 08:41:09 +02:00
Philipp Edelmann
21139945df rayleigh: new package (#38338)
* rayleigh: new package

* Update var/spack/repos/builtin/packages/rayleigh/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* split edit into three methods

* add comments to clarify use of configure

* rayleigh: copyright year

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-20 19:23:02 -06:00
George Young
900bd2f477 seqkit: add 2.4.0, switching to 'go build' install mechanic (#38192)
* seqkit: add 2.4.0

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: add 2.4.0, switching to 'go build' install mechanic

* seqkit: update to @2.6.1, drop deprecated version and build system

* seqkit: add @2.7.0, convert to GoPackage

* tidying

* tidying

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-05-20 18:08:57 -07:00
Scott Wittenburg
29d4a5af44 gitlab ci: fix untouched spec pruning on windows (#44279)
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
2024-05-21 00:56:48 +00:00
dependabot[bot]
dd9b7ed6a7 build(deps): bump types-six in /.github/workflows/style (#44167)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.9 to 1.16.21.20240513.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-20 17:41:42 -07:00
Wouter Deconinck
09ff74be62 libsigsegv: fix patch filename extension 2024-05-21 02:12:24 +02:00
Robert Underwood
a94ebfea11 libpressio update (#44076)
* libpressio update
* fix typos in libpressio packages
* Addressed review feedback from @tldahlgren
* fix ci issues
* add missing package for SZx
* simplify varient logic and fix GPU deps
* Update var/spack/repos/builtin/packages/py-langsmith/package.py

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 18:12:00 -06:00
Paul R. C. Kent
8f5fe1d123 Add llvm v1816 (#44271) 2024-05-20 16:39:43 -07:00
Loris Ercole
d4fb58efa3 Update, fix serenity & serenity-libint (#43816)
* Update, fix serenity & serenity-libint
   Version 1.6.1 of `serenity` is added, some dependencies are made more
   explicit, some options are improved or fixed.
   The url of `serenity-libint` is fixed. The old url is not reachable
   anymore.
* Use upstream patch, modify cmake patch
* Update var/spack/repos/builtin/packages/serenity/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-20 16:07:25 -07:00
Tony Weaver
ce900346cc heimdall: Astronomy software package (#38328)
* heimdall: Astronomy software package.  Requires dedisp and psrdada (included as part of this commit)

* Updated packages to align with Spack's style

Minor updates based on wdconinc's comments regarding Spack's style guide

* [@spackbot] updating style on behalf of aweaver1fandm

* Minor edits to fix copyright year and dedisp install

Fixed copyright year to be 2024 instead of 2023

Removed the overridden version of install and created a preinstall function to create the missing lib and inc directories, therefore allowing the default install to run

Here is output from the spack install of heimdall showing successful build with cuda.  If necessary I do a spack clean and freshly install it
./spack install heimdall +cuda cuda_arch=86
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libiconv-1.17-enpmbhsi3kztebwmpclpub2afhlbr3gy
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/xz-5.4.1-pte76kujkezxb3laqse3o4sctlbygsaw
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zlib-1.2.13-utlfo5ltxz5v5bckirn5v3amtbxjdvwh
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/bzip2-1.0.8-x6navz7ucgfnb5xq7aelqmgd4zxsz5bs
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libmd-1.0.4-ncomhrodpdul4dm64o6b7426fhmc2u64
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/ncurses-6.4-c77h34rooycbzapxjvc27sg5td5jiwyb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/zstd-1.5.5-eporpybumydxveg5rwtfzysrsu4eqzcv
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libffi-3.4.4-vskntokaojclfqxjfzkbyirkeogddbpx
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxcrypt-4.4.33-gtpn32p6mxztul3c3dxzqj7gvcyh555j
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/util-linux-uuid-2.38.1-g322a5peqjaad6gl5q64cdu4qo7kvw6o
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libxml2-2.10.3-pza3kz2mtbncbbeim6rejfqkgftnf4rz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/openssl-1.1.1t-adbquvgg4qpc3vq6jynf44qzq3gfwrv5
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/pigz-2.7-e7mxj4ya2u4a6zb4hu64g7docujmkxeb
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/libbsd-0.11.7-wh4xleivbe7wndiqt5nsehzlfrccnjcg
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/readline-8.2-hhb647bwmbcj7iwpmtetbylninfm5rxf
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/cuda-11.7.1-osue2sx5rv7dgzhsmaemydpwhyribxng
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/tar-1.34-35f5gki2ycxmy5zd7zs5tsvp3xoszxum
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/expat-2.5.0-4gizyhhqklciuyrbyinq2tdggt73gds4
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gdbm-1.23-w3uzihtubj2iwv6es55fis6nt2q5zwlr
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/sqlite-3.40.1-m4ntzvuupsnbtkdfbz7oqpbjdlaffp2a
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/dedisp-1.0-pq6r3jnyxq6jzoygz3fp2e6jc2ojpvap
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/gettext-0.21.1-hglzdeadmgkzjb76bmemt6dnulfkrpha
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/python-3.10.10-khu36qq4p2te7jf475ewr2h7egidekfl
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/psrdada-master-by4w6mrpcfnoihlhos2jcfo2roiyagaz
[+] /home/htony/spack/opt/spack/linux-ubuntu22.04-zen3/gcc-11.3.0/heimdall-1.0-ohtdnltuhhejysshcert25h6nmuvluqp

* [@spackbot] updating style on behalf of aweaver1fandm
2024-05-20 13:39:23 -06:00
Adam J. Stewart
7cb64e465f py-pytest: add v8.2.1 (#44267)
* py-pytest: add v8.2.1
* py-pluggy: add v1.5.0
2024-05-20 11:28:59 -07:00
Pramod Kumbhar
eb70c9f5b9 caliper: add new variant to support Intel Vtune (#44147) 2024-05-20 18:27:42 +02:00
Teague Sterling
a28405700e awscli-v2: add v2.15.53, and other updates (#44258)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-20 18:21:09 +02:00
Adam J. Stewart
f8f4d94d7a Deprecate py-cmake and py-ninja (#44257) 2024-05-20 18:05:06 +02:00
Matt Thompson
32dfb522d6 gh: add v2.49.2 (#44231) 2024-05-20 13:51:46 +02:00
Mark W. Krentel
c61c707aa5 hpctoolkit: restrict one patch to :2022 (#44268)
Restrict the hpcrun-fmt.txt patch to :2022.  It's fixed in the code
after that, and in recent develop, some code paths have moved causing
this patch to fail.
2024-05-20 01:44:28 -06:00
Harmen Stoppels
60d10848c8 gettext: no link dep on tar (#44256) 2024-05-20 09:23:46 +02:00
Adam J. Stewart
dcd6b530f9 py-matplotlib: add v3.9.0 (#44225) 2024-05-20 09:22:30 +02:00
Teague Sterling
419f0742a0 htslib, STAR: add zlib-ng conflict (#44261) 2024-05-20 09:20:19 +02:00
Jacob King
c99174798b nimrod-aai: add version 24.2 and fix url (#42464)
shas changed due to reorganization of GitLab.com repo 
into an open subgroup.
2024-05-20 09:13:33 +02:00
Adam J. Stewart
8df2a4b511 py-gdown: add new package (#44265) 2024-05-20 09:09:35 +02:00
Benjamin Meyers
c174cf6830 Add openjdk@15.0.2 (#35936) 2024-05-19 10:27:26 -06:00
Wouter Deconinck
5eebd65366 audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1 (#44212)
* audit: disallow github.com/org/repo/pull/n/commits/hash.patch?full_index=1

* [@spackbot] updating style on behalf of wdconinc

* audit: fix style

* audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch

* [@spackbot] updating style on behalf of wdconinc

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit 2ecec99238.

* Revert "audit: github.com/o/r/pull/n/commits/sha.patch -> sha.patch"

This reverts commit 5bd7da2cad.

* fix: modify audit message with suggested fix

* audit: github.com/o/r/pull/n/commits/sha.patch -> /o/r/commit/sha.patch?full_index=1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-19 09:30:19 -05:00
Michael Kuhn
625f5323c0 py-pyqt5-sip: add 12.13.0 and fix build with gcc@14 (#44133) 2024-05-19 08:05:18 +02:00
Wouter Deconinck
e05a32cead gaudi: don't apply patch for 38.2 (#44252) 2024-05-18 23:13:46 -06:00
Juan Miguel Carceller
c69af5d1e5 podio: cleanup recipe, remove deprecated versions and patches (#44111)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-18 15:17:54 -06:00
Miranda Mundt
1ac2ee8043 Add Pyomo 6.7.2 (#44097) 2024-05-18 10:31:57 -05:00
Teague Sterling
36af1c1c73 perl-xml-libxml: add new versions and conflicts (fixes #44253) (#44254)
* Address #44253 by adding new versions and declaring conflicts for perl-xml-libxml

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/perl-xml-libxml/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-18 09:28:32 -06:00
Carlos Bederián
e2fa087002 namd: add 3.0b7 (#44198) 2024-05-18 10:15:55 -05:00
Teague Sterling
df02bfbad2 Adding new spark versions (#44250)
* Adding new spark versions (in preparation of HAIL package)

* Adding myself as potential maintainer
2024-05-18 10:06:12 -05:00
Teague Sterling
fecb63843e yq: add new package (#44249) 2024-05-18 06:38:26 -06:00
Scott Wittenburg
b33e2d09d3 oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-18 11:57:53 +02:00
Chris Green
f8054aa21a git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-05-18 03:32:06 -06:00
Valentin Volkl
8f3a2acc54 whizard: add gosam variant (#43595)
* whizard: add gosam variant

* adress comments, fix compiler wrapper issue
2024-05-18 10:11:26 +02:00
Kevin Huck
d1a20908b8 ZeroSum: add new package (#44228) 2024-05-18 09:23:45 +02:00
Derek Ryan Strong
dd781f7368 perl: add v5.36.3, v5.38.2; deprecate unsafe versions (#44186) 2024-05-18 09:21:03 +02:00
dmagdavector
9bcc43c4c1 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-17 17:50:29 -06:00
Todd Gamblin
77c83af17d docs: remove warning about repositories and package extension (#44247)
Local package repositories are very well supported and we test them extensively, so this
warning from 8 years ago can be removed from the docs.
2024-05-17 22:03:57 +00:00
Michael Kuhn
574bd2db99 netlib-scalapack: fix build with gcc@14 (#44120) 2024-05-17 22:38:46 +02:00
James Taliaferro
a76f37da96 kakoune: add v2024.05.09 (#44124) 2024-05-17 22:36:58 +02:00
Adam J. Stewart
9e75f3ec0a GDAL: add v3.9.0 (#44128) 2024-05-17 22:35:33 +02:00
Derek Ryan Strong
4d42d45897 Add latest Python versions (#44130) 2024-05-17 22:35:06 +02:00
SXS Bot
a4b4bfda73 spectre: add v2024.05.11 (#44139)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-05-17 22:28:25 +02:00
Michael Kuhn
1bcdd3a57e py-cython: add 3.0.10 (#44140) 2024-05-17 22:27:42 +02:00
Stephen Sachs
297a3a1bc9 Add mpas-model and mpich to pcluster neoverse stack (#44151)
Should build now since https://github.com/spack/spack/pull/43547 has been merged.
2024-05-17 22:00:17 +02:00
Adam J. Stewart
8d01e8c978 JAX: add v0.4.28 (#44112) 2024-05-17 21:58:44 +02:00
Wouter Deconinck
6be28aa303 pythia8: patch latest 8.311 for upstream bug (#43803)
* pythia8: prefer 8.310

* [@spackbot] updating style on behalf of wdconinc

* pythia8: filter_file to remove sed n

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit e2a3decaffbd3f464d1bd992025e1812df49f088.

* Revert "pythia8: prefer 8.310"

This reverts commit 568cb056b87129085e245d9dbef1732ee1c6c0aa.

* [@spackbot] updating style on behalf of wdconinc

* pythia8: comment for fix

* pythia8: fix style

* pythia8: filter_file with raw string because of escaped pipe

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-17 21:31:05 +02:00
Jon Rood
5e38310515 nalu-wind: fix trilinos rocm dependency (#44233) 2024-05-17 13:00:24 -06:00
wspear
ddfed65485 tau: fix lib/include paths with oneapi (#44170) 2024-05-17 18:55:27 +02:00
dependabot[bot]
2a16d8bfa8 build(deps): bump codecov/codecov-action from 4.3.1 to 4.4.0 (#44195)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5ecb98a3c6...6d798873df)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 18:46:17 +02:00
Jonathon Anderson
6a40a50a29 hpcviewer: Update URLs to use GitLab release assets (#44129) 2024-05-17 18:43:13 +02:00
Carlos Bederián
b2924f68c0 blis: add v1.0 (#44199) 2024-05-17 18:39:42 +02:00
Rocco Meli
41ffe36636 spglib: add v2.4.0 (#44202) 2024-05-17 18:33:40 +02:00
Chris Marsh
24edc72252 docs: show phase signature for builders (#44067) 2024-05-17 18:16:31 +02:00
Raffaele Solcà
83b38a26a0 New versions nvpl-blas and nvpl-lapack (#44244) 2024-05-17 17:46:25 +02:00
Nathalie Furmento
914d785e3b starpu: add v1.4.6 (#44203) 2024-05-17 08:17:01 -06:00
Garth N. Wells
f99f642fa8 fenicsx: remove deprecated versions (#44223) 2024-05-17 07:35:21 -06:00
Seth R. Johnson
e0bf3667e3 cli11: new version and enable library (#44204) 2024-05-17 15:08:28 +02:00
Andrew-Dunning-NNL
a24ca50fed Fix broken link in docs (#44217) 2024-05-17 12:58:11 +00:00
Fabien Bruneval
51e9f37252 MOLGW: add v3.3 (#44241)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 06:56:05 -06:00
Rémi Lacroix
453900c884 libxc: Fix compilation after distribution changes. (#44206)
The release tarballs are not available anymore which means autoconf, automake and libtool are always needed.

The NVHPC specific patches don't make sense anymore being that the patched files are not distributed in the new tar files.
2024-05-17 14:52:03 +02:00
Alberto Invernizzi
4696459d2d libcatalyst: add missing python dependencies (#44224) 2024-05-17 14:45:26 +02:00
Fabien Bruneval
ad1e3231e5 libcint: add v6.1.2, v5.5.0 (#44239)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 05:45:26 -06:00
Richard Berger
2ef7eb1826 exodusii: only use MPI fortran compiler if +fortran (#44211) 2024-05-17 13:01:09 +02:00
Dom Heinzeller
fe86019f9a ecflow: versions up to 5.11.4 require boost 1.84 or earlier (#44181) 2024-05-17 12:53:48 +02:00
Harmen Stoppels
9dbb18219f build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-05-17 12:29:56 +02:00
Mikael Simberg
451a977de0 hpx: change default of max_cpu_count variant to auto (#44220) 2024-05-17 11:54:48 +02:00
Jack S. Hale
e604929a4c FEniCS: add more maintainers (#44240) 2024-05-17 03:03:21 -06:00
dependabot[bot]
9d591f9f7c build(deps): bump actions/checkout from 4.1.5 to 4.1.6 (#44234)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5 to 4.1.6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](44c2b7a8a4...a5ac7e51b4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 10:57:08 +02:00
Matt Thompson
f8ad915100 esmf: add v8.6.1 (#44229) 2024-05-17 10:49:47 +02:00
Garth N. Wells
cbbabe6920 fenics-dolfinx: add spdlog dependency (#44237) 2024-05-17 10:48:55 +02:00
John W. Parent
81fe460194 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-05-16 17:00:02 -06:00
Paul Kuberry
b894f996c0 trilinos: catch kokkos inconsistency with trilinos (#44209)
* trilinos: catch kokkos inconsistency with trilinos

* trilinos: update kokkos version range
2024-05-16 13:36:11 -06:00
John W. Parent
1ce09847d9 Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-05-16 10:56:04 -07:00
Thomas Madlener
722d401394 gaudi: Don't apply the patch if it has already landed upstream (#44180) 2024-05-16 17:15:39 +02:00
Howard Pritchard
e6f04d5ef9 py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-16 16:18:44 +02:00
Mosè Giordano
b8e3ecbf00 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-05-16 11:07:18 +02:00
Todd Gamblin
d189387c24 bugfix: add arg to write_line_break() in spack_yaml (#42727)
`ruamel`'s `Emitter.write_line_break()` method takes an extra argument that we forgot to
implement in our custom emitter.
2024-05-15 19:25:06 -07:00
Andrew Lister
9e96ddc5ae CoinHSL: Support the Meson build system and add new release (#43610)
* Add maintainer and fix linting
* allow for fewer deps in archive
* use meson for archive packages
* Fix version spec and f-string
* fix blas dependency links
* Add new release to spack
* Fix checksums for latest release
2024-05-15 16:48:23 -06:00
dmagdavector
543bd189af iperf3: updated versions from 3.6 to 3.16 (#44152)
* Update version of iperf3 from 3.6 to 3.16

Spack currently only explicitly has version 3.6 of the iPerf3 package
(out of ESnet / LBNL). This makes the default the latest version of 3.16,
and adds some other versions (found in some Linux distros, for possible
compatibility purposes).

* iperf3: update to 3.17; update 3.6 hash for new url

* protobuf: update hash for patch needed when="@3.4:3.21"

* Revert "protobuf: update hash for patch needed when="@3.4:3.21""

This reverts commit 4d168d0b27.
2024-05-15 15:48:15 -06:00
John W. Parent
43291aa723 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-05-15 15:41:51 -04:00
Alec Scott
d0589285f7 unmaintained pkgs: bump versions 2024-05-10 (#44131)
* unmaintained pkgs: bump versions 2024-05-10

* openblas: fix satisfies syntax

* pixman: add autotools dependencies

* [@spackbot] updating style on behalf of alecbcs

* pixman: revert

* chapel: revert changes in favor of other PR

* openblas: revert due to failing tests

* Address review feedback for flint, biobam2, and pango

* pango: add version comment about v2.0

* numactl: revert changes due to ppc4le bug

* flint: remote duplicate configure arg

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* openvkl, rkcommon: remove commented maintainers template

* flint: fix style

---------

Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-15 11:04:52 -07:00
afzpatel
d079aaa083 enable tensorflow-2.14 and 2.16 for spack built ROCm (#44095)
* initial commit to enable tensorflow-2.14 for spack built ROCm

* fix style errors

* modify hipcc patch

* updates for rocm 6.1

* updates for tf-rocm-enhanced 2.16

* fix styling

* fix styling

* add patch for 2.16

* add patch for 2.16

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update rocm enhanced version names

* changes for rocm-enhanced version name change

* fix styling

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-15 17:06:35 +02:00
Paolo
6c65977e0d Fix gromacs installation with SVE. Issue 44062 (#44183)
* Fix gromacs installation with SVE. Issue 44062

* [@spackbot] updating style on behalf of paolotricerri

* Remove `neoverse_n2` target

We have removed the neoverse_n2 target as its detection is more involved
compared to neoverse_v*.
2024-05-15 07:22:47 -06:00
Carlos Bederián
1b5d786cf5 gromacs: add 2024.2, 2023.5 (#44197) 2024-05-15 07:21:55 -06:00
Ben Morgan
4cf00645bd VecGeom: new version 1.2.8 (#44179) 2024-05-15 07:50:48 -04:00
Jon Rood
e9149cfc3c nalu-wind: fix mistake (#44188) 2024-05-14 23:13:13 -06:00
Jon Rood
a5c8111076 exawind: updates to package to allow mixed device (#44159)
* exawind: updates to package to allow mixed device

* Style.

* Remove ninja variants.

* Add conflict for amr-wind+hypre with mixed device.

* Relax amr-wind~hypre requirement.

* Move runtime variables to nalu-wind.

* Update suggestions.

* Remove umpire.
2024-05-14 12:26:07 -06:00
Alec Scott
c3576f712d pkg-config: support apple-clang@15: (#44007) 2024-05-14 17:24:36 +02:00
Alec Scott
410e6a59b7 rust: fix v1.78.0 instructions (#44127) 2024-05-14 08:14:34 -07:00
Carlos Bederián
bd2b2fb75a python: add 3.10.14, 3.9.19, 3.8.19 (#44162) 2024-05-14 17:03:45 +02:00
Zachary Newell
7ae318efd0 Added NCCL version 2.21.5-1 (#44158) 2024-05-14 03:34:21 -06:00
Derek Ryan Strong
73e9d56647 Update bash 5.2 patches (#44172) 2024-05-14 09:08:39 +02:00
Derek Ryan Strong
f87a752b63 Add zsh 5.8.1 and 5.9 (#44173) 2024-05-14 09:08:08 +02:00
Derek Ryan Strong
ae2fec30c3 Add newer fish versions (#44174) 2024-05-14 09:07:49 +02:00
Derek Ryan Strong
1af5564cbe Add file 5.45 (#44175) 2024-05-14 09:07:25 +02:00
Derek Ryan Strong
a8f057a701 Add man-db 2.12.0 and 2.12.1 (#44176) 2024-05-14 09:07:10 +02:00
Michael B Kuhn
7f3dd38ccc amr-wind: add latest versions and correct waves2amr details (#44099)
* add latest versions and correct waves2amr details

* update commit associated with v2.1.0
2024-05-13 14:21:25 -06:00
Adam J. Stewart
8e9adefcd5 ML CI: update image (#43751)
* ML CI: update image

* Use main branch

* Use tagged version
2024-05-13 21:43:52 +02:00
jdomke
d276f9700f fujitsu-mpi package: help CMake find wrappers (#43979)
Set MPI_C_COMPILER etc. env vars when building with fujitsu-mpi, which
override CMake logic. Without using these variables to explicitly
request the Fujitsu MPI wrappers, the builtin CMake logic is unwilling
to use these wrappers unless the Fujitsu compiler is used, but they
should be used for %clang as well.

This avoids patching CMake module files like in #16864, primarily to
avoid the possibility of altering behavior for specs that do not use
%fj or ^fujitsu-mpi.
2024-05-13 11:15:45 -07:00
Harmen Stoppels
4f111659ec glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-05-13 20:11:27 +02:00
Dave Keeshan
eaf330f2a8 yosys: add v0.41 (#44153) 2024-05-13 09:40:42 -07:00
Julien Cortial
cdaeb74dc7 cdt: Add version 1.4.1 (#44155) 2024-05-13 09:38:25 -07:00
Alberto Sartori
fbaac46604 justbuild: add version 1.3.0 (#44148) 2024-05-13 09:35:29 -07:00
Jon Rood
7f6210ee90 nalu-wind: updates (#44046) 2024-05-13 10:28:24 -06:00
Wouter Deconinck
63f6e6079a gaudi: upstream patch when @38.1 for missing #include <list> (#44121)
* gaudi: upstream patch when @38.1 for missing #include <list>

* gaudi: apply list patch for all versions
2024-05-13 07:37:31 -06:00
Greg Becker
d4fd6caae0 spack uninstall: improve error message for dependent environment (#44149) 2024-05-13 14:58:13 +02:00
Mikael Simberg
fd3c18b6fd whip: Add 0.3.0 (#44146)
Co-authored-by: Mikael Simberg <simbergm@cscs.ch>
2024-05-13 03:09:32 -06:00
Adam J. Stewart
725f427f25 spack checksum: do not add expand=False to wheels (#44118) 2024-05-13 10:01:47 +02:00
Paul R. C. Kent
32b3e91ef7 llvm: add 18.1.5, 18.1.4 (#44123) 2024-05-12 16:28:32 -07:00
bk
b7e4602268 R: common package updates for 4.4.0 (#44088)
* r-ggplot2: v3.5.1, r-haven: v2.5.4, r-jsonlite: v1.8.8, r-pkgload: v1.3.4, r-vctrs: v0.6.5

* r-scales: v1.3.0
2024-05-12 10:49:38 -07:00
Stephen Sachs
4a98d4db93 Add applications to aws-pcluster-* stacks (#43901)
* Add openfoam to aws-pcluster-neoverse_v1 stack

* Add more apps to aws-pcluster-x86_64_v4 stack

* Remove WRF while hdf5 cannot build in buildcache at the moment

* Update comment

* Add more apps for aws-pcluster-neoverse_v1 stack

* Remove apps that currently do not build

* Disable those packages that won't build

* Modify syntax such that correct cflags are used

* Changing syntax again to what works with other packages

* Fix overriding packages.yaml entry for gettext

* Use new `prefer` and `require:when` clauses to clarify intent

* Use newer spack version to install intel compiler

This removes the need for patches and makes sure the `prefer` directives in
`package.yaml` are understood.

* `prefer` not strong enough, need to set compilers

* Revert "Use newer spack version to install intel compiler"

This reverts commit ecb25a192c.

Cannot update the spack version to install intel compiler as this changes the
compiler hash but not the version. This leads to incompatible compiler paths. If
we update this spack version in the future make sure the compiler version also updates.

Tested-by: Stephen Sachs <stesachs@amazon.com>

* Remove `prefer` clause as it is not strong enough for our needs

This way we can safely go back to installing the intel compiler with an older
spack version.

* Prefer gcc or oneapi to build gettext

* Pin gettext version compatible with system glibc-headers

* relax gettext version requirement to enable later versions

* oneapi cannot build older gettext version
2024-05-12 10:48:02 -07:00
Juan Miguel Carceller
9d6bf373be whizard: Add version 3.1.4 (#43045)
* whizard: Add a patch to fix builds with pythia8 >= 8.310

* Add whizard 3.1.4 and update accordingly

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:44:56 -07:00
Alex Richert
cff35c4987 octave: use pcre2 for @8: (#42636)
* octave: use pcre2 for @8:

* Add 'pcre2' variant to octave to control pcre vs. pcre2

* Update var/spack/repos/builtin/packages/octave/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-12 10:44:31 -07:00
Juan Miguel Carceller
d594f84b8f fastjet: Add a cxxstd variant (#44072)
* fastjet: Add a cxxstd variant

* Use f-strings

* Add multi=False

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:42:56 -07:00
Wouter Deconinck
f8f01c336c clang: support cxx20_flag and cxx23_flag (#43438)
* clang: support cxx20_flag and cxx23_flag

* clang: coverage test cxx{}_flag and c{}_flag additions
2024-05-12 07:45:59 -07:00
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
Massimiliano Culpo
f7afd67a26 Remove spurious ASP debug lines (#44051) 2024-05-07 11:28:38 +02:00
psakievich
d22bdc1c4e certs: fix interpolation and disallow relative paths (#44030) 2024-05-07 11:16:32 +02:00
Mikael Simberg
540f9eefb7 mold: Add 2.30.0 and 2.31.0 (#44034) 2024-05-07 10:41:13 +02:00
Massimiliano Culpo
2db5bca778 Warn users of the future removal of platform=cray (#43980) 2024-05-07 10:30:23 +02:00
Harmen Stoppels
bcd05407b8 llnl.util.tty.color._force_color: init in global scope (#44036)
Currently SPACK_COLOR=always is not respected in the build process on
macOS, because the global `_force_color` is re-evaluated in global scope
during module setup, where it is always `None`.

So, move global init bits from main.py to the module itself.
2024-05-07 09:49:46 +02:00
John W. Parent
b35ec605fe python-venv: use correct python name for which call (#44048) 2024-05-07 09:32:44 +02:00
Massimiliano Culpo
0a353abc42 Fix issues in packages with the new release of archspec (#44037) 2024-05-07 09:20:34 +02:00
Massimiliano Culpo
e178c58847 Respect requests when filtering reused specs (#44042)
Some specs which were excluded from reuse,
are currently added back to the solve when
we traverse dependencies of other reusable
specs.

This fixes the issue by keeping track of what
we can explicitly reuse.
2024-05-07 09:06:51 +02:00
Massimiliano Culpo
d7297e67a5 Maintenance for the "bootstrap" workflow in CI (#44031)
Removed a lot of duplication.

Fixed an issue in containers, leading to false negative
2024-05-07 08:49:53 +02:00
snehring
ee8addf04a neic-finitefault: adding new package and dependencies (#44032)
* neic-finitefault: adding new package
* py-obspy: adding new package
* py-okada-wrapper: adding new package
* py-pygmt: adding new package
* py-pyrocko: adding new package

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-06 15:45:32 -07:00
Jon Rood
fd3cd3a1c6 amr-wind: updates and fixes (#43993)
* Updates and fixes for AMR-Wind.

* Add versions and update openfast constraints.

* Style.

* Fix version.
2024-05-06 13:31:29 -06:00
Garth N. Wells
e585aeb883 (py)-fenics-dolfinx: version update (#44002)
* Update DOLFINx to v0.8
* Fix typo
2024-05-06 12:11:02 -07:00
James Taliaferro
1f43384db4 Add LSP client extension for Kakoune (#44000)
* new package: kakoune-lsp
* blacken
* add myself as a maintainer
2024-05-06 11:18:45 -07:00
Adam J. Stewart
814b328fca py-torchmetrics: add v1.4.0 (#44027) 2024-05-06 10:21:20 -07:00
Harmen Stoppels
125206d44d python: always use a venv (#40773)
This commit adds a layer of indirection to improve build isolation with 
and without external Python, as well as usability of environment views.

It adds `python-venv` as a dependency to all packages that `extends("python")`, 
which has the following advantages:

1. Build isolation: only `PYTHONPATH` is considered in builds, not 
   user / system packages
2. Stable install layout: fixes the problem on Debian, RHEL and Fedora where 
   external / system python produces `bin/local` subdirs in Spack install prefixes. 
3. Environment views are Python virtual environments (and if you add 
   `py-pip` things like `pip list` work)

Views work whether they're symlink, hardlink or copy type.

This commit additionally makes `spec["python"].command` return 
`spec["python-venv"].command`. The rationale is that packages in repos we do 
not own do not pass the underlying python to the build system, which could still 
result in incorrectly computed install layouts.

Other attributes like `libs`, `headers` should be on `python` anyways and need no change.
2024-05-06 16:17:35 +02:00
John W. Parent
a081b875b4 proj: patch for modern cmake (#43780) 2024-05-06 16:01:10 +02:00
Harmen Stoppels
a16ee3348b Do not cache indices in Gitlab (#44029) 2024-05-06 15:54:21 +02:00
Massimiliano Culpo
d654d6b1f4 Remove Fedora 37 and 38, Ubuntu 18 from CI (#44006) 2024-05-06 15:51:45 +02:00
Harmen Stoppels
9b4ca0be40 clingo bootstrap: remove 3.12 patch and concretizer workarounds (#44028) 2024-05-06 15:00:41 +02:00
Harmen Stoppels
dc71dcfdc2 bootstrap: lazy bootstrapping of clingo and GnuPG (#44026)
Currently bootstrapping from source fails because clingo requires gnupg
requires clingo.

This commit stops eager bootstrapping. We don't need `patchelf` nor `gnupg`
generally. They're bootstrapped when needed.
2024-05-06 14:02:39 +02:00
Greg Becker
1f31c3374c External package detection for compilers (#43464)
This creates shared infrastructure for compiler packages to implement the 
detailed search capabilities from the `spack compiler find` command for the 
`spack external find` command.

After this commit, `spack compiler find` can be replaced with 
`spack external find --tag compiler`, with the exception of mixed toolchains.
2024-05-06 10:33:33 +02:00
Massimiliano Culpo
27aeb6e293 Update vendored archspec to v0.2.4 (#44005) 2024-05-06 10:20:56 +02:00
Harmen Stoppels
715214c1a1 spack env create <env>: dir if dir-like (#44024)
A named env cannot contain `.` and `/`.

So when a user runs `spack env create ./here` do not error but treat it
as `spack env create -d ./here`.

Also fix help string of `spack env create`, which seems to have been
copied from `activate` incorrectly.
2024-05-06 02:00:23 -06:00
dependabot[bot]
b471d62dbd build(deps): bump black from 24.4.0 to 24.4.2 in /lib/spack/docs (#43878)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-06 09:55:43 +02:00
Massimiliano Culpo
a5f62889ca archspec: add v0.2.4 (#44022) 2024-05-06 09:51:01 +02:00
Alec Scott
2a942d98e3 pkgconf: add v2.2.0 (#44008) 2024-05-06 09:31:41 +02:00
Alec Scott
4a4077d4ef rust: add v1.78.0 (#44012) 2024-05-06 09:30:34 +02:00
Alec Scott
c0fcccc232 go: add v1.22.2 (#44013) 2024-05-06 09:30:13 +02:00
Alec Scott
0b2cbfefce ncurses: add v6.5 (#44015) 2024-05-06 09:30:00 +02:00
Massimiliano Culpo
c499514322 libgpg-error: add v1.49 (#44023) 2024-05-06 09:28:31 +02:00
Alec Scott
ae392b5935 pcre2: add v10.43 (#44020) 2024-05-06 09:25:11 +02:00
Alec Scott
62e9bb5d51 libunistring: add v1.2 (#44018) 2024-05-06 09:24:12 +02:00
Alec Scott
6cd948184e libidn2: add v2.3.7 (#44017) 2024-05-06 09:23:50 +02:00
Alec Scott
44ff24f558 openssl: v3.3.0 (#44019) 2024-05-06 09:22:15 +02:00
Alec Scott
c657dfb768 gettext: v0.22.5 (#44014) 2024-05-05 06:57:42 -06:00
Pranav Sivaraman
f2e410d95a lazygit: add v0.41.0 (#43903) 2024-05-04 17:18:30 -06:00
dependabot[bot]
df443a38d6 build(deps): bump actions/checkout from 4.1.2 to 4.1.4 (#43831)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.2 to 4.1.4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](9bb56186c3...0ad4b8fada)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:49:26 -07:00
dependabot[bot]
74fe498cb8 build(deps): bump mypy from 1.9.0 to 1.10.0 in /lib/spack/docs (#43834)
Bumps [mypy](https://github.com/python/mypy) from 1.9.0 to 1.10.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:48:49 -07:00
Carlos Bederián
5f13a48bf2 mesa: add 23.3.6 (#43736) 2024-05-04 15:46:39 -07:00
Jon Rood
c4824f7fd2 ccls: add new versions (#43923) 2024-05-04 15:42:07 -07:00
dependabot[bot]
49a8634584 build(deps): bump codecov/codecov-action from 4.3.0 to 4.3.1 (#43945)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](84508663e9...5ecb98a3c6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-04 15:39:42 -07:00
Seth Bromberger
eac5ea869f tmux: add v3.4 and missing yacc build dep (#43975) 2024-05-04 22:33:19 +02:00
Juan Miguel Carceller
f5946c4621 pythia8: Add a gzip variant and filter some compiler wrapper paths (#43807)
* Add a gzip variant and filter some compiler wrapper paths

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-04 09:33:44 -05:00
Harmen Stoppels
8564ab19c3 fix iconv from libc (#43996)
* fix iconv from libc

* fix args in glib
2024-05-04 10:30:43 +02:00
Scott Wittenburg
aae7a22d39 gitlab: release branch pipelines rebuild what changed (#43990) 2024-05-04 10:11:48 +02:00
eugeneswalker
09cea230b4 e4s-alc: add v1.0.2 (#44001) 2024-05-03 22:24:08 -06:00
wspear
a1f34ec58b Add a gmake dependency for TAU (#43870)
We discovered a case where the system gmake can get confused by spack's build environment. Let's use a spack-provided gmake for consistent builds.
2024-05-03 20:15:40 -07:00
Auriane R
4d7cd4c0bf Add py-flash-attn@2.4.2 (#43960) 2024-05-03 16:40:46 -07:00
Christopher Christofi
4adbfa3835 fix package permissions docs link for gaussian packages (#43998) 2024-05-03 17:35:39 -06:00
Adam J. Stewart
8a1b69c1d3 Modernize py-torch-geometric and dependencies (#43984)
* Modernize py-torch-geometric and dependencies
* Forgot one maintainer
2024-05-03 16:21:41 -07:00
Matthew Thompson
a1d69f8661 hdf package: fix building with apple-clang@15: (#43964) 2024-05-03 16:11:55 -07:00
Carlos Bederián
e05dbc529e globalarrays package: add missing dependencies for armci=ofi/openib (#43971) 2024-05-03 16:08:29 -07:00
jdomke
99d33bf1f2 hpl: linking against fujitsu ssl2 if available (#43978)
* linking against fujitsu ssl2 if available

---------

Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:08:12 -07:00
jdomke
bd1918cd71 adding clang compiler checks which behaves exactly like aocc (#43976)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:01:14 -07:00
snehring
2a967c7df4 graphicsmagick package: add version 1.3.43 (#43977) 2024-05-03 15:59:10 -07:00
downloadico
7596aac958 New package: py-pylith (#43987) 2024-05-03 15:56:21 -07:00
Frédéric Simonis
c73ded8ed6 preCICE: increase baseline versions for next ver (#43985) 2024-05-03 15:55:46 -07:00
Satish Balay
df1d783035 sowing: add @master (#43991) 2024-05-03 15:37:39 -07:00
Massimiliano Culpo
47051c3690 Add a default provider for armci (#43997) 2024-05-03 23:48:58 +02:00
Owen Solberg
3fd83c637d Add checksum for R v4.4.0 (#43973)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-05-03 14:47:53 -07:00
Jon Rood
ef5afb66da openfast: updates to package (#43994) 2024-05-03 15:22:12 -06:00
snehring
ecc4336bf9 r-asreml: adding new package (#43970)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-03 12:20:58 -07:00
Harmen Stoppels
d2ed217796 concretizer args: --fresh-roots == --reuse-deps (#43988)
Since reuse is the default now, `--reuse-deps` can be confusing, as it
technically does not imply roots are fresh.

So add `--fresh-roots`, which is also easier to discover when running
`spack concretize --fre<tab>`
2024-05-03 12:12:36 -07:00
Dom Heinzeller
272c7c069a Add -fPIC to hdf-eos2 builds (#43794)
* Add -fPIC to hdf-eos2 builds
* Use self.compiler.cc_pic_flag instead of -fPIC
* Fix style in var/spack/repos/builtin/packages/hdf-eos2/package.py
2024-05-03 11:38:33 -07:00
Jeff Hammond
23f16041cd NWChem ARMCI-MPI variant and optional TCE builds (#43883)
* WIP NWChem with ARMCI-MPI and TCE optional
* rename armci-mpi to armcimpi
* add version for git
* add ARMCI-MPI support to NWChem package
   EXTERNAL_ARMCI_PATH needs to be set to the ARMCI-MPI package install prefix.
   I could not get it from spec["armcimpi"] but it worked to use the dependent build environment method to export a    generic environmental variable to use instead.
* i suppose i can maintain NWChem as well
* check rejects option that dozens of packages use
   i do not have time to fight with this nonsense
   Run . share/spack/setup-env.sh
   ==> Error: armcimpi version 'master' has extra arguments: 'branch'
   Valid arguments for a url fetcher are:
       'url', 'sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', and 'checksum'
   Error: Process completed with exit code 1.
* style
* ARMCI-MPI needs depends_on; the rest seems fixed
* fix ARMCI selection per review feedback from @zzzoom
   https://github.com/spack/spack/pull/43883#discussion_r1585014147
* address reviewer feedback from @yizeyi18
   https://github.com/spack/spack/pull/43883#discussion_r1587228084
   elaborate on what +extratce does, in terms of the NWChem build environment variables,
   some of which are documented on https://nwchemgit.github.io/TCE.html.
* style
* Update var/spack/repos/builtin/packages/nwchem/package.py

---------

Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Carlos Bederián <4043375+zzzoom@users.noreply.github.com>
2024-05-03 11:13:54 -07:00
Frank Willmore
e2329adac0 Update package.py for vacuumms 1.2.0 (#43948)
* updating vacuumms for new release
* formatting
* re-order versions and remove triple quote
2024-05-03 11:06:15 -07:00
jalcaraz
4ec788ca12 [TAU package] Update with +rocprofv2. Updated some tests. (#43937)
* [TAU package] Update with +rocprofv2. Updated some tests.

pdated with +rocprofv2 flag. Only works with rocm-core >= 6.0.0

Can be tested with (Omnia):
spack install tau@master +rocm+rocprofv2 %gcc@11
Needs the last commit in our local repository, can be done by modifying the "git = " line, or waiting until the public one is updated.


In the case that tests cause issues when building TAU, there is the flag:
disable_tests = False

The rocm test is disabled by default, as the PR regarding tests loading dependencies is not solved (PR#43682).

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-05-03 08:33:13 -07:00
Andrew W Elble
c1cea9d304 py-tensorflow: fix aarch64 build (#43852)
* py-tensorflow: fix aarch64 build

* [@spackbot] updating style on behalf of aweits

* patch format

* change patch strategy, actually get the logic correct in
the patch

* only patch 2.16.1 onwards for aarch64

* __CUDACC__ not __NVCC__

* !(defined(__NVCC__) && defined(__CUDACC__))

---------

Co-authored-by: aweits <aweits@users.noreply.github.com>
2024-05-03 15:46:13 +02:00
Massimiliano Culpo
5c96e67bb1 Make runtimes depend on libc only on linux (#43981) 2024-05-03 15:40:02 +02:00
Marc T. Henry de Frahan
7008bb6335 Add rosco package (#43822)
* Add rosco package

* format

* remove unused

* Add in other versions

* start adding some patches

* finish adding the patches
2024-05-03 01:58:35 -06:00
Adam J. Stewart
14561fafff py-torch: set TORCH_CUDA_ARCH_LIST globally for dependents (#43962) 2024-05-03 09:11:59 +02:00
Massimiliano Culpo
89bf1edb6e Spec.satisfies: fix a bug with concrete spec from JSON (#43968)
Fix a bug triggered by missing a virtual on some transitive edge, in a subdag of a pure build dependency.
2024-05-02 22:16:02 -04:00
Todd Gamblin
cc85dc05b7 docs: re-enable google analytics (#43974)
We recently switched to using the new ReadTheDocs with "addons". That includes its own
analytics, which is nice, but we also want to continue using our GA4 analytics.

Adding GA4 is no longer supported by RTD, so we have to add it manually.

- [x] re-add the gtag to all pages, manually

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-02 21:56:19 -04:00
Adam J. Stewart
ae171f8b83 py-torch: conflict with newer cuda (#43969) 2024-05-02 18:47:33 -07:00
snehring
578dd18b34 minimap2: adding version 2.28 (#43972)
* minimap2: adding version 2.28
* minimap2: adding maintainer

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-02 18:45:59 -07:00
Garth N. Wells
a7a51ee5cf py-fenics-ffcx: update to v0.8 (#43929)
* Update FFCx and UFCx to v0.8
* Syntax fix
2024-05-02 09:53:31 -07:00
Stephanie Labasan Brink
960cc90667 variorum: add branch dev and version 0.8.0 (#43829)
* variorum: add branch dev and version 0.8.0
* formatting
2024-05-02 09:50:05 -07:00
Alex Richert
dea44bad8b grads: fix libpng and g2c deps (#43909)
* grads: fix libpng and g2c deps
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:42:15 -07:00
Richard Berger
e37870ff43 rdma-core: add versions 51.0, 50.0 and 49.1 (#43926) 2024-05-02 09:41:23 -07:00
Sajid Ali
3751642a27 mold: add new versions (#43931) 2024-05-02 09:39:09 -07:00
Nils Vu
0f386697c6 spectre: add versions, update constraints (#43936) 2024-05-02 09:38:06 -07:00
Alex Richert
67ce103b2c Update prod-util recipe (#43940)
* update prod-util recipe
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:24:39 -07:00
Alex Richert
a8c9fa0e45 g2tmpl: add v1.12.0 (#43941) 2024-05-02 09:23:42 -07:00
Alex Richert
b56a133fce update grib-util recipe (#43939) 2024-05-02 09:22:03 -07:00
Seth R. Johnson
f0b3d33145 Celeritas: new version 0.4.3 (#43949) 2024-05-02 09:04:30 -07:00
Adam J. Stewart
32564da9d0 rust: add conflict for older GCC (#43958) 2024-05-02 08:36:06 -07:00
Stephen Hudson
8f2faf65dc libEnsemble: add v1.3.0 (#43944) 2024-05-02 08:35:06 -07:00
eugeneswalker
1d59637051 e4s ci: add py-amrex (#43904) 2024-05-02 08:57:26 -06:00
jdomke
97dc353cb0 libc: detect ARM flavor (#43959)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-02 15:59:29 +02:00
wspear
beebe2c9d3 Accept later pythons for tau@2.33+ (#43911)
* Accept later pythons for tau@2.33+

* separate forward compat bounds
2024-05-02 07:13:21 -06:00
Harmen Stoppels
2eb7add8c4 gcc: write builtin specs before modifications (#43956) 2024-05-02 06:40:10 -06:00
Harmen Stoppels
a9fea9f611 nghttp2: add diffutils (#43954) 2024-05-02 14:22:53 +02:00
Harmen Stoppels
9b62a9c238 gitlab ci: cache user cache (#43952) 2024-05-02 12:06:30 +02:00
wspear
f7eb0ccfc9 Revert #43701 (#43935)
PR #43701 is broken, preventing scorep from being installed. As explained in #43700 the issue is internal to scorep and can not be fixed in the package, at least by the method being attempted here.
2024-05-02 09:56:21 +02:00
Adrien Bernede
a0aa35667c Ignore external packages when pushing to buildcache automatically (--autopush) (#43930) 2024-05-02 09:50:10 +02:00
Luc Berger
b1d4fd14bc Nalu: adding support for Trilinos 14.2.0 for Nalu 1.6.0 (#43857) 2024-05-02 01:10:27 -06:00
John W. Parent
7e8415a3a6 Windows: auto-add WGL/SDK as externals (#43752)
Adds a pre-concretization check for the Windows SDK and WGL (Windows
GL) packages as non-buildable externals.

This is a redo of https://github.com/spack/spack/pull/43459, but makes
sure to modify the configuration scope outside of the bootstrap scope:
whichever is highest-precedence in the user's environment at the time
the concretization runs, which should either be an env scope or the
~ scope.

Adds pytest fixture mocking the check for WGL and WSDK as if they were
present.
2024-05-02 01:05:03 -06:00
Simon Pintarelli
7f4f42894d update sirius package.py (#43872) 2024-05-02 08:23:10 +02:00
Massimiliano Culpo
4e876b4014 Allow more control over which specs are reused (#42782)
This PR gives users finer control over which specs are reused during concretization.

The value of the `concretizer:reuse` config option now can take an object with the following properties:
- `roots`: true if reusing roots, false if reusing just dependencies
- `exclude`: list of constraints used to select reusable specs 
- `include`: list of constraints used to select reusable specs 
- `from`: allows to select the sources of reused specs

### Examples

#### Reuse only specs compiled with GCC
```yaml
concretizer:
  reuse:
    roots: true
    include:
    - "%gcc"
```

#### `openmpi` must be used from externals, and it must be the only external used
```yaml
concretizer:
  reuse:
    roots: true
    from:
    - type: local
      exclude:
      - "openmpi"
    - type: buildcache
      exclude:
      - "openmpi"
    - type: external
      include:
      - "openmpi"
```
2024-05-01 23:05:26 -04:00
Wouter Deconinck
77a8a4fe08 abseil-cpp: new version 20240116.2 (#43666) 2024-05-01 17:06:57 -07:00
Thomas Bouvier
597e5a4e5e Update py-nvidia-dali to v1.36.0 (#43928)
* py-nvidia-dali: add v1.36.0
* fix: do not expand downloaded wheel
2024-05-01 16:35:43 -07:00
MatthewLieber
3c31c32f62 Adding sha for 7.4 release of OSU Micro Benchmarks (#43899)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-05-01 16:27:52 -07:00
Weiqun Zhang
3a93a716e4 amrex: add v24.05 (#43938) 2024-05-01 16:17:05 -07:00
Filippo Spiga
82229a0784 Adding EXTRAE 4.1.2 (#43907) 2024-05-01 15:55:54 -07:00
Axel Huebl
5d846a69d1 ADIOS2: Campaign Variant (#43906)
With v2.10+, ADIOS added a campaign manager. This is auto-enabled
if SQLite3 is found.

Add explicit control for it now and disables it by default, to avoid
picking up system dependencies or bloating by default the ADIOS2
dependencies. Also, not yet fully mature to be used by default:
https://github.com/ornladios/ADIOS2/issues/4148
2024-05-01 15:50:45 -07:00
Stephen Herbener
d21aa1cc12 Removed use of mpi wrappers for fms and mapl package.py scripts. These were causing (#43726)
builds to fail on MacOS and CMake appears to handle the build fine without the mpi wrappers.
2024-05-01 15:41:30 -07:00
Jake Koester
7896ff51f6 bumped up version of kokkos-kernels (#43920) 2024-05-01 12:20:51 -06:00
Chris Mc
5849a24a74 jwt-cpp: add v0.7.0 (#41894)
* jwt-cpp: add version 0.7.0

* Update var/spack/repos/builtin/packages/jwt-cpp/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-01 12:20:35 -06:00
Alex Leute
38c49d6b82 py-simpy: New package (#43497)
* pypi build of py-simpy

* [py-simpy] toml explicitly called out

* py-simpy: New version

* py-setuptools-scm: Added new version

* py-setuptools-scm: add url_for_version

Because versions @:7 have an underscore (setuptools_scm) in the URL, and
newer versions have a dash (setuptools-scm).

* py-setuptools-scm: fix flake8 line too long

* py-simpy: Fix hash

---------

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-05-01 12:06:52 -06:00
fpruvost
0d8900986d qrmumps: 3.1 (#43913) 2024-05-01 12:00:25 -06:00
dependabot[bot]
62554cebc4 build(deps): bump black in /.github/workflows/style (#43879)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:37:54 +02:00
Wouter Deconinck
067155cff5 containers: add ubuntu 24.04 (#43881)
* containers: add ubuntu 24.04

* containers: use python3-boto3 pkg instead of pip install
2024-05-01 13:37:13 +02:00
Wouter Deconinck
08e68d779f ci: update upload-artifact to v4 (in build-containers) (#43880) 2024-05-01 13:35:12 +02:00
Adam J. Stewart
05b04cd4c3 Update PyTorch ecosystem (#43823)
* Update PyTorch ecosystem

* Don't expose protobuf namespace to other packages, breaks torchtext build

* Revert "Don't expose protobuf namespace to other packages, breaks torchtext build"

This reverts commit 50a4b08f65.
2024-05-01 13:30:46 +02:00
dependabot[bot]
be48f762a9 build(deps): bump pytest from 8.1.1 to 8.2.0 in /lib/spack/docs (#43908)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.1.1 to 8.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.1.1...8.2.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:29:22 +02:00
Axel Huebl
de5b4840e9 Add quantiphy (#43894)
* Add quanitphy

Add https://github.com/KenKundert/quantiphy .

* Simplify Version Range
2024-04-30 15:26:35 -07:00
Harmen Stoppels
20f9884445 curl: perl is temporarily required (#43921) 2024-04-30 23:32:06 +02:00
Harmen Stoppels
deb78bcd93 re2c: depends on gmake (#43916)
regressed in 650a668a9d
2024-04-30 23:30:36 +02:00
Garth N. Wells
06239de0e9 py-fenics-ufl: add new version (#43848)
* Bump version to 2024.1.0
* Bump to .post0
* Dependency fix
* Format file
* Run black on package file
2024-04-30 14:10:55 -07:00
Garth N. Wells
1f904c38b3 Add version 1.9.2. (#43838) 2024-04-30 12:19:31 -07:00
Wouter Deconinck
f2d0ba8fcc PackageStillNeededError: add pkg that needs spec to exception msg (#43845)
* PackageStillNeededError: add pkg that needs spec to exception msg
* PackageStillNeededError: f-string with short fmt and hash
* PackageStillNeededError: split long string
2024-04-30 12:11:47 -07:00
Satish Balay
49d3eb1723 petsc, py-petsc4py: add v3.21.1 (#43887) 2024-04-30 10:07:33 -07:00
Dom Heinzeller
7c5439f48a Update crtm-fix and crtm from JCSDA fork (#43855) 2024-04-30 10:07:16 -07:00
Harmen Stoppels
7f2cedd31f hack: drop glibc and musl in old concretizer (#43914)
The old concretizer creates a cyclic graph when expanding virtuals for
`iconv`, which is a bug. This hack drops glibc and musl as possible
providers for `iconv` in the old concretizer to work around it.
2024-04-30 13:55:48 +02:00
Harmen Stoppels
d47951a1e3 glibc: provides iconv (#43897)
`iconv` is a bit of weird virtual because the only shared API between
`glibc` and `libiconv` is:

```
iconv
iconv_open
iconv_close
```

whereas `libiconv` has further symbols [iconvctl](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconvctl.3.html), [iconv_open_into](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconv_open_into.3.html), and an `iconv` executable and `libcharset.so`. Packages that need those will have to do `depends_on("[virtuals=iconv] libiconv")`.
2024-04-30 07:40:00 +02:00
Teague Sterling
f2bd0c5cf1 Fix duckdb version handling again (#43762)
* Added package to build Ollama
* Update package.py
   Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
* We can now use OVERRIDE_GIT_DESCRIBE to set the version in DuckDB
* Update duckdb/package.py
   - use f-string
   - fix version specs to address inconsistencies
* Update package.py
   Fix spec definitions further
* [@spackbot] updating style on behalf of teaguesterling

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-29 18:51:04 -07:00
one
4362382223 Update package.py (#43764) 2024-04-29 18:49:49 -07:00
Nathalie Furmento
ba4859b33d starpu: add release 1.4.5 and dependancy for building starpupy (#43810) 2024-04-29 18:48:37 -07:00
Carlos Bederián
e8472714ef ucc: add 1.3.0 (#43828) 2024-04-29 18:45:00 -07:00
Mikael Simberg
ee6960e53e boost: Add 1.85.0 (#43788)
* boost: Add 1.85.0
* Add conflict for Boost 1.85.0 stacktrace change
2024-04-29 18:41:38 -07:00
Nathalie Furmento
dad266c955 starpu: add branch 1.4 (#43837) 2024-04-29 18:38:43 -07:00
Garth N. Wells
7a234ce00a (py-)fenics-basix: update for new version (#43841)
* Bump version to 2024.1.0
* Bump py-basix version
* Revert UFL change
* Python version update
2024-04-29 18:11:36 -07:00
Loris Ercole
a0c2ed97c8 Update scine-qcmaquis (#43817)
`scine-qcmaquis` is updated with a version 3.1.4.
The option to build the OpenMolcas interface is added, and some
dependencies are clarified.
2024-04-29 16:38:29 -07:00
Rémi Lacroix
a3aa5b59cd dotnet-core-sdk: Fix environment setup (#43842)
The "DOTNET_CLI_TELEMETRY_OPTOUT" environment variable should be defined when using the product, not when installing it (the installation phase is just extract the files anyway).

Also use `~` instead of `-` to check for the variant and fix the second argument for `env.set`  which should also be a string.
2024-04-29 14:47:37 -07:00
Zach Jibben
f7dbb59d13 Update Petaca (#43849)
* Add recent petaca releases
* Add Petaca 24.04
2024-04-29 14:42:11 -07:00
Wouter Deconinck
0df27bc0f7 nopayloadclient: new package (#43853) 2024-04-29 14:16:57 -07:00
Massimiliano Culpo
877e09dcc1 Delete leftover file (#43869)
This file is not needed anymore, was introduced in #19688
2024-04-29 22:45:41 +02:00
Adam J. Stewart
c4439e86a2 Prettier: add new package (#43866) 2024-04-29 13:39:52 -07:00
Pramod Kumbhar
aa00dcac96 gprofng-gui: new package (#43873) 2024-04-29 13:38:48 -07:00
Adam J. Stewart
4c9a946b3b py-keras: add v3.3.3 (#43885) 2024-04-29 13:30:52 -07:00
Wouter Deconinck
0c6e6ad226 xbae: new package (#43889) 2024-04-29 13:05:56 -07:00
Adam J. Stewart
bf8f32443f py-einops: add v0.8.0 (#43890) 2024-04-29 12:54:33 -07:00
Wouter Deconinck
c2eef8bab2 fontconfig: depends_on gperf when 2.11.1: (#43892) 2024-04-29 11:56:59 -07:00
afzpatel
2df4b307d7 hipblaslt: new package (#43846)
* initial commit to add hipblaslt package
* remove master and update patch
* add docstring comment
2024-04-29 11:48:03 -07:00
Ryan Marcellino
3c57440c10 openjdk: add v21_35 (#40699)
* openjdk: add v21+35

* add provides java@21

* Update var/spack/repos/builtin/packages/openjdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-29 10:51:17 -05:00
Harmen Stoppels
3e6e9829da compiler.py: fix early return (#43898) 2024-04-29 08:53:27 -06:00
Gregory Becker
859745f1a9 Run audits on windows
Add debug log for external detection tests. The debug log
is used to print which test is being executed.

Skip version audit on Windows where appropriate
2024-04-29 14:13:10 +02:00
1326 changed files with 18424 additions and 11929 deletions

View File

@@ -17,33 +17,51 @@ concurrency:
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.operating_system }}
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) -d audit externals
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -1,7 +1,8 @@
#!/bin/bash
set -ex
set -e
source share/spack/setup-env.sh
$PYTHON bin/spack bootstrap disable github-actions-v0.4
$PYTHON bin/spack bootstrap disable spack-install
$PYTHON bin/spack -d solve zlib
$PYTHON bin/spack $SPACK_FLAGS solve zlib
tree $BOOTSTRAP/store
exit 0

View File

@@ -13,118 +13,22 @@ concurrency:
cancel-in-progress: true
jobs:
fedora-clingo-sources:
distros-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
container: ${{ matrix.image }}
strategy:
matrix:
image: ["fedora:latest", "opensuse/leap:latest"]
steps:
- name: Install dependencies
- name: Setup Fedora
if: ${{ matrix.image == 'fedora:latest' }}
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
@@ -133,15 +37,9 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -151,77 +49,100 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-sources:
runs-on: macos-latest
clingo-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison@2.7 tree
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: ${{ matrix.macos-version }}
gnupg-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
macos-version: ['macos-11', 'macos-12']
runner: [ 'macos-13', 'macos-14', "ubuntu-latest" ]
steps:
- name: Install dependencies
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
brew install tree gawk
sudo rm -rf $(command -v gpg gpg2)
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
# NOTE: test all pythons that exist, not all do on 12
done
ubuntu-clingo-binaries:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Setup repo
- name: Bootstrap GnuPG
run: |
git --version
. .github/workflows/setup_git.sh
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
from-binaries:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: |
3.8
3.9
3.10
3.11
3.12
- name: Set bootstrap sources
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
echo "Testing $ver_dir"
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
@@ -236,122 +157,9 @@ jobs:
exit 1
fi
done
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
useradd spack-test && mkdir -p ~spack-test
chown -R spack-test . ~spack-test
- name: Setup repo
shell: runuser -u spack-test -- bash {0}
run: |
git --version
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-binaries:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack -d gpg list
tree ~/.spack/bootstrap/store/
# [1] Distros that have patched git to resolve CVE-2022-24765 (e.g. Ubuntu patching v2.25.1)
# introduce breaking behaviorso we have to set `safe.directory` in gitconfig ourselves.
# See:
# - https://github.blog/2022-04-12-git-security-vulnerability-announced/
# - https://github.com/actions/checkout/issues/760
# - http://changelogs.ubuntu.com/changelogs/pool/main/g/git/git_2.25.1-1ubuntu3.3/changelog

View File

@@ -45,19 +45,18 @@ jobs:
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora37, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:37'],
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38'],
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -89,9 +88,9 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@a8a3f3ad30e3422c9c7b888a15615d19a852ae32
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
@@ -122,3 +121,14 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -77,13 +77,8 @@ jobs:
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all:
needs: [ windows, unit-tests, bootstrap ]
needs: [ unit-tests, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -1,7 +1,7 @@
black==24.4.0
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.9
types-six==1.16.21.20240513
vermin==1.6.0

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -100,7 +100,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -195,10 +195,10 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-latest, macos-14]
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,8 +223,39 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Run unit tests on Windows
windows:
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -1,83 +0,0 @@
name: windows
on:
workflow_call:
concurrency:
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack -d external find cmake ninja
spack -d install abseil-cpp

View File

@@ -1,3 +1,48 @@
# v0.21.2 (2024-03-01)
## Bugfixes
- Containerize: accommodate nested or pre-existing spack-env paths (#41558)
- Fix setup-env script, when going back and forth between instances (#40924)
- Fix using fully-qualified namespaces from root specs (#41957)
- Fix a bug when a required provider is requested for multiple virtuals (#42088)
- OCI buildcaches:
- only push in parallel when forking (#42143)
- use pickleable errors (#42160)
- Fix using sticky variants in externals (#42253)
- Fix a rare issue with conditional requirements and multi-valued variants (#42566)
## Package updates
- rust: add v1.75, rework a few variants (#41161,#41903)
- py-transformers: add v4.35.2 (#41266)
- mgard: fix OpenMP on AppleClang (#42933)
# v0.21.1 (2024-01-11)
## New features
- Add support for reading buildcaches created by Spack v0.22 (#41773)
## Bugfixes
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)
- Improve the error message for deprecated preferences (#41075)
- Fix MSVC preview version breaking clingo build on Windows (#41185)
- Fix multi-word aliases (#41126)
- Add a warning for unconfigured compiler (#41213)
- environment: fix an issue with deconcretization/reconcretization of specs (#41294)
- buildcache: don't error if a patch is missing, when installing from binaries (#41986)
- Multiple improvements to unit-tests (#41215,#41369,#41495,#41359,#41361,#41345,#41342,#41308,#41226)
## Package updates
- root: add a webgui patch to address security issue (#41404)
- BerkeleyGW: update source urls (#38218)
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.

View File

@@ -32,7 +32,7 @@
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
macOS, and many supercomputers. Spack is non-destructive: installing a
macOS, Windows, and many supercomputers. Spack is non-destructive: installing a
new version of a package does not break existing installations, so many
configurations of the same package can coexist.

View File

@@ -22,4 +22,4 @@
#
# This is compatible across platforms.
#
exec /usr/bin/env spack python "$@"
exec spack python "$@"

View File

@@ -144,3 +144,5 @@ switch($SpackSubCommand)
"unload" {Invoke-SpackLoad}
default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
}
exit $LASTEXITCODE

View File

@@ -1,16 +0,0 @@
# -------------------------------------------------------------------------
# This is the default configuration for Spack's module file generation.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/modules.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules: {}

View File

@@ -18,6 +18,7 @@ packages:
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-oneapi-daal]
@@ -37,10 +38,9 @@ packages:
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx, mesa18+glx]
libglx: [mesa+glx]
libifcore: [ intel-oneapi-runtime ]
libllvm: [llvm]
libosmesa: [mesa+osmesa, mesa18+osmesa]
lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]

12
lib/spack/docs/_templates/layout.html vendored Normal file
View File

@@ -0,0 +1,12 @@
{% extends "!layout.html" %}
{%- block extrahead %}
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S0PQ7WV75K"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-S0PQ7WV75K');
</script>
{% endblock %}

View File

@@ -865,7 +865,7 @@ There are several different ways to use Spack packages once you have
installed them. As you've seen, spack packages are installed into long
paths with hashes, and you need a way to get them into your path. The
easiest way is to use :ref:`spack load <cmd-spack-load>`, which is
described in the next section.
described in this section.
Some more advanced ways to use Spack packages include:
@@ -959,7 +959,86 @@ use ``spack find --loaded``.
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
We'll learn more about Spack's spec syntax in :ref:`a later section <sec-specs>`.
.. _extensions:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Python packages and virtual environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack can install a large number of Python packages. Their names are
typically prefixed with ``py-``. Installing and using them is no
different from any other package:
.. code-block:: console
$ spack install py-numpy
$ spack load py-numpy
$ python3
>>> import numpy
The ``spack load`` command sets the ``PATH`` variable so that the right Python
executable is used, and makes sure that ``numpy`` and its dependencies can be
located in the ``PYTHONPATH``.
Spack is different from other Python package managers in that it installs
every package into its *own* prefix. This is in contrast to ``pip``, which
installs all packages into the same prefix, be it in a virtual environment
or not.
For many users, **virtual environments** are more convenient than repeated
``spack load`` commands, particularly when working with multiple Python
packages. Fortunately Spack supports environments itself, which together
with a view are no different from Python virtual environments.
The recommended way of working with Python extensions such as ``py-numpy``
is through :ref:`Environments <environments>`. The following example creates
a Spack environment with ``numpy`` in the current working directory. It also
puts a filesystem view in ``./view``, which is a more traditional combined
prefix for all packages in the environment.
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
Now you can activate the environment and start using the packages:
.. code-block:: console
$ spack env activate .
$ python3
>>> import numpy
The environment view is also a virtual environment, which is useful if you are
sharing the environment with others who are unfamiliar with Spack. They can
either use the Python executable directly:
.. code-block:: console
$ ./view/bin/python3
>>> import numpy
or use the activation script:
.. code-block:: console
$ source ./view/bin/activate
$ python3
>>> import numpy
In general, there should not be much difference between ``spack env activate``
and using the virtual environment. The main advantage of ``spack env activate``
is that it knows about more packages than just Python packages, and it may set
additional runtime variables that are not covered by the virtual environment
activation script.
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
.. _sec-specs:
@@ -1354,22 +1433,12 @@ the reserved keywords ``platform``, ``os`` and ``target``:
$ spack install libelf os=ubuntu18.04
$ spack install libelf target=broadwell
or together by using the reserved keyword ``arch``:
.. code-block:: console
$ spack install libelf arch=cray-CNL10-haswell
Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default
target), see :ref:`package-preferences`.
.. admonition:: Cray machines
The situation is a little bit different for Cray machines and a detailed
explanation on how the architecture can be set on them can be found at :ref:`cray-support`
.. _support-for-microarchitectures:
@@ -1705,165 +1774,6 @@ check only local packages (as opposed to those used transparently from
``upstream`` spack instances) and the ``-j,--json`` option to output
machine-readable json data for any errors.
.. _extensions:
---------------------------
Extensions & Python support
---------------------------
Spack's installation model assumes that each package will live in its
own install prefix. However, certain packages are typically installed
*within* the directory hierarchy of other packages. For example,
`Python <https://www.python.org>`_ packages are typically installed in the
``$prefix/lib/python-2.7/site-packages`` directory.
In Spack, installation prefixes are immutable, so this type of installation
is not directly supported. However, it is possible to create views that
allow you to merge install prefixes of multiple packages into a single new prefix.
Views are a convenient way to get a more traditional filesystem structure.
Using *extensions*, you can ensure that Python packages always share the
same prefix in the view as Python itself. Suppose you have
Python installed like so:
.. code-block:: console
$ spack find python
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
python@2.7.8
.. _cmd-spack-extensions:
^^^^^^^^^^^^^^^^^^^^
``spack extensions``
^^^^^^^^^^^^^^^^^^^^
You can find extensions for your Python installation like this:
.. code-block:: console
$ spack extensions python
==> python@2.7.8%gcc@4.4.7 arch=linux-debian7-x86_64-703c7a96
==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six
py-biopython py-mako py-pmw py-rpy2 py-sympy
py-cython py-matplotlib py-pychecker py-scientificpython py-virtualenv
py-dateutil py-mpi4py py-pygments py-scikit-learn
py-epydoc py-mx py-pylint py-scipy
py-gnuplot py-nose py-pyparsing py-setuptools
py-h5py py-numpy py-pyqt py-shiboken
==> 12 installed:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-dateutil@2.4.0 py-nose@1.3.4 py-pyside@1.2.2
py-dateutil@2.4.0 py-numpy@1.9.1 py-pytz@2014.10
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
The extensions are a subset of what's returned by ``spack list``, and
they are packages like any other. They are installed into their own
prefixes, and you can see this with ``spack find --paths``:
.. code-block:: console
$ spack find --paths py-numpy
==> 1 installed packages.
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-numpy@1.9.1 ~/spack/opt/linux-debian7-x86_64/gcc@4.4.7/py-numpy@1.9.1-66733244
However, even though this package is installed, you cannot use it
directly when you run ``python``:
.. code-block:: console
$ spack load python
$ python
Python 2.7.8 (default, Feb 17 2015, 01:35:25)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>>
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Extensions in Environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The recommended way of working with extensions such as ``py-numpy``
above is through :ref:`Environments <environments>`. For example,
the following creates an environment in the current working directory
with a filesystem view in the ``./view`` directory:
.. code-block:: console
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
We recommend environments for two reasons. Firstly, environments
can be activated (requires :ref:`shell-support`):
.. code-block:: console
$ spack env activate .
which sets all the right environment variables such as ``PATH`` and
``PYTHONPATH``. This ensures that
.. code-block:: console
$ python
>>> import numpy
works. Secondly, even without shell support, the view ensures
that Python can locate its extensions:
.. code-block:: console
$ ./view/bin/python
>>> import numpy
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
^^^^^^^^^^^^^^^^^^^^
Using ``spack load``
^^^^^^^^^^^^^^^^^^^^
A more traditional way of using Spack and extensions is ``spack load``
(requires :ref:`shell-support`). This will add the extension to ``PYTHONPATH``
in your current shell, and Python itself will be available in the ``PATH``:
.. code-block:: console
$ spack load py-numpy
$ python
>>> import numpy
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Apart from ``spack env activate`` and ``spack load``, you can load numpy
through your environment modules (using ``environment-modules`` or
``lmod``). This will also add the extension to the ``PYTHONPATH`` in
your current shell.
.. code-block:: console
$ module load <name of numpy module>
If you do not know the name of the specific numpy module you wish to
load, you can use the ``spack module tcl|lmod loads`` command to get
the name of the module from the Spack spec.
-----------------------
Filesystem requirements
-----------------------

View File

@@ -21,23 +21,86 @@ is the following:
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
The ``reuse`` attribute controls how aggressively Spack reuses binary packages during concretization. The
attribute can either be a single value, or an object for more complex configurations.
In the former case ("single value") it allows Spack to:
1. Reuse installed packages and buildcaches for all the specs to be concretized, when ``true``
2. Reuse installed packages and buildcaches only for the dependencies of the root specs, when ``dependencies``
3. Disregard reusing installed packages and buildcaches, when ``false``
In case a finer control over which specs are reused is needed, then the value of this attribute can be
an object, with the following keys:
1. ``roots``: if ``true`` root specs are reused, if ``false`` only dependencies of root specs are reused
2. ``from``: list of sources from which reused specs are taken
Each source in ``from`` is itself an object:
.. list-table:: Attributes for a source or reusable specs
:header-rows: 1
* - Attribute name
- Description
* - type (mandatory, string)
- Can be ``local``, ``buildcache``, or ``external``
* - include (optional, list of specs)
- If present, reusable specs must match at least one of the constraint in the list
* - exclude (optional, list of specs)
- If present, reusable specs must not match any of the constraint in the list.
For instance, the following configuration:
.. code-block:: yaml
concretizer:
reuse:
roots: true
from:
- type: local
include:
- "%gcc"
- "%clang"
tells the concretizer to reuse all specs compiled with either ``gcc`` or ``clang``, that are installed
in the local store. Any spec from remote buildcaches is disregarded.
To reduce the boilerplate in configuration files, default values for the ``include`` and
``exclude`` options can be pushed up one level:
.. code-block:: yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
from:
- type: local
- type: buildcache
- type: local
include:
- "foo %oneapi"
In the example above we reuse all specs compiled with ``gcc`` from the local store
and remote buildcaches, and we also reuse ``foo %oneapi``. Note that the last source of
specs override the default ``include`` attribute.
For one-off concretizations, the are command line arguments for each of the simple "single value"
configurations. This means a user can:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
to enable reuse for a single installation, or:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: dependencies`` is the default.
.. seealso::

View File

@@ -147,6 +147,15 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -718,23 +718,45 @@ command-line tool, or C/C++/Fortran program with optional Python
modules? The former should be prepended with ``py-``, while the
latter should not.
""""""""""""""""""""""
extends vs. depends_on
""""""""""""""""""""""
""""""""""""""""""""""""""""""
``extends`` vs. ``depends_on``
""""""""""""""""""""""""""""""
This is very similar to the naming dilemma above, with a slight twist.
As mentioned in the :ref:`Packaging Guide <packaging_extensions>`,
``extends`` and ``depends_on`` are very similar, but ``extends`` ensures
that the extension and extendee share the same prefix in views.
This allows the user to import a Python module without
having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``.
Additionally, ``extends("python")`` adds a dependency on the package
``python-venv``. This improves isolation from the system, whether
it's during the build or at runtime: user and system site packages
cannot accidentally be used by any package that ``extends("python")``.
As a rule of thumb: if a package does not install any Python modules
of its own, and merely puts a Python script in the ``bin`` directory,
then there is no need for ``extends``. If the package installs modules
in the ``site-packages`` directory, it requires ``extends``.
"""""""""""""""""""""""""""""""""""""
Executing ``python`` during the build
"""""""""""""""""""""""""""""""""""""
Whenever you need to execute a Python command or pass the path of the
Python interpreter to the build system, it is best to use the global
variable ``python`` directly. For example:
.. code-block:: python
@run_before("install")
def recythonize(self):
python("setup.py", "clean") # use the `python` global
As mentioned in the previous section, ``extends("python")`` adds an
automatic dependency on ``python-venv``, which is a virtual environment
that guarantees build isolation. The ``python`` global always refers to
the correct Python interpreter, whether the package uses ``extends("python")``
or ``depends_on("python")``.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack

View File

@@ -11,7 +11,8 @@ Chaining Spack Installations
You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance,
you can add it as an entry to ``upstreams.yaml``:
you can add it as an entry to ``upstreams.yaml`` at any of the
:ref:`configuration-scopes`:
.. code-block:: yaml
@@ -22,7 +23,8 @@ you can add it as an entry to ``upstreams.yaml``:
install_tree: /path/to/another/spack/opt/spack
``install_tree`` must point to the ``opt/spack`` directory inside of the
Spack base directory.
Spack base directory, or the location of the ``install_tree`` defined
in :ref:`config.yaml <config-yaml>`.
Once the upstream Spack instance has been added, ``spack find`` will
automatically check the upstream instance when querying installed packages,

View File

@@ -150,7 +150,7 @@ this can expose you to attacks. Use at your own risk.
--------------------
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to a file path.
filesytem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.
@@ -160,6 +160,9 @@ in the subprocess calling ``curl``.
If ``url_fetch_method:urllib`` then files and directories are supported i.e.
``config:ssl_certs:$SSL_CERT_FILE`` or ``config:ssl_certs:$SSL_CERT_DIR``
will work.
In all cases the expanded path must be absolute for Spack to use the certificates.
Certificates relative to an environment can be created by prepending the path variable
with the Spack configuration variable``$env``.
--------------------
``checksum``

View File

@@ -194,15 +194,15 @@ The OS that are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -227,12 +227,6 @@ The OS that are currently supported are summarized in the table below:
* - Rocky Linux 9
- ``rockylinux:9``
- ``spack/rockylinux9``
* - Fedora Linux 37
- ``fedora:37``
- ``spack/fedora37``
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
* - Fedora Linux 39
- ``fedora:39``
- ``spack/fedora39``

View File

@@ -142,12 +142,8 @@ user's prompt to begin with the environment name in brackets.
$ spack env activate -p myenv
[myenv] $ ...
The ``activate`` command can also be used to create a new environment, if it is
not already defined, by adding the ``--create`` flag. Managed and anonymous
environments, anonymous environments are explained in the next section,
can both be created using the same flags that `spack env create` accepts.
If an environment already exists then spack will simply activate it and ignore the
create specific flags.
The ``activate`` command can also be used to create a new environment if it does not already
exist.
.. code-block:: console
@@ -176,21 +172,36 @@ environment will remove the view from the user environment.
Anonymous Environments
^^^^^^^^^^^^^^^^^^^^^^
Any directory can be treated as an environment if it contains a file
``spack.yaml``. To load an anonymous environment, use:
Apart from managed environments, Spack also supports anonymous environments.
Anonymous environments can be placed in any directory of choice.
.. note::
When uninstalling packages, Spack asks the user to confirm the removal of packages
that are still used in a managed environment. This is not the case for anonymous
environments.
To create an anonymous environment, use one of the following commands:
.. code-block:: console
$ spack env activate -d /path/to/directory
$ spack env create --dir my_env
$ spack env create ./my_env
Anonymous specs can be created in place using the command:
As a shorthand, you can also create an anonymous environment upon activation if it does not
already exist:
.. code-block:: console
$ spack env create -d .
$ spack env activate --create ./my_env
For convenience, Spack can also place an anonymous environment in a temporary directory for you:
.. code-block:: console
$ spack env activate --temp
In this case Spack simply creates a ``spack.yaml`` file in the requested
directory.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Environment Sensitive Commands
@@ -449,6 +460,125 @@ Sourcing that file in Bash will make the environment available to the
user; and can be included in ``.bashrc`` files, etc. The ``loads``
file may also be copied out of the environment, renamed, etc.
.. _environment_include_concrete:
------------------------------
Included Concrete Environments
------------------------------
Spack environments can create an environment based off of information in already
established environments. You can think of it as a combination of existing
environments. It will gather information from the existing environment's
``spack.lock`` and use that during the creation of this included concrete
environment. When an included concrete environment is created it will generate
a ``spack.lock`` file for the newly created environment.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Creating included environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To create a combined concrete environment, you must have at least one existing
concrete environment. You will use the command ``spack env create`` with the
argument ``--include-concrete`` followed by the name or path of the environment
you'd like to include. Here is an example of how to create a combined environment
from the command line.
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
You can also include an environment directly in the ``spack.yaml`` file. It
involves adding the ``include_concrete`` heading in the yaml followed by the
absolute path to the independent environments.
.. code-block:: yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /absolute/path/to/environment1
- /absolute/path/to/environment2
Once the ``spack.yaml`` has been updated you must concretize the environment to
get the concrete specs from the included environments.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Updating an included environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If changes were made to the base environment and you want that reflected in the
included environment you will need to reconcretize both the base environment and the
included environment for the change to be implemented. For example:
.. code-block:: console
$ spack env create myenv
$ spack -e myenv add python
$ spack -e myenv concretize
$ spack env create --include-concrete myenv included_env
$ spack -e myenv find
==> In environment myenv
==> Root specs
python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
Here we see that ``included_env`` has access to the python package through
the ``myenv`` environment. But if we were to add another spec to ``myenv``,
``included_env`` will not be able to access the new information.
.. code-block:: console
$ spack -e myenv add perl
$ spack -e myenv concretize
$ spack -e myenv find
==> In environment myenv
==> Root specs
perl python
==> 0 installed packages
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
python
==> 0 installed packages
It isn't until you run the ``spack concretize`` command that the combined
environment will get the updated information from the reconcretized base environmennt.
.. code-block:: console
$ spack -e included_env concretize
$ spack -e included_env find
==> In environment included_env
==> No root specs
==> Included specs
perl python
==> 0 installed packages
.. _environment-configuration:
------------------------
@@ -800,6 +930,7 @@ For example, the following environment has three root packages:
This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
@@ -1033,7 +1164,7 @@ other targets to depend on the environment installation.
A typical workflow is as follows:
.. code:: console
.. code-block:: console
spack env create -d .
spack -e . add perl
@@ -1126,7 +1257,7 @@ its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
.. code-block:: console
$ spack env depfile -o Makefile
@@ -1148,7 +1279,7 @@ This can be accomplished through the generated ``[<prefix>/]SPACK_PACKAGE_IDS``
variable. Assuming we have an active and concrete environment, we generate the
associated ``Makefile`` with a prefix ``example``:
.. code:: console
.. code-block:: console
$ spack env depfile -o env.mk --make-prefix example

View File

@@ -478,6 +478,13 @@ prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
Note that the format for the ``paths`` field in the
``extra_attributes`` section is different than in the ``compilers``
config. For compilers configured as external packages, the section is
named ``compilers`` and the dictionary maps language names (``c``,
``cxx``, ``fortran``) to paths, rather than using the names ``cc``,
``fc``, and ``f77``.
.. code-block:: yaml
packages:
@@ -493,11 +500,10 @@ all other fields from the compilers config can be added to the
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
paths:
cc: /usr/bin/clang-with-suffix
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fc: /usr/bin/gfortran
f77: /usr/bin/gfortran
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
@@ -1358,187 +1364,6 @@ This will write the private key to the file `dinosaur.priv`.
or for help on an issue or the Spack slack.
.. _cray-support:
-------------
Spack on Cray
-------------
Spack differs slightly when used on a Cray system. The architecture spec
can differentiate between the front-end and back-end processor and operating system.
For example, on Edison at NERSC, the back-end target processor
is "Ivy Bridge", so you can specify to use the back-end this way:
.. code-block:: console
$ spack install zlib target=ivybridge
You can also use the operating system to build against the back-end:
.. code-block:: console
$ spack install zlib os=CNL10
Notice that the name includes both the operating system name and the major
version number concatenated together.
Alternatively, if you want to build something for the front-end,
you can specify the front-end target processor. The processor for a login node
on Edison is "Sandy bridge" so we specify on the command line like so:
.. code-block:: console
$ spack install zlib target=sandybridge
And the front-end operating system is:
.. code-block:: console
$ spack install zlib os=SuSE11
^^^^^^^^^^^^^^^^^^^^^^^
Cray compiler detection
^^^^^^^^^^^^^^^^^^^^^^^
Spack can detect compilers using two methods. For the front-end, we treat
everything the same. The difference lies in back-end compiler detection.
Back-end compiler detection is made via the Tcl module avail command.
Once it detects the compiler it writes the appropriate PrgEnv and compiler
module name to compilers.yaml and sets the paths to each compiler with Cray\'s
compiler wrapper names (i.e. cc, CC, ftn). During build time, Spack will load
the correct PrgEnv and compiler module and will call appropriate wrapper.
The compilers.yaml config file will also differ. There is a
modules section that is filled with the compiler's Programming Environment
and module name. On other systems, this field is empty []:
.. code-block:: yaml
- compiler:
modules:
- PrgEnv-intel
- intel/15.0.109
As mentioned earlier, the compiler paths will look different on a Cray system.
Since most compilers are invoked using cc, CC and ftn, the paths for each
compiler are replaced with their respective Cray compiler wrapper names:
.. code-block:: yaml
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
As opposed to an explicit path to the compiler executable. This allows Spack
to call the Cray compiler wrappers during build time.
For more on compiler configuration, check out :ref:`compiler-config`.
Spack sets the default Cray link type to dynamic, to better match other
other platforms. Individual packages can enable static linking (which is the
default outside of Spack on cray systems) using the ``-static`` flag.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting defaults and using Cray modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you want to use default compilers for each PrgEnv and also be able
to load cray external modules, you will need to set up a ``packages.yaml``.
Here's an example of an external configuration for cray modules:
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all:
providers:
mpi: [mpich]
This tells Spack that for whatever package that depends on mpi, load the
cray-mpich module into the environment. You can then be able to use whatever
environment variables, libraries, etc, that are brought into the environment
via module load.
.. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line).
You can set the default compiler that Spack can use for each compiler type.
If you want to use the Cray defaults, then set them under ``all:`` in packages.yaml.
In the compiler field, set the compiler specs in your order of preference.
Whenever you build with that compiler type, Spack will concretize to that version.
Here is an example of a full packages.yaml used at NERSC
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False
netcdf:
externals:
- spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False
hdf5:
externals:
- spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False
all:
compiler: [gcc@5.2.0, intel@16.0.0.109]
providers:
mpi: [mpich]
Here we tell spack that whenever we want to build with gcc use version 5.2.0 or
if we want to build with intel compilers, use version 16.0.0.109. We add a spec
for each compiler type for each cray modules. This ensures that for each
compiler on our system we can use that external module.
For more on external packages check out the section :ref:`sec-external-packages`.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Linux containers on Cray machines
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack uses environment variables particular to the Cray programming
environment to determine which systems are Cray platforms. These
environment variables may be propagated into containers that are not
using the Cray programming environment.
To ensure that Spack does not autodetect the Cray programming
environment, unset the environment variable ``MODULEPATH``. This
will cause Spack to treat a linux container on a Cray system as a base
linux distro.
.. _windows_support:
----------------
@@ -1572,6 +1397,8 @@ Microsoft Visual Studio
"""""""""""""""""""""""
Microsoft Visual Studio provides the only Windows C/C++ compiler that is currently supported by Spack.
Spack additionally requires that the Windows SDK (including WGL) to be installed as part of your
visual studio installation as it is required to build many packages from source.
We require several specific components to be included in the Visual Studio installation.
One is the C/C++ toolset, which can be selected as "Desktop development with C++" or "C++ build tools,"
@@ -1579,6 +1406,7 @@ depending on installation type (Professional, Build Tools, etc.) The other requ
"C++ CMake tools for Windows," which can be selected from among the optional packages.
This provides CMake and Ninja for use during Spack configuration.
If you already have Visual Studio installed, you can make sure these components are installed by
rerunning the installer. Next to your installation, select "Modify" and look at the
"Installation details" pane on the right.

View File

@@ -476,9 +476,3 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx_design==0.6.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
pygments==2.18.0
urllib3==2.2.1
pytest==8.1.1
pytest==8.2.2
isort==5.13.2
black==24.4.0
black==24.4.2
flake8==7.0.0
mypy==1.9.0
mypy==1.10.0

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.3 (commit 7b8fe60b69e2861e7dac104bc1c183decfcd3daf)
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse
----------------

View File

@@ -1,3 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.3"
__version__ = "0.2.4"

View File

@@ -5,9 +5,10 @@
"""The "cpu" package permits to query and compare different
CPU microarchitectures.
"""
from .detect import host
from .detect import brand_string, host
from .microarchitecture import (
TARGETS,
InvalidCompilerVersion,
Microarchitecture,
UnsupportedMicroarchitecture,
generic_microarchitecture,
@@ -15,10 +16,12 @@
)
__all__ = [
"brand_string",
"host",
"TARGETS",
"InvalidCompilerVersion",
"Microarchitecture",
"UnsupportedMicroarchitecture",
"TARGETS",
"generic_microarchitecture",
"host",
"version_components",
]

View File

@@ -155,6 +155,31 @@ def _is_bit_set(self, register: int, bit: int) -> bool:
mask = 1 << bit
return register & mask > 0
def brand_string(self) -> Optional[str]:
"""Returns the brand string, if available."""
if self.highest_extension_support < 0x80000004:
return None
r1 = self.cpuid.registers_for(eax=0x80000002, ecx=0)
r2 = self.cpuid.registers_for(eax=0x80000003, ecx=0)
r3 = self.cpuid.registers_for(eax=0x80000004, ecx=0)
result = struct.pack(
"IIIIIIIIIIII",
r1.eax,
r1.ebx,
r1.ecx,
r1.edx,
r2.eax,
r2.ebx,
r2.ecx,
r2.edx,
r3.eax,
r3.ebx,
r3.ecx,
r3.edx,
).decode("utf-8")
return result.strip("\x00")
@detection(operating_system="Windows")
def cpuid_info():
@@ -174,8 +199,8 @@ def _check_output(args, env):
WINDOWS_MAPPING = {
"AMD64": "x86_64",
"ARM64": "aarch64",
"AMD64": X86_64,
"ARM64": AARCH64,
}
@@ -409,3 +434,16 @@ def compatibility_check_for_riscv64(info, target):
return (target == arch_root or arch_root in target.ancestors) and (
target.name == info.name or target.vendor == "generic"
)
def brand_string() -> Optional[str]:
"""Returns the brand string of the host, if detected, or None."""
if platform.system() == "Darwin":
return _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if host().family == X86_64:
return CpuidInfoCollector().brand_string()
return None

View File

@@ -208,6 +208,8 @@ def optimization_flags(self, compiler, version):
"""Returns a string containing the optimization flags that needs
to be used to produce code optimized for this micro-architecture.
The version is expected to be a string of dot separated digits.
If there is no information on the compiler passed as argument the
function returns an empty string. If it is known that the compiler
version we want to use does not support this architecture the function
@@ -216,6 +218,11 @@ def optimization_flags(self, compiler, version):
Args:
compiler (str): name of the compiler to be used
version (str): version of the compiler to be used
Raises:
UnsupportedMicroarchitecture: if the requested compiler does not support
this micro-architecture.
ValueError: if the version doesn't match the expected format
"""
# If we don't have information on compiler at all return an empty string
if compiler not in self.family.compilers:
@@ -232,6 +239,14 @@ def optimization_flags(self, compiler, version):
msg = msg.format(compiler, best_target, best_target.family)
raise UnsupportedMicroarchitecture(msg)
# Check that the version matches the expected format
if not re.match(r"^(?:\d+\.)*\d+$", version):
msg = (
"invalid format for the compiler version argument. "
"Only dot separated digits are allowed."
)
raise InvalidCompilerVersion(msg)
# If we have information on this compiler we need to check the
# version being used
compiler_info = self.compilers[compiler]
@@ -292,7 +307,7 @@ def generic_microarchitecture(name):
Args:
name (str): name of the micro-architecture
"""
return Microarchitecture(name, parents=[], vendor="generic", features=[], compilers={})
return Microarchitecture(name, parents=[], vendor="generic", features=set(), compilers={})
def version_components(version):
@@ -367,7 +382,15 @@ def fill_target_from_dict(name, data, targets):
TARGETS = LazyDictionary(_known_microarchitectures)
class UnsupportedMicroarchitecture(ValueError):
class ArchspecError(Exception):
"""Base class for errors within archspec"""
class UnsupportedMicroarchitecture(ArchspecError, ValueError):
"""Raised if a compiler version does not support optimization for a given
micro-architecture.
"""
class InvalidCompilerVersion(ArchspecError, ValueError):
"""Raised when an invalid format is used for compiler versions in archspec."""

View File

@@ -2937,8 +2937,6 @@
"ilrcpc",
"flagm",
"ssbs",
"paca",
"pacg",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3066,8 +3064,6 @@
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
@@ -3081,8 +3077,7 @@
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
"dgh"
],
"compilers" : {
"gcc": [

View File

@@ -98,3 +98,10 @@ def path_filter_caller(*args, **kwargs):
if _func:
return holder_func(_func)
return holder_func
def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.
no-op if extended path prefix is not present"""
return path.lstrip("\\\\?\\")

View File

@@ -187,12 +187,18 @@ def polite_filename(filename: str) -> str:
return _polite_antipattern().sub("_", filename)
def getuid():
def getuid() -> Union[str, int]:
"""Returns os getuid on non Windows
On Windows returns 0 for admin users, login string otherwise
This is in line with behavior from get_owner_uid which
always returns the login string on Windows
"""
if sys.platform == "win32":
import ctypes
# If not admin, use the string name of the login as a unique ID
if ctypes.windll.shell32.IsUserAnAdmin() == 0:
return 1
return os.getlogin()
return 0
else:
return os.getuid()
@@ -213,6 +219,15 @@ def _win_rename(src, dst):
os.replace(src, dst)
@system_path_filter
def msdos_escape_parens(path):
"""MS-DOS interprets parens as grouping parameters even in a quoted string"""
if sys.platform == "win32":
return path.replace("(", "^(").replace(")", "^)")
else:
return path
@system_path_filter
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
@@ -553,7 +568,13 @@ def exploding_archive_handler(tarball_container, stage):
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
def get_owner_uid(path, err_msg=None) -> Union[str, int]:
"""Returns owner UID of path destination
On non Windows this is the value of st_uid
On Windows this is the login string associated with the
owning user.
"""
if not os.path.exists(path):
mkdirp(path, mode=stat.S_IRWXU)
@@ -745,7 +766,6 @@ def copy_tree(
src: str,
dest: str,
symlinks: bool = True,
allow_broken_symlinks: bool = sys.platform != "win32",
ignore: Optional[Callable[[str], bool]] = None,
_permissions: bool = False,
):
@@ -768,8 +788,6 @@ def copy_tree(
src (str): the directory to copy
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception. Defaults to true on unix.
ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only
@@ -777,8 +795,6 @@ def copy_tree(
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if allow_broken_symlinks and sys.platform == "win32":
raise llnl.util.symlink.SymlinkError("Cannot allow broken symlinks on Windows!")
if _permissions:
tty.debug("Installing {0} to {1}".format(src, dest))
else:
@@ -822,7 +838,7 @@ def copy_tree(
if islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
target = readlink(s)
if os.path.isabs(target):
def escaped_path(path):
@@ -851,16 +867,14 @@ def escaped_path(path):
copy_mode(s, d)
for target, d, s in links:
symlink(target, d, allow_broken_symlinks=allow_broken_symlinks)
symlink(target, d)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
@system_path_filter
def install_tree(
src, dest, symlinks=True, ignore=None, allow_broken_symlinks=sys.platform != "win32"
):
def install_tree(src, dest, symlinks=True, ignore=None):
"""Recursively install an entire directory tree rooted at *src*.
Same as :py:func:`copy_tree` with the addition of setting proper
@@ -871,21 +885,12 @@ def install_tree(
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (typing.Callable): function indicating which files to ignore
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception.
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(
src,
dest,
symlinks=symlinks,
allow_broken_symlinks=allow_broken_symlinks,
ignore=ignore,
_permissions=True,
)
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)
@system_path_filter
@@ -2429,9 +2434,10 @@ def add_library_dependent(self, *dest):
"""
for pth in dest:
if os.path.isfile(pth):
self._additional_library_dependents.add(pathlib.Path(pth).parent)
new_pth = pathlib.Path(pth).parent
else:
self._additional_library_dependents.add(pathlib.Path(pth))
new_pth = pathlib.Path(pth)
self._additional_library_dependents.add(new_pth)
@property
def rpaths(self):
@@ -2509,8 +2515,14 @@ def establish_link(self):
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
# do not rpath for system libraries included in the dag
# we should not be modifying libraries managed by the Windows system
# as this will negatively impact linker behavior and can result in permission
# errors if those system libs are not modifiable by Spack
if "windows-system" not in getattr(self.pkg, "tags", []):
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
@system_path_filter

View File

@@ -8,100 +8,75 @@
import subprocess
import sys
import tempfile
from typing import Union
from llnl.util import lang, tty
from ..path import system_path_filter
from ..path import sanitize_win_longpath, system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
is_windows = sys.platform == "win32"
def _windows_symlink(
src: str, dst: str, target_is_directory: bool = False, *, dir_fd: Union[int, None] = None
):
"""On Windows with System Administrator privileges this will be a normal symbolic link via
os.symlink. On Windows without privledges the link will be a junction for a directory and a
hardlink for a file. On Windows the various link types are:
def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not is_windows):
"""
Create a link.
Symbolic Link: A link to a file or directory on the same or different volume (drive letter) or
even to a remote file or directory (using UNC in its path). Need System Administrator
privileges to make these.
On non-Windows and Windows with System Administrator
privleges this will be a normal symbolic link via
os.symlink.
Hard Link: A link to a file on the same volume (drive letter) only. Every file (file's data)
has at least 1 hard link (file's name). But when this method creates a new hard link there will
be 2. Deleting all hard links effectively deletes the file. Don't need System Administrator
privileges.
On Windows without privledges the link will be a
junction for a directory and a hardlink for a file.
On Windows the various link types are:
Symbolic Link: A link to a file or directory on the
same or different volume (drive letter) or even to
a remote file or directory (using UNC in its path).
Need System Administrator privileges to make these.
Hard Link: A link to a file on the same volume (drive
letter) only. Every file (file's data) has at least 1
hard link (file's name). But when this method creates
a new hard link there will be 2. Deleting all hard
links effectively deletes the file. Don't need System
Administrator privileges.
Junction: A link to a directory on the same or different
volume (drive letter) but not to a remote directory. Don't
need System Administrator privileges.
Parameters:
source_path (str): The real file or directory that the link points to.
Must be absolute OR relative to the link.
link_path (str): The path where the link will exist.
allow_broken_symlinks (bool): On Linux or Mac, don't raise an exception if the source_path
doesn't exist. This will still raise an exception on Windows.
"""
source_path = os.path.normpath(source_path)
Junction: A link to a directory on the same or different volume (drive letter) but not to a
remote directory. Don't need System Administrator privileges."""
source_path = os.path.normpath(src)
win_source_path = source_path
link_path = os.path.normpath(link_path)
link_path = os.path.normpath(dst)
# Never allow broken links on Windows.
if sys.platform == "win32" and allow_broken_symlinks:
raise ValueError("allow_broken_symlinks parameter cannot be True on Windows.")
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(f"Link path ({link_path}) already exists. Cannot create link.")
if not allow_broken_symlinks:
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(
f"Link path ({link_path}) already exists. Cannot create link."
if not os.path.exists(source_path):
if os.path.isabs(source_path):
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
if not os.path.exists(source_path):
if os.path.isabs(source_path) and not allow_broken_symlinks:
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
elif not allow_broken_symlinks:
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
# Create the symlink
if sys.platform == "win32" and not _windows_can_symlink():
if not _windows_can_symlink():
_windows_create_link(win_source_path, link_path)
else:
os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path))
def islink(path: str) -> bool:
def _windows_islink(path: str) -> bool:
"""Override os.islink to give correct answer for spack logic.
For Non-Windows: a link can be determined with the os.path.islink method.
@@ -247,9 +222,9 @@ def _windows_create_junction(source: str, link: str):
out, err = proc.communicate()
tty.debug(out.decode())
if proc.returncode != 0:
err = err.decode()
tty.error(err)
raise SymlinkError("Make junction command returned a non-zero return code.", err)
err_str = err.decode()
tty.error(err_str)
raise SymlinkError("Make junction command returned a non-zero return code.", err_str)
def _windows_create_hard_link(path: str, link: str):
@@ -269,14 +244,14 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def readlink(path: str):
def _windows_readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
elif _windows_is_junction(path):
return _windows_read_junction(path)
else:
return os.readlink(path)
return sanitize_win_longpath(os.readlink(path, dir_fd=dir_fd))
def _windows_read_hard_link(link: str) -> str:
@@ -338,6 +313,16 @@ def resolve_link_target_relative_to_the_link(link):
return os.path.join(link_dir, target)
if sys.platform == "win32":
symlink = _windows_symlink
readlink = _windows_readlink
islink = _windows_islink
else:
symlink = os.symlink
readlink = os.readlink
islink = os.path.islink
class SymlinkError(RuntimeError):
"""Exception class for errors raised while creating symlinks,
junctions and hard links

View File

@@ -59,6 +59,7 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
import os
import re
import sys
from contextlib import contextmanager
@@ -101,9 +102,29 @@ def __init__(self, message):
# Mapping from color arguments to values for tty.set_color
color_when_values = {"always": True, "auto": None, "never": False}
# Force color; None: Only color if stdout is a tty
# True: Always colorize output, False: Never colorize output
_force_color = None
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def _color_from_environ() -> Optional[bool]:
try:
return _color_when_value(os.environ.get("SPACK_COLOR", "auto"))
except ValueError:
return None
#: When `None` colorize when stdout is tty, when `True` or `False` always or never colorize resp.
_force_color = _color_from_environ()
def try_enable_terminal_color_on_windows():
@@ -164,19 +185,6 @@ def _err_check(result, func, args):
debug("Unable to support color on Windows terminal")
def _color_when_value(when):
"""Raise a ValueError for an invalid color setting.
Valid values are 'always', 'never', and 'auto', or equivalently,
True, False, and None.
"""
if when in color_when_values:
return color_when_values[when]
elif when not in color_when_values.values():
raise ValueError("Invalid color setting: %s" % when)
return when
def get_color_when():
"""Return whether commands should print color or not."""
if _force_color is not None:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.0.dev0"
__version__ = "0.23.0.dev0"
spack_version = __version__

View File

@@ -254,8 +254,8 @@ def _search_duplicate_specs_in_externals(error_cls):
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
@@ -421,6 +421,10 @@ def _check_patch_urls(pkgs, error_cls):
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
)
github_pull_commits_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/pull/\d+/commits/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = (
@@ -436,14 +440,24 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch):
continue
if re.match(github_patch_url_re, patch.url):
if re.match(github_pull_commits_re, patch.url):
url = re.sub(r"/pull/\d+/commits/", r"/commit/", patch.url)
url = re.sub(r"^(.*)(?<!full_index=1)$", r"\1?full_index=1", url)
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ "must not be a pull request commit; "
+ f"instead use {url}",
[patch.url],
)
)
elif re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
f"patch URL in package {pkg_cls.name} "
+ f"must end with {full_index_arg}",
[patch.url],
)
)
@@ -451,9 +465,7 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(".diff"):
errors.append(
error_cls(
"patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
f"patch URL in package {pkg_cls.name} must end with .diff",
[patch.url],
)
)
@@ -1046,7 +1058,7 @@ def _extracts_errors(triggers, summary):
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
kwargs=("pkgs", "debug_log"),
)
@@ -1069,7 +1081,7 @@ def packages_with_detection_tests():
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
def _test_detection_by_executable(pkgs, debug_log, error_cls):
"""Test drive external detection for packages"""
import spack.detection
@@ -1095,6 +1107,7 @@ def _test_detection_by_executable(pkgs, error_cls):
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
debug_log(f"[{__file__}]: running test {idx} for package {pkg_name}")
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
@@ -1115,11 +1128,10 @@ def _test_detection_by_executable(pkgs, error_cls):
for candidate in expected_specs:
try:
idx = specs.index(candidate)
matched_detection.append((candidate, specs[idx]))
except (AttributeError, ValueError):
pass
matched_detection.append((candidate, specs[idx]))
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type

View File

@@ -29,6 +29,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
from llnl.util.symlink import readlink
import spack.caches
import spack.cmd
@@ -658,7 +659,7 @@ def get_buildfile_manifest(spec):
# 2. paths are used as strings.
for rel_path in visitor.symlinks:
abs_path = os.path.join(root, rel_path)
link = os.readlink(abs_path)
link = readlink(abs_path)
if os.path.isabs(link) and link.startswith(spack.store.STORE.layout.root):
data["link_to_relocate"].append(rel_path)
@@ -2001,6 +2002,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)

View File

@@ -5,7 +5,13 @@
"""Function and classes needed to bootstrap Spack itself."""
from .config import ensure_bootstrap_configuration, is_bootstrapping, store_path
from .core import all_core_root_specs, ensure_core_dependencies, ensure_patchelf_in_path_or_raise
from .core import (
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
from .environment import BootstrapEnvironment, ensure_environment_dependencies
from .status import status_message
@@ -13,6 +19,8 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",
"all_core_root_specs",
"ensure_environment_dependencies",

View File

@@ -54,10 +54,14 @@ def _try_import_from_store(
installed_specs = spack.store.STORE.db.query(query_spec, installed=True)
for candidate_spec in installed_specs:
pkg = candidate_spec["python"].package
# previously bootstrapped specs may not have a python-venv dependency.
if candidate_spec.dependencies("python-venv"):
python, *_ = candidate_spec.dependencies("python-venv")
else:
python, *_ = candidate_spec.dependencies("python")
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
os.path.join(candidate_spec.prefix, python.package.purelib),
os.path.join(candidate_spec.prefix, python.package.platlib),
]
path_before = list(sys.path)
@@ -209,15 +213,18 @@ def _root_spec(spec_str: str) -> str:
Args:
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a compiler requirement to the root spec.
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -270,10 +270,6 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
@@ -538,6 +534,41 @@ def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
)
def ensure_winsdk_external_or_raise() -> None:
"""Ensure the Windows SDK + WGL are available on system
If both of these package are found, the Spack user or bootstrap
configuration (depending on where Spack is running)
will be updated to include all versions and variants detected.
If either the WDK or WSDK are not found, this method will raise
a RuntimeError.
**NOTE:** This modifies the Spack config in the current scope,
either user or environment depending on the calling context.
This is different from all other current bootstrap dependency
checks.
"""
if set(["win-sdk", "wgl"]).issubset(spack.config.get("packages").keys()):
return
externals = spack.detection.by_path(["win-sdk", "wgl"])
if not set(["win-sdk", "wgl"]) == externals.keys():
missing_packages_lst = []
if "wgl" not in externals:
missing_packages_lst.append("wgl")
if "win-sdk" not in externals:
missing_packages_lst.append("win-sdk")
missing_packages = " & ".join(missing_packages_lst)
raise RuntimeError(
f"Unable to find the {missing_packages}, please install these packages \
via the Visual Studio installer \
before proceeding with Spack or provide the path to a non standard install with \
'spack external find --path'"
)
# wgl/sdk are not required for bootstrapping Spack, but
# are required for building anything non trivial
# add to user config so they can be used by subsequent Spack ops
spack.detection.update_configuration(externals, buildable=False)
def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":

View File

@@ -3,13 +3,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap non-core Spack dependencies from an environment."""
import glob
import hashlib
import os
import pathlib
import sys
import warnings
from typing import List
from typing import Iterable, List
import archspec.cpu
@@ -28,6 +26,16 @@
class BootstrapEnvironment(spack.environment.Environment):
"""Environment to install dependencies of Spack for a given interpreter and architecture"""
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
# Remove python package roots created before python-venv was introduced
for s in self.concrete_roots():
if "python" in s.package.extendees and not s.dependencies("python-venv"):
self.deconcretize(s)
@classmethod
def spack_dev_requirements(cls) -> List[str]:
"""Spack development requirements"""
@@ -59,31 +67,19 @@ def view_root(cls) -> pathlib.Path:
return cls.environment_root().joinpath("view")
@classmethod
def pythonpaths(cls) -> List[str]:
"""Paths to be added to sys.path or PYTHONPATH"""
python_dir_part = f"python{'.'.join(str(x) for x in sys.version_info[:2])}"
glob_expr = str(cls.view_root().joinpath("**", python_dir_part, "**"))
result = glob.glob(glob_expr)
if not result:
msg = f"Cannot find any Python path in {cls.view_root()}"
warnings.warn(msg)
return result
@classmethod
def bin_dirs(cls) -> List[pathlib.Path]:
def bin_dir(cls) -> pathlib.Path:
"""Paths to be added to PATH"""
return [cls.view_root().joinpath("bin")]
return cls.view_root().joinpath("bin")
def python_dirs(self) -> Iterable[pathlib.Path]:
python = next(s for s in self.all_specs_generator() if s.name == "python-venv").package
return {self.view_root().joinpath(p) for p in (python.platlib, python.purelib)}
@classmethod
def spack_yaml(cls) -> pathlib.Path:
"""Environment spack.yaml file"""
return cls.environment_root().joinpath("spack.yaml")
def __init__(self) -> None:
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
@@ -100,21 +96,13 @@ def update_installations(self) -> None:
self.install_all()
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
the environment view.
"""
# Do minimal modifications to sys.path and environment variables. In particular, pay
# attention to have the smallest PYTHONPATH / sys.path possible, since that may impact
# the performance of the current interpreter
sys.path.extend(self.pythonpaths())
os.environ["PATH"] = os.pathsep.join(
[str(x) for x in self.bin_dirs()] + os.environ.get("PATH", "").split(os.pathsep)
)
os.environ["PYTHONPATH"] = os.pathsep.join(
os.environ.get("PYTHONPATH", "").split(os.pathsep)
+ [str(x) for x in self.pythonpaths()]
)
def load(self) -> None:
"""Update PATH and sys.path."""
# Make executables available (shouldn't need PYTHONPATH)
os.environ["PATH"] = f"{self.bin_dir()}{os.pathsep}{os.environ.get('PATH', '')}"
# Spack itself imports pytest
sys.path.extend(str(p) for p in self.python_dirs())
def _write_spack_yaml_file(self) -> None:
tty.msg(
@@ -164,4 +152,4 @@ def ensure_environment_dependencies() -> None:
_add_externals_if_missing()
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()
env.load()

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Set, Tuple
from typing import Dict, List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -91,7 +91,7 @@
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.module_cmd import load_module, path_from_modules
#
# This can be set by the user to globally disable parallel builds.
@@ -190,14 +190,6 @@ def __call__(self, *args, **kwargs):
return super().__call__(*args, **kwargs)
def _on_cray():
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
on_cray = str(host_platform) == "cray"
using_cnl = re.match(r"cnl\d+", str(host_os))
return on_cray, using_cnl
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -241,17 +233,6 @@ def clean_environment():
if varname.endswith("_ROOT") and varname != "SPACK_ROOT":
env.unset(varname)
# On Cray "cluster" systems, unset CRAY_LD_LIBRARY_PATH to avoid
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
# depending on the CNL version).
on_cray, using_cnl = _on_cray()
if on_cray and not using_cnl:
env.unset("CRAY_LD_LIBRARY_PATH")
for varname in os.environ.keys():
if "PKGCONF" in varname:
env.unset(varname)
# Unset the following variables because they can affect installation of
# Autotools and CMake packages.
build_system_vars = [
@@ -381,11 +362,7 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
# Don't set on cray platform because the targeting module handles this
if spec.satisfies("platform=cray"):
isa_arg = ""
else:
isa_arg = spec.architecture.target.optimization_flags(compiler)
isa_arg = spec.architecture.target.optimization_flags(compiler)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
@@ -730,12 +707,28 @@ def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwa
return compiler(*compiler_args, output=compiler_output)
def get_rpath_deps(pkg):
"""Return immediate or transitive RPATHs depending on the package."""
if pkg.transitive_rpaths:
return [d for d in pkg.spec.traverse(root=False, deptype=("link"))]
else:
return pkg.spec.dependencies(deptype="link")
def _get_rpath_deps_from_spec(
spec: spack.spec.Spec, transitive_rpaths: bool
) -> List[spack.spec.Spec]:
if not transitive_rpaths:
return spec.dependencies(deptype=dt.LINK)
by_name: Dict[str, spack.spec.Spec] = {}
for dep in spec.traverse(root=False, deptype=dt.LINK):
lookup = by_name.get(dep.name)
if lookup is None:
by_name[dep.name] = dep
elif lookup.version < dep.version:
by_name[dep.name] = dep
return list(by_name.values())
def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]:
"""Return immediate or transitive dependencies (depending on the package) that need to be
rpath'ed. If a package occurs multiple times, the newest version is kept."""
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def get_rpaths(pkg):
@@ -747,7 +740,9 @@ def get_rpaths(pkg):
# Second module is our compiler mod name. We use that to get rpaths from
# module show output.
if pkg.compiler.modules and len(pkg.compiler.modules) > 1:
rpaths.append(path_from_modules([pkg.compiler.modules[1]]))
mod_rpath = path_from_modules([pkg.compiler.modules[1]])
if mod_rpath:
rpaths.append(mod_rpath)
return list(dedupe(filter_system_paths(rpaths)))
@@ -817,14 +812,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules:
load_module(mod)
# kludge to handle cray mpich and libsci being automatically loaded by
# PrgEnv modules on cray platform. Module unload does no damage when
# unnecessary
on_cray, _ = _on_cray()
if on_cray and not dirty:
for mod in ["cray-mpich", "cray-libsci"]:
module("unload", mod)
if target and target.module_name:
load_module(target.module_name)

View File

@@ -162,7 +162,9 @@ def initconfig_compiler_entries(self):
ld_flags = " ".join(flags["ldflags"])
ld_format_string = "CMAKE_{0}_LINKER_FLAGS"
# CMake has separate linker arguments for types of builds.
for ld_type in ["EXE", "MODULE", "SHARED", "STATIC"]:
# 'ldflags' should not be used with CMAKE_STATIC_LINKER_FLAGS which
# is used by the archiver, so don't include "STATIC" in this loop:
for ld_type in ["EXE", "MODULE", "SHARED"]:
ld_string = ld_format_string.format(ld_type)
entries.append(cmake_cache_string(ld_string, ld_flags))

View File

@@ -39,16 +39,11 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
if not getattr(pkg, "find_python_hints", False):
if not getattr(pkg, "find_python_hints", False) or not pkg.spec.dependencies(
"python", dt.BUILD | dt.LINK
):
return
pythons = pkg.spec.dependencies("python", dt.BUILD | dt.LINK)
if len(pythons) != 1:
return
try:
python_executable = pythons[0].package.command.path
except RuntimeError:
return
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),

View File

@@ -0,0 +1,144 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
import spack.compiler
import spack.package_base
# Local "type" for type hints
Path = Union[str, pathlib.Path]
class CompilerPackage(spack.package_base.PackageBase):
"""A Package mixin for all common logic for packages that implement compilers"""
# TODO: how do these play nicely with other tags
tags: Sequence[str] = ["compiler"]
#: Optional suffix regexes for searching for this type of compiler.
#: Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
#: version suffix for gcc.
compiler_suffixes: List[str] = [r"-.*"]
#: Optional prefix regexes for searching for this compiler
compiler_prefixes: List[str] = []
#: Compiler argument(s) that produces version information
#: If multiple arguments, the earlier arguments must produce errors when invalid
compiler_version_argument: Union[str, Tuple[str]] = "-dumpversion"
#: Regex used to extract version from compiler's output
compiler_version_regex: str = "(.*)"
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
assert set(self.supported_languages) <= set(self.compiler_languages), msg
@property
def supported_languages(self) -> Sequence[str]:
"""Dynamic definition of languages supported by this package"""
return self.compiler_languages
@classproperty
def compiler_names(cls) -> Sequence[str]:
"""Construct list of compiler names from per-language names"""
names = []
for language in cls.compiler_languages:
names.extend(getattr(cls, f"{language}_names"))
return names
@classproperty
def executables(cls) -> Sequence[str]:
"""Construct executables for external detection from names, prefixes, and suffixes."""
regexp_fmt = r"^({0}){1}({2})$"
prefixes = [""] + cls.compiler_prefixes
suffixes = [""] + cls.compiler_suffixes
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
suffixes += [suf + ext for suf in suffixes]
return [
regexp_fmt.format(prefix, re.escape(name), suffix)
for prefix, name, suffix in itertools.product(prefixes, cls.compiler_names, suffixes)
]
@classmethod
def determine_version(cls, exe: Path):
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
except spack.util.executable.ProcessError:
pass
except Exception as e:
tty.debug(
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
return os.path.join(prefix, "bin")
@classmethod
def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
"""Compute the paths to compiler executables associated with this package
This is a helper method for ``determine_variants`` to compute the ``extra_attributes``
to include with each spec object."""
# There are often at least two copies (not symlinks) of each compiler executable in the
# same directory: one with a canonical name, e.g. "gfortran", and another one with the
# target prefix, e.g. "x86_64-pc-linux-gnu-gfortran". There also might be a copy of "gcc"
# with the version suffix, e.g. "x86_64-pc-linux-gnu-gcc-6.3.0". To ensure the consistency
# of values in the "paths" dictionary (i.e. we prefer all of them to reference copies
# with canonical names if possible), we iterate over the executables in the reversed sorted
# order:
# First pass over languages identifies exes that are perfect matches for canonical names
# Second pass checks for names with prefix/suffix
# Second pass is sorted by language name length because longer named languages
# e.g. cxx can often contain the names of shorter named languages
# e.g. c (e.g. clang/clang++)
paths = {}
exes = sorted(exes, reverse=True)
languages = {
lang: getattr(cls, f"{lang}_names")
for lang in sorted(cls.compiler_languages, key=len, reverse=True)
}
for exe in exes:
for lang, names in languages.items():
if os.path.basename(exe) in names:
paths[lang] = exe
break
else:
for lang, names in languages.items():
if any(name in os.path.basename(exe) for name in names):
paths[lang] = exe
break
return paths
@classmethod
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}

View File

@@ -110,9 +110,8 @@ def cuda_flags(arch_list):
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
# platform=linux, since they should also apply to platform=cray, and may
# apply to platform=darwin. We currently do not provide conflicts for
# platform=darwin with %apple-clang.
# platform=linux, since they may apply to platform=darwin. We currently
# do not provide conflicts for platform=darwin with %apple-clang.
# Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132
@@ -137,11 +136,14 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -846,6 +846,7 @@ def scalapack_libs(self):
"^mpich@2:" in spec_root
or "^cray-mpich" in spec_root
or "^mvapich2" in spec_root
or "^mvapich" in spec_root
or "^intel-mpi" in spec_root
or "^intel-oneapi-mpi" in spec_root
or "^intel-parallel-studio" in spec_root
@@ -936,32 +937,15 @@ def mpi_setup_dependent_build_environment(self, env, dependent_spec, compilers_o
"I_MPI_ROOT": self.normalize_path("mpi"),
}
# CAUTION - SIMILAR code in:
# var/spack/repos/builtin/packages/mpich/package.py
# var/spack/repos/builtin/packages/openmpi/package.py
# var/spack/repos/builtin/packages/mvapich2/package.py
#
# On Cray, the regular compiler wrappers *are* the MPI wrappers.
if "platform=cray" in self.spec:
# TODO: Confirm
wrapper_vars.update(
{
"MPICC": compilers_of_client["CC"],
"MPICXX": compilers_of_client["CXX"],
"MPIF77": compilers_of_client["F77"],
"MPIF90": compilers_of_client["F90"],
}
)
else:
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
# Ensure that the directory containing the compiler wrappers is in the
# PATH. Spack packages add `prefix.bin` to their dependents' paths,

View File

@@ -24,7 +24,6 @@ class MSBuildPackage(spack.package_base.PackageBase):
build_system("msbuild")
conflicts("platform=linux", when="build_system=msbuild")
conflicts("platform=darwin", when="build_system=msbuild")
conflicts("platform=cray", when="build_system=msbuild")
@spack.builder.builder("msbuild")

View File

@@ -24,7 +24,6 @@ class NMakePackage(spack.package_base.PackageBase):
build_system("nmake")
conflicts("platform=linux", when="build_system=nmake")
conflicts("platform=darwin", when="build_system=nmake")
conflicts("platform=cray", when="build_system=nmake")
@spack.builder.builder("nmake")
@@ -145,7 +144,7 @@ def install(self, pkg, spec, prefix):
opts += self.nmake_install_args()
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", prefix))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes

View File

@@ -36,9 +36,8 @@ class IntelOneApiPackage(Package):
"target=ppc64:",
"target=ppc64le:",
"target=aarch64:",
"platform=darwin:",
"platform=cray:",
"platform=windows:",
"platform=darwin",
"platform=windows",
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")

View File

@@ -138,16 +138,21 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
# Patch up shebangs if the package extends Python and we put a Python interpreter in the
# view.
if not self.extendee_spec:
return super().add_files_to_view(view, merge_map, skip_if_exists)
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
if python.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
for src, dst in merge_map.items():
if skip_if_exists and os.path.lexists(dst):
continue
@@ -168,7 +173,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
python.prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
view.link(src, dst)
@@ -199,14 +204,13 @@ def remove_files_from_view(self, view, merge_map):
ignore_namespace = True
bin_dir = self.spec.prefix.bin
global_view = self.extendee_spec.prefix == view.get_projection_for_spec(self.spec)
to_remove = []
for src, dst in merge_map.items():
if ignore_namespace and namespace_init(dst):
continue
if global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
to_remove.append(dst)
else:
os.remove(dst)
@@ -362,6 +366,12 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return f"https://pypi.org/simple/{name}/"
return None
@property
def python_spec(self):
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@property
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
@@ -371,8 +381,9 @@ def headers(self) -> HeaderList:
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
@@ -391,8 +402,9 @@ def libs(self) -> LibraryList:
name = self.spec.name[3:]
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
python = self.python_spec
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
@@ -504,6 +516,8 @@ def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]
@@ -519,14 +533,6 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
else:
args.append(".")
pip = spec["python"].command
# Hide user packages, since we don't have build isolation. This is
# necessary because pip / setuptools may run hooks from arbitrary
# packages during the build. There is no equivalent variable to hide
# system packages, so this is not reliable for external Python.
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
with fs.working_dir(self.build_directory):
pip(*args)

View File

@@ -44,6 +44,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
@@ -683,6 +684,22 @@ def generate_gitlab_ci_yaml(
"instead.",
)
def ensure_expected_target_path(path):
"""Returns passed paths with all Windows path separators exchanged
for posix separators only if copy_only_pipeline is enabled
This is required as copy_only_pipelines are a unique scenario where
the generate job and child pipelines are run on different platforms.
To make this compatible w/ Windows, we cannot write Windows style path separators
that will be consumed on by the Posix copy job runner.
TODO (johnwparent): Refactor config + cli read/write to deal only in posix
style paths
"""
if copy_only_pipeline and path:
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
@@ -806,7 +823,7 @@ def generate_gitlab_ci_yaml(
if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope)
env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = env_includes
env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes]
if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]:
env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci")
@@ -1227,6 +1244,9 @@ def main_script_replacements(cmd):
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
}
output_vars = output_object["variables"]
for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val)
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
@@ -1283,7 +1303,6 @@ def main_script_replacements(cmd):
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
@@ -1478,6 +1497,12 @@ def copy_test_logs_to_artifacts(test_stage, job_test_dir):
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def win_quote(quote_str: str) -> str:
if IS_WINDOWS:
quote_str = f'"{quote_str}"'
return quote_str
def download_and_extract_artifacts(url, work_dir):
"""Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
@@ -1500,7 +1525,7 @@ def download_and_extract_artifacts(url, work_dir):
request = Request(url, headers=headers)
request.get_method = lambda: "GET"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
@@ -1942,9 +1967,9 @@ def compose_command_err_handling(args):
# but we need to handle EXEs (git, etc) ourselves
catch_exe_failure = (
"""
if ($LASTEXITCODE -ne 0){
throw "Command {} has failed"
}
if ($LASTEXITCODE -ne 0){{
throw 'Command {} has failed'
}}
"""
if IS_WINDOWS
else ""
@@ -2176,13 +2201,13 @@ def __init__(self, ci_cdash):
def args(self):
return [
"--cdash-upload-url",
self.upload_url,
win_quote(self.upload_url),
"--cdash-build",
self.build_name,
win_quote(self.build_name),
"--cdash-site",
self.site,
win_quote(self.site),
"--cdash-buildstamp",
self.build_stamp,
win_quote(self.build_stamp),
]
@property # type: ignore
@@ -2248,7 +2273,7 @@ def create_buildgroup(self, opener, headers, url, group_name, group_type):
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code not in [200, 201]:
@@ -2294,7 +2319,7 @@ def populate_buildgroup(self, job_names):
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:

View File

@@ -84,7 +84,7 @@ def externals(parser, args):
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs, debug_log=tty.debug)
_process_reports(reports)

View File

@@ -13,7 +13,6 @@
import shutil
import sys
import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty
@@ -54,6 +53,7 @@
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry,
upload_manifest_with_retry,
)
@@ -856,10 +856,7 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
tags = list_tags(image_ref)
# Fetch all image config files in parallel
spec_dicts = pool.starmap(

View File

@@ -31,7 +31,6 @@
level = "long"
SPACK_COMMAND = "spack"
MAKE_COMMAND = "make"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
@@ -40,6 +39,12 @@ def deindent(desc):
return desc.replace(" ", "")
def unicode_escape(path: str) -> str:
"""Returns transformed path with any unicode
characters replaced with their corresponding escapes"""
return path.encode("unicode-escape").decode("utf-8")
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help="CI sub-commands")
@@ -551,75 +556,35 @@ def ci_rebuild(args):
# No hash match anywhere means we need to rebuild spec
# Start with spack arguments
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose"]
spack_cmd = [SPACK_COMMAND, "--color=always", "--backtrace", "--verbose", "install"]
config = cfg.get("config")
if not config["verify_ssl"]:
spack_cmd.append("-k")
install_args = []
install_args = [f'--use-buildcache={spack_ci.win_quote("package:never,dependencies:only")}']
can_verify = spack_ci.can_verify_binaries()
verify_binaries = can_verify and spack_is_pr_pipeline is False
if not verify_binaries:
install_args.append("--no-check-signature")
slash_hash = "/{}".format(job_spec.dag_hash())
# Arguments when installing dependencies from cache
deps_install_args = install_args
slash_hash = spack_ci.win_quote("/" + job_spec.dag_hash())
# Arguments when installing the root from sources
root_install_args = install_args + [
"--keep-stage",
"--only=package",
"--use-buildcache=package:never,dependencies:only",
]
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
root_install_args.extend(cdash_handler.args())
root_install_args.append(slash_hash)
# ["x", "y"] -> "'x' 'y'"
args_to_string = lambda args: " ".join("'{}'".format(arg) for arg in args)
commands = [
# apparently there's a race when spack bootstraps? do it up front once
[SPACK_COMMAND, "-e", env.path, "bootstrap", "now"],
[
SPACK_COMMAND,
"-e",
env.path,
"env",
"depfile",
"-o",
"Makefile",
"--use-buildcache=package:never,dependencies:only",
slash_hash, # limit to spec we're building
],
[
# --output-sync requires GNU make 4.x.
# Old make errors when you pass it a flag it doesn't recognize,
# but it doesn't error or warn when you set unrecognized flags in
# this variable.
"export",
"GNUMAKEFLAGS=--output-sync=recurse",
],
[
MAKE_COMMAND,
"SPACK={}".format(args_to_string(spack_cmd)),
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,
[SPACK_COMMAND, "-e", unicode_escape(env.path), "bootstrap", "now"],
spack_cmd + deps_install_args + [slash_hash],
spack_cmd + root_install_args + [slash_hash],
]
tty.debug("Installing {0} from source".format(job_spec.name))
install_exit_code = spack_ci.process_command("install", commands, repro_dir)

View File

@@ -106,7 +106,8 @@ def clean(parser, args):
# Then do the cleaning falling through the cases
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = spack.cmd.parse_specs(args.specs, concretize=False)
specs = list(spack.cmd.matching_spec_from_env(x) for x in specs)
for spec in specs:
msg = "Cleaning build stage [{0}]"
tty.msg(msg.format(spec.short_spec))

View File

@@ -563,12 +563,13 @@ def add_concretizer_args(subparser):
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--fresh-roots",
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
help="concretize with fresh roots and reused dependencies",
)
subgroup.add_argument(
"--deprecated",

View File

@@ -3,6 +3,9 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty as tty
from llnl.string import plural
import spack.cmd
import spack.cmd.common.arguments
import spack.environment as ev
@@ -43,5 +46,9 @@ def concretize(parser, args):
with env.write_transaction():
concretized_specs = env.concretize(force=args.force, tests=tests)
if not args.quiet:
ev.display_specs(concretized_specs)
if concretized_specs:
tty.msg(f"Concretized {plural(len(concretized_specs), 'spec')}:")
ev.display_specs([concrete for _, concrete in concretized_specs])
else:
tty.msg("No new specs to concretize.")
env.write()

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import glob
import os
@@ -11,43 +12,13 @@
import spack.cmd
import spack.paths
import spack.repo
from spack.spec import Spec
from spack.util.editor import editor
import spack.util.editor
description = "open package files in $EDITOR"
section = "packaging"
level = "short"
def edit_package(name, repo_path, namespace):
"""Opens the requested package file in your favorite $EDITOR.
Args:
name (str): The name of the package
repo_path (str): The path to the repository containing this package
namespace (str): A valid namespace registered with Spack
"""
# Find the location of the package
if repo_path:
repo = spack.repo.Repo(repo_path)
elif namespace:
repo = spack.repo.PATH.get_repo(namespace)
else:
repo = spack.repo.PATH
path = repo.filename_for_package_name(name)
spec = Spec(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something is wrong. '{0}' is not a file!".format(path))
if not os.access(path, os.R_OK):
tty.die("Insufficient permissions on '%s'!" % path)
else:
raise spack.repo.UnknownPackageError(spec.name)
editor(path)
def setup_parser(subparser):
excl_args = subparser.add_mutually_exclusive_group()
@@ -98,41 +69,67 @@ def setup_parser(subparser):
excl_args.add_argument("-r", "--repo", default=None, help="path to repo to edit package in")
excl_args.add_argument("-N", "--namespace", default=None, help="namespace of package to edit")
subparser.add_argument("package", nargs="?", default=None, help="package name")
subparser.add_argument("package", nargs="*", default=None, help="package name")
def locate_package(name: str, repo: spack.repo.Repo) -> str:
path = repo.filename_for_package_name(name)
try:
with open(path, "r"):
return path
except OSError as e:
if e.errno == errno.ENOENT:
raise spack.repo.UnknownPackageError(name) from e
tty.die(f"Cannot edit package: {e}")
def locate_file(name: str, path: str) -> str:
# convert command names to python module name
if path == spack.paths.command_path:
name = spack.cmd.python_name(name)
file_path = os.path.join(path, name)
# Try to open direct match.
try:
with open(file_path, "r"):
return file_path
except OSError as e:
if e.errno != errno.ENOENT:
tty.die(f"Cannot edit file: {e}")
pass
# Otherwise try to find a file that starts with the name
candidates = glob.glob(file_path + "*")
exclude_list = [".pyc", "~"] # exclude binaries and backups
files = [f for f in candidates if not any(f.endswith(ext) for ext in exclude_list)]
if len(files) > 1:
tty.die(
f"Multiple files start with `{name}`:\n"
+ "\n".join(f" {os.path.basename(f)}" for f in files)
)
elif not files:
tty.die(f"No file for '{name}' was found in {path}")
return files[0]
def edit(parser, args):
name = args.package
# By default, edit package files
path = spack.paths.packages_path
names = args.package
# If `--command`, `--test`, or `--module` is chosen, edit those instead
if args.path:
path = args.path
if name:
# convert command names to python module name
if path == spack.paths.command_path:
name = spack.cmd.python_name(name)
path = os.path.join(path, name)
if not os.path.exists(path):
files = glob.glob(path + "*")
exclude_list = [".pyc", "~"] # exclude binaries and backups
files = list(filter(lambda x: all(s not in x for s in exclude_list), files))
if len(files) > 1:
m = "Multiple files exist with the name {0}.".format(name)
m += " Please specify a suffix. Files are:\n\n"
for f in files:
m += " " + os.path.basename(f) + "\n"
tty.die(m)
if not files:
tty.die("No file for '{0}' was found in {1}".format(name, path))
path = files[0] # already confirmed only one entry in files
editor(path)
elif name:
edit_package(name, args.repo, args.namespace)
paths = [locate_file(name, args.path) for name in names] if names else [args.path]
spack.util.editor.editor(*paths)
elif names:
if args.repo:
repo = spack.repo.Repo(args.repo)
elif args.namespace:
repo = spack.repo.PATH.get_repo(args.namespace)
else:
repo = spack.repo.PATH
paths = [locate_package(name, repo) for name in names]
spack.util.editor.editor(*paths)
else:
# By default open the directory where packages live
editor(path)
spack.util.editor.editor(spack.paths.packages_path)

View File

@@ -10,13 +10,13 @@
import sys
import tempfile
from pathlib import Path
from typing import Optional
from typing import List, Optional
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from llnl.util.tty.color import cescape, colorize
import spack.cmd
import spack.cmd.common
@@ -61,14 +61,7 @@
#
def env_create_setup_parser(subparser):
"""create a new environment"""
subparser.add_argument(
"env_name",
metavar="env",
help=(
"name of managed environment or directory of the anonymous env "
"(when using --dir/-d) to activate"
),
)
subparser.add_argument("env_name", metavar="env", help="name or directory of environment")
subparser.add_argument(
"-d", "--dir", action="store_true", help="create an environment in a specific directory"
)
@@ -94,6 +87,9 @@ def env_create_setup_parser(subparser):
default=None,
help="either a lockfile (must end with '.json' or '.lock') or a manifest file",
)
subparser.add_argument(
"--include-concrete", action="append", help="name of old environment to copy specs from"
)
def env_create(args):
@@ -111,19 +107,32 @@ def env_create(args):
# the environment should not include a view.
with_view = None
include_concrete = None
if hasattr(args, "include_concrete"):
include_concrete = args.include_concrete
env = _env_create(
args.env_name,
init_file=args.envfile,
dir=args.dir,
dir=args.dir or os.path.sep in args.env_name or args.env_name in (".", ".."),
with_view=with_view,
keep_relative=args.keep_relative,
include_concrete=include_concrete,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
def _env_create(
name_or_path: str,
*,
init_file: Optional[str] = None,
dir: bool = False,
with_view: Optional[str] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
):
"""Create a new environment, with an optional yaml description.
Arguments:
@@ -135,22 +144,31 @@ def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep
keep_relative (bool): if True, develop paths are copied verbatim into
the new environment file, otherwise they may be made absolute if the
new environment is in a different location
include_concrete (list): list of the included concrete environments
"""
if not dir:
env = ev.create(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg("Created environment '%s' in %s" % (name_or_path, env.path))
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % (name_or_path))
return env
env = ev.create_in_dir(
name_or_path, init_file=init_file, with_view=with_view, keep_relative=keep_relative
)
tty.msg("Created environment in %s" % env.path)
tty.msg("You can activate this environment with:")
tty.msg(" spack env activate %s" % env.path)
tty.msg(
colorize(
f"Created environment @c{{{cescape(name_or_path)}}} in: @c{{{cescape(env.path)}}}"
)
)
else:
env = ev.create_in_dir(
name_or_path,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
tty.msg(colorize(f"Created independent environment in: @c{{{cescape(env.path)}}}"))
tty.msg(f"Activate with: {colorize(f'@c{{spack env activate {cescape(name_or_path)}}}')}")
return env
@@ -436,6 +454,12 @@ def env_remove_setup_parser(subparser):
"""remove an existing environment"""
subparser.add_argument("rm_env", metavar="env", nargs="+", help="environment(s) to remove")
arguments.add_common_arguments(subparser, ["yes_to_all"])
subparser.add_argument(
"-f",
"--force",
action="store_true",
help="remove the environment even if it is included in another environment",
)
def env_remove(args):
@@ -445,13 +469,35 @@ def env_remove(args):
and manifests embedded in repositories should be removed manually.
"""
read_envs = []
valid_envs = []
bad_envs = []
for env_name in args.rm_env:
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
read_envs.append(env)
valid_envs.append(env_name)
if env_name in args.rm_env:
read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
bad_envs.append(env_name)
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if env is linked to another before trying to remove
for name in valid_envs:
# don't check if environment is included to itself
if name == env_name:
continue
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import sys
import llnl.util.lang
@@ -271,6 +272,27 @@ def root_decorator(spec, string):
print()
if env.included_concrete_envs:
tty.msg("Included specs")
# Root specs cannot be displayed with prefixes, since those are not
# set for abstract specs. Same for hashes
root_args = copy.copy(args)
root_args.paths = False
# Roots are displayed with variants, etc. so that we can see
# specifically what the user asked for.
cmd.display_specs(
env.included_user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
)
print()
if args.show_concretized:
tty.msg("Concretized roots")
cmd.display_specs(env.specs_by_hash.values(), args, decorator=decorator)

View File

@@ -50,7 +50,7 @@
@B{++}, @r{--}, @r{~~}, @B{==} propagate variants to package dependencies
architecture variants:
@m{platform=platform} linux, darwin, cray, etc.
@m{platform=platform} linux, darwin, freebsd, windows
@m{os=operating_system} specific <operating_system>
@m{target=target} specific <target> processor
@m{arch=platform-os-target} shortcut for all three above

View File

@@ -10,6 +10,7 @@
from typing import List
import llnl.util.filesystem as fs
from llnl.string import plural
from llnl.util import lang, tty
import spack.build_environment
@@ -61,7 +62,6 @@ def install_kwargs_from_args(args):
"dependencies_use_cache": cache_opt(args.use_cache, dep_use_bc),
"dependencies_cache_only": cache_opt(args.cache_only, dep_use_bc),
"include_build_deps": args.include_build_deps,
"explicit": True, # Use true as a default for install command
"stop_at": args.until,
"unsigned": args.unsigned,
"install_deps": ("dependencies" in args.things_to_install),
@@ -376,7 +376,9 @@ def _maybe_add_and_concretize(args, env, specs):
# `spack concretize`
tests = compute_tests_install_kwargs(env.user_specs, args.test)
concretized_specs = env.concretize(tests=tests)
ev.display_specs(concretized_specs)
if concretized_specs:
tty.msg(f"Concretized {plural(len(concretized_specs), 'spec')}")
ev.display_specs([concrete for _, concrete in concretized_specs])
# save view regeneration for later, so that we only do it
# once, as it can be slow.
@@ -473,6 +475,7 @@ def install_without_active_env(args, install_kwargs, reporter_factory):
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]
installs = [(s.package, install_kwargs) for s in concrete_specs]
builder = PackageInstaller(installs)
installs = [s.package for s in concrete_specs]
install_kwargs["explicit"] = [s.dag_hash() for s in concrete_specs]
builder = PackageInstaller(installs, install_kwargs)
builder.install()

View File

@@ -114,15 +114,16 @@ def _process_result(result, show, required_format, kwargs):
# dump the solutions as concretized specs
if "solutions" in show:
for spec in result.specs:
# With -y, just print YAML to output.
if required_format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif required_format == "json":
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
else:
sys.stdout.write(spec.tree(color=sys.stdout.isatty(), **kwargs))
if required_format:
for spec in result.specs:
# With -y, just print YAML to output.
if required_format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif required_format == "json":
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
else:
sys.stdout.write(spack.spec.tree(result.specs, color=sys.stdout.isatty(), **kwargs))
print()
if result.unsolved_specs and "solutions" in show:

View File

@@ -105,11 +105,19 @@ def spec(parser, args):
if env:
env.concretize()
specs = env.concretized_specs()
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
if not args.format:
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
else:
tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs:
# With -y, just print YAML to output.
# With --yaml or --json, just print the raw specs to output
if args.format:
if args.format == "yaml":
# use write because to_yaml already has a newline.

View File

@@ -23,7 +23,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.21"
tutorial_branch = "releases/v0.22"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -151,7 +151,8 @@ def is_installed(spec):
key=lambda s: s.dag_hash(),
)
return [spec for spec in specs if is_installed(spec)]
with spack.store.STORE.db.read_transaction():
return [spec for spec in specs if is_installed(spec)]
def dependent_environments(
@@ -239,6 +240,8 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -214,8 +214,6 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:

View File

@@ -38,10 +38,10 @@
import spack.cmd
import spack.environment as ev
import spack.filesystem_view as fsv
import spack.schema.projections
import spack.store
from spack.config import validate
from spack.filesystem_view import YamlFilesystemView, view_func_parser
from spack.util import spack_yaml as s_yaml
description = "project packages to a compact naming scheme on the filesystem"
@@ -193,17 +193,13 @@ def view(parser, args):
ordered_projections = {}
# What method are we using for this view
if args.action in actions_link:
link_fn = view_func_parser(args.action)
else:
link_fn = view_func_parser("symlink")
view = YamlFilesystemView(
link_type = args.action if args.action in actions_link else "symlink"
view = fsv.YamlFilesystemView(
path,
spack.store.STORE.layout,
projections=ordered_projections,
ignore_conflicts=getattr(args, "ignore_conflicts", False),
link=link_fn,
link_type=link_type,
verbose=args.verbose,
)

View File

@@ -20,6 +20,7 @@
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
@@ -683,8 +684,8 @@ def __str__(self):
@contextlib.contextmanager
def compiler_environment(self):
# yield immediately if no modules
if not self.modules:
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
@@ -694,20 +695,12 @@ def compiler_environment(self):
try:
# load modules and set env variables
for module in self.modules:
# On cray, mic-knl module cannot be loaded without cce module
# See: https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
spack.util.module_cmd.load_module("cce")
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()
spack.schema.environment.parse(self.environment).apply_modifications()
yield
except BaseException:
raise
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()

View File

@@ -164,43 +164,66 @@ def _compiler_config_from_package_config(config):
def _compiler_config_from_external(config):
extra_attributes_key = "extra_attributes"
compilers_key = "compilers"
c_key, cxx_key, fortran_key = "c", "cxx", "fortran"
# Allow `@x.y.z` instead of `@=x.y.z`
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
err_header = f"The external spec '{spec}' cannot be used as a compiler"
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("paths", {})
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if extra_attributes_key not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{extra_attributes_key}' key")
return None
extra_attributes = config[extra_attributes_key]
# If I have 'extra_attributes' warn if 'compilers' is missing, or we don't have a C compiler
if compilers_key not in extra_attributes:
warnings.warn(
f"{err_header}: missing the '{compilers_key}' key under '{extra_attributes_key}'"
)
return None
attribute_compilers = extra_attributes[compilers_key]
if c_key not in attribute_compilers:
warnings.warn(
f"{err_header}: missing the C compiler path under "
f"'{extra_attributes_key}:{compilers_key}'"
)
return None
c_compiler = attribute_compilers[c_key]
# C++ and Fortran compilers are not mandatory, so let's just leave a debug trace
if cxx_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a C++ compiler")
if fortran_key not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a Fortran compiler")
# compilers format has cc/fc/f77, externals format has "c/fortran"
paths = {
"cc": c_compiler,
"cxx": attribute_compilers.get(cxx_key, None),
"fc": attribute_compilers.get(fortran_key, None),
"f77": attribute_compilers.get(fortran_key, None),
}
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture
else:
target = spec.target
target = spec.architecture.target
if not target:
host_platform = spack.platforms.host()
target = host_platform.target("default_target").microarchitecture
target = spack.platforms.host().target("default_target")
target = target.microarchitecture
operating_system = spec.os
if not operating_system:

View File

@@ -96,6 +96,8 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -120,6 +122,24 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -142,7 +162,10 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
return "-std=c2x"
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):

View File

@@ -74,6 +74,10 @@ class Concretizer:
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
@@ -113,7 +117,11 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.PATH.providers_for(spec)
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)

View File

@@ -34,28 +34,6 @@
"image": "docker.io/fedora:39"
}
},
"fedora:38": {
"bootstrap": {
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:38"
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"final": {
"image": "docker.io/fedora:38"
}
},
"fedora:37": {
"bootstrap": {
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:37"
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"final": {
"image": "docker.io/fedora:37"
}
},
"rockylinux:9": {
"bootstrap": {
"template": "container/rockylinux_9.dockerfile",
@@ -138,6 +116,13 @@
},
"os_package_manager": "apt"
},
"ubuntu:24.04": {
"bootstrap": {
"template": "container/ubuntu_2404.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-noble"
},
"ubuntu:22.04": {
"bootstrap": {
"template": "container/ubuntu_2204.dockerfile"
@@ -151,13 +136,6 @@
},
"build": "spack/ubuntu-focal",
"os_package_manager": "apt"
},
"ubuntu:18.04": {
"bootstrap": {
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -662,6 +662,7 @@ def _execute_redistribute(
@directive(("extendees", "dependencies"))
def extends(spec, when=None, type=("build", "run"), patches=None):
"""Same as depends_on, but also adds this package to the extendee list.
In case of Python, also adds a dependency on python-venv.
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
@@ -677,6 +678,11 @@ def _execute_extends(pkg):
_depends_on(pkg, spec, when=when, type=type, patches=patches)
spec_obj = spack.spec.Spec(spec)
# When extending python, also add a dependency on python-venv. This is done so that
# Spack environment views are Python virtual environments.
if spec_obj.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, "python-venv", when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[spec_obj.name] = (spec_obj, None)

View File

@@ -15,6 +15,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config
import spack.hash_types as ht
@@ -181,7 +182,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
base_dir = (
self.path_for_spec(deprecator_spec)
if deprecator_spec
else os.readlink(deprecated_spec.prefix)
else readlink(deprecated_spec.prefix)
)
yaml_path = os.path.join(

View File

@@ -34,6 +34,9 @@
* ``spec``: a string representation of the abstract spec that was concretized
4. ``concrete_specs``: a dictionary containing the specs in the environment.
5. ``include_concrete`` (dictionary): an optional dictionary that includes the roots
and concrete specs from the included environments, keyed by the path to that
environment
Compatibility
-------------
@@ -50,26 +53,37 @@
- ``v2``
- ``v3``
- ``v4``
- ``v5``
* - ``v0.12:0.14``
-
-
-
-
-
* - ``v0.15:0.16``
-
-
-
-
-
* - ``v0.17``
-
-
-
-
-
* - ``v0.18:``
-
-
-
-
-
* - ``v0.22:``
-
-
-
-
-
Version 1
---------
@@ -334,6 +348,118 @@
}
}
}
Version 5
---------
Version 5 doesn't change the top-level lockfile format, but an optional dictionary is
added. The dictionary has the ``root`` and ``concrete_specs`` of the included
environments, which are keyed by the path to that environment. Since this is optional
if the environment does not have any included environments ``include_concrete`` will
not be a part of the lockfile.
.. code-block:: json
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 5,
"specfile-version": 3
},
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
"include_concrete": {
"<path to environment>": {
"roots": [
{
"hash": "<dag_hash 1>",
"spec": "<abstract spec 1>"
},
{
"hash": "<dag_hash 2>",
"spec": "<abstract spec 2>"
}
],
"concrete_specs": {
"<dag_hash 1>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_1",
"hash": "<dag_hash for depname_1>",
"type": ["build", "link"]
},
{
"name": "depname_2",
"hash": "<dag_hash for depname_2>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 1>",
},
"<daghash 2>": {
"... <spec dict attributes> ...": { },
"dependencies": [
{
"name": "depname_3",
"hash": "<dag_hash for depname_3>",
"type": ["build", "link"]
},
{
"name": "depname_4",
"hash": "<dag_hash for depname_4>",
"type": ["build", "link"]
}
],
"hash": "<dag_hash 2>"
}
}
}
}
}
"""
from .environment import (

View File

@@ -16,20 +16,22 @@
import urllib.parse
import urllib.request
import warnings
from typing import Dict, Iterable, List, Optional, Set, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.cmd
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.filesystem_view as fsv
import spack.hash_types as ht
import spack.hooks
import spack.main
@@ -52,7 +54,6 @@
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
@@ -159,6 +160,8 @@ def default_manifest_yaml():
default_view_name = "default"
# Default behavior to link all packages into views (vs. only root packages)
default_view_link = "all"
# The name for any included concrete specs
included_concrete_name = "include_concrete"
def installed_specs():
@@ -293,6 +296,7 @@ def create(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create a managed environment in Spack and returns it.
@@ -309,10 +313,15 @@ def create(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: list of concrete environment names/paths to be included
"""
environment_dir = environment_dir_from_name(name, exists_ok=False)
return create_in_dir(
environment_dir, init_file=init_file, with_view=with_view, keep_relative=keep_relative
environment_dir,
init_file=init_file,
with_view=with_view,
keep_relative=keep_relative,
include_concrete=include_concrete,
)
@@ -321,6 +330,7 @@ def create_in_dir(
init_file: Optional[Union[str, pathlib.Path]] = None,
with_view: Optional[Union[str, pathlib.Path, bool]] = None,
keep_relative: bool = False,
include_concrete: Optional[List[str]] = None,
) -> "Environment":
"""Create an environment in the directory passed as input and returns it.
@@ -334,6 +344,7 @@ def create_in_dir(
string, it specifies the path to the view
keep_relative: if True, develop paths are copied verbatim into the new environment file,
otherwise they are made absolute
include_concrete: concrete environment names/paths to be included
"""
initialize_environment_dir(root, envfile=init_file)
@@ -346,6 +357,12 @@ def create_in_dir(
if with_view is not None:
manifest.set_default_view(with_view)
if include_concrete is not None:
set_included_envs_to_env_paths(include_concrete)
validate_included_envs_exists(include_concrete)
validate_included_envs_concrete(include_concrete)
manifest.set_include_concrete(include_concrete)
manifest.flush()
except (spack.config.ConfigFormatError, SpackEnvironmentConfigError) as e:
@@ -419,6 +436,67 @@ def ensure_env_root_path_exists():
fs.mkdirp(env_root_path())
def set_included_envs_to_env_paths(include_concrete: List[str]) -> None:
"""If the included environment(s) is the environment name
it is replaced by the path to the environment
Args:
include_concrete: list of env name or path to env"""
for i, env_name in enumerate(include_concrete):
if is_env_dir(env_name):
include_concrete[i] = env_name
elif exists(env_name):
include_concrete[i] = root(env_name)
def validate_included_envs_exists(include_concrete: List[str]) -> None:
"""Checks that all of the included environments exist
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments do not exist
"""
missing_envs = set()
for i, env_name in enumerate(include_concrete):
if not is_env_dir(env_name):
missing_envs.add(env_name)
if missing_envs:
msg = "The following environment(s) are missing: {0}".format(", ".join(missing_envs))
raise SpackEnvironmentError(msg)
def validate_included_envs_concrete(include_concrete: List[str]) -> None:
"""Checks that all of the included environments are concrete
Args:
include_concrete: list of already existing concrete environments to include
Raises:
SpackEnvironmentError: if any of the included environments are not concrete
"""
non_concrete_envs = set()
for env_path in include_concrete:
if not os.path.exists(Environment(env_path).lock_path):
non_concrete_envs.add(Environment(env_path).name)
if non_concrete_envs:
msg = "The following environment(s) are not concrete: {0}\n" "Please run:".format(
", ".join(non_concrete_envs)
)
for env in non_concrete_envs:
msg += f"\n\t`spack -e {env} concretize`"
raise SpackEnvironmentError(msg)
def all_environment_names():
"""List the names of environments that currently exist."""
# just return empty if the env path does not exist. A read-only
@@ -529,7 +607,7 @@ def __init__(
self.projections = projections
self.select = select
self.exclude = exclude
self.link_type = view_func_parser(link_type)
self.link_type = fsv.canonicalize_link_type(link_type)
self.link = link
def select_fn(self, spec):
@@ -563,7 +641,7 @@ def to_dict(self):
if self.exclude:
ret["exclude"] = self.exclude
if self.link_type:
ret["link_type"] = inverse_view_func_parser(self.link_type)
ret["link_type"] = self.link_type
if self.link != default_view_link:
ret["link"] = self.link
return ret
@@ -585,7 +663,7 @@ def _current_root(self):
if not os.path.islink(self.root):
return None
root = os.readlink(self.root)
root = readlink(self.root)
if os.path.isabs(root):
return root
@@ -613,7 +691,7 @@ def get_projection_for_spec(self, spec):
to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec)
def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
"""
Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem
@@ -633,14 +711,14 @@ def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
def _view(self, root: str) -> fsv.SimpleFilesystemView:
"""Returns a view object for a given root dir."""
return SimpleFilesystemView(
return fsv.SimpleFilesystemView(
root,
spack.store.STORE.layout,
ignore_conflicts=True,
projections=self.projections,
link=self.link_type,
link_type=self.link_type,
)
def __contains__(self, spec):
@@ -821,6 +899,18 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
self.specs_by_hash: Dict[str, Spec] = {}
#: Repository for this environment (memoized)
self._repo = None
#: Environment paths for concrete (lockfile) included environments
self.included_concrete_envs: List[str] = []
#: First-level included concretized spec data from/to the lockfile.
self.included_concrete_spec_data: Dict[str, Dict[str, List[str]]] = {}
#: User specs from included environments from the last concretization
self.included_concretized_user_specs: Dict[str, List[Spec]] = {}
#: Roots from included environments with the last concretization, in order
self.included_concretized_order: Dict[str, List[str]] = {}
#: Concretized specs by hash from the included environments
self.included_specs_by_hash: Dict[str, Dict[str, Spec]] = {}
#: Previously active environment
self._previous_active = None
self._dev_specs = None
@@ -858,7 +948,7 @@ def _read(self):
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
read_lock_version = self._read_lockfile(f)
read_lock_version = self._read_lockfile(f)["_meta"]["lockfile-version"]
if read_lock_version == 1:
tty.debug(f"Storing backup of {self.lock_path} at {self._lock_backup_v1_path}")
@@ -926,6 +1016,20 @@ def add_view(name, values):
if self.views == dict():
self.views[default_view_name] = ViewDescriptor(self.path, self.view_path_default)
def _process_concrete_includes(self):
"""Extract and load into memory included concrete spec data."""
self.included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
if self.included_concrete_envs:
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
data = self._read_lockfile(f)
if included_concrete_name in data:
self.included_concrete_spec_data = data[included_concrete_name]
else:
self.include_concrete_envs()
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
@@ -942,6 +1046,31 @@ def _construct_state_from_manifest(self):
self.spec_lists[user_speclist_name] = user_specs
self._process_view(spack.config.get("view", True))
self._process_concrete_includes()
def all_concretized_user_specs(self) -> List[Spec]:
"""Returns all of the concretized user specs of the environment and
its included environment(s)."""
concretized_user_specs = self.concretized_user_specs[:]
for included_specs in self.included_concretized_user_specs.values():
for included in included_specs:
# Don't duplicate included spec(s)
if included not in concretized_user_specs:
concretized_user_specs.append(included)
return concretized_user_specs
def all_concretized_orders(self) -> List[str]:
"""Returns all of the concretized order of the environment and
its included environment(s)."""
concretized_order = self.concretized_order[:]
for included_concretized_order in self.included_concretized_order.values():
for included in included_concretized_order:
# Don't duplicate included spec(s)
if included not in concretized_order:
concretized_order.append(included)
return concretized_order
@property
def user_specs(self):
@@ -966,6 +1095,26 @@ def _read_dev_specs(self):
dev_specs[name] = local_entry
return dev_specs
@property
def included_user_specs(self) -> SpecList:
"""Included concrete user (or root) specs from last concretization."""
spec_list = SpecList()
if not self.included_concrete_envs:
return spec_list
def add_root_specs(included_concrete_specs):
# add specs from the include *and* any nested includes it may have
for env, info in included_concrete_specs.items():
for root_list in info["roots"]:
spec_list.add(root_list["spec"])
if "include_concrete" in info:
add_root_specs(info["include_concrete"])
add_root_specs(self.included_concrete_spec_data)
return spec_list
def clear(self, re_read=False):
"""Clear the contents of the environment
@@ -977,9 +1126,15 @@ def clear(self, re_read=False):
self.spec_lists[user_speclist_name] = SpecList()
self._dev_specs = {}
self.concretized_user_specs = [] # user specs from last concretize
self.concretized_order = [] # roots of last concretize, in order
self.concretized_user_specs = [] # user specs from last concretize
self.specs_by_hash = {} # concretized specs by hash
self.included_concrete_spec_data = {} # concretized specs from lockfile of included envs
self.included_concretized_order = {} # root specs of the included envs, keyed by env path
self.included_concretized_user_specs = {} # user specs from last concretize's included env
self.included_specs_by_hash = {} # concretized specs by hash from the included envs
self.invalidate_repository_cache()
self._previous_active = None # previously active environment
if not re_read:
@@ -1033,6 +1188,55 @@ def scope_name(self):
"""Name of the config scope of this environment's manifest file."""
return self.manifest.scope_name
def include_concrete_envs(self):
"""Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set()
concrete_hash_seen = set()
self.included_concrete_spec_data = {}
for env_path in self.included_concrete_envs:
# Check that environment exists
if not is_env_dir(env_path):
raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path)
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
# Copy unique root specs from env
self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env
for concrete_spec in lockfile_as_dict["concrete_specs"]:
if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path].update(
{"concrete_specs": lockfile_as_dict["concrete_specs"]}
)
concrete_hash_seen.add(concrete_spec)
if "include_concrete" in lockfile_as_dict.keys():
self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
"include_concrete"
]
self._read_lockfile_dict(self._to_lockfile_dict())
self.write()
def destroy(self):
"""Remove this environment from Spack entirely."""
shutil.rmtree(self.path)
@@ -1232,6 +1436,10 @@ def concretize(self, force=False, tests=False):
for spec in set(self.concretized_user_specs) - set(self.user_specs):
self.deconcretize(spec, concrete=False)
# If a combined env, check updated spec is in the linked envs
if self.included_concrete_envs:
self.include_concrete_envs()
# Pick the right concretization strategy
if self.unify == "when_possible":
return self._concretize_together_where_possible(tests=tests)
@@ -1415,7 +1623,7 @@ def _concretize_separately(self, tests=False):
# Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -1704,8 +1912,14 @@ def _partition_roots_by_install_status(self):
of per spec."""
installed, uninstalled = [], []
with spack.store.STORE.db.read_transaction():
for concretized_hash in self.concretized_order:
spec = self.specs_by_hash[concretized_hash]
for concretized_hash in self.all_concretized_orders():
if concretized_hash in self.specs_by_hash:
spec = self.specs_by_hash[concretized_hash]
else:
for env_path in self.included_specs_by_hash.keys():
if concretized_hash in self.included_specs_by_hash[env_path]:
spec = self.included_specs_by_hash[env_path][concretized_hash]
break
if not spec.installed or (
spec.satisfies("dev_path=*") or spec.satisfies("^dev_path=*")
):
@@ -1735,13 +1949,19 @@ def install_specs(self, specs: Optional[List[Spec]] = None, **install_args):
specs = specs if specs is not None else roots
# Extend the set of specs to overwrite with modified dev specs and their parents
install_args["overwrite"] = (
install_args.get("overwrite", []) + self._dev_specs_that_need_overwrite()
overwrite: Set[str] = set()
overwrite.update(install_args.get("overwrite", []), self._dev_specs_that_need_overwrite())
install_args["overwrite"] = overwrite
explicit: Set[str] = set()
explicit.update(
install_args.get("explicit", []),
(s.dag_hash() for s in specs),
(s.dag_hash() for s in roots),
)
install_args["explicit"] = explicit
installs = [(spec.package, {**install_args, "explicit": spec in roots}) for spec in specs]
PackageInstaller(installs).install()
PackageInstaller([spec.package for spec in specs], install_args).install()
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
@@ -1785,8 +2005,14 @@ def added_specs(self):
def concretized_specs(self):
"""Tuples of (user spec, concrete spec) for all concrete specs."""
for s, h in zip(self.concretized_user_specs, self.concretized_order):
yield (s, self.specs_by_hash[h])
for s, h in zip(self.all_concretized_user_specs(), self.all_concretized_orders()):
if h in self.specs_by_hash:
yield (s, self.specs_by_hash[h])
else:
for env_path in self.included_specs_by_hash.keys():
if h in self.included_specs_by_hash[env_path]:
yield (s, self.included_specs_by_hash[env_path][h])
break
def concrete_roots(self):
"""Same as concretized_specs, except it returns the list of concrete
@@ -1915,8 +2141,7 @@ def _get_environment_specs(self, recurse_dependencies=True):
If these specs appear under different user_specs, only one copy
is added to the list returned.
"""
specs = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [self.specs_by_hash[h] for h in self.all_concretized_orders()]
if recurse_dependencies:
specs.extend(
traverse.traverse_nodes(
@@ -1961,31 +2186,76 @@ def _to_lockfile_dict(self):
"concrete_specs": concrete_specs,
}
if self.included_concrete_envs:
data[included_concrete_name] = self.included_concrete_spec_data
return data
def _read_lockfile(self, file_or_json):
"""Read a lockfile from a file or from a raw string."""
lockfile_dict = sjson.load(file_or_json)
self._read_lockfile_dict(lockfile_dict)
return lockfile_dict["_meta"]["lockfile-version"]
return lockfile_dict
def set_included_concretized_user_specs(
self,
env_name: str,
env_info: Dict[str, Dict[str, Any]],
included_json_specs_by_hash: Dict[str, Dict[str, Any]],
) -> Dict[str, Dict[str, Any]]:
"""Sets all of the concretized user specs from included environments
to include those from nested included environments.
Args:
env_name: the name (technically the path) of the included environment
env_info: included concrete environment data
included_json_specs_by_hash: concrete spec data keyed by hash
Returns: updated specs_by_hash
"""
self.included_concretized_order[env_name] = []
self.included_concretized_user_specs[env_name] = []
def add_specs(name, info, specs_by_hash):
# Add specs from the environment as well as any of its nested
# environments.
for root_info in info["roots"]:
self.included_concretized_order[name].append(root_info["hash"])
self.included_concretized_user_specs[name].append(Spec(root_info["spec"]))
if "concrete_specs" in info:
specs_by_hash.update(info["concrete_specs"])
if included_concrete_name in info:
for included_name, included_info in info[included_concrete_name].items():
if included_name not in self.included_concretized_order:
self.included_concretized_order[included_name] = []
self.included_concretized_user_specs[included_name] = []
add_specs(included_name, included_info, specs_by_hash)
add_specs(env_name, env_info, included_json_specs_by_hash)
return included_json_specs_by_hash
def _read_lockfile_dict(self, d):
"""Read a lockfile dictionary into this environment."""
self.specs_by_hash = {}
self.included_specs_by_hash = {}
self.included_concretized_user_specs = {}
self.included_concretized_order = {}
roots = d["roots"]
self.concretized_user_specs = [Spec(r["spec"]) for r in roots]
self.concretized_order = [r["hash"] for r in roots]
json_specs_by_hash = d["concrete_specs"]
included_json_specs_by_hash = {}
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
if included_concrete_name in d:
for env_name, env_info in d[included_concrete_name].items():
included_json_specs_by_hash.update(
self.set_included_concretized_user_specs(
env_name, env_info, included_json_specs_by_hash
)
)
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
current_lockfile_format = d["_meta"]["lockfile-version"]
try:
reader = READER_CLS[current_lockfile_format]
@@ -1998,6 +2268,39 @@ def _read_lockfile_dict(self, d):
msg += " You need to use a newer Spack version."
raise SpackEnvironmentError(msg)
first_seen, self.concretized_order = self.filter_specs(
reader, json_specs_by_hash, self.concretized_order
)
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
if any(self.included_concretized_order.values()):
first_seen = {}
for env_name, concretized_order in self.included_concretized_order.items():
filtered_spec, self.included_concretized_order[env_name] = self.filter_specs(
reader, included_json_specs_by_hash, concretized_order
)
first_seen.update(filtered_spec)
for env_path, spec_hashes in self.included_concretized_order.items():
self.included_specs_by_hash[env_path] = {}
for spec_dag_hash in spec_hashes:
self.included_specs_by_hash[env_path].update(
{spec_dag_hash: first_seen[spec_dag_hash]}
)
def filter_specs(self, reader, json_specs_by_hash, order_concretized):
# Track specs by their lockfile key. Currently spack uses the finest
# grained hash as the lockfile key, while older formats used the build
# hash or a previous incarnation of the DAG hash (one that did not
# include build deps or package hash).
specs_by_hash = {}
# Track specs by their DAG hash, allows handling DAG hash collisions
first_seen = {}
# First pass: Put each spec in the map ignoring dependencies
for lockfile_key, node_dict in json_specs_by_hash.items():
spec = reader.from_node_dict(node_dict)
@@ -2020,7 +2323,8 @@ def _read_lockfile_dict(self, d):
# keep. This is only required as long as we support older lockfile
# formats where the mapping from DAG hash to lockfile key is possibly
# one-to-many.
for lockfile_key in self.concretized_order:
for lockfile_key in order_concretized:
for s in specs_by_hash[lockfile_key].traverse():
if s.dag_hash() not in first_seen:
first_seen[s.dag_hash()] = s
@@ -2028,12 +2332,10 @@ def _read_lockfile_dict(self, d):
# Now make sure concretized_order and our internal specs dict
# contains the keys used by modern spack (i.e. the dag_hash
# that includes build deps and package hash).
self.concretized_order = [
specs_by_hash[h_key].dag_hash() for h_key in self.concretized_order
]
for spec_dag_hash in self.concretized_order:
self.specs_by_hash[spec_dag_hash] = first_seen[spec_dag_hash]
order_concretized = [specs_by_hash[h_key].dag_hash() for h_key in order_concretized]
return first_seen, order_concretized
def write(self, regenerate: bool = True) -> None:
"""Writes an in-memory environment to its location on disk.
@@ -2046,7 +2348,7 @@ def write(self, regenerate: bool = True) -> None:
regenerate: regenerate views and run post-write hooks as well as writing if True.
"""
self.manifest_uptodate_or_warn()
if self.specs_by_hash:
if self.specs_by_hash or self.included_concrete_envs:
self.ensure_env_directory_exists(dot_env=True)
self.update_environment_repository()
self.manifest.flush()
@@ -2172,27 +2474,21 @@ def _equiv_dict(first, second):
return same_values and same_keys_with_same_overrides
def display_specs(concretized_specs):
"""Displays the list of specs returned by `Environment.concretize()`.
def display_specs(specs):
"""Displays a list of specs traversed breadth-first, covering nodes, with install status.
Args:
concretized_specs (list): list of specs returned by
`Environment.concretize()`
specs (list): list of specs
"""
def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.DISPLAY_FORMAT,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,
)
for user_spec, concrete_spec in concretized_specs:
tty.msg("Concretized {0}".format(user_spec))
sys.stdout.write(_tree_to_display(concrete_spec))
print("")
tree_string = spack.spec.tree(
specs,
format=spack.spec.DISPLAY_FORMAT,
hashes=True,
hashlen=7,
status_fn=spack.spec.Spec.install_status,
key=traverse.by_dag_hash,
)
print(tree_string)
def _concretize_from_constraints(spec_constraints, tests=False):
@@ -2545,6 +2841,19 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
raise SpackEnvironmentError(msg) from e
self.changed = True
def set_include_concrete(self, include_concrete: List[str]) -> None:
"""Sets the included concrete environments in the manifest to the value(s) passed as input.
Args:
include_concrete: list of already existing concrete environments to include
"""
self.pristine_configuration[included_concrete_name] = []
for env_path in include_concrete:
self.pristine_configuration[included_concrete_name].append(env_path)
self.changed = True
def add_definition(self, user_spec: str, list_name: str) -> None:
"""Appends a user spec to the first active definition matching the name passed as argument.
@@ -2728,54 +3037,56 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# If scheme is not valid, config_path is not a url
# of a type Spack is generally aware
if spack.util.url.validate_scheme(include_url.scheme):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):

View File

@@ -10,8 +10,9 @@
import shutil
import stat
import sys
from typing import Optional
from typing import Callable, Dict, Optional
from llnl.string import comma_or
from llnl.util import tty
from llnl.util.filesystem import (
mkdirp,
@@ -49,19 +50,20 @@
_projections_path = ".spack/projections.yaml"
def view_symlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
LinkCallbackType = Callable[[str, str, "FilesystemView", Optional["spack.spec.Spec"]], None]
def view_symlink(src: str, dst: str, *args, **kwargs) -> None:
symlink(src, dst)
def view_hardlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
def view_hardlink(src: str, dst: str, *args, **kwargs) -> None:
os.link(src, dst)
def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
def view_copy(
src: str, dst: str, view: "FilesystemView", spec: Optional["spack.spec.Spec"] = None
) -> None:
"""
Copy a file from src to dst.
@@ -104,27 +106,40 @@ def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
tty.debug(f"Can't change the permissions for {dst}")
def view_func_parser(parsed_name):
# What method are we using for this view
if parsed_name in ("hardlink", "hard"):
#: supported string values for `link_type` in an env, mapped to canonical values
_LINK_TYPES = {
"hardlink": "hardlink",
"hard": "hardlink",
"copy": "copy",
"relocate": "copy",
"add": "symlink",
"symlink": "symlink",
"soft": "symlink",
}
_VALID_LINK_TYPES = sorted(set(_LINK_TYPES.values()))
def canonicalize_link_type(link_type: str) -> str:
"""Return canonical"""
canonical = _LINK_TYPES.get(link_type)
if not canonical:
raise ValueError(
f"Invalid link type: '{link_type}. Must be one of {comma_or(_VALID_LINK_TYPES)}'"
)
return canonical
def function_for_link_type(link_type: str) -> LinkCallbackType:
link_type = canonicalize_link_type(link_type)
if link_type == "hardlink":
return view_hardlink
elif parsed_name in ("copy", "relocate"):
return view_copy
elif parsed_name in ("add", "symlink", "soft"):
elif link_type == "symlink":
return view_symlink
else:
raise ValueError(f"invalid link type for view: '{parsed_name}'")
elif link_type == "copy":
return view_copy
def inverse_view_func_parser(view_type):
# get string based on view type
if view_type is view_hardlink:
link_name = "hardlink"
elif view_type is view_copy:
link_name = "copy"
else:
link_name = "symlink"
return link_name
assert False, "invalid link type" # need mypy Literal values
class FilesystemView:
@@ -140,7 +155,16 @@ class FilesystemView:
directory structure.
"""
def __init__(self, root, layout, **kwargs):
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
"""
Initialize a filesystem view under the given `root` directory with
corresponding directory `layout`.
@@ -149,15 +173,14 @@ def __init__(self, root, layout, **kwargs):
"""
self._root = root
self.layout = layout
self.projections = {} if projections is None else projections
self.projections = kwargs.get("projections", {})
self.ignore_conflicts = kwargs.get("ignore_conflicts", False)
self.verbose = kwargs.get("verbose", False)
self.ignore_conflicts = ignore_conflicts
self.verbose = verbose
# Setup link function to include view
link_func = kwargs.get("link", view_symlink)
self.link = ft.partial(link_func, view=self)
self.link_type = link_type
self.link = ft.partial(function_for_link_type(link_type), view=self)
def add_specs(self, *specs, **kwargs):
"""
@@ -255,8 +278,24 @@ class YamlFilesystemView(FilesystemView):
Filesystem view to work with a yaml based directory layout.
"""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
super().__init__(
root,
layout,
projections=projections,
ignore_conflicts=ignore_conflicts,
verbose=verbose,
link_type=link_type,
)
# Super class gets projections from the kwargs
# YAML specific to get projections from YAML file
@@ -638,9 +677,6 @@ class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to

View File

@@ -13,7 +13,6 @@
import spack.config
import spack.relocate
from spack.util.elf import ElfParsingError, parse_elf
from spack.util.executable import Executable
def is_shared_library_elf(filepath):
@@ -141,7 +140,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
if not spec.satisfies("platform=linux"):
return
# Disable this hook when bootstrapping, to avoid recursion.
@@ -149,10 +148,9 @@ def post_install(spec, explicit=None):
return
# Should failing to locate patchelf be a hard error?
patchelf_path = spack.relocate._patchelf()
if not patchelf_path:
patchelf = spack.relocate._patchelf()
if not patchelf:
return
patchelf = Executable(patchelf_path)
fixes = find_and_patch_sonames(spec.prefix, spec.package.non_bindable_shared_objects, patchelf)

View File

@@ -12,6 +12,10 @@
def post_install(spec, explicit):
# Push package to all buildcaches with autopush==True
# Do nothing if spec is an external package
if spec.external:
return
# Do nothing if package was not installed from source
pkg = spec.package
if pkg.installed_from_binary_cache:

View File

@@ -117,7 +117,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
if not spec.satisfies("platform=linux"):
return
visit_directory_tree(spec.prefix, ElfFilesWithRPathVisitor())

View File

@@ -0,0 +1,8 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
def post_install(spec, explicit=None):
spec.package.windows_establish_runtime_linkage()

View File

@@ -488,6 +488,7 @@ def _process_binary_cache_tarball(
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
pkg.windows_establish_runtime_linkage()
if hasattr(pkg, "_post_buildcache_install_hook"):
pkg._post_buildcache_install_hook()
@@ -599,9 +600,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
if node is spec:
spack.repo.PATH.dump_provenance(node, dest_pkg_dir)
elif source_pkg_dir:
fs.install_tree(
source_pkg_dir, dest_pkg_dir, allow_broken_symlinks=(sys.platform != "win32")
)
fs.install_tree(source_pkg_dir, dest_pkg_dir)
def get_dependent_ids(spec: "spack.spec.Spec") -> List[str]:
@@ -760,12 +759,8 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
if not self.pkg.spec.concrete:
raise ValueError(f"{self.pkg.name} must have a concrete spec")
# Cache the package phase options with the explicit package,
# popping the options to ensure installation of associated
# dependencies is NOT affected by these options.
self.pkg.stop_before_phase = install_args.pop("stop_before", None) # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.pop("stop_at", None) # type: ignore[attr-defined]
self.pkg.stop_before_phase = install_args.get("stop_before") # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.get("stop_at") # type: ignore[attr-defined]
# Cache the package id for convenience
self.pkg_id = package_id(pkg.spec)
@@ -1075,19 +1070,17 @@ def flag_installed(self, installed: List[str]) -> None:
@property
def explicit(self) -> bool:
"""The package was explicitly requested by the user."""
return self.is_root and self.request.install_args.get("explicit", True)
return self.pkg.spec.dag_hash() in self.request.install_args.get("explicit", [])
@property
def is_root(self) -> bool:
"""The package was requested directly, but may or may not be explicit
in an environment."""
def is_build_request(self) -> bool:
"""The package was requested directly"""
return self.pkg == self.request.pkg
@property
def use_cache(self) -> bool:
_use_cache = True
if self.is_root:
if self.is_build_request:
return self.request.install_args.get("package_use_cache", _use_cache)
else:
return self.request.install_args.get("dependencies_use_cache", _use_cache)
@@ -1095,7 +1088,7 @@ def use_cache(self) -> bool:
@property
def cache_only(self) -> bool:
_cache_only = False
if self.is_root:
if self.is_build_request:
return self.request.install_args.get("package_cache_only", _cache_only)
else:
return self.request.install_args.get("dependencies_cache_only", _cache_only)
@@ -1121,24 +1114,17 @@ def priority(self):
class PackageInstaller:
"""
Class for managing the install process for a Spack instance based on a
bottom-up DAG approach.
Class for managing the install process for a Spack instance based on a bottom-up DAG approach.
This installer can coordinate concurrent batch and interactive, local
and distributed (on a shared file system) builds for the same Spack
instance.
This installer can coordinate concurrent batch and interactive, local and distributed (on a
shared file system) builds for the same Spack instance.
"""
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []) -> None:
"""Initialize the installer.
Args:
installs (list): list of tuples, where each
tuple consists of a package (PackageBase) and its associated
install arguments (dict)
"""
def __init__(
self, packages: List["spack.package_base.PackageBase"], install_args: dict
) -> None:
# List of build requests
self.build_requests = [BuildRequest(pkg, install_args) for pkg, install_args in installs]
self.build_requests = [BuildRequest(pkg, install_args) for pkg in packages]
# Priority queue of build tasks
self.build_pq: List[Tuple[Tuple[int, int], BuildTask]] = []
@@ -1561,7 +1547,7 @@ def _add_tasks(self, request: BuildRequest, all_deps):
#
# External and upstream packages need to get flagged as installed to
# ensure proper status tracking for environment build.
explicit = request.install_args.get("explicit", True)
explicit = request.pkg.spec.dag_hash() in request.install_args.get("explicit", [])
not_local = _handle_external_and_upstream(request.pkg, explicit)
if not_local:
self._flag_installed(request.pkg)
@@ -1682,10 +1668,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if not pkg.unit_test_check():
return
# Injecting information to know if this installation request is the root one
# to determine in BuildProcessInstaller whether installation is explicit or not
install_args["is_root"] = task.is_root
try:
self._setup_install_dir(pkg)
@@ -1698,10 +1680,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
spack.package_base.PackageBase._verbose = spack.build_environment.start_build_process(
pkg, build_process, install_args
)
# Currently this is how RPATH-like behavior is achieved on Windows, after install
# establish runtime linkage via Windows Runtime link object
# Note: this is a no-op on non Windows platforms
pkg.windows_establish_runtime_linkage()
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
@@ -2001,8 +1979,8 @@ def install(self) -> None:
self._init_queue()
fail_fast_err = "Terminating after first install failure"
single_explicit_spec = len(self.build_requests) == 1
failed_explicits = []
single_requested_spec = len(self.build_requests) == 1
failed_build_requests = []
install_status = InstallStatus(len(self.build_pq))
@@ -2200,14 +2178,11 @@ def install(self) -> None:
if self.fail_fast:
raise InstallError(f"{fail_fast_err}: {str(exc)}", pkg=pkg)
# Terminate at this point if the single explicit spec has
# failed to install.
if single_explicit_spec and task.explicit:
raise
# Track explicit spec id and error to summarize when done
if task.explicit:
failed_explicits.append((pkg, pkg_id, str(exc)))
# Terminate when a single build request has failed, or summarize errors later.
if task.is_build_request:
if single_requested_spec:
raise
failed_build_requests.append((pkg, pkg_id, str(exc)))
finally:
# Remove the install prefix if anything went wrong during
@@ -2230,16 +2205,16 @@ def install(self) -> None:
if request.install_args.get("install_package") and request.pkg_id not in self.installed
]
if failed_explicits or missing:
for _, pkg_id, err in failed_explicits:
if failed_build_requests or missing:
for _, pkg_id, err in failed_build_requests:
tty.error(f"{pkg_id}: {err}")
for _, pkg_id in missing:
tty.error(f"{pkg_id}: Package was not installed")
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
if len(failed_build_requests) > 0:
pkg = failed_build_requests[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_build_requests]
tty.debug(
"Associating installation failure with first failed "
f"explicit package ({ids[0]}) from {', '.join(ids)}"
@@ -2298,7 +2273,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
self.verbose = bool(install_args.get("verbose", False))
# whether installation was explicitly requested by the user
self.explicit = install_args.get("is_root", False) and install_args.get("explicit", True)
self.explicit = pkg.spec.dag_hash() in install_args.get("explicit", [])
# env before starting installation
self.unmodified_env = install_args.get("unmodified_env", {})
@@ -2383,9 +2358,7 @@ def _install_source(self) -> None:
src_target = os.path.join(pkg.spec.prefix, "share", pkg.name, "src")
tty.debug(f"{self.pre} Copying source to {src_target}")
fs.install_tree(
pkg.stage.source_path, src_target, allow_broken_symlinks=(sys.platform != "win32")
)
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder

View File

@@ -427,7 +427,7 @@ def make_argument_parser(**kwargs):
parser.add_argument(
"--color",
action="store",
default=os.environ.get("SPACK_COLOR", "auto"),
default=None,
choices=("always", "never", "auto"),
help="when to colorize output (default: auto)",
)
@@ -622,7 +622,8 @@ def setup_main_options(args):
# with color
color.try_enable_terminal_color_on_windows()
# when to use color (takes always, auto, or never)
color.set_color_when(args.color)
if args.color is not None:
color.set_color_when(args.color)
def allows_unknown_args(command):

View File

@@ -11,7 +11,7 @@
import urllib.parse
import urllib.request
from http.client import HTTPResponse
from typing import NamedTuple, Tuple
from typing import List, NamedTuple, Tuple
from urllib.request import Request
import llnl.util.tty as tty
@@ -27,6 +27,7 @@
import spack.stage
import spack.traverse
import spack.util.crypto
import spack.util.url
from .image import Digest, ImageReference
@@ -69,6 +70,42 @@ def with_query_param(url: str, param: str, value: str) -> str:
)
def list_tags(ref: ImageReference, _urlopen: spack.oci.opener.MaybeOpen = None) -> List[str]:
"""Retrieves the list of tags associated with an image, handling pagination."""
_urlopen = _urlopen or spack.oci.opener.urlopen
tags = set()
fetch_url = ref.tags_url()
while True:
# Fetch tags
request = Request(url=fetch_url)
response = _urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags.update(json.load(response)["tags"])
# Check for pagination
link_header = response.headers["Link"]
if link_header is None:
break
tty.debug(f"OCI tag pagination: {link_header}")
rel_next_value = spack.util.url.parse_link_rel_next(link_header)
if rel_next_value is None:
break
rel_next = urllib.parse.urlparse(rel_next_value)
if rel_next.scheme not in ("https", ""):
break
fetch_url = ref.endpoint(rel_next_value)
return sorted(tags)
def upload_blob(
ref: ImageReference,
file: str,

View File

@@ -398,7 +398,7 @@ def create_opener():
opener = urllib.request.OpenerDirector()
for handler in [
urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(),
urllib.request.HTTPRedirectHandler(),
urllib.request.HTTPErrorProcessor(),
@@ -418,18 +418,27 @@ def ensure_status(request: urllib.request.Request, response: HTTPResponse, statu
)
def default_retry(f, retries: int = 3, sleep=None):
def default_retry(f, retries: int = 5, sleep=None):
sleep = sleep or time.sleep
def wrapper(*args, **kwargs):
for i in range(retries):
try:
return f(*args, **kwargs)
except urllib.error.HTTPError as e:
except (urllib.error.URLError, TimeoutError) as e:
# Retry on internal server errors, and rate limit errors
# Potentially this could take into account the Retry-After header
# if registries support it
if i + 1 != retries and (500 <= e.code < 600 or e.code == 429):
if i + 1 != retries and (
(
isinstance(e, urllib.error.HTTPError)
and (500 <= e.code < 600 or e.code == 429)
)
or (
isinstance(e, urllib.error.URLError) and isinstance(e.reason, TimeoutError)
)
or isinstance(e, TimeoutError)
):
# Exponential backoff
sleep(2**i)
continue

View File

@@ -3,22 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from ._operating_system import OperatingSystem
from .cray_backend import CrayBackend
from .cray_frontend import CrayFrontend
from .freebsd import FreeBSDOs
from .linux_distro import LinuxDistro
from .mac_os import MacOs
from .windows_os import WindowsOs
__all__ = [
"OperatingSystem",
"LinuxDistro",
"MacOs",
"CrayFrontend",
"CrayBackend",
"WindowsOs",
"FreeBSDOs",
]
__all__ = ["OperatingSystem", "LinuxDistro", "MacOs", "WindowsOs", "FreeBSDOs"]
#: List of all the Operating Systems known to Spack
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs, FreeBSDOs]
operating_systems = [LinuxDistro, MacOs, WindowsOs, FreeBSDOs]

View File

@@ -1,172 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.tty as tty
import spack.error
import spack.version
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
#: Possible locations of the Cray CLE release file,
#: which we look at to get the CNL OS version.
_cle_release_file = "/etc/opt/cray/release/cle-release"
_clerelease_file = "/etc/opt/cray/release/clerelease"
def read_cle_release_file():
"""Read the CLE release file and return a dict with its attributes.
This file is present on newer versions of Cray.
The release file looks something like this::
RELEASE=6.0.UP07
BUILD=6.0.7424
...
The dictionary we produce looks like this::
{
"RELEASE": "6.0.UP07",
"BUILD": "6.0.7424",
...
}
Returns:
dict: dictionary of release attributes
"""
with open(_cle_release_file) as release_file:
result = {}
for line in release_file:
# use partition instead of split() to ensure we only split on
# the first '=' in the line.
key, _, value = line.partition("=")
result[key] = value.strip()
return result
def read_clerelease_file():
"""Read the CLE release file and return the Cray OS version.
This file is present on older versions of Cray.
The release file looks something like this::
5.2.UP04
Returns:
str: the Cray OS version
"""
with open(_clerelease_file) as release_file:
for line in release_file:
return line.strip()
class CrayBackend(LinuxDistro):
"""Compute Node Linux (CNL) is the operating system used for the Cray XC
series super computers. It is a very stripped down version of GNU/Linux.
Any compilers found through this operating system will be used with
modules. If updated, user must make sure that version and name are
updated to indicate that OS has been upgraded (or downgraded)
"""
def __init__(self):
name = "cnl"
version = self._detect_crayos_version()
if version:
# If we found a CrayOS version, we do not want the information
# from LinuxDistro. In order to skip the logic from
# distro.linux_distribution, while still calling __init__
# methods further up the MRO, we skip LinuxDistro in the MRO and
# call the OperatingSystem superclass __init__ method
super(LinuxDistro, self).__init__(name, version)
else:
super().__init__()
self.modulecmd = module
def __str__(self):
return self.name + str(self.version)
@classmethod
def _detect_crayos_version(cls):
if os.path.isfile(_cle_release_file):
release_attrs = read_cle_release_file()
if "RELEASE" not in release_attrs:
# This Cray system uses a base OS not CLE/CNL
return None
v = spack.version.Version(release_attrs["RELEASE"])
return v[0]
elif os.path.isfile(_clerelease_file):
v = read_clerelease_file()
return spack.version.Version(v)[0]
else:
# Not all Cray systems run CNL on the backend.
# Systems running in what Cray calls "cluster" mode run other
# linux OSs under the Cray PE.
# So if we don't detect any Cray OS version on the system,
# we return None. We can't ever be sure we will get a Cray OS
# version.
# Returning None allows the calling code to test for the value
# being "True-ish" rather than requiring a try/except block.
return None
def arguments_to_detect_version_fn(self, paths):
import spack.compilers
command_arguments = []
for compiler_name in spack.compilers.supported_compilers():
cmp_cls = spack.compilers.class_for_compiler_name(compiler_name)
# If the compiler doesn't have a corresponding
# Programming Environment, skip to the next
if cmp_cls.PrgEnv is None:
continue
if cmp_cls.PrgEnv_compiler is None:
tty.die("Must supply PrgEnv_compiler with PrgEnv")
compiler_id = spack.compilers.CompilerID(self, compiler_name, None)
detect_version_args = spack.compilers.DetectVersionArgs(
id=compiler_id, variation=(None, None), language="cc", path="cc"
)
command_arguments.append(detect_version_args)
return command_arguments
def detect_version(self, detect_version_args):
import spack.compilers
modulecmd = self.modulecmd
compiler_name = detect_version_args.id.compiler_name
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
output = modulecmd("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
version = tuple(version for _, version in matches if "classic" not in version)
compiler_id = detect_version_args.id
value = detect_version_args._replace(id=compiler_id._replace(version=version))
return value, None
def make_compilers(self, compiler_id, paths):
import spack.spec
name = compiler_id.compiler_name
cmp_cls = spack.compilers.class_for_compiler_name(name)
compilers = []
for v in compiler_id.version:
comp = cmp_cls(
spack.spec.CompilerSpec(name + "@=" + v),
self,
"any",
["cc", "CC", "ftn"],
[cmp_cls.PrgEnv, name + "/" + v],
)
compilers.append(comp)
return compilers

View File

@@ -1,105 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import os
import re
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from spack.util.environment import get_path
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
@contextlib.contextmanager
def unload_programming_environment():
"""Context manager that unloads Cray Programming Environments."""
env_bu = None
# We rely on the fact that the PrgEnv-* modules set the PE_ENV
# environment variable.
if "PE_ENV" in os.environ:
# Copy environment variables to restore them after the compiler
# detection. We expect that the only thing PrgEnv-* modules do is
# the environment variables modifications.
env_bu = os.environ.copy()
# Get the name of the module from the environment variable.
prg_env = "PrgEnv-" + os.environ["PE_ENV"].lower()
# Unload the PrgEnv-* module. By doing this we intentionally
# provoke errors when the Cray's compiler wrappers are executed
# (Error: A PrgEnv-* modulefile must be loaded.) so they will not
# be detected as valid compilers by the overridden method. We also
# expect that the modules that add the actual compilers' binaries
# into the PATH environment variable (i.e. the following modules:
# 'intel', 'cce', 'gcc', etc.) will also be unloaded since they are
# specified as prerequisites in the PrgEnv-* modulefiles.
module("unload", prg_env)
yield
# Restore the environment.
if env_bu is not None:
os.environ.clear()
os.environ.update(env_bu)
class CrayFrontend(LinuxDistro):
"""Represents OS that runs on login and service nodes of the Cray platform.
It acts as a regular Linux without Cray-specific modules and compiler
wrappers."""
@property
def compiler_search_paths(self):
"""Calls the default function but unloads Cray's programming
environments first.
This prevents from detecting Cray compiler wrappers and avoids
possible false detections.
"""
import spack.compilers
with unload_programming_environment():
search_paths = get_path("PATH")
extract_path_re = re.compile(r"prepend-path[\s]*PATH[\s]*([/\w\.:-]*)")
for compiler_cls in spack.compilers.all_compiler_types():
# Check if the compiler class is supported on Cray
prg_env = getattr(compiler_cls, "PrgEnv", None)
compiler_module = getattr(compiler_cls, "PrgEnv_compiler", None)
if not (prg_env and compiler_module):
continue
# It is supported, check which versions are available
output = module("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
versions = tuple(version for _, version in matches if "classic" not in version)
# Now inspect the modules and add to paths
msg = "[CRAY FE] Detected FE compiler [name={0}, versions={1}]"
tty.debug(msg.format(compiler_module, versions))
for v in versions:
try:
current_module = compiler_module + "/" + v
out = module("show", current_module)
match = extract_path_re.search(out)
search_paths += match.group(1).split(":")
except Exception as e:
msg = (
"[CRAY FE] An unexpected error occurred while "
"detecting FE compiler [compiler={0}, "
" version={1}, error={2}]"
)
tty.debug(msg.format(compiler_cls.name, v, str(e)))
search_paths = list(llnl.util.lang.dedupe(search_paths))
return fs.search_paths_for_executables(*search_paths)

View File

@@ -39,6 +39,7 @@
)
from spack.build_systems.cargo import CargoPackage
from spack.build_systems.cmake import CMakePackage, generator
from spack.build_systems.compiler import CompilerPackage
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.generic import Package
from spack.build_systems.gnu import GNUMirrorPackage

View File

@@ -161,7 +161,11 @@ def windows_establish_runtime_linkage(self):
Performs symlinking to incorporate rpath dependencies to Windows runtime search paths
"""
if sys.platform == "win32":
# If spec is an external, we should not be modifying its bin directory, as we would
# be doing in this method
# Spack should in general not modify things it has not installed
# we can reasonably expect externals to have their link interface properly established
if sys.platform == "win32" and not self.spec.external:
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link()
@@ -617,10 +621,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
#: By default do not run tests within package's install()
run_tests = False
#: Keep -Werror flags, matches config:flags:keep_werror to override config
# NOTE: should be type Optional[Literal['all', 'specific', 'none']] in 3.8+
keep_werror: Optional[str] = None
#: Most packages are NOT extendable. Set to True if you want extensions.
extendable = False
@@ -926,6 +926,32 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
def keep_werror(self) -> Optional[str]:
"""Keep ``-Werror`` flags, matches ``config:flags:keep_werror`` to override config.
Valid return values are:
* ``"all"``: keep all ``-Werror`` flags.
* ``"specific"``: keep only ``-Werror=specific-warning`` flags.
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
elif self.spec.satisfies("%nvhpc@23.4:"):
# newer nvhpc supports -Wno-error but can't disable specific warnings with
# -Wno-error=warning. Skip -Werror=warning, but still filter -Werror.
return "specific"
else:
# use -Werror disablement by default for other compilers
return None
@property
def version(self):
if not self.spec.versions.concrete:
@@ -1240,7 +1266,7 @@ def install_test_root(self):
"""Return the install test root directory."""
tty.warn(
"The 'pkg.install_test_root' property is deprecated with removal "
"expected v0.22. Use 'install_test_root(pkg)' instead."
"expected v0.23. Use 'install_test_root(pkg)' instead."
)
return install_test_root(self)
@@ -1877,7 +1903,10 @@ def do_install(self, **kwargs):
verbose (bool): Display verbose build output (by default,
suppresses it)
"""
PackageInstaller([(self, kwargs)]).install()
explicit = kwargs.get("explicit", True)
if isinstance(explicit, bool):
kwargs["explicit"] = {self.spec.dag_hash()} if explicit else set()
PackageInstaller([self], kwargs).install()
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the routine made available to
@@ -1898,7 +1927,7 @@ def cache_extra_test_sources(self, srcs):
"""
msg = (
"'pkg.cache_extra_test_sources(srcs) is deprecated with removal "
"expected in v0.22. Use 'cache_extra_test_sources(pkg, srcs)' "
"expected in v0.23. Use 'cache_extra_test_sources(pkg, srcs)' "
"instead."
)
warnings.warn(msg)
@@ -2446,9 +2475,18 @@ def rpath(self):
# on Windows, libraries of runtime interest are typically
# stored in the bin directory
# Do not include Windows system libraries in the rpath interface
# these libraries are handled automatically by VS/VCVARS and adding
# Spack derived system libs into the link path or address space of a program
# can result in conflicting versions, which makes Spack packages less useable
if sys.platform == "win32":
rpaths = [self.prefix.bin]
rpaths.extend(d.prefix.bin for d in deps if os.path.isdir(d.prefix.bin))
rpaths.extend(
d.prefix.bin
for d in deps
if os.path.isdir(d.prefix.bin)
and "windows-system" not in getattr(d.package, "tags", [])
)
else:
rpaths = [self.prefix.lib, self.prefix.lib64]
rpaths.extend(d.prefix.lib for d in deps if os.path.isdir(d.prefix.lib))
@@ -2555,7 +2593,12 @@ class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""
def __init__(self, spec, dependents):
super().__init__("Cannot uninstall %s" % spec)
spec_fmt = spack.spec.DEFAULT_FORMAT + " /{hash:7}"
dep_fmt = "{name}{@versions} /{hash:7}"
super().__init__(
f"Cannot uninstall {spec.format(spec_fmt)}, "
f"needed by {[dep.format(dep_fmt) for dep in dependents]}"
)
self.spec = spec
self.dependents = dependents

View File

@@ -6,7 +6,6 @@
from ._functions import _host, by_name, platforms, prevent_cray_detection, reset
from ._platform import Platform
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -15,7 +14,6 @@
__all__ = [
"Platform",
"Cray",
"Darwin",
"Linux",
"FreeBSD",

View File

@@ -8,7 +8,6 @@
import spack.util.environment
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -16,7 +15,7 @@
from .windows import Windows
#: List of all the platform classes known to Spack
platforms = [Cray, Darwin, Linux, Windows, FreeBSD, Test]
platforms = [Darwin, Linux, Windows, FreeBSD, Test]
@llnl.util.lang.memoized

View File

@@ -2,253 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import platform
import re
import archspec.cpu
import llnl.util.tty as tty
import spack.target
import spack.version
from spack.operating_systems.cray_backend import CrayBackend
from spack.operating_systems.cray_frontend import CrayFrontend
from spack.paths import build_env_path
from spack.util.executable import Executable
from spack.util.module_cmd import module
from ._platform import NoPlatformError, Platform
_craype_name_to_target_name = {
"x86-cascadelake": "cascadelake",
"x86-naples": "zen",
"x86-rome": "zen2",
"x86-milan": "zen3",
"x86-skylake": "skylake_avx512",
"mic-knl": "mic_knl",
"interlagos": "bulldozer",
"abudhabi": "piledriver",
}
_ex_craype_dir = "/opt/cray/pe/cpe"
_xc_craype_dir = "/opt/cray/pe/cdt"
def slingshot_network():
return os.path.exists("/opt/cray/pe") and (
os.path.exists("/lib64/libcxi.so") or os.path.exists("/usr/lib64/libcxi.so")
)
def _target_name_from_craype_target_name(name):
return _craype_name_to_target_name.get(name, name)
class Cray(Platform):
priority = 10
def __init__(self):
"""Create a Cray system platform.
Target names should use craype target names but not include the
'craype-' prefix. Uses first viable target from:
self
envars [SPACK_FRONT_END, SPACK_BACK_END]
configuration file "targets.yaml" with keys 'front_end', 'back_end'
scanning /etc/bash/bashrc.local for back_end only
"""
super().__init__("cray")
# Make all craype targets available.
for target in self._avail_targets():
name = _target_name_from_craype_target_name(target)
self.add_target(name, spack.target.Target(name, "craype-%s" % target))
self.back_end = os.environ.get("SPACK_BACK_END", self._default_target_from_env())
self.default = self.back_end
if self.back_end not in self.targets:
# We didn't find a target module for the backend
raise NoPlatformError()
# Setup frontend targets
for name in archspec.cpu.TARGETS:
if name not in self.targets:
self.add_target(name, spack.target.Target(name))
self.front_end = os.environ.get("SPACK_FRONT_END", archspec.cpu.host().name)
if self.front_end not in self.targets:
self.add_target(self.front_end, spack.target.Target(self.front_end))
front_distro = CrayFrontend()
back_distro = CrayBackend()
self.default_os = str(back_distro)
self.back_os = self.default_os
self.front_os = str(front_distro)
self.add_operating_system(self.back_os, back_distro)
if self.front_os != self.back_os:
self.add_operating_system(self.front_os, front_distro)
def setup_platform_environment(self, pkg, env):
"""Change the linker to default dynamic to be more
similar to linux/standard linker behavior
"""
# Unload these modules to prevent any silent linking or unnecessary
# I/O profiling in the case of darshan.
modules_to_unload = ["cray-mpich", "darshan", "cray-libsci", "altd"]
for mod in modules_to_unload:
module("unload", mod)
env.set("CRAYPE_LINK_TYPE", "dynamic")
cray_wrapper_names = os.path.join(build_env_path, "cray")
if os.path.isdir(cray_wrapper_names):
env.prepend_path("PATH", cray_wrapper_names)
env.prepend_path("SPACK_ENV_PATH", cray_wrapper_names)
# Makes spack installed pkg-config work on Crays
env.append_path("PKG_CONFIG_PATH", "/usr/lib64/pkgconfig")
env.append_path("PKG_CONFIG_PATH", "/usr/local/lib64/pkgconfig")
# CRAY_LD_LIBRARY_PATH is used at build time by the cray compiler
# wrappers to augment LD_LIBRARY_PATH. This is to avoid long load
# times at runtime. This behavior is not always respected on cray
# "cluster" systems, so we reproduce it here.
if os.environ.get("CRAY_LD_LIBRARY_PATH"):
env.prepend_path("LD_LIBRARY_PATH", os.environ["CRAY_LD_LIBRARY_PATH"])
@classmethod
def craype_type_and_version(cls):
if os.path.isdir(_ex_craype_dir):
craype_dir = _ex_craype_dir
craype_type = "EX"
elif os.path.isdir(_xc_craype_dir):
craype_dir = _xc_craype_dir
craype_type = "XC"
else:
return (None, None)
# Take the default version from known symlink path
default_path = os.path.join(craype_dir, "default")
if os.path.islink(default_path):
version = spack.version.Version(os.readlink(default_path))
return (craype_type, version)
# If no default version, sort available versions and return latest
versions_available = [spack.version.Version(v) for v in os.listdir(craype_dir)]
versions_available.sort(reverse=True)
if not versions_available:
return (craype_type, None)
return (craype_type, versions_available[0])
@classmethod
def detect(cls):
"""
Detect whether this system requires CrayPE module support.
Systems with newer CrayPE (21.10 for EX systems, future work for CS and
XC systems) have compilers and MPI wrappers that can be used directly
by path. These systems are considered ``linux`` platforms.
For systems running an older CrayPE, we detect the Cray platform based
on the availability through `module` of the Cray programming
environment. If this environment is available, we can use it to find
compilers, target modules, etc. If the Cray programming environment is
not available via modules, then we will treat it as a standard linux
system, as the Cray compiler wrappers and other components of the Cray
programming environment are irrelevant without module support.
"""
if "opt/cray" not in os.environ.get("MODULEPATH", ""):
return False
craype_type, craype_version = cls.craype_type_and_version()
if craype_type == "XC":
return True
if craype_type == "EX" and craype_version < spack.version.Version("21.10"):
return True
return False
def _default_target_from_env(self):
"""Set and return the default CrayPE target loaded in a clean login
session.
A bash subshell is launched with a wiped environment and the list of
loaded modules is parsed for the first acceptable CrayPE target.
"""
# env -i /bin/bash -lc echo $CRAY_CPU_TARGET 2> /dev/null
if getattr(self, "default", None) is None:
bash = Executable("/bin/bash")
output = bash(
"--norc",
"--noprofile",
"-lc",
"echo $CRAY_CPU_TARGET",
env={"TERM": os.environ.get("TERM", "")},
output=str,
error=os.devnull,
)
default_from_module = "".join(output.split()) # rm all whitespace
if default_from_module:
tty.debug("Found default module:%s" % default_from_module)
return default_from_module
else:
front_end = archspec.cpu.host()
# Look for the frontend architecture or closest ancestor
# available in cray target modules
avail = [_target_name_from_craype_target_name(x) for x in self._avail_targets()]
for front_end_possibility in [front_end] + front_end.ancestors:
if front_end_possibility.name in avail:
tty.debug("using front-end architecture or available ancestor")
return front_end_possibility.name
else:
tty.debug("using platform.machine as default")
return platform.machine()
def _avail_targets(self):
"""Return a list of available CrayPE CPU targets."""
def modules_in_output(output):
"""Returns a list of valid modules parsed from modulecmd output"""
return [i for i in re.split(r"\s\s+|\n", output)]
def target_names_from_modules(modules):
# Craype- module prefixes that are not valid CPU targets.
targets = []
for mod in modules:
if "craype-" in mod:
name = mod[7:]
name = name.split()[0]
_n = name.replace("-", "_") # test for mic-knl/mic_knl
is_target_name = name in archspec.cpu.TARGETS or _n in archspec.cpu.TARGETS
is_cray_target_name = name in _craype_name_to_target_name
if is_target_name or is_cray_target_name:
targets.append(name)
return targets
def modules_from_listdir():
craype_default_path = "/opt/cray/pe/craype/default/modulefiles"
if os.path.isdir(craype_default_path):
return os.listdir(craype_default_path)
return []
if getattr(self, "_craype_targets", None) is None:
strategies = [
lambda: modules_in_output(module("avail", "-t", "craype-")),
modules_from_listdir,
]
for available_craype_modules in strategies:
craype_modules = available_craype_modules()
craype_targets = target_names_from_modules(craype_modules)
if craype_targets:
self._craype_targets = craype_targets
break
else:
# If nothing is found add platform.machine()
# to avoid Spack erroring out
self._craype_targets = [platform.machine()]
return self._craype_targets

Some files were not shown because too many files have changed in this diff Show More