Compare commits

..

90 Commits

Author SHA1 Message Date
Miranda Mundt
1ac2ee8043 Add Pyomo 6.7.2 (#44097) 2024-05-18 10:31:57 -05:00
Teague Sterling
36af1c1c73 perl-xml-libxml: add new versions and conflicts (fixes #44253) (#44254)
* Address #44253 by adding new versions and declaring conflicts for perl-xml-libxml

* [@spackbot] updating style on behalf of teaguesterling

* Update var/spack/repos/builtin/packages/perl-xml-libxml/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-18 09:28:32 -06:00
Carlos Bederián
e2fa087002 namd: add 3.0b7 (#44198) 2024-05-18 10:15:55 -05:00
Teague Sterling
df02bfbad2 Adding new spark versions (#44250)
* Adding new spark versions (in preparation of HAIL package)

* Adding myself as potential maintainer
2024-05-18 10:06:12 -05:00
Teague Sterling
fecb63843e yq: add new package (#44249) 2024-05-18 06:38:26 -06:00
Scott Wittenburg
b33e2d09d3 oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-05-18 11:57:53 +02:00
Chris Green
f8054aa21a git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-05-18 03:32:06 -06:00
Valentin Volkl
8f3a2acc54 whizard: add gosam variant (#43595)
* whizard: add gosam variant

* adress comments, fix compiler wrapper issue
2024-05-18 10:11:26 +02:00
Kevin Huck
d1a20908b8 ZeroSum: add new package (#44228) 2024-05-18 09:23:45 +02:00
Derek Ryan Strong
dd781f7368 perl: add v5.36.3, v5.38.2; deprecate unsafe versions (#44186) 2024-05-18 09:21:03 +02:00
dmagdavector
9bcc43c4c1 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-17 17:50:29 -06:00
Todd Gamblin
77c83af17d docs: remove warning about repositories and package extension (#44247)
Local package repositories are very well supported and we test them extensively, so this
warning from 8 years ago can be removed from the docs.
2024-05-17 22:03:57 +00:00
Michael Kuhn
574bd2db99 netlib-scalapack: fix build with gcc@14 (#44120) 2024-05-17 22:38:46 +02:00
James Taliaferro
a76f37da96 kakoune: add v2024.05.09 (#44124) 2024-05-17 22:36:58 +02:00
Adam J. Stewart
9e75f3ec0a GDAL: add v3.9.0 (#44128) 2024-05-17 22:35:33 +02:00
Derek Ryan Strong
4d42d45897 Add latest Python versions (#44130) 2024-05-17 22:35:06 +02:00
SXS Bot
a4b4bfda73 spectre: add v2024.05.11 (#44139)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-05-17 22:28:25 +02:00
Michael Kuhn
1bcdd3a57e py-cython: add 3.0.10 (#44140) 2024-05-17 22:27:42 +02:00
Stephen Sachs
297a3a1bc9 Add mpas-model and mpich to pcluster neoverse stack (#44151)
Should build now since https://github.com/spack/spack/pull/43547 has been merged.
2024-05-17 22:00:17 +02:00
Adam J. Stewart
8d01e8c978 JAX: add v0.4.28 (#44112) 2024-05-17 21:58:44 +02:00
Wouter Deconinck
6be28aa303 pythia8: patch latest 8.311 for upstream bug (#43803)
* pythia8: prefer 8.310

* [@spackbot] updating style on behalf of wdconinc

* pythia8: filter_file to remove sed n

* Revert "[@spackbot] updating style on behalf of wdconinc"

This reverts commit e2a3decaffbd3f464d1bd992025e1812df49f088.

* Revert "pythia8: prefer 8.310"

This reverts commit 568cb056b87129085e245d9dbef1732ee1c6c0aa.

* [@spackbot] updating style on behalf of wdconinc

* pythia8: comment for fix

* pythia8: fix style

* pythia8: filter_file with raw string because of escaped pipe

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-17 21:31:05 +02:00
Jon Rood
5e38310515 nalu-wind: fix trilinos rocm dependency (#44233) 2024-05-17 13:00:24 -06:00
wspear
ddfed65485 tau: fix lib/include paths with oneapi (#44170) 2024-05-17 18:55:27 +02:00
dependabot[bot]
2a16d8bfa8 build(deps): bump codecov/codecov-action from 4.3.1 to 4.4.0 (#44195)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5ecb98a3c6...6d798873df)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 18:46:17 +02:00
Jonathon Anderson
6a40a50a29 hpcviewer: Update URLs to use GitLab release assets (#44129) 2024-05-17 18:43:13 +02:00
Carlos Bederián
b2924f68c0 blis: add v1.0 (#44199) 2024-05-17 18:39:42 +02:00
Rocco Meli
41ffe36636 spglib: add v2.4.0 (#44202) 2024-05-17 18:33:40 +02:00
Chris Marsh
24edc72252 docs: show phase signature for builders (#44067) 2024-05-17 18:16:31 +02:00
Raffaele Solcà
83b38a26a0 New versions nvpl-blas and nvpl-lapack (#44244) 2024-05-17 17:46:25 +02:00
Nathalie Furmento
914d785e3b starpu: add v1.4.6 (#44203) 2024-05-17 08:17:01 -06:00
Garth N. Wells
f99f642fa8 fenicsx: remove deprecated versions (#44223) 2024-05-17 07:35:21 -06:00
Seth R. Johnson
e0bf3667e3 cli11: new version and enable library (#44204) 2024-05-17 15:08:28 +02:00
Andrew-Dunning-NNL
a24ca50fed Fix broken link in docs (#44217) 2024-05-17 12:58:11 +00:00
Fabien Bruneval
51e9f37252 MOLGW: add v3.3 (#44241)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 06:56:05 -06:00
Rémi Lacroix
453900c884 libxc: Fix compilation after distribution changes. (#44206)
The release tarballs are not available anymore which means autoconf, automake and libtool are always needed.

The NVHPC specific patches don't make sense anymore being that the patched files are not distributed in the new tar files.
2024-05-17 14:52:03 +02:00
Alberto Invernizzi
4696459d2d libcatalyst: add missing python dependencies (#44224) 2024-05-17 14:45:26 +02:00
Fabien Bruneval
ad1e3231e5 libcint: add v6.1.2, v5.5.0 (#44239)
Co-authored-by: Fabien Bruneval <fabien.bruneval@.cea.fr>
2024-05-17 05:45:26 -06:00
Richard Berger
2ef7eb1826 exodusii: only use MPI fortran compiler if +fortran (#44211) 2024-05-17 13:01:09 +02:00
Dom Heinzeller
fe86019f9a ecflow: versions up to 5.11.4 require boost 1.84 or earlier (#44181) 2024-05-17 12:53:48 +02:00
Harmen Stoppels
9dbb18219f build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-05-17 12:29:56 +02:00
Mikael Simberg
451a977de0 hpx: change default of max_cpu_count variant to auto (#44220) 2024-05-17 11:54:48 +02:00
Jack S. Hale
e604929a4c FEniCS: add more maintainers (#44240) 2024-05-17 03:03:21 -06:00
dependabot[bot]
9d591f9f7c build(deps): bump actions/checkout from 4.1.5 to 4.1.6 (#44234)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.5 to 4.1.6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](44c2b7a8a4...a5ac7e51b4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-17 10:57:08 +02:00
Matt Thompson
f8ad915100 esmf: add v8.6.1 (#44229) 2024-05-17 10:49:47 +02:00
Garth N. Wells
cbbabe6920 fenics-dolfinx: add spdlog dependency (#44237) 2024-05-17 10:48:55 +02:00
John W. Parent
81fe460194 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-05-16 17:00:02 -06:00
Paul Kuberry
b894f996c0 trilinos: catch kokkos inconsistency with trilinos (#44209)
* trilinos: catch kokkos inconsistency with trilinos

* trilinos: update kokkos version range
2024-05-16 13:36:11 -06:00
John W. Parent
1ce09847d9 Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-05-16 10:56:04 -07:00
Thomas Madlener
722d401394 gaudi: Don't apply the patch if it has already landed upstream (#44180) 2024-05-16 17:15:39 +02:00
Howard Pritchard
e6f04d5ef9 py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-16 16:18:44 +02:00
Mosè Giordano
b8e3ecbf00 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-05-16 11:07:18 +02:00
Todd Gamblin
d189387c24 bugfix: add arg to write_line_break() in spack_yaml (#42727)
`ruamel`'s `Emitter.write_line_break()` method takes an extra argument that we forgot to
implement in our custom emitter.
2024-05-15 19:25:06 -07:00
Andrew Lister
9e96ddc5ae CoinHSL: Support the Meson build system and add new release (#43610)
* Add maintainer and fix linting
* allow for fewer deps in archive
* use meson for archive packages
* Fix version spec and f-string
* fix blas dependency links
* Add new release to spack
* Fix checksums for latest release
2024-05-15 16:48:23 -06:00
dmagdavector
543bd189af iperf3: updated versions from 3.6 to 3.16 (#44152)
* Update version of iperf3 from 3.6 to 3.16

Spack currently only explicitly has version 3.6 of the iPerf3 package
(out of ESnet / LBNL). This makes the default the latest version of 3.16,
and adds some other versions (found in some Linux distros, for possible
compatibility purposes).

* iperf3: update to 3.17; update 3.6 hash for new url

* protobuf: update hash for patch needed when="@3.4:3.21"

* Revert "protobuf: update hash for patch needed when="@3.4:3.21""

This reverts commit 4d168d0b27.
2024-05-15 15:48:15 -06:00
John W. Parent
43291aa723 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-05-15 15:41:51 -04:00
Alec Scott
d0589285f7 unmaintained pkgs: bump versions 2024-05-10 (#44131)
* unmaintained pkgs: bump versions 2024-05-10

* openblas: fix satisfies syntax

* pixman: add autotools dependencies

* [@spackbot] updating style on behalf of alecbcs

* pixman: revert

* chapel: revert changes in favor of other PR

* openblas: revert due to failing tests

* Address review feedback for flint, biobam2, and pango

* pango: add version comment about v2.0

* numactl: revert changes due to ppc4le bug

* flint: remote duplicate configure arg

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* openvkl, rkcommon: remove commented maintainers template

* flint: fix style

---------

Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-15 11:04:52 -07:00
afzpatel
d079aaa083 enable tensorflow-2.14 and 2.16 for spack built ROCm (#44095)
* initial commit to enable tensorflow-2.14 for spack built ROCm

* fix style errors

* modify hipcc patch

* updates for rocm 6.1

* updates for tf-rocm-enhanced 2.16

* fix styling

* fix styling

* add patch for 2.16

* add patch for 2.16

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update rocm enhanced version names

* changes for rocm-enhanced version name change

* fix styling

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-05-15 17:06:35 +02:00
Paolo
6c65977e0d Fix gromacs installation with SVE. Issue 44062 (#44183)
* Fix gromacs installation with SVE. Issue 44062

* [@spackbot] updating style on behalf of paolotricerri

* Remove `neoverse_n2` target

We have removed the neoverse_n2 target as its detection is more involved
compared to neoverse_v*.
2024-05-15 07:22:47 -06:00
Carlos Bederián
1b5d786cf5 gromacs: add 2024.2, 2023.5 (#44197) 2024-05-15 07:21:55 -06:00
Ben Morgan
4cf00645bd VecGeom: new version 1.2.8 (#44179) 2024-05-15 07:50:48 -04:00
Jon Rood
e9149cfc3c nalu-wind: fix mistake (#44188) 2024-05-14 23:13:13 -06:00
Jon Rood
a5c8111076 exawind: updates to package to allow mixed device (#44159)
* exawind: updates to package to allow mixed device

* Style.

* Remove ninja variants.

* Add conflict for amr-wind+hypre with mixed device.

* Relax amr-wind~hypre requirement.

* Move runtime variables to nalu-wind.

* Update suggestions.

* Remove umpire.
2024-05-14 12:26:07 -06:00
Alec Scott
c3576f712d pkg-config: support apple-clang@15: (#44007) 2024-05-14 17:24:36 +02:00
Alec Scott
410e6a59b7 rust: fix v1.78.0 instructions (#44127) 2024-05-14 08:14:34 -07:00
Carlos Bederián
bd2b2fb75a python: add 3.10.14, 3.9.19, 3.8.19 (#44162) 2024-05-14 17:03:45 +02:00
Zachary Newell
7ae318efd0 Added NCCL version 2.21.5-1 (#44158) 2024-05-14 03:34:21 -06:00
Derek Ryan Strong
73e9d56647 Update bash 5.2 patches (#44172) 2024-05-14 09:08:39 +02:00
Derek Ryan Strong
f87a752b63 Add zsh 5.8.1 and 5.9 (#44173) 2024-05-14 09:08:08 +02:00
Derek Ryan Strong
ae2fec30c3 Add newer fish versions (#44174) 2024-05-14 09:07:49 +02:00
Derek Ryan Strong
1af5564cbe Add file 5.45 (#44175) 2024-05-14 09:07:25 +02:00
Derek Ryan Strong
a8f057a701 Add man-db 2.12.0 and 2.12.1 (#44176) 2024-05-14 09:07:10 +02:00
Michael B Kuhn
7f3dd38ccc amr-wind: add latest versions and correct waves2amr details (#44099)
* add latest versions and correct waves2amr details

* update commit associated with v2.1.0
2024-05-13 14:21:25 -06:00
Adam J. Stewart
8e9adefcd5 ML CI: update image (#43751)
* ML CI: update image

* Use main branch

* Use tagged version
2024-05-13 21:43:52 +02:00
jdomke
d276f9700f fujitsu-mpi package: help CMake find wrappers (#43979)
Set MPI_C_COMPILER etc. env vars when building with fujitsu-mpi, which
override CMake logic. Without using these variables to explicitly
request the Fujitsu MPI wrappers, the builtin CMake logic is unwilling
to use these wrappers unless the Fujitsu compiler is used, but they
should be used for %clang as well.

This avoids patching CMake module files like in #16864, primarily to
avoid the possibility of altering behavior for specs that do not use
%fj or ^fujitsu-mpi.
2024-05-13 11:15:45 -07:00
Harmen Stoppels
4f111659ec glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-05-13 20:11:27 +02:00
Dave Keeshan
eaf330f2a8 yosys: add v0.41 (#44153) 2024-05-13 09:40:42 -07:00
Julien Cortial
cdaeb74dc7 cdt: Add version 1.4.1 (#44155) 2024-05-13 09:38:25 -07:00
Alberto Sartori
fbaac46604 justbuild: add version 1.3.0 (#44148) 2024-05-13 09:35:29 -07:00
Jon Rood
7f6210ee90 nalu-wind: updates (#44046) 2024-05-13 10:28:24 -06:00
Wouter Deconinck
63f6e6079a gaudi: upstream patch when @38.1 for missing #include <list> (#44121)
* gaudi: upstream patch when @38.1 for missing #include <list>

* gaudi: apply list patch for all versions
2024-05-13 07:37:31 -06:00
Greg Becker
d4fd6caae0 spack uninstall: improve error message for dependent environment (#44149) 2024-05-13 14:58:13 +02:00
Mikael Simberg
fd3c18b6fd whip: Add 0.3.0 (#44146)
Co-authored-by: Mikael Simberg <simbergm@cscs.ch>
2024-05-13 03:09:32 -06:00
Adam J. Stewart
725f427f25 spack checksum: do not add expand=False to wheels (#44118) 2024-05-13 10:01:47 +02:00
Paul R. C. Kent
32b3e91ef7 llvm: add 18.1.5, 18.1.4 (#44123) 2024-05-12 16:28:32 -07:00
bk
b7e4602268 R: common package updates for 4.4.0 (#44088)
* r-ggplot2: v3.5.1, r-haven: v2.5.4, r-jsonlite: v1.8.8, r-pkgload: v1.3.4, r-vctrs: v0.6.5

* r-scales: v1.3.0
2024-05-12 10:49:38 -07:00
Stephen Sachs
4a98d4db93 Add applications to aws-pcluster-* stacks (#43901)
* Add openfoam to aws-pcluster-neoverse_v1 stack

* Add more apps to aws-pcluster-x86_64_v4 stack

* Remove WRF while hdf5 cannot build in buildcache at the moment

* Update comment

* Add more apps for aws-pcluster-neoverse_v1 stack

* Remove apps that currently do not build

* Disable those packages that won't build

* Modify syntax such that correct cflags are used

* Changing syntax again to what works with other packages

* Fix overriding packages.yaml entry for gettext

* Use new `prefer` and `require:when` clauses to clarify intent

* Use newer spack version to install intel compiler

This removes the need for patches and makes sure the `prefer` directives in
`package.yaml` are understood.

* `prefer` not strong enough, need to set compilers

* Revert "Use newer spack version to install intel compiler"

This reverts commit ecb25a192c.

Cannot update the spack version to install intel compiler as this changes the
compiler hash but not the version. This leads to incompatible compiler paths. If
we update this spack version in the future make sure the compiler version also updates.

Tested-by: Stephen Sachs <stesachs@amazon.com>

* Remove `prefer` clause as it is not strong enough for our needs

This way we can safely go back to installing the intel compiler with an older
spack version.

* Prefer gcc or oneapi to build gettext

* Pin gettext version compatible with system glibc-headers

* relax gettext version requirement to enable later versions

* oneapi cannot build older gettext version
2024-05-12 10:48:02 -07:00
Juan Miguel Carceller
9d6bf373be whizard: Add version 3.1.4 (#43045)
* whizard: Add a patch to fix builds with pythia8 >= 8.310

* Add whizard 3.1.4 and update accordingly

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:44:56 -07:00
Alex Richert
cff35c4987 octave: use pcre2 for @8: (#42636)
* octave: use pcre2 for @8:

* Add 'pcre2' variant to octave to control pcre vs. pcre2

* Update var/spack/repos/builtin/packages/octave/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-12 10:44:31 -07:00
Juan Miguel Carceller
d594f84b8f fastjet: Add a cxxstd variant (#44072)
* fastjet: Add a cxxstd variant

* Use f-strings

* Add multi=False

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-12 10:42:56 -07:00
Wouter Deconinck
f8f01c336c clang: support cxx20_flag and cxx23_flag (#43438)
* clang: support cxx20_flag and cxx23_flag

* clang: coverage test cxx{}_flag and c{}_flag additions
2024-05-12 07:45:59 -07:00
232 changed files with 2590 additions and 1049 deletions

View File

@@ -28,7 +28,7 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
@@ -61,7 +61,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -60,7 +60,7 @@ jobs:
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -92,7 +92,7 @@ jobs:
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -121,7 +121,7 @@ jobs:
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -100,7 +100,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -198,7 +198,7 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,7 +223,7 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -41,7 +41,7 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -59,7 +59,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
- uses: codecov/codecov-action@6d798873df2b1b8e5846dba6fb86631229fbcb17
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -67,7 +67,7 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -144,3 +144,5 @@ switch($SpackSubCommand)
"unload" {Invoke-SpackLoad}
default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
}
exit $LASTEXITCODE

View File

@@ -147,6 +147,15 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -476,9 +476,3 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -843,7 +843,7 @@ def copy_tree(
if islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
target = readlink(s)
if os.path.isabs(target):
def escaped_path(path):
@@ -2531,8 +2531,14 @@ def establish_link(self):
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
# do not rpath for system libraries included in the dag
# we should not be modifying libraries managed by the Windows system
# as this will negatively impact linker behavior and can result in permission
# errors if those system libs are not modifiable by Spack
if "windows-system" not in getattr(self.pkg, "tags", []):
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
@system_path_filter

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Set, Tuple
from typing import Dict, List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -730,12 +730,28 @@ def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwa
return compiler(*compiler_args, output=compiler_output)
def get_rpath_deps(pkg):
"""Return immediate or transitive RPATHs depending on the package."""
if pkg.transitive_rpaths:
return [d for d in pkg.spec.traverse(root=False, deptype=("link"))]
else:
return pkg.spec.dependencies(deptype="link")
def _get_rpath_deps_from_spec(
spec: spack.spec.Spec, transitive_rpaths: bool
) -> List[spack.spec.Spec]:
if not transitive_rpaths:
return spec.dependencies(deptype=dt.LINK)
by_name: Dict[str, spack.spec.Spec] = {}
for dep in spec.traverse(root=False, deptype=dt.LINK):
lookup = by_name.get(dep.name)
if lookup is None:
by_name[dep.name] = dep
elif lookup.version < dep.version:
by_name[dep.name] = dep
return list(by_name.values())
def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]:
"""Return immediate or transitive dependencies (depending on the package) that need to be
rpath'ed. If a package occurs multiple times, the newest version is kept."""
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def get_rpaths(pkg):

View File

@@ -44,6 +44,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
@@ -683,6 +684,22 @@ def generate_gitlab_ci_yaml(
"instead.",
)
def ensure_expected_target_path(path):
"""Returns passed paths with all Windows path separators exchanged
for posix separators only if copy_only_pipeline is enabled
This is required as copy_only_pipelines are a unique scenario where
the generate job and child pipelines are run on different platforms.
To make this compatible w/ Windows, we cannot write Windows style path separators
that will be consumed on by the Posix copy job runner.
TODO (johnwparent): Refactor config + cli read/write to deal only in posix
style paths
"""
if copy_only_pipeline and path:
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
@@ -806,7 +823,7 @@ def generate_gitlab_ci_yaml(
if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope)
env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = env_includes
env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes]
if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]:
env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci")
@@ -1227,6 +1244,9 @@ def main_script_replacements(cmd):
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
}
output_vars = output_object["variables"]
for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val)
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
@@ -1283,7 +1303,6 @@ def main_script_replacements(cmd):
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
@@ -1506,7 +1525,7 @@ def download_and_extract_artifacts(url, work_dir):
request = Request(url, headers=headers)
request.get_method = lambda: "GET"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
@@ -2254,7 +2273,7 @@ def create_buildgroup(self, opener, headers, url, group_name, group_type):
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code not in [200, 201]:
@@ -2300,7 +2319,7 @@ def populate_buildgroup(self, job_names):
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:

View File

@@ -13,7 +13,6 @@
import shutil
import sys
import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty
@@ -54,6 +53,7 @@
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry,
upload_manifest_with_retry,
)
@@ -856,10 +856,7 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
tags = list_tags(image_ref)
# Fetch all image config files in parallel
spec_dicts = pool.starmap(

View File

@@ -239,6 +239,8 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -96,6 +96,8 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -120,6 +122,24 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -142,7 +162,10 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
return "-std=c2x"
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):

View File

@@ -15,6 +15,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config
import spack.hash_types as ht
@@ -181,7 +182,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
base_dir = (
self.path_for_spec(deprecator_spec)
if deprecator_spec
else os.readlink(deprecated_spec.prefix)
else readlink(deprecated_spec.prefix)
)
yaml_path = os.path.join(

View File

@@ -22,7 +22,7 @@
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.compilers
import spack.concretize
@@ -662,7 +662,7 @@ def _current_root(self):
if not os.path.islink(self.root):
return None
root = os.readlink(self.root)
root = readlink(self.root)
if os.path.isabs(root):
return root

View File

@@ -11,7 +11,7 @@
import urllib.parse
import urllib.request
from http.client import HTTPResponse
from typing import NamedTuple, Tuple
from typing import List, NamedTuple, Tuple
from urllib.request import Request
import llnl.util.tty as tty
@@ -27,6 +27,7 @@
import spack.stage
import spack.traverse
import spack.util.crypto
import spack.util.url
from .image import Digest, ImageReference
@@ -69,6 +70,42 @@ def with_query_param(url: str, param: str, value: str) -> str:
)
def list_tags(ref: ImageReference, _urlopen: spack.oci.opener.MaybeOpen = None) -> List[str]:
"""Retrieves the list of tags associated with an image, handling pagination."""
_urlopen = _urlopen or spack.oci.opener.urlopen
tags = set()
fetch_url = ref.tags_url()
while True:
# Fetch tags
request = Request(url=fetch_url)
response = _urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags.update(json.load(response)["tags"])
# Check for pagination
link_header = response.headers["Link"]
if link_header is None:
break
tty.debug(f"OCI tag pagination: {link_header}")
rel_next_value = spack.util.url.parse_link_rel_next(link_header)
if rel_next_value is None:
break
rel_next = urllib.parse.urlparse(rel_next_value)
if rel_next.scheme not in ("https", ""):
break
fetch_url = ref.endpoint(rel_next_value)
return sorted(tags)
def upload_blob(
ref: ImageReference,
file: str,

View File

@@ -161,7 +161,11 @@ def windows_establish_runtime_linkage(self):
Performs symlinking to incorporate rpath dependencies to Windows runtime search paths
"""
if sys.platform == "win32":
# If spec is an external, we should not be modifying its bin directory, as we would
# be doing in this method
# Spack should in general not modify things it has not installed
# we can reasonably expect externals to have their link interface properly established
if sys.platform == "win32" and not self.spec.external:
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link()
@@ -2446,9 +2450,18 @@ def rpath(self):
# on Windows, libraries of runtime interest are typically
# stored in the bin directory
# Do not include Windows system libraries in the rpath interface
# these libraries are handled automatically by VS/VCVARS and adding
# Spack derived system libs into the link path or address space of a program
# can result in conflicting versions, which makes Spack packages less useable
if sys.platform == "win32":
rpaths = [self.prefix.bin]
rpaths.extend(d.prefix.bin for d in deps if os.path.isdir(d.prefix.bin))
rpaths.extend(
d.prefix.bin
for d in deps
if os.path.isdir(d.prefix.bin)
and "windows-system" not in getattr(d.package, "tags", [])
)
else:
rpaths = [self.prefix.lib, self.prefix.lib64]
rpaths.extend(d.prefix.lib for d in deps if os.path.isdir(d.prefix.lib))

View File

@@ -10,6 +10,7 @@
import archspec.cpu
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.target
import spack.version
@@ -133,7 +134,7 @@ def craype_type_and_version(cls):
# Take the default version from known symlink path
default_path = os.path.join(craype_dir, "default")
if os.path.islink(default_path):
version = spack.version.Version(os.readlink(default_path))
version = spack.version.Version(readlink(default_path))
return (craype_type, version)
# If no default version, sort available versions and return latest

View File

@@ -566,7 +566,7 @@ def make_link_relative(new_links, orig_links):
orig_links (list): original links
"""
for new_link, orig_link in zip(new_links, orig_links):
target = os.readlink(orig_link)
target = readlink(orig_link)
relative_target = os.path.relpath(target, os.path.dirname(orig_link))
os.unlink(new_link)
symlink(relative_target, new_link)

View File

@@ -58,7 +58,8 @@
# Initialize data structures common to each phase's report.
CDASH_PHASES = set(MAP_PHASES_TO_CDASH.values())
CDASH_PHASES.add("update")
# CDash request timeout in seconds
SPACK_CDASH_TIMEOUT = 45
CDashConfiguration = collections.namedtuple(
"CDashConfiguration", ["upload_url", "packages", "build", "site", "buildstamp", "track"]
@@ -447,7 +448,7 @@ def upload(self, filename):
# By default, urllib2 only support GET and POST.
# CDash expects this file to be uploaded via PUT.
request.get_method = lambda: "PUT"
response = opener.open(request)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
if self.current_package_name not in self.buildIds:
resp_value = response.read()
if isinstance(resp_value, bytes):

View File

@@ -9,7 +9,7 @@
import tempfile
from collections import OrderedDict
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.binary_distribution as bindist
import spack.error
@@ -26,7 +26,7 @@ def _relocate_spliced_links(links, orig_prefix, new_prefix):
in our case. This still needs to be called after the copy to destination
because it expects the new directory structure to be in place."""
for link in links:
link_target = os.readlink(os.path.join(orig_prefix, link))
link_target = readlink(os.path.join(orig_prefix, link))
link_target = re.sub("^" + orig_prefix, new_prefix, link_target)
new_link_path = os.path.join(new_prefix, link)
os.unlink(new_link_path)

View File

@@ -22,6 +22,7 @@
import archspec.cpu
from llnl.util.filesystem import join_path, visit_directory_tree
from llnl.util.symlink import readlink
import spack.binary_distribution as bindist
import spack.caches
@@ -1062,10 +1063,10 @@ def test_tarball_common_prefix(dummy_prefix, tmpdir):
assert set(os.listdir(os.path.join("prefix2", "share"))) == {"file"}
# Relative symlink should still be correct
assert os.readlink(os.path.join("prefix2", "bin", "relative_app_link")) == "app"
assert readlink(os.path.join("prefix2", "bin", "relative_app_link")) == "app"
# Absolute symlink should remain absolute -- this is for relocation to fix up.
assert os.readlink(os.path.join("prefix2", "bin", "absolute_app_link")) == os.path.join(
assert readlink(os.path.join("prefix2", "bin", "absolute_app_link")) == os.path.join(
dummy_prefix, "bin", "app"
)

View File

@@ -14,6 +14,7 @@
import spack.build_environment
import spack.config
import spack.deptypes as dt
import spack.package_base
import spack.spec
import spack.util.spack_yaml as syaml
@@ -716,3 +717,21 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
for depth, spec in root.traverse(depth=True, root=True):
for variable in build_variables:
assert hasattr(spec.package.module, variable) == should_be_set(depth)
def test_rpath_with_duplicate_link_deps():
"""If we have two instances of one package in the same link sub-dag, only the newest version is
rpath'ed. This is for runtime support without splicing."""
runtime_1 = spack.spec.Spec("runtime@=1.0")
runtime_2 = spack.spec.Spec("runtime@=2.0")
child = spack.spec.Spec("child@=1.0")
root = spack.spec.Spec("root@=1.0")
root.add_dependency_edge(child, depflag=dt.LINK, virtuals=())
root.add_dependency_edge(runtime_2, depflag=dt.LINK, virtuals=())
child.add_dependency_edge(runtime_1, depflag=dt.LINK, virtuals=())
rpath_deps = spack.build_environment._get_rpath_deps_from_spec(root, transitive_rpaths=True)
assert child in rpath_deps
assert runtime_2 in rpath_deps
assert runtime_1 not in rpath_deps

View File

@@ -51,7 +51,7 @@ def __init__(self, response_code=200, content_to_read=[]):
self._content = content_to_read
self._read = [False for c in content_to_read]
def open(self, request):
def open(self, request, data=None, timeout=object()):
return self
def getcode(self):

View File

@@ -15,6 +15,7 @@
import llnl.util.filesystem as fs
import llnl.util.link_tree
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.cmd.env
import spack.config
@@ -4414,8 +4415,8 @@ def test_env_view_resolves_identical_file_conflicts(tmp_path, install_mockery, m
# view-file/bin/
# x # expect this x to be linked
assert os.readlink(tmp_path / "view" / "bin" / "x") == bottom.bin.x
assert os.readlink(tmp_path / "view" / "bin" / "y") == top.bin.y
assert readlink(tmp_path / "view" / "bin" / "x") == bottom.bin.x
assert readlink(tmp_path / "view" / "bin" / "y") == top.bin.y
def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mock_fetch):
@@ -4426,4 +4427,4 @@ def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mo
install()
prefix_dependent = e.matching_spec("view-ignore-conflict").prefix
# The dependent's file is linked into the view
assert os.readlink(tmp_path / "view" / "bin" / "x") == prefix_dependent.bin.x
assert readlink(tmp_path / "view" / "bin" / "x") == prefix_dependent.bin.x

View File

@@ -384,9 +384,18 @@ def test_clang_flags():
unsupported_flag_test("cxx17_flag", "clang@3.4")
supported_flag_test("cxx17_flag", "-std=c++1z", "clang@3.5")
supported_flag_test("cxx17_flag", "-std=c++17", "clang@5.0")
unsupported_flag_test("cxx20_flag", "clang@4.0")
supported_flag_test("cxx20_flag", "-std=c++2a", "clang@5.0")
supported_flag_test("cxx20_flag", "-std=c++20", "clang@11.0")
unsupported_flag_test("cxx23_flag", "clang@11.0")
supported_flag_test("cxx23_flag", "-std=c++2b", "clang@12.0")
supported_flag_test("cxx23_flag", "-std=c++23", "clang@17.0")
supported_flag_test("c99_flag", "-std=c99", "clang@3.3")
unsupported_flag_test("c11_flag", "clang@2.0")
supported_flag_test("c11_flag", "-std=c11", "clang@6.1.0")
unsupported_flag_test("c23_flag", "clang@8.0")
supported_flag_test("c23_flag", "-std=c2x", "clang@9.0")
supported_flag_test("c23_flag", "-std=c23", "clang@18.0")
supported_flag_test("cc_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("cxx_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("f77_pic_flag", "-fPIC", "clang@3.3")

View File

@@ -14,7 +14,7 @@
import pytest
import llnl.util.filesystem as fs
from llnl.util.symlink import islink, symlink
from llnl.util.symlink import islink, readlink, symlink
import spack.paths
@@ -181,7 +181,7 @@ def test_symlinks_true(self, stage):
assert os.path.exists("dest/a/b2")
with fs.working_dir("dest/a"):
assert os.path.exists(os.readlink("b2"))
assert os.path.exists(readlink("b2"))
assert os.path.realpath("dest/f/2") == os.path.abspath("dest/a/b/2")
assert os.path.realpath("dest/2") == os.path.abspath("dest/1")
@@ -281,7 +281,7 @@ def test_allow_broken_symlinks(self, stage):
symlink("nonexistant.txt", "source/broken", allow_broken_symlinks=True)
fs.install_tree("source", "dest", symlinks=True, allow_broken_symlinks=True)
assert os.path.islink("dest/broken")
assert not os.path.exists(os.readlink("dest/broken"))
assert not os.path.exists(readlink("dest/broken"))
def test_glob_src(self, stage):
"""Test using a glob as the source."""

View File

@@ -7,6 +7,8 @@
import pytest
from llnl.util.symlink import readlink
import spack.cmd.modules
import spack.config
import spack.error
@@ -78,7 +80,7 @@ def test_modules_default_symlink(
link_path = os.path.join(os.path.dirname(mock_module_filename), "default")
assert os.path.islink(link_path)
assert os.readlink(link_path) == mock_module_filename
assert readlink(link_path) == mock_module_filename
generator.remove()
assert not os.path.lexists(link_path)

View File

@@ -151,7 +151,9 @@ class InMemoryOCIRegistry(DummyServer):
A third option is to use the chunked upload, but this is not implemented here, because
it's typically a major performance hit in upload speed, so we're not using it in Spack."""
def __init__(self, domain: str, allow_single_post: bool = True) -> None:
def __init__(
self, domain: str, allow_single_post: bool = True, tags_per_page: int = 100
) -> None:
super().__init__(domain)
self.router.register("GET", r"/v2/", self.index)
self.router.register("HEAD", r"/v2/(?P<name>.+)/blobs/(?P<digest>.+)", self.head_blob)
@@ -165,6 +167,9 @@ def __init__(self, domain: str, allow_single_post: bool = True) -> None:
# If True, allow single POST upload, not all registries support this
self.allow_single_post = allow_single_post
# How many tags are returned in a single request
self.tags_per_page = tags_per_page
# Used for POST + PUT upload. This is a map from session ID to image name
self.sessions: Dict[str, str] = {}
@@ -280,10 +285,34 @@ def handle_upload(self, req: Request, name: str, digest: Digest):
return MockHTTPResponse(201, "Created", headers={"Location": f"/v2/{name}/blobs/{digest}"})
def list_tags(self, req: Request, name: str):
# Paginate using Link headers, this was added to the spec in the following commit:
# https://github.com/opencontainers/distribution-spec/commit/2ed79d930ecec11dd755dc8190409a3b10f01ca9
# List all tags, exclude digests.
tags = [_tag for _name, _tag in self.manifests.keys() if _name == name and ":" not in _tag]
tags.sort()
return MockHTTPResponse.with_json(200, "OK", body={"tags": tags})
all_tags = sorted(
_tag for _name, _tag in self.manifests.keys() if _name == name and ":" not in _tag
)
query = urllib.parse.parse_qs(urllib.parse.urlparse(req.full_url).query)
n = int(query["n"][0]) if "n" in query else self.tags_per_page
if "last" in query:
try:
offset = all_tags.index(query["last"][0]) + 1
except ValueError:
return MockHTTPResponse(404, "Not found")
else:
offset = 0
tags = all_tags[offset : offset + n]
if offset + n < len(all_tags):
headers = {"Link": f'</v2/{name}/tags/list?last={tags[-1]}&n={n}>; rel="next"'}
else:
headers = None
return MockHTTPResponse.with_json(200, "OK", headers=headers, body={"tags": tags})
class DummyServerUrllibHandler(urllib.request.BaseHandler):

View File

@@ -6,6 +6,7 @@
import hashlib
import json
import random
import urllib.error
import urllib.parse
import urllib.request
@@ -19,6 +20,7 @@
copy_missing_layers,
get_manifest_and_config,
image_from_mirror,
list_tags,
upload_blob,
upload_manifest,
)
@@ -670,3 +672,31 @@ def test_retry(url, max_retries, expect_failure, expect_requests):
assert len(server.requests) == expect_requests
assert sleep_time == [2**i for i in range(expect_requests - 1)]
def test_list_tags():
# Follows a relatively new rewording of the OCI distribution spec, which is not yet tagged.
# https://github.com/opencontainers/distribution-spec/commit/2ed79d930ecec11dd755dc8190409a3b10f01ca9
N = 20
urlopen = create_opener(InMemoryOCIRegistry("example.com", tags_per_page=5)).open
image = ImageReference.from_string("example.com/image")
to_tag = lambda i: f"tag-{i:02}"
# Create N tags in arbitrary order
_tags_to_create = [to_tag(i) for i in range(N)]
random.shuffle(_tags_to_create)
for tag in _tags_to_create:
upload_manifest(image.with_tag(tag), default_manifest(), tag=True, _urlopen=urlopen)
# list_tags should return all tags from all pages in order
tags = list_tags(image, urlopen)
assert len(tags) == N
assert [to_tag(i) for i in range(N)] == tags
# Test a single request, which should give the first 5 tags
assert json.loads(urlopen(image.tags_url()).read())["tags"] == [to_tag(i) for i in range(5)]
# Test response at an offset, which should exclude the `last` tag.
assert json.loads(urlopen(image.tags_url() + f"?last={to_tag(N - 3)}").read())["tags"] == [
to_tag(i) for i in range(N - 2, N)
]

View File

@@ -16,7 +16,7 @@
import pytest
from llnl.util import filesystem as fs
from llnl.util.symlink import symlink
from llnl.util.symlink import readlink, symlink
import spack.binary_distribution as bindist
import spack.cmd.buildcache as buildcache
@@ -181,12 +181,12 @@ def test_relocate_links(tmpdir):
relocate_links(["to_self", "to_dependency", "to_system"], prefix_to_prefix)
# These two are relocated
assert os.readlink("to_self") == str(tmpdir.join("new_prefix_a", "file"))
assert os.readlink("to_dependency") == str(tmpdir.join("new_prefix_b", "file"))
assert readlink("to_self") == str(tmpdir.join("new_prefix_a", "file"))
assert readlink("to_dependency") == str(tmpdir.join("new_prefix_b", "file"))
# These two are not.
assert os.readlink("to_system") == system_path
assert os.readlink("to_self_but_relative") == "relative"
assert readlink("to_system") == system_path
assert readlink("to_self_but_relative") == "relative"
def test_needs_relocation():

View File

@@ -15,6 +15,7 @@
import pytest
from llnl.util.filesystem import getuid, mkdirp, partition_path, touch, working_dir
from llnl.util.symlink import readlink
import spack.error
import spack.paths
@@ -872,7 +873,7 @@ def _create_files_from_tree(base, tree):
def _create_tree_from_dir_recursive(path):
if os.path.islink(path):
return os.readlink(path)
return readlink(path)
elif os.path.isdir(path):
tree = {}
for name in os.listdir(path):

View File

@@ -87,6 +87,13 @@ def test_url_strip_name_suffixes(url, version, expected):
59,
"https://github.com/nextflow-io/nextflow/releases/download/v0.20.1/nextflow",
),
(
"hpcviewer",
30,
"2024.02",
51,
"https://gitlab.com/hpctoolkit/hpcviewer/-/releases/2024.02/downloads/hpcviewer.tgz",
),
# Version in stem
("zlib", 24, "1.2.10", 29, "http://zlib.net/fossils/zlib-1.2.10.tar.gz"),
(

View File

@@ -207,3 +207,29 @@ def test_default_download_name_dot_dot():
assert url_util.default_download_filename("https://example.com/.") == "_"
assert url_util.default_download_filename("https://example.com/..") == "_."
assert url_util.default_download_filename("https://example.com/.abcdef") == "_abcdef"
def test_parse_link_rel_next():
parse = url_util.parse_link_rel_next
assert parse(r'</abc>; rel="next"') == "/abc"
assert parse(r'</abc>; x=y; rel="next", </def>; x=y; rel="prev"') == "/abc"
assert parse(r'</abc>; rel="prev"; x=y, </def>; x=y; rel="next"') == "/def"
# example from RFC5988
assert (
parse(
r"""</TheBook/chapter2>; title*=UTF-8'de'letztes%20Kapitel; rel="previous","""
r"""</TheBook/chapter4>; title*=UTF-8'de'n%c3%a4chstes%20Kapitel; rel="next" """
)
== "/TheBook/chapter4"
)
assert (
parse(r"""<https://example.com/example>; key=";a=b, </c/d>; e=f"; rel="next" """)
== "https://example.com/example"
)
assert parse("https://example.com/example") is None
assert parse("<https://example.com/example; broken=broken") is None
assert parse("https://example.com/example; rel=prev") is None
assert parse("https://example.com/example; a=b; c=d; g=h") is None

View File

@@ -258,7 +258,9 @@ def parse_version_offset(path):
# 9th Pass: Version in path
# github.com/repo/name/releases/download/vver/name
# e.g. https://github.com/nextflow-io/nextflow/releases/download/v0.20.1/nextflow
# e.g. https://gitlab.com/hpctoolkit/hpcviewer/-/releases/2024.02/downloads/hpcviewer.tgz
(r"github\.com/[^/]+/[^/]+/releases/download/[a-zA-Z+._-]*v?(\d[\da-zA-Z._-]*)/", path),
(r"gitlab\.com/[^/]+/.+/-/releases/[a-zA-Z+._-]*v?(\d[\da-zA-Z._-]*)/downloads/", path),
# e.g. ftp://ftp.ncbi.nlm.nih.gov/blast/executables/legacy.NOTSUPPORTED/2.2.26/ncbi.tar.gz
(r"(\d[\da-zA-Z._-]*)/[^/]+$", path),
]

View File

@@ -21,16 +21,6 @@ def get_version_lines(version_hashes_dict: dict, url_dict: Optional[dict] = None
version_lines = []
for v, h in version_hashes_dict.items():
expand_arg = ""
# Extract the url for a version if url_dict is provided.
url = ""
if url_dict is not None and v in url_dict:
url = url_dict[v]
# Add expand_arg since wheels should not be expanded during stanging
if url.endswith(".whl") or ".whl#" in url:
expand_arg = ", expand=False"
version_lines.append(f' version("{v}", sha256="{h}"{expand_arg})')
version_lines.append(f' version("{v}", sha256="{h}")')
return "\n".join(version_lines)

View File

@@ -22,7 +22,7 @@ def _libc_from_ldd(ldd: str) -> Optional["spack.spec.Spec"]:
except Exception:
return None
if not re.search(r"\b(?:gnu|glibc|arm)\b", stdout, re.IGNORECASE):
if not re.search(r"\bFree Software Foundation\b", stdout):
return None
version_str = re.match(r".+\(.+\) (.+)", stdout)
@@ -75,7 +75,7 @@ def libc_from_dynamic_linker(dynamic_linker: str) -> Optional["spack.spec.Spec"]
return spec
except Exception:
return None
elif re.search(r"\b(?:gnu|glibc|arm)\b", stdout, re.IGNORECASE):
elif re.search(r"\bFree Software Foundation\b", stdout):
# output is like "ld.so (...) stable release version 2.33."
match = re.search(r"version (\d+\.\d+(?:\.\d+)?)", stdout)
if not match:

View File

@@ -296,8 +296,8 @@ def process_scalar(self):
if marked(self.event.value):
self.saved = self.event.value
def write_line_break(self):
super().write_line_break()
def write_line_break(self, data=None):
super().write_line_break(data)
if self.saved is None:
_ANNOTATIONS.append(colorize("@K{---}"))
return

View File

@@ -10,9 +10,11 @@
import itertools
import os
import posixpath
import re
import sys
import urllib.parse
import urllib.request
from typing import Optional
from llnl.path import convert_to_posix_path
@@ -254,3 +256,43 @@ def default_download_filename(url: str) -> str:
valid_name = "_" + valid_name[1:]
return valid_name
def parse_link_rel_next(link_value: str) -> Optional[str]:
"""Return the next link from a Link header value, if any."""
# Relaxed version of RFC5988
uri = re.compile(r"\s*<([^>]+)>\s*")
param_key = r"[^;=\s]+"
quoted_string = r"\"([^\"]+)\""
unquoted_param_value = r"([^;,\s]+)"
param = re.compile(rf";\s*({param_key})\s*=\s*(?:{quoted_string}|{unquoted_param_value})\s*")
data = link_value
# Parse a list of <url>; key=value; key=value, <url>; key=value; key=value, ... links.
while True:
uri_match = re.match(uri, data)
if not uri_match:
break
uri_reference = uri_match.group(1)
data = data[uri_match.end() :]
# Parse parameter list
while True:
param_match = re.match(param, data)
if not param_match:
break
key, quoted_value, unquoted_value = param_match.groups()
value = quoted_value or unquoted_value
data = data[param_match.end() :]
if key == "rel" and value == "next":
return uri_reference
if not data.startswith(","):
break
data = data[1:]
return None

View File

@@ -9,6 +9,7 @@
from typing import Any, Dict
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.filesystem_view
import spack.store
@@ -38,7 +39,7 @@ def create_manifest_entry(path: str) -> Dict[str, Any]:
data: Dict[str, Any] = {"mode": s.st_mode, "owner": s.st_uid, "group": s.st_gid}
if stat.S_ISLNK(s.st_mode):
data["dest"] = os.readlink(path)
data["dest"] = readlink(path)
elif stat.S_ISREG(s.st_mode):
data["hash"] = compute_hash(path)
@@ -90,7 +91,7 @@ def check_entry(path, data):
# instead of `lstat(...).st_mode`. So, ignore mode errors for symlinks.
if not stat.S_ISLNK(s.st_mode) and s.st_mode != data["mode"]:
res.add_error(path, "mode")
elif stat.S_ISLNK(s.st_mode) and os.readlink(path) != data.get("dest"):
elif stat.S_ISLNK(s.st_mode) and readlink(path) != data.get("dest"):
res.add_error(path, "link")
elif stat.S_ISREG(s.st_mode):
# Check file contents against hash and listed as file

View File

@@ -64,6 +64,11 @@ default:
SPACK_TARGET_PLATFORM: "linux"
SPACK_TARGET_ARCH: "ppc64le"
.win64-msvc2019:
variables:
SPACK_TARGET_PLATFORM: "win64"
SPACK_TARGET_ARCH: "x86_64"
########################################
# Job templates
########################################
@@ -72,6 +77,8 @@ default:
PIPELINE_MIRROR_TEMPLATE: "single-src-protected-mirrors.yaml.in"
# TODO: We can remove this when we drop the "deprecated" stack
PUSH_BUILDCACHE_DEPRECATED: "${PROTECTED_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
SPACK_CI_CONFIG_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/configs"
SPACK_CI_SCRIPTS_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts"
rules:
- if: $SPACK_CI_DISABLE_STACKS =~ /.+/ && $SPACK_CI_STACK_NAME =~ $SPACK_CI_DISABLE_STACKS
@@ -114,16 +121,8 @@ default:
.generate-common:
stage: generate
script:
- uname -a || true
- grep -E 'vendor|model name' /proc/cpuinfo 2>/dev/null | sort -u || head -n10 /proc/cpuinfo 2>/dev/null || true
- nproc || true
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- . "./share/spack/setup-env.sh"
- spack --version
- cd share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME}
- spack env activate --without-view .
- export SPACK_CI_CONFIG_ROOT="${SPACK_ROOT}/share/spack/gitlab/cloud_pipelines/configs"
- spack env activate --without-view share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME}
- spack
--config-scope "${SPACK_CI_CONFIG_ROOT}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}"
@@ -134,29 +133,25 @@ default:
--config-scope "${SPACK_CI_CONFIG_ROOT}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}/${SPACK_TARGET_ARCH}"
${CI_STACK_CONFIG_SCOPES}
audit configs
- spack python -c "import os,sys; print(os.path.expandvars(sys.stdin.read()))"
< "${SPACK_CI_CONFIG_ROOT}/${PIPELINE_MIRROR_TEMPLATE}" > "${SPACK_CI_CONFIG_ROOT}/mirrors.yaml"
# Command below needs to be `spack python` due to naming differences accross platforms
- spack python ${SPACK_CI_SCRIPTS_ROOT}/common/expand_vars.py
"${SPACK_CI_CONFIG_ROOT}/${PIPELINE_MIRROR_TEMPLATE}"
"${SPACK_CI_CONFIG_ROOT}/mirrors.yaml"
- spack config add -f "${SPACK_CI_CONFIG_ROOT}/mirrors.yaml"
- mkdir -p "${CI_PROJECT_DIR}/jobs_scratch_dir"
- mkdir "${CI_PROJECT_DIR}/jobs_scratch_dir"
- spack
--config-scope "${SPACK_CI_CONFIG_ROOT}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}/${SPACK_TARGET_ARCH}"
${CI_STACK_CONFIG_SCOPES}
config blame > "${CI_PROJECT_DIR}/jobs_scratch_dir/spack.yaml.blame"
- spack -v --color=always
--config-scope "${SPACK_CI_CONFIG_ROOT}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}"
--config-scope "${SPACK_CI_CONFIG_ROOT}/${SPACK_TARGET_PLATFORM}/${SPACK_TARGET_ARCH}"
${CI_STACK_CONFIG_SCOPES}
ci generate --check-index-only
--artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir"
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml"
after_script:
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir"
@@ -179,6 +174,16 @@ default:
# Generate without tags for cases using external runners
.generate-base:
extends: [ ".base-job", ".generate-common" ]
before_script:
- uname -a || true
- grep -E 'vendor|model name' /proc/cpuinfo 2>/dev/null | sort -u || head -n10 /proc/cpuinfo 2>/dev/null || true
- nproc || true
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- . "./share/spack/setup-env.sh"
after_script:
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
.generate-x86_64:
extends: [ ".generate-base" ]
@@ -196,6 +201,25 @@ default:
extends: [ ".generate-base" ]
tags: ["spack", "public", "medium", "neoverse_v2"]
.generate-win64:
extends: [ ".base-job", ".generate-common" ]
before_script:
- $ErrorActionOld=$ErrorActionPreference
- $ErrorActionPreference="SilentlyContinue"
- python -c"import psutil;print(psutil.getloadavg())"
- (Get-WmiObject Win32_PhysicalMemory | measure-object Capacity -sum).sum/1kb
- $ErrorActionPreference=$ErrorActionOld
- . .\share\spack\setup-env.ps1
after_script:
- $ErrorActionOld=$ErrorActionPreference
- $ErrorActionPreference="SilentlyContinue"
- python -c"import psutil;print(psutil.getloadavg())"
- (Get-WmiObject Win32_PhysicalMemory | measure-object Capacity -sum).sum/1kb
- $ErrorActionPreference=$ErrorActionOld
tags: ["spack", "public", "medium", "x86_64-win"]
image: "ghcr.io/johnwparent/windows-server21h2:sha-c749cf3"
.generate-deprecated:
extends: [ ".base-job" ]
stage: generate
@@ -718,7 +742,7 @@ tutorial-build:
ml-linux-x86_64-cpu-generate:
extends: [ ".generate-x86_64", .ml-linux-x86_64-cpu, ".tags-x86_64_v4" ]
image: ghcr.io/spack/linux-ubuntu22.04-x86_64_v2:v2024-01-29
image: ghcr.io/spack/ubuntu-22.04:v2024-05-07
ml-linux-x86_64-cpu-build:
extends: [ ".build", ".ml-linux-x86_64-cpu" ]
@@ -741,7 +765,7 @@ ml-linux-x86_64-cpu-build:
ml-linux-x86_64-cuda-generate:
extends: [ ".generate-x86_64", .ml-linux-x86_64-cuda, ".tags-x86_64_v4" ]
image: ghcr.io/spack/linux-ubuntu22.04-x86_64_v2:v2024-01-29
image: ghcr.io/spack/ubuntu-22.04:v2024-05-07
ml-linux-x86_64-cuda-build:
extends: [ ".build", ".ml-linux-x86_64-cuda" ]
@@ -859,6 +883,15 @@ aws-pcluster-build-neoverse_v1:
- echo $PATH
- module avail
- module list
- uname -a || true
- grep -E 'vendor|model name' /proc/cpuinfo 2>/dev/null | sort -u || head -n10 /proc/cpuinfo 2>/dev/null || true
- nproc || true
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- . "./share/spack/setup-env.sh"
after_script:
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
.generate-cray-rhel:
tags: [ "cray-rhel-zen4", "public" ]
@@ -912,3 +945,25 @@ e4s-cray-sles-build:
needs:
- artifacts: True
job: e4s-cray-sles-generate
#######################################
# Windows Visualization Tools
#######################################
.windows-vis:
extends: [".win64-msvc2019"]
variables:
SPACK_CI_STACK_NAME: windows-vis
windows-vis-generate:
extends: [ ".generate-win64", ".windows-vis" ]
windows-vis-build:
extends: [ ".build", ".windows-vis"]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: windows-vis-generate
strategy: depend
needs:
- artifacts: True
job: windows-vis-generate

View File

@@ -0,0 +1,18 @@
ci:
pipeline-gen:
- build-job:
after_script::
- Write-Output "Done"
before_script::
- fsutil 8dot3name set C:\ 0
- . .\share\spack\setup-env.ps1
- If (Test-Path -path C:\\key\intermediate_ci_signing_key.gpg) { spack.ps1 gpg trust C:\\key\intermediate_ci_signing_key.gpg }
- If (Test-Path -path C:\\key\spack_public_key.gpg) { spack.ps1 gpg trust C:\\key\spack_public_key.gpg }
script::
- spack.ps1 env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
- spack.ps1 config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{hash}'"
- mkdir ${SPACK_ARTIFACTS_ROOT}/user_data
- spack.ps1 --backtrace ci rebuild | Tee-Object -FilePath "${env:SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt" 2>&1 | Tee-Object -FilePath "${env:SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt"
image: "ghcr.io/johnwparent/windows-server21h2:sha-c749cf3"

View File

@@ -0,0 +1,10 @@
config:
build_stage::
- 'C:/spack stage'
install_tree:
root: "C:/spack install"
# Path lengths on windows doesn't support much padding
padded_length: 0
# Reduce the projections to only including the hash to avoid path length issues
projections:
all: '{hash}'

View File

@@ -0,0 +1,25 @@
packages:
all:
target: [x86_64]
tbb:
require: "intel-tbb"
cmake:
externals:
- spec: cmake@3.28.0-msvc1
prefix: "C:\\Program Files\\Microsoft Visual Studio\\2022\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\CMake\\CMake"
buildable: False
ninja:
externals:
- spec: ninja@1.11.0
prefix: "C:\\Program Files\\Microsoft Visual Studio\\2022\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\CMake\\Ninja"
buildable: False
wgl:
externals:
- spec: wgl@10.0.22621 plat=x64
prefix: "C:\\Program Files (x86)\\Windows Kits\\10"
buildable: False
win-sdk:
externals:
- spec: win-sdk@10.0.22621 plat=x64
prefix: "C:\\Program Files (x86)\\Windows Kits\\10"
buildable: False

View File

@@ -0,0 +1,4 @@
ci:
pipeline-gen:
- build-job:
tags: [x86_64-win]

View File

@@ -0,0 +1,3 @@
packages:
all:
target: [x86_64]

View File

@@ -0,0 +1,10 @@
import argparse
import os
parser = argparse.ArgumentParser()
parser.add_argument("input", type=argparse.FileType("r"))
parser.add_argument("out", type=argparse.FileType("w"))
args = parser.parse_args()
args.out.write(os.path.expandvars(args.input.read()))

View File

@@ -10,6 +10,7 @@ set -e
# The best solution would be to have the compilers hash (or packages contents) be part of the
# individual packages hashes. I don't see this at the moment.
# Set to the latest tag including a recent oneapi compiler.
# NOTE: If we update this spack version in the future make sure the compiler version also updates.
spack_intel_compiler_commit="develop-2023-08-06"
set_pcluster_defaults() {
@@ -23,10 +24,9 @@ set_pcluster_defaults() {
setup_spack() {
spack compiler add --scope site
spack external find --scope site
# Remove all autotools/buildtools packages. These versions need to be managed by spack or it will
# Do not add autotools/buildtools packages. These versions need to be managed by spack or it will
# eventually end up in a version mismatch (e.g. when compiling gmp).
spack tags build-tools | xargs -I {} spack config --scope site rm packages:{}
spack external find --scope site --tag core-packages
}
patch_compilers_yaml() {
@@ -99,7 +99,7 @@ install_compilers() {
# The compilers needs to be in the same install tree as the rest of the software such that the path
# relocation works correctly. This holds the danger that this part will fail when the current spack gets
# incompatible with the one in $spack_intel_compiler_commit. Therefore, we make intel installations optional
# in package.yaml files.
# in package.yaml files and add a fallback `%gcc` version for each application.
if [ "x86_64" == "$(arch)" ]; then
(
CURRENT_SPACK_ROOT=${SPACK_ROOT}

View File

@@ -19,7 +19,10 @@ packages:
llvm:
variants: ~lldb
mpas-model:
require: "precision=single make_target=llvm %arm ^parallelio+pnetcdf"
require:
- one_of:
- "precision=single make_target=llvm %arm ^parallelio+pnetcdf"
- "precision=single %gcc ^parallelio+pnetcdf"
mpich:
require: "mpich pmi=pmi2 device=ch4 netmod=ofi +slurm"
nvhpc:

View File

@@ -2,13 +2,23 @@ spack:
view: false
definitions:
- optimized_configs:
- gromacs target=neoverse_v1
- gromacs target=neoverse_n1
- apps:
- gromacs
- mpas-model
- mpich
- openfoam
# - quantum-espresso : %gcc@12.3.0 on neoverse_v1 fails.
# Root cause: internal compiler error: in compute_live_loop_exits, at tree-ssa-loop-manip.cc:247
- wrf
- targets:
- 'target=neoverse_v1'
- 'target=neoverse_n1'
specs:
- $optimized_configs
- matrix:
- [$apps]
- [$targets]
ci:
pipeline-gen:
- build-job:

View File

@@ -5,13 +5,20 @@ packages:
- one_of:
- "cflags=-std=c18 target=x86_64_v4"
- "cflags=-std=c18 target=x86_64_v3"
- "%gcc"
when: "%intel"
gettext:
# Newer gettext cannot build with gcc@12 and old AL2 glibc headers
# Older gettext versions do not build correctly with oneapi.
require:
- one_of:
- '@:0.20'
- '%oneapi'
gromacs:
require:
- one_of:
- "+intel_provided_gcc %intel ^intel-oneapi-mkl target=x86_64_v4"
- "+intel_provided_gcc %intel ^intel-oneapi-mkl target=x86_64_v3"
- "%gcc"
- "+intel_provided_gcc ^intel-oneapi-mkl target=x86_64_v4"
- "+intel_provided_gcc ^intel-oneapi-mkl target=x86_64_v3"
when: "%intel"
intel-mpi:
variants: +external-libfabric
intel-oneapi-compilers:
@@ -21,15 +28,15 @@ packages:
lammps:
require:
- one_of:
- "lammps_sizes=bigbig +molecule +kspace +rigid +asphere +opt +openmp +openmp-package +intel %intel ^intel-oneapi-mkl target=x86_64_v4"
- "lammps_sizes=bigbig +molecule +kspace +rigid +asphere +opt +openmp +openmp-package %intel ^intel-oneapi-mkl target=x86_64_v3"
- "%gcc"
- "lammps_sizes=bigbig +molecule +kspace +rigid +asphere +opt +openmp +openmp-package +intel ^intel-oneapi-mkl target=x86_64_v4"
- "lammps_sizes=bigbig +molecule +kspace +rigid +asphere +opt +openmp +openmp-package ^intel-oneapi-mkl target=x86_64_v3"
when: "%intel"
libidn2:
require:
- one_of:
- "cflags=-std=c18 target=x86_64_v4"
- "cflags=-std=c18 target=x86_64_v3"
- '%gcc'
when: "%intel"
libfabric:
buildable: true
externals:
@@ -41,13 +48,13 @@ packages:
- one_of:
- "cflags=-std=c18 target=x86_64_v4"
- "cflags=-std=c18 target=x86_64_v3"
- "%gcc"
when: "%intel"
mpas-model:
require:
- one_of:
- "precision=single %intel ^parallelio+pnetcdf target=x86_64_v4"
- "precision=single %intel ^parallelio+pnetcdf target=x86_64_v3"
- "%gcc"
- "precision=single ^parallelio+pnetcdf target=x86_64_v4"
- "precision=single ^parallelio+pnetcdf target=x86_64_v3"
when: "%intel"
mpich:
require:
- one_of:
@@ -67,9 +74,12 @@ packages:
palace:
require:
- one_of:
- "palace %oneapi ^fmt@9.1.0 target=x86_64_v4"
- "palace %oneapi ^fmt@9.1.0 target=x86_64_v3"
- "%gcc ^fmt@9.1.0"
- "palace ^fmt@9.1.0 target=x86_64_v4"
- "palace ^fmt@9.1.0 target=x86_64_v3"
when: "%oneapi"
- one_of:
- "palace ^fmt@9.1.0"
when: "%gcc"
pmix:
require:
- one_of:
@@ -78,9 +88,9 @@ packages:
quantum-espresso:
require:
- one_of:
- "quantum-espresso@6.6 %intel ^intel-oneapi-mkl+cluster target=x86_64_v4"
- "quantum-espresso@6.6 %intel ^intel-oneapi-mkl+cluster target=x86_64_v3"
- "%gcc"
- "quantum-espresso@6.6 ^intel-oneapi-mkl+cluster target=x86_64_v4"
- "quantum-espresso@6.6 ^intel-oneapi-mkl+cluster target=x86_64_v3"
when: "%intel"
slurm:
buildable: false
externals:
@@ -89,12 +99,13 @@ packages:
wrf:
require:
- one_of:
- "wrf@4 build_type=dm+sm %intel target=x86_64_v4"
- "wrf@4 build_type=dm+sm %intel target=x86_64_v3"
- "wrf@4.2.2 +netcdf_classic fflags=\"-fp-model fast=2 -no-heap-arrays -no-prec-div -no-prec-sqrt -fno-common\" build_type=dm+sm %intel target=x86_64_v3"
- "%gcc"
- "wrf@4 build_type=dm+sm target=x86_64_v4"
- "wrf@4 build_type=dm+sm target=x86_64_v3"
- "wrf@4.2.2 +netcdf_classic fflags=\"-fp-model fast=2 -no-heap-arrays -no-prec-div -no-prec-sqrt -fno-common\" build_type=dm+sm target=x86_64_v3"
when: "%intel"
all:
compiler: [intel, gcc]
compiler: [intel, oneapi, gcc]
permissions:
read: world
write: user

View File

@@ -2,14 +2,24 @@ spack:
view: false
definitions:
- apps:
- gromacs %intel
- lammps %intel
- mpas-model %intel
- openfoam %gcc
- palace %oneapi
- quantum-espresso %intel
# - wrf : While building hdf5 cmake errors out with Detecting Fortran/C Interface: Failed to compile
# Root cause: ifort cannot deal with arbitrarily long file names.
- optimized_configs:
- palace target=x86_64_v4
- palace target=x86_64_v3
- targets:
- 'target=x86_64_v4'
- 'target=x86_64_v3'
specs:
- $optimized_configs
- matrix:
- [$apps]
- [$targets]
ci:
pipeline-gen:
- build-job:
@@ -28,5 +38,6 @@ spack:
# Do not distribute Intel & ARM binaries
- - for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep intel-oneapi | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep armpl | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
cdash:
build-group: AWS Packages

View File

@@ -79,7 +79,7 @@ spack:
pipeline-gen:
- build-job:
image:
name: ghcr.io/spack/linux-ubuntu22.04-x86_64_v2:v2024-01-29
name: ghcr.io/spack/ubuntu-22.04:v2024-05-07
entrypoint: ['']
cdash:

View File

@@ -83,7 +83,7 @@ spack:
pipeline-gen:
- build-job:
image:
name: ghcr.io/spack/linux-ubuntu22.04-x86_64_v2:v2024-01-29
name: ghcr.io/spack/ubuntu-22.04:v2024-05-07
entrypoint: ['']
cdash:

View File

@@ -0,0 +1,12 @@
# Windows Visualization Stack
# maintainers:
# - John Parent (@johnwparent)
# - Ryan Krattiger (@kwryankrattiger)
spack:
view: false
specs:
- vtk
cdash:
build-group: Windows Visualization (Kitware)

View File

@@ -21,6 +21,12 @@ class AmrWind(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
version("main", branch="main", submodules=True)
version(
"2.1.0", tag="v2.1.0", commit="bc787f21deca9239928182e27400133934c62658", submodules=True
)
version(
"2.0.0", tag="v2.0.0", commit="ea448365033fc6bc9ee0febeb369b377f4fd8240", submodules=True
)
version(
"1.4.0", tag="v1.4.0", commit="bdddf133e41a9b7b4c8ce28f1ea1bebec47678f5", submodules=True
)
@@ -91,7 +97,7 @@ class AmrWind(CMakePackage, CudaPackage, ROCmPackage):
depends_on("openfast@3.5:", when="@2: +openfast")
depends_on("helics@:3.3.2", when="+helics")
depends_on("helics@:3.3.2+mpi", when="+helics+mpi")
depends_on("fftw", when="@2: +waves2amr")
depends_on("fftw", when="@2.1: +waves2amr")
for arch in CudaPackage.cuda_arch_values:
depends_on("hypre+cuda cuda_arch=%s" % arch, when="+cuda+hypre cuda_arch=%s" % arch)
@@ -104,7 +110,7 @@ class AmrWind(CMakePackage, CudaPackage, ROCmPackage):
conflicts("+openmp", when="+cuda")
conflicts("+shared", when="+cuda")
conflicts("@:1.4.0", when="+waves2amr")
conflicts("@:2.0", when="+waves2amr")
def setup_build_environment(self, env):
# Avoid compile errors with Intel interprocedural optimization

View File

@@ -14,14 +14,21 @@ class Axel(AutotoolsPackage):
license("GPL-2.0-or-later WITH OpenSSL-Exception")
version("2.17.14", sha256="73f3aeafcb00b8101b212fcf47969a4962e7a1b50843306178b527a9942d8785")
version("2.17.13", sha256="aedd5e0f22d6eda23eece483ce89be4adfdf1e16ba18d54fd6b743da9d49911b")
version("2.17.10", sha256="c0d26eba6b94945cd98c5b69ca6df2744639d17bfd49047ef51a8a48f067de10")
version("2.16.1", sha256="763066efc61e4f7be2eb59afa049bdbc520837e01c95a78f403e542ad82f2719")
depends_on("pkgconfig", type="build")
# For systems not providing libintl APU in the system libc (glibc integrated it)
depends_on("gettext")
depends_on("openssl")
def installcheck(self):
Executable(self.prefix.bin.axel)("--version")
# check if we can run axel
@run_after("install")
@on_package_attributes(run_tests=True)
def check_version(self):
with working_dir(self.stage.source_path):
axel = Executable(self.prefix.bin.axel)
axel("--version")

View File

@@ -44,6 +44,17 @@ class Bash(AutotoolsPackage, GNUMirrorPackage):
("5.2", "013", "094b4fd81bc488a26febba5d799689b64d52a5505b63e8ee854f48d356bc7ce6"),
("5.2", "014", "3ef9246f2906ef1e487a0a3f4c647ae1c289cbd8459caa7db5ce118ef136e624"),
("5.2", "015", "ef73905169db67399a728e238a9413e0d689462cb9b72ab17a05dba51221358a"),
("5.2", "016", "155853bc5bd10e40a9bea369fb6f50a203a7d0358e9e32321be0d9fa21585915"),
("5.2", "017", "1c48cecbc9b7b4217990580203b7e1de19c4979d0bd2c0e310167df748df2c89"),
("5.2", "018", "4641dd49dd923b454dd0a346277907090410f5d60a29a2de3b82c98e49aaaa80"),
("5.2", "019", "325c26860ad4bba8558356c4ab914ac57e7b415dac6f5aae86b9b05ccb7ed282"),
("5.2", "020", "b6fc252aeb95ce67c9b017d29d81e8a5e285db4bf20d4ec8cdca35892be5c01d"),
("5.2", "021", "8334b88117ad047598f23581aeb0c66c0248cdd77abc3b4e259133aa307650cd"),
("5.2", "022", "78b5230a49594ec30811e72dcd0f56d1089710ec7828621022d08507aa57e470"),
("5.2", "023", "af905502e2106c8510ba2085aa2b56e64830fc0fdf6ee67ebb459ac11696dcd3"),
("5.2", "024", "971534490117eb05d97d7fd81f5f9d8daf927b4d581231844ffae485651b02c3"),
("5.2", "025", "5138f487e7cf71a6323dc81d22419906f1535b89835cc2ff68847e1a35613075"),
("5.2", "026", "96ee1f549aa0b530521e36bdc0ba7661602cfaee409f7023cac744dd42852eac"),
("5.1", "001", "ebb07b3dbadd98598f078125d0ae0d699295978a5cdaef6282fe19adef45b5fa"),
("5.1", "002", "15ea6121a801e48e658ceee712ea9b88d4ded022046a6147550790caf04f5dbe"),
("5.1", "003", "22f2cc262f056b22966281babf4b0a2f84cb7dd2223422e5dcd013c3dcbab6b1"),

View File

@@ -27,7 +27,7 @@ class Biobambam2(AutotoolsPackage):
test_src_dir = "test"
def configure_args(self):
args = ["--with-libmaus2={0}".format(self.spec["libmaus2"].prefix)]
args = [f"--with-libmaus2={self.spec['libmaus2'].prefix}"]
return args
@run_after("install")

View File

@@ -124,6 +124,7 @@ class Blis(BlisBase):
license("BSD-3-Clause")
version("master", branch="master")
version("1.0", sha256="9c12972aa1e50f64ca61684eba6828f2f3dd509384b1e41a1e8a9aedea4b16a6")
version("0.9.0", sha256="1135f664be7355427b91025075562805cdc6cc730d3173f83533b2c5dcc2f308")
version("0.8.1", sha256="729694128719801e82fae7b5f2489ab73e4a467f46271beff09588c9265a697b")
version("0.8.0", sha256="5e05868c4a6cf5032a7492f8861653e939a8f907a4fa524bbb6e14394e170a3d")

View File

@@ -16,6 +16,7 @@ class Cdt(CMakePackage):
license("MPL-2.0-no-copyleft-exception")
version("1.4.1", sha256="86df99eb5f02a73eeb8c6ea45765eed0d7f206e8d4d9f6479f77e3c590ae5bb3")
version("1.4.0", sha256="cb5a95a39b417f5a4d170c7ebe97232d0ed36ea64069339b14964dd52dea95ab")
version("1.3.6", sha256="15881e4c451f3b7cceade9b11884b3813ff674dff3edae4fb7c440634f8d4c33")
version("1.3.0", sha256="7e8feadf9534cf79f9bf188365510fd6bc68ea997758e1c68d1569f98da924da")

View File

@@ -15,4 +15,5 @@ class Cjson(CMakePackage):
license("MIT")
version("1.7.17", sha256="51f3b07aece8d1786e74b951fd92556506586cb36670741b6bfb79bf5d484216")
version("1.7.15", sha256="c55519316d940757ef93a779f1db1ca809dbf979c551861f339d35aaea1c907c")

View File

@@ -14,8 +14,9 @@ class Cli11(CMakePackage):
url = "https://github.com/CLIUtils/CLI11/archive/v1.9.1.tar.gz"
maintainers("nightlark")
license("BitTorrent-1.0")
license("BSD-3-Clause")
version("2.4.1", sha256="73b7ec52261ce8fe980a29df6b4ceb66243bb0b779451dbd3d014cfec9fdbb58")
version("2.3.2", sha256="aac0ab42108131ac5d3344a9db0fdf25c4db652296641955720a4fbe52334e22")
version("2.3.1", sha256="378da73d2d1d9a7b82ad6ed2b5bda3e7bc7093c4034a1d680a2e009eb067e7b2")
version("2.1.1", sha256="d69023d1d0ab6a22be86b4f59d449422bc5efd9121868f4e284d6042e52f682e")
@@ -26,5 +27,10 @@ class Cli11(CMakePackage):
depends_on("cmake@3.4:", type="build")
def cmake_args(self):
args = ["-DCLI11_BUILD_EXAMPLES=OFF", "-DCLI11_BUILD_DOCS=OFF", "-DCLI11_BUILD_TESTS=OFF"]
args = [
self.define("CLI11_BUILD_EXAMPLES", False),
self.define("CLI11_BUILD_DOCS", False),
self.define("CLI11_BUILD_TESTS", False),
self.define("CLI11_PRECOMPILED", True),
]
return args

View File

@@ -13,16 +13,17 @@ class Clp(AutotoolsPackage):
homepage = "https://projects.coin-or.org/Clp"
url = "https://github.com/coin-or/Clp/archive/releases/1.17.6.tar.gz"
depends_on("pkgconfig", type="build")
depends_on("coinutils")
depends_on("osi")
depends_on("pkgconfig", type="build")
license("EPL-2.0")
version("1.17.9", sha256="b02109be54e2c9c6babc9480c242b2c3c7499368cfca8c0430f74782a694a49f")
version("1.17.7", sha256="c4c2c0e014220ce8b6294f3be0f3a595a37bef58a14bf9bac406016e9e73b0f5")
version("1.17.6", sha256="afff465b1620cfcbb7b7c17b5d331d412039650ff471c4160c7eb24ae01284c9")
version("1.17.4", sha256="ef412cde00cb1313d9041115a700d8d59d4b8b8b5e4dde43e9deb5108fcfbea8")
version("1.16.11", sha256="b525451423a9a09a043e6a13d9436e13e3ee7a7049f558ad41a110742fa65f39")
depends_on("pkgconfig", type="build")
depends_on("coinutils")
depends_on("osi")
depends_on("pkgconfig", type="build")
build_directory = "spack-build"

View File

@@ -17,6 +17,7 @@ class Cmor(AutotoolsPackage):
license("BSD-3-Clause")
version("3.8.0", sha256="5f5a44e660104916dd0a3d0d942234db375d2a4ffb4f4113ec88cfdd93f99ef4")
version("3.7.2", sha256="5e19a9be8e6a8bd18a2035772732c34b87b3448319bf0b8fa12ccd4a351b8e86")
version("3.6.1", sha256="991035a41424f72ea6f0f85653fc13730eb035e63c7dff6ca740aa7a70976fb4")
version("3.6.0", sha256="1608904a35106e83d365f27522209c325bd4bfc19d022b1a8abfb12cdf85fe20")
@@ -46,22 +47,23 @@ class Cmor(AutotoolsPackage):
@run_before("configure")
def validate(self):
if "+fortran" in self.spec and not self.compiler.fc:
if self.spec.satisfies("+fortran") and not self.compiler.fc:
msg = "cannot build a fortran variant without a fortran compiler"
raise RuntimeError(msg)
def configure_args(self):
extra_args = ["--disable-debug"]
spec = self.spec
args = ["--disable-debug"]
if "+fortran" in self.spec:
extra_args.append("--enable-fortran")
if spec.satisfies("+fortran"):
args.append("--enable-fortran")
else:
extra_args.append("--disable-fortran")
args.append("--disable-fortran")
if "+python" in self.spec:
extra_args.append("--with-python={0}".format(self.spec["python"].prefix))
if spec.satisfies("+python"):
args.append(f"--with-python={self.spec['python'].prefix}")
return extra_args
return args
def check(self):
"""tests need downloaded files, testcases have manual instructions for that."""
@@ -70,6 +72,6 @@ def check(self):
def install(self, spec, prefix):
make("install")
if "+python" in spec:
if spec.satisfies("+python"):
args = std_pip_args + ["--prefix=" + prefix, "."]
pip(*args)

View File

@@ -8,7 +8,7 @@
from spack.package import *
class Coinhsl(AutotoolsPackage):
class Coinhsl(MesonPackage, AutotoolsPackage):
"""CoinHSL is a collection of linear algebra libraries (KB22, MA27,
MA28, MA54, MA57, MA64, MA77, MA86, MA97, MC19, MC34, MC64, MC68,
MC69, MC78, MC80, OF01, ZB01, ZB11) bundled for use with IPOPT and
@@ -21,60 +21,85 @@ class Coinhsl(AutotoolsPackage):
that Spack can find it. For instructions on how to set up a
mirror, see https://spack.readthedocs.io/en/latest/mirrors.html"""
# NOTE(oxberry1@llnl.gov): an HTTPS version of the URL below does not
# exist
build_system(
conditional("autotools", when="@b:2019.05.21"),
conditional("meson", when="@2023:,:b"),
default="meson",
)
homepage = "https://www.hsl.rl.ac.uk/ipopt/"
url = "file://{0}/coinhsl-archive-2014.01.17.tar.gz".format(os.getcwd())
url = f"file://{os.getcwd()}/coinhsl-2023.11.17.tar.gz"
manual_download = True
# CoinHSL has a few versions that vary with respect to stability/features
# and licensing terms.
maintainers("AndrewLister-STFC")
# Version 2019.05.21 is a full-featured "release candidate"
# version available via an "academic license" that can be used for
# personal teaching and research purposes only. For a full list of
# conditions, see https://www.hsl.rl.ac.uk/academic.html.
# Meson builds
version(
"2024.05.15",
sha256="2534807b4f6a4a69661c82dc0da7094f685f0fce6443a9147ee90a21caba9e63",
preferred=True,
)
version(
"archive-2024.05.15",
sha256="1d907ce5d84331ce8f78125d5fc766184f0fce9a7b340db7f3c4821a7f4b7c4c",
)
with when("build_system=meson @2023:"):
depends_on("blas")
depends_on("lapack")
variant("metis", default=True, description="Build with Metis support.")
depends_on("metis", when="+metis")
def meson_args(self):
spec = self.spec
args = []
if spec.satisfies("@:b"):
return []
blas = spec["blas"].libs.names[0]
blas_paths = [sf[2:] for sf in spec["blas"].libs.search_flags.split()]
lapack = spec["lapack"].libs.names[0]
lapack_paths = [sf[2:] for sf in spec["lapack"].libs.search_flags.split()]
args.append(f"-Dlibblas={blas}")
args.extend([f"-Dlibblas_path={p}" for p in blas_paths])
args.append(f"-Dliblapack={lapack}")
args.extend([f"-Dlibblas_path={p}" for p in lapack_paths])
if spec.satisfies("+metis"):
metis = spec["metis"]
if metis.satisfies("@5"):
args.append("-Dlibmetis_version=5")
else:
args.append("-Dlibmetis_version=4")
args.extend(
[
f"-Dlibmetis_include={metis.prefix.include}",
f"-Dlibmetis_path={metis.prefix.lib}",
]
)
return args
# Autotools builds
version(
"2019.05.21", sha256="95ce1160f0b013151a3e25d40337775c760a8f3a79d801a1d190598bf4e4c0c3"
)
# Version 2015.06.23 is a full-featured "stable"
# version available via an "academic license" that can be used for
# personal teaching and research purposes only. For a full list of
# conditions, see https://www.hsl.rl.ac.uk/academic.html.
version(
"2015.06.23",
sha256="3e955a2072f669b8f357ae746531b37aea921552e415dc219a5dd13577575fb3",
preferred=True,
"2015.06.23", sha256="3e955a2072f669b8f357ae746531b37aea921552e415dc219a5dd13577575fb3"
)
# Version 2014.01.17 is a full-featured "stable" version available
# via an "academic license" that can be used for personal teaching
# and research purposes only.
version(
"2014.01.17", sha256="ed49fea62692c5d2f928d4007988930da9ff9a2e944e4c559d028671d122437b"
)
# Version 2014.01.10 only has MA27, MA28, and MC19, and is
# available as a "personal license" that is free to all, and
# permits commercial use, but *not redistribution* (emphasis from
# original source).
version(
"2014.01.10", sha256="7c2be60a3913b406904c66ee83acdbd0709f229b652c4e39ee5d0876f6b2e907"
)
# CoinHSL fails to build in parallel
parallel = False
variant("blas", default=False, description="Link to external BLAS library")
depends_on("blas", when="+blas")
with when("build_system=autotools"):
parallel = False
variant("blas", default=False, description="Link to external BLAS library")
depends_on("blas", when="+blas")
def configure_args(self):
spec = self.spec
args = []
if spec.satisfies("+blas"):
args.append("--with-blas={0}".format(spec["blas"].libs.ld_flags))
args.append(f"--with-blas={spec['blas'].libs.ld_flags}")
return args

View File

@@ -14,9 +14,15 @@ class Cppad(CMakePackage):
git = "https://github.com/coin-or/CppAD.git"
version("develop", branch="master")
version(
"20180000.0", sha256="1c355713e720fc5226ff3d6db2909922d46cd7ee0d36ee7985882f86905f655a"
)
version("20170114", sha256="fa3980a882be2a668a7522146273a1b4f1d8dabe66ad4aafa8964c8c1fd6f957")
def cmake_args(self):
# This package does not obey CMAKE_INSTALL_PREFIX
args = ["-Dcppad_prefix=%s" % (self.prefix), "-Dcmake_install_docdir=share/cppad/doc"]
args = [
self.define("cppad_prefix", self.prefix),
self.define("cmake_install_docdir", "share/cppad/doc"),
]
return args

View File

@@ -18,6 +18,7 @@ class Cryptopp(MakefilePackage):
license("BSL-1.0")
version("8.9.0", sha256="4cc0ccc324625b80b695fcd3dee63a66f1a460d3e51b71640cdbfc4cd1a3779c")
version("8.7.0", sha256="d0d3a28fcb5a1f6ed66b3adf57ecfaed234a7e194e42be465c2ba70c744538dd")
version("7.0.0", sha256="a4bc939910edd3d29fb819a6fc0dfdc293f686fa62326f61c56d72d0a366ceb0")
version("6.1.0", sha256="21289d2511101a9350c87c8eb1f4982d4a266e8037b19dab79a32cc13ea108c7")
@@ -39,7 +40,7 @@ def url_for_version(self, version):
def build(self, spec, prefix):
cxx_flags = []
if "+shared" in spec:
if spec.satisfies("+shared"):
cxx_flags.append(self.compiler.cxx_pic_flag)
target = self.spec.target
@@ -51,7 +52,7 @@ def build(self, spec, prefix):
cxx_flags.append("-DCRYPTOPP_DISABLE_SSE2")
make_target = "dynamic" if "+shared" in spec else "static"
make(make_target, "CXXFLAGS={0}".format(" ".join(cxx_flags)))
make(make_target, f"CXXFLAGS={' '.join(cxx_flags)}")
def install(self, spec, prefix):
make("install", "PREFIX={0}".format(prefix))
make("install", f"PREFIX={prefix}")

View File

@@ -13,11 +13,12 @@ class Dlib(CMakePackage):
url = "https://github.com/davisking/dlib/archive/v19.19.tar.gz"
git = "https://github.com/davisking/dlib"
maintainer = ["robertu94"]
maintainers("robertu94")
license("BSL-1.0")
version("master", branch="master")
version("19.24.4", sha256="d881911d68972d11563bb9db692b8fcea0ac1b3fd2e3f03fa0b94fde6c739e43")
version("19.22", sha256="5f44b67f762691b92f3e41dcf9c95dd0f4525b59cacb478094e511fdacb5c096")
version("19.21", sha256="116f52e58be04b47dab52057eaad4b5c4d5c3032d927fe23d55b0741fc4107a0")
version("19.20", sha256="fc3f0986350e8e53aceadf95a71d2f413f1eedc469abda99a462cb528741d411")
@@ -34,7 +35,9 @@ class Dlib(CMakePackage):
depends_on("libx11")
def cmake_args(self):
spec = self.spec
args = []
if "+shared" in self.spec:
args.append("-DBUILD_SHARED_LIBS=ON")
if spec.satisfies("+shared"):
args.append(self.define("BUILD_SHARED_LIBS", "ON"))
return args

View File

@@ -52,7 +52,8 @@ class Ecflow(CMakePackage):
# See https://github.com/spack/spack/pull/22303 for reference
depends_on(Boost.with_default_variants, when="@:4")
# Use newer boost with v5
# Use newer boost with v5 up to 1.84.0 - https://github.com/spack/spack/issues/44116
conflicts("boost@1.85:", when="@:5.11.4")
depends_on(
"boost@1.72:+chrono+date_time+exception+filesystem+program_options+python+regex+serialization+system+test+thread+timer", # noqa
when="@5:",

View File

@@ -16,6 +16,7 @@ class Entt(CMakePackage):
license("MIT")
version("3.13.2", sha256="cb556aa543d01177b62de41321759e02d96078948dda72705b3d7fe68af88489")
version("3.13.1", sha256="a4f290b601a70333126abd2cec7b0c232c74a4f85dcf1e04d969e8122dae8652")
version("3.11.1", sha256="0ac010f232d3089200c5e545bcbd6480cf68b705de6930d8ff7cdb0a29f5b47b")
version("3.5.2", sha256="f9271293c44518386c402c9a2188627819748f66302df48af4f6d08e30661036")
@@ -30,4 +31,4 @@ class Entt(CMakePackage):
conflicts("%gcc@:7.1", msg=compiler_warning)
def cmake_args(self):
return ["-DBUILD_DOCS=ON"]
return [self.define("BUILD_DOCS", "ON")]

View File

@@ -29,6 +29,7 @@ class Esmf(MakefilePackage):
# Develop is a special name for spack and is always considered the newest version
version("develop", branch="develop")
# generate chksum with 'spack checksum esmf@x.y.z'
version("8.6.1", sha256="dc270dcba1c0b317f5c9c6a32ab334cb79468dda283d1e395d98ed2a22866364")
version("8.6.0", sha256="ed057eaddb158a3cce2afc0712b49353b7038b45b29aee86180f381457c0ebe7")
version("8.5.0", sha256="acd0b2641587007cc3ca318427f47b9cae5bfd2da8d2a16ea778f637107c29c4")
version("8.4.2", sha256="969304efa518c7859567fa6e65efd960df2b4f6d72dbf2c3f29e39e4ab5ae594")

View File

@@ -16,59 +16,84 @@ class Exawind(CMakePackage, CudaPackage, ROCmPackage):
tags = ["ecp", "ecp-apps"]
# Testing is currently always enabled, but should be optional in the future
# to avoid cloning the mesh submodule
version("master", branch="main", submodules=True)
version("1.0.0", tag="v1.0.0", submodules=True)
license("Apache-2.0")
variant("openfast", default=False, description="Enable OpenFAST integration")
variant("hypre", default=True, description="Enable hypre solver")
variant("stk_simd", default=False, description="Enable SIMD in STK")
variant("umpire", default=False, description="Enable Umpire")
variant("tiny_profile", default=False, description="Turn on AMR-wind with tiny profile")
version("master", branch="main", submodules=True, preferred=True)
version("1.0.0", tag="v1.0.0", submodules=True)
variant("amr_wind_gpu", default=False, description="Enable AMR-Wind on the GPU")
variant("nalu_wind_gpu", default=False, description="Enable Nalu-Wind on the GPU")
variant("sycl", default=False, description="Enable SYCL backend for AMR-Wind")
variant("gpu-aware-mpi", default=False, description="gpu-aware-mpi")
conflicts("amr-wind+hypre", when="+sycl")
for arch in CudaPackage.cuda_arch_values:
depends_on("amr-wind+cuda cuda_arch=%s" % arch, when="+cuda cuda_arch=%s" % arch)
depends_on("nalu-wind+cuda cuda_arch=%s" % arch, when="+cuda cuda_arch=%s" % arch)
depends_on(
"amr-wind+cuda cuda_arch=%s" % arch, when="+amr_wind_gpu+cuda cuda_arch=%s" % arch
)
depends_on(
"nalu-wind+cuda cuda_arch=%s" % arch, when="+nalu_wind_gpu+cuda cuda_arch=%s" % arch
)
depends_on(
"trilinos+cuda cuda_arch=%s" % arch, when="+nalu_wind_gpu+cuda cuda_arch=%s" % arch
)
for arch in ROCmPackage.amdgpu_targets:
depends_on("amr-wind+rocm amdgpu_target=%s" % arch, when="+rocm amdgpu_target=%s" % arch)
depends_on("nalu-wind+rocm amdgpu_target=%s" % arch, when="+rocm amdgpu_target=%s" % arch)
depends_on(
"amr-wind+rocm amdgpu_target=%s" % arch,
when="+amr_wind_gpu+rocm amdgpu_target=%s" % arch,
)
depends_on(
"nalu-wind+rocm amdgpu_target=%s" % arch,
when="+nalu_wind_gpu+rocm amdgpu_target=%s" % arch,
)
depends_on(
"trilinos+rocm amdgpu_target=%s" % arch,
when="+nalu_wind_gpu+rocm amdgpu_target=%s" % arch,
)
depends_on("nalu-wind+tioga")
depends_on("amr-wind+netcdf+mpi")
depends_on("tioga~nodegid")
depends_on("nalu-wind+hypre+fsi+openfast+tioga")
depends_on("amr-wind+netcdf+mpi+tiny_profile")
depends_on("trilinos")
depends_on("yaml-cpp@0.6:")
depends_on("nalu-wind+openfast", when="+openfast")
depends_on("amr-wind+hypre", when="+hypre~sycl")
depends_on("amr-wind~hypre", when="~hypre")
depends_on("nalu-wind+hypre", when="+hypre")
depends_on("nalu-wind~hypre", when="~hypre")
depends_on("amr-wind+sycl", when="+sycl")
depends_on("nalu-wind+umpire", when="+umpire")
depends_on("amr-wind+umpire", when="+umpire")
depends_on("amr-wind+tiny_profile", when="+tiny_profile")
depends_on("tioga~nodegid")
depends_on("openfast+cxx@2.6.0:")
depends_on("amr-wind+sycl", when="+amr_wind_gpu+sycl")
depends_on("kokkos-nvcc-wrapper", type="build", when="+cuda")
depends_on("mpi")
depends_on("nalu-wind+gpu-aware-mpi", when="+gpu-aware-mpi")
depends_on("amr-wind+gpu-aware-mpi", when="+gpu-aware-mpi")
depends_on("nalu-wind@2.0.0:", when="@1.0.0:")
depends_on("amr-wind@0.9.0:", when="@1.0.0:")
depends_on("tioga@1.0.0:", when="@1.0.0:")
with when("~amr_wind_gpu~nalu_wind_gpu"):
conflicts("+cuda")
conflicts("+rocm")
conflicts("+sycl")
with when("~nalu_wind_gpu"):
conflicts("^nalu-wind+cuda")
conflicts("^nalu-wind+rocm")
with when("~amr_wind_gpu"):
conflicts("^amr-wind+cuda")
conflicts("^amr-wind+rocm")
conflicts("^amr-wind+sycl")
conflicts("+amr_wind_gpu", when="~cuda~rocm~sycl")
conflicts("+nalu_wind_gpu", when="~cuda~rocm")
conflicts("+nalu_wind_gpu", when="+sycl")
conflicts("^amr-wind+hypre", when="~amr_wind_gpu+nalu_wind_gpu")
conflicts("^amr-wind+hypre", when="+amr_wind_gpu~nalu_wind_gpu")
conflicts("+sycl", when="+cuda")
conflicts("+rocm", when="+cuda")
conflicts("+sycl", when="+rocm")
def cmake_args(self):
spec = self.spec
args = [self.define("MPI_HOME", spec["mpi"].prefix)]
if "+umpire" in self.spec:
args.append(self.define_from_variant("EXAWIND_ENABLE_UMPIRE", "umpire"))
args.append(self.define("UMPIRE_DIR", self.spec["umpire"].prefix))
if spec.satisfies("+cuda"):
args.append(self.define("CMAKE_CXX_COMPILER", spec["mpi"].mpicxx))
args.append(self.define("CMAKE_C_COMPILER", spec["mpi"].mpicc))
args.append(self.define("EXAWIND_ENABLE_CUDA", True))
args.append(self.define("CUDAToolkit_ROOT", self.spec["cuda"].prefix))
args.append(self.define("EXAWIND_CUDA_ARCH", self.spec.variants["cuda_arch"].value))
@@ -77,6 +102,7 @@ def cmake_args(self):
targets = self.spec.variants["amdgpu_target"].value
args.append(self.define("EXAWIND_ENABLE_ROCM", True))
args.append(self.define("CMAKE_CXX_COMPILER", self.spec["hip"].hipcc))
# Optimization to only build one specific target architecture:
args.append(self.define("CMAKE_HIP_ARCHITECTURES", ";".join(str(x) for x in targets)))
args.append(self.define("AMDGPU_TARGETS", ";".join(str(x) for x in targets)))
args.append(self.define("GPU_TARGETS", ";".join(str(x) for x in targets)))
@@ -84,14 +110,17 @@ def cmake_args(self):
if spec.satisfies("^amr-wind+hdf5"):
args.append(self.define("H5Z_ZFP_USE_STATIC_LIBS", True))
if spec.satisfies("^amr-wind+ascent"):
args.append(self.define("CMAKE_EXE_LINKER_FLAGS", self.compiler.openmp_flag))
return args
def setup_build_environment(self, env):
if "~stk_simd" in self.spec:
env.append_flags("CXXFLAGS", "-DUSE_STK_SIMD_NONE")
env.append_flags("CXXFLAGS", "-DUSE_STK_SIMD_NONE")
if "+rocm+amr_wind_gpu~nalu_wind_gpu" in self.spec:
# Manually turn off device self.defines to solve Kokkos issues in Nalu-Wind headers
env.append_flags("CXXFLAGS", "-U__HIP_DEVICE_COMPILE__ -DDESUL_HIP_RDC")
if "+cuda" in self.spec:
env.set("OMPI_CXX", self.spec["kokkos-nvcc-wrapper"].kokkos_cxx)
env.set("MPICH_CXX", self.spec["kokkos-nvcc-wrapper"].kokkos_cxx)
env.set("MPICXX_CXX", self.spec["kokkos-nvcc-wrapper"].kokkos_cxx)
if "+rocm" in self.spec:
env.set("OMPI_CXX", self.spec["hip"].hipcc)
env.set("MPICH_CXX", self.spec["hip"].hipcc)

View File

@@ -180,10 +180,11 @@ def cmake_args(self):
[
define("CMAKE_C_COMPILER", spec["mpi"].mpicc),
define("CMAKE_CXX_COMPILER", spec["mpi"].mpicxx),
define("CMAKE_Fortran_COMPILER", spec["mpi"].mpifc),
define("MPI_BASE_DIR", spec["mpi"].prefix),
]
)
if "+fortran" in self.spec:
options.append(define("CMAKE_Fortran_COMPILER", spec["mpi"].mpifc))
# ##################### Dependencies ##########################
# Always need NetCDF-C

View File

@@ -72,6 +72,14 @@ class Fastjet(AutotoolsPackage):
)
variant("atlas", default=False, description="Patch to make random generator thread_local")
variant(
"cxxstd",
default="11",
values=("11", "17", "20", "23"),
multi=False,
description="Use the specified C++ standard when building",
)
available_plugins = (
conditional("atlascone", when="@2.4.0:"),
conditional("cdfcones", when="@2.1.0:"),
@@ -126,3 +134,8 @@ def configure_args(self):
extra_args += ["--enable-thread-safety"]
return extra_args
def flag_handler(self, name, flags):
if name == "cxxflags":
flags.append(f"-std=c++{self.spec.variants['cxxstd'].value}")
return (None, flags, None)

View File

@@ -20,31 +20,12 @@ class FenicsBasix(CMakePackage):
version("0.8.0", sha256="b299af82daf8fa3e4845e17f202491fe71b313bf6ab64c767a5287190b3dd7fe")
version("0.7.0", sha256="9bee81b396ee452eec8d9735f278cb44cb6994c6bc30aec8ed9bb4b12d83fa7f")
version("0.6.0", sha256="687ae53153c98facac4080dcdc7081701db1dcea8c5e7ae3feb72aec17f83304")
version(
"0.5.1",
sha256="69133476ac35f0bd0deccb480676030378c341d7dfb2adaca22cd16b7e1dc1cb",
deprecated=True,
)
version(
"0.4.2",
sha256="a54f5e442b7cbf3dbb6319c682f9161272557bd7f42e2b8b8ccef88bc1b7a22f",
deprecated=True,
)
depends_on("cmake@3.19:", type="build")
depends_on("blas")
depends_on("lapack")
depends_on("xtensor@0.23.10:", when="@:0.4")
depends_on("xtl@0.7.2:", when="@:0.4")
conflicts(
"%gcc@:9.10", when="@0.5.0:", msg="fenics-basix requires GCC-10 or newer for C++20 support"
)
conflicts(
"%clang@:9.10",
when="@0.5.0:",
msg="fenics-basix requires Clang-10 or newer for C++20 support",
)
conflicts("%gcc@:9.10", msg="fenics-basix requires GCC-10 or newer for C++20 support")
conflicts("%clang@:9.10", msg="fenics-basix requires Clang-10 or newer for C++20 support")
root_cmakelists_dir = "cpp"

View File

@@ -20,32 +20,6 @@ class FenicsDolfinx(CMakePackage):
version("0.8.0", sha256="acf3104d9ecc0380677a6faf69eabfafc58d0cce43f7777e1307b95701c7cad9")
version("0.7.2", sha256="7d9ce1338ce66580593b376327f23ac464a4ce89ef63c105efc1a38e5eae5c0b")
version("0.6.0", sha256="eb8ac2bb2f032b0d393977993e1ab6b4101a84d54023a67206e3eac1a8d79b80")
version(
"0.5.1",
sha256="a570e3f6ed8e7c570e7e61d0e6fd44fa9dad2c5f8f1f48a6dc9ad22bacfbc973",
deprecated=True,
)
version(
"0.5.0",
sha256="503c70c01a44d1ffe48e052ca987693a49f8d201877652cabbe2a44eb3b7c040",
deprecated=True,
)
version(
"0.4.1",
sha256="68dcf29a26c750fcea5e02d8d58411e3b054313c3bf6fcbc1d0f08dd2851117f",
deprecated=True,
)
conflicts(
"%gcc@:9.10",
when="@0.5.0:",
msg="fenics-dolfinx requires GCC-10 or newer for C++20 support",
)
conflicts(
"%clang@:9.10",
when="@0.5.0:",
msg="fenics-dolfinx requires Clang-10 or newer for C++20 support",
)
# Graph partitioner variants
variant(
@@ -57,8 +31,7 @@ class FenicsDolfinx(CMakePackage):
)
# Graph partitioner dependencies
depends_on("kahip@3.12:", when="partitioners=kahip @0.5.0:")
depends_on("kahip@3.11", when="partitioners=kahip @:0.4.1")
depends_on("kahip@3.12:", when="partitioners=kahip")
depends_on("parmetis", when="partitioners=parmetis")
depends_on("scotch+mpi", when="partitioners=scotch")
@@ -70,41 +43,26 @@ class FenicsDolfinx(CMakePackage):
depends_on("mpi")
depends_on("hdf5+mpi")
depends_on("boost@1.7.0:+filesystem+program_options+timer")
depends_on("pugixml")
depends_on("spdlog", when="@0.9:")
depends_on("petsc+mpi+shared")
depends_on("xtensor@0.23.10:", when="@:0.5")
depends_on("xtl@0.7.2:", when="@:0.5")
depends_on("slepc", when="+slepc")
depends_on("adios2+mpi", when="+adios2")
depends_on("pugixml", when="@0.5.0:")
depends_on("fenics-ufcx@main", when="@main")
depends_on("fenics-ufcx@0.8", when="@0.8")
depends_on("fenics-ufcx@0.7", when="@0.7")
depends_on("fenics-ufcx@0.6.0:0.6", when="@0.6.0:0.6")
depends_on("fenics-ufcx@0.5.0", when="@0.5.1:0.5")
depends_on("fenics-ufcx@0.4.2", when="@0.4.1")
depends_on("fenics-ufcx@0.6", when="@0.6")
depends_on("fenics-basix@main", when="@main")
depends_on("fenics-basix@0.8", when="@0.8")
depends_on("fenics-basix@0.7", when="@0.7")
depends_on("fenics-basix@0.6.0:0.6", when="@0.6.0:0.6")
depends_on("fenics-basix@0.5.1:0.5", when="@0.5.0:0.5")
depends_on("fenics-basix@0.4.2", when="@0.4.1")
depends_on("fenics-basix@0.6", when="@0.6")
conflicts(
"%gcc@:9.10",
when="@0.5.0:",
msg="fenics-dolfinx requires GCC-10 or newer for C++20 support",
)
conflicts(
"%clang@:9.10",
when="@0.5.0:",
msg="fenics-dolfinx requires Clang-10 or newer for C++20 support",
)
conflicts("%gcc@:8", msg="fenics-dolfinx requires GCC-9 or newer for improved C++17 support")
conflicts("%gcc@:9.10", msg="fenics-dolfinx requires GCC-10 or newer for C++20 support")
conflicts("%clang@:9.10", msg="fenics-dolfinx requires Clang-10 or newer for C++20 support")
root_cmakelists_dir = "cpp"

View File

@@ -14,7 +14,7 @@ class FenicsUfcx(CMakePackage):
homepage = "https://github.com/FEniCS/ffcx"
git = "https://github.com/FEniCS/ffcx.git"
url = "https://github.com/FEniCS/ffcx/archive/v0.4.2.tar.gz"
maintainers("ma595", "jhale")
maintainers("ma595", "jhale", "garth-wells", "chrisrichardson")
license("LGPL-3.0-or-later")
@@ -22,21 +22,6 @@ class FenicsUfcx(CMakePackage):
version("0.8.0", sha256="8a854782dbd119ec1c23c4522a2134d5281e7f1bd2f37d64489f75da055282e3")
version("0.7.0", sha256="7f3c3ca91d63ce7831d37799cc19d0551bdcd275bdfa4c099711679533dd1c71")
version("0.6.0", sha256="076fad61d406afffd41019ae1abf6da3f76406c035c772abad2156127667980e")
version(
"0.5.0.post0",
sha256="039908c9998b51ba53e5deb3a97016062c262f0a4285218644304f7d3cd35882",
deprecated=True,
)
version(
"0.5.0",
sha256="3413409e5885e41e220f99e0f95cc817e94c4931143d1f700c6e0c5e1bfad1f6",
deprecated=True,
)
version(
"0.4.2",
sha256="3be6eef064d6ef907245db5b6cc15d4e603762e68b76e53e099935ca91ef1ee4",
deprecated=True,
)
depends_on("cmake@3.19:", type="build")

View File

@@ -19,6 +19,7 @@ class File(AutotoolsPackage):
license("BSD-2-Clause")
version("5.45", sha256="fc97f51029bb0e2c9f4e3bffefdaf678f0e039ee872b9de5c002a6d09c784d82")
version("5.44", sha256="3751c7fba8dbc831cb8d7cc8aff21035459b8ce5155ef8b0880a27d028475f3b")
version("5.43", sha256="8c8015e91ae0e8d0321d94c78239892ef9dbc70c4ade0008c0e95894abfb1991")
version("5.42", sha256="c076fb4d029c74073f15c43361ef572cfb868407d347190ba834af3b1639b0e4")

View File

@@ -22,6 +22,11 @@ class Fish(CMakePackage):
license("GPL-2.0-only")
version("master", branch="master")
version("3.7.1", sha256="614c9f5643cd0799df391395fa6bbc3649427bb839722ce3b114d3bbc1a3b250")
version("3.7.0", sha256="df1b7378b714f0690b285ed9e4e58afe270ac98dbc9ca5839589c1afcca33ab1")
version("3.6.4", sha256="0f3f610e580de092fbe882c8aa76623ecf91bb16fdf0543241e6e90d5d4bc393")
version("3.6.3", sha256="55520128c8ef515908a3821423b430db9258527a6c6acb61c7cb95626b5a48d5")
version("3.6.2", sha256="a21a6c986f1f80273895ba7e905fa80ad7e1a262ddb3d979efa443367eaf4863")
version("3.6.1", sha256="55402bb47ca6739d8aba25e41780905b5ce1bce0a5e0dd17dca908b5bc0b49b2")
version("3.6.0", sha256="97044d57773ee7ca15634f693d917ed1c3dc0fa7fde1017f1626d60b83ea6181")
version("3.5.1", sha256="a6d45b3dc5a45dd31772e7f8dfdfecabc063986e8f67d60bd7ca60cc81db6928")

View File

@@ -6,16 +6,20 @@
from spack.package import *
class Flint(Package):
class Flint(AutotoolsPackage):
"""FLINT (Fast Library for Number Theory)."""
homepage = "https://www.flintlib.org"
url = "https://mirrors.mit.edu/sage/spkg/upstream/flint/flint-2.5.2.tar.gz"
git = "https://github.com/wbhart/flint2.git"
homepage = "https://flintlib.org"
url = "https://flintlib.org/flint-3.1.2.tar.gz"
git = "https://github.com/flintlib/flint.git"
list_url = "https://flintlib.org/downloads.html"
list_depth = 0
license("LGPL-2.1-or-later")
version("develop", branch="trunk")
version("main", branch="main")
version("3.1.2", sha256="fdb3a431a37464834acff3bdc145f4fe8d0f951dd5327c4c6f93f4cbac5c2700")
version("3.0.1", sha256="7b311a00503a863881eb8177dbeb84322f29399f3d7d72f3b1a4c9ba1d5794b4")
version("2.5.2", sha256="cbf1fe0034533c53c5c41761017065f85207a1b770483e98b2392315f6575e87")
version("2.4.5", sha256="e489354df00f0d84976ccdd0477028693977c87ccd14f3924a89f848bb0e01e3")
@@ -25,28 +29,9 @@ class Flint(Package):
# variant('mpir', default=False,
# description='Compile with the MPIR library')
# Build dependencies
depends_on("autoconf", type="build")
# Other dependencies
depends_on("gmp") # mpir is a drop-in replacement for this
depends_on("mpfr") # Could also be built against mpir
def install(self, spec, prefix):
options = []
options = [
"--prefix=%s" % prefix,
"--with-gmp=%s" % spec["gmp"].prefix,
"--with-mpfr=%s" % spec["mpfr"].prefix,
]
# if '+mpir' in spec:
# options.extend([
# "--with-mpir=%s" % spec['mpir'].prefix
# ])
configure(*options)
make()
if self.run_tests:
make("check")
make("install")
def configure_args(self):
spec = self.spec
return [f"--with-gmp={spec['gmp'].prefix}", f"--with-mpfr={spec['mpfr'].prefix}"]

View File

@@ -17,7 +17,8 @@ class Fpocket(MakefilePackage):
license("MIT")
version("4.1", "1a2af2d3f2df42de67301996db3b93c7eaff0375f866443c0468dcf4b1750688")
version("4.2", sha256="8aea4ccdf4243606110c8f6978b13dd90f9cae092660eca4c6970206011de4aa")
version("4.1", sha256="1a2af2d3f2df42de67301996db3b93c7eaff0375f866443c0468dcf4b1750688")
depends_on("netcdf-c")
depends_on("netcdf-cxx")

View File

@@ -53,6 +53,16 @@ def setup_dependent_package(self, module, dependent_spec):
self.spec.mpif77 = self.prefix.bin.mpifrt
self.spec.mpifc = self.prefix.bin.mpifrt
def setup_dependent_build_environment(self, env, dependent_spec):
if self.spec.satisfies("%gcc"):
env.set("MPI_C_COMPILER", self.prefix.bin.mpicc)
env.set("MPI_CXX_COMPILER", self.prefix.bin.mpicxx)
env.set("MPI_Fortran_COMPILER", self.prefix.bin.mpifort)
else:
env.set("MPI_C_COMPILER", self.prefix.bin.mpifcc)
env.set("MPI_CXX_COMPILER", self.prefix.bin.mpiFCC)
env.set("MPI_Fortran_COMPILER", self.prefix.bin.mpifrt)
def setup_run_environment(self, env):
# Because MPI are both compilers and runtimes, we set up the compilers
# as part of run environment

View File

@@ -65,6 +65,12 @@ class Gaudi(CMakePackage):
sha256="b05f6b7c1efb8c3af291c8d81fd1627e58af7c5f9a78a0098c6e3bfd7ec80c15",
when="@37.1 ^catch2@3.1:",
)
# add missing <list> include for newer compilers
patch(
"https://gitlab.cern.ch/gaudi/Gaudi/-/commit/54b727f08a685606420703098131b387d3026637.diff",
sha256="41aa1587a3e59d49e0fa9659577073c091871c2eca1b8b237c177ab98fbacf3f",
when="@:38.2",
)
# These dependencies are needed for a minimal Gaudi build
depends_on("aida")

View File

@@ -32,6 +32,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
license("MIT")
version("3.9.0", sha256="577f80e9d14ff7c90b6bfbc34201652b4546700c01543efb4f4c3050e0b3fda2")
version("3.8.5", sha256="e8b4df2a8a7d25272f867455c0c230459545972f81f0eff2ddbf6a6f60dcb1e4")
version("3.8.4", sha256="0c53ced95d29474236487202709b49015854f8e02e35e44ed0f4f4e12a7966ce")
version("3.8.3", sha256="ae2d160f65016e208eca34ff14490ec4511f1fa03fd386ac130449d15e82929d")
@@ -73,29 +74,30 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
version("3.0.2", sha256="c3765371ce391715c8f28bd6defbc70b57aa43341f6e94605f04fe3c92468983")
version("3.0.1", sha256="45b4ae25dbd87282d589eca76481c426f72132d7a599556470d5c38263b09266")
version("3.0.0", sha256="ad316fa052d94d9606e90b20a514b92b2dd64e3142dfdbd8f10981a5fcd5c43e")
version("2.4.4", sha256="a383bd3cf555d6e1169666b01b5b3025b2722ed39e834f1b65090f604405dcd8")
version("2.4.3", sha256="d52dc3e0cff3af3e898d887c4151442989f416e839948e73f0994f0224bbff60")
version("2.4.2", sha256="dcc132e469c5eb76fa4aaff238d32e45a5d947dc5b6c801a123b70045b618e0c")
version("2.4.1", sha256="fd51b4900b2fc49b98d8714f55fc8a78ebfd07218357f93fb796791115a5a1ad")
version("2.4.0", sha256="c3791dcc6d37e59f6efa86e2df2a55a4485237b0a48e330ae08949f0cdf00f27")
version("2.3.3", sha256="c3635e41766a648f945d235b922e3c5306e26a2ee5bbd730d2181e242f5f46fe")
version("2.3.2", sha256="3f6d78fe8807d1d6afb7bed27394f19467840a82bc36d65e66316fa0aa9d32a4")
version("2.3.1", sha256="9c4625c45a3ee7e49a604ef221778983dd9fd8104922a87f20b99d9bedb7725a")
version("2.3.0", sha256="6f75e49aa30de140525ccb58688667efe3a2d770576feb7fbc91023b7f552aa2")
version("2.2.4", sha256="441eb1d1acb35238ca43a1a0a649493fc91fdcbab231d0747e9d462eea192278")
version("2.2.3", sha256="a328d63d476b3653f5a25b5f7971e87a15cdf8860ab0729d4b1157ba988b8d0b")
version("2.2.2", sha256="eb25d6ee85f4f5ac1d5581958f8c6eed9b1d50746f82866fe92e507541def35b")
version("2.2.1", sha256="927098d54083ac919a497f787b835b099e9a194f2e5444dbff901f7426b86066")
version("2.2.0", sha256="0d4c326862e0f118e17418c042c2bcd037b25abd3fb198e1fc5d40b11a9fc8ea")
version("2.1.4", sha256="e06a7ae4c4ed2fd678cd045ff50a10ff5002f3b81cdfcd8ab03c39ce962d9b63")
version("2.1.3", sha256="b489793627e6cb8d2ff8d7737b61daf58382fe189fae4c581ddfd48c04b49005")
version("2.1.2", sha256="b597f36bd29a2b4368998ddd32b28c8cdf3c8192237a81b99af83cc17d7fa374")
version("2.1.1", sha256="87ce516ce757ad1edf1e21f007fbe232ed2e932af422e9893f40199711c41f92")
version("2.1.0", sha256="568b43441955b306364fcf97fb47d4c1512ac6f2f5f76b2ec39a890d2418ee03")
version("2.0.3", sha256="3c6c5ade299c7a52fc9c5d2111110c97032e1f0c2593ce6091c364b1a43b442a")
version("2.0.2", sha256="90f838853cc1c07e55893483faa7e923e4b4b1659c6bc9df3538366030a7e622")
version("2.0.1", sha256="2564c91ed8ed36274ee31002a25798f5babc4221e879cb5013867733d80f9920")
version("2.0.0", sha256="91704fafeea2349c5e268dc1e2d03921b3aae64b05ee01d59fdfc1a6b0ffc061")
with default_args(deprecated=True):
version("2.4.4", sha256="a383bd3cf555d6e1169666b01b5b3025b2722ed39e834f1b65090f604405dcd8")
version("2.4.3", sha256="d52dc3e0cff3af3e898d887c4151442989f416e839948e73f0994f0224bbff60")
version("2.4.2", sha256="dcc132e469c5eb76fa4aaff238d32e45a5d947dc5b6c801a123b70045b618e0c")
version("2.4.1", sha256="fd51b4900b2fc49b98d8714f55fc8a78ebfd07218357f93fb796791115a5a1ad")
version("2.4.0", sha256="c3791dcc6d37e59f6efa86e2df2a55a4485237b0a48e330ae08949f0cdf00f27")
version("2.3.3", sha256="c3635e41766a648f945d235b922e3c5306e26a2ee5bbd730d2181e242f5f46fe")
version("2.3.2", sha256="3f6d78fe8807d1d6afb7bed27394f19467840a82bc36d65e66316fa0aa9d32a4")
version("2.3.1", sha256="9c4625c45a3ee7e49a604ef221778983dd9fd8104922a87f20b99d9bedb7725a")
version("2.3.0", sha256="6f75e49aa30de140525ccb58688667efe3a2d770576feb7fbc91023b7f552aa2")
version("2.2.4", sha256="441eb1d1acb35238ca43a1a0a649493fc91fdcbab231d0747e9d462eea192278")
version("2.2.3", sha256="a328d63d476b3653f5a25b5f7971e87a15cdf8860ab0729d4b1157ba988b8d0b")
version("2.2.2", sha256="eb25d6ee85f4f5ac1d5581958f8c6eed9b1d50746f82866fe92e507541def35b")
version("2.2.1", sha256="927098d54083ac919a497f787b835b099e9a194f2e5444dbff901f7426b86066")
version("2.2.0", sha256="0d4c326862e0f118e17418c042c2bcd037b25abd3fb198e1fc5d40b11a9fc8ea")
version("2.1.4", sha256="e06a7ae4c4ed2fd678cd045ff50a10ff5002f3b81cdfcd8ab03c39ce962d9b63")
version("2.1.3", sha256="b489793627e6cb8d2ff8d7737b61daf58382fe189fae4c581ddfd48c04b49005")
version("2.1.2", sha256="b597f36bd29a2b4368998ddd32b28c8cdf3c8192237a81b99af83cc17d7fa374")
version("2.1.1", sha256="87ce516ce757ad1edf1e21f007fbe232ed2e932af422e9893f40199711c41f92")
version("2.1.0", sha256="568b43441955b306364fcf97fb47d4c1512ac6f2f5f76b2ec39a890d2418ee03")
version("2.0.3", sha256="3c6c5ade299c7a52fc9c5d2111110c97032e1f0c2593ce6091c364b1a43b442a")
version("2.0.2", sha256="90f838853cc1c07e55893483faa7e923e4b4b1659c6bc9df3538366030a7e622")
version("2.0.1", sha256="2564c91ed8ed36274ee31002a25798f5babc4221e879cb5013867733d80f9920")
version("2.0.0", sha256="91704fafeea2349c5e268dc1e2d03921b3aae64b05ee01d59fdfc1a6b0ffc061")
# Optional dependencies
variant("archive", default=False, when="@3.7:", description="Optional for vsi7z VFS driver")
@@ -251,18 +253,22 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
with when("build_system=cmake"):
generator("ninja")
depends_on("cmake@3.16:", type="build", when="@3.9:")
depends_on("cmake@3.9:", type="build")
with when("build_system=autotools"):
depends_on("gmake", type="build")
# Required dependencies
# Versions come from gdal_check_package in cmake/helpers/CheckDependentLibraries.cmake
depends_on("pkgconfig@0.25:", type="build")
depends_on("proj@6.3.1:", when="@3.9:")
depends_on("proj@6:", when="@3:")
depends_on("proj@:6", when="@2.5:2")
depends_on("proj@:5", when="@2.4")
depends_on("proj@:4", when="@:2.3")
depends_on("zlib-api")
depends_on("libtiff@4.1:", when="@3.9:")
depends_on("libtiff@4:", when="@3:")
depends_on("libtiff@3.6.0:") # 3.9.0+ needed to pass testsuite
depends_on("libgeotiff@1.5:", when="@3:")
@@ -283,6 +289,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('bsb', when='+bsb')
depends_on("cfitsio", when="+cfitsio")
depends_on("crunch", when="+crnlib")
depends_on("curl@7.68:", when="@3.9:+curl")
depends_on("curl", when="+curl")
depends_on("cryptopp", when="+cryptopp")
depends_on("libdeflate", when="+deflate")
@@ -294,6 +301,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('fme', when='+fme')
depends_on("freexl", when="+freexl")
depends_on("fyba", when="+fyba")
depends_on("geos@3.8:", when="@3.9:+geos")
depends_on("geos@3.1:", when="+geos")
depends_on("giflib", when="+gif")
depends_on("grass@5.7:", when="+grass")
@@ -301,9 +309,11 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
depends_on("libgta", when="+gta")
depends_on("libheif@1.1:", when="+heif")
depends_on("hdf", when="+hdf4")
depends_on("hdf5+cxx", when="+hdf5")
depends_on("hdf5@:1.13", when="@:3.5 +hdf5")
depends_on("hdf5@:1.12", when="@:3.4 +hdf5")
depends_on("hdf5@1.10:", when="@3.9:+hdf5")
depends_on("hdf5@:1.13", when="@:3.5+hdf5")
depends_on("hdf5@:1.12", when="@:3.4+hdf5")
depends_on("hdf5+cxx", when="@3.8:+hdf5+kea")
depends_on("hdf5+cxx", when="@:3.7+hdf5")
depends_on("hadoop", when="+hdfs")
depends_on("iconv", when="+iconv")
# depends_on('idb', when='+idb')
@@ -318,6 +328,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('libcsf', when='+libcsf')
depends_on("libkml@1.3:", when="+libkml")
depends_on("xz", when="+liblzma")
depends_on("qb3", when="+libqb3")
depends_on("libxml2", when="+libxml2")
# depends_on('luratech', when='+luratech')
depends_on("lz4", when="+lz4")
@@ -330,6 +341,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('mssql_ncli', when='+mssql_ncli')
# depends_on('mssql_odbc', when='+mssql_odbc')
depends_on("mysql", when="+mysql")
depends_on("netcdf-c@4.7:", when="@3.9:+netcdf")
depends_on("netcdf-c", when="+netcdf")
depends_on("unixodbc", when="+odbc")
# depends_on('odbc-cpp-wrapper', when='+odbccpp')
@@ -337,6 +349,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('lib-opencad', when='+opencad')
depends_on("opencl", when="+opencl")
depends_on("openexr@2.2:", when="+openexr")
depends_on("openjpeg@2.3.1:", when="@3.9:+openjpeg")
depends_on("openjpeg", when="+openjpeg")
depends_on("openssl", when="+openssl")
depends_on("oracle-instant-client", when="+oracle")
@@ -345,26 +358,32 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
depends_on("pcre2", when="@3.5:+pcre2")
depends_on("pcre", when="@:3.4+pcre2")
# depends_on('pdfium', when='+pdfium')
depends_on("libpng@1.6:", when="@3.9:+png")
depends_on("libpng", when="+png")
# depends_on('podofo', when='+podofo')
depends_on("poppler@0.86:", when="@3.9:+poppler")
depends_on("poppler@0.24:", when="@3:+poppler")
depends_on("poppler@:0.63", when="@:2.3+poppler")
depends_on("poppler@:0.71", when="@:2.4+poppler")
depends_on("poppler@:21", when="@:3.4.1+poppler")
depends_on("poppler", when="+poppler")
depends_on("poppler@0.24:", when="@3: +poppler")
depends_on("poppler@:0.63", when="@:2.3 +poppler")
depends_on("poppler@:0.71", when="@:2.4 +poppler")
depends_on("poppler@:21", when="@:3.4.1 +poppler")
depends_on("postgresql", when="+postgresql")
depends_on("qb3", when="+libqb3")
depends_on("qhull", when="+qhull")
depends_on("qhull@2015:", when="@3.5:+qhull")
depends_on("qhull@:2020.1", when="@:3.3+qhull")
# depends_on('rasdaman', when='+rasdaman')
# depends_on('rasterlite2@1.1:', when='+rasterlite2')
# depends_on('rasterlite2@1.1:', when='@3.7:+rasterlite2')
# depends_on('rasterlite2', when='+rasterlite2')
# depends_on('rdblib', when='+rdb')
# depends_on('sde', when='+sde')
depends_on("sfcgal", when="+sfcgal")
depends_on("libspatialite@4.1.2:", when="@3.7:+spatialite")
depends_on("libspatialite", when="+spatialite")
depends_on("sqlite@3.31:", when="@3.9:+sqlite3")
depends_on("sqlite@3:", when="+sqlite3")
# depends_on('teigha', when='+teigha')
# depends_on('tiledb@2.15:', when='@3.9:+tiledb')
# depends_on('tiledb@2.7:', when='@3.7:+tiledb')
# depends_on('tiledb', when='+tiledb')
depends_on("libwebp", when="+webp")
depends_on("xerces-c@3.1:", when="+xercesc")
@@ -377,14 +396,15 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# extends('openjdk', when='+java')
# extends('perl', when='+perl')
# see gdal_version_and_min_supported_python_version
# in swig/python/osgeo/__init__.py
depends_on("python@3.6:", type=("build", "link", "run"), when="@3.3:+python")
depends_on("python@2.0:", type=("build", "link", "run"), when="@3.2:+python")
# swig/python/pyproject.toml (3.9+)
# swig/python/setup.py.in (3.5-3.8)
# swig/python/osgeo/__init__.py (3.4-)
depends_on("python", type=("build", "link", "run"), when="+python")
# Uses distutils
depends_on("python@:3.11", type=("build", "link", "run"), when="@:3.4+python")
# swig/python/setup.py
# swig/python/pyproject.toml (3.9+)
# swig/python/setup.py (3.8-)
depends_on("py-setuptools@67:", type="build", when="@3.9:+python")
depends_on("py-setuptools@:57", type="build", when="@:3.2+python") # needs 2to3
depends_on("py-setuptools", type="build", when="+python")
depends_on("py-numpy@1.0.0:", type=("build", "run"), when="+python")
@@ -403,13 +423,17 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# https://gdal.org/development/rfc/rfc88_googletest.html
depends_on("googletest@1.10:", type="test")
# https://trac.osgeo.org/gdal/wiki/SupportedCompilers
# https://gdal.org/development/rfc/rfc98_build_requirements_gdal_3_9.html
msg = "GDAL requires C++17 support"
conflicts("%gcc@:7", msg=msg)
conflicts("%clang@:4", msg=msg)
conflicts("%msvc@:19.14", msg=msg)
# https://gdal.org/development/rfc/rfc68_cplusplus11.html
msg = "GDAL requires C++11 support"
conflicts("%gcc@:4.8.0", msg=msg)
conflicts("%clang@:3.2", msg=msg)
conflicts("%intel@:12", msg=msg)
conflicts("%xl@:13.0", msg=msg)
conflicts("%xl_r@:13.0", msg=msg)
conflicts("%msvc@:13", msg=msg)
# https://github.com/OSGeo/gdal/issues/5994
conflicts("~png", when="@3:3.5.0")

View File

@@ -29,26 +29,40 @@ class Git(AutotoolsPackage):
# Every new git release comes with a corresponding manpage resource:
# https://www.kernel.org/pub/software/scm/git/git-manpages-{version}.tar.gz
# https://mirrors.edge.kernel.org/pub/software/scm/git/sha256sums.asc
version("2.42.0", sha256="34aedd54210d7216a55d642bbb4cfb22695b7610719a106bf0ddef4c82a8beed")
version("2.41.0", sha256="c4a6a3dd1827895a80cbd824e14d94811796ae54037549e0da93f7b84cb45b9f")
version("2.40.1", sha256="55511f10f3b1cdf5db4e0e3dea61819dfb67661b0507a5a2b061c70e4f87e14c")
version("2.39.3", sha256="2f9aa93c548941cc5aff641cedc24add15b912ad8c9b36ff5a41b1a9dcad783e")
version("2.38.5", sha256="09392caf6ff296341022595a175d8b075bc98b6a82f6227d3bd21e36a2a812c3")
version("2.37.7", sha256="2108fa57b74add4300b8960e9404e0ed3e5f0efda7470450c67c67e8ab7616d5")
version("2.36.6", sha256="a8c09f46d5d16a8d8f19e8089aeb408d95d8097af03de297061e83a2c74890dd")
version("2.35.8", sha256="3a675e0128a7153e1492bbe14d08195d44b5916e6b8879addf94b1f4add77dca")
version("2.34.8", sha256="10a6c233471d7d4439cd4004961a3f4ff7e6de308645a1074ec3522b8ea52c83")
version("2.33.8", sha256="eafd10da9fdf86be0a79beb67c3537eead114f91836c685d5b9c969c961516ae")
version("2.32.7", sha256="f09904d13a9bfca5fcb228c3caba1d4c17426dec0000bf67672af257b8a73db4")
version("2.31.8", sha256="d2443e368b1394858a1040bd74dacfba46bce2cf3410ef3bc5089a703fc91e9a")
version("2.30.9", sha256="b14b5f4ce1fe23ed78839664c7ba888fb5cedba3dd98d9f5a499a36fa3a4a2d8")
version("2.45.1", sha256="10acb581993061e616be9c5674469335922025a666318e0748cb8306079fef24")
version("2.44.1", sha256="118214bb8d7ba971a62741416e757562b8f5451cefc087a407e91857897c92cc")
version("2.43.4", sha256="bfd717dc31922f718232a25a929d199e26146df5e876fdf0ff90a7cc95fa06e2")
version("2.42.2", sha256="3b24b712fa6e9a3da5b7d3e68b1854466905aadb93a43088a38816bcc3b9d043")
version("2.41.1", sha256="06d2a681aa7f1bdb6e7f7101631407e7412faa534e1fa0eb6fdcb9975d867d31")
version("2.40.2", sha256="1dcdfbb4eeb3ef2c2d9154f888d4a6f0cf19f19acad76f0d32e725e7bc147753")
version("2.39.4", sha256="b895ed2b5d98fd3dcfde5807f16d5fb17c4f83044e7d08e597ae13de222f0d26")
# Deprecated versions
# Deprecated versions (see https://github.blog/2024-05-14-securing-git-addressing-5-new-vulnerabilities/).
version(
"2.42.0",
sha256="34aedd54210d7216a55d642bbb4cfb22695b7610719a106bf0ddef4c82a8beed",
deprecated=True,
)
version(
"2.41.0",
sha256="c4a6a3dd1827895a80cbd824e14d94811796ae54037549e0da93f7b84cb45b9f",
deprecated=True,
)
version(
"2.40.1",
sha256="55511f10f3b1cdf5db4e0e3dea61819dfb67661b0507a5a2b061c70e4f87e14c",
deprecated=True,
)
version(
"2.40.0",
sha256="ab37c343c0ad097282fd311ab9ca521ab3da836e5c4ed2093994f1b7f8575b09",
deprecated=True,
)
version(
"2.39.3",
sha256="2f9aa93c548941cc5aff641cedc24add15b912ad8c9b36ff5a41b1a9dcad783e",
deprecated=True,
)
version(
"2.39.2",
sha256="fb6807d1eb4094bb2349ab97d203fe1e6c3eb28af73ea391decfbd3a03c02e85",
@@ -59,6 +73,11 @@ class Git(AutotoolsPackage):
sha256="ae8d3427e4ccd677abc931f16183c0ec953e3bfcd866493601351e04a2b97398",
deprecated=True,
)
version(
"2.38.5",
sha256="09392caf6ff296341022595a175d8b075bc98b6a82f6227d3bd21e36a2a812c3",
deprecated=True,
)
version(
"2.38.3",
sha256="ba8f1c56763cfde0433657b045629a4c55047c243eb3091d39dea6f281c8d8e1",
@@ -69,6 +88,11 @@ class Git(AutotoolsPackage):
sha256="620ed3df572a34e782a2be4c7d958d443469b2665eac4ae33f27da554d88b270",
deprecated=True,
)
version(
"2.37.7",
sha256="2108fa57b74add4300b8960e9404e0ed3e5f0efda7470450c67c67e8ab7616d5",
deprecated=True,
)
version(
"2.37.5",
sha256="5c11f90652afee6c77ef7ddfc672facd4bc6f2596d9627df2f1780664b058b9a",
@@ -79,11 +103,21 @@ class Git(AutotoolsPackage):
sha256="a638c9bf9e45e8d48592076266adaa9b7aa272a99ee2aee2e166a649a9ba8a03",
deprecated=True,
)
version(
"2.36.6",
sha256="a8c09f46d5d16a8d8f19e8089aeb408d95d8097af03de297061e83a2c74890dd",
deprecated=True,
)
version(
"2.36.3",
sha256="0c831b88b0534f08051d1287505dfe45c367108ee043de6f1c0502711a7aa3a6",
deprecated=True,
)
version(
"2.35.8",
sha256="3a675e0128a7153e1492bbe14d08195d44b5916e6b8879addf94b1f4add77dca",
deprecated=True,
)
version(
"2.35.6",
sha256="6bd51e0487028543ba40fe3d5b33bd124526a7f7109824aa7f022e79edf93bd1",
@@ -94,6 +128,11 @@ class Git(AutotoolsPackage):
sha256="2cca63fe7bebb5b4bf8efea7b46b12bb89c16ff9711b6b6d845928501d00d0a3",
deprecated=True,
)
version(
"2.34.8",
sha256="10a6c233471d7d4439cd4004961a3f4ff7e6de308645a1074ec3522b8ea52c83",
deprecated=True,
)
version(
"2.34.6",
sha256="01c0ae4161a07ffeb89cfb8bda564eb2dcb83b45b678cf2930cdbdd8e81784d0",
@@ -104,6 +143,11 @@ class Git(AutotoolsPackage):
sha256="26831c5e48a8c2bf6a4fede1b38e1e51ffd6dad85952cf69ac520ebd81a5ae82",
deprecated=True,
)
version(
"2.33.8",
sha256="eafd10da9fdf86be0a79beb67c3537eead114f91836c685d5b9c969c961516ae",
deprecated=True,
)
version(
"2.33.6",
sha256="76f6a64a198bec38e83044f97fb5a2dfa8404091df5a905404615d2a4c5ebfb7",
@@ -114,6 +158,11 @@ class Git(AutotoolsPackage):
sha256="d061ed97f890befaef18b4aad80a37b40db90bcf24113c42765fee157a69c7de",
deprecated=True,
)
version(
"2.32.7",
sha256="f09904d13a9bfca5fcb228c3caba1d4c17426dec0000bf67672af257b8a73db4",
deprecated=True,
)
version(
"2.32.5",
sha256="9982e17209cf4a385ce4a6167863cdd29f68e425d4249aac186434dc3536fe5f",
@@ -124,6 +173,11 @@ class Git(AutotoolsPackage):
sha256="4c791b8e1d96948c9772efc21373ab9b3187af42cdebc3bcbb1a06d794d4e494",
deprecated=True,
)
version(
"2.31.8",
sha256="d2443e368b1394858a1040bd74dacfba46bce2cf3410ef3bc5089a703fc91e9a",
deprecated=True,
)
version(
"2.31.6",
sha256="73971208dccdd6d87639abe50ee3405421ec4ba05dec9f8aa90b4e7f1985e15c",
@@ -134,6 +188,11 @@ class Git(AutotoolsPackage):
sha256="2d4197660322937cc44cab5742deef727ba519ef7405455e33100912e3b019f2",
deprecated=True,
)
version(
"2.30.9",
sha256="b14b5f4ce1fe23ed78839664c7ba888fb5cedba3dd98d9f5a499a36fa3a4a2d8",
deprecated=True,
)
version(
"2.30.7",
sha256="c98bf38a296f23ad5619a097df928044b31859df8f89b3ae5a8ea109d3ebd88e",
@@ -146,10 +205,17 @@ class Git(AutotoolsPackage):
)
for _version, _sha256_manpage in {
"2.45.1": "d9098fd93a3c0ef242814fc856a99886ce31dae2ba457afc416ba4e92af8f8f5",
"2.44.1": "8d80359e44cbcce256c1eb1389cb8e15ccfcd267fbb8df567d5ce19ce006eb42",
"2.43.4": "99d3a0394a6093237123237fd6c0d3de1041d5ceaedc3bfc016807914275d3e2",
"2.42.2": "2ddfa2187fdaf9ab2b27c0ab043e46793127c26c82a824ffe980f006be049286",
"2.42.0": "51643c53d70ce15dde83b6da2bad76ba0c7bbcd4f944d7c378f03a15b9f2e1de",
"2.41.1": "7093ef7dacfa8cdb3c4689d8bc1f06186d9b2420bec49087a3a6a4dee26ddcec",
"2.41.0": "7b77c646b36d33c5c0f62677a147142011093270d6fd628ca38c42d5301f3888",
"2.40.2": "2c71f3f3e4801176f97708f2093756bce672ef260c6d95c255046e6727b3a031",
"2.40.1": "6bbde434121bd0bf8aa574c60fd9a162388383679bd5ddd99921505149ffd4c2",
"2.40.0": "fda16047e9c1dd07d9585cc26bbf4002ebf8462ada54cb72b97a0e48135fd435",
"2.39.4": "fedd01dd22a15b84bcbcad68c1b37113ba2c64381c19b6c9f3aa9b2818e126dc",
"2.39.3": "c8377b5a3ff497d7e6377363c270931496e982509ff27a1e46956d6637671642",
"2.39.2": "fd92e67fb750ceb2279dcee967a21303f2f8117834a21c1e0c9f312ebab6d254",
"2.39.1": "b2d1b2c6cba2343934792c4409a370a8c684add1b3c0f9b757e71189b1a2e80e",

View File

@@ -17,6 +17,9 @@ class GosamContrib(AutotoolsPackage):
version("2.0", sha256="c05beceea74324eb51c1049773095e2cb0c09c8c909093ee913d8b0da659048d")
version("1.0", sha256="a29d4232d9190710246abc2ed97fdcd8790ce83580f56a360f3456b0377c40ec")
# whizard checks for .la files ( but does not use them )
install_libtool_archives = True
variant(
"libs",
default="shared,static",
@@ -26,6 +29,11 @@ class GosamContrib(AutotoolsPackage):
)
variant("pic", default=False, description="Build position-independent code")
def patch(self):
# remove spack compiler wrapper path
mf = FileFilter("gosam.conf.in")
mf.filter("^fc.bin=.*", "fc.bin=" + self.compiler.fc)
def flag_handler(self, name, flags):
if name in ["cflags", "cxxflags", "cppflags"]:
if "+pic" in self.spec:

View File

@@ -33,8 +33,10 @@ class Gromacs(CMakePackage, CudaPackage):
version("main", branch="main")
version("master", branch="main", deprecated=True)
version("2024.2", sha256="802a7e335f2e895770f57b159e4ec368ebb0ff2ce6daccf706c6e8025c36852b")
version("2024.1", sha256="937d8f12a36fffbf2af7add71adbb5aa5c5537892d46c9a76afbecab1aa0aac7")
version("2024", sha256="04d226d52066a8bc3a42e00d6213de737b4ec292e26703065924ff01956801e2")
version("2023.5", sha256="9cc491d3601a5fe0ec0de727e4432c34877f596fe8a463d4cf0f0f53fb34d08b")
version("2023.4", sha256="e5d6c4d9e7ccacfaccb0888619bd21b5ea8911f82b410e68d6db5d40f695f231")
version("2023.3", sha256="4ec8f8d0c7af76b13f8fd16db8e2c120e749de439ae9554d9f653f812d78d1cb")
version("2023.2", sha256="bce1480727e4b2bb900413b75d99a3266f3507877da4f5b2d491df798f9fcdae")
@@ -161,7 +163,7 @@ class Gromacs(CMakePackage, CudaPackage):
"sve",
default=True,
description="Enable SVE on aarch64 if available",
when="target=neoverse_v1",
when="target=neoverse_v1:,neoverse_v2:",
)
variant(
"sve", default=True, description="Enable SVE on aarch64 if available", when="target=a64fx"

View File

@@ -1,25 +1,24 @@
diff --git a/bin/hipcc.pl b/bin/hipcc.pl
index 513a427..cd2d6ac 100755
index 513a427..780dc5c 100755
--- a/bin/hipcc.pl
+++ b/bin/hipcc.pl
@@ -160,11 +160,14 @@ if ($HIP_PLATFORM eq "amd") {
@@ -160,11 +160,13 @@ if ($HIP_PLATFORM eq "amd") {
if($isWindows) {
$execExtension = ".exe";
}
- $HIPCC=get_normalized_path("$HIP_CLANG_PATH/clang++" . $execExtension);
+ # llvm_path is set inside the hip recipe
+ $LLVM_PATH= $ENV{'LLVM_PATH'};
+ $HIPCC="${LLVM_PATH}/bin/clang++" . $execExtension;
+ $HIP_CLANG_PATH= $ENV{'HIP_CLANG_PATH'};
+ $HIPCC="${HIP_CLANG_PATH}/clang++" . $execExtension;
# If $HIPCC clang++ is not compiled, use clang instead
if ( ! -e $HIPCC ) {
- $HIPCC=get_normalized_path("$HIP_CLANG_PATH/clang" . $execExtension);
+ $LLVM_PATH= $ENV{'LLVM_PATH'};
+ $HIPCC="${LLVM_PATH}/bin/clang" . $execExtension;
+ $HIPCC="${HIP_CLANG_PATH}/clang" . $execExtension;
$HIPLDFLAGS = "--driver-mode=g++";
}
# to avoid using dk linker or MSVC linker
@@ -484,7 +487,8 @@ if($HIP_PLATFORM eq "amd"){
@@ -484,7 +486,8 @@ if($HIP_PLATFORM eq "amd"){
$targetsStr = $ENV{HCC_AMDGPU_TARGET};
} elsif (not $isWindows) {
# Else try using rocm_agent_enumerator

View File

@@ -1,26 +1,24 @@
diff --git a/amd/hipcc/bin/hipcc.pl b/amd/hipcc/bin/hipcc.pl
index 513a427..ffeafaa 100755
index 513a427..780dc5c 100755
--- a/amd/hipcc/bin/hipcc.pl
+++ b/amd/hipcc/bin/hipcc.pl
@@ -160,11 +160,15 @@ if ($HIP_PLATFORM eq "amd") {
@@ -160,11 +160,13 @@ if ($HIP_PLATFORM eq "amd") {
if($isWindows) {
$execExtension = ".exe";
}
- $HIPCC=get_normalized_path("$HIP_CLANG_PATH/clang++" . $execExtension);
+ # llvm_path is set inside the hip recipe
+ $LLVM_PATH= $ENV{'LLVM_PATH'};
+ $HIPCC="${LLVM_PATH}/bin/clang++" . $execExtension;
+
+ # hip_clang_path is set inside the hip recipe
+ $HIP_CLANG_PATH= $ENV{'HIP_CLANG_PATH'};
+ $HIPCC="${HIP_CLANG_PATH}/clang++" . $execExtension;
# If $HIPCC clang++ is not compiled, use clang instead
if ( ! -e $HIPCC ) {
- $HIPCC=get_normalized_path("$HIP_CLANG_PATH/clang" . $execExtension);
+ $LLVM_PATH= $ENV{'LLVM_PATH'};
+ $HIPCC="${LLVM_PATH}/bin/clang" . $execExtension;
+ $HIPCC="${HIP_CLANG_PATH}/clang" . $execExtension;
$HIPLDFLAGS = "--driver-mode=g++";
}
# to avoid using dk linker or MSVC linker
@@ -484,7 +488,8 @@ if($HIP_PLATFORM eq "amd"){
@@ -484,7 +486,8 @@ if($HIP_PLATFORM eq "amd"){
$targetsStr = $ENV{HCC_AMDGPU_TARGET};
} elsif (not $isWindows) {
# Else try using rocm_agent_enumerator

View File

@@ -31,7 +31,12 @@ class Hisat2(MakefilePackage):
url="https://cloud.biohpc.swmed.edu/index.php/s/hisat2-220-source/download",
extension="zip",
)
version("2.1.0", sha256="89a276eed1fc07414b1601947bc9466bdeb50e8f148ad42074186fe39a1ee781")
version(
"2.1.0",
sha256="89a276eed1fc07414b1601947bc9466bdeb50e8f148ad42074186fe39a1ee781",
url="ftp://ftp.ccb.jhu.edu/pub/infphilo/hisat2/downloads/hisat2-2.1.0-source.zip",
extension="zip",
)
variant("sra", default=False, description="Add SRA (Sequence Read Archive) support")
@@ -48,8 +53,8 @@ class Hisat2(MakefilePackage):
def build(self, spec, prefix):
make(
"USE_SRA=1",
"NCBI_NGS_DIR={0}".format(spec["sra-tools"].prefix),
"NCBI_VDB_DIR={0}".format(spec["ncbi-vdb"].prefix),
f"NCBI_NGS_DIR={spec['sra-tools'].prefix}",
f"NCBI_VDB_DIR={spec['ncbi-vdb'].prefix}",
)
def install(self, spec, prefix):
@@ -59,7 +64,7 @@ def install(self, spec, prefix):
install_tree("example", prefix.example)
install_tree("scripts", prefix.scripts)
if "@:2.2.0" in spec:
if spec.satisfies("@:2.2.0"):
install_tree("hisatgenotype_modules", prefix.hisatgenotype_modules)
install_tree("hisatgenotype_scripts", prefix.hisatgenotype_scripts)
@@ -75,33 +80,33 @@ def install(self, spec, prefix):
install("hisat2-inspect-l", prefix.bin)
install("*.py", prefix.bin)
if "@2.2:" in spec:
if spec.satisfies("@2.2:"):
install("hisat2-repeat", prefix.bin)
@run_after("install")
def filter_sbang(self):
with working_dir(self.prefix.bin):
pattern = "^#!.*/usr/bin/env python"
repl = "#!{0}".format(self.spec["python"].command.path)
repl = f"#!{self.spec['python'].command.path}"
files = ["hisat2-build", "hisat2-inspect"]
for file in files:
filter_file(pattern, repl, *files, backup=False)
pattern = "^#!.*/usr/bin/env perl"
repl = "#!{0}".format(self.spec["perl"].command.path)
repl = f"#!{self.spec['perl'].command.path}"
files = ["hisat2"]
for file in files:
filter_file(pattern, repl, *files, backup=False)
pattern = "^#!.*/usr/bin/env python3"
repl = "#!{0}".format(self.spec["python"].command.path)
repl = f"#!{self.spec['python'].command.path}"
files = glob.glob("*.py")
for file in files:
filter_file(pattern, repl, *files, backup=False)
with working_dir(self.prefix.scripts):
pattern = "^#!.*/usr/bin/perl"
repl = "#!{0}".format(self.spec["perl"].command.path)
repl = f"#!{self.spec['perl'].command.path}"
files = glob.glob("*.pl")
for file in files:
filter_file(pattern, repl, *files, backup=False)

View File

@@ -10,29 +10,6 @@
from spack.package import *
# The viewer and trace viewer tar files and sha256sum depend on the
# version and machine type. Starting with 2019.08, the name of the
# tar file contains the version number.
def viewer_url(ver, mach):
ver2 = ("-" + ver) if ver >= "2019.08" else ""
return ("http://hpctoolkit.org/download/hpcviewer/{0}/hpcviewer{1}-linux.gtk.{2}.tgz").format(
ver, ver2, mach
)
def trace_url(ver, mach):
ver2 = ("-" + ver) if ver >= "2019.08" else ""
return (
"http://hpctoolkit.org/download/hpcviewer/{0}/hpctraceviewer{1}-linux.gtk.{2}.tgz"
).format(ver, ver2, mach)
def darwin_url(ver, mach):
return (
"http://hpctoolkit.org/download/hpcviewer/{0}/hpcviewer-{0}-macosx.cocoa.{1}.zip"
).format(ver, mach)
class Hpcviewer(Package):
"""Binary distribution of hpcviewer and integrated hpctraceviewer for
the Rice HPCToolkit (Linux x86_64, ppc64le and aarch64, and MacOSX
@@ -43,7 +20,7 @@ class Hpcviewer(Package):
run hpcrun and hpcviewer on different machines.
"""
homepage = "http://hpctoolkit.org"
homepage = "https://hpctoolkit.org"
maintainers("mwkrentel")
skip_version_audit = ["platform=windows"]
@@ -129,27 +106,6 @@ class Hpcviewer(Package):
("2020.02", "x86_64"): "af1f514547a9325aee30eb891b31e38c7ea3f33d2d1978b44f83e7daa3d5de6b",
("2020.02", "ppc64"): "7bb4926202db663aedd5a6830778c5f73f6b08a65d56861824ea95ba83b1f59c",
("2020.02", "ppc64le"): "cfcebb7ba301affd6d21d2afd43c540e6dd4c5bc39b0d20e8bd1e4fed6aa3481",
("2020.01", "x86_64"): "3cd5a2a382cec1d64c8bd0abaf2b1461dcd4092a4b4074ddbdc1b96d2a0b4220",
("2020.01", "ppc64"): "814394a5f410033cc1019526c268ef98b5b381e311fcd39ae8b2bde6c6ff017c",
("2020.01", "ppc64le"): "e830e956b8088c415fb25ef44a8aca16ebcb27bcd34536866612343217e3f9e4",
("2019.12", "x86_64"): "6ba149c8d23d9913291655602894f7a91f9c838e69ae5682fd7b605467255c2d",
("2019.12", "ppc64"): "787257272381fac26401e1013952bea94635172503e7abf8063081fe03f08384",
("2019.12", "ppc64le"): "fd20891fdae6dd5c2313cdd98e53c52023a0cf146a1121d0c889ebedc08a8bb9",
("2019.09", "x86_64"): "40982a43880fe646b7f9d03ac4911b55f8a4464510eb8c7304ffaf4d4205ecc6",
("2019.09", "ppc64"): "3972d604bd160c058185b6f8f3f3a63c4031046734b29cc386c24e40831e6798",
("2019.09", "ppc64le"): "c348f442b7415aadb94ead06bd35e96442a49a9768fd8c972ca707d77d61e0c3",
("2019.08", "x86_64"): "249aae6a23dca19286ee15909afbeba5e515388f1c1ad87f572454534fccb9f2",
("2019.08", "ppc64"): "f91b4772c92c05a4a35c88eec094604f3c233c7233adeede97acba38592da379",
("2019.08", "ppc64le"): "b1bd5c76b37f225a01631193e0a62524bd41a54b3354a658fdfd0f66c444cc28",
("2019.07", "x86_64"): "e999781d6a7d178cb1db5b549650024fa9b19891e933bac8b0441d24e7bf015c",
("2019.07", "ppc64"): "057ce0e2d6be5639639f762fb43b116fe31fb855745abaf4ea26bd281cffaab1",
("2019.07", "ppc64le"): "40d6928e0761568168f3ce34f3ed320916ea60bda830dd74513897ef77386b28",
("2019.04", "x86_64"): "c524498ef235171e298c8142b7e73b0a1f7c433f9c471fb692d31f0685e53aa4",
("2019.04", "ppc64"): "dc9daee886ba72c0615db909860ee1aed0979f12c0d113efbe721ddabdf55199",
("2019.04", "ppc64le"): "dddabccef156996d390653639096ad3e27b7384a5754f42084f50c4a50a9009b",
("2019.02", "x86_64"): "e24368a3ec27b82736a781971a8371abfe7744b2a4f68b7b41d76f84af306b83",
("2019.02", "ppc64"): "72c1ef1a5682c3273e900bb248f126428a02dfe728af0c49c7ee8381938d1e18",
("2019.02", "ppc64le"): "02aaf27bb5b0f72d5b5738289bce60f6ef0ef7327ca96a890892509a09adc946",
}
trace_sha = {
@@ -165,27 +121,6 @@ class Hpcviewer(Package):
("2020.02", "x86_64"): "b7b634e91108aa50a2e8647ac6bac87df775ae38aff078545efaa84735e0a666",
("2020.02", "ppc64"): "a3e845901689e1b32bc6ab2826c6ac6ed352df4839090fa530b20f747e6e0957",
("2020.02", "ppc64le"): "a64a283f61e706d988952a7cede9fac0328b09d2d0b64e4c08acc54e38781c98",
("2020.01", "x86_64"): "9459177a2445e85d648384e2ccee20524592e91a74d615262f32d0876831cd7c",
("2020.01", "ppc64"): "02366a2ba30b9b2450d50cf44933288f04fae5bf9868eef7bb2ae1b49d4f454e",
("2020.01", "ppc64le"): "39970e84e397ed96bc994e7b8db3b7b3aab4e3155fa7ca8e68b9274bb58115f0",
("2019.12", "x86_64"): "6339b36e655e2c2b07af4cb40946f325acc46da3ec590d36069661e69b046a92",
("2019.12", "ppc64"): "fe4ee5af22a983fa0ddbfbd97fa6676f07492400536e900188455f21e489c59b",
("2019.12", "ppc64le"): "2688ea834c546b9e2c6e9d69d271a62dd00f6bc7ff4cb874563ba8d0ae5824e3",
("2019.09", "x86_64"): "8d7ce0710570bb8cd424d88cc4b5bfe821330f24fef84bbbbb370fa291b60a14",
("2019.09", "ppc64"): "dfb3fe8283cbaeaa1653e8c8bf68267a3f25886bc452309b10f88a7b1e713ec6",
("2019.09", "ppc64le"): "c1b6ab4f6c91e3a226e8629de62e718c92318ffd83d03db3c40678d578b99b20",
("2019.08", "x86_64"): "6cefed6a397298ab31cadd10831f5d5533d3f634a4a76bb93f686e603a42c5ed",
("2019.08", "ppc64"): "64ca5605c89dd3065cacaeee4a8e2ac14b47953530711ed9e04666c8435e44e8",
("2019.08", "ppc64le"): "bee03b5cb2de7e8556cf1249f98ece7848c13a0de6b8ba71786c430da68f7bcc",
("2019.07", "x86_64"): "267052cf742d12bbe900bc03bc7c47c8e1704fbaad0e1a3fc77b73dc506d5a68",
("2019.07", "ppc64"): "5ae63d8e2f2edf5c3b982d3663311e4d55f9b378f512926b3ebadab27ba72e22",
("2019.07", "ppc64le"): "c2883714cbafa5252432c52d1d32ab5f34554b33a9bad20dcd2c0632388fbee5",
("2019.04", "x86_64"): "f5f908c0e52c97a72af1af8519f4b191298fe52bd811dd06a051b68cd7bcce27",
("2019.04", "ppc64"): "221683c992e4fe2cd9079ad2ebb531d99d04a3cbb3a8860f795b276b1eaeab19",
("2019.04", "ppc64le"): "fe539c6a165a72bba6ea7bdb34a90d862d427c4d55095c97794d54e6dd9d3075",
("2019.02", "x86_64"): "5ff11317a638318295821204ffcb1276e9da1684cd5f298410ae2bf78ce88b6b",
("2019.02", "ppc64"): "95b2a7d848ecb924591c248f5e47c641646ef90a071db48237ddb96c4b71a8fb",
("2019.02", "ppc64le"): "01a159306e7810efe07157ec823ac6ca7570ec2014c95db599a3f90eee33355c",
}
system = platform.system().lower()
@@ -195,31 +130,38 @@ class Hpcviewer(Package):
# Versions for MacOSX / Darwin
if system == "darwin":
for key in darwin_sha.keys():
if key[1] == machine:
version(key[0], url=darwin_url(*key), sha256=darwin_sha[key])
for (ver, arch), sha in darwin_sha.items():
if arch == machine:
version(
ver,
url=f"https://gitlab.com/hpctoolkit/hpcviewer/-/releases/{ver}/downloads/hpcviewer-macosx.cocoa.{arch}.zip",
sha256=sha,
# Versions before 2022.01 are dead links
deprecated=(ver < "2022.01"),
)
# Versions for Linux and Cray front-end
if system == "linux":
for key in viewer_sha.keys():
if key[1] == machine:
for (ver, arch), sha in viewer_sha.items():
if arch == machine:
version(
key[0],
url=viewer_url(*key),
sha256=viewer_sha[key],
deprecated=(key[0] <= "2020.99"),
ver,
url=f"https://gitlab.com/hpctoolkit/hpcviewer/-/releases/{ver}/downloads/hpcviewer-linux.gtk.{arch}.tgz",
sha256=sha,
# Versions before 2022.01 are dead links
deprecated=(ver < "2022.01"),
)
# Current versions include the viewer and trace viewer in
# one tar file. Before 2020.07, the trace viewer was a
# separate tar file (resource).
if key in trace_sha:
if (ver, arch) in trace_sha:
resource(
name="hpctraceviewer",
url=trace_url(*key),
sha256=trace_sha[key],
url=f"https://gitlab.com/hpctoolkit/hpcviewer/-/releases/{ver}/downloads/hpctraceviewer-linux.gtk.{arch}.tgz",
sha256=trace_sha[ver, arch],
placement="TRACE",
when="@{0}".format(key[0]),
when=f"@{ver}",
)
depends_on("java@11:", type=("build", "run"), when="@2021.0:")

View File

@@ -60,9 +60,9 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
variant(
"max_cpu_count",
default="64",
default="auto",
description="Max number of OS-threads for HPX applications",
values=lambda x: isinstance(x, str) and x.isdigit(),
values=lambda x: isinstance(x, str) and (x.isdigit() or x == "auto"),
)
instrumentation_values = ("apex", "google_perftools", "papi", "valgrind")
@@ -224,6 +224,9 @@ def instrumentation_args(self):
def cmake_args(self):
spec, args = self.spec, []
format_max_cpu_count = lambda max_cpu_count: (
"" if max_cpu_count == "auto" else max_cpu_count
)
args += [
self.define("HPX_WITH_CXX{0}".format(spec.variants["cxxstd"].value), True),
self.define_from_variant("HPX_WITH_MALLOC", "malloc"),
@@ -237,7 +240,10 @@ def cmake_args(self):
self.define("HPX_WITH_NETWORKING", "networking=none" not in spec),
self.define("HPX_WITH_PARCELPORT_TCP", "networking=tcp" in spec),
self.define("HPX_WITH_PARCELPORT_MPI", "networking=mpi" in spec),
self.define_from_variant("HPX_WITH_MAX_CPU_COUNT", "max_cpu_count"),
self.define(
"HPX_WITH_MAX_CPU_COUNT",
format_max_cpu_count(spec.variants["max_cpu_count"].value),
),
self.define_from_variant("HPX_WITH_GENERIC_CONTEXT_COROUTINES", "generic_coroutines"),
self.define("BOOST_ROOT", spec["boost"].prefix),
self.define("HWLOC_ROOT", spec["hwloc"].prefix),

View File

@@ -14,9 +14,10 @@ class HwProbe(MakefilePackage):
license("LGPL-2.1-or-later OR BSD-4-Clause")
version("1.6", sha256="de048be6aef357d3142c9e2327d6f79d205a42aa3396ad381ed319115d1c9a22")
version("1.5", sha256="8bb7d6ff272c1412e26fcfd86e9df5c3e34e1584552404b930c281b8498b25ea")
version("1.4", sha256="90f3ea83bf641348b209e4a2a910f65d836ae7828c0be0f660236ea413bc46bb")
version("1.3", sha256="820ada4f16cb827e0990eb918e75423845fef54a863fdd88aa5bd23127354229")
def install(self, spec, prefix):
make("install", "prefix={0}".format(prefix))
make("install", f"prefix={prefix}")

View File

@@ -6,6 +6,8 @@
import os
import platform
from llnl.util.symlink import readlink
from spack.package import *
@@ -94,7 +96,7 @@ def install(self, spec, prefix):
# The archive.bin file is quite fussy and doesn't work as a
# symlink.
if os.path.islink(archive):
targ = os.readlink(archive)
targ = readlink(archive)
os.unlink(archive)
copy(targ, archive)

Some files were not shown because too many files have changed in this diff Show More