Compare commits

..

201 Commits

Author SHA1 Message Date
Harmen Stoppels
25c74506a3 cmake.py: include transitive run deps in CMAKE_PREFIX_PATH
- Run deps often need to be located during the build
- In cases like python-venv and python, where there's a direct build dep
  on both, but python-venv has a run dep on python, meaning we'd rather
  see python-venv in path before python topologically.
2024-11-28 18:27:15 +01:00
Harmen Stoppels
e37e53cfe8 traverse: add MixedDepthVisitor, use in cmake (#47750)
This visitor accepts the sub-dag of all nodes and unique edges that have
deptype X directly from given roots, or deptype Y transitively for any
of the roots.
2024-11-28 17:48:48 +01:00
Andrey Perestoronin
cf31d20d4c add new packages (#47817) 2024-11-28 09:49:52 -05:00
Harmen Stoppels
b74db341c8 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2024-11-28 14:57:28 +01:00
Daryl W. Grunau
e88a3f6f85 eospac: version 6.5.12 (#47826)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-11-28 12:32:35 +01:00
Massimiliano Culpo
9bd7483e73 Add further C and C++ dependencies to packages (#47821) 2024-11-28 10:50:35 +01:00
Harmen Stoppels
04c76fab63 hip: hints for find_package llvm/clang (#47788)
LLVM can be a transitive link dependency of hip through gl's dependency mesa, which uses it for software rendering.

In this case make sure llvm-amdgpu is found with find_package(LLVM) and
find_package(Clang) by setting LLVM_ROOT and Clang_ROOT.

That makes the patch of find_package's HINTS redundant, so remove that.
It did not work anyways, because CMAKE_PREFIX_PATH has higher precedence
than HINTS.
2024-11-28 10:23:09 +01:00
Adam J. Stewart
ecbf9fcacf py-scooby: add v0.10.0 (#47790) 2024-11-28 10:21:36 +01:00
Victor A. P. Magri
69fb594699 hypre: add a variant to allow using internal lapack functions (#47780) 2024-11-28 10:15:12 +01:00
Howard Pritchard
d28614151f nghttp2: add v1.64.0 (#47800)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-11-28 10:12:41 +01:00
etiennemlb
f1d6af6c94 netlib-scalapack: fix for some clang derivative (cce/rocmcc) (#45434) 2024-11-28 09:25:33 +01:00
Adam J. Stewart
192821f361 py-river: mark numpy 2 compatibility (#47813) 2024-11-28 09:24:21 +01:00
Adam J. Stewart
18790ca397 py-pyvista: VTK 9.4 not yet supported (#47815) 2024-11-28 09:23:41 +01:00
BOUDAOUD34
c22d77a38e dbcsr: patch for resolving .mod file conflicts in ROCm by implementing USE, INTRINSIC (#46181)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-11-28 09:20:48 +01:00
Tom Payerle
d82bdb3bf7 seacas: update recipe to find faodel (#40239)
Explcitly sets the CMake variables  Faodel_INCLUDE_DIRS and Faodel_LIBRARY_DIRS when +faodel.
This seems to be needed for recent versions of seacas (seacas@2021-04-02:), but should be safe
to do for all versions.

For Faodel_INCLUDE_DIRS, it looks like Faodel has header files under $(Faodel_Prefix)/include/faodel,
but seacas is not including the "faodel" part in #includes.  So add both $(Faodel_Prefix)/include
and $(Foadel_Prefix)/include/faodel

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-11-28 09:17:44 +01:00
Matt Thompson
a042bdfe0b mapl: add hpcx-mpi (#47793) 2024-11-28 09:15:25 +01:00
Adam J. Stewart
60e3e645e8 py-joblib: add v1.4.2 (#47789) 2024-11-28 08:28:44 +01:00
Chris Marsh
51785437bc Patch to fix building gcc@14.2 on darwin. Fixes #45628 (#47830) 2024-11-27 20:58:18 -07:00
dependabot[bot]
2e8db0815d build(deps): bump docker/build-push-action from 6.9.0 to 6.10.0 (#47819)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.10.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4f58ea7922...48aba3b46d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-27 16:29:41 -07:00
George Malerbo
8a6428746f raylib: add v5.5 (#47708)
* Add version 5.5 in package.py

* Update package.py
2024-11-27 16:25:22 -07:00
Adam J. Stewart
6b9c099af8 py-keras: add v3.7.0 (#47816) 2024-11-27 16:12:47 -07:00
Derek Ryan Strong
30814fb4e0 Deprecate rsync releases before v3.2.5 (#47820) 2024-11-27 16:14:34 -06:00
Harmen Stoppels
3194be2e92 gcc-runtime: remove libz.so from libgfortran.so if present (#47812) 2024-11-27 22:32:37 +01:00
snehring
41be2f5899 ltr-retriever: changing directory layout (#38513) 2024-11-27 14:16:57 -07:00
kwryankrattiger
02af41ebb3 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2024-11-27 15:13:55 -06:00
snehring
9d33c89030 r-rsamtools: add -lz to Makevars (#38649) 2024-11-27 13:44:48 -07:00
Erik Heeren
51ab7bad3b julia: conflict for %gcc@12: support (#35931) 2024-11-27 04:31:44 -07:00
kwryankrattiger
0b094f2473 Docs: Reference 7z requirement on Windows (#35943) 2024-11-26 17:11:12 -05:00
Christoph Junghans
cd306d0bc6 all-libary: add voronoi support and git version (#47798)
* all-libary: add voronoi support and git version

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-26 14:56:22 -07:00
Dom Heinzeller
fdb9cf2412 Intel/oneapi compilers: correct version ranges for diab-disable flag (#47428)
* c/c++ flags should have been modified for all 2023.x.y versions, but
  upper bound was too low
* Fortran flags should have been modified for all 2024.x.y versions, but
  likewise the upper bound was too low
2024-11-26 12:34:37 -07:00
etiennemlb
a546441d2e siesta: remove link args on a non-declared dependency (#46080) 2024-11-26 20:25:04 +01:00
IHuismann
141cdb6810 adol-c: fix libs property (#36614) 2024-11-26 17:01:18 +01:00
Brian Van Essen
f2ab74efe5 cray: add further versions of Cray packages. (#37733) 2024-11-26 16:59:53 +01:00
Martin Aumüller
38b838e405 openscenegraph: remove X11 dependencies for macos (#39037) 2024-11-26 16:59:10 +01:00
Mark Abraham
c037188b59 gromacs: announce deprecation policy and start to implement (#47804)
* gromacs: announce deprecation policy and start to implement

* Style it up

* [@spackbot] updating style on behalf of mabraham

* Bump versions used in CI

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2024-11-26 05:54:07 -07:00
Mark Abraham
0835a3c5f2 gromacs: obtain SYCL from either ACpp or intel-oneapi-runtime (#47806) 2024-11-26 05:51:54 -07:00
Mark Abraham
38a2f9c2f2 gromacs: Improve HeFFTe dependency (#47805)
GROMACS supports HeFFTe with either SYCL or CUDA build and requires
a matching HeFFTe build
2024-11-26 05:50:41 -07:00
Massimiliano Culpo
eecd4afe58 gromacs: fix the value used for the ITT directory (#47795) 2024-11-26 08:14:45 +01:00
Seth R. Johnson
83624551e0 ROOT: default to +aqua~x on macOS (#47792) 2024-11-25 14:27:38 -06:00
Victor A. P. Magri
741652caa1 caliper: add "tools" variant (#47779) 2024-11-25 18:26:53 +01:00
Mark Abraham
8e914308f0 gromacs: add itt variant (#47764)
Permit configuring GROMACS with support for mdrun to trace its timing
regions by calling the ITT API. This permits tools like VTune and
unitrace to augment their analysis with GROMACS-specific annotation.
2024-11-25 16:12:55 +01:00
Mikael Simberg
3c220d0989 apex: add 2.7.0 (#47736) 2024-11-25 13:22:16 +01:00
Wouter Deconinck
8094fa1e2f py-gradio: add v5.1.0 (and add/update dependencies) (fix CVEs) (#47504)
* py-pdm-backend: add v2.4.3
* py-starlette: add v0.28.0, v0.32.0, v0.35.1, v0.36.3, v0.37.2, v0.41.2
* py-fastapi: add v0.110.2, v0.115.4
* py-pydantic-extra-types: add v2.10.0
* py-pydantic-settings: add v2.6.1
* py-python-multipart: add v0.0.17
* py-email-validator: add v2.2.0
2024-11-25 13:07:56 +01:00
Massimiliano Culpo
5c67051980 Add missing C/C++ dependencies (#47782) 2024-11-25 12:56:39 +01:00
John W. Parent
c01fb9a6d2 Add CMake 3.31 minor release (#47676) 2024-11-25 04:32:57 -07:00
Harmen Stoppels
bf12bb57e7 install_test: first look at builder, then package (#47735) 2024-11-25 11:53:28 +01:00
Wouter Deconinck
406c73ae11 py-boto*: add v1.34.162 (#47528)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:39:09 +01:00
Wouter Deconinck
3f50ccfcdd py-azure-*: updated versions (#47525) 2024-11-25 11:38:49 +01:00
Wouter Deconinck
9883a2144d py-quart: add v0.19.8 (#47508)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:38:22 +01:00
Wouter Deconinck
94815d2227 py-netifaces: add v0.10.9, v0.11.0 (#47451) 2024-11-25 11:37:41 +01:00
Wouter Deconinck
a15563f890 py-werkzeug: add v3.1.3 (and deprecate older versions) (#47507)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:28:01 +01:00
Wouter Deconinck
ac2ede8d2f py-pyzmq: add v25.1.2, v26.0.3, v26.1.1, v26.2.0 (switch to py-scikit-build-core) (#44493) 2024-11-25 11:00:00 +01:00
david-edwards-linaro
b256a7c50d linaro-forge: remove v21.1.3 (#47688) 2024-11-25 10:53:27 +01:00
Szabolcs Horvát
21e10d6d98 igraph: add v0.10.15 (#47692) 2024-11-25 10:51:24 +01:00
afzpatel
ed39967848 rocm-tensile: add 6.2.1 (#47702) 2024-11-25 10:40:21 +01:00
Alex Richert
eda0c6888e ip: add cmake version requirement for @5.1: (#47754) 2024-11-25 02:38:08 -07:00
pauleonix
66055f903c cuda: Add v12.6.3 (#47721) 2024-11-25 10:36:11 +01:00
Dave Keeshan
a1c57d86c3 libusb: add v1.0.23:1.0.27 (#47727) 2024-11-25 10:33:08 +01:00
Dave Keeshan
9da8dcae97 verible: add v0.0.3841 (#47729) 2024-11-25 10:30:48 +01:00
jflrichard
c93f223a73 postgis: add version 3.1.2 (#47743) 2024-11-25 10:24:03 +01:00
Wouter Deconinck
f1faf31735 build-containers: determine latest release tag and push that as latest (#47742) 2024-11-25 10:20:58 +01:00
Stephen Herbener
8957ef0df5 Updated version specs for bufr-query package. (#47752) 2024-11-25 10:14:16 +01:00
Veselin Dobrev
347ec87fc5 mfem: add logic for the C++ standard level when using rocPRIM (#47751) 2024-11-25 10:13:22 +01:00
Adam J. Stewart
cd8c46e54e py-ruff: add v0.8.0 (#47758) 2024-11-25 10:02:31 +01:00
Adam J. Stewart
75b03bc12f glib: add v2.82.2 (#47766) 2024-11-24 20:55:18 +01:00
Adam J. Stewart
58511a3352 py-pandas: correct Python version constraint (#47770) 2024-11-24 17:48:16 +01:00
Adam J. Stewart
325873a4c7 py-fsspec: add v2024.10.0 (#47778) 2024-11-24 15:42:30 +01:00
Adam J. Stewart
9156e4be04 awscli-v2: add v2.22.4 (#47765) 2024-11-24 15:42:06 +01:00
Adam J. Stewart
12d3abc736 py-pytz: add v2024.2 (#47777) 2024-11-24 15:40:45 +01:00
Adam J. Stewart
4208aa6291 py-torchvision: add Python 3.13 support (#47776) 2024-11-24 15:40:11 +01:00
Adam J. Stewart
0bad754e23 py-scikit-learn: add Python 3.13 support (#47775) 2024-11-24 15:39:36 +01:00
Adam J. Stewart
cde2620f41 py-safetensors: add v0.4.5 (#47774) 2024-11-24 15:38:05 +01:00
Adam J. Stewart
a35aa038b0 py-pystac: add support for Python 3.12+ (#47772) 2024-11-24 15:37:43 +01:00
Adam J. Stewart
150416919e py-pydantic-core: add v2.27.1 (#47771) 2024-11-24 15:37:06 +01:00
Adam J. Stewart
281c274e0b py-jupyter-packaging: add Python 3.13 support (#47769) 2024-11-24 15:31:31 +01:00
Adam J. Stewart
16e130ece1 py-cryptography: mark Python 3.13 support (#47768) 2024-11-24 15:31:08 +01:00
Adam J. Stewart
7586303fba py-ruamel-yaml-clib: add Python compatibility bounds (#47773) 2024-11-24 15:28:45 +01:00
Harmen Stoppels
6501880fbf py-node-env: add v1.9.1 (#47762) 2024-11-24 15:27:16 +01:00
Harmen Stoppels
c76098038c py-ipykernel: split forward and backward compat bounds (#47763) 2024-11-24 15:26:15 +01:00
Harmen Stoppels
124b616b27 add a few forward compat bounds with python (#47761) 2024-11-24 15:23:11 +01:00
Adam J. Stewart
1148c8f195 gobject-introspection: Python 3.12 still not supported (#47767) 2024-11-24 03:53:32 -07:00
Adam J. Stewart
c57452dd08 py-cffi: support Python 3.12+ (#47713) 2024-11-24 08:41:29 +01:00
Harmen Stoppels
a7e57c9a14 py-opt-einsum: add v3.4.0 (#47759) 2024-11-24 08:40:29 +01:00
Teague Sterling
85d83f9c26 duckdb: add v1.1.3, deprecate <v1.1.0 (#47653)
* duckdb: add v1.0.0, v0.10.3

* Adding issue reference

* Adding issue reference

* duckdb: add v1.1.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: add v1.1.1

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: Fix missing depends_on(unixodbc, when=+odbc)

* Adding duckdb variants, removing old variants, removing deprecated versions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb+static_openssl: Add pkgconfig and zlib-api to link zlib when needed

* duckdb: add v1.1.3

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py for CVE-2024-41672 as suggested

* [@spackbot] updating style on behalf of teaguesterling

* duckdb: add CVE comment before deprecated versions

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-23 13:13:40 -07:00
finkandreas
39a081d7fd Kokkos complex_align variant, Trilinos+PETSc enforcement for Kokkos~complex_align (#47686) 2024-11-23 07:45:22 -07:00
Harmen Stoppels
71b65bb424 py-opt-einsum: missing forward compat bound for python (#47757) 2024-11-23 10:48:07 +01:00
Adam J. Stewart
3dcbd118df py-cython: support Python 3.12+ (#47714)
and add various other compat bounds on dependents
2024-11-22 22:20:41 +01:00
Harmen Stoppels
5dacb774f6 itk: use vendored googletest (#47687)
external googletest breaks dependents because they end up with
ITK_LIBRARIES set to `GTest::GTest;GTest::Main`, which then end up
literally in a nonsensical link line `-lGTest::GtTest`.

the vendored googletest produces a cmake config file where
`ITKGoogleTest_LIBRARIES` is empty.
2024-11-22 18:41:23 +01:00
Harmen Stoppels
cb3d6549c9 traverse.py: ensure topo order is bfs for trees (#47720) 2024-11-22 15:04:19 +01:00
Mark Abraham
559c2f1eb9 gromacs: oneapi does not always require gcc (#47679)
* gromacs: oneapi does not always require gcc

* Support intel_provided_gcc only with %intel classic compiler

Require gcc only when needed with %intel

* New approach depending on gcc-runtime directly

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>

---------

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
2024-11-22 06:33:30 -07:00
Harmen Stoppels
ed1dbea77b eigen: self.builder.build_directory -> self.build_directory (#47728) 2024-11-22 07:20:38 +01:00
Seth R. Johnson
6ebafe4631 vecgeom: add v1.2.10 and delete unused, deprecated versions (#47725)
* vecgeom: add v1.2.10

* Remove unused+deprecated versions of vecgeom

* Deprecate older v1.2.x  versions

* [@spackbot] updating style on behalf of sethrj
2024-11-21 17:03:09 -07:00
Harmen Stoppels
7f0bb7147d README.md update old tutorial URL (#47718) 2024-11-21 16:46:46 +01:00
Satish Balay
f41b38e93d xsdk: add v1.1.0 (#47635)
xsdk: exclude pflotran, alquimia, exago

heffte: ~fftw when=+hip

dealii: ~sundials ~opencascade ~vtk ~taskflow
2024-11-21 08:08:27 -06:00
Massimiliano Culpo
5fd12b7bea Add further missing C, C++ dependencies to packages (#47662) 2024-11-21 14:49:12 +01:00
Mikael Simberg
fe746bdebb aws-ofi-nccl: Add 1.8.1 to 1.13.0 (#47717) 2024-11-21 05:37:57 -07:00
eugeneswalker
453af4b9f7 hdf5-vol-cache %cce: add -Wno-error=incompatible-function-pointer-types (#47698) 2024-11-20 14:56:19 -08:00
eugeneswalker
29cf1559cc netlib-scalapack %cce: add cflags -Wno-error=implicit-function-declaration (#47701) 2024-11-20 15:09:14 -07:00
eugeneswalker
a9b3e1670b mpifileutils%cce: append cflags -Wno-error=implicit-function-declaration (#47700) 2024-11-20 14:19:05 -07:00
kwryankrattiger
4f9aa6004b visit: add v3.4.0, v3.4.1 (#47161)
* Visit: Add new versions 3.4.0 and 3.4.1

* Adios2: Restrict python, 3.11 doesn't not work for older Adios2

* VisIt: Set the VTK_VERSION for @3.4:

Older versions of VTK used the VTK_{MAJOR, MINOR}_VERSION variables for
VTK detection. VisIt >= 3.4 uses the full string VTK_VERSION.

* CI: Don't build llvm-amdgpu for non-HIP stack

* VisIt: v3.4.1 handles newer Adios2 correctly

* Visit: Add missing links in HDF5, set correct VTK version configuration parameter

* VisIt: Add py-pip requirement and patch visit with configuration changes

* HDF5 symlinks move when inside of callback

* VisIt ninja install fails with python module. Using make does not

* VisIt 3.4 has a high minimum cmake requirement

* HDF5: Early return when not mpi for mpi symlinks

* HDF5: Use platform agnostic method for creating legacy compatible MPI symlinks

* Fix VISIT_VTK_VERSION handling for 8.2.1a hack
2024-11-20 18:37:56 +01:00
Harmen Stoppels
aa2c18e4df spack style: import-check -> import, fix bugs, exclude spack.pkg (#47690) 2024-11-20 16:15:28 +01:00
dependabot[bot]
0ff3e86315 build(deps): bump codecov/codecov-action from 5.0.2 to 5.0.3 (#47683)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.0.2 to 5.0.3.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5c47607acb...05f5a9cfad)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:40:01 -06:00
dependabot[bot]
df208c1095 build(deps): bump docker/metadata-action from 5.5.1 to 5.6.1 (#47682)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.1 to 5.6.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](8e5442c4ef...369eb591f4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:39:45 -06:00
Chris Marsh
853f70edc8 cgal: update depends versions for 6.0.1 (#47516)
* This extends PR #47285 to properly include some of the required version constrains of cgal 6 incl C++ standard. It also adds the new no-gmp backend as a variant.

* fix style

* disable cgal@6 +demo variant as the demos require qt6 which is not in spack

* disable the gmp variant until clarity on how its supposed to work is provided. bound shared and header_only variants to relevant versions

* Fix missing msvc compiler limit, fix variant left in

* Add more comments. Better describe the gmp variant. Remove testing code

* fix style
2024-11-19 16:43:21 -07:00
Paul R. C. Kent
50970f866e Add llvm v19.1.4 (#47681) 2024-11-19 16:03:28 -07:00
Wouter Deconinck
8821300985 py-gevent: add v24.2.1, v24.10.3, v24.11.1 (#47646)
* py-gevent: add v24.2.1, v24.10.3
* py-gevent: add v24.11.1
2024-11-19 12:14:52 -08:00
AMD Toolchain Support
adc8e1d996 Restrict disable dynamic thread scaling only to 3.1 version (#47673)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-11-19 12:12:21 -08:00
Andrey Perestoronin
1e0aac6ac3 Add new 2025.0.1 Oneapi patch packages (#47678) 2024-11-19 11:38:42 -07:00
Harmen Stoppels
99e2313d81 openturns: fix deps (#47669) 2024-11-19 18:13:47 +01:00
Mark Abraham
22690a7576 Make oneAPI library-with-sdk specialize library class (#47632) 2024-11-19 12:12:10 -05:00
Harmen Stoppels
5325cfe865 systemd: symlink the internal libraries so they are found in rpath (#47667) 2024-11-19 15:28:49 +01:00
Harmen Stoppels
5333925dd7 sensei: fix install rpath for python extension (#47670) 2024-11-19 15:23:54 +01:00
Massimiliano Culpo
2db99e1ff6 gmp: fix cxx dependency, remove dependency on fortran (#47671) 2024-11-19 15:19:08 +01:00
Massimiliano Culpo
68aa712a3e solver: add a timeout handle for users (#47661)
This PR adds a configuration setting to allow setting time limits for concretization.

For backward compatibility, the default is to set no time limit.
2024-11-19 15:00:26 +01:00
Mikael Simberg
2e71bc640c pika: Add 0.30.1 (#47666) 2024-11-19 05:44:41 -07:00
Dom Heinzeller
661f3621a7 netcdf-cxx: add a maintainer (#47665) 2024-11-19 05:28:38 -07:00
Massimiliano Culpo
f182032337 Restore message when concretizing in parallel (#47663)
It was lost in #44843
2024-11-19 12:28:14 +00:00
teddy
066666b7b1 py-non-regression-test-tools: add v1.1.6 & remove v1.1.2 (tag removed) (#47622)
* py-non-regression-test-tools: add v1.1.6  & remove v1.1.2 (tag removed)
* Update var/spack/repos/builtin/packages/py-non-regression-test-tools/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-19 04:38:33 -07:00
Richard Berger
73316c3e28 cached_cmake: mpifc is not always defined (#46861)
* cached_cmake: mpifc is not always defined
* mpich: only depend on fortran when +fortran
2024-11-18 14:49:56 -08:00
afzpatel
c8e4ae08da eigen: enable ROCm support and add master version (#47332)
* eigen: enable ROCm support and add master version
* change boost dependency to only for tests
2024-11-18 14:41:02 -08:00
Tom Scogland
44225caade llvm: fix sysroot and build on darwin (#47336)
The default build of clang on darwin couldn't actually build anything
because of a lack of a sysroot built in.  Also several compilation
errors finding the system libc++ cropped up, much like those in GCC, and
have been fixed.
2024-11-18 14:39:16 -08:00
Lydéric Debusschère
8d325d3e30 yambo: add v5.2.3, v5.2.4 (#47350)
* packages: Update 'yambo'
* add call to 'resource' method to download Ydriver and iotk during fetch instead of during build
* air-gapped installation could be performed since version 5.2.1
* add versions 5.2.3 and 5.2.4
* remove some inexistant configure options for versions "@5:"
* add a sanity_check on 'bin/yambo'
2024-11-18 14:36:16 -08:00
Mosè Giordano
d0fd112006 grep: add executables attribute and determine_version method (#47438) 2024-11-18 14:30:46 -08:00
Wouter Deconinck
50f43ca71d py-gitpython: add v3.1.43 (fix CVEs) (#47444)
* py-gitpython: add v3.1.43
2024-11-18 14:24:20 -08:00
Vanessasaurus
2546fb6afa Automated deployment to update package flux-sched 2024-11-05 (#47449)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:23:35 -08:00
Vanessasaurus
10f6863d91 Automated deployment to update package flux-security 2024-11-05 (#47450)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:21:42 -08:00
afzpatel
63ea528606 mivisionx, migraphx and rocal: add and fix tests (#47453)
* fix mivisionx test, add migraphx build-time test and add rocal unit test
* fix miopen-hip linking issue
* remove setup_run_environment from roctracer-dev
* change patch file
2024-11-18 14:09:18 -08:00
snehring
89d2b9553d tb-lmto: new package (#47482)
* tb-lmto: adding new package tb-lmto
* tb-lmto: update copyright year
2024-11-18 14:01:59 -08:00
Dom Heinzeller
278326b4d9 Add py-phdf@0.11.4 and add conflict for py-pyhdf@0.10.4 with numpy@1.25: (#47656) 2024-11-18 13:53:38 -08:00
alvaro-sch
43c1a5e0ec orca: add v6.0.1, avx2-6.0.1 (#47489)
* orca: add 6.0.1 versions
* orca: checksum fix for avx2-6.0.1
* orca: fix version string for avx2-6.0.1
* orca: checksum fix for 6.0.1
2024-11-18 13:34:38 -08:00
Paul Gessinger
8feb506b3a geomodel: Fix dependencies (#47437)
* geomodel: Add dependency on `hdf5` for `+pythia`, require `hdf5+cxx`

* fix visualization dependencies

* geomodel: Add soqt dependency

* update dependency on soqt to drop explicit qt variant
2024-11-18 15:31:53 -06:00
Wouter Deconinck
627544191a py-pymongo: add v4.10.1 (fix CVE) (#47501)
* py-pymongo: add v4.10.1
* py-pymongo: fix copyright header spacing
* py-hatch-requirements-txt: add v0.4.1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-18 12:53:32 -08:00
Wouter Deconinck
cf672ea8af py-waitress: add v3.0.1 (#47509) 2024-11-18 12:50:49 -08:00
green-br
2c4ac02adf Add option to not build GUI for WxWidgets. (#47526) 2024-11-18 12:43:23 -08:00
Niclas Jansson
7f76490b31 neko: add v0.8.1, v0.9.0 and fix package (#47558) 2024-11-18 12:38:08 -08:00
Thomas-Ulrich
46e4c1fd30 seissol: add versions, conflict (#47562)
* add a couple of seissol version
2024-11-18 12:34:59 -08:00
Rémi Lacroix
85c5533e62 hpctoolkit: Update the minimum version for Python dependency (#47564)
New versions of HPCToolKit supports Python from version 3.8.
2024-11-18 12:31:20 -08:00
Vicente Bolea
c47cafd11a diy: apply smoke_test patch in 3.6 only (#47572) 2024-11-18 12:26:45 -08:00
jmuddnv
8e33cc158b Changes for NVIDIA HPC SDK 24.11 (#47592) 2024-11-18 12:24:31 -08:00
Adam J. Stewart
f07173e5ee py-torchmetrics: add v1.6.0 (#47580) 2024-11-18 12:22:57 -08:00
Ken Raffenetti
118f5d2683 mpich: Remove incorrect dependency (#47586)
The gni libfabric provider works on some Cray systems, but not all. For
example, Slingshot-based machines use a different libfabric provider
(cxi). Therefore libfabric/gni should not be a dependency when using
Cray PMI.
2024-11-18 12:20:10 -08:00
Louise Spellacy
8fb2abc3cd linaro-forge: added version 24.1 (#47594) 2024-11-18 12:16:12 -08:00
afzpatel
3bcb8a9236 rocm-dbgapi: add pciutils dependency (#47605)
* add pciutils dependency
* add maintainer
2024-11-18 12:12:55 -08:00
Adam J. Stewart
a6fdd7608f py-huggingface-hub: add v0.26.2, hf_transfer (#47600) 2024-11-18 11:51:16 -08:00
Cody Balos
1ffd7125a6 sundials: set CUDAToolkit_ROOT (#47601)
* sundials: specify CUDAToolkit_ROOT
2024-11-18 11:48:16 -08:00
Jerome Soumagne
d1166fd316 mercury: add v2.4.0 (#47606) 2024-11-18 11:36:23 -08:00
Weiqun Zhang
b8eba1c677 amrex: add new variant fft for >= 24.11 (#47611) 2024-11-18 11:35:00 -08:00
Pranav Sivaraman
e3c0515076 tinycbor: new package (#47623) 2024-11-18 11:19:14 -08:00
Pranav Sivaraman
97406f241c tomlplusplus: new package (#47624)
* tomlplusplus: new package
* tomlplusplus: remove period
2024-11-18 11:15:30 -08:00
Wouter Deconinck
e1dfbbf611 py-greenlet: add v3.0.3, v3.1.1 (#47647)
* py-greenlet: add v3.0.3, v3.1.1
* py-greenlet: depends_on py-setuptools@40.8.0:
2024-11-18 12:13:41 -07:00
Paul
52147348c7 Added Go v1.23.3 (#47633) 2024-11-18 10:52:17 -08:00
Wouter Deconinck
aeb0ab6acf pocl: add v3.1, v4.0, v5.0, v6.0 (#47643) 2024-11-18 10:43:26 -08:00
Wouter Deconinck
6cd26b7603 clinfo: add v3.0.23.01.25 (#47644) 2024-11-18 10:41:32 -08:00
wspear
1c75d07f05 Fix self.spec reference (#47610)
@teaguesterling
2024-11-18 10:37:53 -08:00
Wouter Deconinck
15b3ff2a0a harfbuzz: add v10.1.0 (#47645) 2024-11-18 10:29:29 -08:00
Teague Sterling
e9f94d9bf2 ollama: add v0.4.2 (#47654)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-11-18 10:14:16 -08:00
Wouter Deconinck
299324c7ca dbus: add v1.15.12 (#47655) 2024-11-18 10:10:14 -08:00
Stephen Nicholas Swatman
dfab174f31 benchmark: add version 1.9.0 (#47658)
This commit adds Google Benchmark v1.9.0.
2024-11-18 07:04:52 -06:00
Stephen Nicholas Swatman
a86953fcb1 acts: add version 37.4.0 (#47657)
This commit adds v37.4.0 of the ACTS project; no new versions of the
dependencies were released.
2024-11-18 06:58:59 -06:00
Massimiliano Culpo
5f262eb5d3 Add further missing C, C++ dependencies to packages (#47649) 2024-11-18 09:39:47 +01:00
Wouter Deconinck
00f179ee6d root: add v6.32.08 (#47641) 2024-11-17 18:39:17 -06:00
Massimiliano Culpo
da4f7c2952 Add an audit to prevent using the name "all" in packages (#47651)
Packages cannot be named like that, since we use "all" to indicate
default settings under the "packages" section of the configuration.
2024-11-17 13:32:24 -07:00
Harmen Stoppels
fdedb6f95d style.py: add import-check for missing & redundant imports (#47619) 2024-11-17 09:18:48 +01:00
Massimiliano Culpo
067fefc46a Set "generic" uarch granularity for a few pipelines (#47640) 2024-11-17 09:07:42 +01:00
Massimiliano Culpo
42c9961bbe Added a few missing language deps to packages (#47639) 2024-11-17 09:07:08 +01:00
Wouter Deconinck
fe2bf4c0f9 pixman: add missing MesonPackage (#47607) 2024-11-17 09:03:15 +01:00
Harmen Stoppels
4d3b85c4d4 spack.package / builtin repo: fix exports/imports (#47617)
Add various missing imports in packages.
Remove redundant imports
Export NoLibrariesError, NoHeadersError, which_string in spack.package
2024-11-17 09:02:04 +01:00
Satish Balay
f05cbfbf44 xsdk: dealii has changes to variant defaults, update xsdk accordingly (#47602) 2024-11-16 16:42:16 -06:00
Wouter Deconinck
448049ccfc qt-tools: new package (#45602)
* qt-tools: new pkg with +designer to build Qt Designer for QWT

* qt-tools: fix style

* qt-tools: fix unused variable

* qt-tools: rm setup_run_environments (now in qt-base)

* qt-tools: add myself as maintainer

* qt-tools: add variant assistant; use commits with submodule

* qt-base: define QtPackage.get_git
2024-11-16 09:09:41 -06:00
etiennemlb
e56057fd79 gobject-introspection: Do not write to user home (#47621) 2024-11-16 11:11:52 +01:00
Harmen Stoppels
26d80e7bc5 py-blosc2: use external libblosc2 (#47566) 2024-11-16 09:43:54 +01:00
Dom Heinzeller
60eb0e9c80 Bug fix in py-scipy for versions 1.8.0 to 1.14.0 that surfaces with latest Clang and Intel LLVM compilers (#47620) 2024-11-16 06:56:25 +01:00
Thomas Bouvier
7443a3b572 py-wandb: add v0.16.6 (#43891)
* py-wandb: add version v0.16.6

* fix: typo

* py-wandb: py-click when @0.15.5:, py-pathtools when @:0.15

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:17:51 -07:00
dependabot[bot]
a5ba4f8d91 build(deps): bump codecov/codecov-action from 4.6.0 to 5.0.2 (#47631)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.6.0 to 5.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](b9fd7d16f6...5c47607acb)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 21:41:14 -06:00
Matthias Wolf
6ef0f495a9 py-libsonata: add v0.1.29 (#47466)
* py-libsonata: new version.

* Fix Python version dependency.

* py-libsonata: fix typo

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 20:27:41 -07:00
Matthias Wolf
e91b8c291a py-numpy-quaternion: add v2024.0.3 (#47469)
* py-numpy-quaterion: add new version.

* Update dependency version bounds

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix build dependencies.

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:16:16 -06:00
Wouter Deconinck
6662046aca armadillo: add v14.0.3 (#47634) 2024-11-15 19:57:48 -07:00
Matthieu Dorier
db83c62fb1 arrow: add v18.0.0 (#47494)
* arrow: added version 18.0.0

This PR adds version 18.0.0 to the arrow package.

* arrow: updated dependency on llvm
2024-11-15 20:54:43 -06:00
teddy
d4adfda385 costo: add v0.0.8 (#47625)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-15 18:10:52 -08:00
Matt Thompson
e8a8e2d98b mapl: add 2.40.3.1 (#47627)
* mapl: add 2.40.3.1
* Relax ESMF requirement
2024-11-15 18:09:22 -08:00
Paolo
55c770c556 Add ACfL 24.10.1 (#47616) 2024-11-15 18:05:38 -08:00
Thomas Gruber
33a796801c Likwid: add version 5.4.0 (#47630) 2024-11-15 18:04:21 -08:00
Seth R. Johnson
b90ac6441c celeritas: remove ancient versions and add CUDA package dependency (#47629)
* celeritas: remove deprecated versions through 0.3

* celeritas: deprecate old versions

* celeritas: add c++20 option

* Propagate vecgeom CUDA requirements

* Remove outdated conflicts and format it
2024-11-15 17:27:22 -07:00
dependabot[bot]
68b69aa9e3 build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#47588)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 3.0.1 to 3.0.2.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/3.0.1...3.0.2)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 17:21:42 -06:00
Veselin Dobrev
ac0ed2c4cc [mfem] Add a patch for MFEM v4.7 that adds support for SUDIALS v7 (#47591) 2024-11-15 08:50:44 -06:00
Harmen Stoppels
66a93b5433 Add missing llnl.* imports (#47618) 2024-11-15 15:49:25 +01:00
Harmen Stoppels
b7993317ea Improve type hints for package API (#47576)
by disentangling `package_base`, `builder` and `directives`.
2024-11-15 09:13:10 +01:00
etiennemlb
66622ec4d0 py-easybuild-framework: add python forward compat bound (#47597) 2024-11-14 23:47:23 -07:00
Pranav Sivaraman
9b2cd1b208 yyjson: new package (#47563)
* yyjson: new package

* [@spackbot] updating style on behalf of pranav-sivaraman

---------

Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-11-14 11:01:21 -08:00
Dominic Hofer
9888683a21 eccodes: add v2.38.0 (#47581)
* eccodes: Add 2.38.0

* Update var/spack/repos/builtin/packages/eccodes/package.py
2024-11-14 10:57:59 -08:00
Massimiliano Culpo
fb46c7a72d Rework spack.database.InstallStatuses into a flag (#47321) 2024-11-14 15:43:31 +01:00
Massimiliano Culpo
c0196cde39 Remove support for PGI compilers (#47195) 2024-11-14 09:17:41 +01:00
Todd Gamblin
d091172d67 Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-13 23:20:03 -08:00
psakievich
ab51369087 Update tutorial version (#47593) 2024-11-14 08:15:11 +01:00
Matthieu Dorier
1cea82b629 xfsprogs: fix dependency on liburcu (#47582)
* xfsprogs: fix dependency on liburcu

* xfsprogs: fix install rules.d

* xfsprogs: edited xfsprogs requirement on liburcu

* xfsprogs: many more versions
2024-11-13 16:17:20 -06:00
H. Joe Lee
2abb711337 hermes-shm: remove duplicate line (#47575)
close #10
2024-11-13 09:29:25 -07:00
Todd Gamblin
6f948eb847 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-13 08:21:16 -07:00
Alec Scott
93bf0634f3 nlopt: reformat for best practices (#47340) 2024-11-13 08:20:56 -07:00
Luca Heltai
badb3cedcd dealii: add v9.6.0 (#45554)
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-11-13 15:34:27 +01:00
Harmen Stoppels
be918817d6 bump version to 0.24.0.dev0 (#47578) 2024-11-13 13:05:14 +01:00
Harmen Stoppels
41d9f687f6 missing and redundant imports (#47577) 2024-11-13 13:03:09 +01:00
554 changed files with 6180 additions and 5411 deletions

View File

@@ -57,7 +57,13 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
images: |
@@ -71,6 +77,7 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -113,7 +120,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -29,6 +29,6 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true

View File

@@ -52,13 +52,7 @@ jobs:
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
cmake bison libbison-dev subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
@@ -105,13 +99,7 @@ jobs:
run: |
sudo apt-get -y update
# Needed for shell tests
sudo apt-get install -y coreutils csh zsh tcsh fish dash bash subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist
@@ -146,7 +134,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory '*'
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test

View File

@@ -74,7 +74,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory '*'
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test

View File

@@ -1,395 +1,3 @@
# v0.23.1 (2025-02-19)
## Bugfixes
- Fix a correctness issue of `ArchSpec.intersects` (#48741)
- Make `extra_attributes` order independent in Spec hashing (#48615, #48854)
- Fix issue where system proxy settings were not respected in OCI build caches (#48783)
- Fix an issue where the `--test` concretizer flag was not forwarded correctly (#48417)
- Fix an issue where `codesign` and `install_name_tool` would not preserve hardlinks on
Darwin (#47808)
- Fix an issue on Darwin where codesign would run on unmodified binaries (#48568)
- Patch configure scripts generated with libtool < 2.5.4, to avoid redundant flags when
creating shared libraries on Darwin (#48671)
- Fix issue related to mirror URL paths on Windows (#47898)
- Esnure proper UTF-8 encoding/decoding in logging (#48005)
- Fix issues related to `filter_file` (#48038, #48108)
- Fix issue related to creating bootstrap source mirrors (#48235)
- Fix issue where command line config arguments were not always top level (#48255)
- Fix an incorrect typehint of `concretized()` (#48504)
- Improve mention of next Spack version in warning (#47887)
- Tests: fix forward compatibility with Python 3.13 (#48209)
- Docs: encourage use of `--oci-username-variable` and `--oci-password-variable` (#48189)
- Docs: ensure Getting Started has bootstrap list output in correct place (#48281)
- CI: allow GitHub actions to run on forks of Spack with different project name (#48041)
- CI: make unit tests work on Ubuntu 24.04 (#48151)
- CI: re-enable cray pipelines (#47697)
## Package updates
- `qt-base`: fix rpath for dependents (#47424)
- `gdk-pixbuf`: fix outdated URL (#47825)
# v0.23.0 (2024-11-13)
`v0.23.0` is a major feature release.
We are planning to make this the last major release before Spack `v1.0`
in June 2025. Alongside `v0.23`, we will be making pre-releases (alpha,
beta, etc.) of `v1.0`, and we encourage users to try them and send us
feedback, either on GitHub or on Slack. You can track the road to
`v1.0` here:
* https://github.com/spack/spack/releases
* https://github.com/spack/spack/discussions/30634
## Features in this Release
1. **Language virtuals**
Your packages can now explicitly depend on the languages they require.
Historically, Spack has considered C, C++, and Fortran compiler
dependencies to be implicit. In `v0.23`, you should ensure that
new packages add relevant C, C++, and Fortran dependencies like this:
```python
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
```
We encourage you to add these annotations to your packages now, to prepare
for Spack `v1.0.0`. In `v1.0.0`, these annotations will be necessary for
your package to use C, C++, and Fortran compilers. Note that you should
*not* add language dependencies to packages that don't need them, e.g.,
pure python packages.
We have already auto-generated these dependencies for packages in the
`builtin` repository (see #45217), based on the types of source files
present in each package's source code. We *may* have added too many or too
few language dependencies, so please submit pull requests to correct
packages if you find that the language dependencies are incorrect.
Note that we have also backported support for these dependencies to
`v0.21.3` and `v0.22.2`, to make all of them forward-compatible with
`v0.23`. This should allow you to move easily between older and newer Spack
releases without breaking your packages.
2. **Spec splicing**
We are working to make binary installation more seamless in Spack. `v0.23`
introduces "splicing", which allows users to deploy binaries using local,
optimized versions of a binary interface, even if they were not built with
that interface. For example, this would allow you to build binaries in the
cloud using `mpich` and install them on a system using a local, optimized
version of `mvapich2` *without rebuilding*. Spack preserves full provenance
for the installed packages and knows that they were built one way but
deployed another.
Our intent is to leverage this across many key HPC binary packages,
e.g. MPI, CUDA, ROCm, and libfabric.
Fundamentally, splicing allows Spack to redeploy an existing spec with
different dependencies than how it was built. There are two interfaces to
splicing.
a. Explicit Splicing
#39136 introduced the explicit splicing interface. In the
concretizer config, you can specify a target spec and a replacement
by hash.
```yaml
concretizer:
splice:
explicit:
- target: mpi
replacement: mpich/abcdef
```
Here, every installation that would normally use the target spec will
instead use its replacement. Above, any spec using *any* `mpi` will be
spliced to depend on the specific `mpich` installation requested. This
*can* go wrong if you try to replace something built with, e.g.,
`openmpi` with `mpich`, and it is on the user to ensure ABI
compatibility between target and replacement specs. This currently
requires some expertise to use, but it will allow users to reuse the
binaries they create across more machines and environments.
b. Automatic Splicing (experimental)
#46729 introduced automatic splicing. In the concretizer config, enable
automatic splicing:
```yaml
concretizer:
splice:
automatic: true
```
or run:
```console
spack config add concretizer:splice:automatic:true
```
The concretizer will select splices for ABI compatibility to maximize
package reuse. Packages can denote ABI compatibility using the
`can_splice` directive. No packages in Spack yet use this directive, so
if you want to use this feature you will need to add `can_splice`
annotations to your packages. We are working on ways to add more ABI
compatibility information to the Spack package repository, and this
directive may change in the future.
See the documentation for more details:
* https://spack.readthedocs.io/en/latest/build_settings.html#splicing
* https://spack.readthedocs.io/en/latest/packaging_guide.html#specifying-abi-compatibility
3. Broader variant propagation
Since #42931, you can specify propagated variants like `hdf5
build_type==RelWithDebInfo` or `trilinos ++openmp` to propagate a variant
to all dependencies for which it is relevant. This is valid *even* if the
variant does not exist on the package or its dependencies.
See https://spack.readthedocs.io/en/latest/basic_usage.html#variants.
4. Query specs by namespace
#45416 allows a package's namespace (indicating the repository it came from)
to be treated like a variant. You can request packages from particular repos
like this:
```console
spack find zlib namespace=builtin
spack find zlib namespace=myrepo
```
Previously, the spec syntax only allowed namespaces to be prefixes of spec
names, e.g. `builtin.zlib`. The previous syntax still works.
5. `spack spec` respects environment settings and `unify:true`
`spack spec` did not previously respect environment lockfiles or
unification settings, which made it difficult to see exactly how a spec
would concretize within an environment. Now it does, so the output you get
with `spack spec` will be *the same* as what your environment will
concretize to when you run `spack concretize`. Similarly, if you provide
multiple specs on the command line with `spack spec`, it will concretize
them together if `unify:true` is set.
See #47556 and #44843.
6. Less noisy `spack spec` output
`spack spec` previously showed output like this:
```console
> spack spec /v5fn6xo
Input spec
--------------------------------
- /v5fn6xo
Concretized
--------------------------------
[+] openssl@3.3.1%apple-clang@16.0.0~docs+shared arch=darwin-sequoia-m1
...
```
But the input spec is redundant, and we know we run `spack spec` to concretize
the input spec. `spack spec` now *only* shows the concretized spec. See #47574.
7. Better output for `spack find -c`
In an environmnet, `spack find -c` lets you search the concretized, but not
yet installed, specs, just as you would the installed ones. As with `spack
spec`, this should make it easier for you to see what *will* be built
before building and installing it. See #44713.
8. `spack -C <env>`: use an environment's configuration without activation
Spack environments allow you to associate:
1. a set of (possibly concretized) specs, and
2. configuration
When you activate an environment, you're using both of these. Previously, we
supported:
* `spack -e <env>` to run spack in the context of a specific environment, and
* `spack -C <directory>` to run spack using a directory with configuration files.
You can now also pass an environment to `spack -C` to use *only* the environment's
configuration, but not the specs or lockfile. See #45046.
## New commands, options, and directives
* The new `spack env track` command (#41897) takes a non-managed Spack
environment and adds a symlink to Spack's `$environments_root` directory, so
that it will be included for reference counting for commands like `spack
uninstall` and `spack gc`. If you use free-standing directory environments,
this is useful for preventing Spack from removing things required by your
environments. You can undo this tracking with the `spack env untrack`
command.
* Add `-t` short option for `spack --backtrace` (#47227)
`spack -d / --debug` enables backtraces on error, but it can be very
verbose, and sometimes you just want the backtrace. `spack -t / --backtrace`
provides that option.
* `gc`: restrict to specific specs (#46790)
If you only want to garbage-collect specific packages, you can now provide
them on the command line. This gives users finer-grained control over what
is uninstalled.
* oci buildcaches now support `--only=package`. You can now push *just* a
package and not its dependencies to an OCI registry. This allows dependents
of non-redistributable specs to be stored in OCI registries without an
error. See #45775.
## Notable refactors
* Variants are now fully conditional
The `variants` dictionary on packages was previously keyed by variant name,
and allowed only one definition of any given variant. Spack is now smart
enough to understand that variants may have different values and defaults
for different versions. For example, `warpx` prior to `23.06` only supported
builds for one dimensionality, and newer `warpx` versions could be built
with support for many different dimensions:
```python
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
```
Previously, the default for the old version of `warpx` was not respected and
had to be specified manually. Now, Spack will select the right variant
definition for each version at concretization time. This allows variants to
evolve more smoothly over time. See #44425 for details.
## Highlighted bugfixes
1. Externals no longer override the preferred provider (#45025).
External definitions could interfere with package preferences. Now, if
`openmpi` is the preferred `mpi`, and an external `mpich` is defined, a new
`openmpi` *will* be built if building it is possible. Previously we would
prefer `mpich` despite the preference.
2. Composable `cflags` (#41049).
This release fixes a longstanding bug that concretization would fail if
there were different `cflags` specified in `packages.yaml`,
`compilers.yaml`, or on `the` CLI. Flags and their ordering are now tracked
in the concretizer and flags from multiple sources will be merged.
3. Fix concretizer Unification for included environments (#45139).
## Deprecations, removals, and syntax changes
1. The old concretizer has been removed from Spack, along with the
`config:concretizer` config option. Spack will emit a warning if the option
is present in user configuration, since it now has no effect. Spack now
uses a simpler bootstrapping mechanism, where a JSON prototype is tweaked
slightly to get an initial concrete spec to download. See #45215.
2. Best-effort expansion of spec matrices has been removed. This feature did
not work with the "new" ASP-based concretizer, and did not work with
`unify: True` or `unify: when_possible`. Use the
[exclude key](https://spack.readthedocs.io/en/latest/environments.html#spec-matrices)
for the environment to exclude invalid components, or use multiple spec
matrices to combine the list of specs for which the constraint is valid and
the list of specs for which it is not. See #40792.
3. The old Cray `platform` (based on Cray PE modules) has been removed, and
`platform=cray` is no longer supported. Since `v0.19`, Spack has handled
Cray machines like Linux clusters with extra packages, and we have
encouraged using this option to support Cray. The new approach allows us to
correctly handle Cray machines with non-SLES operating systems, and it is
much more reliable than making assumptions about Cray modules. See the
`v0.19` release notes and #43796 for more details.
4. The `config:install_missing_compilers` config option has been deprecated,
and it is a no-op when set in `v0.23`. Our new compiler dependency model
will replace it with a much more reliable and robust mechanism in `v1.0`.
See #46237.
5. Config options that deprecated in `v0.21` have been removed in `v0.23`. You
can now only specify preferences for `compilers`, `targets`, and
`providers` globally via the `packages:all:` section. Similarly, you can
only specify `versions:` locally for a specific package. See #44061 and
#31261 for details.
6. Spack's old test interface has been removed (#45752), having been
deprecated in `v0.22.0` (#34236). All `builtin` packages have been updated
to use the new interface. See the [stand-alone test documentation](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests)
7. The `spack versions --safe-only` option, deprecated since `v0.21.0`, has
been removed. See #45765.
* The `--dependencies` and `--optimize` arguments to `spack ci` have been
deprecated. See #45005.
## Binary caches
1. Public binary caches now include an ML stack for Linux/aarch64 (#39666)We
now build an ML stack for Linux/aarch64 for all pull requests and on
develop. The ML stack includes both CPU-only and CUDA builds for Horovod,
Hugging Face, JAX, Keras, PyTorch,scikit-learn, TensorBoard, and
TensorFlow, and related packages. The CPU-only stack also includes XGBoost.
See https://cache.spack.io/tag/develop/?stack=ml-linux-aarch64-cuda.
2. There is also now an stack of developer tools for macOS (#46910), which is
analogous to the Linux devtools stack. You can use this to avoid building
many common build dependencies. See
https://cache.spack.io/tag/develop/?stack=developer-tools-darwin.
## Architecture support
* archspec has been updated to `v0.2.5`, with support for `zen5`
* Spack's CUDA package now supports the Grace Hopper `9.0a` compute capability (#45540)
## Windows
* Windows bootstrapping: `file` and `gpg` (#41810)
* `scripts` directory added to PATH on Windows for python extensions (#45427)
* Fix `spack load --list` and `spack unload` on Windows (#35720)
## Other notable changes
* Bugfix: `spack find -x` in environments (#46798)
* Spec splices are now robust to duplicate nodes with the same name in a spec (#46382)
* Cache per-compiler libc calculations for performance (#47213)
* Fixed a bug in external detection for openmpi (#47541)
* Mirror configuration allows username/password as environment variables (#46549)
* Default library search caps maximum depth (#41945)
* Unify interface for `spack spec` and `spack solve` commands (#47182)
* Spack no longer RPATHs directories in the default library search path (#44686)
* Improved performance of Spack database (#46554)
* Enable package reuse for packages with versions from git refs (#43859)
* Improved handling for `uuid` virtual on macos (#43002)
* Improved tracking of task queueing/requeueing in the installer (#46293)
## Spack community stats
* Over 2,000 pull requests updated package recipes
* 8,307 total packages, 329 new since `v0.22.0`
* 140 new Python packages
* 14 new R packages
* 373 people contributed to this release
* 357 committers to packages
* 60 committers to core
# v0.22.2 (2024-09-21)
## Bugfixes
@@ -811,7 +419,7 @@ feedback, either on GitHub or on Slack. You can track the road to
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)

View File

@@ -70,7 +70,7 @@ Tutorial
----------------
We maintain a
[**hands-on tutorial**](https://spack.readthedocs.io/en/latest/tutorial.html).
[**hands-on tutorial**](https://spack-tutorial.readthedocs.io/).
It covers basic to advanced usage, packaging, developer features, and large HPC
deployments. You can do all of the exercises on your own laptop using a
Docker container.

View File

@@ -55,3 +55,11 @@ concretizer:
splice:
explicit: []
automatic: false
# Maximum time, in seconds, allowed for the 'solve' phase. If set to 0, there is no time limit.
timeout: 0
# If set to true, exceeding the timeout will always result in a concretization error. If false,
# the best (suboptimal) model computed before the timeout is used.
#
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true

View File

@@ -265,30 +265,25 @@ infrastructure, or to cache Spack built binaries in Github Actions and
GitLab CI.
To get started, configure an OCI mirror using ``oci://`` as the scheme,
and optionally specify variables that hold the username and password (or
personal access token) for the registry:
and optionally specify a username and password (or personal access token):
.. code-block:: console
$ spack mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
my_registry oci://example.com/my_image
$ spack mirror add --oci-username username --oci-password password my_registry oci://example.com/my_image
Spack follows the naming conventions of Docker, with Dockerhub as the default
registry. To use Dockerhub, you can omit the registry domain:
.. code-block:: console
$ spack mirror add ... my_registry oci://username/my_image
$ spack mirror add --oci-username username --oci-password password my_registry oci://username/my_image
From here, you can use the mirror as any other build cache:
.. code-block:: console
$ export REGISTRY_USER=...
$ export REGISTRY_TOKEN=...
$ spack buildcache push my_registry <specs...> # push to the registry
$ spack install <specs...> # or install from the registry
$ spack install <specs...> # install from the registry
A unique feature of buildcaches on top of OCI registries is that it's incredibly
easy to generate get a runnable container image with the binaries installed. This

View File

@@ -210,7 +210,7 @@ def setup(sphinx):
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
("py:class", "spack.build_systems._checks.BuilderWithDefaults"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),

View File

@@ -38,11 +38,9 @@ just have to configure and OCI registry and run ``spack buildcache push``.
spack -e . install
# Configure the registry
spack -e . mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
container-registry oci://example.com/name/image
spack -e . mirror add --oci-username ... --oci-password ... container-registry oci://example.com/name/image
# Push the image (do set REGISTRY_USER and REGISTRY_TOKEN)
# Push the image
spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry
The resulting container image can then be run as follows:

View File

@@ -1042,7 +1042,7 @@ file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
package. This view selects all packages that depend on MPI, and
excludes those built with the PGI compiler at version 18.5.
excludes those built with the GCC compiler at version 18.5.
The root specs with their (transitive) link and run type dependencies
will be put in the view due to the ``link: all`` option,
and the files in the view will be symlinks to the spack install
@@ -1056,7 +1056,7 @@ directories.
mpis:
root: /path/to/view
select: [^mpi]
exclude: ['%pgi@18.5']
exclude: ['%gcc@18.5']
projections:
all: '{name}/{version}-{compiler.name}'
link: all

View File

@@ -148,22 +148,20 @@ The first time you concretize a spec, Spack will bootstrap automatically:
--------------------------------
zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake
The default bootstrap behavior is to use pre-built binaries. You can verify the
active bootstrap repositories with:
.. command-output:: spack bootstrap list
If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to disable fetching the binaries we generated with Github Actions.
.. code-block:: console
$ spack bootstrap disable github-actions-v0.6
==> "github-actions-v0.6" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.5
==> "github-actions-v0.5" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.4
==> "github-actions-v0.4" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.3
==> "github-actions-v0.3" is now disabled and will not be used for bootstrapping
You can verify that the new settings are effective with:
.. command-output:: spack bootstrap list
You can verify that the new settings are effective with ``spack bootstrap list``.
.. note::
@@ -285,10 +283,6 @@ compilers`` or ``spack compiler list``:
intel@14.0.1 intel@13.0.1 intel@12.1.2 intel@10.1
-- clang -------------------------------------------------------
clang@3.4 clang@3.3 clang@3.2 clang@3.1
-- pgi ---------------------------------------------------------
pgi@14.3-0 pgi@13.2-0 pgi@12.1-0 pgi@10.9-0 pgi@8.0-1
pgi@13.10-0 pgi@13.1-1 pgi@11.10-0 pgi@10.2-0 pgi@7.1-3
pgi@13.6-0 pgi@12.8-0 pgi@11.1-0 pgi@9.0-4 pgi@7.0-6
Any of these compilers can be used to build Spack packages. More on
how this is done is in :ref:`sec-specs`.
@@ -808,65 +802,6 @@ flags to the ``icc`` command:
spec: intel@15.0.24.4.9.3
^^^
PGI
^^^
PGI comes with two sets of compilers for C++ and Fortran,
distinguishable by their names. "Old" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgCC
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgf77
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgf90
"New" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgc++
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
Older installations of PGI contains just the old compilers; whereas
newer installations contain the old and the new. The new compiler is
considered preferable, as some packages
(``hdf``) will not build with the old compiler.
When auto-detecting a PGI compiler, there are cases where Spack will
find the old compilers, when you really want it to find the new
compilers. It is best to check this ``compilers.yaml``; and if the old
compilers are being used, change ``pgf77`` and ``pgf90`` to
``pgfortran``.
Other issues:
* There are reports that some packages will not build with PGI,
including ``libpciaccess`` and ``openssl``. A workaround is to
build these packages with another compiler and then use them as
dependencies for PGI-build packages. For example:
.. code-block:: console
$ spack install openmpi%pgi ^libpciaccess%gcc
* PGI requires a license to use; see :ref:`licensed-compilers` for more
information on installation.
.. note::
It is believed the problem with HDF 4 is that everything is
compiled with the ``F77`` compiler, but at some point some Fortran
90 code slipped in there. So compilers that can handle both FORTRAN
77 and Fortran 90 (``gfortran``, ``pgfortran``, etc) are fine. But
compilers specific to one or the other (``pgf77``, ``pgf90``) won't
work.
^^^
NAG
^^^
@@ -1391,6 +1326,7 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1456,6 +1392,13 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -1928,71 +1928,29 @@ to the empty list.
String. A URL pointing to license setup instructions for the software.
Defaults to the empty string.
For example, let's take a look at the package for the PGI compilers.
For example, let's take a look at the Arm Forge package.
.. code-block:: python
# Licensing
license_required = True
license_comment = "#"
license_files = ["license.dat"]
license_vars = ["PGROUPD_LICENSE_FILE", "LM_LICENSE_FILE"]
license_url = "http://www.pgroup.com/doc/pgiinstall.pdf"
license_comment = "#"
license_files = ["licences/Licence"]
license_vars = [
"ALLINEA_LICENSE_DIR",
"ALLINEA_LICENCE_DIR",
"ALLINEA_LICENSE_FILE",
"ALLINEA_LICENCE_FILE",
]
license_url = "https://developer.arm.com/documentation/101169/latest/Use-Arm-Licence-Server"
As you can see, PGI requires a license. Its license manager, FlexNet, uses
the ``#`` symbol to denote a comment. It expects the license file to be
named ``license.dat`` and to be located directly in the installation prefix.
If you would like the installation file to be located elsewhere, simply set
``PGROUPD_LICENSE_FILE`` or ``LM_LICENSE_FILE`` after installation. For
further instructions on installation and licensing, see the URL provided.
Arm Forge requires a license. Its license manager uses the ``#`` symbol to denote a comment.
It expects the license file to be named ``License`` and to be located in a ``licenses`` directory
in the installation prefix.
Let's walk through a sample PGI installation to see exactly what Spack is
and isn't capable of. Since PGI does not provide a download URL, it must
be downloaded manually. It can either be added to a mirror or located in
the current directory when ``spack install pgi`` is run. See :ref:`mirrors`
for instructions on setting up a mirror.
After running ``spack install pgi``, the first thing that will happen is
Spack will create a global license file located at
``$SPACK_ROOT/etc/spack/licenses/pgi/license.dat``. It will then open up the
file using :ref:`your favorite editor <controlling-the-editor>`. It will look like
this:
.. code-block:: sh
# A license is required to use pgi.
#
# The recommended solution is to store your license key in this global
# license file. After installation, the following symlink(s) will be
# added to point to this file (relative to the installation prefix):
#
# license.dat
#
# Alternatively, use one of the following environment variable(s):
#
# PGROUPD_LICENSE_FILE
# LM_LICENSE_FILE
#
# If you choose to store your license in a non-standard location, you may
# set one of these variable(s) to the full pathname to the license file, or
# port@host if you store your license keys on a dedicated license server.
# You will likely want to set this variable in a module file so that it
# gets loaded every time someone tries to use pgi.
#
# For further information on how to acquire a license, please refer to:
#
# http://www.pgroup.com/doc/pgiinstall.pdf
#
# You may enter your license below.
You can add your license directly to this file, or tell FlexNet to use a
license stored on a separate license server. Here is an example that
points to a license server called licman1:
.. code-block:: none
SERVER licman1.mcs.anl.gov 00163eb7fba5 27200
USE_SERVER
If you would like the installation file to be located elsewhere, simply set ``ALLINEA_LICENSE_DIR`` or
one of the other license variables after installation. For further instructions on installation and
licensing, see the URL provided.
If your package requires the license to install, you can reference the
location of this global license using ``self.global_license_file``.
@@ -2967,9 +2925,9 @@ make sense during the build phase may not be needed at runtime, and vice versa.
it makes sense to let a dependency set the environment variables for its dependents. To allow all
this, Spack provides four different methods that can be overridden in a package:
1. :meth:`setup_build_environment <spack.builder.Builder.setup_build_environment>`
1. :meth:`setup_build_environment <spack.builder.BaseBuilder.setup_build_environment>`
2. :meth:`setup_run_environment <spack.package_base.PackageBase.setup_run_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.BaseBuilder.setup_dependent_build_environment>`
4. :meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
The Qt package, for instance, uses this call:

View File

@@ -1,7 +1,7 @@
sphinx==8.1.3
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0

View File

@@ -301,32 +301,35 @@ def filter_file(
ignore_absent: bool = False,
start_at: Optional[str] = None,
stop_at: Optional[str] = None,
encoding: Optional[str] = "utf-8",
) -> None:
r"""Like sed, but uses python regular expressions.
Filters every line of each file through regex and replaces the file with a filtered version.
Preserves mode of filtered files.
Filters every line of each file through regex and replaces the file
with a filtered version. Preserves mode of filtered files.
As with re.sub, ``repl`` can be either a string or a callable. If it is a callable, it is
passed the match object and should return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution as sed would allow.
As with re.sub, ``repl`` can be either a string or a callable.
If it is a callable, it is passed the match object and should
return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
Args:
regex: The regular expression to search for
repl: The string to replace matches with
*filenames: One or more files to search and replace string: Treat regex as a plain string.
Default it False backup: Make backup file(s) suffixed with ``~``. Default is False
ignore_absent: Ignore any files that don't exist. Default is False
start_at: Marker used to start applying the replacements. If a text line matches this
marker filtering is started at the next line. All contents before the marker and the
marker itself are copied verbatim. Default is to start filtering from the first line of
the file.
stop_at: Marker used to stop scanning the file further. If a text line matches this marker
filtering is stopped and the rest of the file is copied verbatim. Default is to filter
until the end of the file.
encoding: The encoding to use when reading and writing the files. Default is None, which
uses the system's default encoding.
regex (str): The regular expression to search for
repl (str): The string to replace matches with
*filenames: One or more files to search and replace
string (bool): Treat regex as a plain string. Default it False
backup (bool): Make backup file(s) suffixed with ``~``. Default is False
ignore_absent (bool): Ignore any files that don't exist.
Default is False
start_at (str): Marker used to start applying the replacements. If a
text line matches this marker filtering is started at the next line.
All contents before the marker and the marker itself are copied
verbatim. Default is to start filtering from the first line of the
file.
stop_at (str): Marker used to stop scanning the file further. If a text
line matches this marker filtering is stopped and the rest of the
file is copied verbatim. Default is to filter until the end of the
file.
"""
# Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl):
@@ -342,56 +345,72 @@ def groupid_to_group(x):
if string:
regex = re.escape(regex)
regex_compiled = re.compile(regex)
for path in path_to_os_path(*filenames):
if ignore_absent and not os.path.exists(path):
tty.debug(f'FILTER FILE: file "{path}" not found. Skipping to next file.')
for filename in path_to_os_path(*filenames):
msg = 'FILTER FILE: {0} [replacing "{1}"]'
tty.debug(msg.format(filename, regex))
backup_filename = filename + "~"
tmp_filename = filename + ".spack~"
if ignore_absent and not os.path.exists(filename):
msg = 'FILTER FILE: file "{0}" not found. Skipping to next file.'
tty.debug(msg.format(filename))
continue
else:
tty.debug(f'FILTER FILE: {path} [replacing "{regex}"]')
fd, temp_path = tempfile.mkstemp(
prefix=f"{os.path.basename(path)}.", dir=os.path.dirname(path)
)
os.close(fd)
# Create backup file. Don't overwrite an existing backup
# file in case this file is being filtered multiple times.
if not os.path.exists(backup_filename):
shutil.copy(filename, backup_filename)
shutil.copy(path, temp_path)
errored = False
# Create a temporary file to read from. We cannot use backup_filename
# in case filter_file is invoked multiple times on the same file.
shutil.copy(filename, tmp_filename)
try:
# Open as a text file and filter until the end of the file is reached, or we found a
# marker in the line if it was specified. To avoid translating line endings (\n to
# \r\n and vice-versa) use newline="".
with open(
temp_path, mode="r", errors="surrogateescape", newline="", encoding=encoding
) as input_file, open(
path, mode="w", errors="surrogateescape", newline="", encoding=encoding
) as output_file:
if start_at is None and stop_at is None: # common case, avoids branching in loop
for line in input_file:
output_file.write(re.sub(regex_compiled, repl, line))
else:
# state is -1 before start_at; 0 between; 1 after stop_at
state = 0 if start_at is None else -1
for line in input_file:
if state == 0:
# Open as a text file and filter until the end of the file is
# reached, or we found a marker in the line if it was specified
#
# To avoid translating line endings (\n to \r\n and vice-versa)
# we force os.open to ignore translations and use the line endings
# the file comes with
with open(tmp_filename, mode="r", errors="surrogateescape", newline="") as input_file:
with open(filename, mode="w", errors="surrogateescape", newline="") as output_file:
do_filtering = start_at is None
# Using iter and readline is a workaround needed not to
# disable input_file.tell(), which will happen if we call
# input_file.next() implicitly via the for loop
for line in iter(input_file.readline, ""):
if stop_at is not None:
current_position = input_file.tell()
if stop_at == line.strip():
state = 1
else:
line = re.sub(regex_compiled, repl, line)
elif state == -1 and start_at == line.strip():
state = 0
output_file.write(line)
output_file.write(line)
break
if do_filtering:
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
else:
do_filtering = start_at == line.strip()
output_file.write(line)
else:
current_position = None
# If we stopped filtering at some point, reopen the file in
# binary mode and copy verbatim the remaining part
if current_position and stop_at:
with open(tmp_filename, mode="rb") as input_binary_buffer:
input_binary_buffer.seek(current_position)
with open(filename, mode="ab") as output_binary_buffer:
output_binary_buffer.writelines(input_binary_buffer.readlines())
except BaseException:
# restore the original file
os.rename(temp_path, path)
errored = True
# clean up the original file on failure.
shutil.move(backup_filename, filename)
raise
finally:
if not errored and not backup:
os.unlink(temp_path)
os.remove(tmp_filename)
if not backup and os.path.exists(backup_filename):
os.remove(backup_filename)
class FileFilter:

View File

@@ -879,13 +879,10 @@ def _writer_daemon(
write_fd.close()
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O. [needs citation]
# 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# that prevents unbuffered text I/O.
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
# 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False)
if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
@@ -931,7 +928,11 @@ def _writer_daemon(
try:
while line_count < 100:
# Handle output from the calling process.
line = _retry(read_file.readline)()
try:
line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
if not line:
return
@@ -945,13 +946,6 @@ def _writer_daemon(
output_line = clean_line
if filter_fn:
output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line)
# Stripped output to log file.

View File

@@ -11,7 +11,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.1"
__version__ = "0.24.0.dev0"
spack_version = __version__

View File

@@ -571,8 +571,13 @@ def _search_for_deprecated_package_methods(pkgs, error_cls):
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
reserved_names = ("all",)
badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs:
if pkg_name in reserved_names:
error_msg = f"The name '{pkg_name}' is reserved, and cannot be used for packages"
errors.append(error_cls(error_msg, []))
if badname_regex.search(pkg_name):
error_msg = f"Package name '{pkg_name}' should be lowercase and must not contain '_'"
errors.append(error_cls(error_msg, []))
@@ -1366,8 +1371,14 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str) and isinstance(_detected, str):
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
@@ -1383,7 +1394,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
elif isinstance(_expected, dict) and isinstance(_detected, dict):
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1398,10 +1409,6 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result

View File

@@ -87,6 +87,8 @@
from spack.stage import Stage
from spack.util.executable import which
from .enums import InstallRecordStatus
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
@@ -252,7 +254,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
spec_list = [
s
for s in db.query_local(installed=any)
for s in db.query_local(installed=InstallRecordStatus.ANY)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]

View File

@@ -56,7 +56,6 @@
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
import spack.build_systems._checks
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
@@ -883,6 +882,9 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def accept(self, item):
return True
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
@@ -920,19 +922,19 @@ def effective_deptypes(
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
topo_sorted_edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges(specs),
visitor=EnvironmentVisitor(*specs, context=context),
key=traverse.by_dag_hash,
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
for edge in topo_sorted_edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
@@ -1375,7 +1377,7 @@ def exitcode_msg(p):
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
CONTEXT_BASES = (spack.package_base.PackageBase, spack.builder.Builder)
def get_package_context(traceback, context=3):

View File

@@ -9,6 +9,7 @@
import spack.builder
import spack.error
import spack.phase_callbacks
import spack.relocate
import spack.spec
import spack.store
@@ -63,7 +64,7 @@ def apply_macos_rpath_fixups(builder: spack.builder.Builder):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
@@ -71,7 +72,7 @@ def ensure_build_dependencies_or_raise(
Args:
spec: concrete spec to be checked.
dependencies: list of abstract specs to be satisfied
dependencies: list of package names of required build dependencies
error_msg: brief error message to be prepended to a longer description
Raises:
@@ -127,8 +128,8 @@ def execute_install_time_tests(builder: spack.builder.Builder):
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BaseBuilder(spack.builder.Builder):
"""Base class for builders to register common checks"""
class BuilderWithDefaults(spack.builder.Builder):
"""Base class for all specific builders with common callbacks registered."""
# Check that self.prefix is there after installation
spack.builder.run_after("install")(sanity_check_prefix)
spack.phase_callbacks.run_after("install")(sanity_check_prefix)

View File

@@ -6,7 +6,7 @@
import os.path
import stat
import subprocess
from typing import List
from typing import Callable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -15,6 +15,9 @@
import spack.builder
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
@@ -22,7 +25,7 @@
from spack.version import Version
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
ensure_build_dependencies_or_raise,
execute_build_time_tests,
@@ -69,14 +72,14 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def enable_or_disable(self, *args, **kwargs):
return self.builder.enable_or_disable(*args, **kwargs)
return spack.builder.create(self).enable_or_disable(*args, **kwargs)
def with_or_without(self, *args, **kwargs):
return self.builder.with_or_without(*args, **kwargs)
return spack.builder.create(self).with_or_without(*args, **kwargs)
@spack.builder.builder("autotools")
class AutotoolsBuilder(BaseBuilder):
class AutotoolsBuilder(BuilderWithDefaults):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:
@@ -157,7 +160,7 @@ class AutotoolsBuilder(BaseBuilder):
install_libtool_archives = False
@property
def patch_config_files(self):
def patch_config_files(self) -> bool:
"""Whether to update old ``config.guess`` and ``config.sub`` files
distributed with the tarball.
@@ -177,7 +180,7 @@ def patch_config_files(self):
)
@property
def _removed_la_files_log(self):
def _removed_la_files_log(self) -> str:
"""File containing the list of removed libtool archives"""
build_dir = self.build_directory
if not os.path.isabs(self.build_directory):
@@ -185,15 +188,15 @@ def _removed_la_files_log(self):
return os.path.join(build_dir, "removed_la_files.txt")
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on autotools"""
files = [os.path.join(self.build_directory, "config.log")]
if not self.install_libtool_archives:
files.append(self._removed_la_files_log)
return files
@spack.builder.run_after("autoreconf")
def _do_patch_config_files(self):
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
"""Some packages ship with older config.guess/config.sub files and need to
have these updated when installed on a newer architecture.
@@ -294,7 +297,7 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name))
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -304,8 +307,8 @@ def runs_ok(script_abs_path):
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@spack.builder.run_before("configure")
def _patch_usr_bin_file(self):
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
"""On NixOS file is not available in /usr/bin/file. Patch configure
scripts to use file from path."""
@@ -316,8 +319,8 @@ def _patch_usr_bin_file(self):
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@spack.builder.run_before("configure")
def _set_autotools_environment_variables(self):
@spack.phase_callbacks.run_before("configure")
def _set_autotools_environment_variables(self) -> None:
"""Many autotools builds use a version of mknod.m4 that fails when
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
@@ -330,8 +333,8 @@ def _set_autotools_environment_variables(self):
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@spack.builder.run_before("configure")
def _do_patch_libtool_configure(self):
@spack.phase_callbacks.run_before("configure")
def _do_patch_libtool_configure(self) -> None:
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
"libtool" directly should be implemented in the _do_patch_libtool method
@@ -357,16 +360,9 @@ def _do_patch_libtool_configure(self):
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.builder.run_after("configure")
def _do_patch_libtool(self):
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
values for libtool variables.
@@ -514,27 +510,64 @@ def _do_patch_libtool(self):
)
@property
def configure_directory(self):
def configure_directory(self) -> str:
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
@property
def configure_abs_path(self):
def configure_abs_path(self) -> str:
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self):
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
return self.configure_directory
@spack.builder.run_before("autoreconf")
def delete_configure_to_force_update(self):
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
def autoreconf(self, pkg, spec, prefix):
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
@@ -561,39 +594,12 @@ def autoreconf(self, pkg, spec, prefix):
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
@property
def autoreconf_search_path_args(self):
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.builder.run_after("autoreconf")
def set_configure_or_die(self):
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def configure(self, pkg, spec, prefix):
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
@@ -604,7 +610,12 @@ def configure(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
@@ -612,41 +623,49 @@ def build(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
def _activate_or_not(
self, name, activation_word, deactivation_word, activation_value=None, variant=None
):
self,
name: str,
activation_word: str,
deactivation_word: str,
activation_value: Optional[Union[Callable, str]] = None,
variant=None,
) -> List[str]:
"""This function contain the current implementation details of
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without` and
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.enable_or_disable`.
Args:
name (str): name of the option that is being activated or not
activation_word (str): the default activation word ('with' in the
case of ``with_or_without``)
deactivation_word (str): the default deactivation word ('without'
in the case of ``with_or_without``)
activation_value (typing.Callable): callable that accepts a single
value. This value is either one of the allowed values for a
multi-valued variant or the name of a bool-valued variant.
name: name of the option that is being activated or not
activation_word: the default activation word ('with' in the case of
``with_or_without``)
deactivation_word: the default deactivation word ('without' in the case of
``with_or_without``)
activation_value: callable that accepts a single value. This value is either one of the
allowed values for a multi-valued variant or the name of a bool-valued variant.
Returns the parameter to be used when the value is activated.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
variant (str): name of the variant that is being processed
(if different from option name)
variant: name of the variant that is being processed (if different from option name)
Examples:
@@ -654,19 +673,19 @@ def _activate_or_not(
.. code-block:: python
variant('foo', values=('x', 'y'), description='')
variant('bar', default=True, description='')
variant('ba_z', default=True, description='')
variant("foo", values=("x", "y"), description=")
variant("bar", default=True, description=")
variant("ba_z", default=True, description=")
calling this function like:
.. code-block:: python
_activate_or_not(
'foo', 'with', 'without', activation_value='prefix'
"foo", "with", "without", activation_value="prefix"
)
_activate_or_not('bar', 'with', 'without')
_activate_or_not('ba-z', 'with', 'without', variant='ba_z')
_activate_or_not("bar", "with", "without")
_activate_or_not("ba-z", "with", "without", variant="ba_z")
will generate the following configuration options:
@@ -686,8 +705,8 @@ def _activate_or_not(
Raises:
KeyError: if name is not among known variants
"""
spec = self.pkg.spec
args = []
spec: spack.spec.Spec = self.pkg.spec
args: List[str] = []
if activation_value == "prefix":
activation_value = lambda x: spec[x].prefix
@@ -705,7 +724,7 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
vdef = self.pkg.get_variant(variant)
if set(vdef.values) == set((True, False)):
if set(vdef.values) == set((True, False)): # type: ignore
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
@@ -716,14 +735,12 @@ def _activate_or_not(
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values
options = [(value, f"{variant}={value}" in spec) for value in feature_values]
options = [(v, f"{variant}={v}" in spec) for v in feature_values] # type: ignore
# For each allowed value in the list of values
for option_value, activated in options:
# Search for an override in the package for this value
override_name = "{0}_or_{1}_{2}".format(
activation_word, deactivation_word, option_value
)
override_name = f"{activation_word}_or_{deactivation_word}_{option_value}"
line_generator = getattr(self, override_name, None) or getattr(
self.pkg, override_name, None
)
@@ -732,19 +749,24 @@ def _activate_or_not(
def _default_generator(is_activated):
if is_activated:
line = "--{0}-{1}".format(activation_word, option_value)
line = f"--{activation_word}-{option_value}"
if activation_value is not None and activation_value(
option_value
): # NOQA=ignore=E501
line += "={0}".format(activation_value(option_value))
line = f"{line}={activation_value(option_value)}"
return line
return "--{0}-{1}".format(deactivation_word, option_value)
return f"--{deactivation_word}-{option_value}"
line_generator = _default_generator
args.append(line_generator(activated))
return args
def with_or_without(self, name, activation_value=None, variant=None):
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
@@ -759,12 +781,11 @@ def with_or_without(self, name, activation_value=None, variant=None):
``variant=value`` is in the spec.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): callable that accepts a single
value and returns the parameter to be used leading to an entry
of the type ``--with-{name}={parameter}``.
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -772,18 +793,22 @@ def with_or_without(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(self, name, activation_value=None, variant=None):
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): if present accepts a single value
and returns the parameter to be used leading to an entry of the
type ``--enable-{name}={parameter}``
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -791,15 +816,15 @@ def enable_or_disable(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@spack.builder.run_after("install")
def remove_libtool_archives(self):
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
@@ -821,12 +846,13 @@ def setup_build_environment(self, env):
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec):
dirs_seen = set()
flags_spack, flags_external = [], []
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()
flags_spack: List[str] = []
flags_external: List[str] = []
# We don't want to add an include flag for automake's default search path.
for automake in spec.dependencies(name="automake", deptype="build"):

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.builder
import spack.phase_callbacks
from .cmake import CMakeBuilder, CMakePackage
@@ -192,7 +192,10 @@ def initconfig_mpi_entries(self):
entries.append(cmake_cache_path("MPI_C_COMPILER", spec["mpi"].mpicc))
entries.append(cmake_cache_path("MPI_CXX_COMPILER", spec["mpi"].mpicxx))
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# not all MPIs have Fortran wrappers
if hasattr(spec["mpi"], "mpifc"):
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# Check for slurm
using_slurm = False
@@ -332,7 +335,7 @@ def std_cmake_args(self):
args.extend(["-C", self.cache_path])
return args
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def install_cmake_cache(self):
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
@@ -27,7 +28,7 @@ class CargoPackage(spack.package_base.PackageBase):
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
class CargoBuilder(BuilderWithDefaults):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
@@ -77,7 +78,7 @@ def install(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import List, Optional, Set, Tuple
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -18,11 +18,15 @@
import spack.deptypes as dt
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
# Regex to extract the primary generator from the CMake generator
# string.
@@ -48,9 +52,9 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),
CMakeBuilder.define("Python_EXECUTABLE", python_executable),
CMakeBuilder.define("Python3_EXECUTABLE", python_executable),
define("PYTHON_EXECUTABLE", python_executable),
define("Python_EXECUTABLE", python_executable),
define("Python3_EXECUTABLE", python_executable),
]
)
@@ -85,7 +89,7 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
ipo = False
if cmake.satisfies("@3.9:"):
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
@@ -93,30 +97,36 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
args.append(define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
args.append(define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
# Disable find package's config mode for versions of Boost that
# didn't provide it. See https://github.com/spack/spack/issues/20169
# and https://cmake.org/cmake/help/latest/module/FindBoost.html
if pkg.spec.satisfies("^boost@:1.69.0"):
args.append(define("Boost_NO_BOOST_CMAKE", True))
def generator(*names: str, default: Optional[str] = None):
def generator(*names: str, default: Optional[str] = None) -> None:
"""The build system generator to use.
See ``cmake --help`` for a list of valid generators.
@@ -157,15 +167,18 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK | dt.RUN, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths
@@ -263,15 +276,15 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def define(self, *args, **kwargs):
return self.builder.define(*args, **kwargs)
def define(self, cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
def define_from_variant(self, *args, **kwargs):
return self.builder.define_from_variant(*args, **kwargs)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self, cmake_var, variant)
@spack.builder.builder("cmake")
class CMakeBuilder(BaseBuilder):
class CMakeBuilder(BuilderWithDefaults):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:
@@ -321,15 +334,15 @@ class CMakeBuilder(BaseBuilder):
build_time_test_callbacks = ["check"]
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on CMake"""
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self):
if _supports_compilation_databases(self.pkg):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self):
def root_cmakelists_dir(self) -> str:
"""The relative path to the directory containing CMakeLists.txt
This path is relative to the root of the extracted tarball,
@@ -338,16 +351,17 @@ def root_cmakelists_dir(self):
return self.pkg.stage.source_path
@property
def generator(self):
def generator(self) -> str:
if self.spec.satisfies("generator=make"):
return "Unix Makefiles"
if self.spec.satisfies("generator=ninja"):
return "Ninja"
msg = f'{self.spec.format()} has an unsupported value for the "generator" variant'
raise ValueError(msg)
raise ValueError(
f'{self.spec.format()} has an unsupported value for the "generator" variant'
)
@property
def std_cmake_args(self):
def std_cmake_args(self) -> List[str]:
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
@@ -356,7 +370,9 @@ def std_cmake_args(self):
return args
@staticmethod
def std_args(pkg, generator=None):
def std_args(
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
) -> List[str]:
"""Computes the standard cmake arguments for a generic package"""
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
@@ -373,7 +389,6 @@ def std_args(pkg, generator=None):
except KeyError:
build_type = "RelWithDebInfo"
define = CMakeBuilder.define
args = [
"-G",
generator,
@@ -405,152 +420,31 @@ def std_args(pkg, generator=None):
return args
@staticmethod
def define_cuda_architectures(pkg):
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
cmake_flag = str()
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value
)
return cmake_flag
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_cuda_architectures(pkg)
@staticmethod
def define_hip_architectures(pkg):
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
cmake_flag = str()
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value
)
return cmake_flag
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_hip_architectures(pkg)
@staticmethod
def define(cmake_var, value):
"""Return a CMake command line argument that defines a variable.
def define(cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
The resulting argument will convert boolean values to OFF/ON
and lists/tuples to CMake semicolon-separated string lists. All other
values will be interpreted as strings.
Examples:
.. code-block:: python
[define('BUILD_SHARED_LIBS', True),
define('CMAKE_CXX_STANDARD', 14),
define('swr', ['avx', 'avx2'])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(self, cmake_var, variant=None):
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
This utility function is similar to
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`.
Examples:
Given a package with:
.. code-block:: python
variant('cxxstd', default='11', values=('11', '14'),
multi=False, description='')
variant('shared', default=True, description='')
variant('swr', values=any_combination_of('avx', 'avx2'),
description='')
calling this function like:
.. code-block:: python
[self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
self.define_from_variant('SWR')]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met,
this function returns an empty string. CMake discards empty strings
provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:
return ""
value = self.pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return self.define(cmake_var, value)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self.pkg, cmake_var, variant)
@property
def build_dirname(self):
def build_dirname(self) -> str:
"""Directory name to use when building the package."""
return "spack-build-%s" % self.pkg.spec.dag_hash(7)
return f"spack-build-{self.pkg.spec.dag_hash(7)}"
@property
def build_directory(self):
def build_directory(self) -> str:
"""Full-path to the directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def cmake_args(self):
def cmake_args(self) -> List[str]:
"""List of all the arguments that must be passed to cmake, except:
* CMAKE_INSTALL_PREFIX
@@ -560,7 +454,12 @@ def cmake_args(self):
"""
return []
def cmake(self, pkg, spec, prefix):
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Runs ``cmake`` in the build directory"""
# skip cmake phase if it is an incremental develop build
@@ -575,7 +474,12 @@ def cmake(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -584,7 +488,12 @@ def build(self, pkg, spec, prefix):
self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -592,9 +501,9 @@ def install(self, pkg, spec, prefix):
elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search the CMake-generated files for the targets ``test`` and ``check``,
and runs them if found.
"""
@@ -605,3 +514,133 @@ def check(self):
elif self.generator == "Ninja":
self.pkg._if_ninja_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
self.pkg._if_ninja_target_execute("check")
def define(cmake_var: str, value: Any) -> str:
"""Return a CMake command line argument that defines a variable.
The resulting argument will convert boolean values to OFF/ON and lists/tuples to CMake
semicolon-separated string lists. All other values will be interpreted as strings.
Examples:
.. code-block:: python
[define("BUILD_SHARED_LIBS", True),
define("CMAKE_CXX_STANDARD", 14),
define("swr", ["avx", "avx2"])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
) -> str:
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
Examples:
Given a package with:
.. code-block:: python
variant("cxxstd", default="11", values=("11", "14"),
multi=False, description="")
variant("shared", default=True, description="")
variant("swr", values=any_combination_of("avx", "avx2"),
description="")
calling this function like:
.. code-block:: python
[
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"),
self.define_from_variant("SWR"),
]
will generate the following configuration options:
.. code-block:: console
[
"-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2",
]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met, this function
returns an empty string. CMake discards empty strings provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, pkg.name))
if variant not in pkg.spec.variants:
return ""
value = pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return define(cmake_var, value)
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
return define("CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value)
return ""
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
return define("CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value)
return ""

View File

@@ -180,13 +180,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@7:", when="+cuda ^cuda@:9.1 target=x86_64:")
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=x86_64:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.2.89 target=x86_64:")
conflicts("%pgi@:14.8", when="+cuda ^cuda@:7.0.27 target=x86_64:")
conflicts("%pgi@:15.3,15.5:", when="+cuda ^cuda@7.5 target=x86_64:")
conflicts("%pgi@:16.2,16.0:16.3", when="+cuda ^cuda@8 target=x86_64:")
conflicts("%pgi@:15,18:", when="+cuda ^cuda@9.0:9.1 target=x86_64:")
conflicts("%pgi@:16,19:", when="+cuda ^cuda@9.2.88:10.0 target=x86_64:")
conflicts("%pgi@:17,20:", when="+cuda ^cuda@10.1.105:10.2.89 target=x86_64:")
conflicts("%pgi@:17,21:", when="+cuda ^cuda@11.0.2:11.1.0 target=x86_64:")
conflicts("%clang@:3.4", when="+cuda ^cuda@:7.5 target=x86_64:")
conflicts("%clang@:3.7,4:", when="+cuda ^cuda@8.0:9.0 target=x86_64:")
conflicts("%clang@:3.7,4.1:", when="+cuda ^cuda@9.1 target=x86_64:")
@@ -212,9 +205,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=ppc64le:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.1.243 target=ppc64le:")
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts("%pgi", when="+cuda ^cuda@:8 target=ppc64le:")
conflicts("%pgi@:16", when="+cuda ^cuda@:9.1.185 target=ppc64le:")
conflicts("%pgi@:17", when="+cuda ^cuda@:10 target=ppc64le:")
conflicts("%clang@4:", when="+cuda ^cuda@:9.0.176 target=ppc64le:")
conflicts("%clang@5:", when="+cuda ^cuda@:9.1 target=ppc64le:")
conflicts("%clang@6:", when="+cuda ^cuda@:9.2 target=ppc64le:")

View File

@@ -7,8 +7,9 @@
import spack.builder
import spack.directives
import spack.package_base
import spack.phase_callbacks
from ._checks import BaseBuilder, apply_macos_rpath_fixups, execute_install_time_tests
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
class Package(spack.package_base.PackageBase):
@@ -26,7 +27,7 @@ class Package(spack.package_base.PackageBase):
@spack.builder.builder("generic")
class GenericBuilder(BaseBuilder):
class GenericBuilder(BuilderWithDefaults):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.
"""
@@ -44,7 +45,7 @@ class GenericBuilder(BaseBuilder):
install_time_test_callbacks = []
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
# unconditionally perform any post-install phase tests
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
@@ -32,7 +33,7 @@ class GoPackage(spack.package_base.PackageBase):
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
class GoBuilder(BuilderWithDefaults):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
@@ -99,7 +100,7 @@ def install(self, pkg, spec, prefix):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""

View File

@@ -22,8 +22,8 @@
install,
)
import spack.builder
import spack.error
import spack.phase_callbacks
from spack.build_environment import dso_suffix
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
@@ -1163,7 +1163,7 @@ def _determine_license_type(self):
debug_print(license_type)
return license_type
@spack.builder.run_before("install")
@spack.phase_callbacks.run_before("install")
def configure(self):
"""Generates the silent.cfg file to pass to installer.sh.
@@ -1250,7 +1250,7 @@ def install(self, spec, prefix):
for f in glob.glob("%s/intel*log" % tmpdir):
install(f, dst)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
@@ -1258,7 +1258,7 @@ def validate_install(self):
if not os.path.exists(self.prefix.bin):
raise InstallError("The installer has failed to install anything.")
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_rpath(self):
if "+rpath" not in self.spec:
return
@@ -1276,7 +1276,7 @@ def configure_rpath(self):
with open(compiler_cfg, "w") as fh:
fh.write("-Xlinker -rpath={0}\n".format(compilers_lib_dir))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_auto_dispatch(self):
if self._has_compilers:
if "auto_dispatch=none" in self.spec:
@@ -1300,7 +1300,7 @@ def configure_auto_dispatch(self):
with open(compiler_cfg, "a") as fh:
fh.write("-ax{0}\n".format(",".join(ad)))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def filter_compiler_wrappers(self):
if ("+mpi" in self.spec or self.provides("mpi")) and "~newdtags" in self.spec:
bin_dir = self.component_bin_dir("mpi")
@@ -1308,7 +1308,7 @@ def filter_compiler_wrappers(self):
f = os.path.join(bin_dir, f)
filter_file("-Xlinker --enable-new-dtags", " ", f, string=True)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def uninstall_ism(self):
# The "Intel(R) Software Improvement Program" [ahem] gets installed,
# apparently regardless of PHONEHOME_SEND_USAGE_DATA.
@@ -1340,7 +1340,7 @@ def base_lib_dir(self):
debug_print(d)
return d
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def modify_LLVMgold_rpath(self):
"""Add libimf.so and other required libraries to the RUNPATH of LLVMgold.so.

View File

@@ -8,11 +8,14 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
execute_build_time_tests,
execute_install_time_tests,
@@ -36,7 +39,7 @@ class MakefilePackage(spack.package_base.PackageBase):
@spack.builder.builder("makefile")
class MakefileBuilder(BaseBuilder):
class MakefileBuilder(BuilderWithDefaults):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:
@@ -91,35 +94,50 @@ class MakefileBuilder(BaseBuilder):
install_time_test_callbacks = ["installcheck"]
@property
def build_directory(self):
def build_directory(self) -> str:
"""Return the directory containing the main Makefile."""
return self.pkg.stage.source_path
def edit(self, pkg, spec, prefix):
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Searches the Makefile for an ``installcheck`` target
and runs it if found.
"""
@@ -127,4 +145,4 @@ def installcheck(self):
self.pkg._if_make_target_execute("installcheck")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -10,7 +10,7 @@
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MavenPackage(spack.package_base.PackageBase):
@@ -34,7 +34,7 @@ class MavenPackage(spack.package_base.PackageBase):
@spack.builder.builder("maven")
class MavenBuilder(BaseBuilder):
class MavenBuilder(BuilderWithDefaults):
"""The Maven builder encodes the default way to build software with Maven.
It has two phases that can be overridden, if need be:

View File

@@ -9,10 +9,13 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class MesonPackage(spack.package_base.PackageBase):
@@ -62,7 +65,7 @@ def flags_to_build_system_args(self, flags):
@spack.builder.builder("meson")
class MesonBuilder(BaseBuilder):
class MesonBuilder(BuilderWithDefaults):
"""The Meson builder encodes the default way to build software with Meson.
The builder has three phases that can be overridden, if need be:
@@ -112,7 +115,7 @@ def archive_files(self):
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]
@property
def root_mesonlists_dir(self):
def root_mesonlists_dir(self) -> str:
"""Relative path to the directory containing meson.build
This path is relative to the root of the extracted tarball,
@@ -121,7 +124,7 @@ def root_mesonlists_dir(self):
return self.pkg.stage.source_path
@property
def std_meson_args(self):
def std_meson_args(self) -> List[str]:
"""Standard meson arguments provided as a property for convenience
of package writers.
"""
@@ -132,7 +135,7 @@ def std_meson_args(self):
return std_meson_args
@staticmethod
def std_args(pkg):
def std_args(pkg) -> List[str]:
"""Standard meson arguments for a generic package."""
try:
build_type = pkg.spec.variants["buildtype"].value
@@ -172,7 +175,7 @@ def build_directory(self):
"""Directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def meson_args(self):
def meson_args(self) -> List[str]:
"""List of arguments that must be passed to meson, except:
* ``--prefix``
@@ -185,7 +188,12 @@ def meson_args(self):
"""
return []
def meson(self, pkg, spec, prefix):
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run ``meson`` in the build directory"""
options = []
if self.spec["meson"].satisfies("@0.64:"):
@@ -196,21 +204,31 @@ def meson(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.ninja(*options)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search Meson-generated files for the target ``test`` and run it if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_ninja_target_execute("test")

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MSBuildPackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
@spack.builder.builder("msbuild")
class MSBuildBuilder(BaseBuilder):
class MSBuildBuilder(BuilderWithDefaults):
"""The MSBuild builder encodes the most common way of building software with
Mircosoft's MSBuild tool. It has two phases that can be overridden, if need be:

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class NMakePackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class NMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("nmake")
class NMakeBuilder(BaseBuilder):
class NMakeBuilder(BuilderWithDefaults):
"""The NMake builder encodes the most common way of building software with
Mircosoft's NMake tool. It has two phases that can be overridden, if need be:

View File

@@ -7,7 +7,7 @@
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class OctavePackage(spack.package_base.PackageBase):
@@ -29,7 +29,7 @@ class OctavePackage(spack.package_base.PackageBase):
@spack.builder.builder("octave")
class OctaveBuilder(BaseBuilder):
class OctaveBuilder(BuilderWithDefaults):
"""The octave builder provides the following phases that can be overridden:
1. :py:meth:`~.OctaveBuilder.install`

View File

@@ -255,7 +255,7 @@ def libs(self):
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
class IntelOneApiLibraryPackageWithSdk(IntelOneApiLibraryPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries

View File

@@ -10,11 +10,12 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class PerlPackage(spack.package_base.PackageBase):
@@ -84,7 +85,7 @@ def test_use(self):
@spack.builder.builder("perl")
class PerlBuilder(BaseBuilder):
class PerlBuilder(BuilderWithDefaults):
"""The perl builder provides four phases that can be overridden, if required:
1. :py:meth:`~.PerlBuilder.configure`
@@ -163,7 +164,7 @@ def configure(self, pkg, spec, prefix):
# Build.PL may be too long causing the build to fail. Patching the shebang
# does not happen until after install so set '/usr/bin/env perl' here in
# the Build script.
@spack.builder.run_after("configure")
@spack.phase_callbacks.run_after("configure")
def fix_shebang(self):
if self.build_method == "Build.PL":
pattern = "#!{0}".format(self.spec["perl"].command.path)
@@ -175,7 +176,7 @@ def build(self, pkg, spec, prefix):
self.build_executable()
# Ensure that tests run after build (if requested):
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
"""Runs built-in tests of a Perl package."""

View File

@@ -24,6 +24,7 @@
import spack.detection
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.platforms
import spack.repo
import spack.spec
@@ -34,7 +35,7 @@
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
@@ -374,7 +375,7 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return None
@property
def python_spec(self):
def python_spec(self) -> Spec:
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@@ -425,7 +426,7 @@ def libs(self) -> LibraryList:
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
class PythonPipBuilder(BuilderWithDefaults):
phases = ("install",)
#: Names associated with package methods in the old build-system format
@@ -543,4 +544,4 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
with fs.working_dir(self.build_directory):
pip(*args)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class QMakePackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class QMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("qmake")
class QMakeBuilder(BaseBuilder):
class QMakeBuilder(BuilderWithDefaults):
"""The qmake builder provides three phases that can be overridden:
1. :py:meth:`~.QMakeBuilder.qmake`
@@ -81,4 +82,4 @@ def check(self):
with working_dir(self.build_directory):
self.pkg._if_make_target_execute("check")
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -8,7 +8,7 @@
import spack.package_base
from spack.directives import build_system, extends, maintainers
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class RubyPackage(spack.package_base.PackageBase):
@@ -28,7 +28,7 @@ class RubyPackage(spack.package_base.PackageBase):
@spack.builder.builder("ruby")
class RubyBuilder(BaseBuilder):
class RubyBuilder(BuilderWithDefaults):
"""The Ruby builder provides two phases that can be overridden if required:
#. :py:meth:`~.RubyBuilder.build`

View File

@@ -4,9 +4,10 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class SConsPackage(spack.package_base.PackageBase):
@@ -28,7 +29,7 @@ class SConsPackage(spack.package_base.PackageBase):
@spack.builder.builder("scons")
class SConsBuilder(BaseBuilder):
class SConsBuilder(BuilderWithDefaults):
"""The Scons builder provides the following phases that can be overridden:
1. :py:meth:`~.SConsBuilder.build`
@@ -79,4 +80,4 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -11,11 +11,12 @@
import spack.builder
import spack.install_test
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class SIPPackage(spack.package_base.PackageBase):
@@ -103,7 +104,7 @@ def test_imports(self):
@spack.builder.builder("sip")
class SIPBuilder(BaseBuilder):
class SIPBuilder(BuilderWithDefaults):
"""The SIP builder provides the following phases that can be overridden:
* configure
@@ -170,4 +171,4 @@ def install_args(self):
"""Arguments to pass to install."""
return []
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
class WafPackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class WafPackage(spack.package_base.PackageBase):
@spack.builder.builder("waf")
class WafBuilder(BaseBuilder):
class WafBuilder(BuilderWithDefaults):
"""The WAF builder provides the following phases that can be overridden:
* configure
@@ -136,7 +137,7 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def install_test(self):
"""Run unit tests after install.
@@ -146,4 +147,4 @@ def install_test(self):
"""
pass
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,44 +6,30 @@
import collections.abc
import copy
import functools
from typing import List, Optional, Tuple
from llnl.util import lang
from typing import Dict, List, Optional, Tuple, Type
import spack.error
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.util.environment
#: Builder classes, as registered by the "builder" decorator
BUILDER_CLS = {}
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
BUILDER_CLS: Dict[str, Type["Builder"]] = {}
#: Map id(pkg) to a builder, to avoid creating multiple
#: builders for the same package object.
_BUILDERS = {}
_BUILDERS: Dict[int, "Builder"] = {}
def builder(build_system_name):
def builder(build_system_name: str):
"""Class decorator used to register the default builder
for a given build-system.
Args:
build_system_name (str): name of the build-system
build_system_name: name of the build-system
"""
def _decorator(cls):
@@ -54,13 +40,9 @@ def _decorator(cls):
return _decorator
def create(pkg):
"""Given a package object with an associated concrete spec,
return the builder object that can install it.
Args:
pkg (spack.package_base.PackageBase): package for which we want the builder
"""
def create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Given a package object with an associated concrete spec, return the builder object that can
install it."""
if id(pkg) not in _BUILDERS:
_BUILDERS[id(pkg)] = _create(pkg)
return _BUILDERS[id(pkg)]
@@ -75,7 +57,7 @@ def __call__(self, spec, prefix):
return self.phase_fn(self.builder.pkg, spec, prefix)
def get_builder_class(pkg, name: str) -> Optional[type]:
def get_builder_class(pkg, name: str) -> Optional[Type["Builder"]]:
"""Return the builder class if a package module defines it."""
cls = getattr(pkg.module, name, None)
if cls and cls.__module__.startswith(spack.repo.ROOT_PYTHON_NAMESPACE):
@@ -83,7 +65,7 @@ def get_builder_class(pkg, name: str) -> Optional[type]:
return None
def _create(pkg):
def _create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Return a new builder object for the package object being passed as argument.
The function inspects the build-system used by the package object and try to:
@@ -103,7 +85,7 @@ class hierarchy (look at AspellDictPackage for an example of that)
to look for build-related methods in the ``*Package``.
Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder
pkg: package object for which we need a builder
"""
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
@@ -168,8 +150,8 @@ def __forward(self, *args, **kwargs):
# with the same name is defined in the Package, it will override this definition
# (when _ForwardToBaseBuilder is initialized)
for method_name in (
base_cls.phases
+ base_cls.legacy_methods
base_cls.phases # type: ignore
+ base_cls.legacy_methods # type: ignore
+ getattr(base_cls, "legacy_long_methods", tuple())
+ ("setup_build_environment", "setup_dependent_build_environment")
):
@@ -181,14 +163,14 @@ def __forward(self):
return __forward
for attribute_name in base_cls.legacy_attributes:
for attribute_name in base_cls.legacy_attributes: # type: ignore
setattr(
_ForwardToBaseBuilder,
attribute_name,
property(forward_property_to_getattr(attribute_name)),
)
class Adapter(base_cls, metaclass=_PackageAdapterMeta):
class Adapter(base_cls, metaclass=_PackageAdapterMeta): # type: ignore
def __init__(self, pkg):
# Deal with custom phases in packages here
if hasattr(pkg, "phases"):
@@ -213,99 +195,18 @@ def setup_dependent_build_environment(self, env, dependent_spec):
return Adapter(pkg)
def buildsystem_name(pkg):
def buildsystem_name(pkg: spack.package_base.PackageBase) -> str:
"""Given a package object with an associated concrete spec,
return the name of its build system.
Args:
pkg (spack.package_base.PackageBase): package for which we want
the build system name
"""
return the name of its build system."""
try:
return pkg.spec.variants["build_system"].value
except KeyError:
# We are reading an old spec without the build_system variant
return pkg.legacy_buildsystem
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
return pkg.legacy_buildsystem # type: ignore
class BuilderMeta(
PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
spack.multimethod.MultiMethodMeta,
type(collections.abc.Sequence), # type: ignore
):
@@ -400,8 +301,12 @@ def __new__(mcs, name, bases, attr_dict):
)
combine_callbacks = _PackageAdapterMeta.combine_callbacks
attr_dict[_RUN_BEFORE.attribute_name] = combine_callbacks(_RUN_BEFORE.attribute_name)
attr_dict[_RUN_AFTER.attribute_name] = combine_callbacks(_RUN_AFTER.attribute_name)
attr_dict[spack.phase_callbacks._RUN_BEFORE.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_BEFORE.attribute_name
)
attr_dict[spack.phase_callbacks._RUN_AFTER.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_AFTER.attribute_name
)
return super(_PackageAdapterMeta, mcs).__new__(mcs, name, bases, attr_dict)
@@ -421,8 +326,8 @@ def __init__(self, name, builder):
self.name = name
self.builder = builder
self.phase_fn = self._select_phase_fn()
self.run_before = self._make_callbacks(_RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(_RUN_AFTER.attribute_name)
self.run_before = self._make_callbacks(spack.phase_callbacks._RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(spack.phase_callbacks._RUN_AFTER.attribute_name)
def _make_callbacks(self, callbacks_attribute):
result = []
@@ -483,15 +388,103 @@ def copy(self):
return copy.deepcopy(self)
class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
"""A builder is a class that, given a package object (i.e. associated with
concrete spec), knows how to install it.
class BaseBuilder(metaclass=BuilderMeta):
"""An interface for builders, without any phases defined. This class is exposed in the package
API, so that packagers can create a single class to define ``setup_build_environment`` and
``@run_before`` and ``@run_after`` callbacks that can be shared among different builders.
The builder behaves like a sequence, and when iterated over return the
"phases" of the installation in the correct order.
Example:
Args:
pkg (spack.package_base.PackageBase): package object to be built
.. code-block:: python
class AnyBuilder(BaseBuilder):
@run_after("install")
def fixup_install(self):
# do something after the package is installed
pass
def setup_build_environment(self, env):
env.set("MY_ENV_VAR", "my_value")
class CMakeBuilder(cmake.CMakeBuilder, AnyBuilder):
pass
class AutotoolsBuilder(autotools.AutotoolsBuilder, AnyBuilder):
pass
"""
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
self.pkg = pkg
@property
def spec(self) -> spack.spec.Spec:
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env: environment modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env) # type: ignore
def setup_dependent_build_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the build environment of a package that depends on this one.
This is similar to ``setup_build_environment``, but it is used to modify the build
environment of a package that *depends* on this one.
This gives packages the ability to set environment variables for the build of the
dependent, which can be useful to provide search hints for headers or libraries if they are
not in standard locations.
This method will be called before the dependent package prefix exists in Spack's store.
Args:
env: environment modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec: the spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec) # type: ignore
def __repr__(self):
fmt = "{name}{/hash:7}"
return f"{self.__class__.__name__}({self.spec.format(fmt)})"
def __str__(self):
fmt = "{name}{/hash:7}"
return f'"{self.__class__.__name__}" builder for "{self.spec.format(fmt)}"'
class Builder(BaseBuilder, collections.abc.Sequence):
"""A builder is a class that, given a package object (i.e. associated with concrete spec),
knows how to install it.
The builder behaves like a sequence, and when iterated over return the "phases" of the
installation in the correct order.
"""
#: Sequence of phases. Must be defined in derived classes
@@ -506,95 +499,22 @@ class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will be
#: copied in the same directory tree as _spack_build_logfile and
#: _spack_build_envfile.
archive_files: List[str] = []
#: List of glob expressions. Each expression must either be absolute or relative to the package
#: source path. Matching artifacts found at the end of the build process will be copied in the
#: same directory tree as _spack_build_logfile and _spack_build_envfile.
@property
def archive_files(self) -> List[str]:
return []
def __init__(self, pkg):
self.pkg = pkg
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
super().__init__(pkg)
self.callbacks = {}
for phase in self.phases:
self.callbacks[phase] = InstallationPhase(phase, self)
@property
def spec(self):
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(self, env):
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
"""Sets up the build environment of packages that depend on this one.
This is similar to ``setup_build_environment``, but it is used to
modify the build environments of packages that *depend* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or compile-time settings
for dependencies.
This method will be called before the dependent package prefix exists
in Spack's store.
Examples:
1. Installing python modules generally requires ``PYTHONPATH``
to point to the ``lib/pythonX.Y/site-packages`` directory in the
module's install prefix. This method could be used to set that
variable.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): the spec of the dependent package
about to be built. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec)
def __getitem__(self, idx):
key = self.phases[idx]
return self.callbacks[key]
def __len__(self):
return len(self.phases)
def __repr__(self):
msg = "{0}({1})"
return msg.format(type(self).__name__, self.pkg.spec.format("{name}/{hash:7}"))
def __str__(self):
msg = '"{0}" builder for "{1}"'
return msg.format(type(self).build_system, self.pkg.spec.format("{name}/{hash:7}"))
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -32,6 +32,7 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.concretize
import spack.config as cfg
import spack.error
@@ -1387,7 +1388,11 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [job_pkg.log_path, job_pkg.env_mods_path, *job_pkg.builder.archive_files]:
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
copy_files_to_artifacts(file, job_log_dir)

View File

@@ -9,7 +9,7 @@
import re
import sys
from collections import Counter
from typing import List, Union
from typing import List, Optional, Union
import llnl.string
import llnl.util.tty as tty
@@ -33,6 +33,8 @@
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
from ..enums import InstallRecordStatus
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -167,9 +169,7 @@ def quote_kvp(string: str) -> str:
def parse_specs(
args: Union[str, List[str]],
concretize: bool = False,
tests: spack.concretize.TestsType = False,
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
@@ -181,13 +181,11 @@ def parse_specs(
if not concretize:
return specs
to_concretize: List[spack.concretize.SpecPairInput] = [(s, None) for s in specs]
to_concretize = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(
to_concretize: List[spack.concretize.SpecPairInput], tests: spack.concretize.TestsType = False
) -> List[spack.spec.Spec]:
def _concretize_spec_pairs(to_concretize, tests=False):
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec
@@ -198,7 +196,7 @@ def _concretize_spec_pairs(
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized(tests=tests)]
return [concrete or abstract.concretized()]
# Special case if every spec is either concrete or has an abstract hash
if all(
@@ -271,39 +269,48 @@ def matching_specs_from_env(specs):
return _concretize_spec_pairs(spec_pairs + additional_concrete_specs)[: len(spec_pairs)]
def disambiguate_spec(spec, env, local=False, installed=True, first=False):
def disambiguate_spec(
spec: spack.spec.Spec,
env: Optional[ev.Environment],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec, figure out which installed package it refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
env (spack.environment.Environment): a spack environment,
if one is active, or None if no environment is active
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
Args:
spec: a spec to disambiguate
env: a spack environment, if one is active, or None if no environment is active
local: do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
hashes = env.all_hashes() if env else None
return disambiguate_spec_from_hashes(spec, hashes, local, installed, first)
def disambiguate_spec_from_hashes(spec, hashes, local=False, installed=True, first=False):
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: List[str],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec and a list of hashes, get concrete spec the spec refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
hashes (typing.Iterable): a set of hashes of specs among which to disambiguate
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
spec: a spec to disambiguate
hashes: a set of hashes of specs among which to disambiguate
local: if True, do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
if local:
matching_specs = spack.store.STORE.db.query_local(spec, hashes=hashes, installed=installed)
else:
matching_specs = spack.store.STORE.db.query(spec, hashes=hashes, installed=installed)
if not matching_specs:
tty.die("Spec '%s' matches no installed packages." % spec)
tty.die(f"Spec '{spec}' matches no installed packages.")
elif first:
return matching_specs[0]

View File

@@ -29,7 +29,7 @@
# Tarball to be downloaded if binary packages are requested in a local mirror
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache.tar.gz"
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.4/bootstrap-buildcache.tar.gz"
#: Subdirectory where to create the mirror
LOCAL_MIRROR_DIR = "bootstrap_cache"
@@ -51,9 +51,9 @@
},
}
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/patchelf.json"
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/patchelf.json"
# Metadata for a generated source mirror
SOURCE_METADATA = {

View File

@@ -34,6 +34,8 @@
from spack.cmd.common import arguments
from spack.spec import Spec, save_dependency_specfiles
from ..enums import InstallRecordStatus
description = "create, download and install binary packages"
section = "packaging"
level = "long"
@@ -308,7 +310,10 @@ def setup_parser(subparser: argparse.ArgumentParser):
def _matching_specs(specs: List[Spec]) -> List[Spec]:
"""Disambiguate specs and return a list of matching specs"""
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
return [
spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=InstallRecordStatus.ANY)
for s in specs
]
def _format_spec(spec: Spec) -> str:

View File

@@ -528,7 +528,6 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually
# specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line")

View File

@@ -23,9 +23,10 @@
import spack.installer
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError
from ..enums import InstallRecordStatus
description = "replace one package with another via symlinks"
section = "admin"
level = "long"
@@ -95,8 +96,12 @@ def deprecate(parser, args):
if len(specs) != 2:
raise SpackError("spack deprecate requires exactly two specs")
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
deprecate = spack.cmd.disambiguate_spec(specs[0], env, local=True, installed=install_query)
deprecate = spack.cmd.disambiguate_spec(
specs[0],
env,
local=True,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
)
if args.install:
deprecator = specs[1].concretized()

View File

@@ -17,7 +17,8 @@
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "list and search installed packages"
section = "basic"
@@ -137,21 +138,22 @@ def setup_parser(subparser):
subparser.add_argument(
"--loaded", action="store_true", help="show only packages loaded in the user environment"
)
subparser.add_argument(
only_missing_or_deprecated = subparser.add_mutually_exclusive_group()
only_missing_or_deprecated.add_argument(
"-M",
"--only-missing",
action="store_true",
dest="only_missing",
help="show only missing dependencies",
)
only_missing_or_deprecated.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--deprecated",
action="store_true",
help="show deprecated packages as well as installed specs",
)
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--install-tree",
action="store",
@@ -165,14 +167,23 @@ def setup_parser(subparser):
def query_arguments(args):
# Set up query arguments.
installed = []
if not (args.only_missing or args.only_deprecated):
installed.append(InstallStatuses.INSTALLED)
if (args.deprecated or args.only_deprecated) and not args.only_missing:
installed.append(InstallStatuses.DEPRECATED)
if (args.missing or args.only_missing) and not args.only_deprecated:
installed.append(InstallStatuses.MISSING)
if args.only_missing and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-missing with --deprecated, or --missing")
if args.only_deprecated and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-deprecated with --deprecated, or --missing")
installed = InstallRecordStatus.INSTALLED
if args.only_missing:
installed = InstallRecordStatus.MISSING
elif args.only_deprecated:
installed = InstallRecordStatus.DEPRECATED
if args.missing:
installed |= InstallRecordStatus.MISSING
if args.deprecated:
installed |= InstallRecordStatus.DEPRECATED
predicate_fn = None
if args.unknown:

View File

@@ -78,8 +78,8 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @g{%pgi} @B{fabrics=psm,mrail,sock}
mvapich2, built with pgi compiler, with support for multiple fabrics
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2, built with gcc compiler, with support for multiple fabrics
"""

View File

@@ -11,6 +11,7 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.builder
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
@@ -202,11 +203,13 @@ def print_namespace(pkg, args):
def print_phases(pkg, args):
"""output installation phases"""
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
builder = spack.builder.create(pkg)
if hasattr(builder, "phases") and builder.phases:
color.cprint("")
color.cprint(section_title("Installation Phases:"))
phase_str = ""
for phase in pkg.builder.phases:
for phase in builder.phases:
phase_str += " {0}".format(phase)
color.cprint(phase_str)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os
import re
from collections import defaultdict
@@ -96,7 +97,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = 2024 # year of 0.23 release
latest_year = datetime.date.today().year
strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files

View File

@@ -10,7 +10,8 @@
import spack.cmd
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "mark packages as explicitly or implicitly installed"
section = "admin"
@@ -67,8 +68,7 @@ def find_matching_specs(specs, allow_multiple_matches=False):
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED]
matching = spack.store.STORE.db.query_local(spec, installed=install_query)
matching = spack.store.STORE.db.query_local(spec, installed=InstallRecordStatus.INSTALLED)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't
if not allow_multiple_matches and len(matching) > 1:

View File

@@ -8,6 +8,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.lmod

View File

@@ -7,6 +7,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.tcl

View File

@@ -3,18 +3,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import ast
import os
import re
import sys
from itertools import zip_longest
from typing import Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
import spack.paths
import spack.repo
import spack.util.git
from spack.util.executable import which
from spack.util.executable import Executable, which
description = "runs source code style checks on spack"
section = "developer"
@@ -36,10 +39,7 @@ def grouper(iterable, n, fillvalue=None):
#: double-check the results of other tools (if, e.g., --fix was provided)
#: The list maps an executable name to a method to ensure the tool is
#: bootstrapped or present in the environment.
tool_names = ["isort", "black", "flake8", "mypy"]
#: tools we run in spack style
tools = {}
tool_names = ["import", "isort", "black", "flake8", "mypy"]
#: warnings to ignore in mypy
mypy_ignores = [
@@ -61,14 +61,28 @@ def is_package(f):
#: decorator for adding tools to the list
class tool:
def __init__(self, name, required=False):
def __init__(self, name: str, required: bool = False, external: bool = True) -> None:
self.name = name
self.external = external
self.required = required
def __call__(self, fun):
tools[self.name] = (fun, self.required)
self.fun = fun
tools[self.name] = self
return fun
@property
def installed(self) -> bool:
return bool(which(self.name)) if self.external else True
@property
def executable(self) -> Optional[Executable]:
return which(self.name) if self.external else None
#: tools we run in spack style
tools: Dict[str, tool] = {}
def changed_files(base="develop", untracked=True, all_files=False, root=None):
"""Get list of changed files in the Spack repository.
@@ -176,22 +190,22 @@ def setup_parser(subparser):
"-t",
"--tool",
action="append",
help="specify which tools to run (default: %s)" % ",".join(tool_names),
help="specify which tools to run (default: %s)" % ", ".join(tool_names),
)
tool_group.add_argument(
"-s",
"--skip",
metavar="TOOL",
action="append",
help="specify tools to skip (choose from %s)" % ",".join(tool_names),
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
def cwd_relative(path, args):
def cwd_relative(path, root, initial_working_dir):
"""Translate prefix-relative path to current working directory-relative."""
return os.path.relpath(os.path.join(args.root, path), args.initial_working_dir)
return os.path.relpath(os.path.join(root, path), initial_working_dir)
def rewrite_and_print_output(
@@ -201,7 +215,10 @@ def rewrite_and_print_output(
# print results relative to current working directory
def translate(match):
return replacement.format(cwd_relative(match.group(1), args), *list(match.groups()[1:]))
return replacement.format(
cwd_relative(match.group(1), args.root, args.initial_working_dir),
*list(match.groups()[1:]),
)
for line in output.split("\n"):
if not line:
@@ -220,7 +237,7 @@ def print_style_header(file_list, args, tools_to_run):
# translate modified paths to cwd_relative if needed
paths = [filename.strip() for filename in file_list]
if not args.root_relative:
paths = [cwd_relative(filename, args) for filename in paths]
paths = [cwd_relative(filename, args.root, args.initial_working_dir) for filename in paths]
tty.msg("Modified files", *paths)
sys.stdout.flush()
@@ -352,17 +369,137 @@ def run_black(black_cmd, file_list, args):
return returncode
def _module_part(root: str, expr: str):
parts = expr.split(".")
# spack.pkg is for repositories, don't try to resolve it here.
if ".".join(parts[:2]) == spack.repo.ROOT_PYTHON_NAMESPACE:
return None
while parts:
f1 = os.path.join(root, "lib", "spack", *parts) + ".py"
f2 = os.path.join(root, "lib", "spack", *parts, "__init__.py")
if (
os.path.exists(f1)
# ensure case sensitive match
and f"{parts[-1]}.py" in os.listdir(os.path.dirname(f1))
or os.path.exists(f2)
):
return ".".join(parts)
parts.pop()
return None
def _run_import_check(
file_list: List[str],
*,
fix: bool,
root_relative: bool,
root=spack.paths.prefix,
working_dir=spack.paths.prefix,
out=sys.stdout,
):
if sys.version_info < (3, 9):
print("import check requires Python 3.9 or later")
return 0
is_use = re.compile(r"(?<!from )(?<!import )(?:llnl|spack)\.[a-zA-Z0-9_\.]+")
# redundant imports followed by a `# comment` are ignored, cause there can be legimitate reason
# to import a module: execute module scope init code, or to deal with circular imports.
is_abs_import = re.compile(r"^import ((?:llnl|spack)\.[a-zA-Z0-9_\.]+)$", re.MULTILINE)
exit_code = 0
for file in file_list:
to_add = set()
to_remove = []
pretty_path = file if root_relative else cwd_relative(file, root, working_dir)
try:
with open(file, "r") as f:
contents = open(file, "r").read()
parsed = ast.parse(contents)
except Exception:
exit_code = 1
print(f"{pretty_path}: could not parse", file=out)
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
# Clear all strings to avoid matching comments/strings etc.
for node in ast.walk(parsed):
if isinstance(node, ast.Constant) and isinstance(node.value, str):
node.value = ""
filtered_contents = ast.unparse(parsed) # novermin
for m in is_use.finditer(filtered_contents):
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
continue
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module} ({m.group(0)})", file=out)
if not fix or not to_add and not to_remove:
continue
with open(file, "r") as f:
lines = f.readlines()
if to_add:
# insert missing imports before the first import, delegate ordering to isort
for node in parsed.body:
if isinstance(node, (ast.Import, ast.ImportFrom)):
first_line = node.lineno
break
else:
print(f"{pretty_path}: could not fix", file=out)
continue
lines.insert(first_line, "\n".join(f"import {x}" for x in to_add) + "\n")
new_contents = "".join(lines)
# remove redundant imports
for statement in to_remove:
new_contents = new_contents.replace(f"{statement}\n", "")
with open(file, "w") as f:
f.write(new_contents)
return exit_code
@tool("import", external=False)
def run_import_check(import_check_cmd, file_list, args):
exit_code = _run_import_check(
file_list,
fix=args.fix,
root_relative=args.root_relative,
root=args.root,
working_dir=args.initial_working_dir,
)
print_tool_result("import", exit_code)
return exit_code
def validate_toolset(arg_value):
"""Validate --tool and --skip arguments (sets of optionally comma-separated tools)."""
tools = set(",".join(arg_value).split(",")) # allow args like 'isort,flake8'
for tool in tools:
if tool not in tool_names:
tty.die("Invaild tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
tty.die("Invalid tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
return tools
def missing_tools(tools_to_run):
return [t for t in tools_to_run if which(t) is None]
def missing_tools(tools_to_run: List[str]) -> List[str]:
return [t for t in tools_to_run if not tools[t].installed]
def _bootstrap_dev_dependencies():
@@ -417,9 +554,9 @@ def prefix_relative(path):
print_style_header(file_list, args, tools_to_run)
for tool_name in tools_to_run:
run_function, required = tools[tool_name]
tool = tools[tool_name]
print_tool_header(tool_name)
return_code |= run_function(which(tool_name), file_list, args)
return_code |= tool.fun(tool.executable, file_list, args)
if return_code == 0:
tty.msg(color.colorize("@*{spack style checks were clean}"))

View File

@@ -17,7 +17,8 @@
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "remove installed packages"
section = "build"
@@ -99,12 +100,14 @@ def find_matching_specs(
hashes = env.all_hashes() if env else None
# List of specs that match expressions given via command line
specs_from_cli: List["spack.spec.Spec"] = []
specs_from_cli: List[spack.spec.Spec] = []
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
matching = spack.store.STORE.db.query_local(
spec, hashes=hashes, installed=install_query, origin=origin
spec,
hashes=hashes,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
origin=origin,
)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't

View File

@@ -16,7 +16,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf_r", os.path.join("xl_r", "xlf_r")),
("xlf", os.path.join("xl", "xlf")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]
@@ -25,7 +24,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf90_r", os.path.join("xl_r", "xlf90_r")),
("xlf90", os.path.join("xl", "xlf90")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]

View File

@@ -124,8 +124,8 @@ def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -155,10 +155,10 @@ def setup_custom_environment(self, pkg, env):
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI

View File

@@ -1,77 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Pgi(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("pgi", "pgcc"),
"cxx": os.path.join("pgi", "pgc++"),
"f77": os.path.join("pgi", "pgfortran"),
"fc": os.path.join("pgi", "pgfortran"),
}
version_argument = "-V"
ignore_version_errors = [2] # `pgcc -V` on PowerPC annoyingly returns 2
version_regex = r"pg[^ ]* ([0-9.]+)-[0-9]+ (LLVM )?[^ ]+ target on "
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gopt"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def openmp_flag(self):
return "-mp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cc_pic_flag(self):
return "-fpic"
@property
def cxx_pic_flag(self):
return "-fpic"
@property
def f77_pic_flag(self):
return "-fpic"
@property
def fc_pic_flag(self):
return "-fpic"
required_libs = ["libpgc", "libpgf90"]
@property
def c99_flag(self):
if self.real_version >= Version("12.10"):
return "-c99"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 12.10")
@property
def c11_flag(self):
if self.real_version >= Version("15.3"):
return "-c11"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 15.3")
@property
def stdcxx_libs(self):
return ("-pgc++libs",)

View File

@@ -6,7 +6,7 @@
import sys
import time
from contextlib import contextmanager
from typing import Iterable, List, Optional, Sequence, Tuple, Union
from typing import Iterable, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
@@ -36,7 +36,6 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
@@ -61,8 +60,8 @@ def concretize_specs_together(
def concretize_together(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together.
Args:
@@ -78,8 +77,8 @@ def concretize_together(
def concretize_together_when_possible(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of
@@ -115,8 +114,8 @@ def concretize_together_when_possible(
def concretize_separately(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Concretizes the input specs separately from each other.
Args:
@@ -161,6 +160,11 @@ def concretize_separately(
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1

View File

@@ -431,19 +431,6 @@ def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""

View File

@@ -69,6 +69,8 @@
from spack.error import SpackError
from spack.util.crypto import bit_length
from .enums import InstallRecordStatus
# TODO: Provide an API automatically retyring a build after detecting and
# TODO: clearing a failure.
@@ -160,36 +162,12 @@ def converter(self, spec_like, *args, **kwargs):
return converter
class InstallStatus(str):
pass
class InstallStatuses:
INSTALLED = InstallStatus("installed")
DEPRECATED = InstallStatus("deprecated")
MISSING = InstallStatus("missing")
@classmethod
def canonicalize(cls, query_arg):
if query_arg is True:
return [cls.INSTALLED]
if query_arg is False:
return [cls.MISSING]
if query_arg is any:
return [cls.INSTALLED, cls.DEPRECATED, cls.MISSING]
if isinstance(query_arg, InstallStatus):
return [query_arg]
try:
statuses = list(query_arg)
if all(isinstance(x, InstallStatus) for x in statuses):
return statuses
except TypeError:
pass
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
def normalize_query(installed: Union[bool, InstallRecordStatus]) -> InstallRecordStatus:
if installed is True:
installed = InstallRecordStatus.INSTALLED
elif installed is False:
installed = InstallRecordStatus.MISSING
return installed
class InstallRecord:
@@ -227,8 +205,8 @@ def __init__(
installation_time: Optional[float] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin=None,
):
origin: Optional[str] = None,
) -> None:
self.spec = spec
self.path = str(path) if path else None
self.installed = bool(installed)
@@ -239,14 +217,12 @@ def __init__(
self.in_buildcache = in_buildcache
self.origin = origin
def install_type_matches(self, installed):
installed = InstallStatuses.canonicalize(installed)
def install_type_matches(self, installed: InstallRecordStatus) -> bool:
if self.installed:
return InstallStatuses.INSTALLED in installed
return InstallRecordStatus.INSTALLED in installed
elif self.deprecated_for:
return InstallStatuses.DEPRECATED in installed
else:
return InstallStatuses.MISSING in installed
return InstallRecordStatus.DEPRECATED in installed
return InstallRecordStatus.MISSING in installed
def to_dict(self, include_fields=DEFAULT_INSTALL_RECORD_FIELDS):
rec_dict = {}
@@ -1396,7 +1372,13 @@ def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
if spec.package.extends(extendee_spec):
yield spec.package
def _get_by_hash_local(self, dag_hash, default=None, installed=any):
def _get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
installed = normalize_query(installed)
# hash is a full hash and is in the data somewhere
if dag_hash in self._data:
rec = self._data[dag_hash]
@@ -1405,8 +1387,7 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
else:
return default
# check if hash is a prefix of some installed (or previously
# installed) spec.
# check if hash is a prefix of some installed (or previously installed) spec.
matches = [
record.spec
for h, record in self._data.items()
@@ -1418,52 +1399,43 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
# nothing found
return default
def get_by_hash_local(self, dag_hash, default=None, installed=any):
def get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec in *this DB* by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed
specs in the search; if ``False`` only missing specs, and if
``any``, all specs in database. If an InstallStatus or iterable
of InstallStatus, returns specs whose install status
(installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
with self.read_transaction():
return self._get_by_hash_local(dag_hash, default=default, installed=installed)
def get_by_hash(self, dag_hash, default=None, installed=any):
def get_by_hash(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed specs in the search; if ``False``
only missing specs, and if ``any``, all specs in database. If an
InstallStatus or iterable of InstallStatus, returns specs whose install
status (installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
@@ -1483,7 +1455,7 @@ def _query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1491,6 +1463,7 @@ def _query(
in_buildcache: Optional[bool] = None,
origin: Optional[str] = None,
) -> List["spack.spec.Spec"]:
installed = normalize_query(installed)
# Restrict the set of records over which we iterate first
matching_hashes = self._data
@@ -1560,7 +1533,7 @@ def query_local(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1620,7 +1593,7 @@ def query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1628,7 +1601,7 @@ def query(
hashes: Optional[List[str]] = None,
origin: Optional[str] = None,
install_tree: str = "all",
):
) -> List["spack.spec.Spec"]:
"""Queries the Spack database including all upstream databases.
Args:
@@ -1709,13 +1682,14 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
return sorted(results)
results.sort()
return results
def query_one(
self,
query_spec: Optional[Union[str, "spack.spec.Spec"]],
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
) -> Optional["spack.spec.Spec"]:
"""Query for exactly one spec that matches the query spec.

View File

@@ -34,12 +34,12 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.package_base
import spack.patch
import spack.spec
import spack.util.crypto
@@ -56,13 +56,8 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conditional",
"conflicts",
@@ -85,15 +80,15 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[spack.package_base.PackageBase, Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
"""Create a ``Spec`` that indicates when a directive should be applied.
Directives with ``when`` specs, e.g.:
@@ -138,7 +133,7 @@ def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
return spack.spec.Spec(value)
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
SubmoduleCallback = Callable[[spack.package_base.PackageBase], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -254,8 +249,8 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
@@ -334,7 +329,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: "spack.package_base.PackageBase"):
def _execute_conflicts(pkg: spack.package_base.PackageBase):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -375,19 +370,12 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
def _execute_depends_on(pkg: spack.package_base.PackageBase):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
@@ -404,7 +392,7 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
pkg: spack.package_base.PackageBase, source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
@@ -434,7 +422,7 @@ def _execute_redistribute(
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
pkg.disable_redistribute[when_spec] = spack.package_base.DisableRedistribute(
source=not source, binary=not binary
)
@@ -480,7 +468,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: "spack.package_base.PackageBase"):
def _execute_provides(pkg: spack.package_base.PackageBase):
import spack.parser # Avoid circular dependency
when_spec = _make_when_spec(when)
@@ -528,7 +516,7 @@ def can_splice(
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: "spack.package_base.PackageBase"):
def _execute_can_splice(pkg: spack.package_base.PackageBase):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
@@ -569,7 +557,7 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependency]):
def _execute_patch(pkg_or_dep: Union[spack.package_base.PackageBase, Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
@@ -893,7 +881,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _execute_requires(pkg: spack.package_base.PackageBase):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -918,7 +906,7 @@ def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: "spack.package_base.PackageBase"):
def _execute_languages(pkg: spack.package_base.PackageBase):
when_spec = _make_when_spec(when)
if not when_spec:
return

15
lib/spack/spack/enums.py Normal file
View File

@@ -0,0 +1,15 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Enumerations used throughout Spack"""
import enum
class InstallRecordStatus(enum.Flag):
"""Enum flag to facilitate querying status from the DB"""
INSTALLED = enum.auto()
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING

View File

@@ -3044,13 +3044,11 @@ def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager
def use_config(self):

View File

@@ -325,12 +325,7 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.depflag
)
]
nodes_in_topological_order = list(spec.traverse(order="topo", deptype=self.depflag))
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive

View File

@@ -23,7 +23,6 @@
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.builder
import spack.config
import spack.error
import spack.package_base
@@ -353,9 +352,7 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
self.test_parts[part_name] = status
self.counts[status] += 1
def phase_tests(
self, builder: spack.builder.Builder, phase_name: str, method_names: List[str]
):
def phase_tests(self, builder, phase_name: str, method_names: List[str]):
"""Execute the builder's package phase-time tests.
Args:
@@ -378,23 +375,16 @@ def phase_tests(
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)
@@ -764,7 +754,7 @@ def virtuals(pkg):
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
c_names = ("gcc", "intel", "intel-parallel-studio")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):

View File

@@ -50,6 +50,7 @@
import spack.binary_distribution as binary_distribution
import spack.build_environment
import spack.builder
import spack.config
import spack.database
import spack.deptypes as dt
@@ -212,7 +213,7 @@ def _check_last_phase(pkg: "spack.package_base.PackageBase") -> None:
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
"""
phases = pkg.builder.phases # type: ignore[attr-defined]
phases = spack.builder.create(pkg).phases # type: ignore[attr-defined]
if pkg.stop_before_phase and pkg.stop_before_phase not in phases: # type: ignore[attr-defined]
raise BadInstallPhase(pkg.name, pkg.stop_before_phase) # type: ignore[attr-defined]
@@ -661,7 +662,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
spack.store.STORE.layout.metadata_path(pkg.spec), "archived-files"
)
for glob_expr in pkg.builder.archive_files:
for glob_expr in spack.builder.create(pkg).archive_files:
# Check that we are trying to copy things that are
# in the stage tree (not arbitrary files)
abs_expr = os.path.realpath(glob_expr)
@@ -2394,7 +2395,6 @@ def _install_source(self) -> None:
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder
pkg = self.pkg

View File

@@ -29,6 +29,7 @@
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.oci.image
import spack.repo
import spack.spec

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem
import spack.builder
import spack.phase_callbacks
def filter_compiler_wrappers(*files, **kwargs):
@@ -111,4 +111,4 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
if pkg.compiler.name == "nag":
x.filter("-Wl,--enable-new-dtags", "", **filter_kwargs)
spack.builder.run_after(after)(_filter_compiler_wrappers_impl)
spack.phase_callbacks.run_after(after)(_filter_compiler_wrappers_impl)

View File

@@ -39,7 +39,7 @@
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import dedupe, memoized
from llnl.util.lang import Singleton, dedupe, memoized
import spack.build_environment
import spack.config
@@ -246,7 +246,7 @@ def _generate_upstream_module_index():
return UpstreamModuleIndex(spack.store.STORE.db, module_indices)
upstream_module_index = llnl.util.lang.Singleton(_generate_upstream_module_index)
upstream_module_index = Singleton(_generate_upstream_module_index)
ModuleIndexEntry = collections.namedtuple("ModuleIndexEntry", ["path", "use_name"])

View File

@@ -397,7 +397,6 @@ def create_opener():
"""Create an opener that can handle OCI authentication."""
opener = urllib.request.OpenerDirector()
for handler in [
urllib.request.ProxyHandler(),
urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(),

View File

@@ -11,7 +11,7 @@
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
from spack.error import InstallError
from spack.error import InstallError, NoHeadersError, NoLibrariesError
# Emulate some shell commands for convenience
env = environ
@@ -74,7 +74,7 @@
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.builder import run_after, run_before
from spack.builder import BaseBuilder
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import *
@@ -100,6 +100,7 @@
on_package_attributes,
)
from spack.package_completions import *
from spack.phase_callbacks import run_after, run_before
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type

View File

@@ -32,18 +32,18 @@
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.builder
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives
import spack.directives_meta
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.store
@@ -51,9 +51,9 @@
import spack.util.environment
import spack.util.path
import spack.util.web
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import PackageTest, TestSuite
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -299,9 +299,9 @@ def determine_variants(cls, objs, version_str):
class PackageMeta(
spack.builder.PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
DetectablePackageMeta,
spack.directives.DirectiveMeta,
spack.directives_meta.DirectiveMeta,
spack.multimethod.MultiMethodMeta,
):
"""
@@ -453,7 +453,7 @@ def _names(when_indexed_dictionary: WhenDict) -> List[str]:
return sorted(all_names)
WhenVariantList = List[Tuple["spack.spec.Spec", "spack.variant.Variant"]]
WhenVariantList = List[Tuple[spack.spec.Spec, spack.variant.Variant]]
def _remove_overridden_vdefs(variant_defs: WhenVariantList) -> None:
@@ -492,41 +492,14 @@ class Hipblas:
i += 1
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -612,17 +585,20 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
dependencies: Dict["spack.spec.Spec", Dict[str, "spack.dependency.Dependency"]]
conflicts: Dict["spack.spec.Spec", List[Tuple["spack.spec.Spec", Optional[str]]]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
"spack.spec.Spec", List[Tuple[Tuple["spack.spec.Spec", ...], str, Optional[str]]]
spack.spec.Spec, List[Tuple[Tuple[spack.spec.Spec, ...], str, Optional[str]]]
]
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
languages: Dict["spack.spec.Spec", Set[str]]
splice_specs: Dict["spack.spec.Spec", Tuple["spack.spec.Spec", Union[None, str, List[str]]]]
provided: Dict[spack.spec.Spec, Set[spack.spec.Spec]]
provided_together: Dict[spack.spec.Spec, List[Set[str]]]
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
disable_redistribute: Dict[spack.spec.Spec, DisableRedistribute]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -737,11 +713,11 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
test_requires_compiler: bool = False
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional["TestSuite"] = None
test_suite: Optional[Any] = None
def __init__(self, spec):
# this determines how the package should be built.
self.spec: "spack.spec.Spec" = spec
self.spec: spack.spec.Spec = spec
# Allow custom staging paths for packages
self.path = None
@@ -759,7 +735,7 @@ def __init__(self, spec):
# init internal variables
self._stage: Optional[StageComposite] = None
self._fetcher = None
self._tester: Optional["PackageTest"] = None
self._tester: Optional[Any] = None
# Set up timing variables
self._fetch_time = 0.0
@@ -809,9 +785,7 @@ def variant_definitions(cls, name: str) -> WhenVariantList:
return defs
@classmethod
def variant_items(
cls,
) -> Iterable[Tuple["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]]:
def variant_items(cls) -> Iterable[Tuple[spack.spec.Spec, Dict[str, spack.variant.Variant]]]:
"""Iterate over ``cls.variants.items()`` with overridden definitions removed."""
# Note: This is quadratic in the average number of variant definitions per name.
# That is likely close to linear in practice, as there are few variants with
@@ -829,7 +803,7 @@ def variant_items(
if filtered_variants_by_name:
yield when, filtered_variants_by_name
def get_variant(self, name: str) -> "spack.variant.Variant":
def get_variant(self, name: str) -> spack.variant.Variant:
"""Get the highest precedence variant definition matching this package's spec.
Arguments:
@@ -1004,6 +978,26 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# Source redistribution must be determined before concretization (because source mirrors work
# with abstract specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror."""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an installed instance of this
package."""
for when_spec, disable_redistribute in self.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
@@ -1016,7 +1010,7 @@ def keep_werror(self) -> Optional[str]:
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
if self.spec.satisfies("%nvhpc@:23.3"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
@@ -1353,11 +1347,13 @@ def archive_install_test_log(self):
@property
def tester(self):
import spack.install_test
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve tester for package without concrete version.")
if not self._tester:
self._tester = PackageTest(self)
self._tester = spack.install_test.PackageTest(self)
return self._tester
@property
@@ -2014,72 +2010,58 @@ def build_system_flags(
"""
return None, None, flags
def setup_run_environment(self, env):
def setup_run_environment(self, env: spack.util.environment.EnvironmentModifications) -> None:
"""Sets up the run environment for a package.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is run. Package authors
env: environment modifications to be applied when the package is run. Package authors
can call methods on it to alter the run environment.
"""
pass
def setup_dependent_run_environment(self, env, dependent_spec):
def setup_dependent_run_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the run environment of packages that depend on this one.
This is similar to ``setup_run_environment``, but it is used to
modify the run environments of packages that *depend* on this one.
This is similar to ``setup_run_environment``, but it is used to modify the run environment
of a package that *depends* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or run-time settings
for dependencies.
This gives packages like Python and others that follow the extension model a way to
implement common environment or run-time settings for dependencies.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is run.
Package authors can call methods on it to alter the build environment.
env: environment modifications to be applied when the dependent package is run. Package
authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be run. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
dependent_spec: The spec of the dependent package about to be run. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
pass
def setup_dependent_package(self, module, dependent_spec):
"""Set up Python module-scope variables for dependent packages.
def setup_dependent_package(self, module, dependent_spec: spack.spec.Spec) -> None:
"""Set up module-scope global variables for dependent packages.
Called before the install() method of dependents.
Default implementation does nothing, but this can be
overridden by an extendable package to set up the module of
its extensions. This is useful if there are some common steps
to installing all extensions for a certain package.
This function is called when setting up the build and run environments of a DAG.
Examples:
1. Extensions often need to invoke the ``python`` interpreter
from the Python installation being extended. This routine
can put a ``python()`` Executable object in the module scope
for the extension package to simplify extension installs.
1. Extensions often need to invoke the ``python`` interpreter from the Python installation
being extended. This routine can put a ``python`` Executable as a global in the module
scope for the extension package to simplify extension installs.
2. MPI compilers could set some variables in the dependent's
scope that point to ``mpicc``, ``mpicxx``, etc., allowing
them to be called by common name regardless of which MPI is used.
3. BLAS/LAPACK implementations can set some variables
indicating the path to their libraries, since these
paths differ by BLAS/LAPACK implementation.
2. MPI compilers could set some variables in the dependent's scope that point to ``mpicc``,
``mpicxx``, etc., allowing them to be called by common name regardless of which MPI is
used.
Args:
module (spack.package_base.PackageBase.module): The Python ``module``
object of the dependent package. Packages can use this to set
module-scope variables for the dependent to use.
module: The Python ``module`` object of the dependent package. Packages can use this to
set module-scope variables for the dependent to use.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be built. This allows the extendee (self) to
query the dependent's state. Note that *this*
package's spec is available as ``self.spec``.
dependent_spec: The spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``.
"""
pass
@@ -2106,7 +2088,7 @@ def flag_handler(self, var: FLAG_HANDLER_TYPE) -> None:
# arguments. This is implemented for build system classes where
# appropriate and will otherwise raise a NotImplementedError.
def flags_to_build_system_args(self, flags):
def flags_to_build_system_args(self, flags: Dict[str, List[str]]) -> None:
# Takes flags as a dict name: list of values
if any(v for v in flags.values()):
msg = "The {0} build system".format(self.__class__.__name__)
@@ -2309,10 +2291,6 @@ def rpath_args(self):
"""
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
@property
def builder(self):
return spack.builder.create(self)
inject_flags = PackageBase.inject_flags
env_flags = PackageBase.env_flags

View File

@@ -0,0 +1,105 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import llnl.util.lang as lang
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -209,7 +209,7 @@ def _apply_to_file(self, f):
# but it's nasty to deal with matches across boundaries, so let's stick to
# something simple.
modified = False
modified = True
for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new)

View File

@@ -88,6 +88,8 @@
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]}
},
},
"timeout": {"type": "integer", "minimum": 0},
"error_on_timeout": {"type": "boolean"},
"os_compatible": {"type": "object", "additionalProperties": {"type": "array"}},
},
}

View File

@@ -106,8 +106,8 @@
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v1.0.",
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"error": False,
},
],

View File

@@ -27,6 +27,7 @@
import spack
import spack.binary_distribution
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
@@ -885,7 +886,22 @@ def on_model(model):
solve_kwargs["on_unsat"] = cores.append
timer.start("solve")
solve_result = self.control.solve(**solve_kwargs)
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
solve_result = handle.get()
timer.stop("solve")
# once done, construct the solve result

View File

@@ -95,6 +95,8 @@
import spack.version as vn
import spack.version.git_ref_lookup
from .enums import InstallRecordStatus
__all__ = [
"CompilerSpec",
"Spec",
@@ -448,9 +450,6 @@ def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
return bool(self._target_intersection(other))
def _target_constrain(self, other: "ArchSpec") -> bool:
if self.target is None and other.target is None:
return False
if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other)
@@ -499,56 +498,21 @@ def _target_intersection(self, other):
if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max):
results.append(o_min)
else:
# Take the "min" of the two max, if there is a partial ordering.
n_max = ""
if s_max and o_max:
_s_max = _make_microarchitecture(s_max)
_o_max = _make_microarchitecture(o_max)
if _s_max.family != _o_max.family:
continue
if _s_max <= _o_max:
n_max = s_max
elif _o_max < _s_max:
n_max = o_max
else:
continue
elif s_max:
n_max = s_max
elif o_max:
n_max = o_max
# Take the "max" of the two min.
n_min = ""
if s_min and o_min:
_s_min = _make_microarchitecture(s_min)
_o_min = _make_microarchitecture(o_min)
if _s_min.family != _o_min.family:
continue
if _s_min >= _o_min:
n_min = s_min
elif _o_min > _s_min:
n_min = o_min
else:
continue
elif s_min:
n_min = s_min
elif o_min:
n_min = o_min
if n_min and n_max:
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min.family != _n_max.family or not _n_min <= _n_max:
continue
if n_min == n_max:
results.append(n_min)
else:
results.append(f"{n_min}:{n_max}")
elif n_min:
results.append(f"{n_min}:")
elif n_max:
results.append(f":{n_max}")
# Take intersection of two ranges
# Lots of comparisons needed
_s_min = _make_microarchitecture(s_min)
_s_max = _make_microarchitecture(s_max)
_o_min = _make_microarchitecture(o_min)
_o_max = _make_microarchitecture(o_max)
n_min = s_min if _s_min >= _o_min else o_min
n_max = s_max if _s_max <= _o_max else o_max
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min == _n_max:
results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max:
results.append("%s:%s" % (n_min, n_max))
return results
def constrain(self, other: "ArchSpec") -> bool:
@@ -1538,8 +1502,9 @@ def __init__(
self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for external specs.
self.extra_attributes: dict = {}
# This attribute is used to store custom information for
# external specs. None signal that it was not set yet.
self.extra_attributes = None
# This attribute holds the original build copy of the spec if it is
# deployed differently than it was built. None signals that the spec
@@ -2108,7 +2073,7 @@ def _lookup_hash(self):
# First env, then store, then binary cache
matches = (
(active_env.all_matching_specs(self) if active_env else [])
or spack.store.STORE.db.query(self, installed=any)
or spack.store.STORE.db.query(self, installed=InstallRecordStatus.ANY)
or spack.binary_distribution.BinaryCacheQuery(True)(self)
)
@@ -2254,8 +2219,8 @@ def to_node_dict(self, hash=ht.dag_hash):
d["external"] = syaml.syaml_dict(
[
("path", self.external_path),
("module", self.external_modules or None),
("extra_attributes", syaml.sorted_dict(self.extra_attributes)),
("module", self.external_modules),
("extra_attributes", self.extra_attributes),
]
)
@@ -2993,7 +2958,7 @@ def _finalize_concretization(self):
for spec in self.traverse():
spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec":
"""This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package
@@ -3116,13 +3081,18 @@ def constrain(self, other, deps=True):
if not self.variants[v].compatible(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
# TODO: Check out the logic here
sarch, oarch = self.architecture, other.architecture
if (
sarch is not None
and oarch is not None
and not self.architecture.intersects(other.architecture)
):
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch is not None and oarch is not None:
if sarch.platform is not None and oarch.platform is not None:
if sarch.platform != oarch.platform:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.os is not None and oarch.os is not None:
if sarch.os != oarch.os:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.target is not None and oarch.target is not None:
if sarch.target != oarch.target:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
changed = False
@@ -3145,12 +3115,18 @@ def constrain(self, other, deps=True):
changed |= self.compiler_flags.constrain(other.compiler_flags)
old = str(self.architecture)
sarch, oarch = self.architecture, other.architecture
if sarch is not None and oarch is not None:
changed |= self.architecture.constrain(other.architecture)
elif oarch is not None:
self.architecture = oarch
changed = True
if sarch is None or other.architecture is None:
self.architecture = sarch or oarch
else:
if sarch.platform is None or oarch.platform is None:
self.architecture.platform = sarch.platform or oarch.platform
if sarch.os is None or oarch.os is None:
sarch.os = sarch.os or oarch.os
if sarch.target is None or oarch.target is None:
sarch.target = sarch.target or oarch.target
changed |= str(self.architecture) != old
if deps:
changed |= self._constrain_dependencies(other)
@@ -4861,8 +4837,8 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"]
if spec.external_modules is False:
spec.external_modules = None
spec.extra_attributes = (
node["external"].get("extra_attributes") or syaml.syaml_dict()
spec.extra_attributes = node["external"].get(
"extra_attributes", syaml.syaml_dict()
)
# specs read in are concrete unless marked abstract

View File

@@ -16,6 +16,7 @@
import llnl.string
import llnl.util.lang
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import (
can_access,
@@ -487,7 +488,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided.
fetchers[:0] = (
fs.from_url_scheme(
url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
url_util.join(mirror.fetch_url, self.mirror_layout.path),
checksum=digest,
expand=expand,
extension=extension,

View File

@@ -15,6 +15,7 @@
import spack.build_systems.autotools
import spack.build_systems.cmake
import spack.builder
import spack.environment
import spack.error
import spack.paths
@@ -149,7 +150,7 @@ def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Assert the libtool archive is not there and we have
# a log of removed files
assert not os.path.exists(s.package.builder.libtool_archive_file)
assert not os.path.exists(spack.builder.create(s.package).libtool_archive_file)
search_directory = os.path.join(s.prefix, ".spack")
libtool_deletion_log = fs.find(search_directory, "removed_la_files.txt", recursive=True)
assert libtool_deletion_log
@@ -160,11 +161,13 @@ def test_libtool_archive_files_might_be_installed_on_demand(
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = Spec("libtool-deletion").concretized()
monkeypatch.setattr(type(s.package.builder), "install_libtool_archives", True)
monkeypatch.setattr(
type(spack.builder.create(s.package)), "install_libtool_archives", True
)
PackageInstaller([s.package], explicit=True).install()
# Assert libtool archives are installed
assert os.path.exists(s.package.builder.libtool_archive_file)
assert os.path.exists(spack.builder.create(s.package).libtool_archive_file)
def test_autotools_gnuconfig_replacement(self, mutable_database):
"""
@@ -261,7 +264,7 @@ def test_cmake_std_args(self, default_mock_concretization):
# Call the function on a CMakePackage instance
s = default_mock_concretization("cmake-client")
expected = spack.build_systems.cmake.CMakeBuilder.std_args(s.package)
assert s.package.builder.std_cmake_args == expected
assert spack.builder.create(s.package).std_cmake_args == expected
# Call it on another kind of package
s = default_mock_concretization("mpich")
@@ -381,7 +384,9 @@ def test_autotools_args_from_conditional_variant(default_mock_concretization):
is not met. When this is the case, the variant is not set in the spec."""
s = default_mock_concretization("autotools-conditional-variants-test")
assert "example" not in s.variants
assert len(s.package.builder._activate_or_not("example", "enable", "disable")) == 0
assert (
len(spack.builder.create(s.package)._activate_or_not("example", "enable", "disable")) == 0
)
def test_autoreconf_search_path_args_multiple(default_mock_concretization, tmpdir):

View File

@@ -17,6 +17,7 @@
import spack
import spack.binary_distribution
import spack.ci as ci
import spack.cmd
import spack.cmd.ci
import spack.environment as ev
import spack.hash_types as ht

View File

@@ -7,7 +7,7 @@
import spack.spec
import spack.store
from spack.database import InstallStatuses
from spack.enums import InstallRecordStatus
from spack.main import SpackCommand
install = SpackCommand("install")
@@ -26,7 +26,7 @@ def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=any)
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
assert all_available == all_installed
assert non_deprecated == spack.store.STORE.db.query("libelf@0.8.13")
@@ -56,7 +56,7 @@ def test_deprecate_install(mock_packages, mock_archive, mock_fetch, install_mock
deprecate("-y", "-i", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
deprecated = spack.store.STORE.db.query(installed=InstallStatuses.DEPRECATED)
deprecated = spack.store.STORE.db.query(installed=InstallRecordStatus.DEPRECATED)
assert deprecated == to_deprecate
assert len(non_deprecated) == 1
assert non_deprecated[0].satisfies("libelf@0.8.13")
@@ -75,8 +75,8 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
deprecate("-y", "-d", "libdwarf@20130207", "libdwarf@20130729")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=any)
deprecated = spack.store.STORE.db.query(installed=InstallStatuses.DEPRECATED)
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
deprecated = spack.store.STORE.db.query(installed=InstallRecordStatus.DEPRECATED)
assert all_available == all_installed
assert sorted(all_available) == sorted(deprecated + non_deprecated)
@@ -96,7 +96,9 @@ def test_uninstall_deprecated(mock_packages, mock_archive, mock_fetch, install_m
uninstall("-y", "libelf@0.8.10")
assert spack.store.STORE.db.query() == spack.store.STORE.db.query(installed=any)
assert spack.store.STORE.db.query() == spack.store.STORE.db.query(
installed=InstallRecordStatus.ANY
)
assert spack.store.STORE.db.query() == non_deprecated
@@ -116,7 +118,7 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=any)
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 2
assert len(all_available) == 3
@@ -143,7 +145,7 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
deprecate("-y", "libelf@0.8.12", "libelf@0.8.13")
non_deprecated = spack.store.STORE.db.query()
all_available = spack.store.STORE.db.query(installed=any)
all_available = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
assert len(non_deprecated) == 1
assert len(all_available) == 3

View File

@@ -17,6 +17,7 @@
import spack.repo
import spack.store
import spack.user_environment as uenv
from spack.enums import InstallRecordStatus
from spack.main import SpackCommand
from spack.spec import Spec
from spack.test.conftest import create_test_repo
@@ -75,7 +76,7 @@ def test_query_arguments():
assert "installed" in q_args
assert "predicate_fn" in q_args
assert "explicit" in q_args
assert q_args["installed"] == ["installed"]
assert q_args["installed"] == InstallRecordStatus.INSTALLED
assert q_args["predicate_fn"] is None
assert q_args["explicit"] is None
assert "start_date" in q_args

View File

@@ -10,6 +10,7 @@
from llnl.util.filesystem import mkdirp, working_dir
import spack.cmd
import spack.cmd.pkg
import spack.main
import spack.paths

View File

@@ -6,6 +6,7 @@
import spack.store
from spack.database import Database
from spack.enums import InstallRecordStatus
from spack.main import SpackCommand
install = SpackCommand("install")
@@ -57,18 +58,18 @@ def test_reindex_with_deprecated_packages(
db = spack.store.STORE.db
all_installed = db.query(installed=any)
all_installed = db.query(installed=InstallRecordStatus.ANY)
non_deprecated = db.query(installed=True)
_clear_db(tmp_path)
reindex()
assert db.query(installed=any) == all_installed
assert db.query(installed=InstallRecordStatus.ANY) == all_installed
assert db.query(installed=True) == non_deprecated
old_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.12", installed=any)[0].dag_hash()
db.query_local("libelf@0.8.12", installed=InstallRecordStatus.ANY)[0].dag_hash()
)
new_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.13", installed=True)[0].dag_hash()

View File

@@ -4,8 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import filecmp
import io
import os
import pathlib
import shutil
import sys
import pytest
@@ -15,7 +18,7 @@
import spack.main
import spack.paths
import spack.repo
from spack.cmd.style import changed_files
from spack.cmd.style import _run_import_check, changed_files
from spack.util.executable import which
#: directory with sample style files
@@ -292,5 +295,114 @@ def test_style_with_black(flake8_package_with_errors):
def test_skip_tools():
output = style("--skip", "isort,mypy,black,flake8")
output = style("--skip", "import,isort,mypy,black,flake8")
assert "Nothing to run" in output
@pytest.mark.skipif(sys.version_info < (3, 9), reason="requires Python 3.9+")
def test_run_import_check(tmp_path: pathlib.Path):
file = tmp_path / "issues.py"
contents = '''
import spack.cmd
import spack.config # do not drop this import because of this comment
# this comment about spack.error should not be removed
class Example(spack.build_systems.autotools.AutotoolsPackage):
"""this is a docstring referencing unused spack.error.SpackError, which is fine"""
pass
def foo(config: "spack.error.SpackError"):
# the type hint is quoted, so it should not be removed
spack.util.executable.Executable("example")
print(spack.__version__)
'''
file.write_text(contents)
root = str(tmp_path)
output_buf = io.StringIO()
exit_code = _run_import_check(
[str(file)],
fix=False,
out=output_buf,
root_relative=False,
root=spack.paths.prefix,
working_dir=root,
)
output = output_buf.getvalue()
assert "issues.py: redundant import: spack.cmd" in output
assert "issues.py: redundant import: spack.config" not in output # comment prevents removal
assert "issues.py: missing import: spack" in output # used by spack.__version__
assert "issues.py: missing import: spack.build_systems.autotools" in output
assert "issues.py: missing import: spack.util.executable" in output
assert "issues.py: missing import: spack.error" not in output # not directly used
assert exit_code == 1
assert file.read_text() == contents # fix=False should not change the file
# run it with --fix, should have the same output.
output_buf = io.StringIO()
exit_code = _run_import_check(
[str(file)],
fix=True,
out=output_buf,
root_relative=False,
root=spack.paths.prefix,
working_dir=root,
)
output = output_buf.getvalue()
assert exit_code == 1
assert "issues.py: redundant import: spack.cmd" in output
assert "issues.py: missing import: spack" in output
assert "issues.py: missing import: spack.build_systems.autotools" in output
assert "issues.py: missing import: spack.util.executable" in output
# after fix a second fix is idempotent
output_buf = io.StringIO()
exit_code = _run_import_check(
[str(file)],
fix=True,
out=output_buf,
root_relative=False,
root=spack.paths.prefix,
working_dir=root,
)
output = output_buf.getvalue()
assert exit_code == 0
assert not output
# check that the file was fixed
new_contents = file.read_text()
assert "import spack.cmd" not in new_contents
assert "import spack\n" in new_contents
assert "import spack.build_systems.autotools\n" in new_contents
assert "import spack.util.executable\n" in new_contents
@pytest.mark.skipif(sys.version_info < (3, 9), reason="requires Python 3.9+")
def test_run_import_check_syntax_error_and_missing(tmp_path: pathlib.Path):
(tmp_path / "syntax-error.py").write_text("""this 'is n(ot python code""")
output_buf = io.StringIO()
exit_code = _run_import_check(
[str(tmp_path / "syntax-error.py"), str(tmp_path / "missing.py")],
fix=False,
out=output_buf,
root_relative=True,
root=str(tmp_path),
working_dir=str(tmp_path / "does-not-matter"),
)
output = output_buf.getvalue()
assert "syntax-error.py: could not parse" in output
assert "missing.py: could not parse" in output
assert exit_code == 1
def test_case_sensitive_imports(tmp_path: pathlib.Path):
# example.Example is a name, while example.example is a module.
(tmp_path / "lib" / "spack" / "example").mkdir(parents=True)
(tmp_path / "lib" / "spack" / "example" / "__init__.py").write_text("class Example:\n pass")
(tmp_path / "lib" / "spack" / "example" / "example.py").write_text("foo = 1")
assert spack.cmd.style._module_part(str(tmp_path), "example.Example") == "example"
def test_pkg_imports():
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg.builtin.boost") is None
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg") is None

View File

@@ -11,6 +11,7 @@
import spack.cmd.uninstall
import spack.environment
import spack.store
from spack.enums import InstallRecordStatus
from spack.main import SpackCommand, SpackCommandError
uninstall = SpackCommand("uninstall")
@@ -129,10 +130,10 @@ def validate_callpath_spec(installed):
specs = spack.store.STORE.db.get_by_hash(dag_hash[:7], installed=installed)
assert len(specs) == 1 and specs[0] == callpath_spec
specs = spack.store.STORE.db.get_by_hash(dag_hash, installed=any)
specs = spack.store.STORE.db.get_by_hash(dag_hash, installed=InstallRecordStatus.ANY)
assert len(specs) == 1 and specs[0] == callpath_spec
specs = spack.store.STORE.db.get_by_hash(dag_hash[:7], installed=any)
specs = spack.store.STORE.db.get_by_hash(dag_hash[:7], installed=InstallRecordStatus.ANY)
assert len(specs) == 1 and specs[0] == callpath_spec
specs = spack.store.STORE.db.get_by_hash(dag_hash, installed=not installed)
@@ -147,7 +148,7 @@ def validate_callpath_spec(installed):
spec = spack.store.STORE.db.query_one("callpath ^mpich", installed=installed)
assert spec == callpath_spec
spec = spack.store.STORE.db.query_one("callpath ^mpich", installed=any)
spec = spack.store.STORE.db.query_one("callpath ^mpich", installed=InstallRecordStatus.ANY)
assert spec == callpath_spec
spec = spack.store.STORE.db.query_one("callpath ^mpich", installed=not installed)

View File

@@ -537,22 +537,6 @@ def test_nvhpc_flags():
supported_flag_test("stdcxx_libs", ("-c++libs",), "nvhpc@=20.9")
def test_pgi_flags():
supported_flag_test("openmp_flag", "-mp", "pgi@=1.0")
supported_flag_test("cxx11_flag", "-std=c++11", "pgi@=1.0")
unsupported_flag_test("c99_flag", "pgi@=12.9")
supported_flag_test("c99_flag", "-c99", "pgi@=12.10")
unsupported_flag_test("c11_flag", "pgi@=15.2")
supported_flag_test("c11_flag", "-c11", "pgi@=15.3")
supported_flag_test("cc_pic_flag", "-fpic", "pgi@=1.0")
supported_flag_test("cxx_pic_flag", "-fpic", "pgi@=1.0")
supported_flag_test("f77_pic_flag", "-fpic", "pgi@=1.0")
supported_flag_test("fc_pic_flag", "-fpic", "pgi@=1.0")
supported_flag_test("stdcxx_libs", ("-pgc++libs",), "pgi@=1.0")
supported_flag_test("debug_flags", ["-g", "-gopt"], "pgi@=1.0")
supported_flag_test("opt_flags", ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"], "pgi@=1.0")
def test_xl_flags():
supported_flag_test("openmp_flag", "-qsmp=omp", "xl@=1.0")
unsupported_flag_test("cxx11_flag", "xl@=13.0")

View File

@@ -15,6 +15,7 @@
import llnl.util.tty as tty
from llnl.util.filesystem import join_path, touch, touchp
import spack
import spack.config
import spack.directory_layout
import spack.environment as ev
@@ -1532,30 +1533,3 @@ def test_config_path_dsl(path, it_should_work, expected_parsed):
else:
with pytest.raises(ValueError):
spack.config.ConfigPath._validate(path)
@pytest.mark.regression("48254")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
"""Check that the "command_line" scope remains the highest priority scope, when we activate,
or deactivate, environments.
"""
expected_cl_scope = spack.config.CONFIG.highest()
assert expected_cl_scope.name == "command_line"
# Creating an environment pushes a new scope
ev.create("test")
with ev.read("test"):
assert spack.config.CONFIG.highest() == expected_cl_scope
# No active environment pops the scope
with ev.no_active_environment():
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
# Switch the environment to another one
ev.create("test-2")
with ev.read("test-2"):
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope

View File

@@ -38,7 +38,7 @@
import spack.compiler
import spack.compilers
import spack.config
import spack.directives
import spack.directives_meta
import spack.environment as ev
import spack.error
import spack.modules.common
@@ -1754,7 +1754,7 @@ def clear_directive_functions():
# Make sure any directive functions overidden by tests are cleared before
# proceeding with subsequent tests that may depend on the original
# functions.
spack.directives.DirectiveMeta._directives_to_be_executed = []
spack.directives_meta.DirectiveMeta._directives_to_be_executed = []
@pytest.fixture

View File

@@ -1,11 +0,0 @@
Export PGI=/usr/tce/packages/pgi/pgi-16.3
/usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/bin/pgc test.c -opt 1 -x 119 0xa10000 -x 122 0x40 -x 123 0x1000 -x 127 4 -x 127 17 -x 19 0x400000 -x 28 0x40000 -x 120 0x10000000 -x 70 0x8000 -x 122 1 -x 125 0x20000 -quad -x 59 4 -tp haswell -x 120 0x1000 -astype 0 -stdinc /usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/include-gcc48:/usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/include:/usr/lib/gcc/x86_64-redhat-linux/4.8.5/include:/usr/local/include:/usr/include -def unix -def __unix -def __unix__ -def linux -def __linux -def __linux__ -def __NO_MATH_INLINES -def __LP64__ -def __x86_64 -def __x86_64__ -def __LONG_MAX__=9223372036854775807L -def '__SIZE_TYPE__=unsigned long int' -def '__PTRDIFF_TYPE__=long int' -def __THROW= -def __extension__= -def __amd_64__amd64__ -def __k8 -def __k8__ -def __SSE__ -def __MMX__ -def __SSE2__ -def __SSE3__ -def __SSSE3__ -def __STDC_HOSTED__ -predicate '#machine(x86_64) #lint(off) #system(posix) #cpu(x86_64)' -cmdline '+pgcc test.c -v -o test.o' -x 123 0x80000000 -x 123 4 -x 2 0x400 -x 119 0x20 -def __pgnu_vsn=40805 -x 120 0x200000 -x 70 0x40000000 -y 163 0xc0000000 -x 189 0x10 -y 189 0x4000000 -asm /var/tmp/gamblin2/pgccL0MCVCOQsq6l.s
PGC/x86-64 Linux 16.3-0: compilation successful
/usr/bin/as /var/tmp/gamblin2/pgccL0MCVCOQsq6l.s -o /var/tmp/gamblin2/pgcc10MCFxmYXjgo.o
/usr/tce/bin/ld /usr/lib64/crt1.o /usr/lib64/crti.o /usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/lib/trace_init.o /usr/lib/gcc/x86_64-redhat-linux/4.8.5/crtbegin.o /usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/lib/initmp.o --eh-frame-hdr -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 /usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/lib/pgi.ld -L/usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/lib -L/usr/lib64 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 /var/tmp/gamblin2/pgcc10MCFxmYXjgo.o -rpath /usr/tce/packages/pgi/pgi-16.3/linux86-64/16.3/lib -o test.o -lpgmp -lnuma -lpthread -lnspgc -lpgc -lm -lgcc -lc -lgcc /usr/lib/gcc/x86_64-redhat-linux/4.8.5/crtend.o /usr/lib64/crtn.o
Unlinking /var/tmp/gamblin2/pgccL0MCVCOQsq6l.s
Unlinking /var/tmp/gamblin2/pgccn0MCNcmgIbh8.ll
Unlinking /var/tmp/gamblin2/pgcc10MCFxmYXjgo.o

View File

@@ -34,6 +34,7 @@
import spack.spec
import spack.store
import spack.version as vn
from spack.enums import InstallRecordStatus
from spack.installer import PackageInstaller
from spack.schema.database_index import schema
from spack.util.executable import Executable
@@ -292,7 +293,7 @@ def _print_ref_counts():
recs = []
def add_rec(spec):
cspecs = spack.store.STORE.db.query(spec, installed=any)
cspecs = spack.store.STORE.db.query(spec, installed=InstallRecordStatus.ANY)
if not cspecs:
recs.append("[ %-7s ] %-20s-" % ("", spec))
@@ -324,7 +325,7 @@ def add_rec(spec):
def _check_merkleiness():
"""Ensure the spack database is a valid merkle graph."""
all_specs = spack.store.STORE.db.query(installed=any)
all_specs = spack.store.STORE.db.query(installed=InstallRecordStatus.ANY)
seen = {}
for spec in all_specs:
@@ -617,7 +618,7 @@ def test_080_root_ref_counts(mutable_database):
mutable_database.remove("mpileaks ^mpich")
# record no longer in DB
assert mutable_database.query("mpileaks ^mpich", installed=any) == []
assert mutable_database.query("mpileaks ^mpich", installed=InstallRecordStatus.ANY) == []
# record's deps have updated ref_counts
assert mutable_database.get_record("callpath ^mpich").ref_count == 0
@@ -627,7 +628,7 @@ def test_080_root_ref_counts(mutable_database):
mutable_database.add(rec.spec)
# record is present again
assert len(mutable_database.query("mpileaks ^mpich", installed=any)) == 1
assert len(mutable_database.query("mpileaks ^mpich", installed=InstallRecordStatus.ANY)) == 1
# dependencies have ref counts updated
assert mutable_database.get_record("callpath ^mpich").ref_count == 1
@@ -643,18 +644,21 @@ def test_090_non_root_ref_counts(mutable_database):
# record still in DB but marked uninstalled
assert mutable_database.query("callpath ^mpich", installed=True) == []
assert len(mutable_database.query("callpath ^mpich", installed=any)) == 1
assert len(mutable_database.query("callpath ^mpich", installed=InstallRecordStatus.ANY)) == 1
# record and its deps have same ref_counts
assert mutable_database.get_record("callpath ^mpich", installed=any).ref_count == 1
assert (
mutable_database.get_record("callpath ^mpich", installed=InstallRecordStatus.ANY).ref_count
== 1
)
assert mutable_database.get_record("mpich").ref_count == 2
# remove only dependent of uninstalled callpath record
mutable_database.remove("mpileaks ^mpich")
# record and parent are completely gone.
assert mutable_database.query("mpileaks ^mpich", installed=any) == []
assert mutable_database.query("callpath ^mpich", installed=any) == []
assert mutable_database.query("mpileaks ^mpich", installed=InstallRecordStatus.ANY) == []
assert mutable_database.query("callpath ^mpich", installed=InstallRecordStatus.ANY) == []
# mpich ref count updated properly.
mpich_rec = mutable_database.get_record("mpich")
@@ -668,14 +672,14 @@ def fail_while_writing():
raise Exception()
with database.read_transaction():
assert len(database.query("mpileaks ^zmpi", installed=any)) == 1
assert len(database.query("mpileaks ^zmpi", installed=InstallRecordStatus.ANY)) == 1
with pytest.raises(Exception):
fail_while_writing()
# reload DB and make sure zmpi is still there.
with database.read_transaction():
assert len(database.query("mpileaks ^zmpi", installed=any)) == 1
assert len(database.query("mpileaks ^zmpi", installed=InstallRecordStatus.ANY)) == 1
def test_110_no_write_with_exception_on_install(database):
@@ -685,14 +689,14 @@ def fail_while_writing():
raise Exception()
with database.read_transaction():
assert database.query("cmake", installed=any) == []
assert database.query("cmake", installed=InstallRecordStatus.ANY) == []
with pytest.raises(Exception):
fail_while_writing()
# reload DB and make sure cmake was not written.
with database.read_transaction():
assert database.query("cmake", installed=any) == []
assert database.query("cmake", installed=InstallRecordStatus.ANY) == []
def test_115_reindex_with_packages_not_in_repo(mutable_database, tmpdir):

View File

@@ -73,5 +73,18 @@ def test_ascii_graph_mpileaks(config, mock_packages, monkeypatch):
o | libdwarf
|/
o libelf
"""
or graph_str
== r"""o mpileaks
|\
| o callpath
|/|
| o dyninst
| |\
o | | mpich
/ /
| o libdwarf
|/
o libelf
"""
)

View File

@@ -69,17 +69,6 @@ def test_icc16_link_paths():
)
def test_pgi_link_paths():
check_link_paths(
"pgcc-16.3.txt",
[
os.path.join(
root, "usr", "tce", "packages", "pgi", "pgi-16.3", "linux86-64", "16.3", "lib"
)
],
)
def test_gcc7_link_paths():
check_link_paths("gcc-7.3.1.txt", [])

Some files were not shown because too many files have changed in this diff Show More