Compare commits

..

375 Commits

Author SHA1 Message Date
Gregory Becker
4a595fb216 remove deprecated methods 2024-12-11 10:59:37 -08:00
Gregory Becker
cf3addc69e attach methods to correct class 2024-12-11 10:58:19 -08:00
Gregory Becker
760fbd4bac missing import 2024-12-11 10:51:46 -08:00
Gregory Becker
fbedaa7854 re-add removed methods with deprecation warning 2024-12-11 10:50:08 -08:00
Harmen Stoppels
74a6b61b75 Partially revert f3ccb8e095121d4bccd562f1c851a2373da1b40a 2024-12-09 10:08:28 +01:00
Harmen Stoppels
e995d0543e fix test that held reference to abstract spec 2024-12-09 10:08:28 +01:00
Harmen Stoppels
3827ebb592 remove quotes from quotes type hints 2024-12-09 10:08:28 +01:00
Harmen Stoppels
0239d77842 also break circular import concretize.py and solver/asp.py 2024-12-09 10:08:28 +01:00
Gregory Becker
4e9fe1fc5d concretization: move single-spec concretization logic to spack.concretize
This resolves a circular import issue between spack.spec and spack.concretize.
It requires removing the `Spec.concretize` and `Spec.concretized` methods and
updating all call-sites to use `spack.concretize.concretized` instead.

This will help with potential future efforts to separate AbstractSpec and
ConcreteSpec classes.

New import relationship is `spack.concretize` imports from `spack.spec`, but
not the other way around.
2024-12-09 10:08:28 +01:00
dependabot[bot]
fc105a1a26 build(deps): bump types-six in /.github/workflows/requirements/style (#47954)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.20241105 to 1.17.0.20241205.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-08 19:34:20 -06:00
Stephen Sachs
8a9e16dc3b aws-pcluster stacks: static spack.yaml (#47918) 2024-12-08 20:26:08 +01:00
eugeneswalker
0b7fc360fa e4s ci: add lammps +rocm (#47929)
* e4s ci: add lammps +rocm

* e4s rocm external stack: add def for rocm-openmp-extras external

* lammps +rocm external rocm has errors, comment out
2024-12-08 10:21:28 -08:00
Wouter Deconinck
79d79969bb celeritas: patch 0.5.0 for geant4@11.3.0: (#47976) 2024-12-08 09:12:14 -05:00
Harmen Stoppels
422f829e4e mirrors: add missing init file (#47977) 2024-12-08 09:31:22 +01:00
Alec Scott
f54c101b44 py-jedi: add v0.19.2 (#47569) 2024-12-07 16:26:31 +01:00
Harmen Stoppels
05acd29f38 extensions.py: remove import of spack.cmd (#47963) 2024-12-07 10:08:04 +01:00
Wouter Deconinck
77e2187e13 coverage.yml: fail_ci_if_error = true (#47731) 2024-12-06 11:01:10 -08:00
Harmen Stoppels
5c88e035f2 directives.py: remove redundant import (#47965) 2024-12-06 19:18:12 +01:00
Harmen Stoppels
94bd7b9afb build_environment: drop off by one fix (#47960) 2024-12-06 17:01:46 +01:00
Stephen Herbener
f181ac199a Upgraded version specs for ECMWF packages: eckit, atlas, ectrans, fckit, fiat (#47749) 2024-12-05 18:46:56 -08:00
Sreenivasa Murthy Kolam
a8da7993ad Bump up the version for rocm-6.2.4 release (#47707)
* Bump up the version for rocm-6.2.4 release
2024-12-05 18:41:02 -08:00
Dom Heinzeller
b808338792 py-uxarray: new package plus dependencies (#47573)
* Add py-param@2.1.1
* Add py-panel@1.5.2
* Add py-bokeh@3.5.2
* New package py-datashader
* New package py-geoviews
* New package py-holoviews
* WIP: new package py-uxarray
* New package py-antimeridian
* New package py-dask-expr
* New package py-spatialpandas
* New package py-hvplot
* Add dependency on py-dask-expr for 'py-dask@2024.3: +dataframe'
* Added all dependencies for py-uxarray; still having problems with py-dask +dataframe / py-dask-expr
* Fix style errors in many packages
* Clean up comments and fix style errors in var/spack/repos/builtin/packages/py-dask-expr/package.py
* In var/spack/repos/builtin/packages/py-dask/package.py: since 2023.8, the dataframe variant requires the array variant
* Fix style errors in py-uxarray package
2024-12-05 18:20:55 -08:00
Massimiliano Culpo
112e47cc23 Don't inject import statements in package recipes
Remove a hack done by RepoLoader, which was injecting an extra
```
from spack.package import *
```
at the beginning of each package.py
2024-12-05 12:48:00 -08:00
Dom Heinzeller
901cea7a54 Add conflict for pixman with Intel Classic (#47922) 2024-12-05 18:14:57 +01:00
Massimiliano Culpo
c1b2ac549d solver: partition classes related to requirement parsing into their own file (#47915) 2024-12-05 18:10:06 +01:00
Harmen Stoppels
4693b323ac spack.mirror: split into submodules (#47936) 2024-12-05 18:09:08 +01:00
Kin Fai Tse
1f2a68f2b6 tar: conditionally link iconv (#47933)
* fix broken packages requiring iconv

* tar: -liconv only when libiconv

* Revert "fix broken packages requiring iconv"

This reverts commit 5fa426b52f.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-12-05 10:09:18 -06:00
Juan Miguel Carceller
3fcc38ef04 pandoramonitoring,pandorasdk: change docstrings that are wrong (#47937)
and are copied from the pandorapfa package

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-12-05 08:53:09 -07:00
Harmen Stoppels
22d104d7a9 ci: add bootstrap stack for python@3.6:3.13 (#47719)
Resurrect latest Python 3.6
Add clingo-bootstrap to Gitlab CI.
2024-12-05 10:07:24 +01:00
Todd Gamblin
8b1009a4a0 resource: clean up arguments and typing
- [x] Clean up arguments on the `resource` directive.
- [x] Add type annotations
- [x] Add `resource` to type annotations on `PackageBase`
- [x] Fix up `resource` docstrings

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
f54526957a directives: add type annotations to DirectiveMeta class
Some of the class-level annotations were wrong, and some were missing. Annotate all the
functions here and fix the class properties to match what's actually happening.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
175a4bf101 directives: use Type[PackageBase] instead of PackageBase
The first argument to each Spack directive is not a `PackageBase` instance but a
`PackageBase` class object, so fix the type annotations to reflect this.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
aa81d59958 directives: don't include Optional in PatchesType
`Optional` shouldn't be part of `PatchesType` -- it's clearer to specify `Optional` it
in the methods that need their arguments to be optional.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
James Taliaferro
6aafefd43d package version: Neovim 0.10.2 (#47925) 2024-12-04 23:17:55 +01:00
Satish Balay
ac82f344bd trilinos@develop: update kokkos dependency (#47838) 2024-12-04 19:53:38 +01:00
Harmen Stoppels
16fd77f9da rust-bootstrap: fix zlib dependency (#47894)
x
2024-12-04 02:28:19 -08:00
Harmen Stoppels
f82554a39b stage.py: improve path to url (#47898) 2024-12-04 09:41:38 +01:00
Massimiliano Culpo
2aaf50b8f7 eigen: remove unnecessary dependency on fortran (#47866) 2024-12-04 08:18:40 +01:00
Mathew Cleveland
b0b9cf15f7 add a '+no_warning' variant to METIS to prevent pervasive warning (#47452)
* add a '+no_warning' variant to metis to prevent prevasive warning
* fix formating

---------

Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: mcourtois <mathieu.courtois@gmail.com>
2024-12-03 17:02:36 -08:00
v
8898e14e69 update py-numl and py-nugraph recipes (#47680)
* update py-numl and py-nugraph recipes

this commit adds the develop branch as a valid option for each of these two packages. in order to enable this, package tarballs are now retrieved from the github source repository instead of pypi, and their checksums and the build system have been updated accordingly.

* rename versions "develop" -> "main" to be consistent with branch name
2024-12-03 16:59:33 -08:00
Buldram
63c72634ea nim: add latest versions (#47844)
* nim: add latest versions
  In addition:
  - Create separate build and install phases.
  - Remove koch nimble call as it's redundant with koch tools.
  - Install all additional tools bundled with Nim instead of only Nimble.
* Fix 1.6 version
* nim: add devel
  In addition:
  - Fix build accessing user config/cache
2024-12-03 16:57:59 -08:00
Carson Woods
a7eacd77e3 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2024-12-03 17:16:36 +01:00
Cédric Chevalier
09b7ea0400 Bump Kokkos and Kokkos-kernels to 4.5.00 (#47809)
* Bump Kokkos and Kokkos-kernels to 4.5.00

* petsc@:3.22 add a conflict with this new version of kokkos

* Update kokkos/kokkos-kernel dependency

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-12-03 09:09:25 -07:00
Harmen Stoppels
b31dd46ab8 style.py: do not remove import spack in packages (#47895) 2024-12-03 16:04:18 +01:00
Harmen Stoppels
ad7417dee9 nwchem: add resource, remove patch (#47892)
fixes a build failure due to broken URL and improves nwchem build without internet
2024-12-03 14:09:05 +01:00
Wouter Deconinck
c3de3b0b6f tar: add v1.35 (fix CVEs) (#47426) 2024-12-03 13:26:04 +01:00
Harmen Stoppels
6da9bf226a python: drop nis module also for < 3.13 (#47862)
the nis module was removed in python 3.13
we had it default to ~nis
no package requires +nis
required dependencies for +nis were missing

so better to remove the nis module entirely.
2024-12-03 13:01:08 +01:00
Auriane R.
b3ee954e5b Remove duplicate version (#47880) 2024-12-03 10:14:47 +01:00
napulath
db090b0cad Update package.py (#47885) 2024-12-03 08:24:28 +01:00
Massimiliano Culpo
3a6c361a85 cgns: make fortran dependency optional (#47867) 2024-12-03 06:18:37 +01:00
Adam J. Stewart
bb5bd030d4 py-rasterio: add v1.4.3 (#47881) 2024-12-03 06:10:20 +01:00
dependabot[bot]
b9c60f96ea build(deps): bump pytest from 8.3.3 to 8.3.4 in /lib/spack/docs (#47882)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.3 to 8.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-03 06:07:27 +01:00
Stephen Nicholas Swatman
6b16c64c0e acts dependencies: new versions as of 2024/12/02 (#47787)
* acts dependencies: new versions as of 2024/11/25

This commit adds a new version of detray and two new versions of vecmem.

* acts dependencies: new versions as of 2024/12/02

This commit adds version 38 of ACTS and a new version of detray.
2024-12-02 19:50:25 -06:00
Andrey Perestoronin
3ea970746d add compilers packages (#47877) 2024-12-02 15:53:56 -07:00
Satish Balay
d8f2e080e6 petsc, py-petsc4py: add v3.22.2 (#47845) 2024-12-02 14:21:31 -08:00
Harmen Stoppels
ecb8a48376 libseccomp: python forward compat bound (#47876)
* libseccomp: python forward compat bound

* include 2.5.5

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-02 14:59:40 -07:00
Massimiliano Culpo
30176582e4 py-torchvision: add dependency on c (#47873) 2024-12-02 22:23:58 +01:00
Massimiliano Culpo
ac17e8bea4 utf8cpp: move to GitHub, make it a CMake package (#47870) 2024-12-02 14:14:24 -07:00
Massimiliano Culpo
c30c85a99c seacas: add a conditional dependency on fortran (#47871)
* seacas: remove unnecessary dependency on fortran
* seacas: add a conditional dependency on fortran
2024-12-02 13:13:14 -08:00
Michael Schlottke-Lakemper
2ae8eb6686 Update HOHQmesh package with newer versions (#47861) 2024-12-02 12:29:45 -08:00
Jose E. Roman
b5cc5b701c New patch release SLEPc 3.22.2 (#47859) 2024-12-02 12:06:52 -08:00
Wouter Deconinck
8e7641e584 onnx: set CMAKE_CXX_STANDARD to abseil-cpp cxxstd value (#47858) 2024-12-02 11:56:33 -08:00
Weiqun Zhang
e692d401eb amrex: add v24.12 (#47857) 2024-12-02 11:55:08 -08:00
Massimiliano Culpo
99319b1d91 oneapi-level-zero: add dependency on c (#47874) 2024-12-02 12:48:49 -07:00
Satish Balay
839ed9447c trilinos@14.4.0 revert kokkos-kernel dependency - as this breaks builds (#47852) 2024-12-02 11:44:37 -08:00
afzpatel
8e5a040985 ucc: add ROCm and rccl support (#46580) 2024-12-02 20:43:53 +01:00
Stephen Nicholas Swatman
5ddbb1566d benchmark: add version 1.9.1 (#47860)
This commit adds version 1.9.1 of Google Benchmark.
2024-12-02 11:42:38 -08:00
Massimiliano Culpo
eb17680d28 double-conversion: add dependency on c, and c++ (#47869) 2024-12-02 12:38:16 -07:00
Massimiliano Culpo
f4d81be9cf py-torch-nvidia-apex: add dependency on C (#47868) 2024-12-02 20:37:33 +01:00
Massimiliano Culpo
ea5ffe35f5 configuration: set egl as buildable:false (#47865) 2024-12-02 11:33:01 -08:00
Wouter Deconinck
1e37a77e72 mlpack: depends_on py-setuptools (#47828) 2024-12-02 12:04:53 +01:00
Todd Gamblin
29427d3e9e ruff: add v0.8.1 (#47851)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 10:49:47 +01:00
Todd Gamblin
2a2d1989c1 version_types: clean up type hierarchy and add annotations (#47781)
In preparation for adding `when=` to `version()`, I'm cleaning up the types in
`version_types` and making sure the methods here pass `mypy` checks. This started as an
attempt to use `ConcreteVersion` outside of `spack.version` and grew into a larger type
refactor.

The hierarchy now looks like this:

* `VersionType`
  * `ConcreteVersion`
    * `StandardVersion`
    * `GitVersion`
  * `ClosedOpenRange`
  * `VersionList`

Note that the top-level thing can't easily be `Version` as that is a method and it
returns only `ConcreteVersion` right now. I *could* do something fancy with `__new__` to
make `Version` a synonym for the `ConcreteVersion` constructor, which would allow it to
be used as a type. I could also do something similar with `VersionRange` but not sure if
it's worth it just to make these into types.

There are still some places where I think `GitVersion` might not be handled properly,
but I have not attempted to fix those here.

- [x] Add a top-level `VersionType` class that all version types extend from
- [x] Define and document common methods and rich comparisons on `VersionType`
- [x] Replace complicated `Union` types with `VersionType` and `ConcreteVersion` as needed
- [x] Annotate most methods (skipping `__getitem__` and friends as the typing is a pain)
- [x] Fix up the `VersionList` constructor a bit
- [x] Add cases to methods that weren't handling all `VersionType`s
- [x] Rework some places to clarify typing for `mypy`
- [x] Simplify / optimize _next_version
- [x] Make StandardVersion.string a property to enable lazy comparison

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 08:21:07 +01:00
Wouter Deconinck
c6e292f55f py-nbdime: add v3.2.1 (#47445) 2024-11-29 15:59:11 -07:00
teddy
bf5e6b4aaf py-mpi4py: create mpi.cfg file, this file is removed since v4.0.0, but API is retained #47584
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-29 13:28:21 -06:00
Adam J. Stewart
9760089089 VTK: mark Python version compatibility (#47814)
* VTK: mark Python version compatibility

* VTK 8.2.0 also only supports Python 3.7
2024-11-29 13:04:56 -06:00
dmagdavector
da7c5c551d py-pip: add v23.2.1 -> v24.3.1 (#47753)
* py-pip: update to latest version 24.3.1 (plus some others)

* py-pip: note Python version dependency for new PIP versions
2024-11-29 17:18:19 +01:00
Harmen Stoppels
a575fa8529 gcc: add missing patches from Iain Sandoe's branch (#47843) 2024-11-29 08:10:04 +01:00
Massimiliano Culpo
39a65d88f6 fpm: add a dependency on c, and fortran (#47839)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13871774

Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
2024-11-29 08:07:50 +01:00
Massimiliano Culpo
06ff8c88ac py-torch-sparse: add a dependency on c (#47841)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870876
2024-11-29 08:06:46 +01:00
Massimiliano Culpo
a96b67ce3d miopen-hip: add a dependency on c (#47842)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870957
2024-11-29 07:25:43 +01:00
Harmen Stoppels
67d494fa0b filesystem.py: remove unused md5sum (#47832) 2024-11-28 18:43:21 +01:00
Harmen Stoppels
e37e53cfe8 traverse: add MixedDepthVisitor, use in cmake (#47750)
This visitor accepts the sub-dag of all nodes and unique edges that have
deptype X directly from given roots, or deptype Y transitively for any
of the roots.
2024-11-28 17:48:48 +01:00
Andrey Perestoronin
cf31d20d4c add new packages (#47817) 2024-11-28 09:49:52 -05:00
Harmen Stoppels
b74db341c8 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2024-11-28 14:57:28 +01:00
Daryl W. Grunau
e88a3f6f85 eospac: version 6.5.12 (#47826)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-11-28 12:32:35 +01:00
Massimiliano Culpo
9bd7483e73 Add further C and C++ dependencies to packages (#47821) 2024-11-28 10:50:35 +01:00
Harmen Stoppels
04c76fab63 hip: hints for find_package llvm/clang (#47788)
LLVM can be a transitive link dependency of hip through gl's dependency mesa, which uses it for software rendering.

In this case make sure llvm-amdgpu is found with find_package(LLVM) and
find_package(Clang) by setting LLVM_ROOT and Clang_ROOT.

That makes the patch of find_package's HINTS redundant, so remove that.
It did not work anyways, because CMAKE_PREFIX_PATH has higher precedence
than HINTS.
2024-11-28 10:23:09 +01:00
Adam J. Stewart
ecbf9fcacf py-scooby: add v0.10.0 (#47790) 2024-11-28 10:21:36 +01:00
Victor A. P. Magri
69fb594699 hypre: add a variant to allow using internal lapack functions (#47780) 2024-11-28 10:15:12 +01:00
Howard Pritchard
d28614151f nghttp2: add v1.64.0 (#47800)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-11-28 10:12:41 +01:00
etiennemlb
f1d6af6c94 netlib-scalapack: fix for some clang derivative (cce/rocmcc) (#45434) 2024-11-28 09:25:33 +01:00
Adam J. Stewart
192821f361 py-river: mark numpy 2 compatibility (#47813) 2024-11-28 09:24:21 +01:00
Adam J. Stewart
18790ca397 py-pyvista: VTK 9.4 not yet supported (#47815) 2024-11-28 09:23:41 +01:00
BOUDAOUD34
c22d77a38e dbcsr: patch for resolving .mod file conflicts in ROCm by implementing USE, INTRINSIC (#46181)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-11-28 09:20:48 +01:00
Tom Payerle
d82bdb3bf7 seacas: update recipe to find faodel (#40239)
Explcitly sets the CMake variables  Faodel_INCLUDE_DIRS and Faodel_LIBRARY_DIRS when +faodel.
This seems to be needed for recent versions of seacas (seacas@2021-04-02:), but should be safe
to do for all versions.

For Faodel_INCLUDE_DIRS, it looks like Faodel has header files under $(Faodel_Prefix)/include/faodel,
but seacas is not including the "faodel" part in #includes.  So add both $(Faodel_Prefix)/include
and $(Foadel_Prefix)/include/faodel

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-11-28 09:17:44 +01:00
Matt Thompson
a042bdfe0b mapl: add hpcx-mpi (#47793) 2024-11-28 09:15:25 +01:00
Adam J. Stewart
60e3e645e8 py-joblib: add v1.4.2 (#47789) 2024-11-28 08:28:44 +01:00
Chris Marsh
51785437bc Patch to fix building gcc@14.2 on darwin. Fixes #45628 (#47830) 2024-11-27 20:58:18 -07:00
dependabot[bot]
2e8db0815d build(deps): bump docker/build-push-action from 6.9.0 to 6.10.0 (#47819)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.10.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4f58ea7922...48aba3b46d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-27 16:29:41 -07:00
George Malerbo
8a6428746f raylib: add v5.5 (#47708)
* Add version 5.5 in package.py

* Update package.py
2024-11-27 16:25:22 -07:00
Adam J. Stewart
6b9c099af8 py-keras: add v3.7.0 (#47816) 2024-11-27 16:12:47 -07:00
Derek Ryan Strong
30814fb4e0 Deprecate rsync releases before v3.2.5 (#47820) 2024-11-27 16:14:34 -06:00
Harmen Stoppels
3194be2e92 gcc-runtime: remove libz.so from libgfortran.so if present (#47812) 2024-11-27 22:32:37 +01:00
snehring
41be2f5899 ltr-retriever: changing directory layout (#38513) 2024-11-27 14:16:57 -07:00
kwryankrattiger
02af41ebb3 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2024-11-27 15:13:55 -06:00
snehring
9d33c89030 r-rsamtools: add -lz to Makevars (#38649) 2024-11-27 13:44:48 -07:00
Erik Heeren
51ab7bad3b julia: conflict for %gcc@12: support (#35931) 2024-11-27 04:31:44 -07:00
kwryankrattiger
0b094f2473 Docs: Reference 7z requirement on Windows (#35943) 2024-11-26 17:11:12 -05:00
Christoph Junghans
cd306d0bc6 all-libary: add voronoi support and git version (#47798)
* all-libary: add voronoi support and git version

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-26 14:56:22 -07:00
Dom Heinzeller
fdb9cf2412 Intel/oneapi compilers: correct version ranges for diab-disable flag (#47428)
* c/c++ flags should have been modified for all 2023.x.y versions, but
  upper bound was too low
* Fortran flags should have been modified for all 2024.x.y versions, but
  likewise the upper bound was too low
2024-11-26 12:34:37 -07:00
etiennemlb
a546441d2e siesta: remove link args on a non-declared dependency (#46080) 2024-11-26 20:25:04 +01:00
IHuismann
141cdb6810 adol-c: fix libs property (#36614) 2024-11-26 17:01:18 +01:00
Brian Van Essen
f2ab74efe5 cray: add further versions of Cray packages. (#37733) 2024-11-26 16:59:53 +01:00
Martin Aumüller
38b838e405 openscenegraph: remove X11 dependencies for macos (#39037) 2024-11-26 16:59:10 +01:00
Mark Abraham
c037188b59 gromacs: announce deprecation policy and start to implement (#47804)
* gromacs: announce deprecation policy and start to implement

* Style it up

* [@spackbot] updating style on behalf of mabraham

* Bump versions used in CI

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2024-11-26 05:54:07 -07:00
Mark Abraham
0835a3c5f2 gromacs: obtain SYCL from either ACpp or intel-oneapi-runtime (#47806) 2024-11-26 05:51:54 -07:00
Mark Abraham
38a2f9c2f2 gromacs: Improve HeFFTe dependency (#47805)
GROMACS supports HeFFTe with either SYCL or CUDA build and requires
a matching HeFFTe build
2024-11-26 05:50:41 -07:00
Massimiliano Culpo
eecd4afe58 gromacs: fix the value used for the ITT directory (#47795) 2024-11-26 08:14:45 +01:00
Seth R. Johnson
83624551e0 ROOT: default to +aqua~x on macOS (#47792) 2024-11-25 14:27:38 -06:00
Victor A. P. Magri
741652caa1 caliper: add "tools" variant (#47779) 2024-11-25 18:26:53 +01:00
Mark Abraham
8e914308f0 gromacs: add itt variant (#47764)
Permit configuring GROMACS with support for mdrun to trace its timing
regions by calling the ITT API. This permits tools like VTune and
unitrace to augment their analysis with GROMACS-specific annotation.
2024-11-25 16:12:55 +01:00
Mikael Simberg
3c220d0989 apex: add 2.7.0 (#47736) 2024-11-25 13:22:16 +01:00
Wouter Deconinck
8094fa1e2f py-gradio: add v5.1.0 (and add/update dependencies) (fix CVEs) (#47504)
* py-pdm-backend: add v2.4.3
* py-starlette: add v0.28.0, v0.32.0, v0.35.1, v0.36.3, v0.37.2, v0.41.2
* py-fastapi: add v0.110.2, v0.115.4
* py-pydantic-extra-types: add v2.10.0
* py-pydantic-settings: add v2.6.1
* py-python-multipart: add v0.0.17
* py-email-validator: add v2.2.0
2024-11-25 13:07:56 +01:00
Massimiliano Culpo
5c67051980 Add missing C/C++ dependencies (#47782) 2024-11-25 12:56:39 +01:00
John W. Parent
c01fb9a6d2 Add CMake 3.31 minor release (#47676) 2024-11-25 04:32:57 -07:00
Harmen Stoppels
bf12bb57e7 install_test: first look at builder, then package (#47735) 2024-11-25 11:53:28 +01:00
Wouter Deconinck
406c73ae11 py-boto*: add v1.34.162 (#47528)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:39:09 +01:00
Wouter Deconinck
3f50ccfcdd py-azure-*: updated versions (#47525) 2024-11-25 11:38:49 +01:00
Wouter Deconinck
9883a2144d py-quart: add v0.19.8 (#47508)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:38:22 +01:00
Wouter Deconinck
94815d2227 py-netifaces: add v0.10.9, v0.11.0 (#47451) 2024-11-25 11:37:41 +01:00
Wouter Deconinck
a15563f890 py-werkzeug: add v3.1.3 (and deprecate older versions) (#47507)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:28:01 +01:00
Wouter Deconinck
ac2ede8d2f py-pyzmq: add v25.1.2, v26.0.3, v26.1.1, v26.2.0 (switch to py-scikit-build-core) (#44493) 2024-11-25 11:00:00 +01:00
david-edwards-linaro
b256a7c50d linaro-forge: remove v21.1.3 (#47688) 2024-11-25 10:53:27 +01:00
Szabolcs Horvát
21e10d6d98 igraph: add v0.10.15 (#47692) 2024-11-25 10:51:24 +01:00
afzpatel
ed39967848 rocm-tensile: add 6.2.1 (#47702) 2024-11-25 10:40:21 +01:00
Alex Richert
eda0c6888e ip: add cmake version requirement for @5.1: (#47754) 2024-11-25 02:38:08 -07:00
pauleonix
66055f903c cuda: Add v12.6.3 (#47721) 2024-11-25 10:36:11 +01:00
Dave Keeshan
a1c57d86c3 libusb: add v1.0.23:1.0.27 (#47727) 2024-11-25 10:33:08 +01:00
Dave Keeshan
9da8dcae97 verible: add v0.0.3841 (#47729) 2024-11-25 10:30:48 +01:00
jflrichard
c93f223a73 postgis: add version 3.1.2 (#47743) 2024-11-25 10:24:03 +01:00
Wouter Deconinck
f1faf31735 build-containers: determine latest release tag and push that as latest (#47742) 2024-11-25 10:20:58 +01:00
Stephen Herbener
8957ef0df5 Updated version specs for bufr-query package. (#47752) 2024-11-25 10:14:16 +01:00
Veselin Dobrev
347ec87fc5 mfem: add logic for the C++ standard level when using rocPRIM (#47751) 2024-11-25 10:13:22 +01:00
Adam J. Stewart
cd8c46e54e py-ruff: add v0.8.0 (#47758) 2024-11-25 10:02:31 +01:00
Adam J. Stewart
75b03bc12f glib: add v2.82.2 (#47766) 2024-11-24 20:55:18 +01:00
Adam J. Stewart
58511a3352 py-pandas: correct Python version constraint (#47770) 2024-11-24 17:48:16 +01:00
Adam J. Stewart
325873a4c7 py-fsspec: add v2024.10.0 (#47778) 2024-11-24 15:42:30 +01:00
Adam J. Stewart
9156e4be04 awscli-v2: add v2.22.4 (#47765) 2024-11-24 15:42:06 +01:00
Adam J. Stewart
12d3abc736 py-pytz: add v2024.2 (#47777) 2024-11-24 15:40:45 +01:00
Adam J. Stewart
4208aa6291 py-torchvision: add Python 3.13 support (#47776) 2024-11-24 15:40:11 +01:00
Adam J. Stewart
0bad754e23 py-scikit-learn: add Python 3.13 support (#47775) 2024-11-24 15:39:36 +01:00
Adam J. Stewart
cde2620f41 py-safetensors: add v0.4.5 (#47774) 2024-11-24 15:38:05 +01:00
Adam J. Stewart
a35aa038b0 py-pystac: add support for Python 3.12+ (#47772) 2024-11-24 15:37:43 +01:00
Adam J. Stewart
150416919e py-pydantic-core: add v2.27.1 (#47771) 2024-11-24 15:37:06 +01:00
Adam J. Stewart
281c274e0b py-jupyter-packaging: add Python 3.13 support (#47769) 2024-11-24 15:31:31 +01:00
Adam J. Stewart
16e130ece1 py-cryptography: mark Python 3.13 support (#47768) 2024-11-24 15:31:08 +01:00
Adam J. Stewart
7586303fba py-ruamel-yaml-clib: add Python compatibility bounds (#47773) 2024-11-24 15:28:45 +01:00
Harmen Stoppels
6501880fbf py-node-env: add v1.9.1 (#47762) 2024-11-24 15:27:16 +01:00
Harmen Stoppels
c76098038c py-ipykernel: split forward and backward compat bounds (#47763) 2024-11-24 15:26:15 +01:00
Harmen Stoppels
124b616b27 add a few forward compat bounds with python (#47761) 2024-11-24 15:23:11 +01:00
Adam J. Stewart
1148c8f195 gobject-introspection: Python 3.12 still not supported (#47767) 2024-11-24 03:53:32 -07:00
Adam J. Stewart
c57452dd08 py-cffi: support Python 3.12+ (#47713) 2024-11-24 08:41:29 +01:00
Harmen Stoppels
a7e57c9a14 py-opt-einsum: add v3.4.0 (#47759) 2024-11-24 08:40:29 +01:00
Teague Sterling
85d83f9c26 duckdb: add v1.1.3, deprecate <v1.1.0 (#47653)
* duckdb: add v1.0.0, v0.10.3

* Adding issue reference

* Adding issue reference

* duckdb: add v1.1.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: add v1.1.1

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: Fix missing depends_on(unixodbc, when=+odbc)

* Adding duckdb variants, removing old variants, removing deprecated versions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb+static_openssl: Add pkgconfig and zlib-api to link zlib when needed

* duckdb: add v1.1.3

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py for CVE-2024-41672 as suggested

* [@spackbot] updating style on behalf of teaguesterling

* duckdb: add CVE comment before deprecated versions

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-23 13:13:40 -07:00
finkandreas
39a081d7fd Kokkos complex_align variant, Trilinos+PETSc enforcement for Kokkos~complex_align (#47686) 2024-11-23 07:45:22 -07:00
Harmen Stoppels
71b65bb424 py-opt-einsum: missing forward compat bound for python (#47757) 2024-11-23 10:48:07 +01:00
Adam J. Stewart
3dcbd118df py-cython: support Python 3.12+ (#47714)
and add various other compat bounds on dependents
2024-11-22 22:20:41 +01:00
Harmen Stoppels
5dacb774f6 itk: use vendored googletest (#47687)
external googletest breaks dependents because they end up with
ITK_LIBRARIES set to `GTest::GTest;GTest::Main`, which then end up
literally in a nonsensical link line `-lGTest::GtTest`.

the vendored googletest produces a cmake config file where
`ITKGoogleTest_LIBRARIES` is empty.
2024-11-22 18:41:23 +01:00
Harmen Stoppels
cb3d6549c9 traverse.py: ensure topo order is bfs for trees (#47720) 2024-11-22 15:04:19 +01:00
Mark Abraham
559c2f1eb9 gromacs: oneapi does not always require gcc (#47679)
* gromacs: oneapi does not always require gcc

* Support intel_provided_gcc only with %intel classic compiler

Require gcc only when needed with %intel

* New approach depending on gcc-runtime directly

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>

---------

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
2024-11-22 06:33:30 -07:00
Harmen Stoppels
ed1dbea77b eigen: self.builder.build_directory -> self.build_directory (#47728) 2024-11-22 07:20:38 +01:00
Seth R. Johnson
6ebafe4631 vecgeom: add v1.2.10 and delete unused, deprecated versions (#47725)
* vecgeom: add v1.2.10

* Remove unused+deprecated versions of vecgeom

* Deprecate older v1.2.x  versions

* [@spackbot] updating style on behalf of sethrj
2024-11-21 17:03:09 -07:00
Harmen Stoppels
7f0bb7147d README.md update old tutorial URL (#47718) 2024-11-21 16:46:46 +01:00
Satish Balay
f41b38e93d xsdk: add v1.1.0 (#47635)
xsdk: exclude pflotran, alquimia, exago

heffte: ~fftw when=+hip

dealii: ~sundials ~opencascade ~vtk ~taskflow
2024-11-21 08:08:27 -06:00
Massimiliano Culpo
5fd12b7bea Add further missing C, C++ dependencies to packages (#47662) 2024-11-21 14:49:12 +01:00
Mikael Simberg
fe746bdebb aws-ofi-nccl: Add 1.8.1 to 1.13.0 (#47717) 2024-11-21 05:37:57 -07:00
eugeneswalker
453af4b9f7 hdf5-vol-cache %cce: add -Wno-error=incompatible-function-pointer-types (#47698) 2024-11-20 14:56:19 -08:00
eugeneswalker
29cf1559cc netlib-scalapack %cce: add cflags -Wno-error=implicit-function-declaration (#47701) 2024-11-20 15:09:14 -07:00
eugeneswalker
a9b3e1670b mpifileutils%cce: append cflags -Wno-error=implicit-function-declaration (#47700) 2024-11-20 14:19:05 -07:00
kwryankrattiger
4f9aa6004b visit: add v3.4.0, v3.4.1 (#47161)
* Visit: Add new versions 3.4.0 and 3.4.1

* Adios2: Restrict python, 3.11 doesn't not work for older Adios2

* VisIt: Set the VTK_VERSION for @3.4:

Older versions of VTK used the VTK_{MAJOR, MINOR}_VERSION variables for
VTK detection. VisIt >= 3.4 uses the full string VTK_VERSION.

* CI: Don't build llvm-amdgpu for non-HIP stack

* VisIt: v3.4.1 handles newer Adios2 correctly

* Visit: Add missing links in HDF5, set correct VTK version configuration parameter

* VisIt: Add py-pip requirement and patch visit with configuration changes

* HDF5 symlinks move when inside of callback

* VisIt ninja install fails with python module. Using make does not

* VisIt 3.4 has a high minimum cmake requirement

* HDF5: Early return when not mpi for mpi symlinks

* HDF5: Use platform agnostic method for creating legacy compatible MPI symlinks

* Fix VISIT_VTK_VERSION handling for 8.2.1a hack
2024-11-20 18:37:56 +01:00
Harmen Stoppels
aa2c18e4df spack style: import-check -> import, fix bugs, exclude spack.pkg (#47690) 2024-11-20 16:15:28 +01:00
dependabot[bot]
0ff3e86315 build(deps): bump codecov/codecov-action from 5.0.2 to 5.0.3 (#47683)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.0.2 to 5.0.3.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5c47607acb...05f5a9cfad)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:40:01 -06:00
dependabot[bot]
df208c1095 build(deps): bump docker/metadata-action from 5.5.1 to 5.6.1 (#47682)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.1 to 5.6.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](8e5442c4ef...369eb591f4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:39:45 -06:00
Chris Marsh
853f70edc8 cgal: update depends versions for 6.0.1 (#47516)
* This extends PR #47285 to properly include some of the required version constrains of cgal 6 incl C++ standard. It also adds the new no-gmp backend as a variant.

* fix style

* disable cgal@6 +demo variant as the demos require qt6 which is not in spack

* disable the gmp variant until clarity on how its supposed to work is provided. bound shared and header_only variants to relevant versions

* Fix missing msvc compiler limit, fix variant left in

* Add more comments. Better describe the gmp variant. Remove testing code

* fix style
2024-11-19 16:43:21 -07:00
Paul R. C. Kent
50970f866e Add llvm v19.1.4 (#47681) 2024-11-19 16:03:28 -07:00
Wouter Deconinck
8821300985 py-gevent: add v24.2.1, v24.10.3, v24.11.1 (#47646)
* py-gevent: add v24.2.1, v24.10.3
* py-gevent: add v24.11.1
2024-11-19 12:14:52 -08:00
AMD Toolchain Support
adc8e1d996 Restrict disable dynamic thread scaling only to 3.1 version (#47673)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-11-19 12:12:21 -08:00
Andrey Perestoronin
1e0aac6ac3 Add new 2025.0.1 Oneapi patch packages (#47678) 2024-11-19 11:38:42 -07:00
Harmen Stoppels
99e2313d81 openturns: fix deps (#47669) 2024-11-19 18:13:47 +01:00
Mark Abraham
22690a7576 Make oneAPI library-with-sdk specialize library class (#47632) 2024-11-19 12:12:10 -05:00
Harmen Stoppels
5325cfe865 systemd: symlink the internal libraries so they are found in rpath (#47667) 2024-11-19 15:28:49 +01:00
Harmen Stoppels
5333925dd7 sensei: fix install rpath for python extension (#47670) 2024-11-19 15:23:54 +01:00
Massimiliano Culpo
2db99e1ff6 gmp: fix cxx dependency, remove dependency on fortran (#47671) 2024-11-19 15:19:08 +01:00
Massimiliano Culpo
68aa712a3e solver: add a timeout handle for users (#47661)
This PR adds a configuration setting to allow setting time limits for concretization.

For backward compatibility, the default is to set no time limit.
2024-11-19 15:00:26 +01:00
Mikael Simberg
2e71bc640c pika: Add 0.30.1 (#47666) 2024-11-19 05:44:41 -07:00
Dom Heinzeller
661f3621a7 netcdf-cxx: add a maintainer (#47665) 2024-11-19 05:28:38 -07:00
Massimiliano Culpo
f182032337 Restore message when concretizing in parallel (#47663)
It was lost in #44843
2024-11-19 12:28:14 +00:00
teddy
066666b7b1 py-non-regression-test-tools: add v1.1.6 & remove v1.1.2 (tag removed) (#47622)
* py-non-regression-test-tools: add v1.1.6  & remove v1.1.2 (tag removed)
* Update var/spack/repos/builtin/packages/py-non-regression-test-tools/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-19 04:38:33 -07:00
Richard Berger
73316c3e28 cached_cmake: mpifc is not always defined (#46861)
* cached_cmake: mpifc is not always defined
* mpich: only depend on fortran when +fortran
2024-11-18 14:49:56 -08:00
afzpatel
c8e4ae08da eigen: enable ROCm support and add master version (#47332)
* eigen: enable ROCm support and add master version
* change boost dependency to only for tests
2024-11-18 14:41:02 -08:00
Tom Scogland
44225caade llvm: fix sysroot and build on darwin (#47336)
The default build of clang on darwin couldn't actually build anything
because of a lack of a sysroot built in.  Also several compilation
errors finding the system libc++ cropped up, much like those in GCC, and
have been fixed.
2024-11-18 14:39:16 -08:00
Lydéric Debusschère
8d325d3e30 yambo: add v5.2.3, v5.2.4 (#47350)
* packages: Update 'yambo'
* add call to 'resource' method to download Ydriver and iotk during fetch instead of during build
* air-gapped installation could be performed since version 5.2.1
* add versions 5.2.3 and 5.2.4
* remove some inexistant configure options for versions "@5:"
* add a sanity_check on 'bin/yambo'
2024-11-18 14:36:16 -08:00
Mosè Giordano
d0fd112006 grep: add executables attribute and determine_version method (#47438) 2024-11-18 14:30:46 -08:00
Wouter Deconinck
50f43ca71d py-gitpython: add v3.1.43 (fix CVEs) (#47444)
* py-gitpython: add v3.1.43
2024-11-18 14:24:20 -08:00
Vanessasaurus
2546fb6afa Automated deployment to update package flux-sched 2024-11-05 (#47449)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:23:35 -08:00
Vanessasaurus
10f6863d91 Automated deployment to update package flux-security 2024-11-05 (#47450)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:21:42 -08:00
afzpatel
63ea528606 mivisionx, migraphx and rocal: add and fix tests (#47453)
* fix mivisionx test, add migraphx build-time test and add rocal unit test
* fix miopen-hip linking issue
* remove setup_run_environment from roctracer-dev
* change patch file
2024-11-18 14:09:18 -08:00
snehring
89d2b9553d tb-lmto: new package (#47482)
* tb-lmto: adding new package tb-lmto
* tb-lmto: update copyright year
2024-11-18 14:01:59 -08:00
Dom Heinzeller
278326b4d9 Add py-phdf@0.11.4 and add conflict for py-pyhdf@0.10.4 with numpy@1.25: (#47656) 2024-11-18 13:53:38 -08:00
alvaro-sch
43c1a5e0ec orca: add v6.0.1, avx2-6.0.1 (#47489)
* orca: add 6.0.1 versions
* orca: checksum fix for avx2-6.0.1
* orca: fix version string for avx2-6.0.1
* orca: checksum fix for 6.0.1
2024-11-18 13:34:38 -08:00
Paul Gessinger
8feb506b3a geomodel: Fix dependencies (#47437)
* geomodel: Add dependency on `hdf5` for `+pythia`, require `hdf5+cxx`

* fix visualization dependencies

* geomodel: Add soqt dependency

* update dependency on soqt to drop explicit qt variant
2024-11-18 15:31:53 -06:00
Wouter Deconinck
627544191a py-pymongo: add v4.10.1 (fix CVE) (#47501)
* py-pymongo: add v4.10.1
* py-pymongo: fix copyright header spacing
* py-hatch-requirements-txt: add v0.4.1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-18 12:53:32 -08:00
Wouter Deconinck
cf672ea8af py-waitress: add v3.0.1 (#47509) 2024-11-18 12:50:49 -08:00
green-br
2c4ac02adf Add option to not build GUI for WxWidgets. (#47526) 2024-11-18 12:43:23 -08:00
Niclas Jansson
7f76490b31 neko: add v0.8.1, v0.9.0 and fix package (#47558) 2024-11-18 12:38:08 -08:00
Thomas-Ulrich
46e4c1fd30 seissol: add versions, conflict (#47562)
* add a couple of seissol version
2024-11-18 12:34:59 -08:00
Rémi Lacroix
85c5533e62 hpctoolkit: Update the minimum version for Python dependency (#47564)
New versions of HPCToolKit supports Python from version 3.8.
2024-11-18 12:31:20 -08:00
Vicente Bolea
c47cafd11a diy: apply smoke_test patch in 3.6 only (#47572) 2024-11-18 12:26:45 -08:00
jmuddnv
8e33cc158b Changes for NVIDIA HPC SDK 24.11 (#47592) 2024-11-18 12:24:31 -08:00
Adam J. Stewart
f07173e5ee py-torchmetrics: add v1.6.0 (#47580) 2024-11-18 12:22:57 -08:00
Ken Raffenetti
118f5d2683 mpich: Remove incorrect dependency (#47586)
The gni libfabric provider works on some Cray systems, but not all. For
example, Slingshot-based machines use a different libfabric provider
(cxi). Therefore libfabric/gni should not be a dependency when using
Cray PMI.
2024-11-18 12:20:10 -08:00
Louise Spellacy
8fb2abc3cd linaro-forge: added version 24.1 (#47594) 2024-11-18 12:16:12 -08:00
afzpatel
3bcb8a9236 rocm-dbgapi: add pciutils dependency (#47605)
* add pciutils dependency
* add maintainer
2024-11-18 12:12:55 -08:00
Adam J. Stewart
a6fdd7608f py-huggingface-hub: add v0.26.2, hf_transfer (#47600) 2024-11-18 11:51:16 -08:00
Cody Balos
1ffd7125a6 sundials: set CUDAToolkit_ROOT (#47601)
* sundials: specify CUDAToolkit_ROOT
2024-11-18 11:48:16 -08:00
Jerome Soumagne
d1166fd316 mercury: add v2.4.0 (#47606) 2024-11-18 11:36:23 -08:00
Weiqun Zhang
b8eba1c677 amrex: add new variant fft for >= 24.11 (#47611) 2024-11-18 11:35:00 -08:00
Pranav Sivaraman
e3c0515076 tinycbor: new package (#47623) 2024-11-18 11:19:14 -08:00
Pranav Sivaraman
97406f241c tomlplusplus: new package (#47624)
* tomlplusplus: new package
* tomlplusplus: remove period
2024-11-18 11:15:30 -08:00
Wouter Deconinck
e1dfbbf611 py-greenlet: add v3.0.3, v3.1.1 (#47647)
* py-greenlet: add v3.0.3, v3.1.1
* py-greenlet: depends_on py-setuptools@40.8.0:
2024-11-18 12:13:41 -07:00
Paul
52147348c7 Added Go v1.23.3 (#47633) 2024-11-18 10:52:17 -08:00
Wouter Deconinck
aeb0ab6acf pocl: add v3.1, v4.0, v5.0, v6.0 (#47643) 2024-11-18 10:43:26 -08:00
Wouter Deconinck
6cd26b7603 clinfo: add v3.0.23.01.25 (#47644) 2024-11-18 10:41:32 -08:00
wspear
1c75d07f05 Fix self.spec reference (#47610)
@teaguesterling
2024-11-18 10:37:53 -08:00
Wouter Deconinck
15b3ff2a0a harfbuzz: add v10.1.0 (#47645) 2024-11-18 10:29:29 -08:00
Teague Sterling
e9f94d9bf2 ollama: add v0.4.2 (#47654)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-11-18 10:14:16 -08:00
Wouter Deconinck
299324c7ca dbus: add v1.15.12 (#47655) 2024-11-18 10:10:14 -08:00
Stephen Nicholas Swatman
dfab174f31 benchmark: add version 1.9.0 (#47658)
This commit adds Google Benchmark v1.9.0.
2024-11-18 07:04:52 -06:00
Stephen Nicholas Swatman
a86953fcb1 acts: add version 37.4.0 (#47657)
This commit adds v37.4.0 of the ACTS project; no new versions of the
dependencies were released.
2024-11-18 06:58:59 -06:00
Massimiliano Culpo
5f262eb5d3 Add further missing C, C++ dependencies to packages (#47649) 2024-11-18 09:39:47 +01:00
Wouter Deconinck
00f179ee6d root: add v6.32.08 (#47641) 2024-11-17 18:39:17 -06:00
Massimiliano Culpo
da4f7c2952 Add an audit to prevent using the name "all" in packages (#47651)
Packages cannot be named like that, since we use "all" to indicate
default settings under the "packages" section of the configuration.
2024-11-17 13:32:24 -07:00
Harmen Stoppels
fdedb6f95d style.py: add import-check for missing & redundant imports (#47619) 2024-11-17 09:18:48 +01:00
Massimiliano Culpo
067fefc46a Set "generic" uarch granularity for a few pipelines (#47640) 2024-11-17 09:07:42 +01:00
Massimiliano Culpo
42c9961bbe Added a few missing language deps to packages (#47639) 2024-11-17 09:07:08 +01:00
Wouter Deconinck
fe2bf4c0f9 pixman: add missing MesonPackage (#47607) 2024-11-17 09:03:15 +01:00
Harmen Stoppels
4d3b85c4d4 spack.package / builtin repo: fix exports/imports (#47617)
Add various missing imports in packages.
Remove redundant imports
Export NoLibrariesError, NoHeadersError, which_string in spack.package
2024-11-17 09:02:04 +01:00
Satish Balay
f05cbfbf44 xsdk: dealii has changes to variant defaults, update xsdk accordingly (#47602) 2024-11-16 16:42:16 -06:00
Wouter Deconinck
448049ccfc qt-tools: new package (#45602)
* qt-tools: new pkg with +designer to build Qt Designer for QWT

* qt-tools: fix style

* qt-tools: fix unused variable

* qt-tools: rm setup_run_environments (now in qt-base)

* qt-tools: add myself as maintainer

* qt-tools: add variant assistant; use commits with submodule

* qt-base: define QtPackage.get_git
2024-11-16 09:09:41 -06:00
etiennemlb
e56057fd79 gobject-introspection: Do not write to user home (#47621) 2024-11-16 11:11:52 +01:00
Harmen Stoppels
26d80e7bc5 py-blosc2: use external libblosc2 (#47566) 2024-11-16 09:43:54 +01:00
Dom Heinzeller
60eb0e9c80 Bug fix in py-scipy for versions 1.8.0 to 1.14.0 that surfaces with latest Clang and Intel LLVM compilers (#47620) 2024-11-16 06:56:25 +01:00
Thomas Bouvier
7443a3b572 py-wandb: add v0.16.6 (#43891)
* py-wandb: add version v0.16.6

* fix: typo

* py-wandb: py-click when @0.15.5:, py-pathtools when @:0.15

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:17:51 -07:00
dependabot[bot]
a5ba4f8d91 build(deps): bump codecov/codecov-action from 4.6.0 to 5.0.2 (#47631)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.6.0 to 5.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](b9fd7d16f6...5c47607acb)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 21:41:14 -06:00
Matthias Wolf
6ef0f495a9 py-libsonata: add v0.1.29 (#47466)
* py-libsonata: new version.

* Fix Python version dependency.

* py-libsonata: fix typo

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 20:27:41 -07:00
Matthias Wolf
e91b8c291a py-numpy-quaternion: add v2024.0.3 (#47469)
* py-numpy-quaterion: add new version.

* Update dependency version bounds

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix build dependencies.

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:16:16 -06:00
Wouter Deconinck
6662046aca armadillo: add v14.0.3 (#47634) 2024-11-15 19:57:48 -07:00
Matthieu Dorier
db83c62fb1 arrow: add v18.0.0 (#47494)
* arrow: added version 18.0.0

This PR adds version 18.0.0 to the arrow package.

* arrow: updated dependency on llvm
2024-11-15 20:54:43 -06:00
teddy
d4adfda385 costo: add v0.0.8 (#47625)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-15 18:10:52 -08:00
Matt Thompson
e8a8e2d98b mapl: add 2.40.3.1 (#47627)
* mapl: add 2.40.3.1
* Relax ESMF requirement
2024-11-15 18:09:22 -08:00
Paolo
55c770c556 Add ACfL 24.10.1 (#47616) 2024-11-15 18:05:38 -08:00
Thomas Gruber
33a796801c Likwid: add version 5.4.0 (#47630) 2024-11-15 18:04:21 -08:00
Seth R. Johnson
b90ac6441c celeritas: remove ancient versions and add CUDA package dependency (#47629)
* celeritas: remove deprecated versions through 0.3

* celeritas: deprecate old versions

* celeritas: add c++20 option

* Propagate vecgeom CUDA requirements

* Remove outdated conflicts and format it
2024-11-15 17:27:22 -07:00
dependabot[bot]
68b69aa9e3 build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#47588)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 3.0.1 to 3.0.2.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/3.0.1...3.0.2)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 17:21:42 -06:00
Veselin Dobrev
ac0ed2c4cc [mfem] Add a patch for MFEM v4.7 that adds support for SUDIALS v7 (#47591) 2024-11-15 08:50:44 -06:00
Harmen Stoppels
66a93b5433 Add missing llnl.* imports (#47618) 2024-11-15 15:49:25 +01:00
Harmen Stoppels
b7993317ea Improve type hints for package API (#47576)
by disentangling `package_base`, `builder` and `directives`.
2024-11-15 09:13:10 +01:00
etiennemlb
66622ec4d0 py-easybuild-framework: add python forward compat bound (#47597) 2024-11-14 23:47:23 -07:00
Pranav Sivaraman
9b2cd1b208 yyjson: new package (#47563)
* yyjson: new package

* [@spackbot] updating style on behalf of pranav-sivaraman

---------

Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-11-14 11:01:21 -08:00
Dominic Hofer
9888683a21 eccodes: add v2.38.0 (#47581)
* eccodes: Add 2.38.0

* Update var/spack/repos/builtin/packages/eccodes/package.py
2024-11-14 10:57:59 -08:00
Massimiliano Culpo
fb46c7a72d Rework spack.database.InstallStatuses into a flag (#47321) 2024-11-14 15:43:31 +01:00
Massimiliano Culpo
c0196cde39 Remove support for PGI compilers (#47195) 2024-11-14 09:17:41 +01:00
Todd Gamblin
d091172d67 Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-13 23:20:03 -08:00
psakievich
ab51369087 Update tutorial version (#47593) 2024-11-14 08:15:11 +01:00
Matthieu Dorier
1cea82b629 xfsprogs: fix dependency on liburcu (#47582)
* xfsprogs: fix dependency on liburcu

* xfsprogs: fix install rules.d

* xfsprogs: edited xfsprogs requirement on liburcu

* xfsprogs: many more versions
2024-11-13 16:17:20 -06:00
H. Joe Lee
2abb711337 hermes-shm: remove duplicate line (#47575)
close #10
2024-11-13 09:29:25 -07:00
Todd Gamblin
6f948eb847 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-13 08:21:16 -07:00
Alec Scott
93bf0634f3 nlopt: reformat for best practices (#47340) 2024-11-13 08:20:56 -07:00
Luca Heltai
badb3cedcd dealii: add v9.6.0 (#45554)
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-11-13 15:34:27 +01:00
Harmen Stoppels
be918817d6 bump version to 0.24.0.dev0 (#47578) 2024-11-13 13:05:14 +01:00
Harmen Stoppels
41d9f687f6 missing and redundant imports (#47577) 2024-11-13 13:03:09 +01:00
dslarm
9642b04513 Add SVE as a variant for Neoverse N2. Default to true, but should be (#47567)
benchmarked to test if that is a correct decision.
2024-11-12 22:09:05 -07:00
John Gouwar
bf16f0bf74 Add solver capability for synthesizing splices of ABI compatible packages. (#46729)
This PR provides complementary 2 features:
1. An augmentation to the package language to express ABI compatibility relationships among packages. 
2. An extension to the concretizer that can synthesize splices between ABI compatible packages.

1.  The `can_splice` directive and ABI compatibility 
We augment the package language with a single directive: `can_splice`. Here is an example of a package `Foo` exercising the `can_splice` directive:

class Foo(Package):
    version("1.0")
    version("1.1")
    variant("compat", default=True)
    variant("json", default=False)
    variant("pic", default=False)
    can_splice("foo@1.0", when="@1.1")
    can_splice("bar@1.0", when="@1.0+compat")
    can_splice("baz@1.0+compat", when="@1.0+compat", match_variants="*")
    can_splice("quux@1.0", when=@1.1~compat", match_variants="json")

Explanations of the uses of each directive: 
- `can_splice("foo@1.0", when="@1.1")`:  If `foo@1.0` is the dependency of an already installed spec and `foo@1.1` could be a valid dependency for the parent spec, then `foo@1.1` can be spliced in for `foo@1.0` in the parent spec.
- `can_splice("bar@1.0", when="@1.0+compat")`: If `bar@1.0` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `bar@1.0+compat` in the parent spec
-  `can_splice("baz@1.0", when="@1.0+compat", match_variants="*")`: If `baz@1.0+compat` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `baz@1.0+compat` in the parent spec, provided that they have the same value for all other variants (regardless of what those values are). 
-  `can_splice("quux@1.0", when=@1.1~compat", match_variants="json")`:If `quux@1.0` is the dependency of an already installed spec and `foo@1.1~compat` could be a valid dependency for the parent spec, then `foo@1.0~compat` can be spliced in for `quux@1.0` in the parent spec, provided that they have the same value for their `json` variant. 

2. Augmenting the solver to synthesize splices
### Changes to the hash encoding in `asp.py`
Previously, when including concrete specs in the solve, they would have the following form:

installed_hash("foo", "xxxyyy")
imposed_constraint("xxxyyy", "foo", "attr1", ...)
imposed_constraint("xxxyyy", "foo", "attr2", ...)
% etc. 

Concrete specs now have the following form:
installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "foo", "attr1", ...)
hash_attr("xxxyyy", "foo", "attr2", ...)

This transformation allows us to control which constraints are imposed when we select a hash, to facilitate the splicing of dependencies. 

2.1 Compiling `can_splice` directives in `asp.py`
Consider the concrete spec:
foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ...
It will emit the following facts for reuse (below is a subset)

installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "hash", "foo", "xxxyyy")
hash_attr("xxxyyy", "version", "foo", "2.72")
hash_attr("xxxyyy", "node_os", "ubuntu22.04")
hash_attr("xxxyyy", "hash", "bar", "zzzqqq")
hash_attr("xxxyyy", "depends_on", "foo", "bar", "link")

Rules that derive abi_splice_conditions_hold will be generated from 
use of the `can_splice` directive. They will have the following form:
can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) --->

abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BaseHash) :-
  installed_hash("foo", BaseHash),
  attr("node", node(SID, SpliceName)),
  attr("node_version_satisfies", node(SID, "foo"), "1.0.1"),
  hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"),
  attr("variant_value", node(SID, "foo"), "a", "True"),
  hash_attr("hash", "variant_value", "foo", "a", "True"),
  attr("variant_value", node(SID, "foo"), "b", VariVar0),
  hash_attr("hash", "variant_value", "foo", "b", VariVar0).


2.2 Synthesizing splices in `concretize.lp` and `splices.lp`

The ASP solver generates "splice_at_hash" attrs to indicate that a particular node has a splice in one of its immediate dependencies. 

Splices can be introduced in the dependencies of concrete specs when `splices.lp` is conditionally loaded (based on the config option `concretizer:splice:True`. 

2.3 Constructing spliced specs in `asp.py`

The method `SpecBuilder._resolve_splices` implements a top-down memoized implementation of hybrid splicing. This is an optimization over the more general `Spec.splice`, since the solver gives a global view of exactly which specs can be shared, to ensure the minimal number of splicing operations. 

Misc changes to facilitate configuration and benchmarking 
- Added the method `Solver.solve_with_stats` to expose timers from the public interface for easier benchmarking 
- Added the boolean config option `concretizer:splice` to conditionally load splicing behavior 

Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-11-12 20:51:19 -08:00
v
ad518d975c py-nugraph, ph5concat, py-numl: Add new nugraph packages (#47315) 2024-11-13 01:34:11 +01:00
SXS Bot
a76e3f2030 spectre: add v2024.03.19 (#43275)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-11-12 15:16:27 -07:00
Greg Becker
1809b81e1d parse_specs: special case for concretizing lookups quickly (#47556)
We added unification semantics for parsing specs from the CLI, but there are a couple
of special cases in which we can avoid calls to the concretizer for speed when the
specs can all be resolved by lookups.

- [x] special case 1: solving a single spec

- [x] special case 2: all specs are either concrete (come from a file) or have an abstract
      hash. In this case if concretizer:unify:true we need an additional check to confirm
      the specs are compatible.

- [x] add a parameterized test for unifying on the CI

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-12 15:04:47 -07:00
Alec Scott
a02b40b670 restic: add v0.17.3 (#47553) 2024-11-12 14:15:53 -07:00
Alec Scott
6d8fdbcf82 direnv: add v2.35.0 (#47551) 2024-11-12 13:54:19 -07:00
Paul Gessinger
3dadf569a4 geomodel: Allow configuring C++ standard (#47422)
* geomodel: Allow configuring C++ standard

* drop c++11
2024-11-12 14:41:14 -05:00
Alec Scott
751585f1e3 glab: add v1.48.0 (#47552) 2024-11-12 12:07:34 -07:00
Wouter Deconinck
f6d6a5a480 parsec: update urls (#47416)
* parsec: update urls
* parsec: fix homepage
2024-11-12 11:31:57 -07:00
Matthieu Dorier
57a1ebc77e xfsprogs: fix dependency on gettext (#47547)
* xfsprogs: fix dependency on gettext

* changed dependency on gettext in xfsprogs

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-12 11:20:48 -07:00
Wouter Deconinck
acdcd1016a openssh: add v9.9p1 (#47555) 2024-11-12 10:04:12 -08:00
Matthieu Dorier
e7c9bb5258 py-constantly: add v23.10.4 (#47548)
* py-constantly: added version 23.10.4
* py-constantly: fixed dependency on py-versioneer
* py-constantly: updated py-versioneer dependency

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-12 09:53:34 -08:00
teddy
e083acdc5d costo: new package and to fix the build, add pkgconfig dep to vtk (#47121)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-12 17:04:20 +01:00
Sebastian Pipping
99fd37931c expat: Add 2.6.4 with security fixes + deprecate vulnerable 2.6.3 (#47521) 2024-11-12 07:10:00 -07:00
Harmen Stoppels
00e68af794 llvm-amdgpu: add missing dependency on libxml2 (#47560) 2024-11-12 14:51:33 +01:00
Harmen Stoppels
e33cbac01f getting_started.rst: fix list of spack deps (#47557) 2024-11-12 08:59:07 +01:00
Wouter Deconinck
ada4c208d4 py-cryptography: add v43.0.3 (switch to maturin) (#47546)
* py-cryptography: add v43.0.3 (switch to maturin)
* py-cryptography: deny some setuptools versions
* py-cryptography: depends_on py-setuptools-rust when @42, no range

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-11 22:09:33 -07:00
Xavier Delaruelle
91310d3ae6 environment-modules: add version 5.5.0 (#47543)
This new version is compatible with Tcl 9.0. It also requires
'util-linux' for new logging capabilities.
2024-11-11 21:45:03 -07:00
Tim Haines
def1613741 gdb: add version 15.2 (#47540) 2024-11-11 21:44:33 -07:00
Mosè Giordano
ac703bc88d prometheus: add v2.55.1 (#47544) 2024-11-11 21:34:32 -07:00
Mikael Simberg
f0f5ffa9de libunwind: Add 1.7.2, 1.8.1, and new *-stable branches (#47412)
* libunwind: Add 1.7.2 and 1.8.1
* libunwind: Remove deprecated 1.1 version
* libunwind: Add newer *-stable branches: Remove 1.5-stable branch as well as cleanup.
* libunwind: Use GitHub url for all versions
* libunwind: Add conflict for PPC and 1.8.*
* libunwind: Add conflict for aarch64 and 1.8:
   Build fails with

   aarch64/Gos-linux.c: In function '_ULaarch64_local_resume':
   aarch64/Gos-linux.c:147:1: error: x29 cannot be used in asm here
    }
    ^
   aarch64/Gos-linux.c:147:1: error: x29 cannot be used in asm here
   make[2]: *** [Makefile:4795: aarch64/Los-linux.lo] Error 1
2024-11-11 18:17:36 -08:00
Alberto Sartori
65929888de justbuild: add version 1.4.0 (#47410) 2024-11-11 18:15:07 -08:00
Luke Diorio-Toth
2987efa93c packages: new versions (diamond, py-alive-progress, py-bakta, py-deepsig-biocomp), new packages (py-pyhmmer, py-pyrodigal) (#47277)
* added updated versions
* added pyhmmer
* updated infernal
* fix blast-plus for apple-clang
* fix py-biopython build on apple-clang
* remove erroneous biopython dep: build issue is with python 3.8, not biopython
* deepsig python 3.9: expanding unnecessary python restrictions
* add pyrodigal
* fix unnecessarily strict diamond version
* builds and updates: blast-plus indexing broken, still need to test db download and bakta pipeline
* builds and runs
* revert blast-plus changes: remove my personal hacks to get blast-plus to build
2024-11-11 18:13:46 -08:00
Tim Haines
37de92e7a2 extrae: Update dyninst dependency (#47359) 2024-11-11 18:09:03 -08:00
Sreenivasa Murthy Kolam
42fd1cafe6 Fix the build error during compilation of rocdecode package (#47283)
* fix the build error during compilation of rocdecode.was dependent on libva-devel packag
* address review comment
* address review changes.commit the changes
2024-11-11 18:05:21 -08:00
MatthewLieber
370694f112 osu-micro-benchmarks: add v7.5 (#47423)
* Adding sha for 7.4 release of OSU Micro Benchmarks
* Adds the sha256sum for the OSU mirco benchmarks 7.5 release.
2024-11-11 18:01:39 -08:00
Wouter Deconinck
fc7125fdf3 py-fsspec-xrootd: new package (#47405)
* py-fsspec-xrootd: new package
* py-fsspec-xrootd: depends_on python@3.8:
2024-11-11 17:58:18 -08:00
Stephen Herbener
3fed708618 openmpi: add two_level_namespace variant for MacOS (#47202)
* Add two_level_namespace variant (default is disabled) for MacOS to enable building
executables and libraries with two level namespace enabled.
* Addressed reviewer comments.
* Moved two_level_namespace variant ahead of the patch that uses that variant to
get concretize to work properly.
* Removed extra print statements
2024-11-11 17:55:28 -08:00
renjithravindrankannath
0614ded2ef Removing args to get libraries added in RPATH (#47465) 2024-11-11 17:54:19 -08:00
Satish Balay
e38e51a6bc superlu-dist: add v9.1.0, v9.0.0 (#47461)
Fix typo wrt @xiaoyeli
2024-11-11 19:52:19 -06:00
Wouter Deconinck
c44c938caf rsyslog: add v8.2410.0 (fix CVE) (#47511)
* rsyslog: add v8.2410.0
2024-11-11 17:50:02 -08:00
Wouter Deconinck
cdaacce4db varnish-cache: add v7.6.1 (#47513) 2024-11-11 17:47:53 -08:00
Wouter Deconinck
b98e5886e5 py-pyppeteer: new package (#47375)
* py-pyppeteer: new package
* py-pyee: new package (v11.1.1, v12.0.0)
2024-11-11 17:44:28 -08:00
Wouter Deconinck
09a88ad3bd xerces-c: add v3.3.0 (#47522) 2024-11-11 17:30:30 -08:00
Wouter Deconinck
4d91d3f77f scitokens-cpp: add v1.1.2 (#47523) 2024-11-11 17:28:06 -08:00
Wouter Deconinck
b748907a61 pixman: add v0.44.0 (switch to meson) (#47529)
* pixman: add v0.44.0 (switch to meson)
2024-11-11 17:26:29 -08:00
Wouter Deconinck
cbd9fad66e xtrans: add v1.5.2 (#47530) 2024-11-11 17:22:32 -08:00
Wouter Deconinck
82dd33c04c git: add v2.46.2, v2.47.0 (#47534) 2024-11-11 17:21:27 -08:00
teddy
31b2b790e7 py-non-regression-test-tools: add v1.1.4 (#47520)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-11 18:57:00 -06:00
Alec Scott
9fd698edcb fzf: add v0.56.2 (#47549) 2024-11-11 16:15:34 -08:00
Alec Scott
247446a8f3 bfs: add v4.0.4 (#47550) 2024-11-11 16:12:48 -08:00
Paul Gessinger
993f743245 soqt: new package (#47443)
* soqt: Add SoQt package

The geomodel package needs this if visualization is turned on.

* make qt versions explicit

* use virtual dependency for qt

* pr feedback

Remove myself as maintainer
Remove v1.6.0
Remove unused qt variant
2024-11-11 17:03:08 -06:00
Harmen Stoppels
786f8dfcce openmpi: fix detection (#47541)
Take a simpler approach to listing variant options -- store them in variables instead of trying to
roundtrip them through metadata dictionaries.
2024-11-11 14:14:38 -08:00
Harmen Stoppels
4691301eba Compiler.default_libc: early exit on darwin/win (#47554)
* Compiler.default_libc: early exit on darwin/win

* use .cc when testing c++ compiler if c compiler is missing
2024-11-11 14:12:43 -08:00
eugeneswalker
a55073e7b0 vtk-m %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47477) 2024-11-11 13:57:45 -08:00
Harmen Stoppels
484c9cf47c py-pillow: patch for disabling optional deps (#47542) 2024-11-11 11:55:47 -07:00
Peter Scheibel
9ed5e1de8e Bugfix: spack find -x in environments (#46798)
This addresses part [1] of #46345

#44713 introduced a bug where all non-spec query parameters like date
ranges, -x, etc. were ignored when an env was active.

This fixes that issue and adds tests for it.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-11-11 10:13:31 -08:00
Massimiliano Culpo
4eb7b998e8 Move concretization tests to the same folder (#47539)
* Move concretization tests to the same folder

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Fix for clingo-cffi

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-11 19:01:24 +01:00
Satish Balay
3b423a67a2 butterflypack: add v3.2.0, strumpack: add v8.0.0 (#47462)
* butterflypack: add v3.2.0

* strumpack: add v8.0.0

* restrict fj patch to @1.2.0

* Update var/spack/repos/builtin/packages/butterflypack/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-11 11:18:03 -06:00
kwryankrattiger
b803dabb2c mirrors: allow username/password as environment variables (#46549)
`spack mirror add` and `set` now have flags `--oci-password-variable`, `--oci-password-variable`, `--s3-access-key-id-variable`, `--s3-access-key-secret-variable`, `--s3-access-token-variable`, which allows users to specify an environment variable in which a username or password is stored.

Storing plain text passwords in config files is considered deprecated.

The schema for mirrors.yaml has changed, notably the `access_pair` list is generally replaced with a dictionary of `{id: ..., secret_variable: ...}` or `{id_variable: ..., secret_variable: ...}`.
2024-11-11 16:34:39 +01:00
v
33dd894eff py-oracledb: add v1.4.2, v2.3.0, v2.4.1 (#47313)
the py-oracledb package only has a single outdated version available in its recipe. this PR adds a much broader range of versions and their corresponding checksums.

* add more versions of py-oracledb
* update py-oracledb recipe
* add py-cython version dependencies
* tweak py-cython version dependencies
* remove older versions of py-oracledb
2024-11-11 07:10:09 -08:00
Satish Balay
f458392c1b petsc: use --with-exodusii-dir [as exodus does not have 'libs()' to provide value for --with-exodusii-lib] (#47506) 2024-11-11 08:52:20 -06:00
Wouter Deconinck
8c962a94b0 vbfnlo: add v3.0; depends on tcsh (build) (#47532)
* vbfnlo: depends on tcsh (build)

* vbfnlo: add v3.0

* vbfnlo: comment

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>

---------

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-11-11 07:24:44 -07:00
Wouter Deconinck
8b165c2cfe py-gosam: add v2.1.2 (#47533) 2024-11-11 14:28:52 +01:00
Mikael Simberg
01edde35be ut: add 2.1.0 and 2.1.1 (#47538) 2024-11-11 14:07:08 +01:00
Massimiliano Culpo
84d33fccce llvm: filter clang-ocl from the executables being probed (#47536)
This filters any selected executable ending with `-ocl` from the list of executables being probed as candidate for external `llvm` installations.

I couldn't reproduce the entire issue, but with a simple script:
```
#!/bin/bash

touch foo.o
echo "clang version 10.0.0-4ubuntu1 "
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
exit 0
```
I noticed the executable was still probed:
```
$ spack -d compiler find /tmp/ocl
[ ... ]
==> [2024-11-11-08:38:41.933618] '/tmp/ocl/bin/clang-ocl' '--version'
```
and `foo.o` was left in the working directory. With this change, instead the executable is filtered out of the list on which we run `--version`, so `clang-ocl --version` is not run by Spack.
2024-11-11 05:05:01 -07:00
Todd Gamblin
c4a5a996a5 solver: avoid parsing specs in setup
- [x] Get rid of a call to `parser.quote_if_needed()` during solver setup, which
      introduces a circular import and also isn't necessary.

- [x] Rename `spack.variant.Value` to `spack.variant.ConditionalValue`, as it is *only*
      used for conditional values. This makes it much easier to understand some of the
      logic for variant definitions.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-11 01:54:57 -08:00
Todd Gamblin
6961514122 imports: move conditional to directives.py
`conditional()`, which defines conditional variant values, and the other ways to declare
variant values should probably be in a layer above `spack.variant`. This does the simple
thing and moves *just* `conditional()` to `spack.directives` to avoid a circular import.

We can revisit the public variant interface later, when we split packages from core.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-11 01:54:57 -08:00
Harmen Stoppels
a9e6074996 filesystem.py find: return directories and improve performance (#47537) 2024-11-11 09:43:23 +00:00
Giuncan
30db764449 lua: always generate pcfile without patch and remove +pcfile variant (#47353)
* lua: add +pcfile support for @5.4: versions, without using a version-dependent patch

* lua: always generate pcfile, remove +pcfile variant from all packages

* lua: minor fixes

* rpm: minor fix
2024-11-10 20:12:15 -06:00
Wouter Deconinck
f5b8b0ac5d mbedtls: add v2.28.9, v3.6.2 (fix CVEs) (#46637)
* mbedtls: add v2.28.9, v3.6.1 (fix CVEs)
* mbedtls: add v3.6.2
2024-11-10 15:09:06 -08:00
Dave Keeshan
913dcd97bc verilator: add v5.030 (#47455)
* Add 5.030 and remove the requirement to patch verilator, the problem has be fixed in this rev

* Update var/spack/repos/builtin/packages/verilator/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-10 16:07:12 -07:00
Adam J. Stewart
68570b7587 GDAL: add v3.10.0 (#47472) 2024-11-10 14:34:20 -06:00
Stephen Nicholas Swatman
2da4366ba6 benchmark: enable shared libraries by default (#47368)
* benchmark: enable shared libraries by default

The existing behaviour of Google Benchmark yiels static objects which
are of little use for most projects. This PR changes the spec to use
dynamic libraries instead.

* Add shared variant
2024-11-10 14:12:23 -06:00
Adam J. Stewart
2713b0c216 py-kornia: add v0.7.4 (#47435) 2024-11-10 13:21:01 -06:00
Matthieu Dorier
16b01c5661 librdkafka: added missing dependency on curl (#47500)
* librdkafka: added missing dependency on curl

This PR adds a missing dependency on curl in librdkafka.

* librdkafka: added dependency on openssl and zlib
2024-11-10 13:06:41 -06:00
dependabot[bot]
ebd4ef934c build(deps): bump types-six in /.github/workflows/requirements/style (#47454)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.20241009 to 1.16.21.20241105.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-10 13:03:37 -06:00
Kaan
97b5ec6e4f Add support for Codeplay AMD Plugin for Intel OneAPI Compilers (#46749)
* Added support for Codeplay AMD Plugin for Intel OneAPI Compilers

* [@spackbot] updating style on behalf of kaanolgu

* Adding 2025.0.0

* removed HOME and XDG_RUNTIME_DIR

* [@spackbot] updating style on behalf of kaanolgu

---------

Co-authored-by: Kaan Olgu <kaan.olgu@bristol.ac.uk>
2024-11-10 11:51:39 -07:00
Dave Keeshan
4c9bc8d879 Add v0.47 (#47456) 2024-11-10 11:51:07 -06:00
Chris Marsh
825fd1ccf6 Disable the optional flexblas support as system flexiblas is possibly used as flexiblas is not a depends and the entire build chain to support using flexibls is not setup. As this does not seem to be needed with the spack blas and lapack, it is easier to disable (#47514) 2024-11-10 10:47:12 -07:00
Matthieu Dorier
33109ce9b9 lksctp-tools: added version 1.0.21 (#47493)
Adds version 1.0.21 of lksctp-tools
2024-11-10 11:11:13 -06:00
Adam J. Stewart
fb5910d139 py-torchmetrics: add v1.5.2 (#47497) 2024-11-10 10:53:15 -06:00
JStewart28
fa6b8a4ceb beatnik: add v1.1 (#47361)
Co-authored-by: Patrick Bridges <patrickb314@gmail.com>
2024-11-09 13:43:55 -07:00
Dom Heinzeller
97acf2614a cprnc: set install rpath and add v1.0.8 (#47505) 2024-11-09 15:39:55 +01:00
eugeneswalker
e99bf48d28 Revert "upcxx %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47503)" (#47512)
This reverts commit 4322cf56b1.
2024-11-09 06:12:46 -08:00
Massimiliano Culpo
b97015b791 ci: ci/all must always run, and fail if any job has status "fail" or "canceled" (#47517)
This means it succeeds when a both jobs have either status "success"
or status "skipped"
2024-11-09 06:04:51 -08:00
Seth R. Johnson
1884520f7b root: fix macos build (#47483)
No ROOT `builtin` should ever be set to true if possible, because that
builds an existing library that spack may not know about.

Furthermore, using `builtin_glew` forces the package to be on, even when
not building x/gl/aqua on macos. This causes build failures.

Caused by https://github.com/spack/spack/pull/45632#issuecomment-2276311748 .
2024-11-09 07:30:38 -06:00
Todd Gamblin
7fbfb0f6dc Revert "fix patched dependencies across repositories (#42463)" (#47519)
This reverts commit da1d533877.
2024-11-09 10:25:25 +01:00
Massimiliano Culpo
11d276ab6f Fix style checks on develop (#47518)
`mypy` checks have been accidentally broken by #47213
2024-11-08 23:50:37 -08:00
Greg Becker
da1d533877 fix patched dependencies across repositories (#42463)
Currently, if a package has a dependency from another repository and patches it,
generation of the patch cache will fail. Concretization succeeds if a fixed patch
cache is in place.

- [x] don't assume that patched dependencies are in the same repo when indexing
- [x] add some test fixtures to support multi-repo tests.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-08 18:07:40 -07:00
Harmen Stoppels
c6997e11a7 spack.compiler/spack.util.libc: add caching (#47213)
* spack.compiler: cache output

* compute libc from the dynamic linker at most once per spack process

* wrap compiler cache entry in class, add type hints

* test compiler caching

* ensure tests do not populate user cache, and fix 2 tests

* avoid recursion: cache lookup -> compute key -> cflags -> real_version -> cache lookup

* allow compiler execution in test that depends on get_real_version
2024-11-08 16:25:02 -08:00
eugeneswalker
4322cf56b1 upcxx %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47503) 2024-11-08 15:12:50 -07:00
Harmen Stoppels
907a37145f llnl.util.filesystem: multiple entrypoints and max_depth (#47495)
If a package `foo` doesn't implement `libs`, the default was to search recursively for `libfoo` whenever asking for `spec[foo].libs` (this also happens automatically if a package includes `foo` as a link dependency).

This can lead to some strange behavior:
1. A package that is normally used as a build dependency (e.g. `cmake` at one point) is referenced like
   `depends_on(cmake)` which leads to a fully-recursive search for `libcmake` (this can take
   "forever" when CMake is registered as an external with a prefix like `/usr`, particularly on NFS mounts).
2. A similar hang can occur if a package is registered as an external with an incorrect prefix

- [x] Update the default library search to stop after a maximum depth (by default, search
  the root prefix and each directory in it, but no lower).
- [x] 

The following is a list of known changes to `find` compared to `develop`:

1. Matching directories are no longer returned -- `find` consistently only finds non-dirs, 
   even at `max_depth`
2. Symlinked directories are followed (needed to support max_depth)
3. `find(..., "dir/*.txt")` is allowed, for finding files inside certain dirs. These "complex"
   patterns are delegated to `glob`, like they are on `develop`.
4. `root` and `files` arguments both support generic sequences, and `root`
   allows both `str` and `path` types. This allows us to specify multiple entry points to `find`.

---------

Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
2024-11-08 13:55:53 -08:00
Harmen Stoppels
4778d2d332 Add missing imports (#47496) 2024-11-08 17:51:58 +01:00
Mikael Simberg
eb256476d2 pika: add 0.30.0 (#47498) 2024-11-08 17:07:50 +01:00
Alec Scott
ff26d2f833 spack env track command (#41897)
This PR adds a sub-command to `spack env` (`track`) which allows users to add/link
anonymous environments into their installation as named environments. This allows
users to more easily track their installed packages and the environments they're
dependencies of. For example, with the addition of #41731 it's now easier to remove
all packages not required by any environments with,

```
spack gc -bE
```

#### Usage
```
spack env track /path/to/env
==> Linked environment in /path/to/env
==> You can activate this environment with:
==>     spack env activate env
```

By default `track /path/to/env` will use the last directory in the path as the name of 
the environment. However users may customize the name of the linked environment
with `-n | --name`. Shown below.
```
spack env track /path/to/env --name foo 
==> Tracking environment in /path/to/env
==> You can activate this environment with:
==>     spack env activate foo
```

When removing a linked environment, Spack will remove the link to the environment
but will keep the structure of the environment within the directory. This will allow
users to remove a linked environment from their installation without deleting it from
a shared repository.

There is a `spack env untrack` command that can be used to *only* untrack a tracked
environment -- it will fail if it is used on a managed environment.  Users can also use
`spack env remove` to untrack an environment.

This allows users to continue to share environments in git repositories  while also having
the dependencies of those environments be remembered by Spack.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-08 00:16:01 -08:00
Harmen Stoppels
ed916ffe6c Revert "filesystem.py: add max_depth argument to find (#41945)"
This reverts commit 38c8069ab4.
2024-11-07 13:09:10 -08:00
Harmen Stoppels
4fbdf2f2c0 Revert "llnl.util.filesystem.find: restore old error handling (#47463)"
This reverts commit a31c525778.
2024-11-07 13:09:10 -08:00
Harmen Stoppels
60ba61f6b2 Revert "llnl.util.filesystem.find: multiple entrypoints (#47436)"
This reverts commit 73219e4b02.
2024-11-07 13:09:10 -08:00
Chris White
0a4563fd02 silo package: update patch (#47457)
Update patch based on LLNL/Silo#319 to fix build of 4.10.2
2024-11-07 12:49:26 -08:00
Marc T. Henry de Frahan
754408ca2b Add fast farm variant to openfast (#47486) 2024-11-07 13:21:38 -07:00
Harmen Stoppels
0d817878ea spec.py: fix comparison with multivalued variants (#47485) 2024-11-07 19:29:37 +00:00
826 changed files with 14498 additions and 8389 deletions

View File

@@ -57,7 +57,13 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
images: |
@@ -71,6 +77,7 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -113,7 +120,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -83,10 +83,17 @@ jobs:
all-prechecks:
needs: [ prechecks ]
if: ${{ always() }}
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
else
exit 0
fi
coverage:
needs: [ unit-tests, prechecks ]
@@ -94,8 +101,19 @@ jobs:
secrets: inherit
all:
needs: [ coverage, bootstrap ]
needs: [ unit-tests, coverage, bootstrap ]
if: ${{ always() }}
runs-on: ubuntu-latest
# See https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/accessing-contextual-information-about-workflow-runs#needs-context
steps:
- name: Success
run: "true"
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
exit 1
else
exit 0
fi

View File

@@ -29,6 +29,7 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: true

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20241009
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -174,7 +174,7 @@ jobs:
spack bootstrap disable github-actions-v0.6
spack bootstrap status
spack solve zlib
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretize.py
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretization/core.py
- uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882
with:
name: coverage-clingo-cffi

View File

@@ -70,7 +70,7 @@ Tutorial
----------------
We maintain a
[**hands-on tutorial**](https://spack.readthedocs.io/en/latest/tutorial.html).
[**hands-on tutorial**](https://spack-tutorial.readthedocs.io/).
It covers basic to advanced usage, packaging, developer features, and large HPC
deployments. You can do all of the exercises on your own laptop using a
Docker container.

View File

@@ -39,7 +39,8 @@ concretizer:
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
duplicates:
# "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "minimal": allows the duplication of 'build-tools' nodes only
# (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Option to specify compatibility between operating systems for reuse of compilers and packages
@@ -47,3 +48,18 @@ concretizer:
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
os_compatible: {}
# Option to specify whether to support splicing. Splicing allows for
# the relinking of concrete package dependencies in order to better
# reuse already built packages with ABI compatible dependencies
splice:
explicit: []
automatic: false
# Maximum time, in seconds, allowed for the 'solve' phase. If set to 0, there is no time limit.
timeout: 0
# If set to true, exceeding the timeout will always result in a concretization error. If false,
# the best (suboptimal) model computed before the timeout is used.
#
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true

View File

@@ -76,6 +76,8 @@ packages:
buildable: false
cray-mvapich2:
buildable: false
egl:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:

View File

@@ -237,3 +237,35 @@ is optional -- by default, splices will be transitive.
``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack
will warn the user in this case, but will not fail the
concretization.
.. _automatic_splicing:
^^^^^^^^^^^^^^^^^^
Automatic Splicing
^^^^^^^^^^^^^^^^^^
The Spack solver can be configured to do automatic splicing for
ABI-compatible packages. Automatic splices are enabled in the concretizer
config section
.. code-block:: yaml
concretizer:
splice:
automatic: True
Packages can include ABI-compatibility information using the
``can_splice`` directive. See :ref:`the packaging
guide<abi_compatibility>` for instructions on specifying ABI
compatibility using the ``can_splice`` directive.
.. note::
The ``can_splice`` directive is experimental and may be changed in
future versions.
When automatic splicing is enabled, the concretizer will combine any
number of ABI-compatible specs if possible to reuse installed packages
and packages available from binary caches. The end result of these
specs is equivalent to a series of transitive/intransitive splices,
but the series may be non-obvious.

View File

@@ -210,7 +210,7 @@ def setup(sphinx):
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
("py:class", "spack.build_systems._checks.BuilderWithDefaults"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
@@ -221,6 +221,7 @@ def setup(sphinx):
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
("py:class", "spack.compiler.CompilerCache"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
]

View File

@@ -1042,7 +1042,7 @@ file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
package. This view selects all packages that depend on MPI, and
excludes those built with the PGI compiler at version 18.5.
excludes those built with the GCC compiler at version 18.5.
The root specs with their (transitive) link and run type dependencies
will be put in the view due to the ``link: all`` option,
and the files in the view will be symlinks to the spack install
@@ -1056,7 +1056,7 @@ directories.
mpis:
root: /path/to/view
select: [^mpi]
exclude: ['%pgi@18.5']
exclude: ['%gcc@18.5']
projections:
all: '{name}/{version}-{compiler.name}'
link: all

View File

@@ -35,7 +35,7 @@ A build matrix showing which packages are working on which systems is shown belo
.. code-block:: console
apt update
apt install build-essential ca-certificates coreutils curl environment-modules gfortran git gpg lsb-release python3 python3-distutils python3-venv unzip zip
apt install bzip2 ca-certificates file g++ gcc gfortran git gzip lsb-release patch python3 tar unzip xz-utils zstd
.. tab-item:: RHEL
@@ -43,14 +43,14 @@ A build matrix showing which packages are working on which systems is shown belo
dnf install epel-release
dnf group install "Development Tools"
dnf install curl findutils gcc-gfortran gnupg2 hostname iproute redhat-lsb-core python3 python3-pip python3-setuptools unzip python3-boto3
dnf install gcc-gfortran redhat-lsb-core python3 unzip
.. tab-item:: macOS Brew
.. code-block:: console
brew update
brew install curl gcc git gnupg zip
brew install gcc git zip
------------
Installation
@@ -283,10 +283,6 @@ compilers`` or ``spack compiler list``:
intel@14.0.1 intel@13.0.1 intel@12.1.2 intel@10.1
-- clang -------------------------------------------------------
clang@3.4 clang@3.3 clang@3.2 clang@3.1
-- pgi ---------------------------------------------------------
pgi@14.3-0 pgi@13.2-0 pgi@12.1-0 pgi@10.9-0 pgi@8.0-1
pgi@13.10-0 pgi@13.1-1 pgi@11.10-0 pgi@10.2-0 pgi@7.1-3
pgi@13.6-0 pgi@12.8-0 pgi@11.1-0 pgi@9.0-4 pgi@7.0-6
Any of these compilers can be used to build Spack packages. More on
how this is done is in :ref:`sec-specs`.
@@ -806,65 +802,6 @@ flags to the ``icc`` command:
spec: intel@15.0.24.4.9.3
^^^
PGI
^^^
PGI comes with two sets of compilers for C++ and Fortran,
distinguishable by their names. "Old" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgCC
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgf77
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgf90
"New" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgc++
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
Older installations of PGI contains just the old compilers; whereas
newer installations contain the old and the new. The new compiler is
considered preferable, as some packages
(``hdf``) will not build with the old compiler.
When auto-detecting a PGI compiler, there are cases where Spack will
find the old compilers, when you really want it to find the new
compilers. It is best to check this ``compilers.yaml``; and if the old
compilers are being used, change ``pgf77`` and ``pgf90`` to
``pgfortran``.
Other issues:
* There are reports that some packages will not build with PGI,
including ``libpciaccess`` and ``openssl``. A workaround is to
build these packages with another compiler and then use them as
dependencies for PGI-build packages. For example:
.. code-block:: console
$ spack install openmpi%pgi ^libpciaccess%gcc
* PGI requires a license to use; see :ref:`licensed-compilers` for more
information on installation.
.. note::
It is believed the problem with HDF 4 is that everything is
compiled with the ``F77`` compiler, but at some point some Fortran
90 code slipped in there. So compilers that can handle both FORTRAN
77 and Fortran 90 (``gfortran``, ``pgfortran``, etc) are fine. But
compilers specific to one or the other (``pgf77``, ``pgf90``) won't
work.
^^^
NAG
^^^
@@ -1389,6 +1326,7 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1454,6 +1392,13 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -1267,7 +1267,7 @@ Git fetching supports the following parameters to ``version``:
This feature requires ``git`` to be version ``2.25.0`` or later but is useful for
large repositories that have separate portions that can be built independently.
If paths provided are directories then all the subdirectories and associated files
will also be cloned.
will also be cloned.
Only one of ``tag``, ``branch``, or ``commit`` can be used at a time.
@@ -1367,8 +1367,8 @@ Submodules
git-submodule``.
Sparse-Checkout
You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the
You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the
``git_sparse_paths`` attribute for the package along with the files in the top level directory.
This feature allows you to only clone what you need from a large repository.
Note that this is a newer feature in git and requries git ``2.25.0`` or greater.
@@ -1928,71 +1928,29 @@ to the empty list.
String. A URL pointing to license setup instructions for the software.
Defaults to the empty string.
For example, let's take a look at the package for the PGI compilers.
For example, let's take a look at the Arm Forge package.
.. code-block:: python
# Licensing
license_required = True
license_comment = "#"
license_files = ["license.dat"]
license_vars = ["PGROUPD_LICENSE_FILE", "LM_LICENSE_FILE"]
license_url = "http://www.pgroup.com/doc/pgiinstall.pdf"
license_comment = "#"
license_files = ["licences/Licence"]
license_vars = [
"ALLINEA_LICENSE_DIR",
"ALLINEA_LICENCE_DIR",
"ALLINEA_LICENSE_FILE",
"ALLINEA_LICENCE_FILE",
]
license_url = "https://developer.arm.com/documentation/101169/latest/Use-Arm-Licence-Server"
As you can see, PGI requires a license. Its license manager, FlexNet, uses
the ``#`` symbol to denote a comment. It expects the license file to be
named ``license.dat`` and to be located directly in the installation prefix.
If you would like the installation file to be located elsewhere, simply set
``PGROUPD_LICENSE_FILE`` or ``LM_LICENSE_FILE`` after installation. For
further instructions on installation and licensing, see the URL provided.
Arm Forge requires a license. Its license manager uses the ``#`` symbol to denote a comment.
It expects the license file to be named ``License`` and to be located in a ``licenses`` directory
in the installation prefix.
Let's walk through a sample PGI installation to see exactly what Spack is
and isn't capable of. Since PGI does not provide a download URL, it must
be downloaded manually. It can either be added to a mirror or located in
the current directory when ``spack install pgi`` is run. See :ref:`mirrors`
for instructions on setting up a mirror.
After running ``spack install pgi``, the first thing that will happen is
Spack will create a global license file located at
``$SPACK_ROOT/etc/spack/licenses/pgi/license.dat``. It will then open up the
file using :ref:`your favorite editor <controlling-the-editor>`. It will look like
this:
.. code-block:: sh
# A license is required to use pgi.
#
# The recommended solution is to store your license key in this global
# license file. After installation, the following symlink(s) will be
# added to point to this file (relative to the installation prefix):
#
# license.dat
#
# Alternatively, use one of the following environment variable(s):
#
# PGROUPD_LICENSE_FILE
# LM_LICENSE_FILE
#
# If you choose to store your license in a non-standard location, you may
# set one of these variable(s) to the full pathname to the license file, or
# port@host if you store your license keys on a dedicated license server.
# You will likely want to set this variable in a module file so that it
# gets loaded every time someone tries to use pgi.
#
# For further information on how to acquire a license, please refer to:
#
# http://www.pgroup.com/doc/pgiinstall.pdf
#
# You may enter your license below.
You can add your license directly to this file, or tell FlexNet to use a
license stored on a separate license server. Here is an example that
points to a license server called licman1:
.. code-block:: none
SERVER licman1.mcs.anl.gov 00163eb7fba5 27200
USE_SERVER
If you would like the installation file to be located elsewhere, simply set ``ALLINEA_LICENSE_DIR`` or
one of the other license variables after installation. For further instructions on installation and
licensing, see the URL provided.
If your package requires the license to install, you can reference the
location of this global license using ``self.global_license_file``.
@@ -2392,7 +2350,7 @@ by the ``--jobs`` option:
.. code-block:: python
:emphasize-lines: 7, 11
:linenos:
class Xios(Package):
...
def install(self, spec, prefix):
@@ -2967,9 +2925,9 @@ make sense during the build phase may not be needed at runtime, and vice versa.
it makes sense to let a dependency set the environment variables for its dependents. To allow all
this, Spack provides four different methods that can be overridden in a package:
1. :meth:`setup_build_environment <spack.builder.Builder.setup_build_environment>`
1. :meth:`setup_build_environment <spack.builder.BaseBuilder.setup_build_environment>`
2. :meth:`setup_run_environment <spack.package_base.PackageBase.setup_run_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.BaseBuilder.setup_dependent_build_environment>`
4. :meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
The Qt package, for instance, uses this call:
@@ -5420,7 +5378,7 @@ by build recipes. Examples of checking :ref:`variant settings <variants>` and
determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples:
The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming
@@ -5737,7 +5695,7 @@ subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to
executing stand-alone tests.
.. tip::
.. tip::
*Perform test-related conversions once when copying files.*
@@ -7113,6 +7071,46 @@ might write:
CXXFLAGS += -I$DWARF_PREFIX/include
CXXFLAGS += -L$DWARF_PREFIX/lib
.. _abi_compatibility:
----------------------------
Specifying ABI Compatibility
----------------------------
Packages can include ABI-compatibility information using the
``can_splice`` directive. For example, if ``Foo`` version 1.1 can
always replace version 1.0, then the package could have:
.. code-block:: python
can_splice("foo@1.0", when="@1.1")
For virtual packages, packages can also specify ABI-compabitiliby with
other packages providing the same virtual. For example, ``zlib-ng``
could specify:
.. code-block:: python
can_splice("zlib@1.3.1", when="@2.2+compat")
Some packages have ABI-compatibility that is dependent on matching
variant values, either for all variants or for some set of
ABI-relevant variants. In those cases, it is not necessary to specify
the full combinatorial explosion. The ``match_variants`` keyword can
cover all single-value variants.
.. code-block:: python
can_splice("foo@1.1", when="@1.2", match_variants=["bar"]) # any value for bar as long as they're the same
can_splice("foo@1.2", when="@1.3", match_variants="*") # any variant values if all single-value variants match
The concretizer will use ABI compatibility to determine automatic
splices when :ref:`automatic splicing<automatic_splicing>` is enabled.
.. note::
The ``can_splice`` directive is experimental, and may be replaced
by a higher-level interface in future versions of Spack.
.. _package_class_structure:

View File

@@ -1,12 +1,12 @@
sphinx==8.1.3
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.3
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1

View File

@@ -20,7 +20,20 @@
import tempfile
from contextlib import contextmanager
from itertools import accumulate
from typing import Callable, Deque, Dict, Iterable, List, Match, Optional, Set, Tuple, Union
from typing import (
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
Optional,
Sequence,
Set,
Tuple,
Union,
)
import llnl.util.symlink
from llnl.util import tty
@@ -85,6 +98,8 @@
"visit_directory_tree",
]
Path = Union[str, pathlib.Path]
if sys.version_info < (3, 7, 4):
# monkeypatch shutil.copystat to fix PermissionError when copying read-only
# files on Lustre when using Python < 3.7.4
@@ -1674,51 +1689,44 @@ def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2)
def find(
root: Union[str, List[str]],
files: Union[str, List[str]],
root: Union[Path, Sequence[Path]],
files: Union[str, Sequence[str]],
recursive: bool = True,
max_depth: Optional[int] = None,
) -> List[str]:
"""Finds all non-directory files matching the filename patterns from ``files`` starting from
``root``. This function returns a deterministic result for the same input and directory
structure when run multiple times. Symlinked directories are followed, and unique directories
are searched only once. Each matching file is returned only once at lowest depth in case
multiple paths exist due to symlinked directories. The function has similarities to the Unix
``find`` utility.
Examples:
.. code-block:: console
$ find -L /usr -name python3 -type f
is roughly equivalent to
>>> find("/usr", "python3")
with the notable difference that this function only lists a single path to each file in case of
"""Finds all files matching the patterns from ``files`` starting from ``root``. This function
returns a deterministic result for the same input and directory structure when run multiple
times. Symlinked directories are followed, and unique directories are searched only once. Each
matching file is returned only once at lowest depth in case multiple paths exist due to
symlinked directories.
.. code-block:: console
$ find -L /usr/local/bin /usr/local/sbin -maxdepth 1 '(' -name python3 -o -name getcap \\
')' -type f
is roughly equivalent to:
>>> find(["/usr/local/bin", "/usr/local/sbin"], ["python3", "getcap"], recursive=False)
Accepts any glob characters accepted by fnmatch:
========== ====================================
Pattern Meaning
========== ====================================
``*`` matches everything
``*`` matches one or more characters
``?`` matches any single character
``[seq]`` matches any character in ``seq``
``[!seq]`` matches any character not in ``seq``
========== ====================================
Examples:
>>> find("/usr", "*.txt", recursive=True, max_depth=2)
finds all files with the extension ``.txt`` in the directory ``/usr`` and subdirectories up to
depth 2.
>>> find(["/usr", "/var"], ["*.txt", "*.log"], recursive=True)
finds all files with the extension ``.txt`` or ``.log`` in the directories ``/usr`` and
``/var`` at any depth.
>>> find("/usr", "GL/*.h", recursive=True)
finds all header files in a directory GL at any depth in the directory ``/usr``.
Parameters:
root: One or more root directories to start searching from
files: One or more filename patterns to search for
@@ -1727,22 +1735,25 @@ def find(
Returns a list of absolute, matching file paths.
"""
if not isinstance(root, list):
if isinstance(root, (str, pathlib.Path)):
root = [root]
elif not isinstance(root, collections.abc.Sequence):
raise TypeError(f"'root' arg must be a path or a sequence of paths, not '{type(root)}']")
if not isinstance(files, list):
if isinstance(files, str):
files = [files]
elif not isinstance(files, collections.abc.Sequence):
raise TypeError(f"'files' arg must be str or a sequence of str, not '{type(files)}']")
# If recursive is false, max_depth can only be None or 0
if max_depth and not recursive:
raise ValueError(f"max_depth ({max_depth}) cannot be set if recursive is False")
tty.debug(f"Find (max depth = {max_depth}): {root} {files}")
if not recursive:
max_depth = 0
elif max_depth is None:
max_depth = sys.maxsize
tty.debug(f"Find (max depth = {max_depth}): {root} {files}")
result = _find_max_depth(root, files, max_depth)
tty.debug(f"Find complete: {root} {files}")
return result
@@ -1753,21 +1764,47 @@ def _log_file_access_issue(e: OSError, path: str) -> None:
tty.debug(f"find must skip {path}: {errno_name} {e}")
def _dir_id(s: os.stat_result) -> Tuple[int, int]:
def _file_id(s: os.stat_result) -> Tuple[int, int]:
# Note: on windows, st_ino is the file index and st_dev is the volume serial number. See
# https://github.com/python/cpython/blob/3.9/Python/fileutils.c
return (s.st_ino, s.st_dev)
def _find_max_depth(roots: List[str], globs: List[str], max_depth: int = sys.maxsize) -> List[str]:
def _dedupe_files(paths: List[str]) -> List[str]:
"""Deduplicate files by inode and device, dropping files that cannot be accessed."""
unique_files: List[str] = []
# tuple of (inode, device) for each file without following symlinks
visited: Set[Tuple[int, int]] = set()
for path in paths:
try:
stat_info = os.lstat(path)
except OSError as e:
_log_file_access_issue(e, path)
continue
file_id = _file_id(stat_info)
if file_id not in visited:
unique_files.append(path)
visited.add(file_id)
return unique_files
def _find_max_depth(
roots: Sequence[Path], globs: Sequence[str], max_depth: int = sys.maxsize
) -> List[str]:
"""See ``find`` for the public API."""
# Apply normcase to file patterns and filenames to respect case insensitive filesystems
regex, groups = fnmatch_translate_multiple([os.path.normcase(x) for x in globs])
# Ordered dictionary that keeps track of the files found for each pattern
capture_group_to_paths: Dict[str, List[str]] = {group: [] for group in groups}
# We optimize for the common case of simple filename only patterns: a single, combined regex
# is used. For complex patterns that include path components, we use a slower glob call from
# every directory we visit within max_depth.
filename_only_patterns = {
f"pattern_{i}": os.path.normcase(x) for i, x in enumerate(globs) if "/" not in x
}
complex_patterns = {f"pattern_{i}": x for i, x in enumerate(globs) if "/" in x}
regex = re.compile(fnmatch_translate_multiple(filename_only_patterns))
# Ordered dictionary that keeps track of what pattern found which files
matched_paths: Dict[str, List[str]] = {f"pattern_{i}": [] for i, _ in enumerate(globs)}
# Ensure returned paths are always absolute
roots = [os.path.abspath(r) for r in roots]
# Breadth-first search queue. Each element is a tuple of (depth, directory)
# Breadth-first search queue. Each element is a tuple of (depth, dir)
dir_queue: Deque[Tuple[int, str]] = collections.deque()
# Set of visited directories. Each element is a tuple of (inode, device)
visited_dirs: Set[Tuple[int, int]] = set()
@@ -1778,58 +1815,76 @@ def _find_max_depth(roots: List[str], globs: List[str], max_depth: int = sys.max
except OSError as e:
_log_file_access_issue(e, root)
continue
dir_id = _dir_id(stat_root)
dir_id = _file_id(stat_root)
if dir_id not in visited_dirs:
dir_queue.appendleft((0, root))
visited_dirs.add(dir_id)
while dir_queue:
depth, next_dir = dir_queue.pop()
depth, curr_dir = dir_queue.pop()
try:
dir_iter = os.scandir(next_dir)
dir_iter = os.scandir(curr_dir)
except OSError as e:
_log_file_access_issue(e, next_dir)
_log_file_access_issue(e, curr_dir)
continue
# Use glob.glob for complex patterns.
for pattern_name, pattern in complex_patterns.items():
matched_paths[pattern_name].extend(
path for path in glob.glob(os.path.join(curr_dir, pattern))
)
# List of subdirectories by path and (inode, device) tuple
subdirs: List[Tuple[str, Tuple[int, int]]] = []
with dir_iter:
ordered_entries = sorted(dir_iter, key=lambda x: x.name)
for dir_entry in ordered_entries:
for dir_entry in dir_iter:
# Match filename only patterns
if filename_only_patterns:
m = regex.match(os.path.normcase(dir_entry.name))
if m:
for pattern_name in filename_only_patterns:
if m.group(pattern_name):
matched_paths[pattern_name].append(dir_entry.path)
break
# Collect subdirectories
if depth >= max_depth:
continue
try:
it_is_a_dir = dir_entry.is_dir(follow_symlinks=True)
if not dir_entry.is_dir(follow_symlinks=True):
continue
if sys.platform == "win32":
# Note: st_ino/st_dev on DirEntry.stat are not set on Windows, so we have
# to call os.stat
stat_info = os.stat(dir_entry.path, follow_symlinks=True)
else:
stat_info = dir_entry.stat(follow_symlinks=True)
except OSError as e:
# Possible permission issue, or a symlink that cannot be resolved (ELOOP).
_log_file_access_issue(e, dir_entry.path)
continue
if it_is_a_dir and depth < max_depth:
try:
# The stat should be performed in a try/except block. We repeat that here
# vs. moving to the above block because we only want to call `stat` if we
# haven't exceeded our max_depth
if sys.platform == "win32":
# Note: st_ino/st_dev on DirEntry.stat are not set on Windows, so we
# have to call os.stat
stat_info = os.stat(dir_entry.path, follow_symlinks=True)
else:
stat_info = dir_entry.stat(follow_symlinks=True)
except OSError as e:
_log_file_access_issue(e, dir_entry.path)
continue
subdirs.append((dir_entry.path, _file_id(stat_info)))
dir_id = _dir_id(stat_info)
if dir_id not in visited_dirs:
dir_queue.appendleft((depth + 1, dir_entry.path))
visited_dirs.add(dir_id)
else:
m = regex.match(os.path.normcase(os.path.basename(dir_entry.path)))
if not m:
continue
for group in capture_group_to_paths:
if m.group(group):
capture_group_to_paths[group].append(dir_entry.path)
break
# Enqueue subdirectories in a deterministic order
if subdirs:
subdirs.sort(key=lambda s: os.path.basename(s[0]))
for subdir, subdir_id in subdirs:
if subdir_id not in visited_dirs:
dir_queue.appendleft((depth + 1, subdir))
visited_dirs.add(subdir_id)
return [path for paths in capture_group_to_paths.values() for path in paths]
# Sort the matched paths for deterministic output
for paths in matched_paths.values():
paths.sort()
all_matching_paths = [path for paths in matched_paths.values() for path in paths]
# We only dedupe files if we have any complex patterns, since only they can match the same file
# multiple times
return _dedupe_files(all_matching_paths) if complex_patterns else all_matching_paths
# Utilities for libraries and headers
@@ -2718,22 +2773,6 @@ def prefixes(path):
return paths
@system_path_filter
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
@system_path_filter
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
@@ -2784,6 +2823,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@@ -15,7 +15,7 @@
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Iterable, List, Tuple, TypeVar
from typing import Callable, Dict, Iterable, List, Tuple, TypeVar
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -867,24 +867,11 @@ def elide_list(line_list: List[str], max_num: int = 10) -> List[str]:
PatternStr = typing.Pattern[str]
def fnmatch_translate_multiple(patterns: List[str]) -> Tuple[PatternStr, List[str]]:
"""Same as fnmatch.translate, but creates a single regex of the form
``(?P<pattern0>...)|(?P<pattern1>...)|...`` for each pattern in the iterable, where
``patternN`` is a named capture group that matches the corresponding pattern translated by
``fnmatch.translate``. This can be used to match multiple patterns in a single pass. No case
normalization is performed on the patterns.
Args:
patterns: list of fnmatch patterns
Returns:
Tuple of the combined regex and the list of named capture groups corresponding to each
pattern in the input list.
"""
groups = [f"pattern{i}" for i in range(len(patterns))]
regexes = (fnmatch.translate(p) for p in patterns)
combined = re.compile("|".join(f"(?P<{g}>{r})" for g, r in zip(groups, regexes)))
return combined, groups
def fnmatch_translate_multiple(named_patterns: Dict[str, str]) -> str:
"""Similar to ``fnmatch.translate``, but takes an ordered dictionary where keys are pattern
names, and values are filename patterns. The output is a regex that matches any of the
patterns in order, and named capture groups are used to identify which pattern matched."""
return "|".join(f"(?P<{n}>{fnmatch.translate(p)})" for n, p in named_patterns.items())
@contextlib.contextmanager

View File

@@ -11,7 +11,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0"
__version__ = "0.24.0.dev0"
spack_version = __version__

View File

@@ -571,8 +571,13 @@ def _search_for_deprecated_package_methods(pkgs, error_cls):
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
reserved_names = ("all",)
badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs:
if pkg_name in reserved_names:
error_msg = f"The name '{pkg_name}' is reserved, and cannot be used for packages"
errors.append(error_cls(error_msg, []))
if badname_regex.search(pkg_name):
error_msg = f"Package name '{pkg_name}' should be lowercase and must not contain '_'"
errors.append(error_cls(error_msg, []))
@@ -688,19 +693,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
details.append(f"{pkg_name}@{v} uses {digest}")
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
details.append(f"Resource in '{pkg_name}' uses {digest}")
if details:
errors.append(error_cls(error_msg, details))
@@ -714,9 +719,9 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
# values are either Value objects (for conditional values) or the values themselves
# values are either ConditionalValue objects or the values themselves
build_system_names = set(
v.value if isinstance(v, spack.variant.Value) else v
v.value if isinstance(v, spack.variant.ConditionalValue) else v
for _, variant in pkg_cls.variant_definitions("build_system")
for v in variant.values
)

View File

@@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirror
import spack.mirrors.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
@@ -87,6 +87,8 @@
from spack.stage import Stage
from spack.util.executable import which
from .enums import InstallRecordStatus
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
@@ -252,7 +254,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
spec_list = [
s
for s in db.query_local(installed=any)
for s in db.query_local(installed=InstallRecordStatus.ANY)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
@@ -367,7 +369,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
@@ -1174,7 +1176,7 @@ def _url_upload_tarball_and_specfile(
class Uploader:
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
@@ -1182,6 +1184,9 @@ def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool)
self.tmpdir: str
self.executor: concurrent.futures.Executor
# Verify if the mirror meets the requirements to push
self.mirror.ensure_mirror_usable("push")
def __enter__(self):
self._tmpdir = tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root())
self._executor = spack.util.parallel.make_concurrent_executor()
@@ -1219,7 +1224,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
@@ -1268,7 +1273,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
@@ -1292,7 +1297,7 @@ def push(
def make_uploader(
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
@@ -1948,9 +1953,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1975,7 +1980,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
return spack.mirrors.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
@@ -2329,7 +2334,9 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
codesign("-fs-", binary)
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -2643,7 +2650,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []
binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
@@ -2704,7 +2711,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []
if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}
@@ -2743,7 +2750,7 @@ def clear_spec_cache():
def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)
if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
@@ -2798,7 +2805,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
def _url_push_keys(
*mirrors: Union[spack.mirror.Mirror, str],
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
@@ -2865,7 +2872,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
"""
rebuilds = {}
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))
rebuild_list = []
@@ -2909,7 +2916,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
@@ -2918,7 +2925,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):

View File

@@ -35,9 +35,10 @@
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.concretize
import spack.config
import spack.detection
import spack.mirror
import spack.mirrors.mirror
import spack.platforms
import spack.spec
import spack.store
@@ -91,7 +92,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:
@@ -271,10 +272,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
concrete_spec = spack.spec.Spec(
abstract_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize()
concrete_spec = spack.concretize.concretized(abstract_spec)
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
@@ -300,7 +301,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
concrete_spec = spack.concretize.concretized(spack.spec.Spec(abstract_spec_str))
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):

View File

@@ -56,7 +56,6 @@
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
import spack.build_systems._checks
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
@@ -883,6 +882,9 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def accept(self, item):
return True
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
@@ -920,19 +922,19 @@ def effective_deptypes(
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
topo_sorted_edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges(specs),
visitor=EnvironmentVisitor(*specs, context=context),
key=traverse.by_dag_hash,
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
for edge in topo_sorted_edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
@@ -1375,7 +1377,7 @@ def exitcode_msg(p):
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
CONTEXT_BASES = (spack.package_base.PackageBase, spack.builder.Builder)
def get_package_context(traceback, context=3):
@@ -1424,27 +1426,20 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
fun_lineno = lineno - start
fun_lineno = frame.f_lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)

View File

@@ -9,6 +9,7 @@
import spack.builder
import spack.error
import spack.phase_callbacks
import spack.relocate
import spack.spec
import spack.store
@@ -63,7 +64,7 @@ def apply_macos_rpath_fixups(builder: spack.builder.Builder):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
@@ -71,7 +72,7 @@ def ensure_build_dependencies_or_raise(
Args:
spec: concrete spec to be checked.
dependencies: list of abstract specs to be satisfied
dependencies: list of package names of required build dependencies
error_msg: brief error message to be prepended to a longer description
Raises:
@@ -127,8 +128,8 @@ def execute_install_time_tests(builder: spack.builder.Builder):
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BaseBuilder(spack.builder.Builder):
"""Base class for builders to register common checks"""
class BuilderWithDefaults(spack.builder.Builder):
"""Base class for all specific builders with common callbacks registered."""
# Check that self.prefix is there after installation
spack.builder.run_after("install")(sanity_check_prefix)
spack.phase_callbacks.run_after("install")(sanity_check_prefix)

View File

@@ -6,7 +6,7 @@
import os.path
import stat
import subprocess
from typing import List
from typing import Callable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -15,6 +15,9 @@
import spack.builder
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
@@ -22,7 +25,7 @@
from spack.version import Version
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
ensure_build_dependencies_or_raise,
execute_build_time_tests,
@@ -69,14 +72,14 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def enable_or_disable(self, *args, **kwargs):
return self.builder.enable_or_disable(*args, **kwargs)
return spack.builder.create(self).enable_or_disable(*args, **kwargs)
def with_or_without(self, *args, **kwargs):
return self.builder.with_or_without(*args, **kwargs)
return spack.builder.create(self).with_or_without(*args, **kwargs)
@spack.builder.builder("autotools")
class AutotoolsBuilder(BaseBuilder):
class AutotoolsBuilder(BuilderWithDefaults):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:
@@ -157,7 +160,7 @@ class AutotoolsBuilder(BaseBuilder):
install_libtool_archives = False
@property
def patch_config_files(self):
def patch_config_files(self) -> bool:
"""Whether to update old ``config.guess`` and ``config.sub`` files
distributed with the tarball.
@@ -177,7 +180,7 @@ def patch_config_files(self):
)
@property
def _removed_la_files_log(self):
def _removed_la_files_log(self) -> str:
"""File containing the list of removed libtool archives"""
build_dir = self.build_directory
if not os.path.isabs(self.build_directory):
@@ -185,15 +188,15 @@ def _removed_la_files_log(self):
return os.path.join(build_dir, "removed_la_files.txt")
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on autotools"""
files = [os.path.join(self.build_directory, "config.log")]
if not self.install_libtool_archives:
files.append(self._removed_la_files_log)
return files
@spack.builder.run_after("autoreconf")
def _do_patch_config_files(self):
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
"""Some packages ship with older config.guess/config.sub files and need to
have these updated when installed on a newer architecture.
@@ -294,7 +297,7 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name))
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -304,8 +307,8 @@ def runs_ok(script_abs_path):
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@spack.builder.run_before("configure")
def _patch_usr_bin_file(self):
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
"""On NixOS file is not available in /usr/bin/file. Patch configure
scripts to use file from path."""
@@ -316,8 +319,8 @@ def _patch_usr_bin_file(self):
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@spack.builder.run_before("configure")
def _set_autotools_environment_variables(self):
@spack.phase_callbacks.run_before("configure")
def _set_autotools_environment_variables(self) -> None:
"""Many autotools builds use a version of mknod.m4 that fails when
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
@@ -330,8 +333,8 @@ def _set_autotools_environment_variables(self):
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@spack.builder.run_before("configure")
def _do_patch_libtool_configure(self):
@spack.phase_callbacks.run_before("configure")
def _do_patch_libtool_configure(self) -> None:
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
"libtool" directly should be implemented in the _do_patch_libtool method
@@ -358,8 +361,8 @@ def _do_patch_libtool_configure(self):
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
@spack.builder.run_after("configure")
def _do_patch_libtool(self):
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
values for libtool variables.
@@ -507,27 +510,64 @@ def _do_patch_libtool(self):
)
@property
def configure_directory(self):
def configure_directory(self) -> str:
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
@property
def configure_abs_path(self):
def configure_abs_path(self) -> str:
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self):
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
return self.configure_directory
@spack.builder.run_before("autoreconf")
def delete_configure_to_force_update(self):
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
def autoreconf(self, pkg, spec, prefix):
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
@@ -554,39 +594,12 @@ def autoreconf(self, pkg, spec, prefix):
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
@property
def autoreconf_search_path_args(self):
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.builder.run_after("autoreconf")
def set_configure_or_die(self):
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def configure(self, pkg, spec, prefix):
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
@@ -597,7 +610,12 @@ def configure(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
@@ -605,41 +623,49 @@ def build(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
def _activate_or_not(
self, name, activation_word, deactivation_word, activation_value=None, variant=None
):
self,
name: str,
activation_word: str,
deactivation_word: str,
activation_value: Optional[Union[Callable, str]] = None,
variant=None,
) -> List[str]:
"""This function contain the current implementation details of
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without` and
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.enable_or_disable`.
Args:
name (str): name of the option that is being activated or not
activation_word (str): the default activation word ('with' in the
case of ``with_or_without``)
deactivation_word (str): the default deactivation word ('without'
in the case of ``with_or_without``)
activation_value (typing.Callable): callable that accepts a single
value. This value is either one of the allowed values for a
multi-valued variant or the name of a bool-valued variant.
name: name of the option that is being activated or not
activation_word: the default activation word ('with' in the case of
``with_or_without``)
deactivation_word: the default deactivation word ('without' in the case of
``with_or_without``)
activation_value: callable that accepts a single value. This value is either one of the
allowed values for a multi-valued variant or the name of a bool-valued variant.
Returns the parameter to be used when the value is activated.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
variant (str): name of the variant that is being processed
(if different from option name)
variant: name of the variant that is being processed (if different from option name)
Examples:
@@ -647,19 +673,19 @@ def _activate_or_not(
.. code-block:: python
variant('foo', values=('x', 'y'), description='')
variant('bar', default=True, description='')
variant('ba_z', default=True, description='')
variant("foo", values=("x", "y"), description=")
variant("bar", default=True, description=")
variant("ba_z", default=True, description=")
calling this function like:
.. code-block:: python
_activate_or_not(
'foo', 'with', 'without', activation_value='prefix'
"foo", "with", "without", activation_value="prefix"
)
_activate_or_not('bar', 'with', 'without')
_activate_or_not('ba-z', 'with', 'without', variant='ba_z')
_activate_or_not("bar", "with", "without")
_activate_or_not("ba-z", "with", "without", variant="ba_z")
will generate the following configuration options:
@@ -679,8 +705,8 @@ def _activate_or_not(
Raises:
KeyError: if name is not among known variants
"""
spec = self.pkg.spec
args = []
spec: spack.spec.Spec = self.pkg.spec
args: List[str] = []
if activation_value == "prefix":
activation_value = lambda x: spec[x].prefix
@@ -698,7 +724,7 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
vdef = self.pkg.get_variant(variant)
if set(vdef.values) == set((True, False)):
if set(vdef.values) == set((True, False)): # type: ignore
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
@@ -709,14 +735,12 @@ def _activate_or_not(
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values
options = [(value, f"{variant}={value}" in spec) for value in feature_values]
options = [(v, f"{variant}={v}" in spec) for v in feature_values] # type: ignore
# For each allowed value in the list of values
for option_value, activated in options:
# Search for an override in the package for this value
override_name = "{0}_or_{1}_{2}".format(
activation_word, deactivation_word, option_value
)
override_name = f"{activation_word}_or_{deactivation_word}_{option_value}"
line_generator = getattr(self, override_name, None) or getattr(
self.pkg, override_name, None
)
@@ -725,19 +749,24 @@ def _activate_or_not(
def _default_generator(is_activated):
if is_activated:
line = "--{0}-{1}".format(activation_word, option_value)
line = f"--{activation_word}-{option_value}"
if activation_value is not None and activation_value(
option_value
): # NOQA=ignore=E501
line += "={0}".format(activation_value(option_value))
line = f"{line}={activation_value(option_value)}"
return line
return "--{0}-{1}".format(deactivation_word, option_value)
return f"--{deactivation_word}-{option_value}"
line_generator = _default_generator
args.append(line_generator(activated))
return args
def with_or_without(self, name, activation_value=None, variant=None):
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
@@ -752,12 +781,11 @@ def with_or_without(self, name, activation_value=None, variant=None):
``variant=value`` is in the spec.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): callable that accepts a single
value and returns the parameter to be used leading to an entry
of the type ``--with-{name}={parameter}``.
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -765,18 +793,22 @@ def with_or_without(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(self, name, activation_value=None, variant=None):
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): if present accepts a single value
and returns the parameter to be used leading to an entry of the
type ``--enable-{name}={parameter}``
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -784,15 +816,15 @@ def enable_or_disable(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@spack.builder.run_after("install")
def remove_libtool_archives(self):
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
@@ -814,12 +846,13 @@ def setup_build_environment(self, env):
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec):
dirs_seen = set()
flags_spack, flags_external = [], []
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()
flags_spack: List[str] = []
flags_external: List[str] = []
# We don't want to add an include flag for automake's default search path.
for automake in spec.dependencies(name="automake", deptype="build"):

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.builder
import spack.phase_callbacks
from .cmake import CMakeBuilder, CMakePackage
@@ -192,7 +192,10 @@ def initconfig_mpi_entries(self):
entries.append(cmake_cache_path("MPI_C_COMPILER", spec["mpi"].mpicc))
entries.append(cmake_cache_path("MPI_CXX_COMPILER", spec["mpi"].mpicxx))
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# not all MPIs have Fortran wrappers
if hasattr(spec["mpi"], "mpifc"):
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# Check for slurm
using_slurm = False
@@ -332,7 +335,7 @@ def std_cmake_args(self):
args.extend(["-C", self.cache_path])
return args
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def install_cmake_cache(self):
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
@@ -27,7 +28,7 @@ class CargoPackage(spack.package_base.PackageBase):
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
class CargoBuilder(BuilderWithDefaults):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
@@ -77,7 +78,7 @@ def install(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import List, Optional, Set, Tuple
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -18,11 +18,15 @@
import spack.deptypes as dt
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
# Regex to extract the primary generator from the CMake generator
# string.
@@ -48,9 +52,9 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),
CMakeBuilder.define("Python_EXECUTABLE", python_executable),
CMakeBuilder.define("Python3_EXECUTABLE", python_executable),
define("PYTHON_EXECUTABLE", python_executable),
define("Python_EXECUTABLE", python_executable),
define("Python3_EXECUTABLE", python_executable),
]
)
@@ -85,7 +89,7 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
ipo = False
if cmake.satisfies("@3.9:"):
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
@@ -93,30 +97,36 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
args.append(define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
args.append(define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
# Disable find package's config mode for versions of Boost that
# didn't provide it. See https://github.com/spack/spack/issues/20169
# and https://cmake.org/cmake/help/latest/module/FindBoost.html
if pkg.spec.satisfies("^boost@:1.69.0"):
args.append(define("Boost_NO_BOOST_CMAKE", True))
def generator(*names: str, default: Optional[str] = None):
def generator(*names: str, default: Optional[str] = None) -> None:
"""The build system generator to use.
See ``cmake --help`` for a list of valid generators.
@@ -157,15 +167,18 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths
@@ -263,15 +276,15 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def define(self, *args, **kwargs):
return self.builder.define(*args, **kwargs)
def define(self, cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
def define_from_variant(self, *args, **kwargs):
return self.builder.define_from_variant(*args, **kwargs)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self, cmake_var, variant)
@spack.builder.builder("cmake")
class CMakeBuilder(BaseBuilder):
class CMakeBuilder(BuilderWithDefaults):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:
@@ -321,15 +334,15 @@ class CMakeBuilder(BaseBuilder):
build_time_test_callbacks = ["check"]
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on CMake"""
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self):
if _supports_compilation_databases(self.pkg):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self):
def root_cmakelists_dir(self) -> str:
"""The relative path to the directory containing CMakeLists.txt
This path is relative to the root of the extracted tarball,
@@ -338,16 +351,17 @@ def root_cmakelists_dir(self):
return self.pkg.stage.source_path
@property
def generator(self):
def generator(self) -> str:
if self.spec.satisfies("generator=make"):
return "Unix Makefiles"
if self.spec.satisfies("generator=ninja"):
return "Ninja"
msg = f'{self.spec.format()} has an unsupported value for the "generator" variant'
raise ValueError(msg)
raise ValueError(
f'{self.spec.format()} has an unsupported value for the "generator" variant'
)
@property
def std_cmake_args(self):
def std_cmake_args(self) -> List[str]:
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
@@ -356,7 +370,9 @@ def std_cmake_args(self):
return args
@staticmethod
def std_args(pkg, generator=None):
def std_args(
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
) -> List[str]:
"""Computes the standard cmake arguments for a generic package"""
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
@@ -373,7 +389,6 @@ def std_args(pkg, generator=None):
except KeyError:
build_type = "RelWithDebInfo"
define = CMakeBuilder.define
args = [
"-G",
generator,
@@ -405,152 +420,31 @@ def std_args(pkg, generator=None):
return args
@staticmethod
def define_cuda_architectures(pkg):
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
cmake_flag = str()
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value
)
return cmake_flag
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_cuda_architectures(pkg)
@staticmethod
def define_hip_architectures(pkg):
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
cmake_flag = str()
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value
)
return cmake_flag
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_hip_architectures(pkg)
@staticmethod
def define(cmake_var, value):
"""Return a CMake command line argument that defines a variable.
def define(cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
The resulting argument will convert boolean values to OFF/ON
and lists/tuples to CMake semicolon-separated string lists. All other
values will be interpreted as strings.
Examples:
.. code-block:: python
[define('BUILD_SHARED_LIBS', True),
define('CMAKE_CXX_STANDARD', 14),
define('swr', ['avx', 'avx2'])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(self, cmake_var, variant=None):
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
This utility function is similar to
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`.
Examples:
Given a package with:
.. code-block:: python
variant('cxxstd', default='11', values=('11', '14'),
multi=False, description='')
variant('shared', default=True, description='')
variant('swr', values=any_combination_of('avx', 'avx2'),
description='')
calling this function like:
.. code-block:: python
[self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
self.define_from_variant('SWR')]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met,
this function returns an empty string. CMake discards empty strings
provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:
return ""
value = self.pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return self.define(cmake_var, value)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self.pkg, cmake_var, variant)
@property
def build_dirname(self):
def build_dirname(self) -> str:
"""Directory name to use when building the package."""
return "spack-build-%s" % self.pkg.spec.dag_hash(7)
return f"spack-build-{self.pkg.spec.dag_hash(7)}"
@property
def build_directory(self):
def build_directory(self) -> str:
"""Full-path to the directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def cmake_args(self):
def cmake_args(self) -> List[str]:
"""List of all the arguments that must be passed to cmake, except:
* CMAKE_INSTALL_PREFIX
@@ -560,7 +454,12 @@ def cmake_args(self):
"""
return []
def cmake(self, pkg, spec, prefix):
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Runs ``cmake`` in the build directory"""
# skip cmake phase if it is an incremental develop build
@@ -575,7 +474,12 @@ def cmake(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -584,7 +488,12 @@ def build(self, pkg, spec, prefix):
self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -592,9 +501,9 @@ def install(self, pkg, spec, prefix):
elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search the CMake-generated files for the targets ``test`` and ``check``,
and runs them if found.
"""
@@ -605,3 +514,133 @@ def check(self):
elif self.generator == "Ninja":
self.pkg._if_ninja_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
self.pkg._if_ninja_target_execute("check")
def define(cmake_var: str, value: Any) -> str:
"""Return a CMake command line argument that defines a variable.
The resulting argument will convert boolean values to OFF/ON and lists/tuples to CMake
semicolon-separated string lists. All other values will be interpreted as strings.
Examples:
.. code-block:: python
[define("BUILD_SHARED_LIBS", True),
define("CMAKE_CXX_STANDARD", 14),
define("swr", ["avx", "avx2"])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
) -> str:
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
Examples:
Given a package with:
.. code-block:: python
variant("cxxstd", default="11", values=("11", "14"),
multi=False, description="")
variant("shared", default=True, description="")
variant("swr", values=any_combination_of("avx", "avx2"),
description="")
calling this function like:
.. code-block:: python
[
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"),
self.define_from_variant("SWR"),
]
will generate the following configuration options:
.. code-block:: console
[
"-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2",
]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met, this function
returns an empty string. CMake discards empty strings provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, pkg.name))
if variant not in pkg.spec.variants:
return ""
value = pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return define(cmake_var, value)
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
return define("CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value)
return ""
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
return define("CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value)
return ""

View File

@@ -180,13 +180,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@7:", when="+cuda ^cuda@:9.1 target=x86_64:")
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=x86_64:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.2.89 target=x86_64:")
conflicts("%pgi@:14.8", when="+cuda ^cuda@:7.0.27 target=x86_64:")
conflicts("%pgi@:15.3,15.5:", when="+cuda ^cuda@7.5 target=x86_64:")
conflicts("%pgi@:16.2,16.0:16.3", when="+cuda ^cuda@8 target=x86_64:")
conflicts("%pgi@:15,18:", when="+cuda ^cuda@9.0:9.1 target=x86_64:")
conflicts("%pgi@:16,19:", when="+cuda ^cuda@9.2.88:10.0 target=x86_64:")
conflicts("%pgi@:17,20:", when="+cuda ^cuda@10.1.105:10.2.89 target=x86_64:")
conflicts("%pgi@:17,21:", when="+cuda ^cuda@11.0.2:11.1.0 target=x86_64:")
conflicts("%clang@:3.4", when="+cuda ^cuda@:7.5 target=x86_64:")
conflicts("%clang@:3.7,4:", when="+cuda ^cuda@8.0:9.0 target=x86_64:")
conflicts("%clang@:3.7,4.1:", when="+cuda ^cuda@9.1 target=x86_64:")
@@ -212,9 +205,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=ppc64le:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.1.243 target=ppc64le:")
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts("%pgi", when="+cuda ^cuda@:8 target=ppc64le:")
conflicts("%pgi@:16", when="+cuda ^cuda@:9.1.185 target=ppc64le:")
conflicts("%pgi@:17", when="+cuda ^cuda@:10 target=ppc64le:")
conflicts("%clang@4:", when="+cuda ^cuda@:9.0.176 target=ppc64le:")
conflicts("%clang@5:", when="+cuda ^cuda@:9.1 target=ppc64le:")
conflicts("%clang@6:", when="+cuda ^cuda@:9.2 target=ppc64le:")

View File

@@ -7,8 +7,9 @@
import spack.builder
import spack.directives
import spack.package_base
import spack.phase_callbacks
from ._checks import BaseBuilder, apply_macos_rpath_fixups, execute_install_time_tests
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
class Package(spack.package_base.PackageBase):
@@ -26,7 +27,7 @@ class Package(spack.package_base.PackageBase):
@spack.builder.builder("generic")
class GenericBuilder(BaseBuilder):
class GenericBuilder(BuilderWithDefaults):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.
"""
@@ -44,7 +45,7 @@ class GenericBuilder(BaseBuilder):
install_time_test_callbacks = []
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
# unconditionally perform any post-install phase tests
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
@@ -32,7 +33,7 @@ class GoPackage(spack.package_base.PackageBase):
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
class GoBuilder(BuilderWithDefaults):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
@@ -99,7 +100,7 @@ def install(self, pkg, spec, prefix):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""

View File

@@ -22,8 +22,8 @@
install,
)
import spack.builder
import spack.error
import spack.phase_callbacks
from spack.build_environment import dso_suffix
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
@@ -1163,7 +1163,7 @@ def _determine_license_type(self):
debug_print(license_type)
return license_type
@spack.builder.run_before("install")
@spack.phase_callbacks.run_before("install")
def configure(self):
"""Generates the silent.cfg file to pass to installer.sh.
@@ -1250,7 +1250,7 @@ def install(self, spec, prefix):
for f in glob.glob("%s/intel*log" % tmpdir):
install(f, dst)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
@@ -1258,7 +1258,7 @@ def validate_install(self):
if not os.path.exists(self.prefix.bin):
raise InstallError("The installer has failed to install anything.")
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_rpath(self):
if "+rpath" not in self.spec:
return
@@ -1276,7 +1276,7 @@ def configure_rpath(self):
with open(compiler_cfg, "w") as fh:
fh.write("-Xlinker -rpath={0}\n".format(compilers_lib_dir))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_auto_dispatch(self):
if self._has_compilers:
if "auto_dispatch=none" in self.spec:
@@ -1300,7 +1300,7 @@ def configure_auto_dispatch(self):
with open(compiler_cfg, "a") as fh:
fh.write("-ax{0}\n".format(",".join(ad)))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def filter_compiler_wrappers(self):
if ("+mpi" in self.spec or self.provides("mpi")) and "~newdtags" in self.spec:
bin_dir = self.component_bin_dir("mpi")
@@ -1308,7 +1308,7 @@ def filter_compiler_wrappers(self):
f = os.path.join(bin_dir, f)
filter_file("-Xlinker --enable-new-dtags", " ", f, string=True)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def uninstall_ism(self):
# The "Intel(R) Software Improvement Program" [ahem] gets installed,
# apparently regardless of PHONEHOME_SEND_USAGE_DATA.
@@ -1340,7 +1340,7 @@ def base_lib_dir(self):
debug_print(d)
return d
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def modify_LLVMgold_rpath(self):
"""Add libimf.so and other required libraries to the RUNPATH of LLVMgold.so.

View File

@@ -8,11 +8,14 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
execute_build_time_tests,
execute_install_time_tests,
@@ -36,7 +39,7 @@ class MakefilePackage(spack.package_base.PackageBase):
@spack.builder.builder("makefile")
class MakefileBuilder(BaseBuilder):
class MakefileBuilder(BuilderWithDefaults):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:
@@ -91,35 +94,50 @@ class MakefileBuilder(BaseBuilder):
install_time_test_callbacks = ["installcheck"]
@property
def build_directory(self):
def build_directory(self) -> str:
"""Return the directory containing the main Makefile."""
return self.pkg.stage.source_path
def edit(self, pkg, spec, prefix):
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Searches the Makefile for an ``installcheck`` target
and runs it if found.
"""
@@ -127,4 +145,4 @@ def installcheck(self):
self.pkg._if_make_target_execute("installcheck")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -10,7 +10,7 @@
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MavenPackage(spack.package_base.PackageBase):
@@ -34,7 +34,7 @@ class MavenPackage(spack.package_base.PackageBase):
@spack.builder.builder("maven")
class MavenBuilder(BaseBuilder):
class MavenBuilder(BuilderWithDefaults):
"""The Maven builder encodes the default way to build software with Maven.
It has two phases that can be overridden, if need be:

View File

@@ -9,10 +9,13 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class MesonPackage(spack.package_base.PackageBase):
@@ -62,7 +65,7 @@ def flags_to_build_system_args(self, flags):
@spack.builder.builder("meson")
class MesonBuilder(BaseBuilder):
class MesonBuilder(BuilderWithDefaults):
"""The Meson builder encodes the default way to build software with Meson.
The builder has three phases that can be overridden, if need be:
@@ -112,7 +115,7 @@ def archive_files(self):
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]
@property
def root_mesonlists_dir(self):
def root_mesonlists_dir(self) -> str:
"""Relative path to the directory containing meson.build
This path is relative to the root of the extracted tarball,
@@ -121,7 +124,7 @@ def root_mesonlists_dir(self):
return self.pkg.stage.source_path
@property
def std_meson_args(self):
def std_meson_args(self) -> List[str]:
"""Standard meson arguments provided as a property for convenience
of package writers.
"""
@@ -132,7 +135,7 @@ def std_meson_args(self):
return std_meson_args
@staticmethod
def std_args(pkg):
def std_args(pkg) -> List[str]:
"""Standard meson arguments for a generic package."""
try:
build_type = pkg.spec.variants["buildtype"].value
@@ -172,7 +175,7 @@ def build_directory(self):
"""Directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def meson_args(self):
def meson_args(self) -> List[str]:
"""List of arguments that must be passed to meson, except:
* ``--prefix``
@@ -185,7 +188,12 @@ def meson_args(self):
"""
return []
def meson(self, pkg, spec, prefix):
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run ``meson`` in the build directory"""
options = []
if self.spec["meson"].satisfies("@0.64:"):
@@ -196,21 +204,31 @@ def meson(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.ninja(*options)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search Meson-generated files for the target ``test`` and run it if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_ninja_target_execute("test")

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MSBuildPackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
@spack.builder.builder("msbuild")
class MSBuildBuilder(BaseBuilder):
class MSBuildBuilder(BuilderWithDefaults):
"""The MSBuild builder encodes the most common way of building software with
Mircosoft's MSBuild tool. It has two phases that can be overridden, if need be:

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class NMakePackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class NMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("nmake")
class NMakeBuilder(BaseBuilder):
class NMakeBuilder(BuilderWithDefaults):
"""The NMake builder encodes the most common way of building software with
Mircosoft's NMake tool. It has two phases that can be overridden, if need be:

View File

@@ -7,7 +7,7 @@
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class OctavePackage(spack.package_base.PackageBase):
@@ -29,7 +29,7 @@ class OctavePackage(spack.package_base.PackageBase):
@spack.builder.builder("octave")
class OctaveBuilder(BaseBuilder):
class OctaveBuilder(BuilderWithDefaults):
"""The octave builder provides the following phases that can be overridden:
1. :py:meth:`~.OctaveBuilder.install`

View File

@@ -255,7 +255,7 @@ def libs(self):
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
class IntelOneApiLibraryPackageWithSdk(IntelOneApiLibraryPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries

View File

@@ -10,11 +10,12 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class PerlPackage(spack.package_base.PackageBase):
@@ -84,7 +85,7 @@ def test_use(self):
@spack.builder.builder("perl")
class PerlBuilder(BaseBuilder):
class PerlBuilder(BuilderWithDefaults):
"""The perl builder provides four phases that can be overridden, if required:
1. :py:meth:`~.PerlBuilder.configure`
@@ -163,7 +164,7 @@ def configure(self, pkg, spec, prefix):
# Build.PL may be too long causing the build to fail. Patching the shebang
# does not happen until after install so set '/usr/bin/env perl' here in
# the Build script.
@spack.builder.run_after("configure")
@spack.phase_callbacks.run_after("configure")
def fix_shebang(self):
if self.build_method == "Build.PL":
pattern = "#!{0}".format(self.spec["perl"].command.path)
@@ -175,7 +176,7 @@ def build(self, pkg, spec, prefix):
self.build_executable()
# Ensure that tests run after build (if requested):
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
"""Runs built-in tests of a Perl package."""

View File

@@ -24,6 +24,7 @@
import spack.detection
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.platforms
import spack.repo
import spack.spec
@@ -34,7 +35,7 @@
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
@@ -374,7 +375,7 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return None
@property
def python_spec(self):
def python_spec(self) -> Spec:
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@@ -425,7 +426,7 @@ def libs(self) -> LibraryList:
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
class PythonPipBuilder(BuilderWithDefaults):
phases = ("install",)
#: Names associated with package methods in the old build-system format
@@ -543,4 +544,4 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
with fs.working_dir(self.build_directory):
pip(*args)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class QMakePackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class QMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("qmake")
class QMakeBuilder(BaseBuilder):
class QMakeBuilder(BuilderWithDefaults):
"""The qmake builder provides three phases that can be overridden:
1. :py:meth:`~.QMakeBuilder.qmake`
@@ -81,4 +82,4 @@ def check(self):
with working_dir(self.build_directory):
self.pkg._if_make_target_execute("check")
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -8,7 +8,7 @@
import spack.package_base
from spack.directives import build_system, extends, maintainers
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class RubyPackage(spack.package_base.PackageBase):
@@ -28,7 +28,7 @@ class RubyPackage(spack.package_base.PackageBase):
@spack.builder.builder("ruby")
class RubyBuilder(BaseBuilder):
class RubyBuilder(BuilderWithDefaults):
"""The Ruby builder provides two phases that can be overridden if required:
#. :py:meth:`~.RubyBuilder.build`

View File

@@ -4,9 +4,10 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class SConsPackage(spack.package_base.PackageBase):
@@ -28,7 +29,7 @@ class SConsPackage(spack.package_base.PackageBase):
@spack.builder.builder("scons")
class SConsBuilder(BaseBuilder):
class SConsBuilder(BuilderWithDefaults):
"""The Scons builder provides the following phases that can be overridden:
1. :py:meth:`~.SConsBuilder.build`
@@ -79,4 +80,4 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -11,11 +11,12 @@
import spack.builder
import spack.install_test
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class SIPPackage(spack.package_base.PackageBase):
@@ -103,7 +104,7 @@ def test_imports(self):
@spack.builder.builder("sip")
class SIPBuilder(BaseBuilder):
class SIPBuilder(BuilderWithDefaults):
"""The SIP builder provides the following phases that can be overridden:
* configure
@@ -170,4 +171,4 @@ def install_args(self):
"""Arguments to pass to install."""
return []
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
class WafPackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class WafPackage(spack.package_base.PackageBase):
@spack.builder.builder("waf")
class WafBuilder(BaseBuilder):
class WafBuilder(BuilderWithDefaults):
"""The WAF builder provides the following phases that can be overridden:
* configure
@@ -136,7 +137,7 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def install_test(self):
"""Run unit tests after install.
@@ -146,4 +147,4 @@ def install_test(self):
"""
pass
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,44 +6,30 @@
import collections.abc
import copy
import functools
from typing import List, Optional, Tuple
from llnl.util import lang
from typing import Dict, List, Optional, Tuple, Type
import spack.error
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.util.environment
#: Builder classes, as registered by the "builder" decorator
BUILDER_CLS = {}
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
BUILDER_CLS: Dict[str, Type["Builder"]] = {}
#: Map id(pkg) to a builder, to avoid creating multiple
#: builders for the same package object.
_BUILDERS = {}
_BUILDERS: Dict[int, "Builder"] = {}
def builder(build_system_name):
def builder(build_system_name: str):
"""Class decorator used to register the default builder
for a given build-system.
Args:
build_system_name (str): name of the build-system
build_system_name: name of the build-system
"""
def _decorator(cls):
@@ -54,13 +40,9 @@ def _decorator(cls):
return _decorator
def create(pkg):
"""Given a package object with an associated concrete spec,
return the builder object that can install it.
Args:
pkg (spack.package_base.PackageBase): package for which we want the builder
"""
def create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Given a package object with an associated concrete spec, return the builder object that can
install it."""
if id(pkg) not in _BUILDERS:
_BUILDERS[id(pkg)] = _create(pkg)
return _BUILDERS[id(pkg)]
@@ -75,7 +57,7 @@ def __call__(self, spec, prefix):
return self.phase_fn(self.builder.pkg, spec, prefix)
def get_builder_class(pkg, name: str) -> Optional[type]:
def get_builder_class(pkg, name: str) -> Optional[Type["Builder"]]:
"""Return the builder class if a package module defines it."""
cls = getattr(pkg.module, name, None)
if cls and cls.__module__.startswith(spack.repo.ROOT_PYTHON_NAMESPACE):
@@ -83,7 +65,7 @@ def get_builder_class(pkg, name: str) -> Optional[type]:
return None
def _create(pkg):
def _create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Return a new builder object for the package object being passed as argument.
The function inspects the build-system used by the package object and try to:
@@ -103,7 +85,7 @@ class hierarchy (look at AspellDictPackage for an example of that)
to look for build-related methods in the ``*Package``.
Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder
pkg: package object for which we need a builder
"""
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
@@ -168,8 +150,8 @@ def __forward(self, *args, **kwargs):
# with the same name is defined in the Package, it will override this definition
# (when _ForwardToBaseBuilder is initialized)
for method_name in (
base_cls.phases
+ base_cls.legacy_methods
base_cls.phases # type: ignore
+ base_cls.legacy_methods # type: ignore
+ getattr(base_cls, "legacy_long_methods", tuple())
+ ("setup_build_environment", "setup_dependent_build_environment")
):
@@ -181,14 +163,14 @@ def __forward(self):
return __forward
for attribute_name in base_cls.legacy_attributes:
for attribute_name in base_cls.legacy_attributes: # type: ignore
setattr(
_ForwardToBaseBuilder,
attribute_name,
property(forward_property_to_getattr(attribute_name)),
)
class Adapter(base_cls, metaclass=_PackageAdapterMeta):
class Adapter(base_cls, metaclass=_PackageAdapterMeta): # type: ignore
def __init__(self, pkg):
# Deal with custom phases in packages here
if hasattr(pkg, "phases"):
@@ -213,99 +195,18 @@ def setup_dependent_build_environment(self, env, dependent_spec):
return Adapter(pkg)
def buildsystem_name(pkg):
def buildsystem_name(pkg: spack.package_base.PackageBase) -> str:
"""Given a package object with an associated concrete spec,
return the name of its build system.
Args:
pkg (spack.package_base.PackageBase): package for which we want
the build system name
"""
return the name of its build system."""
try:
return pkg.spec.variants["build_system"].value
except KeyError:
# We are reading an old spec without the build_system variant
return pkg.legacy_buildsystem
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
return pkg.legacy_buildsystem # type: ignore
class BuilderMeta(
PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
spack.multimethod.MultiMethodMeta,
type(collections.abc.Sequence), # type: ignore
):
@@ -400,8 +301,12 @@ def __new__(mcs, name, bases, attr_dict):
)
combine_callbacks = _PackageAdapterMeta.combine_callbacks
attr_dict[_RUN_BEFORE.attribute_name] = combine_callbacks(_RUN_BEFORE.attribute_name)
attr_dict[_RUN_AFTER.attribute_name] = combine_callbacks(_RUN_AFTER.attribute_name)
attr_dict[spack.phase_callbacks._RUN_BEFORE.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_BEFORE.attribute_name
)
attr_dict[spack.phase_callbacks._RUN_AFTER.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_AFTER.attribute_name
)
return super(_PackageAdapterMeta, mcs).__new__(mcs, name, bases, attr_dict)
@@ -421,8 +326,8 @@ def __init__(self, name, builder):
self.name = name
self.builder = builder
self.phase_fn = self._select_phase_fn()
self.run_before = self._make_callbacks(_RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(_RUN_AFTER.attribute_name)
self.run_before = self._make_callbacks(spack.phase_callbacks._RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(spack.phase_callbacks._RUN_AFTER.attribute_name)
def _make_callbacks(self, callbacks_attribute):
result = []
@@ -483,15 +388,103 @@ def copy(self):
return copy.deepcopy(self)
class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
"""A builder is a class that, given a package object (i.e. associated with
concrete spec), knows how to install it.
class BaseBuilder(metaclass=BuilderMeta):
"""An interface for builders, without any phases defined. This class is exposed in the package
API, so that packagers can create a single class to define ``setup_build_environment`` and
``@run_before`` and ``@run_after`` callbacks that can be shared among different builders.
The builder behaves like a sequence, and when iterated over return the
"phases" of the installation in the correct order.
Example:
Args:
pkg (spack.package_base.PackageBase): package object to be built
.. code-block:: python
class AnyBuilder(BaseBuilder):
@run_after("install")
def fixup_install(self):
# do something after the package is installed
pass
def setup_build_environment(self, env):
env.set("MY_ENV_VAR", "my_value")
class CMakeBuilder(cmake.CMakeBuilder, AnyBuilder):
pass
class AutotoolsBuilder(autotools.AutotoolsBuilder, AnyBuilder):
pass
"""
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
self.pkg = pkg
@property
def spec(self) -> spack.spec.Spec:
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env: environment modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env) # type: ignore
def setup_dependent_build_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the build environment of a package that depends on this one.
This is similar to ``setup_build_environment``, but it is used to modify the build
environment of a package that *depends* on this one.
This gives packages the ability to set environment variables for the build of the
dependent, which can be useful to provide search hints for headers or libraries if they are
not in standard locations.
This method will be called before the dependent package prefix exists in Spack's store.
Args:
env: environment modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec: the spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec) # type: ignore
def __repr__(self):
fmt = "{name}{/hash:7}"
return f"{self.__class__.__name__}({self.spec.format(fmt)})"
def __str__(self):
fmt = "{name}{/hash:7}"
return f'"{self.__class__.__name__}" builder for "{self.spec.format(fmt)}"'
class Builder(BaseBuilder, collections.abc.Sequence):
"""A builder is a class that, given a package object (i.e. associated with concrete spec),
knows how to install it.
The builder behaves like a sequence, and when iterated over return the "phases" of the
installation in the correct order.
"""
#: Sequence of phases. Must be defined in derived classes
@@ -506,95 +499,22 @@ class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will be
#: copied in the same directory tree as _spack_build_logfile and
#: _spack_build_envfile.
archive_files: List[str] = []
#: List of glob expressions. Each expression must either be absolute or relative to the package
#: source path. Matching artifacts found at the end of the build process will be copied in the
#: same directory tree as _spack_build_logfile and _spack_build_envfile.
@property
def archive_files(self) -> List[str]:
return []
def __init__(self, pkg):
self.pkg = pkg
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
super().__init__(pkg)
self.callbacks = {}
for phase in self.phases:
self.callbacks[phase] = InstallationPhase(phase, self)
@property
def spec(self):
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(self, env):
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
"""Sets up the build environment of packages that depend on this one.
This is similar to ``setup_build_environment``, but it is used to
modify the build environments of packages that *depend* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or compile-time settings
for dependencies.
This method will be called before the dependent package prefix exists
in Spack's store.
Examples:
1. Installing python modules generally requires ``PYTHONPATH``
to point to the ``lib/pythonX.Y/site-packages`` directory in the
module's install prefix. This method could be used to set that
variable.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): the spec of the dependent package
about to be built. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec)
def __getitem__(self, idx):
key = self.phases[idx]
return self.callbacks[key]
def __len__(self):
return len(self.phases)
def __repr__(self):
msg = "{0}({1})"
return msg.format(type(self).__name__, self.pkg.spec.format("{name}/{hash:7}"))
def __str__(self):
msg = '"{0}" builder for "{1}"'
return msg.format(type(self).build_system, self.pkg.spec.format("{name}/{hash:7}"))
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -32,11 +32,13 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.concretize
import spack.config as cfg
import spack.error
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.paths
import spack.repo
import spack.spec
@@ -203,7 +205,7 @@ def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages:
return
mirrors = spack.mirror.MirrorCollection(binary=True)
mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values():
tty.msg(f" {m.fetch_url}")
@@ -796,7 +798,7 @@ def ensure_expected_target_path(path):
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
@@ -1322,7 +1324,7 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirror.Mirror.from_url(mirror_url)
mirror = spack.mirrors.mirror.Mirror.from_url(mirror_url)
try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])
@@ -1342,7 +1344,7 @@ def remove_other_mirrors(mirrors_to_keep, scope=None):
mirrors_to_remove.append(name)
for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)
spack.mirrors.utils.remove(mirror_name, scope)
def copy_files_to_artifacts(src, artifacts_dir):
@@ -1387,7 +1389,11 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [job_pkg.log_path, job_pkg.env_mods_path, *job_pkg.builder.archive_files]:
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
copy_files_to_artifacts(file, job_log_dir)

View File

@@ -4,11 +4,13 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import difflib
import importlib
import os
import re
import sys
from typing import List, Union
from collections import Counter
from typing import List, Optional, Union
import llnl.string
import llnl.util.tty as tty
@@ -24,6 +26,7 @@
import spack.extensions
import spack.parser
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.traverse as traverse
@@ -31,6 +34,8 @@
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
from ..enums import InstallRecordStatus
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -121,6 +126,8 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)
attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
@@ -189,6 +196,43 @@ def _concretize_spec_pairs(to_concretize, tests=False):
rules from config."""
unify = spack.config.get("concretizer:unify", False)
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or spack.concretize.concretized(abstract)]
# Special case if every spec is either concrete or has an abstract hash
if all(
concrete or abstract.concrete or abstract.abstract_hash
for abstract, concrete in to_concretize
):
# Get all the concrete specs
ret = [
concrete or (abstract if abstract.concrete else abstract.lookup_hash())
for abstract, concrete in to_concretize
]
# If unify: true, check that specs don't conflict
# Since all concrete, "when_possible" is not relevant
if unify is True: # True, "when_possible", False are possible values
runtimes = spack.repo.PATH.packages_with_tags("runtime")
specs_per_name = Counter(
spec.name
for spec in traverse.traverse_nodes(
ret, deptype=("link", "run"), key=traverse.by_dag_hash
)
if spec.name not in runtimes # runtimes are allowed multiple times
)
conflicts = sorted(name for name, count in specs_per_name.items() if count > 1)
if conflicts:
raise spack.error.SpecError(
"Specs conflict and `concretizer:unify` is configured true.",
f" specs depend on multiple versions of {', '.join(conflicts)}",
)
return ret
# Standard case
concretize_method = spack.concretize.concretize_separately # unify: false
if unify is True:
concretize_method = spack.concretize.concretize_together
@@ -207,9 +251,9 @@ def matching_spec_from_env(spec):
"""
env = ev.active_environment()
if env:
return env.matching_spec(spec) or spec.concretized()
return env.matching_spec(spec) or spack.concretize.concretized(spec)
else:
return spec.concretized()
return spack.concretize.concretized(spec)
def matching_specs_from_env(specs):
@@ -228,39 +272,48 @@ def matching_specs_from_env(specs):
return _concretize_spec_pairs(spec_pairs + additional_concrete_specs)[: len(spec_pairs)]
def disambiguate_spec(spec, env, local=False, installed=True, first=False):
def disambiguate_spec(
spec: spack.spec.Spec,
env: Optional[ev.Environment],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec, figure out which installed package it refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
env (spack.environment.Environment): a spack environment,
if one is active, or None if no environment is active
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
Args:
spec: a spec to disambiguate
env: a spack environment, if one is active, or None if no environment is active
local: do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
hashes = env.all_hashes() if env else None
return disambiguate_spec_from_hashes(spec, hashes, local, installed, first)
def disambiguate_spec_from_hashes(spec, hashes, local=False, installed=True, first=False):
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: List[str],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec and a list of hashes, get concrete spec the spec refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
hashes (typing.Iterable): a set of hashes of specs among which to disambiguate
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
spec: a spec to disambiguate
hashes: a set of hashes of specs among which to disambiguate
local: if True, do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
if local:
matching_specs = spack.store.STORE.db.query_local(spec, hashes=hashes, installed=installed)
else:
matching_specs = spack.store.STORE.db.query(spec, hashes=hashes, installed=installed)
if not matching_specs:
tty.die("Spec '%s' matches no installed packages." % spec)
tty.die(f"Spec '{spec}' matches no installed packages.")
elif first:
return matching_specs[0]
@@ -545,6 +598,18 @@ def __init__(self, name):
super().__init__("{0} is not a permissible Spack command name.".format(name))
class MultipleSpecsMatch(Exception):
"""Raised when multiple specs match a constraint, in a context where
this is not allowed.
"""
class NoSpecMatches(Exception):
"""Raised when no spec matches a constraint, in a context where
this is not allowed.
"""
########################################
# argparse types for argument validation
########################################
@@ -629,3 +694,24 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)

View File

@@ -15,8 +15,9 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.mirror
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
@@ -398,9 +399,9 @@ def _mirror(args):
llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
spec = spack.concretize.concretized(spack.spec.Spec(spec_str))
for node in spec.traverse():
spack.mirror.create(mirror_dir, [node])
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:

View File

@@ -17,11 +17,12 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirror
import spack.mirrors.mirror
import spack.oci.oci
import spack.spec
import spack.stage
@@ -34,6 +35,8 @@
from spack.cmd.common import arguments
from spack.spec import Spec, save_dependency_specfiles
from ..enums import InstallRecordStatus
description = "create, download and install binary packages"
section = "packaging"
level = "long"
@@ -308,7 +311,10 @@ def setup_parser(subparser: argparse.ArgumentParser):
def _matching_specs(specs: List[Spec]) -> List[Spec]:
"""Disambiguate specs and return a list of matching specs"""
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
return [
spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=InstallRecordStatus.ANY)
for s in specs
]
def _format_spec(spec: Spec) -> str:
@@ -387,7 +393,7 @@ def push_fn(args):
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
mirror = args.mirror
assert isinstance(mirror, spack.mirror.Mirror)
assert isinstance(mirror, spack.mirrors.mirror.Mirror)
push_url = mirror.push_url
@@ -550,8 +556,7 @@ def check_fn(args: argparse.Namespace):
tty.msg("No specs provided, exiting.")
return
for spec in specs:
spec.concretize()
specs = [spack.concretize.concretized(s) for s in specs]
# Next see if there are any configured binary mirrors
configured_mirrors = spack.config.get("mirrors", scope=args.scope)
@@ -619,7 +624,7 @@ def save_specfile_fn(args):
root = specs[0]
if not root.concrete:
root.concretize()
root = spack.concretize.concretized(root)
save_dependency_specfiles(
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)
@@ -745,7 +750,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)

View File

@@ -20,7 +20,7 @@
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.mirrors.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -240,7 +240,7 @@ def ci_reindex(args):
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
mirror = spack.mirror.Mirror(remote_mirror_url)
mirror = spack.mirrors.mirror.Mirror(remote_mirror_url)
buildcache.update_index(mirror, update_keys=True)
@@ -328,7 +328,7 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")

View File

@@ -14,7 +14,8 @@
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.reporters
import spack.spec
import spack.store
@@ -581,23 +582,51 @@ def add_concretizer_args(subparser):
def add_connection_args(subparser, add_help):
subparser.add_argument(
"--s3-access-key-id", help="ID string to use to connect to this S3 mirror"
def add_argument_string_or_variable(parser, arg: str, *, deprecate_str: bool = True, **kwargs):
group = parser.add_mutually_exclusive_group()
group.add_argument(arg, **kwargs)
# Update help string
if "help" in kwargs:
kwargs["help"] = "environment variable containing " + kwargs["help"]
group.add_argument(arg + "-variable", **kwargs)
s3_connection_parser = subparser.add_argument_group("S3 Connection")
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-key-id",
help="ID string to use to connect to this S3 mirror",
)
subparser.add_argument(
"--s3-access-key-secret", help="secret string to use to connect to this S3 mirror"
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-key-secret",
help="secret string to use to connect to this S3 mirror",
)
subparser.add_argument(
"--s3-access-token", help="access token to use to connect to this S3 mirror"
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-token",
help="access token to use to connect to this S3 mirror",
)
subparser.add_argument(
s3_connection_parser.add_argument(
"--s3-profile", help="S3 profile name to use to connect to this S3 mirror", default=None
)
subparser.add_argument(
s3_connection_parser.add_argument(
"--s3-endpoint-url", help="endpoint URL to use to connect to this S3 mirror"
)
subparser.add_argument("--oci-username", help="username to use to connect to this OCI mirror")
subparser.add_argument("--oci-password", help="password to use to connect to this OCI mirror")
oci_connection_parser = subparser.add_argument_group("OCI Connection")
add_argument_string_or_variable(
oci_connection_parser,
"--oci-username",
deprecate_str=False,
help="username to use to connect to this OCI mirror",
)
add_argument_string_or_variable(
oci_connection_parser,
"--oci-password",
help="password to use to connect to this OCI mirror",
)
def use_buildcache(cli_arg_value):
@@ -661,31 +690,31 @@ def mirror_name_or_url(m):
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
return spack.mirror.Mirror(m)
return spack.mirrors.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirror.require_mirror_name(m)
return spack.mirrors.utils.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
def mirror_url(url):
try:
return spack.mirror.Mirror.from_url(url)
return spack.mirrors.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_directory(path):
try:
return spack.mirror.Mirror.from_local_path(path)
return spack.mirrors.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_name(name):
try:
return spack.mirror.require_mirror_name(name)
return spack.mirrors.utils.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e

View File

@@ -19,13 +19,15 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.concretize
import spack.environment as ev
import spack.installer
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError
from ..enums import InstallRecordStatus
description = "replace one package with another via symlinks"
section = "admin"
level = "long"
@@ -95,11 +97,15 @@ def deprecate(parser, args):
if len(specs) != 2:
raise SpackError("spack deprecate requires exactly two specs")
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
deprecate = spack.cmd.disambiguate_spec(specs[0], env, local=True, installed=install_query)
deprecate = spack.cmd.disambiguate_spec(
specs[0],
env,
local=True,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
)
if args.install:
deprecator = specs[1].concretized()
deprecator = spack.concretize.concretized(specs[1])
else:
deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True)

View File

@@ -11,6 +11,7 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments
import spack.concretize
import spack.config
import spack.repo
from spack.cmd.common import arguments
@@ -115,7 +116,7 @@ def dev_build(self, args):
# Forces the build to run out of the source directory.
spec.constrain("dev_path=%s" % source_path)
spec.concretize()
spec = spack.concretize.concretized(spec)
if spec.installed:
tty.error("Already installed in %s" % spec.prefix)

View File

@@ -10,11 +10,12 @@
import sys
import tempfile
from pathlib import Path
from typing import List, Optional
from typing import List, Optional, Set
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import islink, symlink
from llnl.util.tty.colify import colify
from llnl.util.tty.color import cescape, colorize
@@ -50,6 +51,8 @@
"update",
"revert",
"depfile",
"track",
"untrack",
]
@@ -446,6 +449,193 @@ def env_deactivate(args):
sys.stdout.write(cmds)
#
# env track
#
def env_track_setup_parser(subparser):
"""track an environment from a directory in Spack"""
subparser.add_argument("-n", "--name", help="custom environment name")
subparser.add_argument("dir", help="path to environment")
arguments.add_common_arguments(subparser, ["yes_to_all"])
def env_track(args):
src_path = os.path.abspath(args.dir)
if not ev.is_env_dir(src_path):
tty.die("Cannot track environment. Path doesn't contain an environment")
if args.name:
name = args.name
else:
name = os.path.basename(src_path)
try:
dst_path = ev.environment_dir_from_name(name, exists_ok=False)
except ev.SpackEnvironmentError:
tty.die(
f"An environment named {name} already exists. Set a name with:"
"\n\n"
f" spack env track --name NAME {src_path}\n"
)
symlink(src_path, dst_path)
tty.msg(f"Tracking environment in {src_path}")
tty.msg(
"You can now activate this environment with the following command:\n\n"
f" spack env activate {name}\n"
)
#
# env remove & untrack helpers
#
def filter_managed_env_names(env_names: Set[str]) -> Set[str]:
tracked_env_names = {e for e in env_names if islink(ev.environment_dir_from_name(e))}
managed_env_names = env_names - set(tracked_env_names)
num_managed_envs = len(managed_env_names)
managed_envs_str = " ".join(managed_env_names)
if num_managed_envs >= 2:
tty.error(
f"The following are not tracked environments. "
"To remove them completely run,"
"\n\n"
f" spack env rm {managed_envs_str}\n"
)
elif num_managed_envs > 0:
tty.error(
f"'{managed_envs_str}' is not a tracked env. "
"To remove it completely run,"
"\n\n"
f" spack env rm {managed_envs_str}\n"
)
return tracked_env_names
def get_valid_envs(env_names: Set[str]) -> Set[ev.Environment]:
valid_envs = set()
for env_name in env_names:
try:
env = ev.read(env_name)
valid_envs.add(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
pass
return valid_envs
def _env_untrack_or_remove(
env_names: List[str], remove: bool = False, force: bool = False, yes_to_all: bool = False
):
all_env_names = set(ev.all_environment_names())
known_env_names = set(env_names).intersection(all_env_names)
unknown_env_names = set(env_names) - known_env_names
# print error for unknown environments
for env_name in unknown_env_names:
tty.error(f"Environment '{env_name}' does not exist")
# if only unlinking is allowed, remove all environments
# which do not point internally at symlinks
if not remove:
env_names_to_remove = filter_managed_env_names(known_env_names)
else:
env_names_to_remove = known_env_names
# initalize all environments with valid spack.yaml configs
all_valid_envs = get_valid_envs(all_env_names)
# build a task list of environments and bad env names to remove
envs_to_remove = [e for e in all_valid_envs if e.name in env_names_to_remove]
bad_env_names_to_remove = env_names_to_remove - {e.name for e in envs_to_remove}
for remove_env in envs_to_remove:
for env in all_valid_envs:
# don't check if an environment is included to itself
if env.name == remove_env.name:
continue
# check if an environment is included un another
if remove_env.path in env.included_concrete_envs:
msg = f"Environment '{remove_env.name}' is used by environment '{env.name}'"
if force:
tty.warn(msg)
else:
tty.error(msg)
envs_to_remove.remove(remove_env)
# ask the user if they really want to remove the known environments
# force should do the same as yes to all here following the symantics of rm
if not (yes_to_all or force) and (envs_to_remove or bad_env_names_to_remove):
environments = string.plural(len(env_names_to_remove), "environment", show_n=False)
envs = string.comma_and(list(env_names_to_remove))
answer = tty.get_yes_or_no(
f"Really {'remove' if remove else 'untrack'} {environments} {envs}?", default=False
)
if not answer:
tty.die("Will not remove any environments")
# keep track of the environments we remove for later printing the exit code
removed_env_names = []
for env in envs_to_remove:
name = env.name
if not force and env.active:
tty.error(
f"Environment '{name}' can't be "
f"{'removed' if remove else 'untracked'} while activated."
)
continue
# Get path to check if environment is a tracked / symlinked environment
if islink(env.path):
real_env_path = os.path.realpath(env.path)
os.unlink(env.path)
tty.msg(
f"Sucessfully untracked environment '{name}', "
"but it can still be found at:\n\n"
f" {real_env_path}\n"
)
else:
env.destroy()
tty.msg(f"Successfully removed environment '{name}'")
removed_env_names.append(env.name)
for bad_env_name in bad_env_names_to_remove:
shutil.rmtree(
spack.environment.environment.environment_dir_from_name(bad_env_name, exists_ok=True)
)
tty.msg(f"Successfully removed environment '{bad_env_name}'")
removed_env_names.append(env.name)
# Following the design of linux rm we should exit with a status of 1
# anytime we cannot delete every environment the user asks for.
# However, we should still process all the environments we know about
# and delete them instead of failing on the first unknown enviornment.
if len(removed_env_names) < len(known_env_names):
sys.exit(1)
#
# env untrack
#
def env_untrack_setup_parser(subparser):
"""track an environment from a directory in Spack"""
subparser.add_argument("env", nargs="+", help="tracked environment name")
subparser.add_argument(
"-f", "--force", action="store_true", help="force unlink even when environment is active"
)
arguments.add_common_arguments(subparser, ["yes_to_all"])
def env_untrack(args):
_env_untrack_or_remove(
env_names=args.env, force=args.force, yes_to_all=args.yes_to_all, remove=False
)
#
# env remove
#
@@ -471,54 +661,9 @@ def env_remove_setup_parser(subparser):
def env_remove(args):
"""remove existing environment(s)"""
remove_envs = []
valid_envs = []
bad_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
valid_envs.append(env)
if env_name in args.rm_env:
remove_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if remove_env is included from another env before trying to remove
for env in valid_envs:
for remove_env in remove_envs:
# don't check if environment is included to itself
if env.name == remove_env.name:
continue
if remove_env.path in env.included_concrete_envs:
msg = f'Environment "{remove_env.name}" is being used by environment "{env.name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)
envs = string.comma_and(args.rm_env)
answer = tty.get_yes_or_no(f"Really remove {environments} {envs}?", default=False)
if not answer:
tty.die("Will not remove any environments")
for env in remove_envs:
name = env.name
if env.active:
tty.die(f"Environment {name} can't be removed while activated.")
env.destroy()
tty.msg(f"Successfully removed environment '{name}'")
for bad_env_name in bad_envs:
shutil.rmtree(
spack.environment.environment.environment_dir_from_name(bad_env_name, exists_ok=True)
)
tty.msg(f"Successfully removed environment '{bad_env_name}'")
_env_untrack_or_remove(
env_names=args.rm_env, remove=True, force=args.force, yes_to_all=args.yes_to_all
)
#

View File

@@ -17,7 +17,8 @@
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "list and search installed packages"
section = "basic"
@@ -137,21 +138,22 @@ def setup_parser(subparser):
subparser.add_argument(
"--loaded", action="store_true", help="show only packages loaded in the user environment"
)
subparser.add_argument(
only_missing_or_deprecated = subparser.add_mutually_exclusive_group()
only_missing_or_deprecated.add_argument(
"-M",
"--only-missing",
action="store_true",
dest="only_missing",
help="show only missing dependencies",
)
only_missing_or_deprecated.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--deprecated",
action="store_true",
help="show deprecated packages as well as installed specs",
)
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--install-tree",
action="store",
@@ -165,14 +167,23 @@ def setup_parser(subparser):
def query_arguments(args):
# Set up query arguments.
installed = []
if not (args.only_missing or args.only_deprecated):
installed.append(InstallStatuses.INSTALLED)
if (args.deprecated or args.only_deprecated) and not args.only_missing:
installed.append(InstallStatuses.DEPRECATED)
if (args.missing or args.only_missing) and not args.only_deprecated:
installed.append(InstallStatuses.MISSING)
if args.only_missing and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-missing with --deprecated, or --missing")
if args.only_deprecated and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-deprecated with --deprecated, or --missing")
installed = InstallRecordStatus.INSTALLED
if args.only_missing:
installed = InstallRecordStatus.MISSING
elif args.only_deprecated:
installed = InstallRecordStatus.DEPRECATED
if args.missing:
installed |= InstallRecordStatus.MISSING
if args.deprecated:
installed |= InstallRecordStatus.DEPRECATED
predicate_fn = None
if args.unknown:
@@ -222,11 +233,9 @@ def decorator(spec, fmt):
def display_env(env, args, decorator, results):
"""Display extra find output when running in an environment.
Find in an environment outputs 2 or 3 sections:
1. Root specs
2. Concretized roots (if asked for with -c)
3. Installed specs
In an environment, `spack find` outputs a preliminary section
showing the root specs of the environment (this is in addition
to the section listing out specs matching the query parameters).
"""
tty.msg("In environment %s" % env.name)
@@ -299,6 +308,56 @@ def root_decorator(spec, string):
print()
def _find_query(args, env):
q_args = query_arguments(args)
concretized_but_not_installed = list()
if env:
all_env_specs = env.all_specs()
if args.constraint:
init_specs = cmd.parse_specs(args.constraint)
env_specs = env.all_matching_specs(*init_specs)
else:
env_specs = all_env_specs
spec_hashes = set(x.dag_hash() for x in env_specs)
specs_meeting_q_args = set(spack.store.STORE.db.query(hashes=spec_hashes, **q_args))
results = list()
with spack.store.STORE.db.read_transaction():
for spec in env_specs:
if not spec.installed:
concretized_but_not_installed.append(spec)
if spec in specs_meeting_q_args:
results.append(spec)
else:
results = args.specs(**q_args)
# use groups by default except with format.
if args.groups is None:
args.groups = not args.format
# Exit early with an error code if no package matches the constraint
if concretized_but_not_installed and args.show_concretized:
pass
elif results:
pass
elif args.constraint:
raise cmd.NoSpecMatches()
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
results = [x for x in results if x.name in packages_with_tags]
concretized_but_not_installed = [
x for x in concretized_but_not_installed if x.name in packages_with_tags
]
if args.loaded:
results = cmd.filter_loaded_specs(results)
return results, concretized_but_not_installed
def find(parser, args):
env = ev.active_environment()
@@ -307,34 +366,12 @@ def find(parser, args):
if not env and args.show_concretized:
tty.die("-c / --show-concretized requires an active environment")
if env:
if args.constraint:
init_specs = spack.cmd.parse_specs(args.constraint)
results = env.all_matching_specs(*init_specs)
else:
results = env.all_specs()
else:
q_args = query_arguments(args)
results = args.specs(**q_args)
decorator = make_env_decorator(env) if env else lambda s, f: f
# use groups by default except with format.
if args.groups is None:
args.groups = not args.format
# Exit early with an error code if no package matches the constraint
if not results and args.constraint:
constraint_str = " ".join(str(s) for s in args.constraint_specs)
tty.die(f"No package matches the query: {constraint_str}")
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
results = [x for x in results if x.name in packages_with_tags]
if args.loaded:
results = spack.cmd.filter_loaded_specs(results)
try:
results, concretized_but_not_installed = _find_query(args, env)
except cmd.NoSpecMatches:
# Note: this uses args.constraint vs. args.constraint_specs because
# the latter only exists if you call args.specs()
tty.die(f"No package matches the query: {' '.join(args.constraint)}")
if args.install_status or args.show_concretized:
status_fn = spack.spec.Spec.install_status
@@ -345,14 +382,16 @@ def find(parser, args):
if args.json:
cmd.display_specs_as_json(results, deps=args.deps)
else:
decorator = make_env_decorator(env) if env else lambda s, f: f
if not args.format:
if env:
display_env(env, args, decorator, results)
if not args.only_roots:
display_results = results
if not args.show_concretized:
display_results = list(x for x in results if x.installed)
display_results = list(results)
if args.show_concretized:
display_results += concretized_but_not_installed
cmd.display_specs(
display_results, args, decorator=decorator, all_headers=True, status_fn=status_fn
)
@@ -370,13 +409,9 @@ def find(parser, args):
concretized_suffix += " (show with `spack find -c`)"
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(
list(x for x in results if x.installed), pkg_type, suffix=installed_suffix
)
cmd.print_how_many_pkgs(results, pkg_type, suffix=installed_suffix)
if env:
spack.cmd.print_how_many_pkgs(
list(x for x in results if not x.installed),
"concretized",
suffix=concretized_suffix,
cmd.print_how_many_pkgs(
concretized_but_not_installed, "concretized", suffix=concretized_suffix
)

View File

@@ -8,7 +8,7 @@
import tempfile
import spack.binary_distribution
import spack.mirror
import spack.mirrors.mirror
import spack.paths
import spack.stage
import spack.util.gpg
@@ -217,11 +217,11 @@ def gpg_publish(args):
mirror = None
if args.directory:
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirror.Mirror(url, url)
mirror = spack.mirrors.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
mirror = spack.mirrors.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
elif args.mirror_url:
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
mirror = spack.mirrors.mirror.Mirror(args.mirror_url, args.mirror_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution._url_push_keys(

View File

@@ -78,8 +78,8 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @g{%pgi} @B{fabrics=psm,mrail,sock}
mvapich2, built with pgi compiler, with support for multiple fabrics
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2, built with gcc compiler, with support for multiple fabrics
"""

View File

@@ -11,6 +11,7 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.builder
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
@@ -202,11 +203,13 @@ def print_namespace(pkg, args):
def print_phases(pkg, args):
"""output installation phases"""
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
builder = spack.builder.create(pkg)
if hasattr(builder, "phases") and builder.phases:
color.cprint("")
color.cprint(section_title("Installation Phases:"))
phase_str = ""
for phase in pkg.builder.phases:
for phase in builder.phases:
phase_str += " {0}".format(phase)
color.cprint(phase_str)

View File

@@ -14,6 +14,7 @@
from llnl.util import lang, tty
import spack.cmd
import spack.concretize
import spack.config
import spack.environment as ev
import spack.paths
@@ -451,7 +452,7 @@ def concrete_specs_from_file(args):
else:
s = spack.spec.Spec.from_json(f)
concretized = s.concretized()
concretized = spack.concretize.concretized(s)
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."

View File

@@ -8,6 +8,7 @@
from llnl.path import convert_to_posix_path
import spack.concretize
import spack.paths
import spack.util.executable
from spack.spec import Spec
@@ -66,8 +67,7 @@ def make_installer(parser, args):
"""
if sys.platform == "win32":
output_dir = args.output_dir
cmake_spec = Spec("cmake")
cmake_spec.concretize()
cmake_spec = spack.concretize.concretized(Spec("cmake"))
cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe")
cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe")
spack_source = args.spack_source

View File

@@ -10,7 +10,8 @@
import spack.cmd
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "mark packages as explicitly or implicitly installed"
section = "admin"
@@ -67,8 +68,7 @@ def find_matching_specs(specs, allow_multiple_matches=False):
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED]
matching = spack.store.STORE.db.query_local(spec, installed=install_query)
matching = spack.store.STORE.db.query_local(spec, installed=InstallRecordStatus.INSTALLED)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't
if not allow_multiple_matches and len(matching) > 1:

View File

@@ -14,7 +14,8 @@
import spack.concretize
import spack.config
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.repo
import spack.spec
import spack.util.web as web_util
@@ -231,31 +232,133 @@ def setup_parser(subparser):
)
def _configure_access_pair(
args, id_tok, id_variable_tok, secret_tok, secret_variable_tok, default=None
):
"""Configure the access_pair options"""
# Check if any of the arguments are set to update this access_pair.
# If none are set, then skip computing the new access pair
args_id = getattr(args, id_tok)
args_id_variable = getattr(args, id_variable_tok)
args_secret = getattr(args, secret_tok)
args_secret_variable = getattr(args, secret_variable_tok)
if not any([args_id, args_id_variable, args_secret, args_secret_variable]):
return None
def _default_value(id_):
if isinstance(default, list):
return default[0] if id_ == "id" else default[1]
elif isinstance(default, dict):
return default.get(id_)
else:
return None
def _default_variable(id_):
if isinstance(default, dict):
return default.get(id_ + "_variable")
else:
return None
id_ = None
id_variable = None
secret = None
secret_variable = None
# Get the value/default value if the argument of the inverse
if not args_id_variable:
id_ = getattr(args, id_tok) or _default_value("id")
if not args_id:
id_variable = getattr(args, id_variable_tok) or _default_variable("id")
if not args_secret_variable:
secret = getattr(args, secret_tok) or _default_value("secret")
if not args_secret:
secret_variable = getattr(args, secret_variable_tok) or _default_variable("secret")
if (id_ or id_variable) and (secret or secret_variable):
if secret:
if not id_:
raise SpackError("Cannot add mirror with a variable id and text secret")
return [id_, secret]
else:
return dict(
[
(("id", id_) if id_ else ("id_variable", id_variable)),
("secret_variable", secret_variable),
]
)
else:
if id_ or id_variable or secret or secret_variable is not None:
id_arg_tok = id_tok.replace("_", "-")
secret_arg_tok = secret_tok.replace("_", "-")
tty.warn(
"Expected both parts of the access pair to be specified. "
f"(i.e. --{id_arg_tok} and --{secret_arg_tok})"
)
return None
def mirror_add(args):
"""add a mirror to Spack"""
if (
args.s3_access_key_id
or args.s3_access_key_secret
or args.s3_access_token
or args.s3_access_key_id_variable
or args.s3_access_key_secret_variable
or args.s3_access_token_variable
or args.s3_profile
or args.s3_endpoint_url
or args.type
or args.oci_username
or args.oci_password
or args.oci_username_variable
or args.oci_password_variable
or args.autopush
or args.signed is not None
):
connection = {"url": args.url}
if args.s3_access_key_id and args.s3_access_key_secret:
connection["access_pair"] = [args.s3_access_key_id, args.s3_access_key_secret]
# S3 Connection
if args.s3_access_key_secret:
tty.warn(
"Configuring mirror secrets as plain text with --s3-access-key-secret is "
"deprecated. Use --s3-access-key-secret-variable instead"
)
if args.oci_password:
tty.warn(
"Configuring mirror secrets as plain text with --oci-password is deprecated. "
"Use --oci-password-variable instead"
)
access_pair = _configure_access_pair(
args,
"s3_access_key_id",
"s3_access_key_id_variable",
"s3_access_key_secret",
"s3_access_key_secret_variable",
)
if access_pair:
connection["access_pair"] = access_pair
if args.s3_access_token:
connection["access_token"] = args.s3_access_token
elif args.s3_access_token_variable:
connection["access_token_variable"] = args.s3_access_token_variable
if args.s3_profile:
connection["profile"] = args.s3_profile
if args.s3_endpoint_url:
connection["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password:
connection["access_pair"] = [args.oci_username, args.oci_password]
# OCI Connection
access_pair = _configure_access_pair(
args, "oci_username", "oci_username_variable", "oci_password", "oci_password_variable"
)
if access_pair:
connection["access_pair"] = access_pair
if args.type:
connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type
@@ -263,15 +366,15 @@ def mirror_add(args):
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
mirror = spack.mirrors.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirror.Mirror(args.url, name=args.name)
spack.mirror.add(mirror, args.scope)
mirror = spack.mirrors.mirror.Mirror(args.url, name=args.name)
spack.mirrors.utils.add(mirror, args.scope)
def mirror_remove(args):
"""remove a mirror by name"""
spack.mirror.remove(args.name, args.scope)
spack.mirrors.utils.remove(args.name, args.scope)
def _configure_mirror(args):
@@ -280,21 +383,40 @@ def _configure_mirror(args):
if args.name not in mirrors:
tty.die(f"No mirror found with name {args.name}.")
entry = spack.mirror.Mirror(mirrors[args.name], args.name)
entry = spack.mirrors.mirror.Mirror(mirrors[args.name], args.name)
direction = "fetch" if args.fetch else "push" if args.push else None
changes = {}
if args.url:
changes["url"] = args.url
if args.s3_access_key_id and args.s3_access_key_secret:
changes["access_pair"] = [args.s3_access_key_id, args.s3_access_key_secret]
default_access_pair = entry._get_value("access_pair", direction or "fetch")
# TODO: Init access_pair args with the fetch/push/base values in the current mirror state
access_pair = _configure_access_pair(
args,
"s3_access_key_id",
"s3_access_key_id_variable",
"s3_access_key_secret",
"s3_access_key_secret_variable",
default=default_access_pair,
)
if access_pair:
changes["access_pair"] = access_pair
if args.s3_access_token:
changes["access_token"] = args.s3_access_token
if args.s3_profile:
changes["profile"] = args.s3_profile
if args.s3_endpoint_url:
changes["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password:
changes["access_pair"] = [args.oci_username, args.oci_password]
access_pair = _configure_access_pair(
args,
"oci_username",
"oci_username_variable",
"oci_password",
"oci_password_variable",
default=default_access_pair,
)
if access_pair:
changes["access_pair"] = access_pair
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
if getattr(args, "autopush", None) is not None:
@@ -328,7 +450,7 @@ def mirror_set_url(args):
def mirror_list(args):
"""print out available mirrors to the console"""
mirrors = spack.mirror.MirrorCollection(scope=args.scope)
mirrors = spack.mirrors.mirror.MirrorCollection(scope=args.scope)
if not mirrors:
tty.msg("No mirrors configured.")
return
@@ -368,10 +490,10 @@ def concrete_specs_from_user(args):
def extend_with_additional_versions(specs, num_versions):
if num_versions == "all":
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirror.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [spack.concretize.concretized(x) for x in mirror_specs]
return mirror_specs
@@ -449,7 +571,7 @@ def concrete_specs_from_environment():
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
return mirror_specs
@@ -538,19 +660,21 @@ def _specs_and_action(args):
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
mirror_cache, mirror_stats = spack.mirrors.utils.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
spack.mirrors.utils.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
present, mirrored, error = spack.mirrors.utils.create(
path, mirror_specs, skip_unstable_versions
)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
@@ -560,7 +684,7 @@ def mirror_destroy(args):
mirror_url = None
if args.mirror_name:
result = spack.mirror.MirrorCollection().lookup(args.mirror_name)
result = spack.mirrors.mirror.MirrorCollection().lookup(args.mirror_name)
mirror_url = result.push_url
elif args.mirror_url:
mirror_url = args.mirror_url

View File

@@ -19,6 +19,7 @@
import spack.modules
import spack.modules.common
import spack.repo
from spack.cmd import MultipleSpecsMatch, NoSpecMatches
from spack.cmd.common import arguments
description = "manipulate module files"
@@ -91,18 +92,6 @@ def add_loads_arguments(subparser):
arguments.add_common_arguments(subparser, ["recurse_dependencies"])
class MultipleSpecsMatch(Exception):
"""Raised when multiple specs match a constraint, in a context where
this is not allowed.
"""
class NoSpecMatches(Exception):
"""Raised when no spec matches a constraint, in a context where
this is not allowed.
"""
def one_spec_or_raise(specs):
"""Ensures exactly one spec has been selected, or raises the appropriate
exception.

View File

@@ -8,6 +8,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.lmod

View File

@@ -7,6 +7,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.tcl

View File

@@ -82,14 +82,6 @@ def spec(parser, args):
if args.namespaces:
fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every
# spec in the DAG. This avoids repeatedly querying the DB.
tree_context = lang.nullcontext
@@ -99,46 +91,35 @@ def spec(parser, args):
env = ev.active_environment()
if args.specs:
input_specs = spack.cmd.parse_specs(args.specs)
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
concrete_specs = spack.cmd.parse_specs(args.specs, concretize=True)
elif env:
env.concretize()
specs = env.concretized_specs()
if not args.format:
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
concrete_specs = env.concrete_roots()
else:
tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs:
# With --yaml or --json, just print the raw specs to output
if args.format:
# With --yaml, --json, or --format, just print the raw specs to output
if args.format:
for spec in concrete_specs:
if args.format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(output.to_yaml(hash=ht.dag_hash))
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == "json":
print(output.to_json(hash=ht.dag_hash))
print(spec.to_json(hash=ht.dag_hash))
else:
print(output.format(args.format))
continue
print(spec.format(args.format))
return
with tree_context():
# Only show the headers for input specs that are not concrete to avoid
# repeated output. This happens because parse_specs outputs concrete
# specs for `/hash` inputs.
if not input.concrete:
tree_kwargs["hashes"] = False # Always False for input spec
print("Input spec")
print("--------------------------------")
print(input.tree(**tree_kwargs))
print("Concretized")
print("--------------------------------")
tree_kwargs["hashes"] = args.long or args.very_long
print(output.tree(**tree_kwargs))
with tree_context():
print(
spack.spec.tree(
concrete_specs,
cover=args.cover,
format=fmt,
hashlen=None if args.very_long else 7,
show_types=args.types,
status_fn=install_status_fn if args.install_status else None,
hashes=args.long or args.very_long,
key=spack.traverse.by_dag_hash,
)
)

View File

@@ -3,18 +3,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import ast
import os
import re
import sys
from itertools import zip_longest
from typing import Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
import spack.paths
import spack.repo
import spack.util.git
from spack.util.executable import which
from spack.util.executable import Executable, which
description = "runs source code style checks on spack"
section = "developer"
@@ -36,10 +39,7 @@ def grouper(iterable, n, fillvalue=None):
#: double-check the results of other tools (if, e.g., --fix was provided)
#: The list maps an executable name to a method to ensure the tool is
#: bootstrapped or present in the environment.
tool_names = ["isort", "black", "flake8", "mypy"]
#: tools we run in spack style
tools = {}
tool_names = ["import", "isort", "black", "flake8", "mypy"]
#: warnings to ignore in mypy
mypy_ignores = [
@@ -61,14 +61,28 @@ def is_package(f):
#: decorator for adding tools to the list
class tool:
def __init__(self, name, required=False):
def __init__(self, name: str, required: bool = False, external: bool = True) -> None:
self.name = name
self.external = external
self.required = required
def __call__(self, fun):
tools[self.name] = (fun, self.required)
self.fun = fun
tools[self.name] = self
return fun
@property
def installed(self) -> bool:
return bool(which(self.name)) if self.external else True
@property
def executable(self) -> Optional[Executable]:
return which(self.name) if self.external else None
#: tools we run in spack style
tools: Dict[str, tool] = {}
def changed_files(base="develop", untracked=True, all_files=False, root=None):
"""Get list of changed files in the Spack repository.
@@ -176,22 +190,22 @@ def setup_parser(subparser):
"-t",
"--tool",
action="append",
help="specify which tools to run (default: %s)" % ",".join(tool_names),
help="specify which tools to run (default: %s)" % ", ".join(tool_names),
)
tool_group.add_argument(
"-s",
"--skip",
metavar="TOOL",
action="append",
help="specify tools to skip (choose from %s)" % ",".join(tool_names),
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
def cwd_relative(path, args):
def cwd_relative(path, root, initial_working_dir):
"""Translate prefix-relative path to current working directory-relative."""
return os.path.relpath(os.path.join(args.root, path), args.initial_working_dir)
return os.path.relpath(os.path.join(root, path), initial_working_dir)
def rewrite_and_print_output(
@@ -201,7 +215,10 @@ def rewrite_and_print_output(
# print results relative to current working directory
def translate(match):
return replacement.format(cwd_relative(match.group(1), args), *list(match.groups()[1:]))
return replacement.format(
cwd_relative(match.group(1), args.root, args.initial_working_dir),
*list(match.groups()[1:]),
)
for line in output.split("\n"):
if not line:
@@ -220,7 +237,7 @@ def print_style_header(file_list, args, tools_to_run):
# translate modified paths to cwd_relative if needed
paths = [filename.strip() for filename in file_list]
if not args.root_relative:
paths = [cwd_relative(filename, args) for filename in paths]
paths = [cwd_relative(filename, args.root, args.initial_working_dir) for filename in paths]
tty.msg("Modified files", *paths)
sys.stdout.flush()
@@ -306,8 +323,6 @@ def process_files(file_list, is_args):
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = (
"--rm",
"spack",
"--rm",
"spack.pkgkit",
"--rm",
@@ -352,17 +367,137 @@ def run_black(black_cmd, file_list, args):
return returncode
def _module_part(root: str, expr: str):
parts = expr.split(".")
# spack.pkg is for repositories, don't try to resolve it here.
if ".".join(parts[:2]) == spack.repo.ROOT_PYTHON_NAMESPACE:
return None
while parts:
f1 = os.path.join(root, "lib", "spack", *parts) + ".py"
f2 = os.path.join(root, "lib", "spack", *parts, "__init__.py")
if (
os.path.exists(f1)
# ensure case sensitive match
and f"{parts[-1]}.py" in os.listdir(os.path.dirname(f1))
or os.path.exists(f2)
):
return ".".join(parts)
parts.pop()
return None
def _run_import_check(
file_list: List[str],
*,
fix: bool,
root_relative: bool,
root=spack.paths.prefix,
working_dir=spack.paths.prefix,
out=sys.stdout,
):
if sys.version_info < (3, 9):
print("import check requires Python 3.9 or later")
return 0
is_use = re.compile(r"(?<!from )(?<!import )(?:llnl|spack)\.[a-zA-Z0-9_\.]+")
# redundant imports followed by a `# comment` are ignored, cause there can be legimitate reason
# to import a module: execute module scope init code, or to deal with circular imports.
is_abs_import = re.compile(r"^import ((?:llnl|spack)\.[a-zA-Z0-9_\.]+)$", re.MULTILINE)
exit_code = 0
for file in file_list:
to_add = set()
to_remove = []
pretty_path = file if root_relative else cwd_relative(file, root, working_dir)
try:
with open(file, "r") as f:
contents = open(file, "r").read()
parsed = ast.parse(contents)
except Exception:
exit_code = 1
print(f"{pretty_path}: could not parse", file=out)
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
# Clear all strings to avoid matching comments/strings etc.
for node in ast.walk(parsed):
if isinstance(node, ast.Constant) and isinstance(node.value, str):
node.value = ""
filtered_contents = ast.unparse(parsed) # novermin
for m in is_use.finditer(filtered_contents):
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
continue
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module} ({m.group(0)})", file=out)
if not fix or not to_add and not to_remove:
continue
with open(file, "r") as f:
lines = f.readlines()
if to_add:
# insert missing imports before the first import, delegate ordering to isort
for node in parsed.body:
if isinstance(node, (ast.Import, ast.ImportFrom)):
first_line = node.lineno
break
else:
print(f"{pretty_path}: could not fix", file=out)
continue
lines.insert(first_line, "\n".join(f"import {x}" for x in to_add) + "\n")
new_contents = "".join(lines)
# remove redundant imports
for statement in to_remove:
new_contents = new_contents.replace(f"{statement}\n", "")
with open(file, "w") as f:
f.write(new_contents)
return exit_code
@tool("import", external=False)
def run_import_check(import_check_cmd, file_list, args):
exit_code = _run_import_check(
file_list,
fix=args.fix,
root_relative=args.root_relative,
root=args.root,
working_dir=args.initial_working_dir,
)
print_tool_result("import", exit_code)
return exit_code
def validate_toolset(arg_value):
"""Validate --tool and --skip arguments (sets of optionally comma-separated tools)."""
tools = set(",".join(arg_value).split(",")) # allow args like 'isort,flake8'
for tool in tools:
if tool not in tool_names:
tty.die("Invaild tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
tty.die("Invalid tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
return tools
def missing_tools(tools_to_run):
return [t for t in tools_to_run if which(t) is None]
def missing_tools(tools_to_run: List[str]) -> List[str]:
return [t for t in tools_to_run if not tools[t].installed]
def _bootstrap_dev_dependencies():
@@ -417,9 +552,9 @@ def prefix_relative(path):
print_style_header(file_list, args, tools_to_run)
for tool_name in tools_to_run:
run_function, required = tools[tool_name]
tool = tools[tool_name]
print_tool_header(tool_name)
return_code |= run_function(which(tool_name), file_list, args)
return_code |= tool.fun(tool.executable, file_list, args)
if return_code == 0:
tty.msg(color.colorize("@*{spack style checks were clean}"))

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.22"
tutorial_branch = "releases/v0.23"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -17,7 +17,8 @@
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "remove installed packages"
section = "build"
@@ -99,12 +100,14 @@ def find_matching_specs(
hashes = env.all_hashes() if env else None
# List of specs that match expressions given via command line
specs_from_cli: List["spack.spec.Spec"] = []
specs_from_cli: List[spack.spec.Spec] = []
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
matching = spack.store.STORE.db.query_local(
spec, hashes=hashes, installed=install_query, origin=origin
spec,
hashes=hashes,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
origin=origin,
)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't

View File

@@ -4,20 +4,23 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import hashlib
import itertools
import json
import os
import platform
import re
import shutil
import sys
import tempfile
from typing import List, Optional, Sequence
from typing import Dict, List, Optional, Sequence
import llnl.path
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.caches
import spack.error
import spack.schema.environment
import spack.spec
@@ -26,6 +29,7 @@
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.file_cache import FileCache
__all__ = ["Compiler"]
@@ -34,7 +38,7 @@
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()) -> str:
"""Invokes the compiler at a given path passing a single
version argument and returns the output.
@@ -57,7 +61,7 @@ def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
return output
def get_compiler_version_output(compiler_path, *args, **kwargs):
def get_compiler_version_output(compiler_path, *args, **kwargs) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
@@ -290,6 +294,7 @@ def __init__(
self.environment = environment or {}
self.extra_rpaths = extra_rpaths or []
self.enable_implicit_rpaths = enable_implicit_rpaths
self.cache = COMPILER_CACHE
self.cc = paths[0]
self.cxx = paths[1]
@@ -390,15 +395,11 @@ def real_version(self):
E.g. C++11 flag checks.
"""
if not self._real_version:
try:
real_version = spack.version.Version(self.get_real_version())
if real_version == spack.version.Version("unknown"):
return self.version
self._real_version = real_version
except spack.util.executable.ProcessError:
self._real_version = self.version
return self._real_version
real_version_str = self.cache.get(self).real_version
if not real_version_str or real_version_str == "unknown":
return self.version
return spack.version.StandardVersion.from_string(real_version_str)
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
@@ -427,6 +428,11 @@ def default_dynamic_linker(self) -> Optional[str]:
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
# technically this should be testing the target platform of the compiler, but we don't have
# that, so stick to host platform for now.
if sys.platform in ("darwin", "win32"):
return None
dynamic_linker = self.default_dynamic_linker
if not dynamic_linker:
@@ -445,19 +451,23 @@ def required_libs(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
return self.cache.get(self).c_compiler_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if self.cc:
cc = self.cc
ext = "c"
else:
cc = self.cxx
ext = "cc"
if not cc or not self.verbose_flag:
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, "main.c")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w") as csource:
csource.write(
@@ -559,7 +569,7 @@ def fc_pic_flag(self):
# Note: This is not a class method. The class methods are used to detect
# compilers on PATH based systems, and do not set up the run environment of
# the compiler. This method can be called on `module` based systems as well
def get_real_version(self):
def get_real_version(self) -> str:
"""Query the compiler for its version.
This is the "real" compiler version, regardless of what is in the
@@ -569,14 +579,17 @@ def get_real_version(self):
modifications) to enable the compiler to run properly on any platform.
"""
cc = spack.util.executable.Executable(self.cc)
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
try:
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
except spack.util.executable.ProcessError:
return "unknown"
@property
def prefix(self):
@@ -603,7 +616,7 @@ def default_version(cls, cc):
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
def extract_version_from_output(cls, output: str) -> str:
"""Extracts the version from compiler's output."""
match = re.search(cls.version_regex, output)
return match.group(1) if match else "unknown"
@@ -732,3 +745,106 @@ def __init__(self, compiler, feature, flag_name, ver_string=None):
)
+ " implement the {0} property and submit a pull request or issue.".format(flag_name),
)
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ["c_compiler_output", "real_version"]
def __init__(self, c_compiler_output: Optional[str], real_version: str):
self.c_compiler_output = c_compiler_output
self.real_version = real_version
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
raise ValueError(f"Invalid {cls.__name__} data")
c_compiler_output = data.get("c_compiler_output")
real_version = data.get("real_version")
if not isinstance(real_version, str) or not isinstance(
c_compiler_output, (str, type(None))
):
raise ValueError(f"Invalid {cls.__name__} data")
return cls(c_compiler_output, real_version)
class CompilerCache:
"""Base class for compiler output cache. Default implementation does not cache anything."""
def value(self, compiler: Compiler) -> Dict[str, Optional[str]]:
return {
"c_compiler_output": compiler._compile_dummy_c_source(),
"real_version": compiler.get_real_version(),
}
def get(self, compiler: Compiler) -> CompilerCacheEntry:
return CompilerCacheEntry.from_dict(self.value(compiler))
class FileCompilerCache(CompilerCache):
"""Cache for compiler output, which is used to determine implicit link paths, the default libc
version, and the compiler version."""
name = os.path.join("compilers", "compilers.json")
def __init__(self, cache: "FileCache") -> None:
self.cache = cache
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str) -> Optional[CompilerCacheEntry]:
try:
return CompilerCacheEntry.from_dict(self._data[key])
except ValueError:
del self._data[key]
except KeyError:
pass
return None
def get(self, compiler: Compiler) -> CompilerCacheEntry:
# Cache hit
try:
with self.cache.read_transaction(self.name) as f:
assert f is not None
self._data = json.loads(f.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
key = self._key(compiler)
value = self._get_entry(key)
if value is not None:
return value
# Cache miss
with self.cache.write_transaction(self.name) as (old, new):
try:
assert old is not None
self._data = json.loads(old.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key)
# Finally compute the cache entry
if entry is None:
self._data[key] = self.value(compiler)
entry = CompilerCacheEntry.from_dict(self._data[key])
new.write(json.dumps(self._data, separators=(",", ":")))
return entry
def _key(self, compiler: Compiler) -> str:
as_bytes = json.dumps(compiler.to_dict(), separators=(",", ":")).encode("utf-8")
return hashlib.sha256(as_bytes).hexdigest()
def _make_compiler_cache():
return FileCompilerCache(spack.caches.MISC_CACHE)
COMPILER_CACHE: CompilerCache = llnl.util.lang.Singleton(_make_compiler_cache) # type: ignore

View File

@@ -116,5 +116,5 @@ def fflags(self):
def _handle_default_flag_addtions(self):
# This is a known issue for AOCC 3.0 see:
# https://developer.amd.com/wp-content/resources/AOCC-3.0-Install-Guide.pdf
if self.real_version.satisfies(ver("3.0.0")):
if self.version.satisfies(ver("3.0.0")):
return "-Wno-unused-command-line-argument " "-mllvm -eliminate-similar-expr=false"

View File

@@ -16,7 +16,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf_r", os.path.join("xl_r", "xlf_r")),
("xlf", os.path.join("xl", "xlf")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]
@@ -25,7 +24,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf90_r", os.path.join("xl_r", "xlf90_r")),
("xlf90", os.path.join("xl", "xlf90")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]

View File

@@ -124,8 +124,8 @@ def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -155,10 +155,10 @@ def setup_custom_environment(self, pkg, env):
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI

View File

@@ -1,77 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Pgi(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("pgi", "pgcc"),
"cxx": os.path.join("pgi", "pgc++"),
"f77": os.path.join("pgi", "pgfortran"),
"fc": os.path.join("pgi", "pgfortran"),
}
version_argument = "-V"
ignore_version_errors = [2] # `pgcc -V` on PowerPC annoyingly returns 2
version_regex = r"pg[^ ]* ([0-9.]+)-[0-9]+ (LLVM )?[^ ]+ target on "
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gopt"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def openmp_flag(self):
return "-mp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cc_pic_flag(self):
return "-fpic"
@property
def cxx_pic_flag(self):
return "-fpic"
@property
def f77_pic_flag(self):
return "-fpic"
@property
def fc_pic_flag(self):
return "-fpic"
required_libs = ["libpgc", "libpgf90"]
@property
def c99_flag(self):
if self.real_version >= Version("12.10"):
return "-c99"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 12.10")
@property
def c11_flag(self):
if self.real_version >= Version("15.3"):
return "-c11"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 15.3")
@property
def stdcxx_libs(self):
return ("-pgc++libs",)

View File

@@ -51,11 +51,10 @@ def concretize_specs_together(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
result = Solver().solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
@@ -90,7 +89,7 @@ def concretize_together_when_possible(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
@@ -98,9 +97,8 @@ def concretize_together_when_possible(
}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
for result in Solver().solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
@@ -124,7 +122,7 @@ def concretize_separately(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.bootstrap
from spack.bootstrap import ensure_bootstrap_configuration, ensure_clingo_importable_or_raise
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
@@ -134,8 +132,8 @@ def concretize_separately(
]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
with ensure_bootstrap_configuration():
ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -160,6 +158,11 @@ def concretize_separately(
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1
@@ -185,10 +188,50 @@ def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int,
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = Spec(spec_str).concretized(tests=tests)
spec = concretized(Spec(spec_str), tests=tests)
return index, spec, time.time() - start
def concretized(spec: Spec, tests: Union[bool, Iterable[str]] = False) -> Spec:
"""Return a concretized copy of the given spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
from spack.solver.asp import Solver, SpecBuilder
spec.replace_hash()
for node in spec.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if spec._concrete:
return spec.copy()
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve([spec], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]
node = SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
return concretized
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""

View File

@@ -69,6 +69,8 @@
from spack.error import SpackError
from spack.util.crypto import bit_length
from .enums import InstallRecordStatus
# TODO: Provide an API automatically retyring a build after detecting and
# TODO: clearing a failure.
@@ -160,36 +162,12 @@ def converter(self, spec_like, *args, **kwargs):
return converter
class InstallStatus(str):
pass
class InstallStatuses:
INSTALLED = InstallStatus("installed")
DEPRECATED = InstallStatus("deprecated")
MISSING = InstallStatus("missing")
@classmethod
def canonicalize(cls, query_arg):
if query_arg is True:
return [cls.INSTALLED]
if query_arg is False:
return [cls.MISSING]
if query_arg is any:
return [cls.INSTALLED, cls.DEPRECATED, cls.MISSING]
if isinstance(query_arg, InstallStatus):
return [query_arg]
try:
statuses = list(query_arg)
if all(isinstance(x, InstallStatus) for x in statuses):
return statuses
except TypeError:
pass
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
def normalize_query(installed: Union[bool, InstallRecordStatus]) -> InstallRecordStatus:
if installed is True:
installed = InstallRecordStatus.INSTALLED
elif installed is False:
installed = InstallRecordStatus.MISSING
return installed
class InstallRecord:
@@ -227,8 +205,8 @@ def __init__(
installation_time: Optional[float] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin=None,
):
origin: Optional[str] = None,
) -> None:
self.spec = spec
self.path = str(path) if path else None
self.installed = bool(installed)
@@ -239,14 +217,12 @@ def __init__(
self.in_buildcache = in_buildcache
self.origin = origin
def install_type_matches(self, installed):
installed = InstallStatuses.canonicalize(installed)
def install_type_matches(self, installed: InstallRecordStatus) -> bool:
if self.installed:
return InstallStatuses.INSTALLED in installed
return InstallRecordStatus.INSTALLED in installed
elif self.deprecated_for:
return InstallStatuses.DEPRECATED in installed
else:
return InstallStatuses.MISSING in installed
return InstallRecordStatus.DEPRECATED in installed
return InstallRecordStatus.MISSING in installed
def to_dict(self, include_fields=DEFAULT_INSTALL_RECORD_FIELDS):
rec_dict = {}
@@ -1396,7 +1372,13 @@ def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
if spec.package.extends(extendee_spec):
yield spec.package
def _get_by_hash_local(self, dag_hash, default=None, installed=any):
def _get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
installed = normalize_query(installed)
# hash is a full hash and is in the data somewhere
if dag_hash in self._data:
rec = self._data[dag_hash]
@@ -1405,8 +1387,7 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
else:
return default
# check if hash is a prefix of some installed (or previously
# installed) spec.
# check if hash is a prefix of some installed (or previously installed) spec.
matches = [
record.spec
for h, record in self._data.items()
@@ -1418,52 +1399,43 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
# nothing found
return default
def get_by_hash_local(self, dag_hash, default=None, installed=any):
def get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec in *this DB* by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed
specs in the search; if ``False`` only missing specs, and if
``any``, all specs in database. If an InstallStatus or iterable
of InstallStatus, returns specs whose install status
(installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
with self.read_transaction():
return self._get_by_hash_local(dag_hash, default=default, installed=installed)
def get_by_hash(self, dag_hash, default=None, installed=any):
def get_by_hash(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed specs in the search; if ``False``
only missing specs, and if ``any``, all specs in database. If an
InstallStatus or iterable of InstallStatus, returns specs whose install
status (installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
@@ -1483,7 +1455,7 @@ def _query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1491,6 +1463,7 @@ def _query(
in_buildcache: Optional[bool] = None,
origin: Optional[str] = None,
) -> List["spack.spec.Spec"]:
installed = normalize_query(installed)
# Restrict the set of records over which we iterate first
matching_hashes = self._data
@@ -1560,7 +1533,7 @@ def query_local(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1620,7 +1593,7 @@ def query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1628,7 +1601,7 @@ def query(
hashes: Optional[List[str]] = None,
origin: Optional[str] = None,
install_tree: str = "all",
):
) -> List["spack.spec.Spec"]:
"""Queries the Spack database including all upstream databases.
Args:
@@ -1709,13 +1682,14 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
return sorted(results)
results.sort()
return results
def query_one(
self,
query_spec: Optional[Union[str, "spack.spec.Spec"]],
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
) -> Optional["spack.spec.Spec"]:
"""Query for exactly one spec that matches the query spec.

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List
from typing import Dict, List, Type
import spack.deptypes as dt
import spack.spec
@@ -38,7 +38,7 @@ class Dependency:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: Type["spack.package_base.PackageBase"],
spec: "spack.spec.Spec",
depflag: dt.DepFlag = dt.DEFAULT,
):

View File

@@ -21,6 +21,7 @@ class OpenMpi(Package):
* ``conflicts``
* ``depends_on``
* ``extends``
* ``license``
* ``patch``
* ``provides``
* ``resource``
@@ -34,19 +35,19 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Type, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.fetch_strategy
import spack.package_base
import spack.patch
import spack.spec
import spack.util.crypto
import spack.variant
from spack.dependency import Dependency
from spack.directives_meta import DirectiveError, DirectiveMeta
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
GitVersion,
@@ -56,14 +57,10 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conditional",
"conflicts",
"depends_on",
"extends",
@@ -76,6 +73,7 @@ class OpenMpi(Package):
"build_system",
"requires",
"redistribute",
"can_splice",
]
_patch_order_index = 0
@@ -83,15 +81,15 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[Type[spack.package_base.PackageBase], Dependency]], None]
PatchesType = Union[Patcher, str, List[Union[Patcher, str]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
"""Create a ``Spec`` that indicates when a directive should be applied.
Directives with ``when`` specs, e.g.:
@@ -136,7 +134,7 @@ def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
return spack.spec.Spec(value)
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
SubmoduleCallback = Callable[[spack.package_base.PackageBase], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -221,7 +219,7 @@ def version(
return lambda pkg: _execute_version(pkg, ver, **kwargs)
def _execute_version(pkg, ver, **kwargs):
def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str, int], **kwargs):
if (
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
@@ -252,12 +250,12 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
pkg: Type[spack.package_base.PackageBase],
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
when_spec = _make_when_spec(when)
if not when_spec:
@@ -332,7 +330,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: "spack.package_base.PackageBase"):
def _execute_conflicts(pkg: Type[spack.package_base.PackageBase]):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -351,7 +349,7 @@ def depends_on(
spec: SpecType,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
"""Creates a dict of deps with specs defining when they apply.
@@ -373,21 +371,16 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
def _execute_depends_on(pkg: Type[spack.package_base.PackageBase]):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
def redistribute(
source: Optional[bool] = None, binary: Optional[bool] = None, when: WhenType = None
):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
@@ -402,7 +395,10 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
pkg: Type[spack.package_base.PackageBase],
source: Optional[bool],
binary: Optional[bool],
when: WhenType,
):
if source is None and binary is None:
return
@@ -432,7 +428,7 @@ def _execute_redistribute(
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
pkg.disable_redistribute[when_spec] = spack.package_base.DisableRedistribute(
source=not source, binary=not binary
)
@@ -478,9 +474,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: "spack.package_base.PackageBase"):
import spack.parser # Avoid circular dependency
def _execute_provides(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return
@@ -504,6 +498,43 @@ def _execute_provides(pkg: "spack.package_base.PackageBase"):
return _execute_provides
@directive("splice_specs")
def can_splice(
target: SpecType, *, when: SpecType, match_variants: Union[None, str, List[str]] = None
):
"""Packages can declare whether they are ABI-compatible with another package
and thus can be spliced into concrete versions of that package.
Args:
target: The spec that the current package is ABI-compatible with.
when: An anonymous spec constraining current package for when it is
ABI-compatible with target.
match_variants: A list of variants that must match
between target spec and current package, with special value '*'
which matches all variants. Example: a variant is defined on both
packages called json, and they are ABI-compatible whenever they agree on
the json variant (regardless of whether it is turned on or off). Note
that this cannot be applied to multi-valued variants and multi-valued
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
"* is the only valid string for match_variants "
"if looking to provide a single variant, use "
f"[{match_variants}] instead"
)
if when_spec is None:
return
pkg.splice_specs[when_spec] = (spack.spec.Spec(target), match_variants)
return _execute_can_splice
@directive("patches")
def patch(
url_or_filename: str,
@@ -530,10 +561,10 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep
if hasattr(pkg, "has_code") and not pkg.has_code:
raise UnsupportedPackageDirective(
@@ -577,6 +608,15 @@ def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependenc
return _execute_patch
def conditional(*values: List[Any], when: Optional[WhenType] = None):
"""Conditional values that can be used in variant declarations."""
# _make_when_spec returns None when the condition is statically false.
when = _make_when_spec(when)
return spack.variant.ConditionalVariantValues(
spack.variant.ConditionalValue(x, when=when) for x in values
)
@directive("variants")
def variant(
name: str,
@@ -698,58 +738,55 @@ def _execute_variant(pkg):
@directive("resources")
def resource(**kwargs):
"""Define an external resource to be fetched and staged when building the
package. Based on the keywords present in the dictionary the appropriate
FetchStrategy will be used for the resource. Resources are fetched and
staged in their own folder inside spack stage area, and then moved into
the stage area of the package that needs them.
def resource(
*,
name: Optional[str] = None,
destination: str = "",
placement: Optional[str] = None,
when: WhenType = None,
# additional kwargs are as for `version()`
**kwargs,
):
"""Define an external resource to be fetched and staged when building the package.
Based on the keywords present in the dictionary the appropriate FetchStrategy will
be used for the resource. Resources are fetched and staged in their own folder
inside spack stage area, and then moved into the stage area of the package that
needs them.
List of recognized keywords:
Keyword Arguments:
name: name for the resource
when: condition defining when the resource is needed
destination: path, relative to the package stage area, to which resource should be moved
placement: optionally rename the expanded resource inside the destination directory
* 'when' : (optional) represents the condition upon which the resource is
needed
* 'destination' : (optional) path where to move the resource. This path
must be relative to the main package stage area.
* 'placement' : (optional) gives the possibility to fine tune how the
resource is moved into the main package stage area.
"""
def _execute_resource(pkg):
when = kwargs.get("when")
when_spec = _make_when_spec(when)
if not when_spec:
return
destination = kwargs.get("destination", "")
placement = kwargs.get("placement", None)
# Check if the path is relative
if os.path.isabs(destination):
message = (
"The destination keyword of a resource directive " "can't be an absolute path.\n"
)
message += "\tdestination : '{dest}\n'".format(dest=destination)
raise RuntimeError(message)
msg = "The destination keyword of a resource directive can't be an absolute path.\n"
msg += f"\tdestination : '{destination}\n'"
raise RuntimeError(msg)
# Check if the path falls within the main package stage area
test_path = "stage_folder_root"
normalized_destination = os.path.normpath(
os.path.join(test_path, destination)
) # Normalized absolute path
# Normalized absolute path
normalized_destination = os.path.normpath(os.path.join(test_path, destination))
if test_path not in normalized_destination:
message = (
"The destination folder of a resource must fall "
"within the main package stage directory.\n"
)
message += "\tdestination : '{dest}'\n".format(dest=destination)
raise RuntimeError(message)
msg = "Destination of a resource must be within the package stage directory.\n"
msg += f"\tdestination : '{destination}'\n"
raise RuntimeError(msg)
resources = pkg.resources.setdefault(when_spec, [])
name = kwargs.get("name")
fetcher = from_kwargs(**kwargs)
resources.append(Resource(name, fetcher, destination, placement))
resources.append(
Resource(name, spack.fetch_strategy.from_kwargs(**kwargs), destination, placement)
)
return _execute_resource
@@ -781,7 +818,9 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(pkg, license_identifier: str, when):
def _execute_license(
pkg: Type[spack.package_base.PackageBase], license_identifier: str, when: WhenType
):
# If when is not specified the license always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -845,7 +884,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -870,7 +909,7 @@ def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: "spack.package_base.PackageBase"):
def _execute_languages(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return

View File

@@ -5,7 +5,7 @@
import collections.abc
import functools
from typing import List, Set
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Type, Union
import llnl.util.lang
@@ -25,11 +25,13 @@ class DirectiveMeta(type):
# Set of all known directives
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_directives_to_be_executed: List[Callable] = []
_when_constraints_from_context: List[spack.spec.Spec] = []
_default_args: List[dict] = []
def __new__(cls, name, bases, attr_dict):
def __new__(
cls: Type["DirectiveMeta"], name: str, bases: tuple, attr_dict: dict
) -> "DirectiveMeta":
# Initialize the attribute containing the list of directives
# to be executed. Here we go reversed because we want to execute
# commands:
@@ -60,7 +62,7 @@ def __new__(cls, name, bases, attr_dict):
return super(DirectiveMeta, cls).__new__(cls, name, bases, attr_dict)
def __init__(cls, name, bases, attr_dict):
def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
@@ -81,27 +83,27 @@ def __init__(cls, name, bases, attr_dict):
super(DirectiveMeta, cls).__init__(name, bases, attr_dict)
@staticmethod
def push_to_context(when_spec):
def push_to_context(when_spec: spack.spec.Spec) -> None:
"""Add a spec to the context constraints."""
DirectiveMeta._when_constraints_from_context.append(when_spec)
@staticmethod
def pop_from_context():
def pop_from_context() -> spack.spec.Spec:
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args):
def push_default_args(default_args: Dict[str, Any]) -> None:
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args():
def pop_default_args() -> dict:
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts=None):
def directive(dicts: Optional[Union[Sequence[str], str]] = None) -> Callable:
"""Decorator for Spack directives.
Spack directives allow you to modify a package while it is being
@@ -156,7 +158,7 @@ class Foo(Package):
DirectiveMeta._directive_dict_names |= set(dicts)
# This decorator just returns the directive functions
def _decorator(decorated_function):
def _decorator(decorated_function: Callable) -> Callable:
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)

15
lib/spack/spack/enums.py Normal file
View File

@@ -0,0 +1,15 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Enumerations used throughout Spack"""
import enum
class InstallRecordStatus(enum.Flag):
"""Enum flag to facilitate querying status from the DB"""
INSTALLED = enum.auto()
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING

View File

@@ -20,7 +20,7 @@
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import readlink, symlink
from llnl.util.symlink import islink, readlink, symlink
import spack
import spack.caches
@@ -668,7 +668,7 @@ def from_dict(base_path, d):
@property
def _current_root(self):
if not os.path.islink(self.root):
if not islink(self.root):
return None
root = readlink(self.root)

View File

@@ -192,3 +192,10 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
class MirrorError(SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -5,7 +5,6 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
@@ -17,7 +16,6 @@
import llnl.util.lang
import spack.cmd
import spack.config
import spack.error
import spack.util.path
@@ -25,9 +23,6 @@
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
# TODO: For consistency we should use spack.cmd.python_name(), but
# currently this would create a circular relationship between
# spack.cmd and spack.extensions.
def _python_name(cmd_name):
return cmd_name.replace("-", "_")
@@ -211,8 +206,7 @@ def get_module(cmd_name):
module = load_command_extension(cmd_name, folder)
if module:
return module
else:
raise CommandNotFoundError(cmd_name)
return None
def get_template_dirs():
@@ -224,27 +218,6 @@ def get_template_dirs():
return extensions
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):
"""Exception class thrown when a configured extension does not follow
the expected naming convention.

View File

@@ -325,12 +325,7 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.depflag
)
]
nodes_in_topological_order = list(spec.traverse(order="topo", deptype=self.depflag))
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirror
import spack.mirrors.mirror
def post_install(spec, explicit):
@@ -22,7 +22,7 @@ def post_install(spec, explicit):
return
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True, autopush=True).values():
signing_key = bindist.select_signing_key() if mirror.signed else None
with bindist.make_uploader(mirror=mirror, force=True, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])

View File

@@ -23,7 +23,6 @@
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.builder
import spack.config
import spack.error
import spack.package_base
@@ -353,9 +352,7 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
self.test_parts[part_name] = status
self.counts[status] += 1
def phase_tests(
self, builder: spack.builder.Builder, phase_name: str, method_names: List[str]
):
def phase_tests(self, builder, phase_name: str, method_names: List[str]):
"""Execute the builder's package phase-time tests.
Args:
@@ -378,23 +375,16 @@ def phase_tests(
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)
@@ -764,7 +754,7 @@ def virtuals(pkg):
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
c_names = ("gcc", "intel", "intel-parallel-studio")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):

View File

@@ -50,12 +50,13 @@
import spack.binary_distribution as binary_distribution
import spack.build_environment
import spack.builder
import spack.config
import spack.database
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.mirror
import spack.mirrors.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -212,7 +213,7 @@ def _check_last_phase(pkg: "spack.package_base.PackageBase") -> None:
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
"""
phases = pkg.builder.phases # type: ignore[attr-defined]
phases = spack.builder.create(pkg).phases # type: ignore[attr-defined]
if pkg.stop_before_phase and pkg.stop_before_phase not in phases: # type: ignore[attr-defined]
raise BadInstallPhase(pkg.name, pkg.stop_before_phase) # type: ignore[attr-defined]
@@ -490,7 +491,7 @@ def _try_install_from_binary_cache(
timer: timer to keep track of binary install phases.
"""
# Early exit if no binary mirrors are configured.
if not spack.mirror.MirrorCollection(binary=True):
if not spack.mirrors.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")
@@ -661,7 +662,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
spack.store.STORE.layout.metadata_path(pkg.spec), "archived-files"
)
for glob_expr in pkg.builder.archive_files:
for glob_expr in spack.builder.create(pkg).archive_files:
# Check that we are trying to copy things that are
# in the stage tree (not arbitrary files)
abs_expr = os.path.realpath(glob_expr)
@@ -2394,7 +2395,6 @@ def _install_source(self) -> None:
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder
pkg = self.pkg

View File

@@ -1,768 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particular package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import collections
import collections.abc
import operator
import os
import os.path
import sys
import traceback
import urllib.parse
from typing import List, Optional, Union
import llnl.url
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.version
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
def _url_or_path_to_url(url_or_path: str) -> str:
"""For simplicity we allow mirror URLs in config files to be local, relative paths.
This helper function takes care of distinguishing between URLs and paths, and
canonicalizes paths before transforming them into file:// URLs."""
# Is it a supported URL already? Then don't do path-related canonicalization.
parsed = urllib.parse.urlparse(url_or_path)
if parsed.scheme in supported_url_schemes:
return url_or_path
# Otherwise we interpret it as path, and we should promote it to file:// URL.
return url_util.path_to_file_url(spack.util.path.canonicalize_path(url_or_path))
class Mirror:
"""Represents a named location for storing source tarballs and binary
packages.
Mirrors have a fetch_url that indicate where and how artifacts are fetched
from them, and a push_url that indicate where and how artifacts are pushed
to them. These two URLs are usually the same.
"""
def __init__(self, data: Union[str, dict], name: Optional[str] = None):
self._data = data
self._name = name
@staticmethod
def from_yaml(stream, name=None):
return Mirror(syaml.load(stream), name)
@staticmethod
def from_json(stream, name=None):
try:
return Mirror(sjson.load(stream), name)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON mirror:", str(e)) from e
@staticmethod
def from_local_path(path: str):
return Mirror(url_util.path_to_file_url(path))
@staticmethod
def from_url(url: str):
"""Create an anonymous mirror by URL. This method validates the URL."""
if not urllib.parse.urlparse(url).scheme in supported_url_schemes:
raise ValueError(
f'"{url}" is not a valid mirror URL. '
f"Scheme must be one of {supported_url_schemes}."
)
return Mirror(url)
def __eq__(self, other):
if not isinstance(other, Mirror):
return NotImplemented
return self._data == other._data and self._name == other._name
def __str__(self):
return f"{self._name}: {self.push_url} {self.fetch_url}"
def __repr__(self):
return f"Mirror(name={self._name!r}, data={self._data!r})"
def to_json(self, stream=None):
return sjson.dump(self.to_dict(), stream)
def to_yaml(self, stream=None):
return syaml.dump(self.to_dict(), stream)
def to_dict(self):
return self._data
def display(self, max_len=0):
fetch, push = self.fetch_url, self.push_url
# don't print the same URL twice
url = fetch if fetch == push else f"fetch: {fetch} push: {push}"
source = "s" if self.source else " "
binary = "b" if self.binary else " "
print(f"{self.name: <{max_len}} [{source}{binary}] {url}")
@property
def name(self):
return self._name or "<unnamed>"
@property
def binary(self):
return isinstance(self._data, str) or self._data.get("binary", True)
@property
def source(self):
return isinstance(self._data, str) or self._data.get("source", True)
@property
def signed(self) -> bool:
return isinstance(self._data, str) or self._data.get("signed", True)
@property
def autopush(self) -> bool:
if isinstance(self._data, str):
return False
return self._data.get("autopush", False)
@property
def fetch_url(self):
"""Get the valid, canonicalized fetch URL"""
return self.get_url("fetch")
@property
def push_url(self):
"""Get the valid, canonicalized fetch URL"""
return self.get_url("push")
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
keys = ["url", "access_pair", "access_token", "profile", "endpoint_url"]
if top_level:
keys += ["binary", "source", "signed", "autopush"]
changed = False
for key in keys:
if key in new_data and current_data.get(key) != new_data[key]:
current_data[key] = new_data[key]
changed = True
return changed
def update(self, data: dict, direction: Optional[str] = None) -> bool:
"""Modify the mirror with the given data. This takes care
of expanding trivial mirror definitions by URL to something more
rich with a dict if necessary
Args:
data (dict): The data to update the mirror with.
direction (str): The direction to update the mirror in (fetch
or push or None for top-level update)
Returns:
bool: True if the mirror was updated, False otherwise."""
# Modify the top-level entry when no direction is given.
if not data:
return False
# If we only update a URL, there's typically no need to expand things to a dict.
set_url = data["url"] if len(data) == 1 and "url" in data else None
if direction is None:
# First deal with the case where the current top-level entry is just a string.
if isinstance(self._data, str):
# Can we replace that string with something new?
if set_url:
if self._data == set_url:
return False
self._data = set_url
return True
# Otherwise promote to a dict
self._data = {"url": self._data}
# And update the dictionary accordingly.
return self._update_connection_dict(self._data, data, top_level=True)
# Otherwise, update the fetch / push entry; turn top-level
# url string into a dict if necessary.
if isinstance(self._data, str):
self._data = {"url": self._data}
# Create a new fetch / push entry if necessary
if direction not in self._data:
# Keep config minimal if we're just setting the URL.
if set_url:
self._data[direction] = set_url
return True
self._data[direction] = {}
entry = self._data[direction]
# Keep the entry simple if we're just swapping out the URL.
if isinstance(entry, str):
if set_url:
if entry == set_url:
return False
self._data[direction] = set_url
return True
# Otherwise promote to a dict
self._data[direction] = {"url": entry}
return self._update_connection_dict(self._data[direction], data, top_level=False)
def _get_value(self, attribute: str, direction: str):
"""Returns the most specific value for a given attribute (either push/fetch or global)"""
if direction not in ("fetch", "push"):
raise ValueError(f"direction must be either 'fetch' or 'push', not {direction}")
if isinstance(self._data, str):
return None
# Either a string (url) or a dictionary, we care about the dict here.
value = self._data.get(direction, {})
# Return top-level entry if only a URL was set.
if isinstance(value, str) or attribute not in value:
return self._data.get(attribute)
return value[attribute]
def get_url(self, direction: str) -> str:
if direction not in ("fetch", "push"):
raise ValueError(f"direction must be either 'fetch' or 'push', not {direction}")
# Whole mirror config is just a url.
if isinstance(self._data, str):
return _url_or_path_to_url(self._data)
# Default value
url = self._data.get("url")
# Override it with a direction-specific value
if direction in self._data:
# Either a url as string or a dict with url key
info = self._data[direction]
if isinstance(info, str):
url = info
elif "url" in info:
url = info["url"]
if not url:
raise ValueError(f"Mirror {self.name} has no URL configured")
return _url_or_path_to_url(url)
def get_access_token(self, direction: str) -> Optional[str]:
return self._get_value("access_token", direction)
def get_access_pair(self, direction: str) -> Optional[List]:
return self._get_value("access_pair", direction)
def get_profile(self, direction: str) -> Optional[str]:
return self._get_value("profile", direction)
def get_endpoint_url(self, direction: str) -> Optional[str]:
return self._get_value("endpoint_url", direction)
class MirrorCollection(collections.abc.Mapping):
"""A mapping of mirror names to mirrors."""
def __init__(
self,
mirrors=None,
scope=None,
binary: Optional[bool] = None,
source: Optional[bool] = None,
autopush: Optional[bool] = None,
):
"""Initialize a mirror collection.
Args:
mirrors: A name-to-mirror mapping to initialize the collection with.
scope: The scope to use when looking up mirrors from the config.
binary: If True, only include binary mirrors.
If False, omit binary mirrors.
If None, do not filter on binary mirrors.
source: If True, only include source mirrors.
If False, omit source mirrors.
If None, do not filter on source mirrors.
autopush: If True, only include mirrors that have autopush enabled.
If False, omit mirrors that have autopush enabled.
If None, do not filter on autopush."""
mirrors_data = (
mirrors.items()
if mirrors is not None
else spack.config.get("mirrors", scope=scope).items()
)
mirrors = (Mirror(data=mirror, name=name) for name, mirror in mirrors_data)
def _filter(m: Mirror):
if source is not None and m.source != source:
return False
if binary is not None and m.binary != binary:
return False
if autopush is not None and m.autopush != autopush:
return False
return True
self._mirrors = {m.name: m for m in mirrors if _filter(m)}
def __eq__(self, other):
return self._mirrors == other._mirrors
def to_json(self, stream=None):
return sjson.dump(self.to_dict(True), stream)
def to_yaml(self, stream=None):
return syaml.dump(self.to_dict(True), stream)
# TODO: this isn't called anywhere
@staticmethod
def from_yaml(stream, name=None):
data = syaml.load(stream)
return MirrorCollection(data)
@staticmethod
def from_json(stream, name=None):
try:
d = sjson.load(stream)
return MirrorCollection(d)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON mirror collection:", str(e)) from e
def to_dict(self, recursive=False):
return syaml.syaml_dict(
sorted(
((k, (v.to_dict() if recursive else v)) for (k, v) in self._mirrors.items()),
key=operator.itemgetter(0),
)
)
@staticmethod
def from_dict(d):
return MirrorCollection(d)
def __getitem__(self, item):
return self._mirrors[item]
def display(self):
max_len = max(len(mirror.name) for mirror in self._mirrors.values())
for mirror in self._mirrors.values():
mirror.display(max_len)
def lookup(self, name_or_url):
"""Looks up and returns a Mirror.
If this MirrorCollection contains a named Mirror under the name
[name_or_url], then that mirror is returned. Otherwise, [name_or_url]
is assumed to be a mirror URL, and an anonymous mirror with the given
URL is returned.
"""
result = self.get(name_or_url)
if result is None:
result = Mirror(fetch=name_or_url)
return result
def __iter__(self):
return iter(self._mirrors)
def __len__(self):
return len(self._mirrors)
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -0,0 +1,4 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,146 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url
import llnl.util.symlink
from llnl.util.filesystem import mkdirp
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
from spack.error import MirrorError
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)

View File

@@ -0,0 +1,470 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import operator
import os
import urllib.parse
from typing import Any, Dict, Optional, Tuple, Union
import llnl.util.tty as tty
import spack.config
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
from spack.error import MirrorError
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
def _url_or_path_to_url(url_or_path: str) -> str:
"""For simplicity we allow mirror URLs in config files to be local, relative paths.
This helper function takes care of distinguishing between URLs and paths, and
canonicalizes paths before transforming them into file:// URLs."""
# Is it a supported URL already? Then don't do path-related canonicalization.
parsed = urllib.parse.urlparse(url_or_path)
if parsed.scheme in supported_url_schemes:
return url_or_path
# Otherwise we interpret it as path, and we should promote it to file:// URL.
return url_util.path_to_file_url(spack.util.path.canonicalize_path(url_or_path))
class Mirror:
"""Represents a named location for storing source tarballs and binary
packages.
Mirrors have a fetch_url that indicate where and how artifacts are fetched
from them, and a push_url that indicate where and how artifacts are pushed
to them. These two URLs are usually the same.
"""
def __init__(self, data: Union[str, dict], name: Optional[str] = None):
self._data = data
self._name = name
@staticmethod
def from_yaml(stream, name=None):
return Mirror(syaml.load(stream), name)
@staticmethod
def from_json(stream, name=None):
try:
return Mirror(sjson.load(stream), name)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON mirror:", str(e)) from e
@staticmethod
def from_local_path(path: str):
return Mirror(url_util.path_to_file_url(path))
@staticmethod
def from_url(url: str):
"""Create an anonymous mirror by URL. This method validates the URL."""
if not urllib.parse.urlparse(url).scheme in supported_url_schemes:
raise ValueError(
f'"{url}" is not a valid mirror URL. '
f"Scheme must be one of {supported_url_schemes}."
)
return Mirror(url)
def __eq__(self, other):
if not isinstance(other, Mirror):
return NotImplemented
return self._data == other._data and self._name == other._name
def __str__(self):
return f"{self._name}: {self.push_url} {self.fetch_url}"
def __repr__(self):
return f"Mirror(name={self._name!r}, data={self._data!r})"
def to_json(self, stream=None):
return sjson.dump(self.to_dict(), stream)
def to_yaml(self, stream=None):
return syaml.dump(self.to_dict(), stream)
def to_dict(self):
return self._data
def display(self, max_len=0):
fetch, push = self.fetch_url, self.push_url
# don't print the same URL twice
url = fetch if fetch == push else f"fetch: {fetch} push: {push}"
source = "s" if self.source else " "
binary = "b" if self.binary else " "
print(f"{self.name: <{max_len}} [{source}{binary}] {url}")
@property
def name(self):
return self._name or "<unnamed>"
@property
def binary(self):
return isinstance(self._data, str) or self._data.get("binary", True)
@property
def source(self):
return isinstance(self._data, str) or self._data.get("source", True)
@property
def signed(self) -> bool:
return isinstance(self._data, str) or self._data.get("signed", True)
@property
def autopush(self) -> bool:
if isinstance(self._data, str):
return False
return self._data.get("autopush", False)
@property
def fetch_url(self):
"""Get the valid, canonicalized fetch URL"""
return self.get_url("fetch")
@property
def push_url(self):
"""Get the valid, canonicalized fetch URL"""
return self.get_url("push")
def ensure_mirror_usable(self, direction: str = "push"):
access_pair = self._get_value("access_pair", direction)
access_token_variable = self._get_value("access_token_variable", direction)
errors = []
# Verify that the credentials that are variables expand
if access_pair and isinstance(access_pair, dict):
if "id_variable" in access_pair and access_pair["id_variable"] not in os.environ:
errors.append(f"id_variable {access_pair['id_variable']} not set in environment")
if "secret_variable" in access_pair:
if access_pair["secret_variable"] not in os.environ:
errors.append(
f"environment variable `{access_pair['secret_variable']}` "
"(secret_variable) not set"
)
if access_token_variable:
if access_token_variable not in os.environ:
errors.append(
f"environment variable `{access_pair['access_token_variable']}` "
"(access_token_variable) not set"
)
if errors:
msg = f"invalid {direction} configuration for mirror {self.name}: "
msg += "\n ".join(errors)
raise MirrorError(msg)
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
# Only allow one to exist in the config
if "access_token" in current_data and "access_token_variable" in new_data:
current_data.pop("access_token")
elif "access_token_variable" in current_data and "access_token" in new_data:
current_data.pop("access_token_variable")
# If updating to a new access_pair that is the deprecated list, warn
warn_deprecated_access_pair = False
if "access_pair" in new_data:
warn_deprecated_access_pair = isinstance(new_data["access_pair"], list)
# If the not updating the current access_pair, and it is the deprecated list, warn
elif "access_pair" in current_data:
warn_deprecated_access_pair = isinstance(current_data["access_pair"], list)
if warn_deprecated_access_pair:
tty.warn(
f"in mirror {self.name}: support for plain text secrets in config files "
"(access_pair: [id, secret]) is deprecated and will be removed in a future Spack "
"version. Use environment variables instead (access_pair: "
"{id: ..., secret_variable: ...})"
)
keys = [
"url",
"access_pair",
"access_token",
"access_token_variable",
"profile",
"endpoint_url",
]
if top_level:
keys += ["binary", "source", "signed", "autopush"]
changed = False
for key in keys:
if key in new_data and current_data.get(key) != new_data[key]:
current_data[key] = new_data[key]
changed = True
return changed
def update(self, data: dict, direction: Optional[str] = None) -> bool:
"""Modify the mirror with the given data. This takes care
of expanding trivial mirror definitions by URL to something more
rich with a dict if necessary
Args:
data (dict): The data to update the mirror with.
direction (str): The direction to update the mirror in (fetch
or push or None for top-level update)
Returns:
bool: True if the mirror was updated, False otherwise."""
# Modify the top-level entry when no direction is given.
if not data:
return False
# If we only update a URL, there's typically no need to expand things to a dict.
set_url = data["url"] if len(data) == 1 and "url" in data else None
if direction is None:
# First deal with the case where the current top-level entry is just a string.
if isinstance(self._data, str):
# Can we replace that string with something new?
if set_url:
if self._data == set_url:
return False
self._data = set_url
return True
# Otherwise promote to a dict
self._data = {"url": self._data}
# And update the dictionary accordingly.
return self._update_connection_dict(self._data, data, top_level=True)
# Otherwise, update the fetch / push entry; turn top-level
# url string into a dict if necessary.
if isinstance(self._data, str):
self._data = {"url": self._data}
# Create a new fetch / push entry if necessary
if direction not in self._data:
# Keep config minimal if we're just setting the URL.
if set_url:
self._data[direction] = set_url
return True
self._data[direction] = {}
entry = self._data[direction]
# Keep the entry simple if we're just swapping out the URL.
if isinstance(entry, str):
if set_url:
if entry == set_url:
return False
self._data[direction] = set_url
return True
# Otherwise promote to a dict
self._data[direction] = {"url": entry}
return self._update_connection_dict(self._data[direction], data, top_level=False)
def _get_value(self, attribute: str, direction: str):
"""Returns the most specific value for a given attribute (either push/fetch or global)"""
if direction not in ("fetch", "push"):
raise ValueError(f"direction must be either 'fetch' or 'push', not {direction}")
if isinstance(self._data, str):
return None
# Either a string (url) or a dictionary, we care about the dict here.
value = self._data.get(direction, {})
# Return top-level entry if only a URL was set.
if isinstance(value, str) or attribute not in value:
return self._data.get(attribute)
return value[attribute]
def get_url(self, direction: str) -> str:
if direction not in ("fetch", "push"):
raise ValueError(f"direction must be either 'fetch' or 'push', not {direction}")
# Whole mirror config is just a url.
if isinstance(self._data, str):
return _url_or_path_to_url(self._data)
# Default value
url = self._data.get("url")
# Override it with a direction-specific value
if direction in self._data:
# Either a url as string or a dict with url key
info = self._data[direction]
if isinstance(info, str):
url = info
elif "url" in info:
url = info["url"]
if not url:
raise ValueError(f"Mirror {self.name} has no URL configured")
return _url_or_path_to_url(url)
def get_credentials(self, direction: str) -> Dict[str, Any]:
"""Get the mirror credentials from the mirror config
Args:
direction: fetch or push mirror config
Returns:
Dictionary from credential type string to value
Credential Type Map:
access_token -> str
access_pair -> tuple(str,str)
profile -> str
"""
creddict: Dict[str, Any] = {}
access_token = self.get_access_token(direction)
if access_token:
creddict["access_token"] = access_token
access_pair = self.get_access_pair(direction)
if access_pair:
creddict.update({"access_pair": access_pair})
profile = self.get_profile(direction)
if profile:
creddict["profile"] = profile
return creddict
def get_access_token(self, direction: str) -> Optional[str]:
tok = self._get_value("access_token_variable", direction)
if tok:
return os.environ.get(tok)
else:
return self._get_value("access_token", direction)
return None
def get_access_pair(self, direction: str) -> Optional[Tuple[str, str]]:
pair = self._get_value("access_pair", direction)
if isinstance(pair, (tuple, list)) and len(pair) == 2:
return (pair[0], pair[1]) if all(pair) else None
elif isinstance(pair, dict):
id_ = os.environ.get(pair["id_variable"]) if "id_variable" in pair else pair["id"]
secret = os.environ.get(pair["secret_variable"])
return (id_, secret) if id_ and secret else None
else:
return None
def get_profile(self, direction: str) -> Optional[str]:
return self._get_value("profile", direction)
def get_endpoint_url(self, direction: str) -> Optional[str]:
return self._get_value("endpoint_url", direction)
class MirrorCollection(collections.abc.Mapping):
"""A mapping of mirror names to mirrors."""
def __init__(
self,
mirrors=None,
scope=None,
binary: Optional[bool] = None,
source: Optional[bool] = None,
autopush: Optional[bool] = None,
):
"""Initialize a mirror collection.
Args:
mirrors: A name-to-mirror mapping to initialize the collection with.
scope: The scope to use when looking up mirrors from the config.
binary: If True, only include binary mirrors.
If False, omit binary mirrors.
If None, do not filter on binary mirrors.
source: If True, only include source mirrors.
If False, omit source mirrors.
If None, do not filter on source mirrors.
autopush: If True, only include mirrors that have autopush enabled.
If False, omit mirrors that have autopush enabled.
If None, do not filter on autopush."""
mirrors_data = (
mirrors.items()
if mirrors is not None
else spack.config.get("mirrors", scope=scope).items()
)
mirrors = (Mirror(data=mirror, name=name) for name, mirror in mirrors_data)
def _filter(m: Mirror):
if source is not None and m.source != source:
return False
if binary is not None and m.binary != binary:
return False
if autopush is not None and m.autopush != autopush:
return False
return True
self._mirrors = {m.name: m for m in mirrors if _filter(m)}
def __eq__(self, other):
return self._mirrors == other._mirrors
def to_json(self, stream=None):
return sjson.dump(self.to_dict(True), stream)
def to_yaml(self, stream=None):
return syaml.dump(self.to_dict(True), stream)
# TODO: this isn't called anywhere
@staticmethod
def from_yaml(stream, name=None):
data = syaml.load(stream)
return MirrorCollection(data)
@staticmethod
def from_json(stream, name=None):
try:
d = sjson.load(stream)
return MirrorCollection(d)
except Exception as e:
raise sjson.SpackJSONError("error parsing JSON mirror collection:", str(e)) from e
def to_dict(self, recursive=False):
return syaml.syaml_dict(
sorted(
((k, (v.to_dict() if recursive else v)) for (k, v) in self._mirrors.items()),
key=operator.itemgetter(0),
)
)
@staticmethod
def from_dict(d):
return MirrorCollection(d)
def __getitem__(self, item):
return self._mirrors[item]
def display(self):
max_len = max(len(mirror.name) for mirror in self._mirrors.values())
for mirror in self._mirrors.values():
mirror.display(max_len)
def lookup(self, name_or_url):
"""Looks up and returns a Mirror.
If this MirrorCollection contains a named Mirror under the name
[name_or_url], then that mirror is returned. Otherwise, [name_or_url]
is assumed to be a mirror URL, and an anonymous mirror with the given
URL is returned.
"""
result = self.get(name_or_url)
if result is None:
result = Mirror(fetch=name_or_url)
return result
def __iter__(self):
return iter(self._mirrors)
def __len__(self):
return len(self._mirrors)

View File

@@ -0,0 +1,258 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import traceback
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml
import spack.version
from spack.error import MirrorError
from spack.mirrors.mirror import Mirror, MirrorCollection
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(
pkg_obj, mirror_cache: "spack.caches.MirrorCache", mirror_stats: MirrorStats
) -> bool:
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache: mirror where to add the spec.
mirror_stats: statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
max_retries = 3
for num_retries in range(max_retries):
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
break
except Exception as e:
if num_retries + 1 == max_retries:
if spack.config.get("config:debug"):
traceback.print_exc()
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.format("{name}{@version}"), str(e)
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem
import spack.builder
import spack.phase_callbacks
def filter_compiler_wrappers(*files, **kwargs):
@@ -111,4 +111,4 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
if pkg.compiler.name == "nag":
x.filter("-Wl,--enable-new-dtags", "", **filter_kwargs)
spack.builder.run_after(after)(_filter_compiler_wrappers_impl)
spack.phase_callbacks.run_after(after)(_filter_compiler_wrappers_impl)

View File

@@ -39,7 +39,7 @@
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import dedupe, memoized
from llnl.util.lang import Singleton, dedupe, memoized
import spack.build_environment
import spack.config
@@ -246,7 +246,7 @@ def _generate_upstream_module_index():
return UpstreamModuleIndex(spack.store.STORE.db, module_indices)
upstream_module_index = llnl.util.lang.Singleton(_generate_upstream_module_index)
upstream_module_index = Singleton(_generate_upstream_module_index)
ModuleIndexEntry = collections.namedtuple("ModuleIndexEntry", ["path", "use_name"])

View File

@@ -16,7 +16,8 @@
import llnl.util.tty as tty
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.oci.opener
import spack.stage
import spack.util.url
@@ -213,7 +214,7 @@ def upload_manifest(
return digest, size
def image_from_mirror(mirror: spack.mirror.Mirror) -> ImageReference:
def image_from_mirror(mirror: spack.mirrors.mirror.Mirror) -> ImageReference:
"""Given an OCI based mirror, extract the URL and image name from it"""
url = mirror.push_url
if not url.startswith("oci://"):
@@ -385,5 +386,8 @@ def make_stage(
# is the `oci-layout` and `index.json` files, which are
# required by the spec.
return spack.stage.Stage(
fetch_strategy, mirror_paths=spack.mirror.OCILayout(digest), name=digest.digest, keep=keep
fetch_strategy,
mirror_paths=spack.mirrors.layout.OCILayout(digest),
name=digest.digest,
keep=keep,
)

View File

@@ -20,7 +20,7 @@
import llnl.util.lang
import spack.config
import spack.mirror
import spack.mirrors.mirror
import spack.parser
import spack.util.web
@@ -367,19 +367,20 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
def credentials_from_mirrors(
domain: str, *, mirrors: Optional[Iterable[spack.mirror.Mirror]] = None
domain: str, *, mirrors: Optional[Iterable[spack.mirrors.mirror.Mirror]] = None
) -> Optional[UsernamePassword]:
"""Filter out OCI registry credentials from a list of mirrors."""
mirrors = mirrors or spack.mirror.MirrorCollection().values()
mirrors = mirrors or spack.mirrors.mirror.MirrorCollection().values()
for mirror in mirrors:
# Prefer push credentials over fetch. Unlikely that those are different
# but our config format allows it.
for direction in ("push", "fetch"):
pair = mirror.get_access_pair(direction)
if pair is None:
pair = mirror.get_credentials(direction).get("access_pair")
if not pair:
continue
url = mirror.get_url(direction)
if not url.startswith("oci://"):
continue

View File

@@ -11,7 +11,7 @@
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
from spack.error import InstallError
from spack.error import InstallError, NoHeadersError, NoLibrariesError
# Emulate some shell commands for convenience
env = environ
@@ -74,7 +74,7 @@
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.builder import run_after, run_before
from spack.builder import BaseBuilder
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import *
@@ -100,15 +100,11 @@
on_package_attributes,
)
from spack.package_completions import *
from spack.phase_callbacks import run_after, run_before
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,
conditional,
disjoint_sets,
)
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver
# These are just here for editor support; they will be replaced when the build env

View File

@@ -32,18 +32,19 @@
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.builder
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives
import spack.directives_meta
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.store
@@ -51,9 +52,10 @@
import spack.util.environment
import spack.util.path
import spack.util.web
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import PackageTest, TestSuite
from spack.resource import Resource
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -299,9 +301,9 @@ def determine_variants(cls, objs, version_str):
class PackageMeta(
spack.builder.PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
DetectablePackageMeta,
spack.directives.DirectiveMeta,
spack.directives_meta.DirectiveMeta,
spack.multimethod.MultiMethodMeta,
):
"""
@@ -453,7 +455,7 @@ def _names(when_indexed_dictionary: WhenDict) -> List[str]:
return sorted(all_names)
WhenVariantList = List[Tuple["spack.spec.Spec", "spack.variant.Variant"]]
WhenVariantList = List[Tuple[spack.spec.Spec, spack.variant.Variant]]
def _remove_overridden_vdefs(variant_defs: WhenVariantList) -> None:
@@ -492,41 +494,14 @@ class Hipblas:
i += 1
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -612,16 +587,22 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
dependencies: Dict["spack.spec.Spec", Dict[str, "spack.dependency.Dependency"]]
conflicts: Dict["spack.spec.Spec", List[Tuple["spack.spec.Spec", Optional[str]]]]
resources: Dict[spack.spec.Spec, List[Resource]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
"spack.spec.Spec", List[Tuple[Tuple["spack.spec.Spec", ...], str, Optional[str]]]
spack.spec.Spec, List[Tuple[Tuple[spack.spec.Spec, ...], str, Optional[str]]]
]
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
languages: Dict["spack.spec.Spec", Set[str]]
provided: Dict[spack.spec.Spec, Set[spack.spec.Spec]]
provided_together: Dict[spack.spec.Spec, List[Set[str]]]
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
licenses: Dict[spack.spec.Spec, str]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
disable_redistribute: Dict[spack.spec.Spec, DisableRedistribute]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -736,11 +717,11 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
test_requires_compiler: bool = False
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional["TestSuite"] = None
test_suite: Optional[Any] = None
def __init__(self, spec):
# this determines how the package should be built.
self.spec: "spack.spec.Spec" = spec
self.spec: spack.spec.Spec = spec
# Allow custom staging paths for packages
self.path = None
@@ -758,7 +739,7 @@ def __init__(self, spec):
# init internal variables
self._stage: Optional[StageComposite] = None
self._fetcher = None
self._tester: Optional["PackageTest"] = None
self._tester: Optional[Any] = None
# Set up timing variables
self._fetch_time = 0.0
@@ -808,9 +789,7 @@ def variant_definitions(cls, name: str) -> WhenVariantList:
return defs
@classmethod
def variant_items(
cls,
) -> Iterable[Tuple["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]]:
def variant_items(cls) -> Iterable[Tuple[spack.spec.Spec, Dict[str, spack.variant.Variant]]]:
"""Iterate over ``cls.variants.items()`` with overridden definitions removed."""
# Note: This is quadratic in the average number of variant definitions per name.
# That is likely close to linear in practice, as there are few variants with
@@ -828,7 +807,7 @@ def variant_items(
if filtered_variants_by_name:
yield when, filtered_variants_by_name
def get_variant(self, name: str) -> "spack.variant.Variant":
def get_variant(self, name: str) -> spack.variant.Variant:
"""Get the highest precedence variant definition matching this package's spec.
Arguments:
@@ -1003,6 +982,26 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# Source redistribution must be determined before concretization (because source mirrors work
# with abstract specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror."""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an installed instance of this
package."""
for when_spec, disable_redistribute in self.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
@@ -1015,7 +1014,7 @@ def keep_werror(self) -> Optional[str]:
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
if self.spec.satisfies("%nvhpc@:23.3"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
@@ -1189,10 +1188,10 @@ def _make_resource_stage(self, root_stage, resource):
root=root_stage,
resource=resource,
name=self._resource_stage(resource),
mirror_paths=spack.mirror.default_mirror_layout(
mirror_paths=spack.mirrors.layout.default_mirror_layout(
resource.fetcher, os.path.join(self.name, pretty_resource_name)
),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
path=self.path,
)
@@ -1204,7 +1203,7 @@ def _make_root_stage(self, fetcher):
# Construct a mirror path (TODO: get this out of package.py)
format_string = "{name}-{version}"
pretty_name = self.spec.format_path(format_string)
mirror_paths = spack.mirror.default_mirror_layout(
mirror_paths = spack.mirrors.layout.default_mirror_layout(
fetcher, os.path.join(self.name, pretty_name), self.spec
)
# Construct a path where the stage should build..
@@ -1213,7 +1212,7 @@ def _make_root_stage(self, fetcher):
stage = Stage(
fetcher,
mirror_paths=mirror_paths,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
name=stage_name,
path=self.path,
search_fn=self._download_search,
@@ -1352,11 +1351,13 @@ def archive_install_test_log(self):
@property
def tester(self):
import spack.install_test
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve tester for package without concrete version.")
if not self._tester:
self._tester = PackageTest(self)
self._tester = spack.install_test.PackageTest(self)
return self._tester
@property
@@ -2013,72 +2014,58 @@ def build_system_flags(
"""
return None, None, flags
def setup_run_environment(self, env):
def setup_run_environment(self, env: spack.util.environment.EnvironmentModifications) -> None:
"""Sets up the run environment for a package.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is run. Package authors
env: environment modifications to be applied when the package is run. Package authors
can call methods on it to alter the run environment.
"""
pass
def setup_dependent_run_environment(self, env, dependent_spec):
def setup_dependent_run_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the run environment of packages that depend on this one.
This is similar to ``setup_run_environment``, but it is used to
modify the run environments of packages that *depend* on this one.
This is similar to ``setup_run_environment``, but it is used to modify the run environment
of a package that *depends* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or run-time settings
for dependencies.
This gives packages like Python and others that follow the extension model a way to
implement common environment or run-time settings for dependencies.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is run.
Package authors can call methods on it to alter the build environment.
env: environment modifications to be applied when the dependent package is run. Package
authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be run. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
dependent_spec: The spec of the dependent package about to be run. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
pass
def setup_dependent_package(self, module, dependent_spec):
"""Set up Python module-scope variables for dependent packages.
def setup_dependent_package(self, module, dependent_spec: spack.spec.Spec) -> None:
"""Set up module-scope global variables for dependent packages.
Called before the install() method of dependents.
Default implementation does nothing, but this can be
overridden by an extendable package to set up the module of
its extensions. This is useful if there are some common steps
to installing all extensions for a certain package.
This function is called when setting up the build and run environments of a DAG.
Examples:
1. Extensions often need to invoke the ``python`` interpreter
from the Python installation being extended. This routine
can put a ``python()`` Executable object in the module scope
for the extension package to simplify extension installs.
1. Extensions often need to invoke the ``python`` interpreter from the Python installation
being extended. This routine can put a ``python`` Executable as a global in the module
scope for the extension package to simplify extension installs.
2. MPI compilers could set some variables in the dependent's
scope that point to ``mpicc``, ``mpicxx``, etc., allowing
them to be called by common name regardless of which MPI is used.
3. BLAS/LAPACK implementations can set some variables
indicating the path to their libraries, since these
paths differ by BLAS/LAPACK implementation.
2. MPI compilers could set some variables in the dependent's scope that point to ``mpicc``,
``mpicxx``, etc., allowing them to be called by common name regardless of which MPI is
used.
Args:
module (spack.package_base.PackageBase.module): The Python ``module``
object of the dependent package. Packages can use this to set
module-scope variables for the dependent to use.
module: The Python ``module`` object of the dependent package. Packages can use this to
set module-scope variables for the dependent to use.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be built. This allows the extendee (self) to
query the dependent's state. Note that *this*
package's spec is available as ``self.spec``.
dependent_spec: The spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``.
"""
pass
@@ -2105,7 +2092,7 @@ def flag_handler(self, var: FLAG_HANDLER_TYPE) -> None:
# arguments. This is implemented for build system classes where
# appropriate and will otherwise raise a NotImplementedError.
def flags_to_build_system_args(self, flags):
def flags_to_build_system_args(self, flags: Dict[str, List[str]]) -> None:
# Takes flags as a dict name: list of values
if any(v for v in flags.values()):
msg = "The {0} build system".format(self.__class__.__name__)
@@ -2308,10 +2295,6 @@ def rpath_args(self):
"""
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
@property
def builder(self):
return spack.builder.create(self)
inject_flags = PackageBase.inject_flags
env_flags = PackageBase.env_flags

View File

@@ -16,7 +16,8 @@
import spack
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.repo
import spack.stage
import spack.util.spack_json as sjson
@@ -329,12 +330,12 @@ def stage(self) -> "spack.stage.Stage":
name = "{0}-{1}".format(os.path.basename(self.url), fetch_digest[:7])
per_package_ref = os.path.join(self.owner.split(".")[-1], name)
mirror_ref = spack.mirror.default_mirror_layout(fetcher, per_package_ref)
mirror_ref = spack.mirrors.layout.default_mirror_layout(fetcher, per_package_ref)
self._stage = spack.stage.Stage(
fetcher,
name=f"{spack.stage.stage_prefix}patch-{fetch_digest}",
mirror_paths=mirror_ref,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
)
return self._stage

Some files were not shown because too many files have changed in this diff Show More