Compare commits

...

164 Commits

Author SHA1 Message Date
Gregory Becker
4a595fb216 remove deprecated methods 2024-12-11 10:59:37 -08:00
Gregory Becker
cf3addc69e attach methods to correct class 2024-12-11 10:58:19 -08:00
Gregory Becker
760fbd4bac missing import 2024-12-11 10:51:46 -08:00
Gregory Becker
fbedaa7854 re-add removed methods with deprecation warning 2024-12-11 10:50:08 -08:00
Harmen Stoppels
74a6b61b75 Partially revert f3ccb8e095121d4bccd562f1c851a2373da1b40a 2024-12-09 10:08:28 +01:00
Harmen Stoppels
e995d0543e fix test that held reference to abstract spec 2024-12-09 10:08:28 +01:00
Harmen Stoppels
3827ebb592 remove quotes from quotes type hints 2024-12-09 10:08:28 +01:00
Harmen Stoppels
0239d77842 also break circular import concretize.py and solver/asp.py 2024-12-09 10:08:28 +01:00
Gregory Becker
4e9fe1fc5d concretization: move single-spec concretization logic to spack.concretize
This resolves a circular import issue between spack.spec and spack.concretize.
It requires removing the `Spec.concretize` and `Spec.concretized` methods and
updating all call-sites to use `spack.concretize.concretized` instead.

This will help with potential future efforts to separate AbstractSpec and
ConcreteSpec classes.

New import relationship is `spack.concretize` imports from `spack.spec`, but
not the other way around.
2024-12-09 10:08:28 +01:00
dependabot[bot]
fc105a1a26 build(deps): bump types-six in /.github/workflows/requirements/style (#47954)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.20241105 to 1.17.0.20241205.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-08 19:34:20 -06:00
Stephen Sachs
8a9e16dc3b aws-pcluster stacks: static spack.yaml (#47918) 2024-12-08 20:26:08 +01:00
eugeneswalker
0b7fc360fa e4s ci: add lammps +rocm (#47929)
* e4s ci: add lammps +rocm

* e4s rocm external stack: add def for rocm-openmp-extras external

* lammps +rocm external rocm has errors, comment out
2024-12-08 10:21:28 -08:00
Wouter Deconinck
79d79969bb celeritas: patch 0.5.0 for geant4@11.3.0: (#47976) 2024-12-08 09:12:14 -05:00
Harmen Stoppels
422f829e4e mirrors: add missing init file (#47977) 2024-12-08 09:31:22 +01:00
Alec Scott
f54c101b44 py-jedi: add v0.19.2 (#47569) 2024-12-07 16:26:31 +01:00
Harmen Stoppels
05acd29f38 extensions.py: remove import of spack.cmd (#47963) 2024-12-07 10:08:04 +01:00
Wouter Deconinck
77e2187e13 coverage.yml: fail_ci_if_error = true (#47731) 2024-12-06 11:01:10 -08:00
Harmen Stoppels
5c88e035f2 directives.py: remove redundant import (#47965) 2024-12-06 19:18:12 +01:00
Harmen Stoppels
94bd7b9afb build_environment: drop off by one fix (#47960) 2024-12-06 17:01:46 +01:00
Stephen Herbener
f181ac199a Upgraded version specs for ECMWF packages: eckit, atlas, ectrans, fckit, fiat (#47749) 2024-12-05 18:46:56 -08:00
Sreenivasa Murthy Kolam
a8da7993ad Bump up the version for rocm-6.2.4 release (#47707)
* Bump up the version for rocm-6.2.4 release
2024-12-05 18:41:02 -08:00
Dom Heinzeller
b808338792 py-uxarray: new package plus dependencies (#47573)
* Add py-param@2.1.1
* Add py-panel@1.5.2
* Add py-bokeh@3.5.2
* New package py-datashader
* New package py-geoviews
* New package py-holoviews
* WIP: new package py-uxarray
* New package py-antimeridian
* New package py-dask-expr
* New package py-spatialpandas
* New package py-hvplot
* Add dependency on py-dask-expr for 'py-dask@2024.3: +dataframe'
* Added all dependencies for py-uxarray; still having problems with py-dask +dataframe / py-dask-expr
* Fix style errors in many packages
* Clean up comments and fix style errors in var/spack/repos/builtin/packages/py-dask-expr/package.py
* In var/spack/repos/builtin/packages/py-dask/package.py: since 2023.8, the dataframe variant requires the array variant
* Fix style errors in py-uxarray package
2024-12-05 18:20:55 -08:00
Massimiliano Culpo
112e47cc23 Don't inject import statements in package recipes
Remove a hack done by RepoLoader, which was injecting an extra
```
from spack.package import *
```
at the beginning of each package.py
2024-12-05 12:48:00 -08:00
Dom Heinzeller
901cea7a54 Add conflict for pixman with Intel Classic (#47922) 2024-12-05 18:14:57 +01:00
Massimiliano Culpo
c1b2ac549d solver: partition classes related to requirement parsing into their own file (#47915) 2024-12-05 18:10:06 +01:00
Harmen Stoppels
4693b323ac spack.mirror: split into submodules (#47936) 2024-12-05 18:09:08 +01:00
Kin Fai Tse
1f2a68f2b6 tar: conditionally link iconv (#47933)
* fix broken packages requiring iconv

* tar: -liconv only when libiconv

* Revert "fix broken packages requiring iconv"

This reverts commit 5fa426b52f.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-12-05 10:09:18 -06:00
Juan Miguel Carceller
3fcc38ef04 pandoramonitoring,pandorasdk: change docstrings that are wrong (#47937)
and are copied from the pandorapfa package

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-12-05 08:53:09 -07:00
Harmen Stoppels
22d104d7a9 ci: add bootstrap stack for python@3.6:3.13 (#47719)
Resurrect latest Python 3.6
Add clingo-bootstrap to Gitlab CI.
2024-12-05 10:07:24 +01:00
Todd Gamblin
8b1009a4a0 resource: clean up arguments and typing
- [x] Clean up arguments on the `resource` directive.
- [x] Add type annotations
- [x] Add `resource` to type annotations on `PackageBase`
- [x] Fix up `resource` docstrings

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
f54526957a directives: add type annotations to DirectiveMeta class
Some of the class-level annotations were wrong, and some were missing. Annotate all the
functions here and fix the class properties to match what's actually happening.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
175a4bf101 directives: use Type[PackageBase] instead of PackageBase
The first argument to each Spack directive is not a `PackageBase` instance but a
`PackageBase` class object, so fix the type annotations to reflect this.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
aa81d59958 directives: don't include Optional in PatchesType
`Optional` shouldn't be part of `PatchesType` -- it's clearer to specify `Optional` it
in the methods that need their arguments to be optional.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
James Taliaferro
6aafefd43d package version: Neovim 0.10.2 (#47925) 2024-12-04 23:17:55 +01:00
Satish Balay
ac82f344bd trilinos@develop: update kokkos dependency (#47838) 2024-12-04 19:53:38 +01:00
Harmen Stoppels
16fd77f9da rust-bootstrap: fix zlib dependency (#47894)
x
2024-12-04 02:28:19 -08:00
Harmen Stoppels
f82554a39b stage.py: improve path to url (#47898) 2024-12-04 09:41:38 +01:00
Massimiliano Culpo
2aaf50b8f7 eigen: remove unnecessary dependency on fortran (#47866) 2024-12-04 08:18:40 +01:00
Mathew Cleveland
b0b9cf15f7 add a '+no_warning' variant to METIS to prevent pervasive warning (#47452)
* add a '+no_warning' variant to metis to prevent prevasive warning
* fix formating

---------

Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: mcourtois <mathieu.courtois@gmail.com>
2024-12-03 17:02:36 -08:00
v
8898e14e69 update py-numl and py-nugraph recipes (#47680)
* update py-numl and py-nugraph recipes

this commit adds the develop branch as a valid option for each of these two packages. in order to enable this, package tarballs are now retrieved from the github source repository instead of pypi, and their checksums and the build system have been updated accordingly.

* rename versions "develop" -> "main" to be consistent with branch name
2024-12-03 16:59:33 -08:00
Buldram
63c72634ea nim: add latest versions (#47844)
* nim: add latest versions
  In addition:
  - Create separate build and install phases.
  - Remove koch nimble call as it's redundant with koch tools.
  - Install all additional tools bundled with Nim instead of only Nimble.
* Fix 1.6 version
* nim: add devel
  In addition:
  - Fix build accessing user config/cache
2024-12-03 16:57:59 -08:00
Carson Woods
a7eacd77e3 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2024-12-03 17:16:36 +01:00
Cédric Chevalier
09b7ea0400 Bump Kokkos and Kokkos-kernels to 4.5.00 (#47809)
* Bump Kokkos and Kokkos-kernels to 4.5.00

* petsc@:3.22 add a conflict with this new version of kokkos

* Update kokkos/kokkos-kernel dependency

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-12-03 09:09:25 -07:00
Harmen Stoppels
b31dd46ab8 style.py: do not remove import spack in packages (#47895) 2024-12-03 16:04:18 +01:00
Harmen Stoppels
ad7417dee9 nwchem: add resource, remove patch (#47892)
fixes a build failure due to broken URL and improves nwchem build without internet
2024-12-03 14:09:05 +01:00
Wouter Deconinck
c3de3b0b6f tar: add v1.35 (fix CVEs) (#47426) 2024-12-03 13:26:04 +01:00
Harmen Stoppels
6da9bf226a python: drop nis module also for < 3.13 (#47862)
the nis module was removed in python 3.13
we had it default to ~nis
no package requires +nis
required dependencies for +nis were missing

so better to remove the nis module entirely.
2024-12-03 13:01:08 +01:00
Auriane R.
b3ee954e5b Remove duplicate version (#47880) 2024-12-03 10:14:47 +01:00
napulath
db090b0cad Update package.py (#47885) 2024-12-03 08:24:28 +01:00
Massimiliano Culpo
3a6c361a85 cgns: make fortran dependency optional (#47867) 2024-12-03 06:18:37 +01:00
Adam J. Stewart
bb5bd030d4 py-rasterio: add v1.4.3 (#47881) 2024-12-03 06:10:20 +01:00
dependabot[bot]
b9c60f96ea build(deps): bump pytest from 8.3.3 to 8.3.4 in /lib/spack/docs (#47882)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.3 to 8.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-03 06:07:27 +01:00
Stephen Nicholas Swatman
6b16c64c0e acts dependencies: new versions as of 2024/12/02 (#47787)
* acts dependencies: new versions as of 2024/11/25

This commit adds a new version of detray and two new versions of vecmem.

* acts dependencies: new versions as of 2024/12/02

This commit adds version 38 of ACTS and a new version of detray.
2024-12-02 19:50:25 -06:00
Andrey Perestoronin
3ea970746d add compilers packages (#47877) 2024-12-02 15:53:56 -07:00
Satish Balay
d8f2e080e6 petsc, py-petsc4py: add v3.22.2 (#47845) 2024-12-02 14:21:31 -08:00
Harmen Stoppels
ecb8a48376 libseccomp: python forward compat bound (#47876)
* libseccomp: python forward compat bound

* include 2.5.5

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-02 14:59:40 -07:00
Massimiliano Culpo
30176582e4 py-torchvision: add dependency on c (#47873) 2024-12-02 22:23:58 +01:00
Massimiliano Culpo
ac17e8bea4 utf8cpp: move to GitHub, make it a CMake package (#47870) 2024-12-02 14:14:24 -07:00
Massimiliano Culpo
c30c85a99c seacas: add a conditional dependency on fortran (#47871)
* seacas: remove unnecessary dependency on fortran
* seacas: add a conditional dependency on fortran
2024-12-02 13:13:14 -08:00
Michael Schlottke-Lakemper
2ae8eb6686 Update HOHQmesh package with newer versions (#47861) 2024-12-02 12:29:45 -08:00
Jose E. Roman
b5cc5b701c New patch release SLEPc 3.22.2 (#47859) 2024-12-02 12:06:52 -08:00
Wouter Deconinck
8e7641e584 onnx: set CMAKE_CXX_STANDARD to abseil-cpp cxxstd value (#47858) 2024-12-02 11:56:33 -08:00
Weiqun Zhang
e692d401eb amrex: add v24.12 (#47857) 2024-12-02 11:55:08 -08:00
Massimiliano Culpo
99319b1d91 oneapi-level-zero: add dependency on c (#47874) 2024-12-02 12:48:49 -07:00
Satish Balay
839ed9447c trilinos@14.4.0 revert kokkos-kernel dependency - as this breaks builds (#47852) 2024-12-02 11:44:37 -08:00
afzpatel
8e5a040985 ucc: add ROCm and rccl support (#46580) 2024-12-02 20:43:53 +01:00
Stephen Nicholas Swatman
5ddbb1566d benchmark: add version 1.9.1 (#47860)
This commit adds version 1.9.1 of Google Benchmark.
2024-12-02 11:42:38 -08:00
Massimiliano Culpo
eb17680d28 double-conversion: add dependency on c, and c++ (#47869) 2024-12-02 12:38:16 -07:00
Massimiliano Culpo
f4d81be9cf py-torch-nvidia-apex: add dependency on C (#47868) 2024-12-02 20:37:33 +01:00
Massimiliano Culpo
ea5ffe35f5 configuration: set egl as buildable:false (#47865) 2024-12-02 11:33:01 -08:00
Wouter Deconinck
1e37a77e72 mlpack: depends_on py-setuptools (#47828) 2024-12-02 12:04:53 +01:00
Todd Gamblin
29427d3e9e ruff: add v0.8.1 (#47851)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 10:49:47 +01:00
Todd Gamblin
2a2d1989c1 version_types: clean up type hierarchy and add annotations (#47781)
In preparation for adding `when=` to `version()`, I'm cleaning up the types in
`version_types` and making sure the methods here pass `mypy` checks. This started as an
attempt to use `ConcreteVersion` outside of `spack.version` and grew into a larger type
refactor.

The hierarchy now looks like this:

* `VersionType`
  * `ConcreteVersion`
    * `StandardVersion`
    * `GitVersion`
  * `ClosedOpenRange`
  * `VersionList`

Note that the top-level thing can't easily be `Version` as that is a method and it
returns only `ConcreteVersion` right now. I *could* do something fancy with `__new__` to
make `Version` a synonym for the `ConcreteVersion` constructor, which would allow it to
be used as a type. I could also do something similar with `VersionRange` but not sure if
it's worth it just to make these into types.

There are still some places where I think `GitVersion` might not be handled properly,
but I have not attempted to fix those here.

- [x] Add a top-level `VersionType` class that all version types extend from
- [x] Define and document common methods and rich comparisons on `VersionType`
- [x] Replace complicated `Union` types with `VersionType` and `ConcreteVersion` as needed
- [x] Annotate most methods (skipping `__getitem__` and friends as the typing is a pain)
- [x] Fix up the `VersionList` constructor a bit
- [x] Add cases to methods that weren't handling all `VersionType`s
- [x] Rework some places to clarify typing for `mypy`
- [x] Simplify / optimize _next_version
- [x] Make StandardVersion.string a property to enable lazy comparison

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 08:21:07 +01:00
Wouter Deconinck
c6e292f55f py-nbdime: add v3.2.1 (#47445) 2024-11-29 15:59:11 -07:00
teddy
bf5e6b4aaf py-mpi4py: create mpi.cfg file, this file is removed since v4.0.0, but API is retained #47584
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-29 13:28:21 -06:00
Adam J. Stewart
9760089089 VTK: mark Python version compatibility (#47814)
* VTK: mark Python version compatibility

* VTK 8.2.0 also only supports Python 3.7
2024-11-29 13:04:56 -06:00
dmagdavector
da7c5c551d py-pip: add v23.2.1 -> v24.3.1 (#47753)
* py-pip: update to latest version 24.3.1 (plus some others)

* py-pip: note Python version dependency for new PIP versions
2024-11-29 17:18:19 +01:00
Harmen Stoppels
a575fa8529 gcc: add missing patches from Iain Sandoe's branch (#47843) 2024-11-29 08:10:04 +01:00
Massimiliano Culpo
39a65d88f6 fpm: add a dependency on c, and fortran (#47839)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13871774

Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
2024-11-29 08:07:50 +01:00
Massimiliano Culpo
06ff8c88ac py-torch-sparse: add a dependency on c (#47841)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870876
2024-11-29 08:06:46 +01:00
Massimiliano Culpo
a96b67ce3d miopen-hip: add a dependency on c (#47842)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870957
2024-11-29 07:25:43 +01:00
Harmen Stoppels
67d494fa0b filesystem.py: remove unused md5sum (#47832) 2024-11-28 18:43:21 +01:00
Harmen Stoppels
e37e53cfe8 traverse: add MixedDepthVisitor, use in cmake (#47750)
This visitor accepts the sub-dag of all nodes and unique edges that have
deptype X directly from given roots, or deptype Y transitively for any
of the roots.
2024-11-28 17:48:48 +01:00
Andrey Perestoronin
cf31d20d4c add new packages (#47817) 2024-11-28 09:49:52 -05:00
Harmen Stoppels
b74db341c8 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2024-11-28 14:57:28 +01:00
Daryl W. Grunau
e88a3f6f85 eospac: version 6.5.12 (#47826)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-11-28 12:32:35 +01:00
Massimiliano Culpo
9bd7483e73 Add further C and C++ dependencies to packages (#47821) 2024-11-28 10:50:35 +01:00
Harmen Stoppels
04c76fab63 hip: hints for find_package llvm/clang (#47788)
LLVM can be a transitive link dependency of hip through gl's dependency mesa, which uses it for software rendering.

In this case make sure llvm-amdgpu is found with find_package(LLVM) and
find_package(Clang) by setting LLVM_ROOT and Clang_ROOT.

That makes the patch of find_package's HINTS redundant, so remove that.
It did not work anyways, because CMAKE_PREFIX_PATH has higher precedence
than HINTS.
2024-11-28 10:23:09 +01:00
Adam J. Stewart
ecbf9fcacf py-scooby: add v0.10.0 (#47790) 2024-11-28 10:21:36 +01:00
Victor A. P. Magri
69fb594699 hypre: add a variant to allow using internal lapack functions (#47780) 2024-11-28 10:15:12 +01:00
Howard Pritchard
d28614151f nghttp2: add v1.64.0 (#47800)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-11-28 10:12:41 +01:00
etiennemlb
f1d6af6c94 netlib-scalapack: fix for some clang derivative (cce/rocmcc) (#45434) 2024-11-28 09:25:33 +01:00
Adam J. Stewart
192821f361 py-river: mark numpy 2 compatibility (#47813) 2024-11-28 09:24:21 +01:00
Adam J. Stewart
18790ca397 py-pyvista: VTK 9.4 not yet supported (#47815) 2024-11-28 09:23:41 +01:00
BOUDAOUD34
c22d77a38e dbcsr: patch for resolving .mod file conflicts in ROCm by implementing USE, INTRINSIC (#46181)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-11-28 09:20:48 +01:00
Tom Payerle
d82bdb3bf7 seacas: update recipe to find faodel (#40239)
Explcitly sets the CMake variables  Faodel_INCLUDE_DIRS and Faodel_LIBRARY_DIRS when +faodel.
This seems to be needed for recent versions of seacas (seacas@2021-04-02:), but should be safe
to do for all versions.

For Faodel_INCLUDE_DIRS, it looks like Faodel has header files under $(Faodel_Prefix)/include/faodel,
but seacas is not including the "faodel" part in #includes.  So add both $(Faodel_Prefix)/include
and $(Foadel_Prefix)/include/faodel

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-11-28 09:17:44 +01:00
Matt Thompson
a042bdfe0b mapl: add hpcx-mpi (#47793) 2024-11-28 09:15:25 +01:00
Adam J. Stewart
60e3e645e8 py-joblib: add v1.4.2 (#47789) 2024-11-28 08:28:44 +01:00
Chris Marsh
51785437bc Patch to fix building gcc@14.2 on darwin. Fixes #45628 (#47830) 2024-11-27 20:58:18 -07:00
dependabot[bot]
2e8db0815d build(deps): bump docker/build-push-action from 6.9.0 to 6.10.0 (#47819)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.10.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4f58ea7922...48aba3b46d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-27 16:29:41 -07:00
George Malerbo
8a6428746f raylib: add v5.5 (#47708)
* Add version 5.5 in package.py

* Update package.py
2024-11-27 16:25:22 -07:00
Adam J. Stewart
6b9c099af8 py-keras: add v3.7.0 (#47816) 2024-11-27 16:12:47 -07:00
Derek Ryan Strong
30814fb4e0 Deprecate rsync releases before v3.2.5 (#47820) 2024-11-27 16:14:34 -06:00
Harmen Stoppels
3194be2e92 gcc-runtime: remove libz.so from libgfortran.so if present (#47812) 2024-11-27 22:32:37 +01:00
snehring
41be2f5899 ltr-retriever: changing directory layout (#38513) 2024-11-27 14:16:57 -07:00
kwryankrattiger
02af41ebb3 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2024-11-27 15:13:55 -06:00
snehring
9d33c89030 r-rsamtools: add -lz to Makevars (#38649) 2024-11-27 13:44:48 -07:00
Erik Heeren
51ab7bad3b julia: conflict for %gcc@12: support (#35931) 2024-11-27 04:31:44 -07:00
kwryankrattiger
0b094f2473 Docs: Reference 7z requirement on Windows (#35943) 2024-11-26 17:11:12 -05:00
Christoph Junghans
cd306d0bc6 all-libary: add voronoi support and git version (#47798)
* all-libary: add voronoi support and git version

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-26 14:56:22 -07:00
Dom Heinzeller
fdb9cf2412 Intel/oneapi compilers: correct version ranges for diab-disable flag (#47428)
* c/c++ flags should have been modified for all 2023.x.y versions, but
  upper bound was too low
* Fortran flags should have been modified for all 2024.x.y versions, but
  likewise the upper bound was too low
2024-11-26 12:34:37 -07:00
etiennemlb
a546441d2e siesta: remove link args on a non-declared dependency (#46080) 2024-11-26 20:25:04 +01:00
IHuismann
141cdb6810 adol-c: fix libs property (#36614) 2024-11-26 17:01:18 +01:00
Brian Van Essen
f2ab74efe5 cray: add further versions of Cray packages. (#37733) 2024-11-26 16:59:53 +01:00
Martin Aumüller
38b838e405 openscenegraph: remove X11 dependencies for macos (#39037) 2024-11-26 16:59:10 +01:00
Mark Abraham
c037188b59 gromacs: announce deprecation policy and start to implement (#47804)
* gromacs: announce deprecation policy and start to implement

* Style it up

* [@spackbot] updating style on behalf of mabraham

* Bump versions used in CI

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2024-11-26 05:54:07 -07:00
Mark Abraham
0835a3c5f2 gromacs: obtain SYCL from either ACpp or intel-oneapi-runtime (#47806) 2024-11-26 05:51:54 -07:00
Mark Abraham
38a2f9c2f2 gromacs: Improve HeFFTe dependency (#47805)
GROMACS supports HeFFTe with either SYCL or CUDA build and requires
a matching HeFFTe build
2024-11-26 05:50:41 -07:00
Massimiliano Culpo
eecd4afe58 gromacs: fix the value used for the ITT directory (#47795) 2024-11-26 08:14:45 +01:00
Seth R. Johnson
83624551e0 ROOT: default to +aqua~x on macOS (#47792) 2024-11-25 14:27:38 -06:00
Victor A. P. Magri
741652caa1 caliper: add "tools" variant (#47779) 2024-11-25 18:26:53 +01:00
Mark Abraham
8e914308f0 gromacs: add itt variant (#47764)
Permit configuring GROMACS with support for mdrun to trace its timing
regions by calling the ITT API. This permits tools like VTune and
unitrace to augment their analysis with GROMACS-specific annotation.
2024-11-25 16:12:55 +01:00
Mikael Simberg
3c220d0989 apex: add 2.7.0 (#47736) 2024-11-25 13:22:16 +01:00
Wouter Deconinck
8094fa1e2f py-gradio: add v5.1.0 (and add/update dependencies) (fix CVEs) (#47504)
* py-pdm-backend: add v2.4.3
* py-starlette: add v0.28.0, v0.32.0, v0.35.1, v0.36.3, v0.37.2, v0.41.2
* py-fastapi: add v0.110.2, v0.115.4
* py-pydantic-extra-types: add v2.10.0
* py-pydantic-settings: add v2.6.1
* py-python-multipart: add v0.0.17
* py-email-validator: add v2.2.0
2024-11-25 13:07:56 +01:00
Massimiliano Culpo
5c67051980 Add missing C/C++ dependencies (#47782) 2024-11-25 12:56:39 +01:00
John W. Parent
c01fb9a6d2 Add CMake 3.31 minor release (#47676) 2024-11-25 04:32:57 -07:00
Harmen Stoppels
bf12bb57e7 install_test: first look at builder, then package (#47735) 2024-11-25 11:53:28 +01:00
Wouter Deconinck
406c73ae11 py-boto*: add v1.34.162 (#47528)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:39:09 +01:00
Wouter Deconinck
3f50ccfcdd py-azure-*: updated versions (#47525) 2024-11-25 11:38:49 +01:00
Wouter Deconinck
9883a2144d py-quart: add v0.19.8 (#47508)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:38:22 +01:00
Wouter Deconinck
94815d2227 py-netifaces: add v0.10.9, v0.11.0 (#47451) 2024-11-25 11:37:41 +01:00
Wouter Deconinck
a15563f890 py-werkzeug: add v3.1.3 (and deprecate older versions) (#47507)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:28:01 +01:00
Wouter Deconinck
ac2ede8d2f py-pyzmq: add v25.1.2, v26.0.3, v26.1.1, v26.2.0 (switch to py-scikit-build-core) (#44493) 2024-11-25 11:00:00 +01:00
david-edwards-linaro
b256a7c50d linaro-forge: remove v21.1.3 (#47688) 2024-11-25 10:53:27 +01:00
Szabolcs Horvát
21e10d6d98 igraph: add v0.10.15 (#47692) 2024-11-25 10:51:24 +01:00
afzpatel
ed39967848 rocm-tensile: add 6.2.1 (#47702) 2024-11-25 10:40:21 +01:00
Alex Richert
eda0c6888e ip: add cmake version requirement for @5.1: (#47754) 2024-11-25 02:38:08 -07:00
pauleonix
66055f903c cuda: Add v12.6.3 (#47721) 2024-11-25 10:36:11 +01:00
Dave Keeshan
a1c57d86c3 libusb: add v1.0.23:1.0.27 (#47727) 2024-11-25 10:33:08 +01:00
Dave Keeshan
9da8dcae97 verible: add v0.0.3841 (#47729) 2024-11-25 10:30:48 +01:00
jflrichard
c93f223a73 postgis: add version 3.1.2 (#47743) 2024-11-25 10:24:03 +01:00
Wouter Deconinck
f1faf31735 build-containers: determine latest release tag and push that as latest (#47742) 2024-11-25 10:20:58 +01:00
Stephen Herbener
8957ef0df5 Updated version specs for bufr-query package. (#47752) 2024-11-25 10:14:16 +01:00
Veselin Dobrev
347ec87fc5 mfem: add logic for the C++ standard level when using rocPRIM (#47751) 2024-11-25 10:13:22 +01:00
Adam J. Stewart
cd8c46e54e py-ruff: add v0.8.0 (#47758) 2024-11-25 10:02:31 +01:00
Adam J. Stewart
75b03bc12f glib: add v2.82.2 (#47766) 2024-11-24 20:55:18 +01:00
Adam J. Stewart
58511a3352 py-pandas: correct Python version constraint (#47770) 2024-11-24 17:48:16 +01:00
Adam J. Stewart
325873a4c7 py-fsspec: add v2024.10.0 (#47778) 2024-11-24 15:42:30 +01:00
Adam J. Stewart
9156e4be04 awscli-v2: add v2.22.4 (#47765) 2024-11-24 15:42:06 +01:00
Adam J. Stewart
12d3abc736 py-pytz: add v2024.2 (#47777) 2024-11-24 15:40:45 +01:00
Adam J. Stewart
4208aa6291 py-torchvision: add Python 3.13 support (#47776) 2024-11-24 15:40:11 +01:00
Adam J. Stewart
0bad754e23 py-scikit-learn: add Python 3.13 support (#47775) 2024-11-24 15:39:36 +01:00
Adam J. Stewart
cde2620f41 py-safetensors: add v0.4.5 (#47774) 2024-11-24 15:38:05 +01:00
Adam J. Stewart
a35aa038b0 py-pystac: add support for Python 3.12+ (#47772) 2024-11-24 15:37:43 +01:00
Adam J. Stewart
150416919e py-pydantic-core: add v2.27.1 (#47771) 2024-11-24 15:37:06 +01:00
Adam J. Stewart
281c274e0b py-jupyter-packaging: add Python 3.13 support (#47769) 2024-11-24 15:31:31 +01:00
Adam J. Stewart
16e130ece1 py-cryptography: mark Python 3.13 support (#47768) 2024-11-24 15:31:08 +01:00
Adam J. Stewart
7586303fba py-ruamel-yaml-clib: add Python compatibility bounds (#47773) 2024-11-24 15:28:45 +01:00
Harmen Stoppels
6501880fbf py-node-env: add v1.9.1 (#47762) 2024-11-24 15:27:16 +01:00
Harmen Stoppels
c76098038c py-ipykernel: split forward and backward compat bounds (#47763) 2024-11-24 15:26:15 +01:00
Harmen Stoppels
124b616b27 add a few forward compat bounds with python (#47761) 2024-11-24 15:23:11 +01:00
Adam J. Stewart
1148c8f195 gobject-introspection: Python 3.12 still not supported (#47767) 2024-11-24 03:53:32 -07:00
Adam J. Stewart
c57452dd08 py-cffi: support Python 3.12+ (#47713) 2024-11-24 08:41:29 +01:00
Harmen Stoppels
a7e57c9a14 py-opt-einsum: add v3.4.0 (#47759) 2024-11-24 08:40:29 +01:00
381 changed files with 5229 additions and 3217 deletions

View File

@@ -57,6 +57,12 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
@@ -71,6 +77,7 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -113,7 +120,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -32,3 +32,4 @@ jobs:
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: true

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20241105
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -76,6 +76,8 @@ packages:
buildable: false
cray-mvapich2:
buildable: false
egl:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:

View File

@@ -1326,6 +1326,7 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1391,6 +1392,13 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -6,7 +6,7 @@ python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.3
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1

View File

@@ -24,6 +24,7 @@
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
@@ -2772,22 +2773,6 @@ def prefixes(path):
return paths
@system_path_filter
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
@system_path_filter
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
@@ -2838,6 +2823,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@@ -693,19 +693,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
details.append(f"{pkg_name}@{v} uses {digest}")
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
details.append(f"Resource in '{pkg_name}' uses {digest}")
if details:
errors.append(error_cls(error_msg, details))

View File

@@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirror
import spack.mirrors.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
@@ -369,7 +369,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
@@ -1176,7 +1176,7 @@ def _url_upload_tarball_and_specfile(
class Uploader:
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
@@ -1224,7 +1224,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
@@ -1273,7 +1273,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
@@ -1297,7 +1297,7 @@ def push(
def make_uploader(
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
@@ -1953,9 +1953,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1980,7 +1980,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
return spack.mirrors.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
@@ -2334,7 +2334,9 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
codesign("-fs-", binary)
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -2648,7 +2650,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []
binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
@@ -2709,7 +2711,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []
if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}
@@ -2748,7 +2750,7 @@ def clear_spec_cache():
def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)
if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
@@ -2803,7 +2805,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
def _url_push_keys(
*mirrors: Union[spack.mirror.Mirror, str],
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
@@ -2870,7 +2872,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
"""
rebuilds = {}
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))
rebuild_list = []
@@ -2914,7 +2916,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
@@ -2923,7 +2925,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):

View File

@@ -35,9 +35,10 @@
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.concretize
import spack.config
import spack.detection
import spack.mirror
import spack.mirrors.mirror
import spack.platforms
import spack.spec
import spack.store
@@ -91,7 +92,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:
@@ -271,10 +272,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
concrete_spec = spack.spec.Spec(
abstract_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize()
concrete_spec = spack.concretize.concretized(abstract_spec)
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
@@ -300,7 +301,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
concrete_spec = spack.concretize.concretized(spack.spec.Spec(abstract_spec_str))
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):

View File

@@ -1426,27 +1426,20 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
fun_lineno = lineno - start
fun_lineno = frame.f_lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import Any, List, Optional, Set, Tuple
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -21,6 +21,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
@@ -166,15 +167,18 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths

View File

@@ -37,7 +37,8 @@
import spack.config as cfg
import spack.error
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.paths
import spack.repo
import spack.spec
@@ -204,7 +205,7 @@ def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages:
return
mirrors = spack.mirror.MirrorCollection(binary=True)
mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values():
tty.msg(f" {m.fetch_url}")
@@ -797,7 +798,7 @@ def ensure_expected_target_path(path):
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
@@ -1323,7 +1324,7 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirror.Mirror.from_url(mirror_url)
mirror = spack.mirrors.mirror.Mirror.from_url(mirror_url)
try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])
@@ -1343,7 +1344,7 @@ def remove_other_mirrors(mirrors_to_keep, scope=None):
mirrors_to_remove.append(name)
for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)
spack.mirrors.utils.remove(mirror_name, scope)
def copy_files_to_artifacts(src, artifacts_dir):

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import difflib
import importlib
import os
import re
@@ -125,6 +126,8 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)
attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
@@ -196,7 +199,7 @@ def _concretize_spec_pairs(to_concretize, tests=False):
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized()]
return [concrete or spack.concretize.concretized(abstract)]
# Special case if every spec is either concrete or has an abstract hash
if all(
@@ -248,9 +251,9 @@ def matching_spec_from_env(spec):
"""
env = ev.active_environment()
if env:
return env.matching_spec(spec) or spec.concretized()
return env.matching_spec(spec) or spack.concretize.concretized(spec)
else:
return spec.concretized()
return spack.concretize.concretized(spec)
def matching_specs_from_env(specs):
@@ -691,3 +694,24 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)

View File

@@ -15,8 +15,9 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.mirror
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
@@ -398,9 +399,9 @@ def _mirror(args):
llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
spec = spack.concretize.concretized(spack.spec.Spec(spec_str))
for node in spec.traverse():
spack.mirror.create(mirror_dir, [node])
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:

View File

@@ -17,11 +17,12 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirror
import spack.mirrors.mirror
import spack.oci.oci
import spack.spec
import spack.stage
@@ -392,7 +393,7 @@ def push_fn(args):
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
mirror = args.mirror
assert isinstance(mirror, spack.mirror.Mirror)
assert isinstance(mirror, spack.mirrors.mirror.Mirror)
push_url = mirror.push_url
@@ -555,8 +556,7 @@ def check_fn(args: argparse.Namespace):
tty.msg("No specs provided, exiting.")
return
for spec in specs:
spec.concretize()
specs = [spack.concretize.concretized(s) for s in specs]
# Next see if there are any configured binary mirrors
configured_mirrors = spack.config.get("mirrors", scope=args.scope)
@@ -624,7 +624,7 @@ def save_specfile_fn(args):
root = specs[0]
if not root.concrete:
root.concretize()
root = spack.concretize.concretized(root)
save_dependency_specfiles(
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)
@@ -750,7 +750,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)

View File

@@ -20,7 +20,7 @@
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.mirrors.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -240,7 +240,7 @@ def ci_reindex(args):
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
mirror = spack.mirror.Mirror(remote_mirror_url)
mirror = spack.mirrors.mirror.Mirror(remote_mirror_url)
buildcache.update_index(mirror, update_keys=True)
@@ -328,7 +328,7 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")

View File

@@ -14,7 +14,8 @@
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.reporters
import spack.spec
import spack.store
@@ -689,31 +690,31 @@ def mirror_name_or_url(m):
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
return spack.mirror.Mirror(m)
return spack.mirrors.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirror.require_mirror_name(m)
return spack.mirrors.utils.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
def mirror_url(url):
try:
return spack.mirror.Mirror.from_url(url)
return spack.mirrors.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_directory(path):
try:
return spack.mirror.Mirror.from_local_path(path)
return spack.mirrors.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_name(name):
try:
return spack.mirror.require_mirror_name(name)
return spack.mirrors.utils.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e

View File

@@ -19,6 +19,7 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.concretize
import spack.environment as ev
import spack.installer
import spack.store
@@ -104,7 +105,7 @@ def deprecate(parser, args):
)
if args.install:
deprecator = specs[1].concretized()
deprecator = spack.concretize.concretized(specs[1])
else:
deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True)

View File

@@ -11,6 +11,7 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments
import spack.concretize
import spack.config
import spack.repo
from spack.cmd.common import arguments
@@ -115,7 +116,7 @@ def dev_build(self, args):
# Forces the build to run out of the source directory.
spec.constrain("dev_path=%s" % source_path)
spec.concretize()
spec = spack.concretize.concretized(spec)
if spec.installed:
tty.error("Already installed in %s" % spec.prefix)

View File

@@ -8,7 +8,7 @@
import tempfile
import spack.binary_distribution
import spack.mirror
import spack.mirrors.mirror
import spack.paths
import spack.stage
import spack.util.gpg
@@ -217,11 +217,11 @@ def gpg_publish(args):
mirror = None
if args.directory:
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirror.Mirror(url, url)
mirror = spack.mirrors.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
mirror = spack.mirrors.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
elif args.mirror_url:
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
mirror = spack.mirrors.mirror.Mirror(args.mirror_url, args.mirror_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution._url_push_keys(

View File

@@ -14,6 +14,7 @@
from llnl.util import lang, tty
import spack.cmd
import spack.concretize
import spack.config
import spack.environment as ev
import spack.paths
@@ -451,7 +452,7 @@ def concrete_specs_from_file(args):
else:
s = spack.spec.Spec.from_json(f)
concretized = s.concretized()
concretized = spack.concretize.concretized(s)
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."

View File

@@ -8,6 +8,7 @@
from llnl.path import convert_to_posix_path
import spack.concretize
import spack.paths
import spack.util.executable
from spack.spec import Spec
@@ -66,8 +67,7 @@ def make_installer(parser, args):
"""
if sys.platform == "win32":
output_dir = args.output_dir
cmake_spec = Spec("cmake")
cmake_spec.concretize()
cmake_spec = spack.concretize.concretized(Spec("cmake"))
cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe")
cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe")
spack_source = args.spack_source

View File

@@ -14,7 +14,8 @@
import spack.concretize
import spack.config
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.repo
import spack.spec
import spack.util.web as web_util
@@ -365,15 +366,15 @@ def mirror_add(args):
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
mirror = spack.mirrors.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirror.Mirror(args.url, name=args.name)
spack.mirror.add(mirror, args.scope)
mirror = spack.mirrors.mirror.Mirror(args.url, name=args.name)
spack.mirrors.utils.add(mirror, args.scope)
def mirror_remove(args):
"""remove a mirror by name"""
spack.mirror.remove(args.name, args.scope)
spack.mirrors.utils.remove(args.name, args.scope)
def _configure_mirror(args):
@@ -382,7 +383,7 @@ def _configure_mirror(args):
if args.name not in mirrors:
tty.die(f"No mirror found with name {args.name}.")
entry = spack.mirror.Mirror(mirrors[args.name], args.name)
entry = spack.mirrors.mirror.Mirror(mirrors[args.name], args.name)
direction = "fetch" if args.fetch else "push" if args.push else None
changes = {}
if args.url:
@@ -449,7 +450,7 @@ def mirror_set_url(args):
def mirror_list(args):
"""print out available mirrors to the console"""
mirrors = spack.mirror.MirrorCollection(scope=args.scope)
mirrors = spack.mirrors.mirror.MirrorCollection(scope=args.scope)
if not mirrors:
tty.msg("No mirrors configured.")
return
@@ -489,10 +490,10 @@ def concrete_specs_from_user(args):
def extend_with_additional_versions(specs, num_versions):
if num_versions == "all":
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirror.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [spack.concretize.concretized(x) for x in mirror_specs]
return mirror_specs
@@ -570,7 +571,7 @@ def concrete_specs_from_environment():
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
return mirror_specs
@@ -659,19 +660,21 @@ def _specs_and_action(args):
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
mirror_cache, mirror_stats = spack.mirrors.utils.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
spack.mirrors.utils.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
present, mirrored, error = spack.mirrors.utils.create(
path, mirror_specs, skip_unstable_versions
)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
@@ -681,7 +684,7 @@ def mirror_destroy(args):
mirror_url = None
if args.mirror_name:
result = spack.mirror.MirrorCollection().lookup(args.mirror_name)
result = spack.mirrors.mirror.MirrorCollection().lookup(args.mirror_name)
mirror_url = result.push_url
elif args.mirror_url:
mirror_url = args.mirror_url

View File

@@ -323,8 +323,6 @@ def process_files(file_list, is_args):
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = (
"--rm",
"spack",
"--rm",
"spack.pkgkit",
"--rm",

View File

@@ -124,8 +124,8 @@ def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -155,10 +155,10 @@ def setup_custom_environment(self, pkg, env):
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI

View File

@@ -51,11 +51,10 @@ def concretize_specs_together(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
result = Solver().solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
@@ -90,7 +89,7 @@ def concretize_together_when_possible(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
@@ -98,9 +97,8 @@ def concretize_together_when_possible(
}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
for result in Solver().solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
@@ -124,7 +122,7 @@ def concretize_separately(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.bootstrap
from spack.bootstrap import ensure_bootstrap_configuration, ensure_clingo_importable_or_raise
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
@@ -134,8 +132,8 @@ def concretize_separately(
]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
with ensure_bootstrap_configuration():
ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -190,10 +188,50 @@ def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int,
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = Spec(spec_str).concretized(tests=tests)
spec = concretized(Spec(spec_str), tests=tests)
return index, spec, time.time() - start
def concretized(spec: Spec, tests: Union[bool, Iterable[str]] = False) -> Spec:
"""Return a concretized copy of the given spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
from spack.solver.asp import Solver, SpecBuilder
spec.replace_hash()
for node in spec.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if spec._concrete:
return spec.copy()
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve([spec], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]
node = SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
return concretized
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List
from typing import Dict, List, Type
import spack.deptypes as dt
import spack.spec
@@ -38,7 +38,7 @@ class Dependency:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: Type["spack.package_base.PackageBase"],
spec: "spack.spec.Spec",
depflag: dt.DepFlag = dt.DEFAULT,
):

View File

@@ -21,6 +21,7 @@ class OpenMpi(Package):
* ``conflicts``
* ``depends_on``
* ``extends``
* ``license``
* ``patch``
* ``provides``
* ``resource``
@@ -34,11 +35,12 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Type, Union
import llnl.util.tty.color
import spack.deptypes as dt
import spack.fetch_strategy
import spack.package_base
import spack.patch
import spack.spec
@@ -46,7 +48,6 @@ class OpenMpi(Package):
import spack.variant
from spack.dependency import Dependency
from spack.directives_meta import DirectiveError, DirectiveMeta
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
GitVersion,
@@ -81,8 +82,8 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[spack.package_base.PackageBase, Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
Patcher = Callable[[Union[Type[spack.package_base.PackageBase], Dependency]], None]
PatchesType = Union[Patcher, str, List[Union[Patcher, str]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
@@ -218,7 +219,7 @@ def version(
return lambda pkg: _execute_version(pkg, ver, **kwargs)
def _execute_version(pkg, ver, **kwargs):
def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str, int], **kwargs):
if (
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
@@ -249,12 +250,12 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: spack.package_base.PackageBase,
pkg: Type[spack.package_base.PackageBase],
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
when_spec = _make_when_spec(when)
if not when_spec:
@@ -329,7 +330,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: spack.package_base.PackageBase):
def _execute_conflicts(pkg: Type[spack.package_base.PackageBase]):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -348,7 +349,7 @@ def depends_on(
spec: SpecType,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
"""Creates a dict of deps with specs defining when they apply.
@@ -370,14 +371,16 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: spack.package_base.PackageBase):
def _execute_depends_on(pkg: Type[spack.package_base.PackageBase]):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
def redistribute(
source: Optional[bool] = None, binary: Optional[bool] = None, when: WhenType = None
):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
@@ -392,7 +395,10 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: spack.package_base.PackageBase, source=None, binary=None, when: WhenType = None
pkg: Type[spack.package_base.PackageBase],
source: Optional[bool],
binary: Optional[bool],
when: WhenType,
):
if source is None and binary is None:
return
@@ -468,9 +474,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: spack.package_base.PackageBase):
import spack.parser # Avoid circular dependency
def _execute_provides(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return
@@ -516,7 +520,7 @@ def can_splice(
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: spack.package_base.PackageBase):
def _execute_can_splice(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
@@ -557,10 +561,10 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union[spack.package_base.PackageBase, Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep
if hasattr(pkg, "has_code") and not pkg.has_code:
raise UnsupportedPackageDirective(
@@ -734,58 +738,55 @@ def _execute_variant(pkg):
@directive("resources")
def resource(**kwargs):
"""Define an external resource to be fetched and staged when building the
package. Based on the keywords present in the dictionary the appropriate
FetchStrategy will be used for the resource. Resources are fetched and
staged in their own folder inside spack stage area, and then moved into
the stage area of the package that needs them.
def resource(
*,
name: Optional[str] = None,
destination: str = "",
placement: Optional[str] = None,
when: WhenType = None,
# additional kwargs are as for `version()`
**kwargs,
):
"""Define an external resource to be fetched and staged when building the package.
Based on the keywords present in the dictionary the appropriate FetchStrategy will
be used for the resource. Resources are fetched and staged in their own folder
inside spack stage area, and then moved into the stage area of the package that
needs them.
List of recognized keywords:
Keyword Arguments:
name: name for the resource
when: condition defining when the resource is needed
destination: path, relative to the package stage area, to which resource should be moved
placement: optionally rename the expanded resource inside the destination directory
* 'when' : (optional) represents the condition upon which the resource is
needed
* 'destination' : (optional) path where to move the resource. This path
must be relative to the main package stage area.
* 'placement' : (optional) gives the possibility to fine tune how the
resource is moved into the main package stage area.
"""
def _execute_resource(pkg):
when = kwargs.get("when")
when_spec = _make_when_spec(when)
if not when_spec:
return
destination = kwargs.get("destination", "")
placement = kwargs.get("placement", None)
# Check if the path is relative
if os.path.isabs(destination):
message = (
"The destination keyword of a resource directive " "can't be an absolute path.\n"
)
message += "\tdestination : '{dest}\n'".format(dest=destination)
raise RuntimeError(message)
msg = "The destination keyword of a resource directive can't be an absolute path.\n"
msg += f"\tdestination : '{destination}\n'"
raise RuntimeError(msg)
# Check if the path falls within the main package stage area
test_path = "stage_folder_root"
normalized_destination = os.path.normpath(
os.path.join(test_path, destination)
) # Normalized absolute path
# Normalized absolute path
normalized_destination = os.path.normpath(os.path.join(test_path, destination))
if test_path not in normalized_destination:
message = (
"The destination folder of a resource must fall "
"within the main package stage directory.\n"
)
message += "\tdestination : '{dest}'\n".format(dest=destination)
raise RuntimeError(message)
msg = "Destination of a resource must be within the package stage directory.\n"
msg += f"\tdestination : '{destination}'\n"
raise RuntimeError(msg)
resources = pkg.resources.setdefault(when_spec, [])
name = kwargs.get("name")
fetcher = from_kwargs(**kwargs)
resources.append(Resource(name, fetcher, destination, placement))
resources.append(
Resource(name, spack.fetch_strategy.from_kwargs(**kwargs), destination, placement)
)
return _execute_resource
@@ -817,7 +818,9 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(pkg, license_identifier: str, when):
def _execute_license(
pkg: Type[spack.package_base.PackageBase], license_identifier: str, when: WhenType
):
# If when is not specified the license always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -881,7 +884,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: spack.package_base.PackageBase):
def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -906,7 +909,7 @@ def _execute_requires(pkg: spack.package_base.PackageBase):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: spack.package_base.PackageBase):
def _execute_languages(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return

View File

@@ -5,7 +5,7 @@
import collections.abc
import functools
from typing import List, Set
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Type, Union
import llnl.util.lang
@@ -25,11 +25,13 @@ class DirectiveMeta(type):
# Set of all known directives
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_directives_to_be_executed: List[Callable] = []
_when_constraints_from_context: List[spack.spec.Spec] = []
_default_args: List[dict] = []
def __new__(cls, name, bases, attr_dict):
def __new__(
cls: Type["DirectiveMeta"], name: str, bases: tuple, attr_dict: dict
) -> "DirectiveMeta":
# Initialize the attribute containing the list of directives
# to be executed. Here we go reversed because we want to execute
# commands:
@@ -60,7 +62,7 @@ def __new__(cls, name, bases, attr_dict):
return super(DirectiveMeta, cls).__new__(cls, name, bases, attr_dict)
def __init__(cls, name, bases, attr_dict):
def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
@@ -81,27 +83,27 @@ def __init__(cls, name, bases, attr_dict):
super(DirectiveMeta, cls).__init__(name, bases, attr_dict)
@staticmethod
def push_to_context(when_spec):
def push_to_context(when_spec: spack.spec.Spec) -> None:
"""Add a spec to the context constraints."""
DirectiveMeta._when_constraints_from_context.append(when_spec)
@staticmethod
def pop_from_context():
def pop_from_context() -> spack.spec.Spec:
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args):
def push_default_args(default_args: Dict[str, Any]) -> None:
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args():
def pop_default_args() -> dict:
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts=None):
def directive(dicts: Optional[Union[Sequence[str], str]] = None) -> Callable:
"""Decorator for Spack directives.
Spack directives allow you to modify a package while it is being
@@ -156,7 +158,7 @@ class Foo(Package):
DirectiveMeta._directive_dict_names |= set(dicts)
# This decorator just returns the directive functions
def _decorator(decorated_function):
def _decorator(decorated_function: Callable) -> Callable:
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)

View File

@@ -192,3 +192,10 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
class MirrorError(SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -5,7 +5,6 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
@@ -17,7 +16,6 @@
import llnl.util.lang
import spack.cmd
import spack.config
import spack.error
import spack.util.path
@@ -25,9 +23,6 @@
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
# TODO: For consistency we should use spack.cmd.python_name(), but
# currently this would create a circular relationship between
# spack.cmd and spack.extensions.
def _python_name(cmd_name):
return cmd_name.replace("-", "_")
@@ -211,8 +206,7 @@ def get_module(cmd_name):
module = load_command_extension(cmd_name, folder)
if module:
return module
else:
raise CommandNotFoundError(cmd_name)
return None
def get_template_dirs():
@@ -224,27 +218,6 @@ def get_template_dirs():
return extensions
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):
"""Exception class thrown when a configured extension does not follow
the expected naming convention.

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirror
import spack.mirrors.mirror
def post_install(spec, explicit):
@@ -22,7 +22,7 @@ def post_install(spec, explicit):
return
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True, autopush=True).values():
signing_key = bindist.select_signing_key() if mirror.signed else None
with bindist.make_uploader(mirror=mirror, force=True, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])

View File

@@ -375,23 +375,16 @@ def phase_tests(self, builder, phase_name: str, method_names: List[str]):
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)

View File

@@ -56,7 +56,7 @@
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.mirror
import spack.mirrors.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -491,7 +491,7 @@ def _try_install_from_binary_cache(
timer: timer to keep track of binary install phases.
"""
# Early exit if no binary mirrors are configured.
if not spack.mirror.MirrorCollection(binary=True):
if not spack.mirrors.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")

View File

@@ -0,0 +1,4 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,146 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url
import llnl.util.symlink
from llnl.util.filesystem import mkdirp
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
from spack.error import MirrorError
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)

View File

@@ -2,42 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particular package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import collections
import collections.abc
import operator
import os
import os.path
import sys
import traceback
import urllib.parse
from typing import Any, Dict, Optional, Tuple, Union
import llnl.url
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.oci.image
import spack.repo
import spack.spec
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.version
from spack.error import MirrorError
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
@@ -490,380 +468,3 @@ def __iter__(self):
def __len__(self):
return len(self._mirrors)
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -0,0 +1,258 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import traceback
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml
import spack.version
from spack.error import MirrorError
from spack.mirrors.mirror import Mirror, MirrorCollection
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(
pkg_obj, mirror_cache: "spack.caches.MirrorCache", mirror_stats: MirrorStats
) -> bool:
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache: mirror where to add the spec.
mirror_stats: statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
max_retries = 3
for num_retries in range(max_retries):
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
break
except Exception as e:
if num_retries + 1 == max_retries:
if spack.config.get("config:debug"):
traceback.print_exc()
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.format("{name}{@version}"), str(e)
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -16,7 +16,8 @@
import llnl.util.tty as tty
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.oci.opener
import spack.stage
import spack.util.url
@@ -213,7 +214,7 @@ def upload_manifest(
return digest, size
def image_from_mirror(mirror: spack.mirror.Mirror) -> ImageReference:
def image_from_mirror(mirror: spack.mirrors.mirror.Mirror) -> ImageReference:
"""Given an OCI based mirror, extract the URL and image name from it"""
url = mirror.push_url
if not url.startswith("oci://"):
@@ -385,5 +386,8 @@ def make_stage(
# is the `oci-layout` and `index.json` files, which are
# required by the spec.
return spack.stage.Stage(
fetch_strategy, mirror_paths=spack.mirror.OCILayout(digest), name=digest.digest, keep=keep
fetch_strategy,
mirror_paths=spack.mirrors.layout.OCILayout(digest),
name=digest.digest,
keep=keep,
)

View File

@@ -20,7 +20,7 @@
import llnl.util.lang
import spack.config
import spack.mirror
import spack.mirrors.mirror
import spack.parser
import spack.util.web
@@ -367,11 +367,11 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
def credentials_from_mirrors(
domain: str, *, mirrors: Optional[Iterable[spack.mirror.Mirror]] = None
domain: str, *, mirrors: Optional[Iterable[spack.mirrors.mirror.Mirror]] = None
) -> Optional[UsernamePassword]:
"""Filter out OCI registry credentials from a list of mirrors."""
mirrors = mirrors or spack.mirror.MirrorCollection().values()
mirrors = mirrors or spack.mirrors.mirror.MirrorCollection().values()
for mirror in mirrors:
# Prefer push credentials over fetch. Unlikely that those are different

View File

@@ -40,7 +40,8 @@
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
@@ -54,6 +55,7 @@
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.resource import Resource
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -585,6 +587,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
resources: Dict[spack.spec.Spec, List[Resource]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
@@ -595,6 +598,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
licenses: Dict[spack.spec.Spec, str]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
@@ -1184,10 +1188,10 @@ def _make_resource_stage(self, root_stage, resource):
root=root_stage,
resource=resource,
name=self._resource_stage(resource),
mirror_paths=spack.mirror.default_mirror_layout(
mirror_paths=spack.mirrors.layout.default_mirror_layout(
resource.fetcher, os.path.join(self.name, pretty_resource_name)
),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
path=self.path,
)
@@ -1199,7 +1203,7 @@ def _make_root_stage(self, fetcher):
# Construct a mirror path (TODO: get this out of package.py)
format_string = "{name}-{version}"
pretty_name = self.spec.format_path(format_string)
mirror_paths = spack.mirror.default_mirror_layout(
mirror_paths = spack.mirrors.layout.default_mirror_layout(
fetcher, os.path.join(self.name, pretty_name), self.spec
)
# Construct a path where the stage should build..
@@ -1208,7 +1212,7 @@ def _make_root_stage(self, fetcher):
stage = Stage(
fetcher,
mirror_paths=mirror_paths,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
name=stage_name,
path=self.path,
search_fn=self._download_search,

View File

@@ -16,7 +16,8 @@
import spack
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.repo
import spack.stage
import spack.util.spack_json as sjson
@@ -329,12 +330,12 @@ def stage(self) -> "spack.stage.Stage":
name = "{0}-{1}".format(os.path.basename(self.url), fetch_digest[:7])
per_package_ref = os.path.join(self.owner.split(".")[-1], name)
mirror_ref = spack.mirror.default_mirror_layout(fetcher, per_package_ref)
mirror_ref = spack.mirrors.layout.default_mirror_layout(fetcher, per_package_ref)
self._stage = spack.stage.Stage(
fetcher,
name=f"{spack.stage.stage_prefix}patch-{fetch_digest}",
mirror_paths=mirror_ref,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
)
return self._stage

View File

@@ -13,6 +13,7 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -275,10 +276,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
install_name_tool = executable.Executable("install_name_tool")
if args:
args.append(str(cur_path))
install_name_tool = executable.Executable("install_name_tool")
install_name_tool(*args)
with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path):
@@ -717,8 +718,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed
return False
args.append(abspath)
executable.Executable("install_name_tool")(*args)
with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args, temp_path)
return True

View File

@@ -41,6 +41,7 @@
import spack.provider_index
import spack.spec
import spack.tag
import spack.tengine
import spack.util.file_cache
import spack.util.git
import spack.util.naming as nm
@@ -81,43 +82,6 @@ def namespace_from_fullname(fullname):
return namespace
class _PrependFileLoader(importlib.machinery.SourceFileLoader):
def __init__(self, fullname, path, prepend=None):
super(_PrependFileLoader, self).__init__(fullname, path)
self.prepend = prepend
def path_stats(self, path):
stats = super(_PrependFileLoader, self).path_stats(path)
if self.prepend:
stats["size"] += len(self.prepend) + 1
return stats
def get_data(self, path):
data = super(_PrependFileLoader, self).get_data(path)
if path != self.path or self.prepend is None:
return data
else:
return self.prepend.encode() + b"\n" + data
class RepoLoader(_PrependFileLoader):
"""Loads a Python module associated with a package in specific repository"""
#: Code in ``_package_prepend`` is prepended to imported packages.
#:
#: Spack packages are expected to call `from spack.package import *`
#: themselves, but we are allowing a deprecation period before breaking
#: external repos that don't do this yet.
_package_prepend = "from spack.package import *"
def __init__(self, fullname, repo, package_name):
self.repo = repo
self.package_name = package_name
self.package_py = repo.filename_for_package_name(package_name)
self.fullname = fullname
super().__init__(self.fullname, self.package_py, prepend=self._package_prepend)
class SpackNamespaceLoader:
def create_module(self, spec):
return SpackNamespace(spec.name)
@@ -187,7 +151,8 @@ def compute_loader(self, fullname):
# With 2 nested conditionals we can call "repo.real_name" only once
package_name = repo.real_name(module_name)
if package_name:
return RepoLoader(fullname, repo, package_name)
module_path = repo.filename_for_package_name(package_name)
return importlib.machinery.SourceFileLoader(fullname, module_path)
# We are importing a full namespace like 'spack.pkg.builtin'
if fullname == repo.full_namespace:
@@ -1521,8 +1486,6 @@ def add_package(self, name, dependencies=None):
Both "dep_type" and "condition" can default to ``None`` in which case
``spack.dependency.default_deptype`` and ``spack.spec.Spec()`` are used.
"""
import spack.tengine # avoid circular import
dependencies = dependencies or []
context = {"cls_name": nm.mod_to_class(name), "dependencies": dependencies}
template = spack.tengine.make_environment().get_template("mock-repository/package.pyt")

View File

@@ -12,7 +12,10 @@
class Resource:
"""Represents an optional resource to be fetched by a package.
"""Represents any resource to be fetched by a package.
This includes the main tarball or source archive, as well as extra archives defined
by the resource() directive.
Aggregates a name, a fetcher, a destination and a placement.
"""

View File

@@ -106,8 +106,8 @@
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v1.0.",
"error": False,
},
],

View File

@@ -48,8 +48,6 @@
import spack.version as vn
import spack.version.git_ref_lookup
from spack import traverse
from spack.config import get_mark_from_yaml_data
from spack.error import SpecSyntaxError
from .core import (
AspFunction,
@@ -65,11 +63,12 @@
parse_term,
)
from .counter import FullDuplicatesCounter, MinimalDuplicatesCounter, NoDuplicatesCounter
from .requirements import RequirementKind, RequirementParser, RequirementRule
from .version_order import concretization_version_order
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
TransformFunction = Callable[["spack.spec.Spec", List[AspFunction]], List[AspFunction]]
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
#: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32"
@@ -129,8 +128,8 @@ def __str__(self):
@contextmanager
def named_spec(
spec: Optional["spack.spec.Spec"], name: Optional[str]
) -> Iterator[Optional["spack.spec.Spec"]]:
spec: Optional[spack.spec.Spec], name: Optional[str]
) -> Iterator[Optional[spack.spec.Spec]]:
"""Context manager to temporarily set the name of a spec"""
if spec is None or name is None:
yield spec
@@ -144,17 +143,6 @@ def named_spec(
spec.name = old_name
class RequirementKind(enum.Enum):
"""Purpose / provenance of a requirement"""
#: Default requirement expressed under the 'all' attribute of packages.yaml
DEFAULT = enum.auto()
#: Requirement expressed on a virtual package
VIRTUAL = enum.auto()
#: Requirement expressed on a specific package
PACKAGE = enum.auto()
class DeclaredVersion(NamedTuple):
"""Data class to contain information on declared versions used in the solve"""
@@ -757,25 +745,14 @@ def on_model(model):
raise UnsatisfiableSpecError(msg)
class RequirementRule(NamedTuple):
"""Data class to collect information on a requirement"""
pkg_name: str
policy: str
requirements: List["spack.spec.Spec"]
condition: "spack.spec.Spec"
kind: RequirementKind
message: Optional[str]
class KnownCompiler(NamedTuple):
"""Data class to collect information on compilers"""
spec: "spack.spec.Spec"
spec: spack.spec.Spec
os: str
target: str
available: bool
compiler_obj: Optional["spack.compiler.Compiler"]
compiler_obj: Optional[spack.compiler.Compiler]
def _key(self):
return self.spec, self.os, self.target
@@ -1146,6 +1123,7 @@ class SpackSolverSetup:
def __init__(self, tests: bool = False):
# these are all initialized in setup()
self.gen: "ProblemInstanceBuilder" = ProblemInstanceBuilder()
self.requirement_parser = RequirementParser(spack.config.CONFIG)
self.possible_virtuals: Set[str] = set()
self.assumptions: List[Tuple["clingo.Symbol", bool]] = [] # type: ignore[name-defined]
@@ -1332,8 +1310,7 @@ def compiler_facts(self):
self.gen.newline()
def package_requirement_rules(self, pkg):
parser = RequirementParser(spack.config.CONFIG)
self.emit_facts_from_requirement_rules(parser.rules(pkg))
self.emit_facts_from_requirement_rules(self.requirement_parser.rules(pkg))
def pkg_rules(self, pkg, tests):
pkg = self.pkg_class(pkg)
@@ -1410,7 +1387,7 @@ def effect_rules(self):
def define_variant(
self,
pkg: "Type[spack.package_base.PackageBase]",
pkg: Type[spack.package_base.PackageBase],
name: str,
when: spack.spec.Spec,
variant_def: vt.Variant,
@@ -1514,7 +1491,7 @@ def define_auto_variant(self, name: str, multi: bool):
)
)
def variant_rules(self, pkg: "Type[spack.package_base.PackageBase]"):
def variant_rules(self, pkg: Type[spack.package_base.PackageBase]):
for name in pkg.variant_names():
self.gen.h3(f"Variant {name} in package {pkg.name}")
for when, variant_def in pkg.variant_definitions(name):
@@ -1705,8 +1682,8 @@ def dependency_holds(input_spec, requirements):
def _gen_match_variant_splice_constraints(
self,
pkg,
cond_spec: "spack.spec.Spec",
splice_spec: "spack.spec.Spec",
cond_spec: spack.spec.Spec,
splice_spec: spack.spec.Spec,
hash_asp_var: "AspVar",
splice_node,
match_variants: List[str],
@@ -1811,9 +1788,8 @@ def provider_defaults(self):
def provider_requirements(self):
self.gen.h2("Requirements on virtual providers")
parser = RequirementParser(spack.config.CONFIG)
for virtual_str in sorted(self.possible_virtuals):
rules = parser.rules_from_virtual(virtual_str)
rules = self.requirement_parser.rules_from_virtual(virtual_str)
if rules:
self.emit_facts_from_requirement_rules(rules)
self.trigger_rules()
@@ -3002,7 +2978,7 @@ def _specs_from_requires(self, pkg_name, section):
for s in spec_group[key]:
yield _spec_with_default_name(s, pkg_name)
def pkg_class(self, pkg_name: str) -> typing.Type["spack.package_base.PackageBase"]:
def pkg_class(self, pkg_name: str) -> typing.Type[spack.package_base.PackageBase]:
request = pkg_name
if pkg_name in self.explicitly_required_namespaces:
namespace = self.explicitly_required_namespaces[pkg_name]
@@ -3088,202 +3064,6 @@ def value(self) -> str:
return "".join(self.asp_problem)
def parse_spec_from_yaml_string(string: str) -> "spack.spec.Spec":
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Arguments:
string: a string representing a ``Spec`` from config YAML.
"""
try:
return spack.spec.Spec(string)
except SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise SpecSyntaxError(msg) from e
raise e
class RequirementParser:
"""Parses requirements from package.py files and configuration, and returns rules."""
def __init__(self, configuration):
self.config = configuration
def rules(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
result.extend(self.rules_from_package_py(pkg))
result.extend(self.rules_from_require(pkg))
result.extend(self.rules_from_prefer(pkg))
result.extend(self.rules_from_conflict(pkg))
return result
def rules_from_package_py(self, pkg) -> List[RequirementRule]:
rules = []
for when_spec, requirement_list in pkg.requirements.items():
for requirements, policy, message in requirement_list:
rules.append(
RequirementRule(
pkg_name=pkg.name,
policy=policy,
requirements=requirements,
kind=RequirementKind.PACKAGE,
condition=when_spec,
message=message,
)
)
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
def rules_from_require(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A strong preference is defined as:
#
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def rules_from_conflict(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A conflict is defined as:
#
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def _parse_prefer_conflict_item(self, item):
# The item is either a string or an object with at least a "spec" attribute
if isinstance(item, str):
spec = parse_spec_from_yaml_string(item)
condition = spack.spec.Spec()
message = None
else:
spec = parse_spec_from_yaml_string(item["spec"])
condition = spack.spec.Spec(item.get("when"))
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg: "spack.package_base.PackageBase", *, section: str):
config = self.config.get("packages")
data = config.get(pkg.name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
return kind, data
def _rules_from_requirements(
self, pkg_name: str, requirements, *, kind: RequirementKind
) -> List[RequirementRule]:
"""Manipulate requirements from packages.yaml, and return a list of tuples
with a uniform structure (name, policy, requirements).
"""
if isinstance(requirements, str):
requirements = [requirements]
rules = []
for requirement in requirements:
# A string is equivalent to a one_of group with a single element
if isinstance(requirement, str):
requirement = {"one_of": [requirement]}
for policy in ("spec", "one_of", "any_of"):
if policy not in requirement:
continue
constraints = requirement[policy]
# "spec" is for specifying a single spec
if policy == "spec":
constraints = [constraints]
policy = "one_of"
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint) for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
constraints = [
x
for x in constraints
if not self.reject_requirement_constraint(pkg_name, constraint=x, kind=kind)
]
if not constraints:
continue
rules.append(
RequirementRule(
pkg_name=pkg_name,
policy=policy,
requirements=constraints,
kind=kind,
message=requirement.get("message"),
condition=when,
)
)
return rules
def reject_requirement_constraint(
self, pkg_name: str, *, constraint: spack.spec.Spec, kind: RequirementKind
) -> bool:
"""Returns True if a requirement constraint should be rejected"""
if kind == RequirementKind.DEFAULT:
# Requirements under all: are applied only if they are satisfiable considering only
# package rules, so e.g. variants must exist etc. Otherwise, they are rejected.
try:
s = spack.spec.Spec(pkg_name)
s.constrain(constraint)
s.validate_or_raise()
except spack.error.SpackError as e:
tty.debug(
f"[SETUP] Rejecting the default '{constraint}' requirement "
f"on '{pkg_name}': {str(e)}",
level=2,
)
return True
return False
class CompilerParser:
"""Parses configuration files, and builds a list of possible compilers for the solve."""
@@ -3317,7 +3097,7 @@ def __init__(self, configuration) -> None:
self.compilers.add(candidate)
def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerParser":
def with_input_specs(self, input_specs: List[spack.spec.Spec]) -> "CompilerParser":
"""Accounts for input specs when building the list of possible compilers.
Args:
@@ -3357,7 +3137,7 @@ def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerPar
return self
def add_compiler_from_concrete_spec(self, spec: "spack.spec.Spec") -> None:
def add_compiler_from_concrete_spec(self, spec: spack.spec.Spec) -> None:
"""Account for compilers that are coming from concrete specs, through reuse.
Args:

View File

@@ -1003,6 +1003,8 @@ variant_default_not_used(node(ID, Package), Variant, Value)
node_has_variant(node(ID, Package), Variant, _),
not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _, _)),
% variant set explicitly don't count for this metric
not attr("variant_set", node(ID, Package), Variant, _),
attr("node", node(ID, Package)).
% The variant is set in an external spec

View File

@@ -0,0 +1,232 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import enum
from typing import List, NamedTuple, Optional, Sequence
from llnl.util import tty
import spack.config
import spack.error
import spack.package_base
import spack.spec
from spack.config import get_mark_from_yaml_data
class RequirementKind(enum.Enum):
"""Purpose / provenance of a requirement"""
#: Default requirement expressed under the 'all' attribute of packages.yaml
DEFAULT = enum.auto()
#: Requirement expressed on a virtual package
VIRTUAL = enum.auto()
#: Requirement expressed on a specific package
PACKAGE = enum.auto()
class RequirementRule(NamedTuple):
"""Data class to collect information on a requirement"""
pkg_name: str
policy: str
requirements: Sequence[spack.spec.Spec]
condition: spack.spec.Spec
kind: RequirementKind
message: Optional[str]
class RequirementParser:
"""Parses requirements from package.py files and configuration, and returns rules."""
def __init__(self, configuration: spack.config.Configuration):
self.config = configuration
def rules(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
result.extend(self.rules_from_package_py(pkg))
result.extend(self.rules_from_require(pkg))
result.extend(self.rules_from_prefer(pkg))
result.extend(self.rules_from_conflict(pkg))
return result
def rules_from_package_py(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
rules = []
for when_spec, requirement_list in pkg.requirements.items():
for requirements, policy, message in requirement_list:
rules.append(
RequirementRule(
pkg_name=pkg.name,
policy=policy,
requirements=requirements,
kind=RequirementKind.PACKAGE,
condition=when_spec,
message=message,
)
)
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
def rules_from_require(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A strong preference is defined as:
#
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def rules_from_conflict(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A conflict is defined as:
#
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def _parse_prefer_conflict_item(self, item):
# The item is either a string or an object with at least a "spec" attribute
if isinstance(item, str):
spec = parse_spec_from_yaml_string(item)
condition = spack.spec.Spec()
message = None
else:
spec = parse_spec_from_yaml_string(item["spec"])
condition = spack.spec.Spec(item.get("when"))
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg: spack.package_base.PackageBase, *, section: str):
config = self.config.get("packages")
data = config.get(pkg.name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
return kind, data
def _rules_from_requirements(
self, pkg_name: str, requirements, *, kind: RequirementKind
) -> List[RequirementRule]:
"""Manipulate requirements from packages.yaml, and return a list of tuples
with a uniform structure (name, policy, requirements).
"""
if isinstance(requirements, str):
requirements = [requirements]
rules = []
for requirement in requirements:
# A string is equivalent to a one_of group with a single element
if isinstance(requirement, str):
requirement = {"one_of": [requirement]}
for policy in ("spec", "one_of", "any_of"):
if policy not in requirement:
continue
constraints = requirement[policy]
# "spec" is for specifying a single spec
if policy == "spec":
constraints = [constraints]
policy = "one_of"
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint) for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
constraints = [
x
for x in constraints
if not self.reject_requirement_constraint(pkg_name, constraint=x, kind=kind)
]
if not constraints:
continue
rules.append(
RequirementRule(
pkg_name=pkg_name,
policy=policy,
requirements=constraints,
kind=kind,
message=requirement.get("message"),
condition=when,
)
)
return rules
def reject_requirement_constraint(
self, pkg_name: str, *, constraint: spack.spec.Spec, kind: RequirementKind
) -> bool:
"""Returns True if a requirement constraint should be rejected"""
if kind == RequirementKind.DEFAULT:
# Requirements under all: are applied only if they are satisfiable considering only
# package rules, so e.g. variants must exist etc. Otherwise, they are rejected.
try:
s = spack.spec.Spec(pkg_name)
s.constrain(constraint)
s.validate_or_raise()
except spack.error.SpackError as e:
tty.debug(
f"[SETUP] Rejecting the default '{constraint}' requirement "
f"on '{pkg_name}': {str(e)}",
level=2,
)
return True
return False
def parse_spec_from_yaml_string(string: str) -> spack.spec.Spec:
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Arguments:
string: a string representing a ``Spec`` from config YAML.
"""
try:
return spack.spec.Spec(string)
except spack.error.SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise spack.error.SpecSyntaxError(msg) from e
raise e

View File

@@ -73,7 +73,6 @@
import spack
import spack.compiler
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
import spack.hash_types as ht
@@ -82,7 +81,6 @@
import spack.platforms
import spack.provider_index
import spack.repo
import spack.solver
import spack.store
import spack.traverse as traverse
import spack.util.executable
@@ -2830,46 +2828,6 @@ def ensure_no_deprecated(root):
msg += " For each package listed, choose another spec\n"
raise SpecDeprecatedError(msg)
def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None:
"""Concretize the current spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
import spack.solver.asp
self.replace_hash()
for node in self.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if self._concrete:
return
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve([self], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = self.name
# TODO: Consolidate this code with similar code in solve.py
if self.virtual:
providers = [spec.name for spec in answer.values() if spec.package.provides(name)]
name = providers[0]
node = spack.solver.asp.SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
self._dup(concretized)
def _mark_root_concrete(self, value=True):
"""Mark just this spec (not dependencies) concrete."""
if (not value) and self.concrete and self.installed:
@@ -2958,21 +2916,6 @@ def _finalize_concretization(self):
for spec in self.traverse():
spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec":
"""This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package
without modifying this package.
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
clone = self.copy()
clone.concretize(tests=tests)
return clone
def index(self, deptype="all"):
"""Return a dictionary that points to all the dependencies in this
spec.

View File

@@ -34,7 +34,8 @@
import spack.caches
import spack.config
import spack.error
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.utils
import spack.resource
import spack.spec
import spack.util.crypto
@@ -353,8 +354,8 @@ def __init__(
url_or_fetch_strategy,
*,
name=None,
mirror_paths: Optional["spack.mirror.MirrorLayout"] = None,
mirrors: Optional[Iterable["spack.mirror.Mirror"]] = None,
mirror_paths: Optional["spack.mirrors.layout.MirrorLayout"] = None,
mirrors: Optional[Iterable["spack.mirrors.mirror.Mirror"]] = None,
keep=False,
path=None,
lock=True,
@@ -488,7 +489,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided.
fetchers[:0] = (
fs.from_url_scheme(
url_util.join(mirror.fetch_url, self.mirror_layout.path),
url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
checksum=digest,
expand=expand,
extension=extension,
@@ -601,7 +602,7 @@ def cache_local(self):
spack.caches.FETCH_CACHE.store(self.fetcher, self.mirror_layout.path)
def cache_mirror(
self, mirror: "spack.caches.MirrorCache", stats: "spack.mirror.MirrorStats"
self, mirror: "spack.caches.MirrorCache", stats: "spack.mirrors.utils.MirrorStats"
) -> None:
"""Perform a fetch if the resource is not already cached

View File

@@ -8,6 +8,7 @@
import pytest
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.solver.asp
@@ -22,7 +23,7 @@ def __init__(self, specs: List[str]) -> None:
self.concr_specs = []
def __enter__(self):
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
self.concr_specs = [spack.concretize.concretized(Spec(s)) for s in self.req_specs]
for s in self.concr_specs:
PackageInstaller([s.package], fake=True, explicit=True).install()
@@ -63,13 +64,13 @@ def _has_build_dependency(spec: Spec, name: str):
def test_simple_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-z").concretized().satisfies(Spec("splice-z"))
assert spack.concretize.concretized(Spec("splice-z")).satisfies(Spec("splice-z"))
def test_simple_dep_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
assert spack.concretize.concretized(Spec("splice-h@1")).satisfies(Spec("splice-h@1"))
def test_splice_installed_hash(splicing_setup):
@@ -82,9 +83,9 @@ def test_splice_installed_hash(splicing_setup):
spack.config.set("packages", packages_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0")
with pytest.raises(Exception):
goal_spec.concretized()
spack.concretize.concretized(goal_spec)
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
assert spack.concretize.concretized(goal_spec).satisfies(goal_spec)
def test_splice_build_splice_node(splicing_setup):
@@ -92,9 +93,9 @@ def test_splice_build_splice_node(splicing_setup):
spack.config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat")
with pytest.raises(Exception):
goal_spec.concretized()
spack.concretize.concretized(goal_spec)
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
assert spack.concretize.concretized(goal_spec).satisfies(goal_spec)
def test_double_splice(splicing_setup):
@@ -108,9 +109,9 @@ def test_double_splice(splicing_setup):
spack.config.set("packages", freeze_builds_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat")
with pytest.raises(Exception):
goal_spec.concretized()
spack.concretize.concretized(goal_spec)
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
assert spack.concretize.concretized(goal_spec).satisfies(goal_spec)
# The next two tests are mirrors of one another
@@ -127,10 +128,10 @@ def test_virtual_multi_splices_in(splicing_setup):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
for gs in goal_specs:
with pytest.raises(Exception):
Spec(gs).concretized()
spack.concretize.concretized(Spec(gs))
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
assert spack.concretize.concretized(Spec(gs)).satisfies(gs)
def test_virtual_multi_can_be_spliced(splicing_setup):
@@ -144,12 +145,12 @@ def test_virtual_multi_can_be_spliced(splicing_setup):
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(Exception):
for gs in goal_specs:
Spec(gs).concretized()
for gs in goal_specs:
with pytest.raises(Exception):
spack.concretize.concretized(Spec(gs))
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
assert spack.concretize.concretized(Spec(gs)).satisfies(gs)
def test_manyvariant_star_matching_variant_splice(splicing_setup):
@@ -167,10 +168,10 @@ def test_manyvariant_star_matching_variant_splice(splicing_setup):
spack.config.set("packages", freeze_build_config)
for goal in goal_specs:
with pytest.raises(Exception):
goal.concretized()
spack.concretize.concretized(goal)
_enable_splicing()
for goal in goal_specs:
assert goal.concretized().satisfies(goal)
assert spack.concretize.concretized(goal).satisfies(goal)
def test_manyvariant_limited_matching(splicing_setup):
@@ -189,10 +190,10 @@ def test_manyvariant_limited_matching(splicing_setup):
spack.config.set("packages", freeze_build_config)
for s in goal_specs:
with pytest.raises(Exception):
s.concretized()
spack.concretize.concretized(s)
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
assert spack.concretize.concretized(s).satisfies(s)
def test_external_splice_same_name(splicing_setup):
@@ -211,7 +212,7 @@ def test_external_splice_same_name(splicing_setup):
spack.config.set("packages", packages_yaml)
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
assert spack.concretize.concretized(s).satisfies(s)
def test_spliced_build_deps_only_in_build_spec(splicing_setup):
@@ -220,7 +221,7 @@ def test_spliced_build_deps_only_in_build_spec(splicing_setup):
with CacheManager(cache):
_enable_splicing()
concr_goal = goal_spec.concretized()
concr_goal = spack.concretize.concretized(goal_spec)
build_spec = concr_goal._build_spec
# Spec has been spliced
assert build_spec is not None
@@ -238,7 +239,7 @@ def test_spliced_transitive_dependency(splicing_setup):
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
concr_goal = spack.concretize.concretized(goal_spec)
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None

View File

@@ -134,5 +134,5 @@ def test_concretize_target_ranges(root_target_range, dep_target_range, result, m
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
spec = spack.concretize.concretized(spec)
assert spec.target == spec["pkg-b"].target == result

View File

@@ -28,11 +28,12 @@
import spack.binary_distribution as bindist
import spack.caches
import spack.compilers
import spack.concretize
import spack.config
import spack.fetch_strategy
import spack.hooks.sbang as sbang
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.paths
import spack.spec
import spack.stage
@@ -213,8 +214,9 @@ def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the default directory layout scheme.
"""
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
sy_spec = Spec("symly").concretized()
gspec = spack.concretize.concretized(Spec("garply"))
cspec = spack.concretize.concretized(Spec("corge"))
sy_spec = spack.concretize.concretized(Spec("symly"))
# Install 'corge' without using a cache
install_cmd("--no-cache", cspec.name)
@@ -261,9 +263,9 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme.
"""
cspec = Spec("corge").concretized()
cspec = spack.concretize.concretized(Spec("corge"))
# This guy tests for symlink relocation
sy_spec = Spec("symly").concretized()
sy_spec = spack.concretize.concretized(Spec("symly"))
# Install some packages with dependent packages
# test install in non-default install path scheme
@@ -284,7 +286,8 @@ def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with relative
rpaths into the default directory layout scheme.
"""
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
gspec = spack.concretize.concretized(Spec("garply"))
cspec = spack.concretize.concretized(Spec("corge"))
# Install buildcache created with relativized rpaths
buildcache_cmd("install", "-uf", cspec.name)
@@ -313,7 +316,7 @@ def test_relative_rpaths_install_nondefault(temporary_mirror_dir):
Test the installation of buildcaches with relativized rpaths
into the non-default directory layout scheme.
"""
cspec = Spec("corge").concretized()
cspec = spack.concretize.concretized(Spec("corge"))
# Test install in non-default install path scheme and relative path
buildcache_cmd("install", "-uf", cspec.name)
@@ -324,8 +327,8 @@ def test_push_and_fetch_keys(mock_gnupghome, tmp_path):
mirror = os.path.join(testpath, "mirror")
mirrors = {"test-mirror": url_util.path_to_file_url(mirror)}
mirrors = spack.mirror.MirrorCollection(mirrors)
mirror = spack.mirror.Mirror(url_util.path_to_file_url(mirror))
mirrors = spack.mirrors.mirror.MirrorCollection(mirrors)
mirror = spack.mirrors.mirror.Mirror(url_util.path_to_file_url(mirror))
gpg_dir1 = os.path.join(testpath, "gpg1")
gpg_dir2 = os.path.join(testpath, "gpg2")
@@ -366,7 +369,8 @@ def test_built_spec_cache(temporary_mirror_dir):
that cache from a buildcache index."""
buildcache_cmd("list", "-a", "-l")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
gspec = spack.concretize.concretized(Spec("garply"))
cspec = spack.concretize.concretized(Spec("corge"))
for s in [gspec, cspec]:
results = bindist.get_mirrors_for_spec(s)
@@ -389,7 +393,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
s = Spec("libdwarf").concretized()
s = spack.concretize.concretized(Spec("libdwarf"))
# Install a package
install_cmd(s.name)
@@ -418,7 +422,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
spack.config.set("mirrors", {"test": mirror_url})
s = Spec("libdwarf").concretized()
s = spack.concretize.concretized(Spec("libdwarf"))
# Install a package
install_cmd("--no-cache", s.name)
@@ -508,7 +512,7 @@ def test_update_sbang(tmpdir, temporary_mirror):
"""
spec_str = "old-sbang"
# Concretize a package with some old-fashioned sbang lines.
old_spec = Spec(spec_str).concretized()
old_spec = spack.concretize.concretized(Spec(spec_str))
old_spec_hash_str = "/{0}".format(old_spec.dag_hash())
# Need a fake mirror with *function* scope.
@@ -529,7 +533,7 @@ def test_update_sbang(tmpdir, temporary_mirror):
# Switch the store to the new install tree locations
newtree_dir = tmpdir.join("newtree")
with spack.store.use_store(str(newtree_dir)):
new_spec = Spec("old-sbang").concretized()
new_spec = spack.concretize.concretized(Spec("old-sbang"))
assert new_spec.dag_hash() == old_spec.dag_hash()
# Install package from buildcache

View File

@@ -9,7 +9,8 @@
import pytest
import spack.binary_distribution as bd
import spack.mirror
import spack.concretize
import spack.mirrors.mirror
import spack.spec
from spack.installer import PackageInstaller
@@ -17,13 +18,13 @@
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path):
spec = spack.spec.Spec("trivial-install-test-package").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("trivial-install-test-package"))
PackageInstaller([spec.package], fake=True).install()
specs = [spec]
# populate cache, everything is new
mirror = spack.mirror.Mirror.from_local_path(str(tmp_path))
mirror = spack.mirrors.mirror.Mirror.from_local_path(str(tmp_path))
with bd.make_uploader(mirror) as uploader:
skipped = uploader.push_or_raise(specs)
assert not skipped

View File

@@ -17,6 +17,7 @@
import spack.build_environment
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.package_base
@@ -164,8 +165,7 @@ def test_static_to_shared_library(build_environment):
@pytest.mark.regression("8345")
@pytest.mark.usefixtures("config", "mock_packages")
def test_cc_not_changed_by_modules(monkeypatch, working_env):
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("cmake"))
pkg = s.package
def _set_wrong_cc(x):
@@ -185,7 +185,7 @@ def test_setup_dependent_package_inherited_modules(
working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized()
s = spack.concretize.concretized(spack.spec.Spec("cmake-client-inheritor"))
PackageInstaller([s.package]).install()
@@ -278,7 +278,7 @@ def platform_pathsep(pathlist):
return convert_to_platform_path(pathlist)
# Monkeypatch a pkg.compiler.environment with the required modifications
pkg = spack.spec.Spec("cmake").concretized().package
pkg = spack.concretize.concretized(spack.spec.Spec("cmake")).package
monkeypatch.setattr(pkg.compiler, "environment", modifications)
# Trigger the modifications
spack.build_environment.setup_package(pkg, False)
@@ -302,7 +302,7 @@ def custom_env(pkg, env):
env.prepend_path("PATH", test_path)
env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1")
pkg = spack.spec.Spec("cmake").concretized().package
pkg = spack.concretize.concretized(spack.spec.Spec("cmake")).package
monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env)
spack.build_environment.setup_package(pkg, False)
@@ -323,7 +323,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
}
spack.config.set("packages:cmake", cmake_config)
cmake_client = spack.spec.Spec("cmake-client").concretized()
cmake_client = spack.concretize.concretized(spack.spec.Spec("cmake-client"))
spack.build_environment.setup_package(cmake_client.package, False)
assert os.environ["TEST_ENV_VAR_SET"] == "yes it's set"
@@ -331,8 +331,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
@pytest.mark.regression("9107")
def test_spack_paths_before_module_paths(config, mock_packages, monkeypatch, working_env):
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("cmake"))
pkg = s.package
module_path = os.path.join("path", "to", "module")
@@ -353,8 +352,7 @@ def _set_wrong_cc(x):
def test_package_inheritance_module_setup(config, mock_packages, working_env):
s = spack.spec.Spec("multimodule-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("multimodule-inheritance"))
pkg = s.package
spack.build_environment.setup_package(pkg, False)
@@ -388,8 +386,7 @@ def test_wrapper_variables(
not in cuda_include_dirs
)
root = spack.spec.Spec("dt-diamond")
root.concretize()
root = spack.concretize.concretized(spack.spec.Spec("dt-diamond"))
for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name)
@@ -454,7 +451,7 @@ def test_external_prefixes_last(mutable_config, mock_packages, working_env, monk
"""
)
spack.config.set("packages", cfg_data)
top = spack.spec.Spec("dt-diamond").concretized()
top = spack.concretize.concretized(spack.spec.Spec("dt-diamond"))
def _trust_me_its_a_dir(path):
return True
@@ -501,8 +498,7 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
)
def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mock_packages):
# Pick a random package to be able to set compiler's variables
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("cmake"))
pkg = s.package
env = EnvironmentModifications()
@@ -534,7 +530,7 @@ def setup_dependent_package(module, dependent_spec):
assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True
externaltool = spack.spec.Spec("externaltest").concretized()
externaltool = spack.concretize.concretized(spack.spec.Spec("externaltest"))
monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
)
@@ -729,7 +725,7 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
But obviously it can lead to very hard to find bugs... We should get rid of those globals and
define them instead as a property on the package instance.
"""
root = spack.spec.Spec("mpileaks").concretized()
root = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
build_variables = ("std_cmake_args", "std_meson_args", "std_pip_args")
# See todo above, we clear out any properties that may have been set by the previous test.

View File

@@ -16,6 +16,7 @@
import spack.build_systems.autotools
import spack.build_systems.cmake
import spack.builder
import spack.concretize
import spack.environment
import spack.error
import spack.paths
@@ -145,7 +146,7 @@ def test_none_is_allowed(self, default_mock_concretization):
def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Install a package that creates a mock libtool archive
s = Spec("libtool-deletion").concretized()
s = spack.concretize.concretized(Spec("libtool-deletion"))
PackageInstaller([s.package], explicit=True).install()
# Assert the libtool archive is not there and we have
@@ -160,7 +161,7 @@ def test_libtool_archive_files_might_be_installed_on_demand(
):
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = Spec("libtool-deletion").concretized()
s = spack.concretize.concretized(Spec("libtool-deletion"))
monkeypatch.setattr(
type(spack.builder.create(s.package)), "install_libtool_archives", True
)
@@ -174,7 +175,9 @@ def test_autotools_gnuconfig_replacement(self, mutable_database):
Tests whether only broken config.sub and config.guess are replaced with
files from working alternatives from the gnuconfig package.
"""
s = Spec("autotools-config-replacement +patch_config_files +gnuconfig").concretized()
s = spack.concretize.concretized(
Spec("autotools-config-replacement +patch_config_files +gnuconfig")
)
PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:
@@ -193,7 +196,9 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
"""
Tests whether disabling patch_config_files
"""
s = Spec("autotools-config-replacement ~patch_config_files +gnuconfig").concretized()
s = spack.concretize.concretized(
Spec("autotools-config-replacement ~patch_config_files +gnuconfig")
)
PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:
@@ -218,8 +223,9 @@ def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, mo
enabled, but gnuconfig is not listed as a direct build dependency.
"""
monkeypatch.setattr(spack.platforms.test.Test, "default", "x86_64")
s = Spec("autotools-config-replacement +patch_config_files ~gnuconfig")
s.concretize()
s = spack.concretize.concretized(
Spec("autotools-config-replacement +patch_config_files ~gnuconfig")
)
msg = "Cannot patch config files: missing dependencies: gnuconfig"
with pytest.raises(ChildError, match=msg):
@@ -299,7 +305,7 @@ def test_define(self, default_mock_concretization):
assert define("SINGLE", "red") == "-DSINGLE:STRING=red"
def test_define_from_variant(self):
s = Spec("cmake-client multi=up,right ~truthy single=red").concretized()
s = spack.concretize.concretized(Spec("cmake-client multi=up,right ~truthy single=red"))
arg = s.package.define_from_variant("MULTI")
assert arg == "-DMULTI:STRING=right;up"

View File

@@ -9,6 +9,7 @@
from llnl.util.filesystem import touch
import spack.builder
import spack.concretize
import spack.paths
import spack.repo
import spack.spec
@@ -79,7 +80,7 @@ def builder_test_repository():
@pytest.mark.disable_clean_stage_check
def test_callbacks_and_installation_procedure(spec_str, expected_values, working_env):
"""Test the correct execution of callbacks and installation procedures for packages."""
s = spack.spec.Spec(spec_str).concretized()
s = spack.concretize.concretized(spack.spec.Spec(spec_str))
builder = spack.builder.create(s.package)
for phase_fn in builder:
phase_fn.execute()
@@ -102,7 +103,7 @@ def test_callbacks_and_installation_procedure(spec_str, expected_values, working
],
)
def test_old_style_compatibility_with_super(spec_str, method_name, expected):
s = spack.spec.Spec(spec_str).concretized()
s = spack.concretize.concretized(spack.spec.Spec(spec_str))
builder = spack.builder.create(s.package)
value = getattr(builder, method_name)()
assert value == expected
@@ -113,7 +114,7 @@ def test_old_style_compatibility_with_super(spec_str, method_name, expected):
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
@pytest.mark.disable_clean_stage_check
def test_build_time_tests_are_executed_from_default_builder():
s = spack.spec.Spec("old-style-autotools").concretized()
s = spack.concretize.concretized(spack.spec.Spec("old-style-autotools"))
builder = spack.builder.create(s.package)
builder.pkg.run_tests = True
for phase_fn in builder:
@@ -127,7 +128,7 @@ def test_build_time_tests_are_executed_from_default_builder():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_wrapped_pkg():
"""Confirm 'run_tests' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized()
s = spack.concretize.concretized(spack.spec.Spec("old-style-autotools"))
builder = spack.builder.create(s.package)
assert s.package.run_tests is False
assert builder.pkg.run_tests is False
@@ -142,7 +143,7 @@ def test_monkey_patching_wrapped_pkg():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_test_log_file():
"""Confirm 'test_log_file' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized()
s = spack.concretize.concretized(spack.spec.Spec("old-style-autotools"))
builder = spack.builder.create(s.package)
s.package.tester.test_log_file = "/some/file"
@@ -155,7 +156,7 @@ def test_monkey_patching_test_log_file():
@pytest.mark.not_on_windows("Does not run on windows")
def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage):
"""Confirm able to run stand-alone test as a post-install callback."""
s = spack.spec.Spec("py-test-callback").concretized()
s = spack.concretize.concretized(spack.spec.Spec("py-test-callback"))
builder = spack.builder.create(s.package)
builder.pkg.run_tests = True
s.package.tester.test_log_file = tmpdir.join("install_test.log")
@@ -175,7 +176,7 @@ def test_mixins_with_builders(working_env):
"""Tests that run_after and run_before callbacks are accumulated correctly,
when mixins are used with builders.
"""
s = spack.spec.Spec("builder-and-mixins").concretized()
s = spack.concretize.concretized(spack.spec.Spec("builder-and-mixins"))
builder = spack.builder.create(s.package)
# Check that callbacks added by the mixin are in the list

View File

@@ -5,6 +5,7 @@
import pytest
import spack.concretize
import spack.deptypes as dt
import spack.installer as inst
import spack.repo
@@ -22,8 +23,7 @@ def test_build_request_errors(install_mockery):
def test_build_request_basics(install_mockery):
spec = spack.spec.Spec("dependent-install")
spec.concretize()
spec = spack.concretize.concretized(spack.spec.Spec("dependent-install"))
assert spec.concrete
# Ensure key properties match expectations
@@ -40,8 +40,7 @@ def test_build_request_basics(install_mockery):
def test_build_request_strings(install_mockery):
"""Tests of BuildRequest repr and str for coverage purposes."""
# Using a package with one dependency
spec = spack.spec.Spec("dependent-install")
spec.concretize()
spec = spack.concretize.concretized(spack.spec.Spec("dependent-install"))
assert spec.concrete
# Ensure key properties match expectations
@@ -73,7 +72,7 @@ def test_build_request_deptypes(
package_deptypes,
dependencies_deptypes,
):
s = spack.spec.Spec("dependent-install").concretized()
s = spack.concretize.concretized(spack.spec.Spec("dependent-install"))
build_request = inst.BuildRequest(
s.package,

View File

@@ -5,6 +5,7 @@
import pytest
import spack.concretize
import spack.error
import spack.installer as inst
import spack.repo
@@ -25,7 +26,7 @@ def test_build_task_errors(install_mockery):
inst.BuildTask(pkg_cls(spec), None)
# Using a concretized package now means the request argument is checked.
spec.concretize()
spec = spack.concretize.concretized(spec)
assert spec.concrete
with pytest.raises(TypeError, match="is not a valid build request"):
@@ -48,8 +49,7 @@ def test_build_task_errors(install_mockery):
def test_build_task_basics(install_mockery):
spec = spack.spec.Spec("dependent-install")
spec.concretize()
spec = spack.concretize.concretized(spack.spec.Spec("dependent-install"))
assert spec.concrete
# Ensure key properties match expectations
@@ -70,8 +70,7 @@ def test_build_task_basics(install_mockery):
def test_build_task_strings(install_mockery):
"""Tests of build_task repr and str for coverage purposes."""
# Using a package with one dependency
spec = spack.spec.Spec("dependent-install")
spec.concretize()
spec = spack.concretize.concretized(spack.spec.Spec("dependent-install"))
assert spec.concrete
# Ensure key properties match expectations

View File

@@ -10,6 +10,7 @@
import llnl.util.filesystem as fs
import spack.ci as ci
import spack.concretize
import spack.environment as ev
import spack.error
import spack.paths as spack_paths
@@ -326,7 +327,7 @@ def test_ci_run_standalone_tests_not_installed_junit(
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"job_spec": spack.concretize.concretized(spack.spec.Spec("printing-package")),
"repro_dir": str(repro_dir),
"fail_fast": True,
}
@@ -345,7 +346,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"job_spec": spack.concretize.concretized(spack.spec.Spec("printing-package")),
"repro_dir": str(repro_dir),
}
@@ -378,7 +379,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
def test_ci_skipped_report(tmpdir, mock_packages, config):
"""Test explicit skipping of report as well as CI's 'package' arg."""
pkg = "trivial-smoke-test"
spec = spack.spec.Spec(pkg).concretized()
spec = spack.concretize.concretized(spack.spec.Spec(pkg))
ci_cdash = {
"url": "file://fake",
"build-group": "fake-group",

View File

@@ -11,11 +11,11 @@
import spack.bootstrap
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.environment as ev
import spack.main
import spack.mirror
import spack.spec
import spack.mirrors.utils
_bootstrap = spack.main.SpackCommand("bootstrap")
@@ -182,9 +182,9 @@ def test_bootstrap_mirror_metadata(mutable_config, linux_os, monkeypatch, tmpdir
`spack bootstrap add`. Here we don't download data, since that would be an
expensive operation for a unit test.
"""
old_create = spack.mirror.create
monkeypatch.setattr(spack.mirror, "create", lambda p, s: old_create(p, []))
monkeypatch.setattr(spack.spec.Spec, "concretized", lambda p: p)
old_create = spack.mirrors.utils.create
monkeypatch.setattr(spack.mirrors.utils, "create", lambda p, s: old_create(p, []))
monkeypatch.setattr(spack.concretize, "concretized", lambda p: p)
# Create the mirror in a temporary folder
compilers = [

View File

@@ -13,10 +13,11 @@
import spack.binary_distribution
import spack.cmd.buildcache
import spack.concretize
import spack.environment as ev
import spack.error
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.spec
import spack.util.url
from spack.installer import PackageInstaller
@@ -82,7 +83,7 @@ def tests_buildcache_create(install_mockery, mock_fetch, monkeypatch, tmpdir):
buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = Spec(pkg).concretized()
spec = spack.concretize.concretized(Spec(pkg))
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -102,7 +103,7 @@ def tests_buildcache_create_env(
buildcache("push", "--unsigned", str(tmpdir))
spec = Spec(pkg).concretized()
spec = spack.concretize.concretized(Spec(pkg))
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -146,7 +147,7 @@ def test_update_key_index(
gpg("create", "Test Signing Key", "nobody@nowhere.com")
s = Spec("libdwarf").concretized()
s = spack.concretize.concretized(Spec("libdwarf"))
# Install a package
install(s.name)
@@ -176,7 +177,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
mirror("add", "--unsigned", "mirror", mirror_dir.as_uri())
mirror("add", "--autopush", "--unsigned", "mirror-autopush", mirror_autopush_dir.as_uri())
s = Spec("libdwarf").concretized()
s = spack.concretize.concretized(Spec("libdwarf"))
# Install and generate build cache index
PackageInstaller([s.package], explicit=True).install()
@@ -220,7 +221,7 @@ def verify_mirror_contents():
assert False
# Install a package and put it in the buildcache
s = Spec(out_env_pkg).concretized()
s = spack.concretize.concretized(Spec(out_env_pkg))
install(s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name)
@@ -330,7 +331,7 @@ def test_buildcache_create_install(
buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = Spec(pkg).concretized()
spec = spack.concretize.concretized(Spec(pkg))
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -385,7 +386,9 @@ def test_correct_specs_are_pushed(
class DontUpload(spack.binary_distribution.Uploader):
def __init__(self):
super().__init__(spack.mirror.Mirror.from_local_path(str(tmpdir)), False, False)
super().__init__(
spack.mirrors.mirror.Mirror.from_local_path(str(tmpdir)), False, False
)
self.pushed = []
def push(self, specs: List[spack.spec.Spec]):
@@ -449,7 +452,7 @@ def test_push_and_install_with_mirror_marked_unsigned_does_not_require_extra_fla
def test_skip_no_redistribute(mock_packages, config):
specs = list(Spec("no-redistribute-dependent").concretized().traverse())
specs = list(spack.concretize.concretized(Spec("no-redistribute-dependent")).traverse())
filtered = spack.cmd.buildcache._skip_no_redistribute_for_public(specs)
assert not any(s.name == "no-redistribute" for s in filtered)
assert any(s.name == "no-redistribute-dependent" for s in filtered)
@@ -489,7 +492,7 @@ def test_push_without_build_deps(tmp_path, temporary_store, mock_packages, mutab
mirror("add", "--unsigned", "my-mirror", str(tmp_path))
s = spack.spec.Spec("dtrun3").concretized()
s = spack.concretize.concretized(spack.spec.Spec("dtrun3"))
PackageInstaller([s.package], explicit=True, fake=True).install()
s["dtbuild3"].package.do_uninstall()

View File

@@ -8,6 +8,7 @@
import pytest
import spack.cmd.checksum
import spack.concretize
import spack.error
import spack.package_base
import spack.repo
@@ -309,7 +310,7 @@ def test_checksum_url(mock_packages, config):
def test_checksum_verification_fails(default_mock_concretization, capsys, can_fetch_versions):
spec = spack.spec.Spec("zlib").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("zlib"))
pkg = spec.package
versions = list(pkg.versions.keys())
version_hashes = {versions[0]: "abadhash", Version("0.1"): "123456789"}

View File

@@ -19,6 +19,7 @@
import spack.ci as ci
import spack.cmd
import spack.cmd.ci
import spack.concretize
import spack.environment as ev
import spack.hash_types as ht
import spack.main
@@ -145,7 +146,7 @@ def test_specs_staging(config, tmpdir):
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
with repo.use_repositories(builder.root):
spec_a = Spec("pkg-a").concretized()
spec_a = spack.concretize.concretized(Spec("pkg-a"))
spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["pkg-b"])
@@ -1108,7 +1109,7 @@ def test_ci_rebuild_index(
with working_dir(tmp_path):
env_cmd("create", "test", "./spack.yaml")
with ev.read("test"):
concrete_spec = Spec("callpath").concretized()
concrete_spec = spack.concretize.concretized(Spec("callpath"))
with open(tmp_path / "spec.json", "w") as f:
f.write(concrete_spec.to_json(hash=ht.dag_hash))
@@ -1229,12 +1230,10 @@ def test_ci_generate_read_broken_specs_url(
ci_base_environment,
):
"""Verify that `broken-specs-url` works as intended"""
spec_a = Spec("pkg-a")
spec_a.concretize()
spec_a = spack.concretize.concretized(Spec("pkg-a"))
a_dag_hash = spec_a.dag_hash()
spec_flattendeps = Spec("flatten-deps")
spec_flattendeps.concretize()
spec_flattendeps = spack.concretize.concretized(Spec("flatten-deps"))
flattendeps_dag_hash = spec_flattendeps.dag_hash()
broken_specs_url = tmp_path.as_uri()
@@ -1585,8 +1584,7 @@ def dynamic_mapping_setup(tmpdir):
"""
)
spec_a = Spec("pkg-a")
spec_a.concretize()
spec_a = spack.concretize.concretized(Spec("pkg-a"))
return ci.get_job_name(spec_a)

View File

@@ -11,10 +11,10 @@
import spack.caches
import spack.cmd.clean
import spack.concretize
import spack.environment as ev
import spack.main
import spack.package_base
import spack.spec
import spack.stage
import spack.store
@@ -78,7 +78,7 @@ def test_env_aware_clean(mock_stage, install_mockery, mutable_mock_env_path, mon
def fail(*args, **kwargs):
raise Exception("This should not have been called")
monkeypatch.setattr(spack.spec.Spec, "concretize", fail)
monkeypatch.setattr(spack.concretize, "concretized", fail)
with e:
clean("mpileaks")

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.config
import spack.database
import spack.environment as ev
@@ -594,8 +595,7 @@ def test_config_prefer_upstream(
prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/"))
for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]:
dep = spack.spec.Spec(spec)
dep.concretize()
dep = spack.concretize.concretized(spack.spec.Spec(spec))
prepared_db.add(dep)
downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))

View File

@@ -5,6 +5,7 @@
import pytest
import spack.concretize
import spack.spec
import spack.store
from spack.enums import InstallRecordStatus
@@ -67,8 +68,8 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
install("libdwarf@20130729 ^libelf@0.8.13")
install("libdwarf@20130207 ^libelf@0.8.10")
new_spec = spack.spec.Spec("libdwarf@20130729^libelf@0.8.13").concretized()
old_spec = spack.spec.Spec("libdwarf@20130207^libelf@0.8.10").concretized()
new_spec = spack.concretize.concretized(spack.spec.Spec("libdwarf@20130729^libelf@0.8.13"))
old_spec = spack.concretize.concretized(spack.spec.Spec("libdwarf@20130207^libelf@0.8.10"))
all_installed = spack.store.STORE.db.query()
@@ -108,12 +109,12 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
install("libelf@0.8.12")
install("libelf@0.8.10")
deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized()
deprecated_spec = spack.concretize.concretized(spack.spec.Spec("libelf@0.8.10"))
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.spec.Spec("libelf@0.8.12").concretized()
assert deprecator == spack.concretize.concretized(spack.spec.Spec("libelf@0.8.12"))
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
@@ -123,7 +124,7 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
assert len(all_available) == 3
deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.spec.Spec("libelf@0.8.13").concretized()
assert deprecator == spack.concretize.concretized(spack.spec.Spec("libelf@0.8.13"))
def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery):
@@ -133,9 +134,9 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
install("libelf@0.8.12")
install("libelf@0.8.10")
first_deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized()
second_deprecated_spec = spack.spec.Spec("libelf@0.8.12").concretized()
final_deprecator = spack.spec.Spec("libelf@0.8.13").concretized()
first_deprecated_spec = spack.concretize.concretized(spack.spec.Spec("libelf@0.8.10"))
second_deprecated_spec = spack.concretize.concretized(spack.spec.Spec("libelf@0.8.12"))
final_deprecator = spack.concretize.concretized(spack.spec.Spec("libelf@0.8.13"))
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
@@ -165,7 +166,7 @@ def test_concretize_deprecated(mock_packages, mock_archive, mock_fetch, install_
spec = spack.spec.Spec("libelf@0.8.10")
with pytest.raises(spack.spec.SpecDeprecatedError):
spec.concretize()
spack.concretize.concretized(spec)
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.environment as ev
import spack.error
import spack.repo
@@ -24,7 +25,9 @@
def test_dev_build_basics(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
assert "dev_path" in spec.variants
@@ -42,7 +45,9 @@ def test_dev_build_basics(tmpdir, install_mockery):
def test_dev_build_before(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -58,7 +63,9 @@ def test_dev_build_before(tmpdir, install_mockery):
def test_dev_build_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -76,7 +83,9 @@ def test_dev_build_until(tmpdir, install_mockery):
def test_dev_build_until_last_phase(tmpdir, install_mockery):
# Test that we ignore the last_phase argument if it is already last
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -94,7 +103,9 @@ def test_dev_build_until_last_phase(tmpdir, install_mockery):
def test_dev_build_before_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -130,8 +141,9 @@ def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery,
def test_dev_build_fails_already_installed(tmpdir, install_mockery):
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
spec.concretize()
spec = spack.concretize.concretized(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
with tmpdir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -173,8 +185,9 @@ def test_dev_build_env(tmpdir, install_mockery, mutable_mock_env_path):
"""Test Spack does dev builds for packages in develop section of env."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
spec.concretize()
spec = spack.concretize.concretized(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
with build_dir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -209,8 +222,9 @@ def test_dev_build_env_with_vars(tmpdir, install_mockery, mutable_mock_env_path,
"""Test Spack does dev builds for packages in develop section of env (path with variables)."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
spec.concretize()
spec = spack.concretize.concretized(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
)
# store the build path in an environment variable that will be used in the environment
monkeypatch.setenv("CUSTOM_BUILD_PATH", build_dir)
@@ -247,8 +261,9 @@ def test_dev_build_env_version_mismatch(tmpdir, install_mockery, mutable_mock_en
"""Test Spack constraints concretization by develop specs."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
spec.concretize()
spec = spack.concretize.concretized(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
with build_dir.as_cwd():
with open(spec.package.filename, "w") as f:
@@ -328,8 +343,8 @@ def test_dev_build_multiple(tmpdir, install_mockery, mutable_mock_env_path, mock
with ev.read("test"):
# Do concretization inside environment for dev info
# These specs are the source of truth to compare against the installs
leaf_spec.concretize()
root_spec.concretize()
leaf_spec = spack.concretize.concretized(leaf_spec)
root_spec = spack.concretize.concretized(root_spec)
# Do install
install()
@@ -375,8 +390,8 @@ def test_dev_build_env_dependency(tmpdir, install_mockery, mock_fetch, mutable_m
# concretize in the environment to get the dev build info
# equivalent to setting dev_build and dev_path variants
# on all specs above
spec.concretize()
dep_spec.concretize()
spec = spack.concretize.concretized(spec)
dep_spec = spack.concretize.concretized(dep_spec)
install()
# Ensure that both specs installed properly
@@ -400,8 +415,9 @@ def test_dev_build_rebuild_on_source_changes(
"""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
spec.concretize()
spec = spack.concretize.concretized(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
def reset_string():
with build_dir.as_cwd():

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.config
import spack.environment as ev
import spack.package_base
@@ -139,7 +140,8 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath)
result = spack.concretize.concretized(spack.spec.Spec("mpich@1.0"))
assert result.satisfies("dev_path=%s" % abspath)
def test_develop_canonicalize_path_no_args(self, monkeypatch):
env("create", "test")
@@ -166,7 +168,8 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath)
result = spack.concretize.concretized(spack.spec.Spec("mpich@1.0"))
assert result.satisfies("dev_path=%s" % abspath)
def _git_commit_list(git_repo_dir):
@@ -191,7 +194,7 @@ def test_develop_full_git_repo(
spack.package_base.PackageBase, "git", "file://%s" % repo_path, raising=False
)
spec = spack.spec.Spec("git-test-commit@1.2").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("git-test-commit@1.2"))
try:
spec.package.do_stage()
commits = _git_commit_list(spec.package.stage[0].source_path)

View File

@@ -6,6 +6,7 @@
import pytest
import spack.cmd.diff
import spack.concretize
import spack.main
import spack.repo
import spack.spec
@@ -20,6 +21,8 @@
_p1 = (
"p1",
"""\
from spack.package import *
class P1(Package):
version("1.0")
@@ -35,6 +38,8 @@ class P1(Package):
_p2 = (
"p2",
"""\
from spack.package import *
class P2(Package):
version("1.0")
@@ -48,6 +53,8 @@ class P2(Package):
_p3 = (
"p3",
"""\
from spack.package import *
class P3(Package):
version("1.0")
@@ -58,6 +65,8 @@ class P3(Package):
_i1 = (
"i1",
"""\
from spack.package import *
class I1(Package):
version("1.0")
@@ -73,6 +82,8 @@ class I1(Package):
_i2 = (
"i2",
"""\
from spack.package import *
class I2(Package):
version("1.0")
@@ -89,6 +100,8 @@ class I2(Package):
_p4 = (
"p4",
"""\
from spack.package import *
class P4(Package):
version("1.0")
@@ -122,8 +135,8 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
def test_diff_ignore(test_repo):
specA = spack.spec.Spec("p1+usev1").concretized()
specB = spack.spec.Spec("p1~usev1").concretized()
specA = spack.concretize.concretized(spack.spec.Spec("p1+usev1"))
specB = spack.concretize.concretized(spack.spec.Spec("p1~usev1"))
c1 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
@@ -143,8 +156,8 @@ def find(function_list, name, args):
# Check ignoring changes on multiple packages
specA = spack.spec.Spec("p1+usev1 ^p3+p3var").concretized()
specA = spack.spec.Spec("p1~usev1 ^p3~p3var").concretized()
specA = spack.concretize.concretized(spack.spec.Spec("p1+usev1 ^p3+p3var"))
specA = spack.concretize.concretized(spack.spec.Spec("p1~usev1 ^p3~p3var"))
c3 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
assert find(c3["a_not_b"], "variant_value", ["p3", "p3var"])
@@ -157,8 +170,8 @@ def find(function_list, name, args):
def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test that we can install two packages and diff them"""
specA = spack.spec.Spec("mpileaks").concretized()
specB = spack.spec.Spec("mpileaks+debug").concretized()
specA = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
specB = spack.concretize.concretized(spack.spec.Spec("mpileaks+debug"))
# Specs should be the same as themselves
c = spack.cmd.diff.compare_specs(specA, specA, to_string=True)

View File

@@ -19,6 +19,7 @@
from llnl.util.symlink import readlink
import spack.cmd.env
import spack.concretize
import spack.config
import spack.environment as ev
import spack.environment.depfile as depfile
@@ -914,7 +915,7 @@ def test_lockfile_spliced_specs(environment_from_manifest, install_mockery):
"""Test that an environment can round-trip a spliced spec."""
# Create a local install for zmpi to splice in
# Default concretization is not using zmpi
zmpi = spack.spec.Spec("zmpi").concretized()
zmpi = spack.concretize.concretized(spack.spec.Spec("zmpi"))
PackageInstaller([zmpi.package], fake=True).install()
e1 = environment_from_manifest(
@@ -1277,39 +1278,43 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
with e:
# List of requirements, flip a variant
config("change", "packages:mpich:require:~debug")
test_spec = spack.spec.Spec("mpich").concretized()
test_spec = spack.concretize.concretized(spack.spec.Spec("mpich"))
assert test_spec.satisfies("@3.0.2~debug")
# List of requirements, change the version (in a different scope)
config("change", "packages:mpich:require:@3.0.3")
test_spec = spack.spec.Spec("mpich").concretized()
test_spec = spack.concretize.concretized(spack.spec.Spec("mpich"))
assert test_spec.satisfies("@3.0.3")
# "require:" as a single string, also try specifying
# a spec string that requires enclosing in quotes as
# part of the config path
config("change", 'packages:libelf:require:"@0.8.12:"')
spack.spec.Spec("libelf@0.8.12").concretized()
spack.concretize.concretized(spack.spec.Spec("libelf@0.8.12"))
# No need for assert, if there wasn't a failure, we
# changed the requirement successfully.
# Use change to add a requirement for a package that
# has no requirements defined
config("change", "packages:fftw:require:+mpi")
test_spec = spack.spec.Spec("fftw").concretized()
test_spec = spack.concretize.concretized(spack.spec.Spec("fftw"))
assert test_spec.satisfies("+mpi")
config("change", "packages:fftw:require:~mpi")
test_spec = spack.spec.Spec("fftw").concretized()
test_spec = spack.concretize.concretized(spack.spec.Spec("fftw"))
assert test_spec.satisfies("~mpi")
config("change", "packages:fftw:require:@1.0")
test_spec = spack.spec.Spec("fftw").concretized()
test_spec = spack.concretize.concretized(spack.spec.Spec("fftw"))
assert test_spec.satisfies("@1.0~mpi")
# Use "--match-spec" to change one spec in a "one_of"
# list
config("change", "packages:bowtie:require:@1.2.2", "--match-spec", "@1.2.0")
spack.spec.Spec("bowtie@1.3.0").concretize()
spack.spec.Spec("bowtie@1.2.2").concretized()
# confirm that we can concretize to either value
spack.concretize.concretized(spack.spec.Spec("bowtie@1.3.0"))
spack.concretize.concretized(spack.spec.Spec("bowtie@1.2.2"))
# confirm that we cannot concretize to the old value
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretized(spack.spec.Spec("bowtie@1.2.0"))
def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config):
@@ -1324,8 +1329,8 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
with ev.Environment(tmp_path):
config("change", "packages:mpich:require:~debug")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.spec.Spec("mpich+debug").concretized()
spack.spec.Spec("mpich~debug").concretized()
spack.concretize.concretized(spack.spec.Spec("mpich+debug"))
spack.concretize.concretized(spack.spec.Spec("mpich~debug"))
# Now check that we raise an error if we need to add a require: constraint
# when preexisting config manually specified it as a singular spec
@@ -1339,7 +1344,7 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
"""
)
with ev.Environment(tmp_path):
assert spack.spec.Spec("mpich").concretized().satisfies("@3.0.3")
assert spack.concretize.concretized(spack.spec.Spec("mpich")).satisfies("@3.0.3")
with pytest.raises(spack.error.ConfigError, match="not a list"):
config("change", "packages:mpich:require:~debug")
@@ -1647,7 +1652,7 @@ def test_stage(mock_stage, mock_fetch, install_mockery):
root = str(mock_stage)
def check_stage(spec):
spec = Spec(spec).concretized()
spec = spack.concretize.concretized(Spec(spec))
for dep in spec.traverse():
stage_name = "{0}{1}-{2}-{3}".format(
stage_prefix, dep.name, dep.version, dep.dag_hash()
@@ -1750,7 +1755,7 @@ def test_indirect_build_dep(tmp_path):
with spack.repo.use_repositories(builder.root):
x_spec = Spec("x")
x_concretized = x_spec.concretized()
x_concretized = spack.concretize.concretized(x_spec)
_env_create("test", with_view=False)
e = ev.read("test")
@@ -1783,10 +1788,10 @@ def test_store_different_build_deps(tmp_path):
with spack.repo.use_repositories(builder.root):
y_spec = Spec("y ^z@3")
y_concretized = y_spec.concretized()
y_concretized = spack.concretize.concretized(y_spec)
x_spec = Spec("x ^z@2")
x_concretized = x_spec.concretized()
x_concretized = spack.concretize.concretized(x_spec)
# Even though x chose a different 'z', the y it chooses should be identical
# *aside* from the dependency on 'z'. The dag_hash() will show the difference

View File

@@ -6,6 +6,7 @@
import pytest
import spack.concretize
from spack.installer import PackageInstaller
from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
@@ -15,7 +16,9 @@
@pytest.fixture
def python_database(mock_packages, mutable_database):
specs = [Spec(s).concretized() for s in ["python", "py-extension1", "py-extension2"]]
specs = [
spack.concretize.concretized(Spec(s)) for s in ["python", "py-extension1", "py-extension2"]
]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
yield
@@ -23,7 +26,7 @@ def python_database(mock_packages, mutable_database):
@pytest.mark.not_on_windows("All Fetchers Failed")
@pytest.mark.db
def test_extensions(mock_packages, python_database, capsys):
ext2 = Spec("py-extension2").concretized()
ext2 = spack.concretize.concretized(Spec("py-extension2"))
def check_output(ni):
with capsys.disabled():

View File

@@ -13,6 +13,7 @@
import spack.cmd as cmd
import spack.cmd.find
import spack.concretize
import spack.environment as ev
import spack.repo
import spack.store
@@ -202,7 +203,8 @@ def test_find_json_deps(database):
@pytest.mark.db
def test_display_json(database, capsys):
specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
spack.concretize.concretized(Spec(s))
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
cmd.display_specs_as_json(specs)
@@ -217,7 +219,8 @@ def test_display_json(database, capsys):
@pytest.mark.db
def test_display_json_deps(database, capsys):
specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
spack.concretize.concretized(Spec(s))
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
cmd.display_specs_as_json(specs, deps=True)
@@ -276,7 +279,7 @@ def test_find_format_deps(database, config):
def test_find_format_deps_paths(database, config):
output = find("-dp", "--format", "{name}-{version}", "mpileaks", "^zmpi")
spec = Spec("mpileaks ^zmpi").concretized()
spec = spack.concretize.concretized(Spec("mpileaks ^zmpi"))
prefixes = [s.prefix for s in spec.traverse()]
assert (
@@ -301,7 +304,8 @@ def test_find_very_long(database, config):
output = find("-L", "--no-groups", "mpileaks")
specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
spack.concretize.concretized(Spec(s))
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
assert set(output.strip().split("\n")) == set(
@@ -462,6 +466,8 @@ def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
_pkga = (
"a0",
"""\
from spack.package import *
class A0(Package):
version("1.2")
version("1.1")
@@ -475,6 +481,8 @@ class A0(Package):
_pkgb = (
"b0",
"""\
from spack.package import *
class B0(Package):
version("1.2")
version("1.1")
@@ -485,6 +493,8 @@ class B0(Package):
_pkgc = (
"c0",
"""\
from spack.package import *
class C0(Package):
version("1.2")
version("1.1")
@@ -497,6 +507,8 @@ class C0(Package):
_pkgd = (
"d0",
"""\
from spack.package import *
class D0(Package):
version("1.2")
version("1.1")
@@ -510,6 +522,8 @@ class D0(Package):
_pkge = (
"e0",
"""\
from spack.package import *
class E0(Package):
tags = ["tag1", "tag2"]

View File

@@ -6,6 +6,7 @@
import pytest
import spack.concretize
import spack.deptypes as dt
import spack.environment as ev
import spack.main
@@ -26,8 +27,7 @@ def test_gc_without_build_dependency(mutable_database):
@pytest.mark.db
def test_gc_with_build_dependency(mutable_database):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], explicit=True, fake=True).install()
assert "There are no unused specs." in gc("-yb")
@@ -37,8 +37,8 @@ def test_gc_with_build_dependency(mutable_database):
@pytest.mark.db
def test_gc_with_constraints(mutable_database):
s_cmake1 = spack.spec.Spec("simple-inheritance ^cmake@3.4.3").concretized()
s_cmake2 = spack.spec.Spec("simple-inheritance ^cmake@3.23.1").concretized()
s_cmake1 = spack.concretize.concretized(spack.spec.Spec("simple-inheritance ^cmake@3.4.3"))
s_cmake2 = spack.concretize.concretized(spack.spec.Spec("simple-inheritance ^cmake@3.23.1"))
PackageInstaller([s_cmake1.package], explicit=True, fake=True).install()
PackageInstaller([s_cmake2.package], explicit=True, fake=True).install()
@@ -53,8 +53,7 @@ def test_gc_with_constraints(mutable_database):
@pytest.mark.db
def test_gc_with_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc")
@@ -69,8 +68,7 @@ def test_gc_with_environment(mutable_database, mutable_mock_env_path):
@pytest.mark.db
def test_gc_with_build_dependency_in_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc")
@@ -121,8 +119,7 @@ def test_gc_except_any_environments(mutable_database, mutable_mock_env_path):
@pytest.mark.db
def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi")
@@ -148,8 +145,7 @@ def test_gc_except_nonexisting_dir_env(mutable_database, mutable_mock_env_path,
@pytest.mark.db
def test_gc_except_specific_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi")

View File

@@ -20,6 +20,7 @@
import spack.build_environment
import spack.cmd.common.arguments
import spack.cmd.install
import spack.concretize
import spack.config
import spack.environment as ev
import spack.error
@@ -135,7 +136,7 @@ def test_package_output(tmpdir, capsys, install_mockery, mock_fetch):
# we can't use output capture here because it interferes with Spack's
# logging. TODO: see whether we can get multiple log_outputs to work
# when nested AND in pytest
spec = Spec("printing-package").concretized()
spec = spack.concretize.concretized(Spec("printing-package"))
pkg = spec.package
PackageInstaller([pkg], explicit=True, verbose=True).install()
@@ -175,7 +176,7 @@ def test_install_output_on_python_error(mock_packages, mock_archive, mock_fetch,
def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Verify that source has been copied into place."""
install("--source", "--keep-stage", "trivial-install-test-package")
spec = Spec("trivial-install-test-package").concretized()
spec = spack.concretize.concretized(Spec("trivial-install-test-package"))
src = os.path.join(spec.prefix.share, "trivial-install-test-package", "src")
assert filecmp.cmp(
os.path.join(mock_archive.path, "configure"), os.path.join(src, "configure")
@@ -183,8 +184,7 @@ def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mo
def test_install_env_variables(mock_packages, mock_archive, mock_fetch, install_mockery):
spec = Spec("libdwarf")
spec.concretize()
spec = spack.concretize.concretized(Spec("libdwarf"))
install("libdwarf")
assert os.path.isfile(spec.package.install_env_path)
@@ -205,8 +205,7 @@ def test_show_log_on_error(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = Spec("libdwarf")
spec.concretize()
spec = spack.concretize.concretized(Spec("libdwarf"))
install("libdwarf")
@@ -239,8 +238,7 @@ def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite_not_installed(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = Spec("libdwarf")
spec.concretize()
spec = spack.concretize.concretized(Spec("libdwarf"))
assert not os.path.exists(spec.prefix)
@@ -261,7 +259,7 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
monkeypatch.setattr(spack.package_base.PackageBase, "git", file_url, raising=False)
# Use the earliest commit in the respository
spec = Spec(f"git-test-commit@{commits[-1]}").concretized()
spec = spack.concretize.concretized(Spec(f"git-test-commit@{commits[-1]}"))
PackageInstaller([spec.package], explicit=True).install()
# Ensure first commit file contents were written
@@ -274,13 +272,11 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
def test_install_overwrite_multiple(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
libdwarf = Spec("libdwarf")
libdwarf.concretize()
libdwarf = spack.concretize.concretized(Spec("libdwarf"))
install("libdwarf")
cmake = Spec("cmake")
cmake.concretize()
cmake = spack.concretize.concretized(Spec("cmake"))
install("cmake")
@@ -356,7 +352,7 @@ def test_install_invalid_spec(invalid_spec):
)
def test_install_from_file(spec, concretize, error_code, tmpdir):
if concretize:
spec.concretize()
spec = spack.concretize.concretized(spec)
specfile = tmpdir.join("spec.yaml")
@@ -486,8 +482,7 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
for spec in filespecs:
filepath = tmpdir.join(spec + ".yaml")
args = ["-f", str(filepath)] + args
s = Spec(spec)
s.concretize()
s = spack.concretize.concretized(Spec(spec))
with filepath.open("w") as f:
s.to_yaml(f)
@@ -496,8 +491,7 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
def test_extra_files_are_archived(mock_packages, mock_archive, mock_fetch, install_mockery):
s = Spec("archive-files")
s.concretize()
s = spack.concretize.concretized(Spec("archive-files"))
install("archive-files")
@@ -616,8 +610,7 @@ def test_cdash_install_from_spec_json(
with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("pkg-a")
pkg_spec.concretize()
pkg_spec = spack.concretize.concretized(Spec("pkg-a"))
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
@@ -693,8 +686,8 @@ def test_cache_only_fails(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_only_dependencies(tmpdir, mock_fetch, install_mockery):
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
dep = spack.concretize.concretized(Spec("dependency-install"))
root = spack.concretize.concretized(Spec("dependent-install"))
install("--only", "dependencies", "dependent-install")
@@ -715,8 +708,8 @@ def test_install_only_package(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
dep = spack.concretize.concretized(Spec("dependency-install"))
root = spack.concretize.concretized(Spec("dependent-install"))
install("--only", "dependencies", "dependent-install")
assert os.path.exists(dep.prefix)
@@ -734,8 +727,8 @@ def test_install_only_dependencies_in_env(
env("create", "test")
with ev.read("test"):
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
dep = spack.concretize.concretized(Spec("dependency-install"))
root = spack.concretize.concretized(Spec("dependent-install"))
install("-v", "--only", "dependencies", "--add", "dependent-install")
@@ -751,8 +744,8 @@ def test_install_only_dependencies_of_all_in_env(
with ev.read("test"):
roots = [
Spec("dependent-install@1.0").concretized(),
Spec("dependent-install@2.0").concretized(),
spack.concretize.concretized(Spec("dependent-install@1.0")),
spack.concretize.concretized(Spec("dependent-install@2.0")),
]
add("dependent-install@1.0")
@@ -901,7 +894,7 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = Spec("configure-warning").concretized()
spec = spack.concretize.concretized(Spec("configure-warning"))
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
@@ -947,7 +940,7 @@ def test_install_env_with_tests_all(
):
env("create", "test")
with ev.read("test"):
test_dep = Spec("test-dependency").concretized()
test_dep = spack.concretize.concretized(Spec("test-dependency"))
add("depb")
install("--test", "all")
assert os.path.exists(test_dep.prefix)
@@ -959,7 +952,7 @@ def test_install_env_with_tests_root(
):
env("create", "test")
with ev.read("test"):
test_dep = Spec("test-dependency").concretized()
test_dep = spack.concretize.concretized(Spec("test-dependency"))
add("depb")
install("--test", "root")
assert not os.path.exists(test_dep.prefix)

View File

@@ -8,6 +8,7 @@
import pytest
import spack.concretize
import spack.spec
import spack.user_environment as uenv
from spack.main import SpackCommand
@@ -50,7 +51,7 @@ def test_load_shell(shell, set_command):
"""Test that `spack load` applies prefix inspections of its required runtime deps in
topo-order"""
install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
mpileaks_spec = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
# Ensure our reference variable is clean.
os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world"
@@ -167,7 +168,7 @@ def test_unload(
"""Tests that any variables set in the user environment are undone by the
unload command"""
install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
mpileaks_spec = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
# Set so unload has something to do
os.environ["FOOBAR"] = "mpileaks"
@@ -188,7 +189,7 @@ def test_unload_fails_no_shell(
):
"""Test that spack unload prints an error message without a shell."""
install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
mpileaks_spec = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
os.environ[uenv.spack_loaded_hashes_var] = mpileaks_spec.dag_hash()
out = unload("mpileaks", fail_on_error=False)

View File

@@ -9,6 +9,7 @@
from llnl.util.filesystem import mkdirp
import spack.concretize
import spack.environment as ev
import spack.paths
import spack.spec
@@ -26,7 +27,7 @@
@pytest.fixture
def mock_spec():
# Make it look like the source was actually expanded.
s = spack.spec.Spec("externaltest").concretized()
s = spack.concretize.concretized(spack.spec.Spec("externaltest"))
source_path = s.package.stage.source_path
mkdirp(source_path)
yield s, s.package

View File

@@ -14,6 +14,7 @@
import spack
import spack.cmd.logs
import spack.concretize
import spack.main
import spack.spec
from spack.main import SpackCommand
@@ -54,7 +55,7 @@ def disable_capture(capfd):
def test_logs_cmd_errors(install_mockery, mock_fetch, mock_archive, mock_packages):
spec = spack.spec.Spec("libelf").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("libelf"))
assert not spec.installed
with pytest.raises(spack.main.SpackCommandError, match="is not installed or staged"):
@@ -83,7 +84,7 @@ def test_dump_logs(install_mockery, mock_fetch, mock_archive, mock_packages, dis
decompress them.
"""
cmdline_spec = spack.spec.Spec("libelf")
concrete_spec = cmdline_spec.concretized()
concrete_spec = spack.concretize.concretized(cmdline_spec)
# Sanity check, make sure this test is checking what we want: to
# start with

View File

@@ -8,10 +8,11 @@
import pytest
import spack.cmd.mirror
import spack.concretize
import spack.config
import spack.environment as ev
import spack.error
import spack.mirror
import spack.mirrors.utils
import spack.spec
import spack.util.url as url_util
import spack.version
@@ -61,7 +62,7 @@ def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_p
@pytest.fixture
def source_for_pkg_with_hash(mock_packages, tmpdir):
s = spack.spec.Spec("trivial-pkg-with-valid-hash").concretized()
s = spack.concretize.concretized(spack.spec.Spec("trivial-pkg-with-valid-hash"))
local_url_basename = os.path.basename(s.package.url)
local_path = os.path.join(str(tmpdir), local_url_basename)
with open(local_path, "w") as f:
@@ -73,8 +74,11 @@ def source_for_pkg_with_hash(mock_packages, tmpdir):
def test_mirror_skip_unstable(tmpdir_factory, mock_packages, config, source_for_pkg_with_hash):
mirror_dir = str(tmpdir_factory.mktemp("mirror-dir"))
specs = [spack.spec.Spec(x).concretized() for x in ["git-test", "trivial-pkg-with-valid-hash"]]
spack.mirror.create(mirror_dir, specs, skip_unstable_versions=True)
specs = [
spack.concretize.concretized(spack.spec.Spec(x))
for x in ["git-test", "trivial-pkg-with-valid-hash"]
]
spack.mirrors.utils.create(mirror_dir, specs, skip_unstable_versions=True)
assert set(os.listdir(mirror_dir)) - set(["_source-cache"]) == set(
["trivial-pkg-with-valid-hash"]
@@ -112,7 +116,8 @@ def test_exclude_specs(mock_packages, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
spack.concretize.concretized(spack.spec.Spec(x))
for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs)
@@ -146,7 +151,8 @@ def test_exclude_file(mock_packages, tmpdir, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
spack.concretize.concretized(spack.spec.Spec(x))
for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs)

View File

@@ -8,6 +8,7 @@
import pytest
import spack.concretize
import spack.config
import spack.main
import spack.modules
@@ -34,7 +35,7 @@ def ensure_module_files_are_there(mock_repo_path, mock_store, mock_configuration
def _module_files(module_type, *specs):
specs = [spack.spec.Spec(x).concretized() for x in specs]
specs = [spack.concretize.concretized(spack.spec.Spec(x)) for x in specs]
writer_cls = spack.modules.module_types[module_type]
return [writer_cls(spec, "default").layout.filename for spec in specs]
@@ -185,12 +186,15 @@ def test_setdefault_command(mutable_database, mutable_config):
# Install two different versions of pkg-a
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0"
specs = [spack.spec.Spec(other_spec).concretized(), spack.spec.Spec(preferred).concretized()]
specs = [
spack.concretize.concretized(spack.spec.Spec(other_spec)),
spack.concretize.concretized(spack.spec.Spec(preferred)),
]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
writers = {
preferred: writer_cls(spack.spec.Spec(preferred).concretized(), "default"),
other_spec: writer_cls(spack.spec.Spec(other_spec).concretized(), "default"),
preferred: writer_cls(specs[1], "default"),
other_spec: writer_cls(specs[0], "default"),
}
# Create two module files for the same software

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.main
import spack.repo
import spack.spec
@@ -48,7 +49,7 @@ class tag_path:
def test_tags_installed(install_mockery, mock_fetch):
s = spack.spec.Spec("mpich").concretized()
s = spack.concretize.concretized(spack.spec.Spec("mpich"))
PackageInstaller([s.package], explicit=True, fake=True).install()
out = tags("-i")

View File

@@ -12,6 +12,7 @@
import spack.cmd.common.arguments
import spack.cmd.test
import spack.concretize
import spack.config
import spack.install_test
import spack.paths
@@ -241,7 +242,7 @@ def test_read_old_results(mock_packages, mock_test_stage):
def test_test_results_none(mock_packages, mock_test_stage):
name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("trivial-smoke-test"))
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()
spack.install_test.write_test_suite_file(suite)
@@ -256,7 +257,7 @@ def test_test_results_none(mock_packages, mock_test_stage):
def test_test_results_status(mock_packages, mock_test_stage, status):
"""Confirm 'spack test results' returns expected status."""
name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("trivial-smoke-test"))
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()
spack.install_test.write_test_suite_file(suite)
@@ -279,7 +280,7 @@ def test_test_results_status(mock_packages, mock_test_stage, status):
def test_report_filename_for_cdash(install_mockery, mock_fetch):
"""Test that the temporary file used to write Testing.xml for CDash is not the upload URL"""
name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("trivial-smoke-test"))
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.environment as ev
import spack.spec
from spack.main import SpackCommand
@@ -31,9 +32,9 @@ def test_undevelop(tmpdir, mutable_config, mock_packages, mutable_mock_env_path)
env("create", "test", "./spack.yaml")
with ev.read("test"):
before = spack.spec.Spec("mpich").concretized()
before = spack.concretize.concretized(spack.spec.Spec("mpich"))
undevelop("mpich")
after = spack.spec.Spec("mpich").concretized()
after = spack.concretize.concretized(spack.spec.Spec("mpich"))
# Removing dev spec from environment changes concretization
assert before.satisfies("dev_path=*")

View File

@@ -8,6 +8,7 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.spec
import spack.store
import spack.util.spack_json as sjson
@@ -66,7 +67,7 @@ def test_single_file_verify_cmd(tmpdir):
def test_single_spec_verify_cmd(tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery):
# Test the verify command interface to verify a single spec
install("libelf")
s = spack.spec.Spec("libelf").concretized()
s = spack.concretize.concretized(spack.spec.Spec("libelf"))
prefix = s.prefix
hash = s.dag_hash()

View File

@@ -10,6 +10,7 @@
from llnl.util.symlink import _windows_can_symlink
import spack.concretize
import spack.util.spack_yaml as s_yaml
from spack.installer import PackageInstaller
from spack.main import SpackCommand
@@ -191,7 +192,7 @@ def test_view_fails_with_missing_projections_file(tmpdir):
def test_view_files_not_ignored(
tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery, cmd, with_projection
):
spec = Spec("view-not-ignored").concretized()
spec = spack.concretize.concretized(Spec("view-not-ignored"))
pkg = spec.package
PackageInstaller([pkg], explicit=True).install()
pkg.assert_installed(spec.prefix)

View File

@@ -210,7 +210,7 @@ def test_missing_command():
"""Ensure that we raise the expected exception if the desired command is
not present.
"""
with pytest.raises(spack.extensions.CommandNotFoundError):
with pytest.raises(spack.cmd.CommandNotFoundError):
spack.cmd.get_module("no-such-command")
@@ -220,9 +220,9 @@ def test_missing_command():
("/my/bad/extension", spack.extensions.ExtensionNamingError),
("", spack.extensions.ExtensionNamingError),
("/my/bad/spack--extra-hyphen", spack.extensions.ExtensionNamingError),
("/my/good/spack-extension", spack.extensions.CommandNotFoundError),
("/my/still/good/spack-extension/", spack.extensions.CommandNotFoundError),
("/my/spack-hyphenated-extension", spack.extensions.CommandNotFoundError),
("/my/good/spack-extension", spack.cmd.CommandNotFoundError),
("/my/still/good/spack-extension/", spack.cmd.CommandNotFoundError),
("/my/spack-hyphenated-extension", spack.cmd.CommandNotFoundError),
],
ids=["no_stem", "vacuous", "leading_hyphen", "basic_good", "trailing_slash", "hyphenated"],
)

View File

@@ -9,6 +9,7 @@
import archspec.cpu
import spack.concretize
import spack.config
import spack.paths
import spack.repo
@@ -21,7 +22,7 @@
def _concretize_with_reuse(*, root_str, reused_str):
reused_spec = spack.spec.Spec(reused_str).concretized()
reused_spec = spack.concretize.concretized(spack.spec.Spec(reused_str))
setup = spack.solver.asp.SpackSolverSetup(tests=False)
driver = spack.solver.asp.PyclingoDriver()
result, _, _ = driver.solve(setup, [spack.spec.Spec(f"{root_str}")], reuse=[reused_spec])
@@ -45,7 +46,7 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0"))
a, b = s["pkg-a"], s["pkg-b"]
# Both a and b should depend on the same gcc-runtime directly
@@ -62,7 +63,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml)
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a%gcc@10.2.1"))
a, b = s["pkg-a"], s["pkg-b"]

File diff suppressed because it is too large Load Diff

View File

@@ -5,6 +5,7 @@
import pytest
import spack.concretize
import spack.config
import spack.solver.asp
import spack.spec
@@ -58,7 +59,7 @@ def test_error_messages(error_messages, config_set, spec, mock_packages, mutable
spack.config.set(path, conf)
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError) as e:
_ = spack.spec.Spec(spec).concretized()
_ = spack.concretize.concretized(spack.spec.Spec(spec))
for em in error_messages:
assert em in str(e.value)

View File

@@ -6,6 +6,7 @@
import pytest
import spack.concretize
import spack.config
import spack.environment as ev
import spack.paths
@@ -63,12 +64,12 @@ def test_mix_spec_and_requirements(concretize_scope, test_repo):
"""
update_concretize_scope(conf_str, "packages")
s1 = Spec('y cflags="-a"').concretized()
s1 = spack.concretize.concretized(Spec('y cflags="-a"'))
assert s1.satisfies('cflags="-a -c"')
def test_mix_spec_and_dependent(concretize_scope, test_repo):
s1 = Spec('x ^y cflags="-a"').concretized()
s1 = spack.concretize.concretized(Spec('x ^y cflags="-a"'))
assert s1["y"].satisfies('cflags="-a -d1"')
@@ -93,7 +94,7 @@ def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "compilers")
s1 = Spec('y %gcc@12.100.100 cflags="-O2"').concretized()
s1 = spack.concretize.concretized(Spec('y %gcc@12.100.100 cflags="-O2"'))
assert s1.satisfies('cflags="-Wall -O2"')
@@ -148,7 +149,7 @@ def test_flag_order_and_grouping(
if cmd_flags:
spec_str += f' cflags="{cmd_flags}"'
root_spec = Spec(spec_str).concretized()
root_spec = spack.concretize.concretized(Spec(spec_str))
spec = root_spec["y"]
satisfy_flags = " ".join(x for x in [cmd_flags, req_flags, cmp_flags, expected_dflags] if x)
assert spec.satisfies(f'cflags="{satisfy_flags}"')
@@ -156,11 +157,11 @@ def test_flag_order_and_grouping(
def test_two_dependents_flag_mixing(concretize_scope, test_repo):
root_spec1 = Spec("w~moveflaglater").concretized()
root_spec1 = spack.concretize.concretized(Spec("w~moveflaglater"))
spec1 = root_spec1["y"]
assert spec1.compiler_flags["cflags"] == "-d0 -d1 -d2".split()
root_spec2 = Spec("w+moveflaglater").concretized()
root_spec2 = spack.concretize.concretized(Spec("w+moveflaglater"))
spec2 = root_spec2["y"]
assert spec2.compiler_flags["cflags"] == "-d3 -d1 -d2".split()
@@ -169,7 +170,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
update_concretize_scope(conf_str, "compilers")
root_spec = Spec("v %gcc@12.100.100 cflags=='-f1'").concretized()
root_spec = spack.concretize.concretized(Spec("v %gcc@12.100.100 cflags=='-f1'"))
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
@@ -178,7 +179,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
def test_propagate_and_pkg_dep(concretize_scope, test_repo):
root_spec1 = Spec("x ~activatemultiflag cflags=='-f1'").concretized()
root_spec1 = spack.concretize.concretized(Spec("x ~activatemultiflag cflags=='-f1'"))
assert root_spec1["y"].satisfies("cflags='-f1 -d1'")
@@ -190,7 +191,7 @@ def test_propagate_and_require(concretize_scope, test_repo):
"""
update_concretize_scope(conf_str, "packages")
root_spec1 = Spec("v cflags=='-f1'").concretized()
root_spec1 = spack.concretize.concretized(Spec("v cflags=='-f1'"))
assert root_spec1["y"].satisfies("cflags='-f1 -f2'")
# Next, check that a requirement does not "undo" a request for
@@ -202,7 +203,7 @@ def test_propagate_and_require(concretize_scope, test_repo):
"""
update_concretize_scope(conf_str, "packages")
root_spec2 = Spec("v cflags=='-f1'").concretized()
root_spec2 = spack.concretize.concretized(Spec("v cflags=='-f1'"))
assert root_spec2["y"].satisfies("cflags='-f1'")
# Note: requirements cannot enforce propagation: any attempt to do
@@ -246,7 +247,7 @@ def test_diamond_dep_flag_mixing(concretize_scope, test_repo):
nodes of the diamond always appear in the same order).
`Spec.traverse` is responsible for handling both of these needs.
"""
root_spec1 = Spec("t").concretized()
root_spec1 = spack.concretize.concretized(Spec("t"))
spec1 = root_spec1["y"]
assert spec1.satisfies('cflags="-c1 -c2 -d1 -d2 -e1 -e2"')
assert spec1.compiler_flags["cflags"] == "-c1 -c2 -e1 -e2 -d1 -d2".split()

View File

@@ -8,6 +8,7 @@
import pytest
import spack.concretize
import spack.config
import spack.package_prefs
import spack.repo
@@ -47,7 +48,7 @@ def configure_permissions():
def concretize(abstract_spec):
return Spec(abstract_spec).concretized()
return spack.concretize.concretized(Spec(abstract_spec))
def update_packages(pkgname, section, value):
@@ -112,7 +113,7 @@ def test_preferred_variants_from_wildcard(self):
def test_preferred_compilers(self, compiler_str, spec_str):
"""Test preferred compilers are applied correctly"""
update_packages("all", "compiler", [compiler_str])
spec = spack.spec.Spec(spec_str).concretized()
spec = spack.concretize.concretized(spack.spec.Spec(spec_str))
assert spec.compiler == CompilerSpec(compiler_str)
def test_preferred_target(self, mutable_mock_repo):
@@ -214,15 +215,13 @@ def test_config_set_pkg_property_new(self, mock_repo_path):
def test_preferred(self):
""" "Test packages with some version marked as preferred=True"""
spec = Spec("python")
spec.concretize()
spec = spack.concretize.concretized(Spec("python"))
assert spec.version == Version("2.7.11")
# now add packages.yaml with versions other than preferred
# ensure that once config is in place, non-preferred version is used
update_packages("python", "version", ["3.5.0"])
spec = Spec("python")
spec.concretize()
spec = spack.concretize.concretized(Spec("python"))
assert spec.version == Version("3.5.0")
def test_preferred_undefined_raises(self):
@@ -230,7 +229,7 @@ def test_preferred_undefined_raises(self):
update_packages("python", "version", ["3.5.0.1"])
spec = Spec("python")
with pytest.raises(ConfigError):
spec.concretize()
spack.concretize.concretized(spec)
def test_preferred_truncated(self):
"""Versions without "=" are treated as version ranges: if there is
@@ -238,35 +237,29 @@ def test_preferred_truncated(self):
(don't define a new version).
"""
update_packages("python", "version", ["3.5"])
spec = Spec("python")
spec.concretize()
spec = spack.concretize.concretized(Spec("python"))
assert spec.satisfies("@3.5.1")
def test_develop(self):
"""Test concretization with develop-like versions"""
spec = Spec("develop-test")
spec.concretize()
spec = spack.concretize.concretized(Spec("develop-test"))
assert spec.version == Version("0.2.15")
spec = Spec("develop-test2")
spec.concretize()
spec = spack.concretize.concretized(Spec("develop-test2"))
assert spec.version == Version("0.2.15")
# now add packages.yaml with develop-like versions
# ensure that once config is in place, develop-like version is used
update_packages("develop-test", "version", ["develop"])
spec = Spec("develop-test")
spec.concretize()
spec = spack.concretize.concretized(Spec("develop-test"))
assert spec.version == Version("develop")
update_packages("develop-test2", "version", ["0.2.15.develop"])
spec = Spec("develop-test2")
spec.concretize()
spec = spack.concretize.concretized(Spec("develop-test2"))
assert spec.version == Version("0.2.15.develop")
def test_external_mpi(self):
# make sure this doesn't give us an external first.
spec = Spec("mpi")
spec.concretize()
spec = spack.concretize.concretized(Spec("mpi"))
assert not spec["mpi"].external
# load config
@@ -285,8 +278,7 @@ def test_external_mpi(self):
spack.config.set("packages", conf, scope="concretize")
# ensure that once config is in place, external is used
spec = Spec("mpi")
spec.concretize()
spec = spack.concretize.concretized(Spec("mpi"))
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_external_module(self, monkeypatch):
@@ -301,8 +293,7 @@ def mock_module(cmd, module):
monkeypatch.setattr(spack.util.module_cmd, "module", mock_module)
spec = Spec("mpi")
spec.concretize()
spec = spack.concretize.concretized(Spec("mpi"))
assert not spec["mpi"].external
# load config
@@ -321,8 +312,7 @@ def mock_module(cmd, module):
spack.config.set("packages", conf, scope="concretize")
# ensure that once config is in place, external is used
spec = Spec("mpi")
spec.concretize()
spec = spack.concretize.concretized(Spec("mpi"))
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_buildable_false(self):
@@ -468,7 +458,7 @@ def test_variant_not_flipped_to_pull_externals(self):
"""Test that a package doesn't prefer pulling in an
external to using the default value of a variant.
"""
s = Spec("vdefault-or-external-root").concretized()
s = spack.concretize.concretized(Spec("vdefault-or-external-root"))
assert "~external" in s["vdefault-or-external"]
assert "externaltool" not in s
@@ -480,7 +470,7 @@ def test_dependencies_cant_make_version_parent_score_better(self):
that makes the overall version score even or better and maybe
has a better score in some lower priority criteria.
"""
s = Spec("version-test-root").concretized()
s = spack.concretize.concretized(Spec("version-test-root"))
assert s.satisfies("^version-test-pkg@2.4.6")
assert "version-test-dependency-preferred" not in s
@@ -498,13 +488,13 @@ def test_multivalued_variants_are_lower_priority_than_providers(self):
with spack.config.override(
"packages:all", {"providers": {"somevirtual": ["some-virtual-preferred"]}}
):
s = Spec("somevirtual").concretized()
s = spack.concretize.concretized(Spec("somevirtual"))
assert s.name == "some-virtual-preferred"
@pytest.mark.regression("26721,19736")
def test_sticky_variant_accounts_for_packages_yaml(self):
with spack.config.override("packages:sticky-variant", {"variants": "+allow-gcc"}):
s = Spec("sticky-variant %gcc").concretized()
s = spack.concretize.concretized(Spec("sticky-variant %gcc"))
assert s.satisfies("%gcc") and s.satisfies("+allow-gcc")
@pytest.mark.regression("41134")
@@ -513,5 +503,5 @@ def test_default_preference_variant_different_type_does_not_error(self):
packages.yaml doesn't fail with an error.
"""
with spack.config.override("packages:all", {"variants": "+foo"}):
s = Spec("pkg-a").concretized()
s = spack.concretize.concretized(Spec("pkg-a"))
assert s.satisfies("foo=bar")

View File

@@ -7,6 +7,7 @@
import pytest
import spack.concretize
import spack.config
import spack.error
import spack.package_base
@@ -43,7 +44,7 @@ def test_one_package_multiple_reqs(concretize_scope, test_repo):
- "~shared"
"""
update_packages_config(conf_str)
y_spec = Spec("y").concretized()
y_spec = spack.concretize.concretized(Spec("y"))
assert y_spec.satisfies("@2.4~shared")
@@ -58,7 +59,7 @@ def test_requirement_isnt_optional(concretize_scope, test_repo):
"""
update_packages_config(conf_str)
with pytest.raises(UnsatisfiableSpecError):
Spec("x@1.1").concretize()
spack.concretize.concretized(Spec("x@1.1"))
def test_require_undefined_version(concretize_scope, test_repo):
@@ -75,7 +76,7 @@ def test_require_undefined_version(concretize_scope, test_repo):
"""
update_packages_config(conf_str)
with pytest.raises(spack.error.ConfigError):
Spec("x").concretize()
spack.concretize.concretized(Spec("x"))
def test_require_truncated(concretize_scope, test_repo):
@@ -90,7 +91,7 @@ def test_require_truncated(concretize_scope, test_repo):
require: "@1"
"""
update_packages_config(conf_str)
xspec = Spec("x").concretized()
xspec = spack.concretize.concretized(Spec("x"))
assert xspec.satisfies("@1.1")
@@ -160,7 +161,7 @@ def test_requirement_adds_new_version(
)
update_packages_config(conf_str)
s1 = Spec("v").concretized()
s1 = spack.concretize.concretized(Spec("v"))
assert s1.satisfies("@2.2")
# Make sure the git commit info is retained
assert isinstance(s1.version, spack.version.GitVersion)
@@ -181,7 +182,7 @@ def test_requirement_adds_version_satisfies(
)
# Sanity check: early version of T does not include U
s0 = Spec("t@2.0").concretized()
s0 = spack.concretize.concretized(Spec("t@2.0"))
assert not ("u" in s0)
conf_str = """\
@@ -193,7 +194,7 @@ def test_requirement_adds_version_satisfies(
)
update_packages_config(conf_str)
s1 = Spec("t").concretized()
s1 = spack.concretize.concretized(Spec("t"))
assert "u" in s1
assert s1.satisfies("@2.2")
@@ -219,7 +220,7 @@ def test_requirement_adds_git_hash_version(
"""
update_packages_config(conf_str)
s1 = Spec("v").concretized()
s1 = spack.concretize.concretized(Spec("v"))
assert isinstance(s1.version, spack.version.GitVersion)
assert s1.satisfies(f"v@{a_commit_hash}")
@@ -240,8 +241,8 @@ def test_requirement_adds_multiple_new_versions(
"""
update_packages_config(conf_str)
assert Spec("v").concretized().satisfies(f"@{commits[0]}=2.2")
assert Spec("v@2.3").concretized().satisfies(f"v@{commits[1]}=2.3")
assert spack.concretize.concretized(Spec("v")).satisfies(f"@{commits[0]}=2.2")
assert spack.concretize.concretized(Spec("v@2.3")).satisfies(f"v@{commits[1]}=2.3")
# TODO: this belongs in the concretize_preferences test module but uses
@@ -264,11 +265,11 @@ def test_preference_adds_new_version(
"""
update_packages_config(conf_str)
assert Spec("v").concretized().satisfies(f"@{commits[0]}=2.2")
assert Spec("v@2.3").concretized().satisfies(f"@{commits[1]}=2.3")
assert spack.concretize.concretized(Spec("v")).satisfies(f"@{commits[0]}=2.2")
assert spack.concretize.concretized(Spec("v@2.3")).satisfies(f"@{commits[1]}=2.3")
# When installing by hash, a lookup is triggered, so it's not mapped to =2.3.
s3 = Spec(f"v@{commits[1]}").concretized()
s3 = spack.concretize.concretized(Spec(f"v@{commits[1]}"))
assert s3.satisfies(f"v@{commits[1]}")
assert not s3.satisfies("@2.3")
@@ -288,7 +289,7 @@ def test_external_adds_new_version_that_is_preferred(concretize_scope, test_repo
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
spec = spack.concretize.concretized(Spec("x"))
assert spec["y"].satisfies("@2.7")
assert spack.version.Version("2.7") not in spec["y"].package.versions
@@ -297,7 +298,7 @@ def test_requirement_is_successfully_applied(concretize_scope, test_repo):
"""If a simple requirement can be satisfied, make sure the
concretization succeeds and the requirement spec is applied.
"""
s1 = Spec("x").concretized()
s1 = spack.concretize.concretized(Spec("x"))
# Without any requirements/preferences, the later version is preferred
assert s1.satisfies("@1.1")
@@ -307,7 +308,7 @@ def test_requirement_is_successfully_applied(concretize_scope, test_repo):
require: "@1.0"
"""
update_packages_config(conf_str)
s2 = Spec("x").concretized()
s2 = spack.concretize.concretized(Spec("x"))
# The requirement forces choosing the eariler version
assert s2.satisfies("@1.0")
@@ -324,7 +325,7 @@ def test_multiple_packages_requirements_are_respected(concretize_scope, test_rep
require: "@2.4"
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
spec = spack.concretize.concretized(Spec("x"))
assert spec["x"].satisfies("@1.0")
assert spec["y"].satisfies("@2.4")
@@ -340,7 +341,7 @@ def test_oneof(concretize_scope, test_repo):
- one_of: ["@2.4", "~shared"]
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
spec = spack.concretize.concretized(Spec("x"))
# The concretizer only has to satisfy one of @2.4/~shared, and @2.4
# comes first so it is prioritized
assert spec["y"].satisfies("@2.4+shared")
@@ -359,10 +360,10 @@ def test_one_package_multiple_oneof_groups(concretize_scope, test_repo):
"""
update_packages_config(conf_str)
s1 = Spec("y@2.5").concretized()
s1 = spack.concretize.concretized(Spec("y@2.5"))
assert s1.satisfies("%clang~shared")
s2 = Spec("y@2.4").concretized()
s2 = spack.concretize.concretized(Spec("y@2.4"))
assert s2.satisfies("%gcc+shared")
@@ -378,10 +379,10 @@ def test_require_cflags(concretize_scope, mock_packages):
"""
update_packages_config(conf_str)
spec_mpich2 = Spec("mpich2").concretized()
spec_mpich2 = spack.concretize.concretized(Spec("mpich2"))
assert spec_mpich2.satisfies("cflags=-g")
spec_mpi = Spec("mpi").concretized()
spec_mpi = spack.concretize.concretized(Spec("mpi"))
assert spec_mpi.satisfies("mpich cflags=-O1")
@@ -404,7 +405,7 @@ def test_requirements_for_package_that_is_not_needed(concretize_scope, test_repo
"""
update_packages_config(conf_str)
s1 = Spec("v").concretized()
s1 = spack.concretize.concretized(Spec("v"))
assert s1.satisfies("@2.1")
@@ -421,10 +422,10 @@ def test_oneof_ordering(concretize_scope, test_repo):
"""
update_packages_config(conf_str)
s1 = Spec("y").concretized()
s1 = spack.concretize.concretized(Spec("y"))
assert s1.satisfies("@2.4")
s2 = Spec("y@2.5").concretized()
s2 = spack.concretize.concretized(Spec("y@2.5"))
assert s2.satisfies("@2.5")
@@ -438,14 +439,14 @@ def test_reuse_oneof(concretize_scope, test_repo, tmp_path, mock_fetch):
store_dir = tmp_path / "store"
with spack.store.use_store(str(store_dir)):
s1 = Spec("y@2.5 ~shared").concretized()
s1 = spack.concretize.concretized(Spec("y@2.5~shared"))
PackageInstaller([s1.package], fake=True, explicit=True).install()
update_packages_config(conf_str)
with spack.config.override("concretizer:reuse", True):
s2 = Spec("y").concretized()
assert not s2.satisfies("@2.5 ~shared")
s2 = spack.concretize.concretized(Spec("y"))
assert not s2.satisfies("@2.5~shared")
@pytest.mark.parametrize(
@@ -473,7 +474,7 @@ def test_requirements_and_deprecated_versions(
update_packages_config(conf_str)
with spack.config.override("config:deprecated", allow_deprecated):
s1 = Spec("y").concretized()
s1 = spack.concretize.concretized(Spec("y"))
for constrain in expected:
assert s1.satisfies(constrain)
@@ -491,7 +492,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
"""
update_packages_config(conf_str)
spec = Spec(spec_str).concretized()
spec = spack.concretize.concretized(Spec(spec_str))
for s in spec.traverse():
assert s.satisfies(requirement_str)
@@ -500,7 +501,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
"requirements,expectations",
[
(("%gcc", "%clang"), ("%gcc", "%clang")),
(("%gcc ~shared", "@1.0"), ("%gcc ~shared", "@1.0 +shared")),
(("%gcc~shared", "@1.0"), ("%gcc~shared", "@1.0+shared")),
],
)
def test_default_and_package_specific_requirements(
@@ -518,7 +519,7 @@ def test_default_and_package_specific_requirements(
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
spec = spack.concretize.concretized(Spec("x"))
assert spec.satisfies(specific_exp)
for s in spec.traverse(root=False):
assert s.satisfies(generic_exp)
@@ -533,7 +534,7 @@ def test_requirements_on_virtual(mpi_requirement, concretize_scope, mock_package
"""
update_packages_config(conf_str)
spec = Spec("callpath").concretized()
spec = spack.concretize.concretized(Spec("callpath"))
assert "mpi" in spec
assert mpi_requirement in spec
@@ -554,7 +555,7 @@ def test_requirements_on_virtual_and_on_package(
"""
update_packages_config(conf_str)
spec = Spec("callpath").concretized()
spec = spack.concretize.concretized(Spec("callpath"))
assert "mpi" in spec
assert mpi_requirement in spec
assert spec["mpi"].satisfies(specific_requirement)
@@ -568,10 +569,10 @@ def test_incompatible_virtual_requirements_raise(concretize_scope, mock_packages
"""
update_packages_config(conf_str)
spec = Spec("callpath ^zmpi")
spec = Spec("callpath^zmpi")
# TODO (multiple nodes): recover a better error message later
with pytest.raises((UnsatisfiableSpecError, InternalConcretizerError)):
spec.concretize()
spack.concretize.concretized(spec)
def test_non_existing_variants_under_all(concretize_scope, mock_packages):
@@ -583,7 +584,7 @@ def test_non_existing_variants_under_all(concretize_scope, mock_packages):
"""
update_packages_config(conf_str)
spec = Spec("callpath ^zmpi").concretized()
spec = spack.concretize.concretized(Spec("callpath^zmpi"))
assert "~foo" not in spec
@@ -658,7 +659,7 @@ def test_conditional_requirements_from_packages_yaml(
and optional when the condition is not met.
"""
update_packages_config(packages_yaml)
spec = Spec(spec_str).concretized()
spec = spack.concretize.concretized(Spec(spec_str))
for match_str, expected in expected_satisfies:
assert spec.satisfies(match_str) is expected
@@ -734,7 +735,7 @@ def test_requirements_fail_with_custom_message(
"""
update_packages_config(packages_yaml)
with pytest.raises(spack.error.SpackError, match=expected_message):
Spec(spec_str).concretized()
spack.concretize.concretized(Spec(spec_str))
def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
@@ -753,9 +754,9 @@ def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
when: "+shared"
"""
update_packages_config(packages_yaml)
s = Spec("mpileaks").concretized()
s = spack.concretize.concretized(Spec("mpileaks"))
assert s.satisfies("%clang +shared")
assert s.satisfies("%clang+shared")
# Sanity checks that 'callpath' doesn't have the shared variant, but that didn't
# cause failures during concretization.
assert "shared" not in s["callpath"].variants
@@ -782,12 +783,12 @@ def test_requires_directive(concretize_scope, mock_packages):
spack.config.CONFIG.clear_caches()
# This package requires either clang or gcc
s = Spec("requires_clang_or_gcc").concretized()
s = spack.concretize.concretized(Spec("requires_clang_or_gcc"))
assert s.satisfies("%gcc@12.0.0")
# This package can only be compiled with clang
with pytest.raises(spack.error.SpackError, match="can only be compiled with Clang"):
Spec("requires_clang").concretized()
spack.concretize.concretized(Spec("requires_clang"))
@pytest.mark.parametrize(
@@ -840,20 +841,20 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
"""
update_packages_config(packages_yaml)
# Regular zlib concretize to +shared
s = Spec("zlib").concretized()
# Regular zlib concretize to+shared
s = spack.concretize.concretized(Spec("zlib"))
assert s.satisfies("+shared")
# If we specify the variant we can concretize only the one matching the constraint
s = Spec("zlib +shared").concretized()
s = spack.concretize.concretized(Spec("zlib+shared"))
assert s.satisfies("+shared")
with pytest.raises(UnsatisfiableSpecError):
Spec("zlib ~shared").concretized()
spack.concretize.concretized(Spec("zlib~shared"))
# A spec without the shared variant still concretize
s = Spec("pkg-a").concretized()
assert not s.satisfies("pkg-a +shared")
assert not s.satisfies("pkg-a ~shared")
s = spack.concretize.concretized(Spec("pkg-a"))
assert not s.satisfies("pkg-a+shared")
assert not s.satisfies("pkg-a~shared")
@pytest.mark.parametrize(
@@ -897,7 +898,7 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
"""
packages:
all:
require: "libs=static +feefoo"
require: "libs=static+feefoo"
""",
"multivalue-variant",
["libs=shared"],
@@ -912,7 +913,7 @@ def test_default_requirements_semantic_with_mv_variants(
from MV variants.
"""
update_packages_config(packages_yaml)
s = Spec(spec_str).concretized()
s = spack.concretize.concretized(Spec(spec_str))
for constraint in expected:
assert s.satisfies(constraint), constraint
@@ -937,7 +938,7 @@ def test_requiring_package_on_multiple_virtuals(concretize_scope, mock_packages)
require: intel-parallel-studio
"""
)
s = Spec("dla-future").concretized()
s = spack.concretize.concretized(Spec("dla-future"))
assert s["blas"].name == "intel-parallel-studio"
assert s["lapack"].name == "intel-parallel-studio"
@@ -990,7 +991,7 @@ def test_strong_preferences_packages_yaml(
):
"""Tests that "preferred" specs are stronger than usual preferences, but can be overridden."""
update_packages_config(packages_yaml)
s = Spec(spec_str).concretized()
s = spack.concretize.concretized(Spec(spec_str))
for constraint in expected:
assert s.satisfies(constraint), constraint
@@ -1039,29 +1040,29 @@ def test_conflict_packages_yaml(packages_yaml, spec_str, concretize_scope, mock_
"""Tests conflicts that are specified from configuration files."""
update_packages_config(packages_yaml)
with pytest.raises(UnsatisfiableSpecError):
Spec(spec_str).concretized()
spack.concretize.concretized(Spec(spec_str))
@pytest.mark.parametrize(
"spec_str,expected,not_expected",
[
(
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv~cuda",
"forward-multi-value+cuda cuda_arch=10^dependency-mv~cuda",
["cuda_arch=10", "^dependency-mv~cuda"],
["cuda_arch=11", "^dependency-mv cuda_arch=10", "^dependency-mv cuda_arch=11"],
),
(
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv+cuda",
"forward-multi-value+cuda cuda_arch=10^dependency-mv+cuda",
["cuda_arch=10", "^dependency-mv cuda_arch=10"],
["cuda_arch=11", "^dependency-mv cuda_arch=11"],
),
(
"forward-multi-value +cuda cuda_arch=11 ^dependency-mv+cuda",
"forward-multi-value+cuda cuda_arch=11^dependency-mv+cuda",
["cuda_arch=11", "^dependency-mv cuda_arch=11"],
["cuda_arch=10", "^dependency-mv cuda_arch=10"],
),
(
"forward-multi-value +cuda cuda_arch=10,11 ^dependency-mv+cuda",
"forward-multi-value+cuda cuda_arch=10,11^dependency-mv+cuda",
["cuda_arch=10,11", "^dependency-mv cuda_arch=10,11"],
[],
),
@@ -1074,9 +1075,9 @@ def test_forward_multi_valued_variant_using_requires(
`requires` directives of the form:
for _val in ("shared", "static"):
requires(f"^some-virtual-mv libs={_val}", when=f"libs={_val} ^some-virtual-mv")
requires(f"^some-virtual-mv libs={_val}", when=f"libs={_val}^some-virtual-mv")
"""
s = Spec(spec_str).concretized()
s = spack.concretize.concretized(Spec(spec_str))
for constraint in expected:
assert s.satisfies(constraint)
@@ -1087,7 +1088,7 @@ def test_forward_multi_valued_variant_using_requires(
def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_packages):
"""Tests that strong preferences have a higher priority than reusing specs."""
reused_spec = Spec("adios2~bzip2").concretized()
reused_spec = spack.concretize.concretized(Spec("adios2~bzip2"))
reuse_nodes = list(reused_spec.traverse())
root_specs = [Spec("ascent+adios2")]
@@ -1122,7 +1123,7 @@ def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_pa
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(
setup, [Spec("ascent+adios2 ^adios2~bzip2")], reuse=reuse_nodes
setup, [Spec("ascent+adios2^adios2~bzip2")], reuse=reuse_nodes
)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent

View File

@@ -5,6 +5,7 @@
import pytest
import spack.concretize
import spack.spec
import spack.store
@@ -14,7 +15,7 @@
def test_set_install_hash_length(hash_length, mutable_config, tmpdir):
mutable_config.set("config:install_hash_length", hash_length)
with spack.store.use_store(str(tmpdir)):
spec = spack.spec.Spec("libelf").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("libelf"))
prefix = spec.prefix
hash_str = prefix.rsplit("-")[-1]
assert len(hash_str) == hash_length
@@ -24,7 +25,7 @@ def test_set_install_hash_length(hash_length, mutable_config, tmpdir):
def test_set_install_hash_length_upper_case(mutable_config, tmpdir):
mutable_config.set("config:install_hash_length", 5)
with spack.store.use_store(str(tmpdir), extra_data={"projections": {"all": "{name}-{HASH}"}}):
spec = spack.spec.Spec("libelf").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("libelf"))
prefix = spec.prefix
hash_str = prefix.rsplit("-")[-1]
assert len(hash_str) == 5

View File

@@ -37,6 +37,7 @@
import spack.caches
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
import spack.directives_meta
import spack.environment as ev
@@ -855,7 +856,7 @@ def _populate(mock_db):
"""
def _install(spec):
s = spack.spec.Spec(spec).concretized()
s = spack.concretize.concretized(spack.spec.Spec(spec))
PackageInstaller([s.package], fake=True, explicit=True).install()
_install("mpileaks ^mpich")
@@ -1989,7 +1990,9 @@ def default_mock_concretization(config, mock_packages, concretized_specs_cache):
def _func(spec_str, tests=False):
key = spec_str, tests
if key not in concretized_specs_cache:
concretized_specs_cache[key] = spack.spec.Spec(spec_str).concretized(tests=tests)
concretized_specs_cache[key] = spack.concretize.concretized(
spack.spec.Spec(spec_str), tests=tests
)
return concretized_specs_cache[key].copy()
return _func

View File

@@ -9,6 +9,7 @@
from llnl.util.filesystem import mkdirp, touch, working_dir
import spack.concretize
from spack.fetch_strategy import CvsFetchStrategy
from spack.spec import Spec
from spack.stage import Stage
@@ -38,7 +39,7 @@ def test_fetch(type_of_test, mock_cvs_repository, config, mutable_mock_repo):
get_date = mock_cvs_repository.get_date
# Construct the package under test
spec = Spec("cvs-test").concretized()
spec = spack.concretize.concretized(Spec("cvs-test"))
spec.package.versions[Version("cvs")] = test.args
# Enter the stage directory and check some properties

View File

@@ -27,6 +27,7 @@
import llnl.util.lock as lk
from llnl.util.tty.colify import colify
import spack.concretize
import spack.database
import spack.deptypes as dt
import spack.package_base
@@ -78,8 +79,8 @@ def test_query_by_install_tree(
up_write_db, up_db, down_db = upstream_and_downstream_db
# Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b")
b = spack.spec.Spec("pkg-b").concretized()
c = spack.spec.Spec("pkg-c").concretized()
b = spack.concretize.concretized(spack.spec.Spec("pkg-b"))
c = spack.concretize.concretized(spack.spec.Spec("pkg-c"))
up_write_db.add(c)
up_db._read()
down_db.add(b)
@@ -96,7 +97,7 @@ def test_spec_installed_upstream(
# a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository):
spec = spack.spec.Spec("pkg-c").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("pkg-c"))
assert not spec.installed
assert not spec.installed_upstream
@@ -125,7 +126,7 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
builder.add_package("w", dependencies=[("x", None, None), ("y", None, None)])
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("w").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("w"))
for dep in spec.traverse(root=False):
upstream_write_db.add(dep)
upstream_db._read()
@@ -136,7 +137,7 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
with pytest.raises(spack.database.ForbiddenLockError):
upstream_db.get_by_hash(dep.dag_hash())
new_spec = spack.spec.Spec("w").concretized()
new_spec = spack.concretize.concretized(spack.spec.Spec("w"))
downstream_db.add(new_spec)
for dep in new_spec.traverse(root=False):
upstream, record = downstream_db.query_by_spec_hash(dep.dag_hash())
@@ -158,7 +159,7 @@ def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir, capsys, config
builder.add_package("y", dependencies=[("z", None, None)])
with spack.repo.use_repositories(builder):
y = spack.spec.Spec("y").concretized()
y = spack.concretize.concretized(spack.spec.Spec("y"))
z = y["z"]
# add dependency to upstream, dependents to downstream
@@ -190,7 +191,7 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
builder.add_package("x")
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("x"))
downstream_db.add(spec)
upstream_write_db.add(spec)
@@ -222,7 +223,7 @@ def test_cannot_write_upstream(tmp_path, mock_packages, config):
db = spack.database.Database(str(tmp_path), is_upstream=True)
with pytest.raises(spack.database.ForbiddenLockError):
db.add(spack.spec.Spec("pkg-a").concretized())
db.add(spack.concretize.concretized(spack.spec.Spec("pkg-a")))
@pytest.mark.usefixtures("config", "temporary_store")
@@ -236,7 +237,7 @@ def test_recursive_upstream_dbs(tmpdir, gen_mock_layout):
builder.add_package("x", dependencies=[("y", None, None)])
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
spec = spack.concretize.concretized(spack.spec.Spec("x"))
db_c = spack.database.Database(roots[2], layout=layouts[2])
db_c.add(spec["z"])
@@ -386,7 +387,7 @@ def _check_remove_and_add_package(database: spack.database.Database, spec):
def _mock_install(spec: str):
s = spack.spec.Spec(spec).concretized()
s = spack.concretize.concretized(spack.spec.Spec(spec))
PackageInstaller([s.package], fake=True, explicit=True).install()
@@ -731,8 +732,7 @@ def test_regression_issue_8036(mutable_database, usr_folder_exists):
# existing. Even when the package prefix exists, the package should
# not be considered installed until it is added to the database by
# the installer with install().
s = spack.spec.Spec("externaltool@0.9")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("externaltool@0.9"))
assert not s.installed
# Now install the external package and check again the `installed` property
@@ -747,8 +747,7 @@ def test_old_external_entries_prefix(mutable_database):
jsonschema.validate(db_obj, schema)
s = spack.spec.Spec("externaltool")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("externaltool"))
db_obj["database"]["installs"][s.dag_hash()]["path"] = "None"
@@ -777,8 +776,7 @@ def test_uninstall_by_spec(mutable_database):
def test_query_unused_specs(mutable_database):
# This spec installs a fake cmake as a build only dependency
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("simple-inheritance"))
PackageInstaller([s.package], fake=True, explicit=True).install()
si = s.dag_hash()
@@ -820,8 +818,7 @@ def check_unused(roots, deptype, expected):
def test_query_spec_with_conditional_dependency(mutable_database):
# The issue is triggered by having dependencies that are
# conditional on a Boolean variant
s = spack.spec.Spec("hdf5~mpi")
s.concretize()
s = spack.concretize.concretized(spack.spec.Spec("hdf5~mpi"))
PackageInstaller([s.package], fake=True, explicit=True).install()
results = spack.store.STORE.db.query_local("hdf5 ^mpich")
@@ -861,7 +858,7 @@ def _is(self, spec):
# Pretend the spec has been failure locked
monkeypatch.setattr(spack.database.FailureTracker, "lock_taken", _is)
s = spack.spec.Spec("pkg-a").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a"))
spack.store.STORE.failure_tracker.clear(s)
out = capfd.readouterr()[0]
assert "Retaining failure marking" in out
@@ -879,7 +876,7 @@ def _is(self, spec):
# Ensure raise OSError when try to remove the non-existent marking
monkeypatch.setattr(spack.database.FailureTracker, "persistent_mark", _is)
s = spack.spec.Spec("pkg-a").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a"))
spack.store.STORE.failure_tracker.clear(s, force=True)
out = capfd.readouterr()[1]
assert "Removing failure marking despite lock" in out
@@ -894,7 +891,7 @@ def _raise_exc(lock):
raise lk.LockTimeoutError("write", "/mock-lock", 1.234, 10)
with tmpdir.as_cwd():
s = spack.spec.Spec("pkg-a").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a"))
# Ensure attempt to acquire write lock on the mark raises the exception
monkeypatch.setattr(lk.Lock, "acquire_write", _raise_exc)
@@ -910,7 +907,7 @@ def _raise_exc(lock):
def test_prefix_failed(mutable_database, monkeypatch):
"""Add coverage to failed operation."""
s = spack.spec.Spec("pkg-a").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a"))
# Confirm the spec is not already marked as failed
assert not spack.store.STORE.failure_tracker.has_failed(s)
@@ -934,7 +931,7 @@ def test_prefix_write_lock_error(mutable_database, monkeypatch):
def _raise(db, spec):
raise lk.LockError("Mock lock error")
s = spack.spec.Spec("pkg-a").concretized()
s = spack.concretize.concretized(spack.spec.Spec("pkg-a"))
# Ensure subsequent lock operations fail
monkeypatch.setattr(lk.Lock, "acquire_write", _raise)
@@ -1142,7 +1139,7 @@ def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config):
# we install `mpileaks` locally with dependencies in the upstream. And we even install
# `mpileaks` with the same hash in the upstream. After reindexing, `mpileaks` should still be
# in the local db, and `callpath` should not.
mpileaks = spack.spec.Spec("mpileaks").concretized()
mpileaks = spack.concretize.concretized(spack.spec.Spec("mpileaks"))
callpath = mpileaks.dependencies("callpath")[0]
upstream_store = spack.store.create(

View File

@@ -6,6 +6,7 @@
import pytest
import spack.concretize
import spack.directives
import spack.repo
import spack.spec
@@ -60,8 +61,8 @@ def test_constraints_from_context_are_merged(mock_packages):
@pytest.mark.regression("27754")
def test_extends_spec(config, mock_packages):
extender = spack.spec.Spec("extends-spec").concretized()
extendee = spack.spec.Spec("extendee").concretized()
extender = spack.concretize.concretized(spack.spec.Spec("extends-spec"))
extendee = spack.concretize.concretized(spack.spec.Spec("extendee"))
assert extender.dependencies
assert extender.package.extends(extendee)
@@ -148,6 +149,8 @@ def test_version_type_validation():
_pkgx = (
"x",
"""\
from spack.package import *
class X(Package):
version("1.3")
version("1.2")
@@ -166,6 +169,8 @@ class X(Package):
_pkgy = (
"y",
"""\
from spack.package import *
class Y(Package):
version("2.1")
version("2.0")
@@ -203,7 +208,7 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
def test_redistribute_directive(test_repo, spec_str, distribute_src, distribute_bin):
spec = spack.spec.Spec(spec_str)
assert spec.package_class.redistribute_source(spec) == distribute_src
concretized_spec = spec.concretized()
concretized_spec = spack.concretize.concretized(spec)
assert concretized_spec.package.redistribute_binary == distribute_bin
@@ -219,10 +224,10 @@ class MockPackage:
disable_redistribute = {}
cls = MockPackage
spack.directives._execute_redistribute(cls, source=False, when="@1.0")
spack.directives._execute_redistribute(cls, source=False, binary=None, when="@1.0")
spec_key = spack.directives._make_when_spec("@1.0")
assert not cls.disable_redistribute[spec_key].binary
assert cls.disable_redistribute[spec_key].source
spack.directives._execute_redistribute(cls, binary=False, when="@1.0")
spack.directives._execute_redistribute(cls, source=None, binary=False, when="@1.0")
assert cls.disable_redistribute[spec_key].binary
assert cls.disable_redistribute[spec_key].source

Some files were not shown because too many files have changed in this diff Show More