Compare commits

..

151 Commits

Author SHA1 Message Date
Wouter Deconinck
81e6dcd95c gaudi: add v39.2; CUDA support (#48557)
* gaudi: add v39.2; CUDA support
* gaudi: CUDAPackage -> CudaPackage
2025-01-23 08:41:38 +01:00
Tara Drwenski
518572e710 Use gcc toolchain when using hip and gcc host compiler (#48632) 2025-01-22 23:37:24 -08:00
Sinan
6f4ac31a67 alps: add new package (#48183) 2025-01-22 14:57:19 -07:00
Victor A. P. Magri
e291daaa17 Add OpenMP dependency to Kokkos and KokkosKernels (#48455)
* Add OpenMP dependency to Kokkos and KokkosKernels

* Add Chris' suggestions
2025-01-22 12:53:41 -07:00
Seth R. Johnson
58f1e791a0 Celeritas: new version 0.5.1 (#48678)
* Celeritas: add version 0.5

* Fix style

* Un-deprecate non-awful versions
2025-01-22 11:03:48 -07:00
jnhealy2
aba0a740c2 Address quoting issue impacting dev_paths containing @ symbols (#48555)
* Address quoting issue that casuses dev_paths containing @ symbols to parse as versions

* Fix additional location were dev_path was imporperly constructed

The dev_path must be quoted to avoid parsing issues when
paths contain '@' symbols. Also add tests to catch
regression of this behavior

* Format to fix line length

* fix failing tests

* Fix whitespace error

* Add binary_compatibility fixture to test

* Change string formatting to avoid multiline f string

* Update lib/spack/spack/test/concretization/core.py

* Add tmate debug session

* Revert "Add tmate debug session"

This reverts commit 24e2f77e3c.

* Move test and refactor to use env methods

* Update lib/spack/spack/test/cmd/develop.py

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-22 18:03:13 +01:00
Harmen Stoppels
0fe8e763c3 bootstrap: do not show disabled sources as errors if enabled sources fail (#48676) 2025-01-22 17:02:23 +01:00
Harmen Stoppels
0e2d261b7e bootstrap/status.py: remove make (#48677) 2025-01-22 17:00:25 +01:00
Matthieu Dorier
85cb234861 pmdk: add cmake dependency back (#48664) 2025-01-22 14:44:04 +01:00
Harmen Stoppels
87a83db623 autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-01-22 11:29:50 +01:00
Taillefumier Mathieu
e1e17786c5 sirius: libxc 7 forward compat (#48663) 2025-01-22 11:27:19 +01:00
eugeneswalker
68af5cc4c0 e4s ci stacks: add libceed (#48668) 2025-01-21 14:13:55 -08:00
Harmen Stoppels
70df460fa7 PackageBase.detect_dev_src_change: speed up (#48618) 2025-01-21 16:48:37 +01:00
Harmen Stoppels
31a1b2fd6c relocate.py, binary_distribution.py: cleanup (#48651) 2025-01-21 15:45:08 +01:00
moloney
f8fd51e12f BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2, missing pkgconfig dep (#48597)
* BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2

Improves compat with newer Python (>3.9) and numpy.

Fix error during configure step (even on at least some older
versions) by including 'pkgconfig' as a build dep so gtk+ is found.

* BF: Make wxpython use spack built wxwidgets instead of rebuilding

* Update var/spack/repos/builtin/packages/py-wxpython/package.py

Avoid using too new version of py-setuptools as license file format is currently not compatible.

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 21:01:47 -06:00
Mirko Rahn
12784594aa gpi-space: add v24.12 (#48361) 2025-01-20 11:47:56 -07:00
Massimiliano Culpo
e0eb0aba37 netlib-lapack: add v3.12.1 (#48318)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 18:23:54 +01:00
Stephen Nicholas Swatman
f47bf5f6b8 indicators: new package (#47786)
* indicators: new package

Indicators is a new package which aims to add easy-to-use progress bars
to modern C++.

* Update header

* Update dependencies
2025-01-20 09:00:37 -06:00
Seth R. Johnson
9296527775 npm, node-js, typescript: add external find (#48587)
* npm: add external find

* node-js: add external find

* Add external for typescript

* Match full executable string

* Fix style
2025-01-20 07:26:01 -06:00
Wouter Deconinck
08c53fa405 pythia8: correct with_or_without prefix for +openmpi (#48131)
* pythia8: correct with_or_without prefix for +openmpi

* pythia8: fix style

* pythia8: fix style
2025-01-20 06:50:29 -06:00
Harmen Stoppels
0c6f0c090d openldap: fix build (#48646) 2025-01-20 13:16:52 +01:00
Nathan Hanford
c623448f81 add affinity package (#48589) 2025-01-20 04:44:25 -07:00
Wouter Deconinck
df71341972 py-mplhep: add v0.3.55 (#48338) 2025-01-20 10:26:09 +01:00
Wouter Deconinck
75862c456d lhapdf: add v6.5.5 (#48336) 2025-01-20 10:25:38 +01:00
dmagdavector
e680a0c153 libslirp: add v4.8.0 (#48561) 2025-01-20 10:23:47 +01:00
Wouter Deconinck
9ad36080ca r: add v4.4.2 (#48329)
* r: add v4.4.2

* r-rcpp: add v1.0.13-1 spot release; conflict 1.0.13 with r@4.4.2
2025-01-20 10:23:23 +01:00
Harmen Stoppels
ecd14f0ad9 archive.py: fix include_parent_directories=True with path_to_name (#48570) 2025-01-20 10:21:32 +01:00
Rocco Meli
c44edf1e8d gnina: add v1.3 (#47711)
* gnina: add v1.3

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* use github patch

* update

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-01-20 10:19:49 +01:00
Paul R. C. Kent
1eacdca5aa py-pyscf: add v2.8.0 (#48571) 2025-01-20 10:15:41 +01:00
Keita Iwabuchi
4a8f5efb38 Metall: add v0.29 and v0.30 (#48564)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2025-01-20 10:14:42 +01:00
Mickael PHILIT
2e753571bd cgns: add v4.5.0 (#48490) 2025-01-20 10:09:45 +01:00
Thomas Bouvier
da16336550 nvtx: fix missing import (#48580) 2025-01-20 10:08:47 +01:00
Alberto Sartori
1818e70e74 justbuild: add version 1.4.2 (#48581) 2025-01-20 10:07:55 +01:00
Chang Liu
1dde785e9a fusion-io: new package (#47699) 2025-01-20 10:06:13 +01:00
Alberto Invernizzi
a7af32c23b py-virtualenvwrapper: add v6.0.0, v6.1.0, v6.1.1 (#47785) 2025-01-20 10:04:40 +01:00
Simon Frasch
6c92ad439b spfft: add v1.1.1 (#48643) 2025-01-20 10:03:50 +01:00
snehring
93f555eb14 dorado: fixing typo on version restriction for hdf5 (#48591)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-01-20 10:01:29 +01:00
Vanessasaurus
fa3725e9de flux-pmix: add v0.6.0 (#48600)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 10:00:35 +01:00
Vanessasaurus
870dd6206f flux-sched: add v0.41.0(#48599)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 09:57:22 +01:00
Olivier Cessenat
b1d411ab06 octave: new version 9.3.0 (#48606) 2025-01-20 09:56:55 +01:00
Massimiliano Culpo
783eccfbd5 jsonschema: use draft7 validator and simplify schemas (#48621) 2025-01-20 09:51:29 +01:00
Wouter Deconinck
a842332b1b rsync: add v3.4.0, v3.4.1 (#48607) 2025-01-20 09:50:44 +01:00
Harmen Stoppels
7e41288ca6 spack_yaml.py / spec.py: use dict instead of OrderedDict (#48616)
* for config: let `syaml_dict` inherit from `dict` instead of `OrderedDict`. `syaml_dict` now only exists as a mutable wrapper for yaml related metadata.
* for spec serialization / hashing: use `dict` directly

This is possible since we only support cpython 3.6+ in which dicts are ordered.

This improves performance of hash computation a bit. For a larger spec I'm getting 9.22ms instead of 11.9ms, so 22.5% reduction in runtime.
2025-01-20 17:33:44 +09:00
Tara Drwenski
3bb375a47f Change llvm-amdgpu to a build dependency of rocm (#48612) 2025-01-20 17:28:15 +09:00
Olivier Cessenat
478855728f ngspice: add v44 (#48611) 2025-01-20 09:27:51 +01:00
jgraciahlrs
5e3baeabfa scorep: update configure opts for libbfd (#48614) 2025-01-20 09:24:12 +01:00
Axel Huebl
58b9b54066 openPMD-api: add v0.16.1 (#48601)
* openpmd-api: add v0.16.1

Add the latest release of openPMD-api.

* TOML11: Latest Versions
2025-01-20 09:22:30 +01:00
Luca Heltai
3918deab74 dealii: add v9.6.1, and v9.6.2 (#48620) 2025-01-20 09:20:49 +01:00
Matt Thompson
ceb2ce352f mapl: add v2.52.0 (#48629) 2025-01-20 09:10:29 +01:00
Wouter Deconinck
7dc6bff7b1 root: depends_on r-rcpp@:1.0.12 when @:6.32.02 (#48625) 2025-01-20 09:05:55 +01:00
Adam J. Stewart
05fbbd7164 py-numpy: add v2.1.3, v2.2.1, v2.2.2 (#48640) 2025-01-20 08:48:56 +01:00
Juan Miguel Carceller
58421866c2 hepmc3: extend python when the python bindings are enabled (#47836)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-20 08:46:08 +01:00
Harmen Stoppels
962498095d compat with mypy 1.14 (#48626)
* OSError.errno apparently is int | None, but already part of __str__ anyway so drop

* fix disambiguate_spec_from_hashes

* List[Spec].sort() complains, ignore it

* fix typing of KnownCompiler, and use it in asp.py
2025-01-18 21:16:05 +01:00
Anna Rift
d0217cf04e Fix two instances of duplicate 'https://' in URLs (#48628) 2025-01-17 22:59:02 -07:00
Olivier Cessenat
65745fa0df poppler: many improvements (#48627)
* poppler: many improvements

* Strangely I did not have the proper comment header

* Add a maintainer
2025-01-17 22:42:50 -07:00
John Gouwar
9f7cff1780 Stable splice-topo order for resolving splicing (#48605)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-17 16:27:16 +01:00
Harmen Stoppels
bb43fa5444 spec.py: make hashing of extra_attributes order independent (#48615) 2025-01-17 13:50:36 +01:00
Julien Loiseau
847f560a6e hard: new package (#48595) 2025-01-16 15:47:59 -07:00
Tamara Dahlgren
623ff835fc mypy: update python version to avoid error/warning (#48593)
* pyproject: remove mypy python version option since defaults to python used to run it (#48593)
2025-01-16 11:24:29 -07:00
Harmen Stoppels
ca19790ff2 py-executing: support python 3.13 (#48604) 2025-01-16 19:00:10 +01:00
acastanedam
f23366e4f8 Update elk to versions 8.8, 9.6, 10.2 (#48583) 2025-01-16 08:46:18 -08:00
Chris Marsh
42fb689501 py-pint-xarray: add version 0.4 (#47599) 2025-01-16 16:57:49 +01:00
arezaii
c33bbdb77d add Arkouda server and client packages (#48054)
Adds packages for the Arkouda server (`arkouda`) and Arkouda python client  (`py-arkouda`).

Arkouda server and client are divided into separate packages to allow for them to be
installed independently of one another. 

Future work remains to add a `+dev` variant to `py-arkouda`, but that will require additional
supporting packages made available through spack (e.g. for `py-pytest-env`
2025-01-15 21:39:23 -08:00
Laurent Chardon
1af6aa22c1 py-basemap: add v1.4.1 (#48548)
* py-basemap: add v1.4.1
* py-basemap: remove broken v1.2.1
* py-basemap: add hires variant
2025-01-15 18:05:05 -08:00
Laurent Chardon
03d9373e5c py-pycuda: add version 2024.1.2 (#48547)
* py-pycuda: add version 2024.1.2
* py-pycuda: add version 2024.1.2
* py-pycuda: Improve dependencies versions
2025-01-15 17:33:14 -08:00
HELICS-bot
f469a3d6ab helics: Add version 3.6.0 (#48000)
* helics: Add version 3.6.0
* helics: Add constraints for new minimum CMake (3.22+), Boost (1.75+), GCC (11+), Clang (15+), and ICC (21+) versions in HELICS 3.6+

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
2025-01-15 17:32:27 -07:00
Xuefeng Ding
25f24d947a add root to rpath of python (#48491)
* add root to rpath of python?

* fix #48446

* [@spackbot] updating style on behalf of DingXuefeng

---------

Co-authored-by: DingXuefeng <DingXuefeng@users.noreply.github.com>
Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-01-15 14:27:59 -07:00
Sergio Sánchez Ramírez
af25a84a56 julia: bump 1.11 and 1.10 patch versions (#48149) 2025-01-15 20:23:09 +01:00
Massimiliano Culpo
59a71959e7 Fix wrong hash in spliced specs, and improve unit-tests (#48574)
Modifications:
* Fix a severe bug where a spliced spec has the same hash as the original build spec
* Removed CacheManager in favor of install_mockery. The latter sets up a temporary store, and removes it at the end 
   of the test.
* Deleted a couple of helper functions e.g. _has_dependencies
* Checked that SolverException is raised, rather than Exception (more specific)
* Extended, and renamed the splicing_setup fixture (now called install_specs)
* Added more specific assertions in each test

One test is currently flaky, due to some instability when sorting multiple specs. It's currently marked xfail
2025-01-15 19:43:55 +01:00
Harmen Stoppels
00e804a94b package.py: expose types of common globals unconditionally (#48588) 2025-01-15 17:33:38 +01:00
Harmen Stoppels
db997229f2 julia: fix forward compat with curl (#48585) 2025-01-15 16:13:37 +01:00
Massimiliano Culpo
6fac041d40 Add type-hints to which and which_string (#48554) 2025-01-15 15:24:18 +01:00
Paul R. C. Kent
f8b2c65ddf llvm: add v19.1.7 (#48572) 2025-01-15 15:22:14 +01:00
Jordan Galby
c504304d39 Add version attributes up_to_1, up_to_2, up_to_3 (#38410)
Making them also available in spec format string
2025-01-15 13:07:17 +01:00
Harmen Stoppels
976f1c2198 flag_mixing.py: add missing import (#48579) 2025-01-15 10:48:34 +01:00
Greg Becker
e7c591a8b8 Deprecate Spec.concretize/Spec.concretized in favor of spack.concretize.concretize_one (#47971)
The methods spack.spec.Spec.concretize and spack.spec.Spec.concretized
are deprecated in favor of spack.concretize.concretize_one.

This will resolve a circular dependency between the spack.spec and
spack.concretize in the next Spack release.
2025-01-15 10:13:19 +01:00
arezaii
f3522cba74 Chapel 2.3 (#48053)
* add python bindings variant to chapel
* fix chpl_home, update chapel frontend rpath in venv
* fix chpl frontend shared lib rpath hack
* patch chapel env var line length limit
* patch chapel python shared lib location by chapel version
* unhack chpl frontend lib rpath
* fix postinstall main version number file path
* update chapel version number to 2.3
* use chapel releases instead of source tarballs, remove dead code
* Chapel 2.3 adds support for LLVM 19
* Bundled LLVM always requires CMake 3.20:
* Apply 2.3 patch for LLVM include search path

Fixes build errors for `chapel@2.3+rocm` of the form:

```
compiler/llvm/llvmDebug.cpp:174:9: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
compiler/llvm/llvmDebug.cpp:254:15: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
```

* fix style

* Fix misreporting of test_hello failures

`test_part` was aliasing the enclosing test name, leading to confusing and incorrect reporting on test failure.

* Ensure `libxml2` optional dep of `hwloc` is also added to `PKG_CONFIG_PATH` in the run env

* patch chplCheck for GPU + multilocale configs, install docs

* Adjust patch for checkChplInstall

* Ensure `CHPL_DEVELOPER` is unset for `~developer`

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2025-01-14 21:48:44 -08:00
Hubertus van Dam
0bd9c235a0 lammps: enable scafacos (#47638)
* Enable Scafacos in LAMMPS
* lammps: make scafacos work with +lib

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-14 21:33:44 -07:00
jclause
335fca7049 Improve support using external modules with zsh (#48115)
* Improve support using external modules with zsh

Spack integrates with external modules by launching a python subprocess
to scrape the path from the module command. In zsh, subshells do not
inheret functions defined in the parent shell (only environment
variables). This breaks the module function in module_cmd.py.

As a workaround, source the module commands using $MODULESHOME prior to
running the module command.

* Fix formatting

* Fix flake error

* Add guard around sourcing module file

* Add improved unit testing to module src command

* Correct style

* Another attempt at style

* formatting again

* formatting again

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-14 21:29:15 -07:00
Alex Richert
ce5ef14fdb crtm: add v3.1.1-build1 (#48451)
* add crtm@3.1.1-build1
* fix url_for_version's version check

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-01-14 19:03:50 -07:00
Greg Becker
2447d16e55 bugfix: associate nodes directly with specs for compiler flag reordering (#48496) 2025-01-14 10:15:57 -08:00
Wouter Deconinck
8196c68ff3 py-dask: fix py-versioneer version pin (#47980)
* py-dask: fix py-versioneer version pin
* py-dask: depends_on py-click@8.1 when @2023.11.0:
* py-dask: reorder dependency lines

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>

---------

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>
2025-01-14 09:44:24 -08:00
Taillefumier Mathieu
308f74fe8b trexio: add v2.2.3 -> master (#48543)
* Update trexio
   - update version
   - add cmake support

* Fix formating

* hdf5+hl only needed when < 2.3.0

---------

Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
2025-01-14 09:39:43 -08:00
Ludovic Räss
864f09fef0 pism: add v2.0.7, v2.1.1 (#47921)
* Update recipe

* Update

* Update

* Cleanup

* Fixup
2025-01-14 09:20:16 -08:00
Juan Miguel Carceller
29b53581e2 bdsim: update to point to the new location in github and add 1.7.7 (#48458)
* bdsim: update to point to the new location in github and add 1.7.7

* Remove the C++ standard patch

* Remove the cmake_args and add a comment instead

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-14 09:18:22 -08:00
Wouter Deconinck
e441e780b9 flatbuffers: add v24.12.23 (#48563)
* flatbuffers: add v24.12.23

* flatbuffers: fix style
2025-01-14 09:12:08 -08:00
Jean Luca Bez
4c642df5ae drishti: add v0.5, v0.6 (#39262)
* include new versions, update dependencies, add test
* Update package.py
* requested changes
* link dependency
* include commit
* fix style
2025-01-14 08:48:35 -07:00
Matt Thompson
217774c972 mepo: add v2.2.1, v2.3.0 (#48552) 2025-01-14 08:44:57 -07:00
Matthieu Dorier
8ec89fc54c pmdk: add gmake dependency, remove cmake dependency (#48544) 2025-01-14 08:44:43 -07:00
Marc T. Henry de Frahan
66ce93a2e3 Openfast version update (#48558) 2025-01-14 08:44:26 -07:00
AMD Toolchain Support
116ffe5809 ICON: Fix missing ocean variant logic handling (#48550)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2025-01-14 08:44:01 -07:00
Hans Fangohr
6b0ea2db1d octopus: add v15.1 (#48514) 2025-01-14 08:43:45 -07:00
Harmen Stoppels
d72b371c8a relocate_text: fix return value (#48568) 2025-01-14 13:29:06 +01:00
Mikael Simberg
aa88ced154 Set CMake policy CMP0074 to the new behaviour in hip package (#48549) 2025-01-14 04:48:09 -07:00
Todd Gamblin
d89ae7bcde Database: make constructor safe for read-only filesystems (#48454)
`spack find` and other commands that read the DB will fail if the databse is in a
read-only location, because `FailureTracker` and `Database` can try to create their
parent directories when they are constructed, even if the DB root is read-only.

Instead, we should only write to the filesystem when needed.

- [x] don't create parent directories in constructors
- [x] have write operations create parents if needed
- [x] in tests, `chmod` databases to be read-only unless absolutely needed.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-01-14 00:10:51 -08:00
Harmen Stoppels
53d1665a8b traverse: use deque and popleft (#48353)
Avoids a quadratic time operation. In practice it is not really felt, cause it has a tiny constant.
But maybe in very large DAGs it is.
2025-01-14 00:09:57 -08:00
Harmen Stoppels
9fa1654102 builtin: add missing deps on gmake (#48545) 2025-01-13 22:51:20 -07:00
Julien Bigot
2c692a5755 doxygen: new versions 1.13 (#48541) 2025-01-13 21:29:40 -07:00
Hubertus van Dam
c0df012b18 py-mdi: new package (#47636)
* Adding MDI
* py-mdi: update package
* lammps: py-mdi needs to be usable at runtime
* py-mdi: update dependency constraints

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-13 20:54:06 -07:00
Cody Balos
66e8523e14 add long_spec property to get fully enumerated spec string (#48389) 2025-01-13 16:43:39 -08:00
Paul R. C. Kent
3932299768 Add v0.3.29 (#48536) 2025-01-13 16:36:29 -08:00
Buldram
3eba6b8379 qemacs, texi2html: new packages (#48537)
* qemacs, texi2html: new packages
* Make plugin support variant
2025-01-13 16:23:40 -08:00
Wouter Deconinck
369928200a py-histoprint: add v2.5.0, v2.6.0 (switch to hatchling) (#48452)
* py-histoprint: add v2.5.0, v2.6.0 (switch to hatchling)
* py-histoprint: add tag hep
2025-01-13 13:44:39 -08:00
Olivier Cessenat
81ed0f8d87 visit: external find requires import re (#48469) 2025-01-13 13:27:31 -07:00
Adam J. Stewart
194b6311e9 GDAL: add v3.10.1 (#48551) 2025-01-13 12:04:47 -08:00
Vicente Bolea
8420898f79 vtk: add v9.4.1 (#48325) 2025-01-13 11:49:41 -08:00
Tamara Dahlgren
f556ba46d9 bugfix/r: git property needs to be classproperty (#48228) 2025-01-13 12:14:26 -06:00
Hubertus van Dam
ddaa9d5d81 pace: new package (#47561)
* Adding the PACE library to the LAMMPS build
* Adding the PACE library for ML-PACE in LAMMPS

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-13 10:28:05 -07:00
Wouter Deconinck
b878fe5555 boost: add v1.87.0, update depdendents' constraints (#48475)
* boost: add 1.87.0

* boost: revert addition of parser variant

* precice: forward compatibility boost@:1.86 when @:3.1.2

* faodel: forward compatibility boost@:1.86 when @:1.2108.1

* boost: conflicts ~python when +mpi @1.87.0:
2025-01-13 09:19:30 -08:00
afzpatel
b600bfc779 hipsparselt, rocalution and omniperf: update hipsparselt and rocalutuion to 6.3.0 and fix omniperf typo (#48374)
* hipsparselt and omniperf: update hipsparselt to 6.3.0 and fix typo in omniperf

* fix style

* update rocalution to 6.3.0
2025-01-13 09:18:28 -08:00
Wouter Deconinck
612c289c41 dbus: add v1.16.0 (#48531) 2025-01-13 16:41:16 +01:00
Harmen Stoppels
e42c76cccf py-ipython: relax py-jedi forward compat (#48542) 2025-01-13 15:12:24 +01:00
Wouter Deconinck
25013bacf2 hep stack: add pandora PFA suite (#48497) 2025-01-13 13:41:01 +01:00
Carlos Bederián
3d554db198 hpl: build v2.3 with timers (#48206)
Co-authored-by: zzzoom <zzzoom@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-13 13:04:49 +01:00
Wouter Deconinck
b6def50dcb py-webcolors: add v24.8.0, v24.11.1 (switch to pdm-backend) (#48398) 2025-01-13 13:03:56 +01:00
Wouter Deconinck
bf591c96bd py-llvmlite: add v0.42.0, v0.43.0; py-numba: add v0.59.1, v0.60.0 (#48399) 2025-01-13 13:03:33 +01:00
Wouter Deconinck
edf1d2ec40 py-pre-commit: add v3.6.2, v3.7.1, v3.8.0, v4.0.1 (#48400) 2025-01-13 13:03:06 +01:00
Adam J. Stewart
07f607ec9f PyTorch: remove +onnx_ml variant (#48406) 2025-01-13 13:02:21 +01:00
Olivier Cessenat
93747c5e24 gxsview: making sure -lstdc++fs is added (#47703)
Co-authored-by: Olivier Cessenat <cessenat@jliana.magic>
2025-01-13 13:01:03 +01:00
Thomas Helfer
b746d4596a tfel: update package to transition from boost/python to pybind11 (#48237) 2025-01-13 12:59:22 +01:00
G-Ragghianti
8814705936 papi: fix erroneous patch application (#48428) 2025-01-13 12:56:28 +01:00
Harmen Stoppels
c989541ebc lmod: fix build on macOS (#48438) 2025-01-13 12:53:11 +01:00
Krishna Chilleri
1759ce05dd py-graphviz: add v0.20.3 (#48280) 2025-01-13 12:49:57 +01:00
Wouter Deconinck
c0c1a4aea1 podio: add v1.2; conflicts +rntuple ^root@6.32: when @:0.99 (#47991)
Co-authored-by: Thomas Madlener <thomas.madlener@desy.de>
2025-01-13 12:48:25 +01:00
Wouter Deconinck
53353ae64e hep stack: add scikit-hep packages (#48412) 2025-01-13 12:40:29 +01:00
Wouter Deconinck
62f7a4c9b1 py-wxpython: depends_on wxwidgets +gui (#48499) 2025-01-13 12:31:13 +01:00
Massimiliano Culpo
39679d0882 otf2: add a dependency on Fortran (#48506)
See https://gitlab.spack.io/spack/spack/-/jobs/14423941

From https://gitlab.com/score-p/scorep/-/blob/master/INSTALL?ref_type=heads

> Score-P requires a full compiler suite with language support for C99,
> C++11 and optionally Fortran 77 and Fortran 90.
2025-01-13 12:28:14 +01:00
Ufuk Turunçoğlu
50e6bf9979 esmf: add v8.8.0 (#48495) 2025-01-13 12:25:04 +01:00
Olivier Cessenat
b874c31cc8 keepassxc: adding versions 2.7.8 and 2.7.9 (#48518) 2025-01-13 12:22:55 +01:00
Adam J. Stewart
04baad90f5 py-lightning-uq-box: add v0.2.0 (#48305) 2025-01-13 12:22:04 +01:00
Adam J. Stewart
1022527923 py-ruff: add v0.9.1 (#48519) 2025-01-13 12:20:53 +01:00
Alberto Invernizzi
7ef19ec1d8 gnupg: bump versions and update libassuan requirement (#48520) 2025-01-13 12:19:38 +01:00
Wouter Deconinck
6e45b51f27 cepgen: add v1.2.5 (#48524) 2025-01-13 12:16:31 +01:00
Adam J. Stewart
5f9cd0991b py-timm: add v1.0.13 (#48526) 2025-01-13 12:15:19 +01:00
Wouter Deconinck
98c44fc351 hep stack: root +postgres (#48498) 2025-01-13 12:11:19 +01:00
Adam J. Stewart
b99f850c8e py-scipy: add v1.15.1 (#48523) 2025-01-13 12:07:30 +01:00
Adam J. Stewart
cbbd68d16b py-scikit-learn: add v1.6.1 (#48525) 2025-01-13 12:07:06 +01:00
Wouter Deconinck
e4fbf99497 dcap: depends_on openssl and zlib-api (#48529) 2025-01-13 12:06:29 +01:00
Wouter Deconinck
6a225d5405 dpmjet: add v19.3.6, v19.3.7; add to hep stack (#48530) 2025-01-13 12:05:45 +01:00
Wouter Deconinck
af9fd82476 acts: conflict ~svg ~json when +traccc (#48534) 2025-01-13 11:51:05 +01:00
Carson Woods
29c1152484 py-rich-click: added new versions and dependencies (#48270) 2025-01-13 11:48:10 +01:00
Wouter Deconinck
d6a8af6a1d py-asdf: depends_on py-setuptools-scm@8: (#48538) 2025-01-13 11:44:44 +01:00
Adam J. Stewart
3c3dad0a7a py-kornia: add v0.8.0 (#48522) 2025-01-13 11:43:33 +01:00
Wouter Deconinck
109efdff88 vbfnlo: deprecated 3.0.0beta* since 3.0 < 3.0.0beta (#48516) 2025-01-13 11:35:38 +01:00
Wouter Deconinck
fa318e2c92 lcio: add v2.22.3 (#48532) 2025-01-13 10:45:28 +01:00
Harmen Stoppels
064e70990d binary_distribution.py: stop relocation old /bin/bash sbang shebang (#48502) 2025-01-13 09:30:18 +01:00
Michael Kuhn
c40139b7d6 builtin: fix pkgconfig dependencies (#48533)
`pkgconfig` is the virtual dependency, `pkg-config` and `pkgconf` are
implementations.
2025-01-13 08:40:49 +01:00
Wouter Deconinck
c302e1a768 builtin: add undeclared gmake dependencies (#48535) 2025-01-13 08:38:37 +01:00
Harmen Stoppels
7171015f1c py-mypy: add v1.13.0 and v1.14.1 (#48508)
* py-setuptools: add v75.3.0, v75.8.0
* py-radical-utils: forward compat with setuptools
2025-01-12 17:18:09 +01:00
Massimiliano Culpo
8ab6f33eb6 Deprecate make, gmake and ninja globals in package API (#48450)
Instead, depends_on("gmake", type="build") should be used, which makes the global available through `setup_dependent_package`.
2025-01-11 22:43:22 +01:00
John W. Parent
a66ab9cc6c CMake: add versions 3.31.3 and 3.31.4 (#48509) 2025-01-11 11:38:36 -07:00
435 changed files with 4636 additions and 2723 deletions

View File

@@ -543,10 +543,10 @@ With either interpreter you can run a single command:
.. code-block:: console .. code-block:: console
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()' $ spack python -c 'from spack.concretize import concretize_one; concretize_one("python")'
... ...
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()' $ spack python -i ipython -c 'from spack.concretize import concretize_one; concretize_one("python")'
Out[1]: ... Out[1]: ...
or a file: or a file:

View File

@@ -456,14 +456,13 @@ For instance, the following config options,
tcl: tcl:
all: all:
suffixes: suffixes:
^python@3: 'python{^python.version}' ^python@3: 'python{^python.version.up_to_2}'
^openblas: 'openblas' ^openblas: 'openblas'
will add a ``python-3.12.1`` version string to any packages compiled with will add a ``python3.12`` to module names of packages compiled with Python 3.12, and similarly for
Python matching the spec, ``python@3``. This is useful to know which all specs depending on ``python@3``. This is useful to know which version of Python a set of Python
version of Python a set of Python extensions is associated with. Likewise, the extensions is associated with. Likewise, the ``openblas`` string is attached to any program that
``openblas`` string is attached to any program that has openblas in the spec, has openblas in the spec, most likely via the ``+blas`` variant specification.
most likely via the ``+blas`` variant specification.
The most heavyweight solution to module naming is to change the entire The most heavyweight solution to module naming is to change the entire
naming convention for module files. This uses the projections format naming convention for module files. This uses the projections format

View File

@@ -75,7 +75,6 @@
"install_tree", "install_tree",
"is_exe", "is_exe",
"join_path", "join_path",
"last_modification_time_recursive",
"library_extensions", "library_extensions",
"mkdirp", "mkdirp",
"partition_path", "partition_path",
@@ -1470,15 +1469,36 @@ def set_executable(path):
@system_path_filter @system_path_filter
def last_modification_time_recursive(path): def recursive_mtime_greater_than(path: str, time: float) -> bool:
path = os.path.abspath(path) """Returns true if any file or dir recursively under `path` has mtime greater than `time`."""
times = [os.stat(path).st_mtime] # use bfs order to increase likelihood of early return
times.extend( queue: Deque[str] = collections.deque()
os.lstat(os.path.join(root, name)).st_mtime
for root, dirs, files in os.walk(path) if os.stat(path).st_mtime > time:
for name in dirs + files return True
)
return max(times) while queue:
current = queue.popleft()
try:
entries = os.scandir(current)
except OSError:
continue
with entries:
for entry in entries:
try:
st = entry.stat(follow_symlinks=False)
except OSError:
continue
if st.st_mtime > time:
return True
if entry.is_dir(follow_symlinks=False):
queue.append(entry.path)
return False
@system_path_filter @system_path_filter
@@ -1740,8 +1760,7 @@ def find(
def _log_file_access_issue(e: OSError, path: str) -> None: def _log_file_access_issue(e: OSError, path: str) -> None:
errno_name = errno.errorcode.get(e.errno, "UNKNOWN") tty.debug(f"find must skip {path}: {e}")
tty.debug(f"find must skip {path}: {errno_name} {e}")
def _file_id(s: os.stat_result) -> Tuple[int, int]: def _file_id(s: os.stat_result) -> Tuple[int, int]:

View File

@@ -1356,14 +1356,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec): def _compare_extra_attribute(_expected, _detected, *, _spec):
result = [] result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex # If they are string expected is a regex
if isinstance(_expected, str): if isinstance(_expected, str) and isinstance(_detected, str):
try: try:
_regex = re.compile(_expected) _regex = re.compile(_expected)
except re.error: except re.error:
@@ -1379,7 +1373,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"] _details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)] return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict): elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys()) _not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected: if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}" _summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1394,6 +1388,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend( result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec) _compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
) )
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result return result

View File

@@ -23,7 +23,7 @@
import urllib.request import urllib.request
import warnings import warnings
from contextlib import closing from contextlib import closing
from typing import IO, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union from typing import IO, Callable, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
import llnl.util.filesystem as fsys import llnl.util.filesystem as fsys
import llnl.util.lang import llnl.util.lang
@@ -669,19 +669,24 @@ def sign_specfile(key: str, specfile_path: str) -> str:
def _read_specs_and_push_index( def _read_specs_and_push_index(
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency file_list: List[str],
read_method: Callable,
cache_prefix: str,
db: BuildCacheDatabase,
temp_dir: str,
concurrency: int,
): ):
"""Read all the specs listed in the provided list, using thread given thread parallelism, """Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror. generate the index, and push it to the mirror.
Args: Args:
file_list (list(str)): List of urls or file paths pointing at spec files to read file_list: List of urls or file paths pointing at spec files to read
read_method: A function taking a single argument, either a url or a file path, read_method: A function taking a single argument, either a url or a file path,
and which reads the spec file at that location, and returns the spec. and which reads the spec file at that location, and returns the spec.
cache_prefix (str): prefix of the build cache on s3 where index should be pushed. cache_prefix: prefix of the build cache on s3 where index should be pushed.
db: A spack database used for adding specs and then writing the index. db: A spack database used for adding specs and then writing the index.
temp_dir (str): Location to write index.json and hash for pushing temp_dir: Location to write index.json and hash for pushing
concurrency (int): Number of parallel processes to use when fetching concurrency: Number of parallel processes to use when fetching
""" """
for file in file_list: for file in file_list:
contents = read_method(file) contents = read_method(file)
@@ -861,9 +866,12 @@ def _url_generate_package_index(url: str, tmpdir: str, concurrency: int = 32):
tty.debug(f"Retrieving spec descriptor files from {url} to build index") tty.debug(f"Retrieving spec descriptor files from {url} to build index")
db = BuildCacheDatabase(tmpdir) db = BuildCacheDatabase(tmpdir)
db._write()
try: try:
_read_specs_and_push_index(file_list, read_fn, url, db, db.database_directory, concurrency) _read_specs_and_push_index(
file_list, read_fn, url, db, str(db.database_directory), concurrency
)
except Exception as e: except Exception as e:
raise GenerateIndexError(f"Encountered problem pushing package index to {url}: {e}") from e raise GenerateIndexError(f"Encountered problem pushing package index to {url}: {e}") from e
@@ -2125,10 +2133,9 @@ def fetch_url_to_mirror(url):
def dedupe_hardlinks_if_necessary(root, buildinfo): def dedupe_hardlinks_if_necessary(root, buildinfo):
"""Updates a buildinfo dict for old archives that did """Updates a buildinfo dict for old archives that did not dedupe hardlinks. De-duping hardlinks
not dedupe hardlinks. De-duping hardlinks is necessary is necessary when relocating files in parallel and in-place. This means we must preserve inodes
when relocating files in parallel and in-place. This when relocating."""
means we must preserve inodes when relocating."""
# New archives don't need this. # New archives don't need this.
if buildinfo.get("hardlinks_deduped", False): if buildinfo.get("hardlinks_deduped", False):
@@ -2157,69 +2164,47 @@ def dedupe_hardlinks_if_necessary(root, buildinfo):
buildinfo[key] = new_list buildinfo[key] = new_list
def relocate_package(spec): def relocate_package(spec: spack.spec.Spec) -> None:
""" """Relocate binaries and text files in the given spec prefix, based on its buildinfo file."""
Relocate the given package spec_prefix = str(spec.prefix)
""" buildinfo = read_buildinfo_file(spec_prefix)
workdir = str(spec.prefix)
buildinfo = read_buildinfo_file(workdir)
new_layout_root = str(spack.store.STORE.layout.root)
new_prefix = str(spec.prefix)
new_rel_prefix = str(os.path.relpath(new_prefix, new_layout_root))
new_spack_prefix = str(spack.paths.prefix)
old_sbang_install_path = None
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
old_layout_root = str(buildinfo["buildpath"]) old_layout_root = str(buildinfo["buildpath"])
old_spack_prefix = str(buildinfo.get("spackprefix"))
old_rel_prefix = buildinfo.get("relative_prefix")
old_prefix = os.path.join(old_layout_root, old_rel_prefix)
# Warn about old style tarballs created with the now removed --rel flag. # Warn about old style tarballs created with the --rel flag (removed in Spack v0.20)
if buildinfo.get("relative_rpaths", False): if buildinfo.get("relative_rpaths", False):
tty.warn( tty.warn(
f"Tarball for {spec} uses relative rpaths, " "which can cause library loading issues." f"Tarball for {spec} uses relative rpaths, which can cause library loading issues."
) )
# In the past prefix_to_hash was the default and externals were not dropped, so prefixes # In Spack 0.19 and older prefix_to_hash was the default and externals were not dropped, so
# were not unique. # prefixes were not unique.
if "hash_to_prefix" in buildinfo: if "hash_to_prefix" in buildinfo:
hash_to_old_prefix = buildinfo["hash_to_prefix"] hash_to_old_prefix = buildinfo["hash_to_prefix"]
elif "prefix_to_hash" in buildinfo: elif "prefix_to_hash" in buildinfo:
hash_to_old_prefix = dict((v, k) for (k, v) in buildinfo["prefix_to_hash"].items()) hash_to_old_prefix = {v: k for (k, v) in buildinfo["prefix_to_hash"].items()}
else: else:
hash_to_old_prefix = dict() raise NewLayoutException(
"Package tarball was created from an install prefix with a different directory layout "
"and an older buildcache create implementation. It cannot be relocated."
)
if old_rel_prefix != new_rel_prefix and not hash_to_old_prefix: prefix_to_prefix: Dict[str, str] = {}
msg = "Package tarball was created from an install "
msg += "prefix with a different directory layout and an older "
msg += "buildcache create implementation. It cannot be relocated."
raise NewLayoutException(msg)
# Spurious replacements (e.g. sbang) will cause issues with binaries if "sbang_install_path" in buildinfo:
# For example, the new sbang can be longer than the old one. old_sbang_install_path = str(buildinfo["sbang_install_path"])
# Hence 2 dictionaries are maintained here. prefix_to_prefix[old_sbang_install_path] = spack.hooks.sbang.sbang_install_path()
prefix_to_prefix_text = collections.OrderedDict()
prefix_to_prefix_bin = collections.OrderedDict()
if old_sbang_install_path: # First match specific prefix paths. Possibly the *local* install prefix of some dependency is
install_path = spack.hooks.sbang.sbang_install_path() # in an upstream, so we cannot assume the original spack store root can be mapped uniformly to
prefix_to_prefix_text[old_sbang_install_path] = install_path # the new spack store root.
# First match specific prefix paths. Possibly the *local* install prefix # If the spec is spliced, we need to handle the simultaneous mapping from the old install_tree
# of some dependency is in an upstream, so we cannot assume the original # to the new install_tree and from the build_spec to the spliced spec. Because foo.build_spec
# spack store root can be mapped uniformly to the new spack store root. # is foo for any non-spliced spec, we can simplify by checking for spliced-in nodes by checking
# # for nodes not in the build_spec without any explicit check for whether the spec is spliced.
# If the spec is spliced, we need to handle the simultaneous mapping # An analog in this algorithm is any spec that shares a name or provides the same virtuals in
# from the old install_tree to the new install_tree and from the build_spec # the context of the relevant root spec. This ensures that the analog for a spec s is the spec
# to the spliced spec. # that s replaced when we spliced.
# Because foo.build_spec is foo for any non-spliced spec, we can simplify
# by checking for spliced-in nodes by checking for nodes not in the build_spec
# without any explicit check for whether the spec is spliced.
# An analog in this algorithm is any spec that shares a name or provides the same virtuals
# in the context of the relevant root spec. This ensures that the analog for a spec s
# is the spec that s replaced when we spliced.
relocation_specs = specs_to_relocate(spec) relocation_specs = specs_to_relocate(spec)
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD)) build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in relocation_specs: for s in relocation_specs:
@@ -2239,72 +2224,48 @@ def relocate_package(spec):
lookup_dag_hash = analog.dag_hash() lookup_dag_hash = analog.dag_hash()
if lookup_dag_hash in hash_to_old_prefix: if lookup_dag_hash in hash_to_old_prefix:
old_dep_prefix = hash_to_old_prefix[lookup_dag_hash] old_dep_prefix = hash_to_old_prefix[lookup_dag_hash]
prefix_to_prefix_bin[old_dep_prefix] = str(s.prefix) prefix_to_prefix[old_dep_prefix] = str(s.prefix)
prefix_to_prefix_text[old_dep_prefix] = str(s.prefix)
# Only then add the generic fallback of install prefix -> install prefix. # Only then add the generic fallback of install prefix -> install prefix.
prefix_to_prefix_text[old_prefix] = new_prefix prefix_to_prefix[old_layout_root] = str(spack.store.STORE.layout.root)
prefix_to_prefix_bin[old_prefix] = new_prefix
prefix_to_prefix_text[old_layout_root] = new_layout_root
prefix_to_prefix_bin[old_layout_root] = new_layout_root
# This is vestigial code for the *old* location of sbang. Previously, # Delete identity mappings from prefix_to_prefix
# sbang was a bash script, and it lived in the spack prefix. It is prefix_to_prefix = {k: v for k, v in prefix_to_prefix.items() if k != v}
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
orig_sbang = "#!/bin/bash {0}/bin/sbang".format(old_spack_prefix)
new_sbang = spack.hooks.sbang.sbang_shebang_line()
prefix_to_prefix_text[orig_sbang] = new_sbang
tty.debug("Relocating package from", "%s to %s." % (old_layout_root, new_layout_root)) # If there's nothing to relocate, we're done.
if not prefix_to_prefix:
return
for old, new in prefix_to_prefix.items():
tty.debug(f"Relocating: {old} => {new}.")
# Old archives may have hardlinks repeated. # Old archives may have hardlinks repeated.
dedupe_hardlinks_if_necessary(workdir, buildinfo) dedupe_hardlinks_if_necessary(spec_prefix, buildinfo)
# Text files containing the prefix text # Text files containing the prefix text
text_names = [os.path.join(workdir, f) for f in buildinfo["relocate_textfiles"]] textfiles = [os.path.join(spec_prefix, f) for f in buildinfo["relocate_textfiles"]]
binaries = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_binaries")]
links = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_links", [])]
# If we are not installing back to the same install tree do the relocation platform = spack.platforms.by_name(spec.platform)
if old_prefix != new_prefix: if "macho" in platform.binary_formats:
files_to_relocate = [ relocate.relocate_macho_binaries(binaries, prefix_to_prefix)
os.path.join(workdir, filename) for filename in buildinfo.get("relocate_binaries") elif "elf" in platform.binary_formats:
] relocate.relocate_elf_binaries(binaries, prefix_to_prefix)
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate, prefix_to_prefix_bin)
elif "elf" in platform.binary_formats:
# The new ELF dynamic section relocation logic only handles absolute to
# absolute relocation.
relocate.relocate_elf_binaries(files_to_relocate, prefix_to_prefix_bin)
# Relocate links to the new install prefix relocate.relocate_links(links, prefix_to_prefix)
links = [os.path.join(workdir, f) for f in buildinfo.get("relocate_links", [])] relocate.relocate_text(textfiles, prefix_to_prefix)
relocate.relocate_links(links, prefix_to_prefix_bin) changed_files = relocate.relocate_text_bin(binaries, prefix_to_prefix)
# For all buildcaches # Add ad-hoc signatures to patched macho files when on macOS.
# relocate the install prefixes in text files including dependencies if "macho" in platform.binary_formats and sys.platform == "darwin":
relocate.relocate_text(text_names, prefix_to_prefix_text) codesign = which("codesign")
if not codesign:
# relocate the install prefixes in binary files including dependencies return
changed_files = relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin) for binary in changed_files:
# preserve the original inode by running codesign on a copy
# Add ad-hoc signatures to patched macho files when on macOS. with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
if "macho" in platform.binary_formats and sys.platform == "darwin": codesign("-fs-", tmp_binary)
codesign = which("codesign")
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
else:
if old_spack_prefix != new_spack_prefix:
relocate.relocate_text(text_names, prefix_to_prefix_text)
def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum): def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):

View File

@@ -10,7 +10,9 @@
import sys import sys
import sysconfig import sysconfig
import warnings import warnings
from typing import Dict, Optional, Sequence, Union from typing import Optional, Sequence, Union
from typing_extensions import TypedDict
import archspec.cpu import archspec.cpu
@@ -18,13 +20,17 @@
from llnl.util import tty from llnl.util import tty
import spack.platforms import spack.platforms
import spack.spec
import spack.store import spack.store
import spack.util.environment import spack.util.environment
import spack.util.executable import spack.util.executable
from .config import spec_for_current_python from .config import spec_for_current_python
QueryInfo = Dict[str, "spack.spec.Spec"]
class QueryInfo(TypedDict, total=False):
spec: spack.spec.Spec
command: spack.util.executable.Executable
def _python_import(module: str) -> bool: def _python_import(module: str) -> bool:
@@ -211,7 +217,9 @@ def _executables_in_store(
): ):
spack.util.environment.path_put_first("PATH", [bin_dir]) spack.util.environment.path_put_first("PATH", [bin_dir])
if query_info is not None: if query_info is not None:
query_info["command"] = spack.util.executable.which(*executables, path=bin_dir) query_info["command"] = spack.util.executable.which(
*executables, path=bin_dir, required=True
)
query_info["spec"] = concrete_spec query_info["spec"] = concrete_spec
return True return True
return False return False

View File

@@ -34,8 +34,10 @@
from llnl.util.lang import GroupedExceptionHandler from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution import spack.binary_distribution
import spack.concretize
import spack.config import spack.config
import spack.detection import spack.detection
import spack.error
import spack.mirrors.mirror import spack.mirrors.mirror
import spack.platforms import spack.platforms
import spack.spec import spack.spec
@@ -47,7 +49,13 @@
import spack.version import spack.version
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store from ._common import (
QueryInfo,
_executables_in_store,
_python_import,
_root_spec,
_try_import_from_store,
)
from .clingo import ClingoBootstrapConcretizer from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python from .config import spack_python_interpreter, spec_for_current_python
@@ -134,7 +142,7 @@ class BuildcacheBootstrapper(Bootstrapper):
def __init__(self, conf) -> None: def __init__(self, conf) -> None:
super().__init__(conf) super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}" self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}"
@staticmethod @staticmethod
@@ -211,14 +219,14 @@ def _install_and_test(
for _, pkg_hash, pkg_sha256 in item["binaries"]: for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform) self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {} info: QueryInfo = {}
if test_fn(query_spec=abstract_spec, query_info=info): if test_fn(query_spec=abstract_spec, query_info=info):
self.last_search = info self.last_search = info
return True return True
return False return False
def try_import(self, module: str, abstract_spec_str: str) -> bool: def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary info: QueryInfo
test_fn, info = functools.partial(_try_import_from_store, module), {} test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info): if test_fn(query_spec=abstract_spec_str, query_info=info):
return True return True
@@ -231,7 +239,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn) return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool: def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary info: QueryInfo
test_fn, info = functools.partial(_executables_in_store, executables), {} test_fn, info = functools.partial(_executables_in_store, executables), {}
if test_fn(query_spec=abstract_spec_str, query_info=info): if test_fn(query_spec=abstract_spec_str, query_info=info):
self.last_search = info self.last_search = info
@@ -249,11 +257,11 @@ class SourceBootstrapper(Bootstrapper):
def __init__(self, conf) -> None: def __init__(self, conf) -> None:
super().__init__(conf) super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}" self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}"
def try_import(self, module: str, abstract_spec_str: str) -> bool: def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary = {} info: QueryInfo = {}
if _try_import_from_store(module, abstract_spec_str, query_info=info): if _try_import_from_store(module, abstract_spec_str, query_info=info):
self.last_search = info self.last_search = info
return True return True
@@ -270,10 +278,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG) bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize() concrete_spec = bootstrapper.concretize()
else: else:
concrete_spec = spack.spec.Spec( abstract_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python() abstract_spec_str + " ^" + spec_for_current_python()
) )
concrete_spec.concretize() concrete_spec = spack.concretize.concretize_one(abstract_spec)
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources" msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str)) tty.debug(msg.format(module, abstract_spec_str))
@@ -288,7 +296,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return False return False
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool: def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary = {} info: QueryInfo = {}
if _executables_in_store(executables, abstract_spec_str, query_info=info): if _executables_in_store(executables, abstract_spec_str, query_info=info):
self.last_search = info self.last_search = info
return True return True
@@ -299,7 +307,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount # might reduce compilation time by a fair amount
_add_externals_if_missing() _add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized() concrete_spec = spack.concretize.concretize_one(abstract_spec_str)
msg = "[BOOTSTRAP] Try installing '{0}' from sources" msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str)) tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope): with spack.config.override(self.mirror_scope):
@@ -316,11 +324,9 @@ def create_bootstrapper(conf: ConfigDictionary):
return _bootstrap_methods[btype](conf) return _bootstrap_methods[btype](conf)
def source_is_enabled_or_raise(conf: ConfigDictionary): def source_is_enabled(conf: ConfigDictionary) -> bool:
"""Raise ValueError if the source is not enabled for bootstrapping""" """Returns true if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"] return spack.config.get("bootstrap:trusted").get(conf["name"], False)
if not trusted.get(name, False):
raise ValueError("source is not trusted")
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None): def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
@@ -350,24 +356,23 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler() exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources(): for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception): with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config) if create_bootstrapper(current_config).try_import(module, abstract_spec):
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
return return
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {module}"
)
msg = f'cannot bootstrap the "{module}" Python module ' msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec: if abstract_spec:
msg += f'from spec "{abstract_spec}" ' msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True) msg += exception_handler.grouped_message(with_tracebacks=True)
else: else:
msg += exception_handler.grouped_message(with_tracebacks=False) msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors" msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise ImportError(msg) raise ImportError(msg)
@@ -405,8 +410,9 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler() exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources(): for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception): with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config) current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec): if current_bootstrapper.try_search_path(executables, abstract_spec):
# Additional environment variables needed # Additional environment variables needed
@@ -414,6 +420,7 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"], current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"], current_bootstrapper.last_search["command"],
) )
assert cmd is not None, "expected an Executable"
cmd.add_default_envmod( cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs( spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False concrete_spec, set_package_py_globals=False
@@ -421,18 +428,17 @@ def ensure_executables_in_path_or_raise(
) )
return cmd return cmd
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {executables_str}"
)
msg = f"cannot bootstrap any of the {executables_str} executables " msg = f"cannot bootstrap any of the {executables_str} executables "
if abstract_spec: if abstract_spec:
msg += f'from spec "{abstract_spec}" ' msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True) msg += exception_handler.grouped_message(with_tracebacks=True)
else: else:
msg += exception_handler.grouped_message(with_tracebacks=False) msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors" msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise RuntimeError(msg) raise RuntimeError(msg)

View File

@@ -63,7 +63,6 @@ def _missing(name: str, purpose: str, system_only: bool = True) -> str:
def _core_requirements() -> List[RequiredResponseType]: def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = { _core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"), "patch": _missing("patch", "required to patch source code before building"),
"tar": _missing("tar", "required to manage code archives"), "tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"), "gzip": _missing("gzip", "required to compress/decompress code archives"),

View File

@@ -44,7 +44,19 @@
from enum import Flag, auto from enum import Flag, auto
from itertools import chain from itertools import chain
from multiprocessing.connection import Connection from multiprocessing.connection import Connection
from typing import Callable, Dict, List, Optional, Set, Tuple from typing import (
Callable,
Dict,
List,
Optional,
Sequence,
Set,
TextIO,
Tuple,
Type,
Union,
overload,
)
import archspec.cpu import archspec.cpu
@@ -146,48 +158,128 @@ def get_effective_jobs(jobs, parallel=True, supports_jobserver=False):
class MakeExecutable(Executable): class MakeExecutable(Executable):
"""Special callable executable object for make so the user can specify """Special callable executable object for make so the user can specify parallelism options
parallelism options on a per-invocation basis. Specifying on a per-invocation basis.
'parallel' to the call will override whatever the package's
global setting is, so you can either default to true or false and
override particular calls. Specifying 'jobs_env' to a particular
call will name an environment variable which will be set to the
parallelism level (without affecting the normal invocation with
-j).
""" """
def __init__(self, name, jobs, **kwargs): def __init__(self, name: str, *, jobs: int, supports_jobserver: bool = True) -> None:
supports_jobserver = kwargs.pop("supports_jobserver", True) super().__init__(name)
super().__init__(name, **kwargs)
self.supports_jobserver = supports_jobserver self.supports_jobserver = supports_jobserver
self.jobs = jobs self.jobs = jobs
def __call__(self, *args, **kwargs): @overload
"""parallel, and jobs_env from kwargs are swallowed and used here; def __call__(
remaining arguments are passed through to the superclass. self,
""" *args: str,
parallel = kwargs.pop("parallel", True) parallel: bool = ...,
jobs_env = kwargs.pop("jobs_env", None) jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver = kwargs.pop("jobs_env_supports_jobserver", False) jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str] = ...,
error: Union[Optional[TextIO], str] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> None: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Type[str], Callable] = ...,
error: Union[Optional[TextIO], str, Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str, Type[str], Callable] = ...,
error: Union[Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
def __call__(
self,
*args: str,
parallel: bool = True,
jobs_env: Optional[str] = None,
jobs_env_supports_jobserver: bool = False,
**kwargs,
) -> Optional[str]:
"""Runs this "make" executable in a subprocess.
Args:
parallel: if False, parallelism is disabled
jobs_env: environment variable that will be set to the current level of parallelism
jobs_env_supports_jobserver: whether the jobs env supports a job server
For all the other **kwargs, refer to the base class.
"""
jobs = get_effective_jobs( jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=self.supports_jobserver self.jobs, parallel=parallel, supports_jobserver=self.supports_jobserver
) )
if jobs is not None: if jobs is not None:
args = ("-j{0}".format(jobs),) + args args = (f"-j{jobs}",) + args
if jobs_env: if jobs_env:
# Caller wants us to set an environment variable to # Caller wants us to set an environment variable to control the parallelism
# control the parallelism.
jobs_env_jobs = get_effective_jobs( jobs_env_jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=jobs_env_supports_jobserver self.jobs, parallel=parallel, supports_jobserver=jobs_env_supports_jobserver
) )
if jobs_env_jobs is not None: if jobs_env_jobs is not None:
kwargs["extra_env"] = {jobs_env: str(jobs_env_jobs)} extra_env = kwargs.setdefault("extra_env", {})
extra_env.update({jobs_env: str(jobs_env_jobs)})
return super().__call__(*args, **kwargs) return super().__call__(*args, **kwargs)
class UndeclaredDependencyError(spack.error.SpackError):
"""Raised if a dependency is invoking an executable through a module global, without
declaring a dependency on it.
"""
class DeprecatedExecutable:
def __init__(self, pkg: str, exe: str, exe_pkg: str) -> None:
self.pkg = pkg
self.exe = exe
self.exe_pkg = exe_pkg
def __call__(self, *args, **kwargs):
raise UndeclaredDependencyError(
f"{self.pkg} is using {self.exe} without declaring a dependency on {self.exe_pkg}"
)
def add_default_env(self, key: str, value: str):
self.__call__()
def clean_environment(): def clean_environment():
# Stuff in here sanitizes the build environment to eliminate # Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately # anything the user has set that may interfere. We apply it immediately
@@ -621,10 +713,9 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg) module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg) module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# TODO: make these build deps that can be installed if not found. module.make = DeprecatedExecutable(pkg.name, "make", "gmake")
module.make = MakeExecutable("make", jobs) module.gmake = DeprecatedExecutable(pkg.name, "gmake", "gmake")
module.gmake = MakeExecutable("gmake", jobs) module.ninja = DeprecatedExecutable(pkg.name, "ninja", "ninja")
module.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools # TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their # for now there is no entrypoint for builders to define these on their
# own # own

View File

@@ -356,6 +356,13 @@ def _do_patch_libtool_configure(self) -> None:
) )
# Support Libtool 2.4.2 and older: # Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2') x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.phase_callbacks.run_after("configure") @spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None: def _do_patch_libtool(self) -> None:

View File

@@ -293,6 +293,13 @@ def initconfig_hardware_entries(self):
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str)) entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str)) entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
if spec.satisfies("%gcc"):
entries.append(
cmake_cache_string(
"CMAKE_HIP_FLAGS", f"--gcc-toolchain={self.pkg.compiler.prefix}"
)
)
return entries return entries
def std_initconfig_entries(self): def std_initconfig_entries(self):

View File

@@ -27,6 +27,7 @@ class QMakePackage(spack.package_base.PackageBase):
build_system("qmake") build_system("qmake")
depends_on("qmake", type="build", when="build_system=qmake") depends_on("qmake", type="build", when="build_system=qmake")
depends_on("gmake", type="build")
@spack.builder.builder("qmake") @spack.builder.builder("qmake")

View File

@@ -94,7 +94,7 @@ def list_url(cls):
if cls.cran: if cls.cran:
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/" return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
@property @lang.classproperty
def git(self): def git(cls):
if self.bioc: if cls.bioc:
return f"https://git.bioconductor.org/packages/{self.bioc}" return f"https://git.bioconductor.org/packages/{cls.bioc}"

View File

@@ -140,7 +140,7 @@ class ROCmPackage(PackageBase):
when="+rocm", when="+rocm",
) )
depends_on("llvm-amdgpu", when="+rocm") depends_on("llvm-amdgpu", type="build", when="+rocm")
depends_on("hsa-rocr-dev", when="+rocm") depends_on("hsa-rocr-dev", when="+rocm")
depends_on("hip +rocm", when="+rocm") depends_on("hip +rocm", when="+rocm")

View File

@@ -202,7 +202,7 @@ def _concretize_spec_pairs(
# Special case for concretizing a single spec # Special case for concretizing a single spec
if len(to_concretize) == 1: if len(to_concretize) == 1:
abstract, concrete = to_concretize[0] abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized(tests=tests)] return [concrete or spack.concretize.concretize_one(abstract, tests=tests)]
# Special case if every spec is either concrete or has an abstract hash # Special case if every spec is either concrete or has an abstract hash
if all( if all(
@@ -254,9 +254,9 @@ def matching_spec_from_env(spec):
""" """
env = ev.active_environment() env = ev.active_environment()
if env: if env:
return env.matching_spec(spec) or spec.concretized() return env.matching_spec(spec) or spack.concretize.concretize_one(spec)
else: else:
return spec.concretized() return spack.concretize.concretize_one(spec)
def matching_specs_from_env(specs): def matching_specs_from_env(specs):
@@ -297,7 +297,7 @@ def disambiguate_spec(
def disambiguate_spec_from_hashes( def disambiguate_spec_from_hashes(
spec: spack.spec.Spec, spec: spack.spec.Spec,
hashes: List[str], hashes: Optional[List[str]],
local: bool = False, local: bool = False,
installed: Union[bool, InstallRecordStatus] = True, installed: Union[bool, InstallRecordStatus] = True,
first: bool = False, first: bool = False,

View File

@@ -14,9 +14,9 @@
import spack.bootstrap import spack.bootstrap
import spack.bootstrap.config import spack.bootstrap.config
import spack.bootstrap.core import spack.bootstrap.core
import spack.concretize
import spack.config import spack.config
import spack.mirrors.utils import spack.mirrors.utils
import spack.spec
import spack.stage import spack.stage
import spack.util.path import spack.util.path
import spack.util.spack_yaml import spack.util.spack_yaml
@@ -397,7 +397,7 @@ def _mirror(args):
llnl.util.tty.msg(msg.format(spec_str, mirror_dir)) llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages # Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False) llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized() spec = spack.concretize.concretize_one(spec_str)
for node in spec.traverse(): for node in spec.traverse():
spack.mirrors.utils.create(mirror_dir, [node]) spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True) llnl.util.tty.set_msg_enabled(True)

View File

@@ -16,6 +16,7 @@
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.cmd import spack.cmd
import spack.concretize
import spack.config import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.environment as ev import spack.environment as ev
@@ -554,8 +555,7 @@ def check_fn(args: argparse.Namespace):
tty.msg("No specs provided, exiting.") tty.msg("No specs provided, exiting.")
return return
for spec in specs: specs = [spack.concretize.concretize_one(s) for s in specs]
spec.concretize()
# Next see if there are any configured binary mirrors # Next see if there are any configured binary mirrors
configured_mirrors = spack.config.get("mirrors", scope=args.scope) configured_mirrors = spack.config.get("mirrors", scope=args.scope)
@@ -623,7 +623,7 @@ def save_specfile_fn(args):
root = specs[0] root = specs[0]
if not root.concrete: if not root.concrete:
root.concretize() root = spack.concretize.concretize_one(root)
save_dependency_specfiles( save_dependency_specfiles(
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs) root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)

View File

@@ -18,6 +18,7 @@
from llnl.util.symlink import symlink from llnl.util.symlink import symlink
import spack.cmd import spack.cmd
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.installer import spack.installer
import spack.store import spack.store
@@ -103,7 +104,7 @@ def deprecate(parser, args):
) )
if args.install: if args.install:
deprecator = specs[1].concretized() deprecator = spack.concretize.concretize_one(specs[1])
else: else:
deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True) deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True)

View File

@@ -10,6 +10,7 @@
import spack.build_environment import spack.build_environment
import spack.cmd import spack.cmd
import spack.cmd.common.arguments import spack.cmd.common.arguments
import spack.concretize
import spack.config import spack.config
import spack.repo import spack.repo
from spack.cmd.common import arguments from spack.cmd.common import arguments
@@ -113,8 +114,8 @@ def dev_build(self, args):
source_path = os.path.abspath(source_path) source_path = os.path.abspath(source_path)
# Forces the build to run out of the source directory. # Forces the build to run out of the source directory.
spec.constrain("dev_path=%s" % source_path) spec.constrain(f'dev_path="{source_path}"')
spec.concretize() spec = spack.concretize.concretize_one(spec)
if spec.installed: if spec.installed:
tty.error("Already installed in %s" % spec.prefix) tty.error("Already installed in %s" % spec.prefix)

View File

@@ -13,6 +13,7 @@
from llnl.util import lang, tty from llnl.util import lang, tty
import spack.cmd import spack.cmd
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.paths import spack.paths
@@ -450,7 +451,7 @@ def concrete_specs_from_file(args):
else: else:
s = spack.spec.Spec.from_json(f) s = spack.spec.Spec.from_json(f)
concretized = s.concretized() concretized = spack.concretize.concretize_one(s)
if concretized.dag_hash() != s.dag_hash(): if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". ' msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec." msg += "The file does not contain a concrete spec."

View File

@@ -7,9 +7,9 @@
from llnl.path import convert_to_posix_path from llnl.path import convert_to_posix_path
import spack.concretize
import spack.paths import spack.paths
import spack.util.executable import spack.util.executable
from spack.spec import Spec
description = "generate Windows installer" description = "generate Windows installer"
section = "admin" section = "admin"
@@ -65,8 +65,7 @@ def make_installer(parser, args):
""" """
if sys.platform == "win32": if sys.platform == "win32":
output_dir = args.output_dir output_dir = args.output_dir
cmake_spec = Spec("cmake") cmake_spec = spack.concretize.concretize_one("cmake")
cmake_spec.concretize()
cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe") cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe")
cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe") cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe")
spack_source = args.spack_source spack_source = args.spack_source

View File

@@ -492,7 +492,7 @@ def extend_with_additional_versions(specs, num_versions):
mirror_specs = spack.mirrors.utils.get_all_versions(specs) mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else: else:
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions) mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs] mirror_specs = [spack.concretize.concretize_one(x) for x in mirror_specs]
return mirror_specs return mirror_specs

View File

@@ -37,13 +37,12 @@ def enable_compiler_existence_check():
SpecPairInput = Tuple[Spec, Optional[Spec]] SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec] SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]] TestsType = Union[bool, Iterable[str]]
def concretize_specs_together( def _concretize_specs_together(
abstract_specs: Sequence[SpecLike], tests: TestsType = False abstract_specs: Sequence[Spec], tests: TestsType = False
) -> Sequence[Spec]: ) -> List[Spec]:
"""Given a number of specs as input, tries to concretize them together. """Given a number of specs as input, tries to concretize them together.
Args: Args:
@@ -51,11 +50,10 @@ def concretize_specs_together(
tests: list of package names for which to consider tests dependencies. If True, all nodes tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded. will have test dependencies. If False, test dependencies will be disregarded.
""" """
import spack.solver.asp from spack.solver.asp import Solver
allow_deprecated = spack.config.get("config:deprecated", False) allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver() result = Solver().solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs] return [s.copy() for s in result.specs]
@@ -72,7 +70,7 @@ def concretize_together(
""" """
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list] to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
abstract_specs = [abstract for abstract, _ in spec_list] abstract_specs = [abstract for abstract, _ in spec_list]
concrete_specs = concretize_specs_together(to_concretize, tests=tests) concrete_specs = _concretize_specs_together(to_concretize, tests=tests)
return list(zip(abstract_specs, concrete_specs)) return list(zip(abstract_specs, concrete_specs))
@@ -90,7 +88,7 @@ def concretize_together_when_possible(
tests: list of package names for which to consider tests dependencies. If True, all nodes tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded. will have test dependencies. If False, test dependencies will be disregarded.
""" """
import spack.solver.asp from spack.solver.asp import Solver
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list] to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = { old_concrete_to_abstract = {
@@ -98,9 +96,8 @@ def concretize_together_when_possible(
} }
result_by_user_spec = {} result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False) allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds( for result in Solver().solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated to_concretize, tests=tests, allow_deprecated=allow_deprecated
): ):
result_by_user_spec.update(result.specs_by_input) result_by_user_spec.update(result.specs_by_input)
@@ -124,7 +121,7 @@ def concretize_separately(
tests: list of package names for which to consider tests dependencies. If True, all nodes tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded. will have test dependencies. If False, test dependencies will be disregarded.
""" """
import spack.bootstrap from spack.bootstrap import ensure_bootstrap_configuration, ensure_clingo_importable_or_raise
to_concretize = [abstract for abstract, concrete in spec_list if not concrete] to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [ args = [
@@ -134,8 +131,8 @@ def concretize_separately(
] ]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete] ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel # Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration(): with ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise() ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since # Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting # otherwise the processes in the pool may timeout on waiting
@@ -190,10 +187,52 @@ def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int,
index, spec_str, tests = packed_arguments index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False): with tty.SuppressOutput(msg_enabled=False):
start = time.time() start = time.time()
spec = Spec(spec_str).concretized(tests=tests) spec = concretize_one(Spec(spec_str), tests=tests)
return index, spec, time.time() - start return index, spec, time.time() - start
def concretize_one(spec: Union[str, Spec], tests: TestsType = False) -> Spec:
"""Return a concretized copy of the given spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
from spack.solver.asp import Solver, SpecBuilder
if isinstance(spec, str):
spec = Spec(spec)
spec = spec.lookup_hash()
if spec.concrete:
return spec.copy()
for node in spec.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve([spec], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]
node = SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
return concretized
class UnavailableCompilerVersionError(spack.error.SpackError): class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a """Raised when there is no available compiler that satisfies a
compiler spec.""" compiler spec."""

View File

@@ -36,6 +36,8 @@
import sys import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
import jsonschema
from llnl.util import filesystem, lang, tty from llnl.util import filesystem, lang, tty
import spack.error import spack.error
@@ -1048,8 +1050,6 @@ def validate(
This leverages the line information (start_mark, end_mark) stored This leverages the line information (start_mark, end_mark) stored
on Spack YAML structures. on Spack YAML structures.
""" """
import jsonschema
try: try:
spack.schema.Validator(schema).validate(data) spack.schema.Validator(schema).validate(data)
except jsonschema.ValidationError as e: except jsonschema.ValidationError as e:

View File

@@ -6,6 +6,8 @@
""" """
import warnings import warnings
import jsonschema
import spack.environment as ev import spack.environment as ev
import spack.schema.env as env import spack.schema.env as env
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
@@ -30,8 +32,6 @@ def validate(configuration_file):
Returns: Returns:
A sanitized copy of the configuration stored in the input file A sanitized copy of the configuration stored in the input file
""" """
import jsonschema
with open(configuration_file, encoding="utf-8") as f: with open(configuration_file, encoding="utf-8") as f:
config = syaml.load(f) config = syaml.load(f)

View File

@@ -9,6 +9,8 @@
from collections import namedtuple from collections import namedtuple
from typing import Optional from typing import Optional
import jsonschema
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.schema.env import spack.schema.env
@@ -188,8 +190,6 @@ def paths(self):
@tengine.context_property @tengine.context_property
def manifest(self): def manifest(self):
"""The spack.yaml file that should be used in the image""" """The spack.yaml file that should be used in the image"""
import jsonschema
# Copy in the part of spack.yaml prescribed in the configuration file # Copy in the part of spack.yaml prescribed in the configuration file
manifest = copy.deepcopy(self.config) manifest = copy.deepcopy(self.config)
manifest.pop("container") manifest.pop("container")

View File

@@ -419,14 +419,25 @@ class FailureTracker:
the likelihood of collision very low with no cleanup required. the likelihood of collision very low with no cleanup required.
""" """
#: root directory of the failure tracker
dir: pathlib.Path
#: File for locking particular concrete spec hashes
locker: SpecLocker
def __init__(self, root_dir: Union[str, pathlib.Path], default_timeout: Optional[float]): def __init__(self, root_dir: Union[str, pathlib.Path], default_timeout: Optional[float]):
#: Ensure a persistent location for dealing with parallel installation #: Ensure a persistent location for dealing with parallel installation
#: failures (e.g., across near-concurrent processes). #: failures (e.g., across near-concurrent processes).
self.dir = pathlib.Path(root_dir) / _DB_DIRNAME / "failures" self.dir = pathlib.Path(root_dir) / _DB_DIRNAME / "failures"
self.dir.mkdir(parents=True, exist_ok=True)
self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout) self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout)
def _ensure_parent_directories(self) -> None:
"""Ensure that parent directories of the FailureTracker exist.
Accesses the filesystem only once, the first time it's called on a given FailureTracker.
"""
self.dir.mkdir(parents=True, exist_ok=True)
def clear(self, spec: "spack.spec.Spec", force: bool = False) -> None: def clear(self, spec: "spack.spec.Spec", force: bool = False) -> None:
"""Removes any persistent and cached failure tracking for the spec. """Removes any persistent and cached failure tracking for the spec.
@@ -469,13 +480,18 @@ def clear_all(self) -> None:
tty.debug("Removing prefix failure tracking files") tty.debug("Removing prefix failure tracking files")
try: try:
for fail_mark in os.listdir(str(self.dir)): marks = os.listdir(str(self.dir))
try: except FileNotFoundError:
(self.dir / fail_mark).unlink() return # directory doesn't exist yet
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
except OSError as exc: except OSError as exc:
tty.warn(f"Unable to remove failure marking files: {str(exc)}") tty.warn(f"Unable to remove failure marking files: {str(exc)}")
return
for fail_mark in marks:
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
def mark(self, spec: "spack.spec.Spec") -> lk.Lock: def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
"""Marks a spec as failing to install. """Marks a spec as failing to install.
@@ -483,6 +499,8 @@ def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
Args: Args:
spec: spec that failed to install spec: spec that failed to install
""" """
self._ensure_parent_directories()
# Dump the spec to the failure file for (manual) debugging purposes # Dump the spec to the failure file for (manual) debugging purposes
path = self._path(spec) path = self._path(spec)
path.write_text(spec.to_json()) path.write_text(spec.to_json())
@@ -567,17 +585,13 @@ def __init__(
Relevant only if the repository is not an upstream. Relevant only if the repository is not an upstream.
""" """
self.root = root self.root = root
self.database_directory = os.path.join(self.root, _DB_DIRNAME) self.database_directory = pathlib.Path(self.root) / _DB_DIRNAME
self.layout = layout self.layout = layout
# Set up layout of database files within the db dir # Set up layout of database files within the db dir
self._index_path = os.path.join(self.database_directory, "index.json") self._index_path = self.database_directory / "index.json"
self._verifier_path = os.path.join(self.database_directory, "index_verifier") self._verifier_path = self.database_directory / "index_verifier"
self._lock_path = os.path.join(self.database_directory, "lock") self._lock_path = self.database_directory / "lock"
# Create needed directories and files
if not is_upstream and not os.path.exists(self.database_directory):
fs.mkdirp(self.database_directory)
self.is_upstream = is_upstream self.is_upstream = is_upstream
self.last_seen_verifier = "" self.last_seen_verifier = ""
@@ -599,7 +613,7 @@ def __init__(
self.lock = ForbiddenLock() self.lock = ForbiddenLock()
else: else:
self.lock = lk.Lock( self.lock = lk.Lock(
self._lock_path, str(self._lock_path),
default_timeout=self.db_lock_timeout, default_timeout=self.db_lock_timeout,
desc="database", desc="database",
enable=lock_cfg.enable, enable=lock_cfg.enable,
@@ -616,6 +630,11 @@ def __init__(
self._write_transaction_impl = lk.WriteTransaction self._write_transaction_impl = lk.WriteTransaction
self._read_transaction_impl = lk.ReadTransaction self._read_transaction_impl = lk.ReadTransaction
def _ensure_parent_directories(self):
"""Create the parent directory for the DB, if necessary."""
if not self.is_upstream:
self.database_directory.mkdir(parents=True, exist_ok=True)
def write_transaction(self): def write_transaction(self):
"""Get a write lock context manager for use in a `with` block.""" """Get a write lock context manager for use in a `with` block."""
return self._write_transaction_impl(self.lock, acquire=self._read, release=self._write) return self._write_transaction_impl(self.lock, acquire=self._read, release=self._write)
@@ -630,6 +649,8 @@ def _write_to_file(self, stream):
This function does not do any locking or transactions. This function does not do any locking or transactions.
""" """
self._ensure_parent_directories()
# map from per-spec hash code to installation record. # map from per-spec hash code to installation record.
installs = dict( installs = dict(
(k, v.to_dict(include_fields=self.record_fields)) for k, v in self._data.items() (k, v.to_dict(include_fields=self.record_fields)) for k, v in self._data.items()
@@ -759,7 +780,7 @@ def _read_from_file(self, filename):
Does not do any locking. Does not do any locking.
""" """
try: try:
with open(filename, "r", encoding="utf-8") as f: with open(str(filename), "r", encoding="utf-8") as f:
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat. # In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read()) fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e: except Exception as e:
@@ -860,11 +881,13 @@ def reindex(self):
if self.is_upstream: if self.is_upstream:
raise UpstreamDatabaseLockingError("Cannot reindex an upstream database") raise UpstreamDatabaseLockingError("Cannot reindex an upstream database")
self._ensure_parent_directories()
# Special transaction to avoid recursive reindex calls and to # Special transaction to avoid recursive reindex calls and to
# ignore errors if we need to rebuild a corrupt database. # ignore errors if we need to rebuild a corrupt database.
def _read_suppress_error(): def _read_suppress_error():
try: try:
if os.path.isfile(self._index_path): if self._index_path.is_file():
self._read_from_file(self._index_path) self._read_from_file(self._index_path)
except CorruptDatabaseError as e: except CorruptDatabaseError as e:
tty.warn(f"Reindexing corrupt database, error was: {e}") tty.warn(f"Reindexing corrupt database, error was: {e}")
@@ -1007,7 +1030,7 @@ def _check_ref_counts(self):
% (key, found, expected, self._index_path) % (key, found, expected, self._index_path)
) )
def _write(self, type, value, traceback): def _write(self, type=None, value=None, traceback=None):
"""Write the in-memory database index to its file path. """Write the in-memory database index to its file path.
This is a helper function called by the WriteTransaction context This is a helper function called by the WriteTransaction context
@@ -1018,6 +1041,8 @@ def _write(self, type, value, traceback):
This routine does no locking. This routine does no locking.
""" """
self._ensure_parent_directories()
# Do not write if exceptions were raised # Do not write if exceptions were raised
if type is not None: if type is not None:
# A failure interrupted a transaction, so we should record that # A failure interrupted a transaction, so we should record that
@@ -1026,16 +1051,16 @@ def _write(self, type, value, traceback):
self._state_is_inconsistent = True self._state_is_inconsistent = True
return return
temp_file = self._index_path + (".%s.%s.temp" % (_getfqdn(), os.getpid())) temp_file = str(self._index_path) + (".%s.%s.temp" % (_getfqdn(), os.getpid()))
# Write a temporary database file them move it into place # Write a temporary database file them move it into place
try: try:
with open(temp_file, "w", encoding="utf-8") as f: with open(temp_file, "w", encoding="utf-8") as f:
self._write_to_file(f) self._write_to_file(f)
fs.rename(temp_file, self._index_path) fs.rename(temp_file, str(self._index_path))
if _use_uuid: if _use_uuid:
with open(self._verifier_path, "w", encoding="utf-8") as f: with self._verifier_path.open("w", encoding="utf-8") as f:
new_verifier = str(uuid.uuid4()) new_verifier = str(uuid.uuid4())
f.write(new_verifier) f.write(new_verifier)
self.last_seen_verifier = new_verifier self.last_seen_verifier = new_verifier
@@ -1048,11 +1073,11 @@ def _write(self, type, value, traceback):
def _read(self): def _read(self):
"""Re-read Database from the data in the set location. This does no locking.""" """Re-read Database from the data in the set location. This does no locking."""
if os.path.isfile(self._index_path): if self._index_path.is_file():
current_verifier = "" current_verifier = ""
if _use_uuid: if _use_uuid:
try: try:
with open(self._verifier_path, "r", encoding="utf-8") as f: with self._verifier_path.open("r", encoding="utf-8") as f:
current_verifier = f.read() current_verifier = f.read()
except BaseException: except BaseException:
pass pass
@@ -1681,7 +1706,7 @@ def query(
) )
results = list(local_results) + list(x for x in upstream_results if x not in local_results) results = list(local_results) + list(x for x in upstream_results if x not in local_results)
results.sort() results.sort() # type: ignore[call-overload]
return results return results
def query_one( def query_one(

View File

@@ -35,7 +35,6 @@
import spack.config import spack.config
import spack.directory_layout import spack.directory_layout
import spack.paths
import spack.projections import spack.projections
import spack.relocate import spack.relocate
import spack.schema.projections import spack.schema.projections
@@ -44,7 +43,6 @@
import spack.util.spack_json as s_json import spack.util.spack_json as s_json
import spack.util.spack_yaml as s_yaml import spack.util.spack_yaml as s_yaml
from spack.error import SpackError from spack.error import SpackError
from spack.hooks import sbang
__all__ = ["FilesystemView", "YamlFilesystemView"] __all__ = ["FilesystemView", "YamlFilesystemView"]
@@ -91,16 +89,10 @@ def view_copy(
if stat.S_ISLNK(src_stat.st_mode): if stat.S_ISLNK(src_stat.st_mode):
spack.relocate.relocate_links(links=[dst], prefix_to_prefix=prefix_to_projection) spack.relocate.relocate_links(links=[dst], prefix_to_prefix=prefix_to_projection)
elif spack.relocate.is_binary(dst): elif spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(binaries=[dst], prefixes=prefix_to_projection) spack.relocate.relocate_text_bin(binaries=[dst], prefix_to_prefix=prefix_to_projection)
else: else:
prefix_to_projection[spack.store.STORE.layout.root] = view._root prefix_to_projection[spack.store.STORE.layout.root] = view._root
spack.relocate.relocate_text(files=[dst], prefix_to_prefix=prefix_to_projection)
# This is vestigial code for the *old* location of sbang.
prefix_to_projection[f"#!/bin/bash {spack.paths.spack_root}/bin/sbang"] = (
sbang.sbang_shebang_line()
)
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
# The os module on Windows does not have a chown function. # The os module on Windows does not have a chown function.
if sys.platform != "win32": if sys.platform != "win32":

View File

@@ -275,7 +275,7 @@ def _do_fake_install(pkg: "spack.package_base.PackageBase") -> None:
fs.mkdirp(pkg.prefix.bin) fs.mkdirp(pkg.prefix.bin)
fs.touch(os.path.join(pkg.prefix.bin, command)) fs.touch(os.path.join(pkg.prefix.bin, command))
if sys.platform != "win32": if sys.platform != "win32":
chmod = which("chmod") chmod = which("chmod", required=True)
chmod("+x", os.path.join(pkg.prefix.bin, command)) chmod("+x", os.path.join(pkg.prefix.bin, command))
# Install fake header file # Install fake header file

View File

@@ -3,8 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.lang import llnl.util.lang
import spack.util.spack_yaml as syaml
@llnl.util.lang.lazy_lexicographic_ordering @llnl.util.lang.lazy_lexicographic_ordering
class OperatingSystem: class OperatingSystem:
@@ -42,4 +40,4 @@ def _cmp_iter(self):
yield self.version yield self.version
def to_dict(self): def to_dict(self):
return syaml.syaml_dict([("name", self.name), ("version", self.version)]) return {"name": self.name, "version": self.version}

View File

@@ -106,8 +106,16 @@
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver from spack.version import Version, ver
# These are just here for editor support; they will be replaced when the build env # These are just here for editor support; they may be set when the build env is set up.
# is set up. configure: Executable
make = MakeExecutable("make", jobs=1) make_jobs: int
ninja = MakeExecutable("ninja", jobs=1) make: MakeExecutable
configure = Executable(join_path(".", "configure")) ninja: MakeExecutable
python_include: str
python_platlib: str
python_purelib: str
python: Executable
spack_cc: str
spack_cxx: str
spack_f77: str
spack_fc: str

View File

@@ -1099,14 +1099,14 @@ def update_external_dependencies(self, extendee_spec=None):
""" """
pass pass
def detect_dev_src_change(self): def detect_dev_src_change(self) -> bool:
""" """
Method for checking for source code changes to trigger rebuild/reinstall Method for checking for source code changes to trigger rebuild/reinstall
""" """
dev_path_var = self.spec.variants.get("dev_path", None) dev_path_var = self.spec.variants.get("dev_path", None)
_, record = spack.store.STORE.db.query_by_spec_hash(self.spec.dag_hash()) _, record = spack.store.STORE.db.query_by_spec_hash(self.spec.dag_hash())
mtime = fsys.last_modification_time_recursive(dev_path_var.value) assert dev_path_var and record, "dev_path variant and record must be present"
return mtime > record.installation_time return fsys.recursive_mtime_greater_than(dev_path_var.value, record.installation_time)
def all_urls_for_version(self, version: StandardVersion) -> List[str]: def all_urls_for_version(self, version: StandardVersion) -> List[str]:
"""Return all URLs derived from version_urls(), url, urls, and """Return all URLs derived from version_urls(), url, urls, and
@@ -1819,12 +1819,6 @@ def _has_make_target(self, target):
Returns: Returns:
bool: True if 'target' is found, else False bool: True if 'target' is found, else False
""" """
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
# Check if we have a Makefile # Check if we have a Makefile
for makefile in ["GNUmakefile", "Makefile", "makefile"]: for makefile in ["GNUmakefile", "Makefile", "makefile"]:
if os.path.exists(makefile): if os.path.exists(makefile):
@@ -1833,6 +1827,12 @@ def _has_make_target(self, target):
tty.debug("No Makefile found in the build directory") tty.debug("No Makefile found in the build directory")
return False return False
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
# Check if 'target' is a valid target. # Check if 'target' is a valid target.
# #
# `make -n target` performs a "dry run". It prints the commands that # `make -n target` performs a "dry run". It prints the commands that

View File

@@ -6,8 +6,7 @@
import os import os
import re import re
import sys import sys
from collections import OrderedDict from typing import Dict, Iterable, List, Optional
from typing import List, Optional
import macholib.mach_o import macholib.mach_o
import macholib.MachO import macholib.MachO
@@ -18,28 +17,11 @@
from llnl.util.lang import memoized from llnl.util.lang import memoized
from llnl.util.symlink import readlink, symlink from llnl.util.symlink import readlink, symlink
import spack.error
import spack.store import spack.store
import spack.util.elf as elf import spack.util.elf as elf
import spack.util.executable as executable import spack.util.executable as executable
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer from .relocate_text import BinaryFilePrefixReplacer, PrefixToPrefix, TextFilePrefixReplacer
class InstallRootStringError(spack.error.SpackError):
def __init__(self, file_path, root_path):
"""Signal that the relocated binary still has the original
Spack's store root string
Args:
file_path (str): path of the binary
root_path (str): original Spack's store root string
"""
super().__init__(
"\n %s \ncontains string\n %s \n"
"after replacing it in rpaths.\n"
"Package should not be relocated.\n Use -a to override." % (file_path, root_path)
)
@memoized @memoized
@@ -58,7 +40,7 @@ def _decode_macho_data(bytestring):
return bytestring.rstrip(b"\x00").decode("ascii") return bytestring.rstrip(b"\x00").decode("ascii")
def macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix): def _macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
""" """
Inputs Inputs
original rpaths from mach-o binaries original rpaths from mach-o binaries
@@ -103,7 +85,7 @@ def macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
return paths_to_paths return paths_to_paths
def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths): def _modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
""" """
This function is used to make machO buildcaches on macOS by This function is used to make machO buildcaches on macOS by
replacing old paths with new paths using install_name_tool replacing old paths with new paths using install_name_tool
@@ -146,7 +128,7 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
install_name_tool(*args, temp_path) install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path): def _macholib_get_paths(cur_path):
"""Get rpaths, dependent libraries, and library id of mach-o objects.""" """Get rpaths, dependent libraries, and library id of mach-o objects."""
headers = [] headers = []
try: try:
@@ -228,25 +210,25 @@ def relocate_macho_binaries(path_names, prefix_to_prefix):
if path_name.endswith(".o"): if path_name.endswith(".o"):
continue continue
# get the paths in the old prefix # get the paths in the old prefix
rpaths, deps, idpath = macholib_get_paths(path_name) rpaths, deps, idpath = _macholib_get_paths(path_name)
# get the mapping of paths in the old prerix to the new prefix # get the mapping of paths in the old prerix to the new prefix
paths_to_paths = macho_find_paths(rpaths, deps, idpath, prefix_to_prefix) paths_to_paths = _macho_find_paths(rpaths, deps, idpath, prefix_to_prefix)
# replace the old paths with new paths # replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths) _modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
def relocate_elf_binaries(binaries, prefix_to_prefix): def relocate_elf_binaries(binaries: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Take a list of binaries, and an ordered dictionary of """Take a list of binaries, and an ordered prefix to prefix mapping, and update the rpaths
prefix to prefix mapping, and update the rpaths accordingly.""" accordingly."""
# Transform to binary string # Transform to binary string
prefix_to_prefix = OrderedDict( prefix_to_prefix_bin = {
(k.encode("utf-8"), v.encode("utf-8")) for (k, v) in prefix_to_prefix.items() k.encode("utf-8"): v.encode("utf-8") for k, v in prefix_to_prefix.items()
) }
for path in binaries: for path in binaries:
try: try:
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix) elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix_bin)
except elf.ElfCStringUpdatesFailed as e: except elf.ElfCStringUpdatesFailed as e:
# Fall back to `patchelf --set-rpath ... --set-interpreter ...` # Fall back to `patchelf --set-rpath ... --set-interpreter ...`
rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else [] rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else []
@@ -254,13 +236,13 @@ def relocate_elf_binaries(binaries, prefix_to_prefix):
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter) _set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def warn_if_link_cant_be_relocated(link, target): def _warn_if_link_cant_be_relocated(link: str, target: str):
if not os.path.isabs(target): if not os.path.isabs(target):
return return
tty.warn('Symbolic link at "{}" to "{}" cannot be relocated'.format(link, target)) tty.warn(f'Symbolic link at "{link}" to "{target}" cannot be relocated')
def relocate_links(links, prefix_to_prefix): def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Relocate links to a new install prefix.""" """Relocate links to a new install prefix."""
regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys())) regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys()))
for link in links: for link in links:
@@ -269,7 +251,7 @@ def relocate_links(links, prefix_to_prefix):
# No match. # No match.
if match is None: if match is None:
warn_if_link_cant_be_relocated(link, old_target) _warn_if_link_cant_be_relocated(link, old_target)
continue continue
new_target = prefix_to_prefix[match.group()] + old_target[match.end() :] new_target = prefix_to_prefix[match.group()] + old_target[match.end() :]
@@ -277,32 +259,32 @@ def relocate_links(links, prefix_to_prefix):
symlink(new_target, link) symlink(new_target, link)
def relocate_text(files, prefixes): def relocate_text(files: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> None:
"""Relocate text file from the original installation prefix to the """Relocate text file from the original installation prefix to the
new prefix. new prefix.
Relocation also affects the the path in Spack's sbang script. Relocation also affects the the path in Spack's sbang script.
Args: Args:
files (list): Text files to be relocated files: Text files to be relocated
prefixes (OrderedDict): String prefixes which need to be changed prefix_to_prefix: ordered prefix to prefix mapping
""" """
TextFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(files) TextFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(files)
def relocate_text_bin(binaries, prefixes): def relocate_text_bin(binaries: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> List[str]:
"""Replace null terminated path strings hard-coded into binaries. """Replace null terminated path strings hard-coded into binaries.
The new install prefix must be shorter than the original one. The new install prefix must be shorter than the original one.
Args: Args:
binaries (list): binaries to be relocated binaries: paths to binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed. prefix_to_prefix: ordered prefix to prefix mapping
Raises: Raises:
spack.relocate_text.BinaryTextReplaceError: when the new path is longer than the old path spack.relocate_text.BinaryTextReplaceError: when the new path is longer than the old path
""" """
return BinaryFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(binaries) return BinaryFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(binaries)
def is_macho_magic(magic: bytes) -> bool: def is_macho_magic(magic: bytes) -> bool:
@@ -339,7 +321,7 @@ def _exists_dir(dirname):
return os.path.isdir(dirname) return os.path.isdir(dirname)
def is_macho_binary(path): def is_macho_binary(path: str) -> bool:
try: try:
with open(path, "rb") as f: with open(path, "rb") as f:
return is_macho_magic(f.read(4)) return is_macho_magic(f.read(4))
@@ -363,7 +345,7 @@ def fixup_macos_rpath(root, filename):
return False return False
# Get Mach-O header commands # Get Mach-O header commands
(rpath_list, deps, id_dylib) = macholib_get_paths(abspath) (rpath_list, deps, id_dylib) = _macholib_get_paths(abspath)
# Convert rpaths list to (name -> number of occurrences) # Convert rpaths list to (name -> number of occurrences)
add_rpaths = set() add_rpaths = set()

View File

@@ -6,64 +6,61 @@
paths inside text files and binaries.""" paths inside text files and binaries."""
import re import re
from collections import OrderedDict from typing import IO, Dict, Iterable, List, Union
from typing import Dict, Union
from llnl.util.lang import PatternBytes
import spack.error import spack.error
Prefix = Union[str, bytes] Prefix = Union[str, bytes]
PrefixToPrefix = Union[Dict[str, str], Dict[bytes, bytes]]
def encode_path(p: Prefix) -> bytes: def encode_path(p: Prefix) -> bytes:
return p if isinstance(p, bytes) else p.encode("utf-8") return p if isinstance(p, bytes) else p.encode("utf-8")
def _prefix_to_prefix_as_bytes(prefix_to_prefix) -> Dict[bytes, bytes]: def _prefix_to_prefix_as_bytes(prefix_to_prefix: PrefixToPrefix) -> Dict[bytes, bytes]:
return OrderedDict((encode_path(k), encode_path(v)) for (k, v) in prefix_to_prefix.items()) return {encode_path(k): encode_path(v) for (k, v) in prefix_to_prefix.items()}
def utf8_path_to_binary_regex(prefix: str): def utf8_path_to_binary_regex(prefix: str) -> PatternBytes:
"""Create a binary regex that matches the input path in utf8""" """Create a binary regex that matches the input path in utf8"""
prefix_bytes = re.escape(prefix).encode("utf-8") prefix_bytes = re.escape(prefix).encode("utf-8")
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)" % prefix_bytes) return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)" % prefix_bytes)
def _byte_strings_to_single_binary_regex(prefixes): def _byte_strings_to_single_binary_regex(prefixes: Iterable[bytes]) -> PatternBytes:
all_prefixes = b"|".join(re.escape(p) for p in prefixes) all_prefixes = b"|".join(re.escape(p) for p in prefixes)
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)(%s)([\\w\\-_/]*)" % all_prefixes) return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)(%s)([\\w\\-_/]*)" % all_prefixes)
def utf8_paths_to_single_binary_regex(prefixes): def utf8_paths_to_single_binary_regex(prefixes: Iterable[str]) -> PatternBytes:
"""Create a (binary) regex that matches any input path in utf8""" """Create a (binary) regex that matches any input path in utf8"""
return _byte_strings_to_single_binary_regex(p.encode("utf-8") for p in prefixes) return _byte_strings_to_single_binary_regex(p.encode("utf-8") for p in prefixes)
def filter_identity_mappings(prefix_to_prefix): def filter_identity_mappings(prefix_to_prefix: Dict[bytes, bytes]) -> Dict[bytes, bytes]:
"""Drop mappings that are not changed.""" """Drop mappings that are not changed."""
# NOTE: we don't guard against the following case: # NOTE: we don't guard against the following case:
# [/abc/def -> /abc/def, /abc -> /x] *will* be simplified to # [/abc/def -> /abc/def, /abc -> /x] *will* be simplified to
# [/abc -> /x], meaning that after this simplification /abc/def will be # [/abc -> /x], meaning that after this simplification /abc/def will be
# mapped to /x/def instead of /abc/def. This should not be a problem. # mapped to /x/def instead of /abc/def. This should not be a problem.
return OrderedDict((k, v) for (k, v) in prefix_to_prefix.items() if k != v) return {k: v for k, v in prefix_to_prefix.items() if k != v}
class PrefixReplacer: class PrefixReplacer:
"""Base class for applying a prefix to prefix map """Base class for applying a prefix to prefix map to a list of binaries or text files. Derived
to a list of binaries or text files. classes implement _apply_to_file to do the actual work, which is different when it comes to
Child classes implement _apply_to_file to do the
actual work, which is different when it comes to
binaries and text files.""" binaries and text files."""
def __init__(self, prefix_to_prefix: Dict[bytes, bytes]): def __init__(self, prefix_to_prefix: Dict[bytes, bytes]) -> None:
""" """
Arguments: Arguments:
prefix_to_prefix: An ordered mapping from prefix to prefix. The order is relevant to
prefix_to_prefix (OrderedDict): support substring fallbacks, for example
``[("/first/sub", "/x"), ("/first", "/y")]`` will ensure /first/sub is matched and
A ordered mapping from prefix to prefix. The order is replaced before /first.
relevant to support substring fallbacks, for example
[("/first/sub", "/x"), ("/first", "/y")] will ensure
/first/sub is matched and replaced before /first.
""" """
self.prefix_to_prefix = filter_identity_mappings(prefix_to_prefix) self.prefix_to_prefix = filter_identity_mappings(prefix_to_prefix)
@@ -74,7 +71,7 @@ def is_noop(self) -> bool:
or there are no prefixes to replace.""" or there are no prefixes to replace."""
return not self.prefix_to_prefix return not self.prefix_to_prefix
def apply(self, filenames: list): def apply(self, filenames: Iterable[str]) -> List[str]:
"""Returns a list of files that were modified""" """Returns a list of files that were modified"""
changed_files = [] changed_files = []
if self.is_noop: if self.is_noop:
@@ -84,17 +81,20 @@ def apply(self, filenames: list):
changed_files.append(filename) changed_files.append(filename)
return changed_files return changed_files
def apply_to_filename(self, filename): def apply_to_filename(self, filename: str) -> bool:
if self.is_noop: if self.is_noop:
return False return False
with open(filename, "rb+") as f: with open(filename, "rb+") as f:
return self.apply_to_file(f) return self.apply_to_file(f)
def apply_to_file(self, f): def apply_to_file(self, f: IO[bytes]) -> bool:
if self.is_noop: if self.is_noop:
return False return False
return self._apply_to_file(f) return self._apply_to_file(f)
def _apply_to_file(self, f: IO) -> bool:
raise NotImplementedError("Derived classes must implement this method")
class TextFilePrefixReplacer(PrefixReplacer): class TextFilePrefixReplacer(PrefixReplacer):
"""This class applies prefix to prefix mappings for relocation """This class applies prefix to prefix mappings for relocation
@@ -112,13 +112,11 @@ def __init__(self, prefix_to_prefix: Dict[bytes, bytes]):
self.regex = _byte_strings_to_single_binary_regex(self.prefix_to_prefix.keys()) self.regex = _byte_strings_to_single_binary_regex(self.prefix_to_prefix.keys())
@classmethod @classmethod
def from_strings_or_bytes( def from_strings_or_bytes(cls, prefix_to_prefix: PrefixToPrefix) -> "TextFilePrefixReplacer":
cls, prefix_to_prefix: Dict[Prefix, Prefix]
) -> "TextFilePrefixReplacer":
"""Create a TextFilePrefixReplacer from an ordered prefix to prefix map.""" """Create a TextFilePrefixReplacer from an ordered prefix to prefix map."""
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix)) return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix))
def _apply_to_file(self, f): def _apply_to_file(self, f: IO) -> bool:
"""Text replacement implementation simply reads the entire file """Text replacement implementation simply reads the entire file
in memory and applies the combined regex.""" in memory and applies the combined regex."""
replacement = lambda m: m.group(1) + self.prefix_to_prefix[m.group(2)] + m.group(3) replacement = lambda m: m.group(1) + self.prefix_to_prefix[m.group(2)] + m.group(3)
@@ -133,12 +131,12 @@ def _apply_to_file(self, f):
class BinaryFilePrefixReplacer(PrefixReplacer): class BinaryFilePrefixReplacer(PrefixReplacer):
def __init__(self, prefix_to_prefix, suffix_safety_size=7): def __init__(self, prefix_to_prefix: Dict[bytes, bytes], suffix_safety_size: int = 7) -> None:
""" """
prefix_to_prefix (OrderedDict): OrderedDictionary where the keys are prefix_to_prefix: Ordered dictionary where the keys are bytes representing the old prefixes
bytes representing the old prefixes and the values are the new and the values are the new
suffix_safety_size (int): in case of null terminated strings, what size suffix_safety_size: in case of null terminated strings, what size of the suffix should
of the suffix should remain to avoid aliasing issues? remain to avoid aliasing issues?
""" """
assert suffix_safety_size >= 0 assert suffix_safety_size >= 0
super().__init__(prefix_to_prefix) super().__init__(prefix_to_prefix)
@@ -146,17 +144,18 @@ def __init__(self, prefix_to_prefix, suffix_safety_size=7):
self.regex = self.binary_text_regex(self.prefix_to_prefix.keys(), suffix_safety_size) self.regex = self.binary_text_regex(self.prefix_to_prefix.keys(), suffix_safety_size)
@classmethod @classmethod
def binary_text_regex(cls, binary_prefixes, suffix_safety_size=7): def binary_text_regex(
""" cls, binary_prefixes: Iterable[bytes], suffix_safety_size: int = 7
Create a regex that looks for exact matches of prefixes, and also tries to ) -> PatternBytes:
match a C-string type null terminator in a small lookahead window. """Create a regex that looks for exact matches of prefixes, and also tries to match a
C-string type null terminator in a small lookahead window.
Arguments: Arguments:
binary_prefixes (list): List of byte strings of prefixes to match binary_prefixes: Iterable of byte strings of prefixes to match
suffix_safety_size (int): Sizeof the lookahed for null-terminated string. suffix_safety_size: Sizeof the lookahed for null-terminated string.
Returns: compiled regex
""" """
# Note: it's important not to use capture groups for the prefix, since it destroys
# performance due to common prefix optimization.
return re.compile( return re.compile(
b"(" b"("
+ b"|".join(re.escape(p) for p in binary_prefixes) + b"|".join(re.escape(p) for p in binary_prefixes)
@@ -165,36 +164,34 @@ def binary_text_regex(cls, binary_prefixes, suffix_safety_size=7):
@classmethod @classmethod
def from_strings_or_bytes( def from_strings_or_bytes(
cls, prefix_to_prefix: Dict[Prefix, Prefix], suffix_safety_size: int = 7 cls, prefix_to_prefix: PrefixToPrefix, suffix_safety_size: int = 7
) -> "BinaryFilePrefixReplacer": ) -> "BinaryFilePrefixReplacer":
"""Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map. """Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map.
Arguments: Arguments:
prefix_to_prefix (OrderedDict): Ordered mapping of prefix to prefix. prefix_to_prefix: Ordered mapping of prefix to prefix.
suffix_safety_size (int): Number of bytes to retain at the end of a C-string suffix_safety_size: Number of bytes to retain at the end of a C-string to avoid binary
to avoid binary string-aliasing issues. string-aliasing issues.
""" """
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix), suffix_safety_size) return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix), suffix_safety_size)
def _apply_to_file(self, f): def _apply_to_file(self, f: IO[bytes]) -> bool:
""" """
Given a file opened in rb+ mode, apply the string replacements as Given a file opened in rb+ mode, apply the string replacements as specified by an ordered
specified by an ordered dictionary of prefix to prefix mappings. This dictionary of prefix to prefix mappings. This method takes special care of null-terminated
method takes special care of null-terminated C-strings. C-string constants C-strings. C-string constants are problematic because compilers and linkers optimize
are problematic because compilers and linkers optimize readonly strings for readonly strings for space by aliasing those that share a common suffix (only suffix since
space by aliasing those that share a common suffix (only suffix since all all of them are null terminated). See https://github.com/spack/spack/pull/31739 and
of them are null terminated). See https://github.com/spack/spack/pull/31739 https://github.com/spack/spack/pull/32253 for details. Our logic matches the original
and https://github.com/spack/spack/pull/32253 for details. Our logic matches prefix with a ``suffix_safety_size + 1`` lookahead for null bytes. If no null terminator
the original prefix with a ``suffix_safety_size + 1`` lookahead for null bytes. is found, we simply pad with leading /, assuming that it's a long C-string; the full
If no null terminator is found, we simply pad with leading /, assuming that C-string after replacement has a large suffix in common with its original value. If there
it's a long C-string; the full C-string after replacement has a large suffix *is* a null terminator we can do the same as long as the replacement has a sufficiently
in common with its original value. long common suffix with the original prefix. As a last resort when the replacement does
If there *is* a null terminator we can do the same as long as the replacement not have a long enough common suffix, we can try to shorten the string, but this only
has a sufficiently long common suffix with the original prefix. works if the new length is sufficiently short (typically the case when going from large
As a last resort when the replacement does not have a long enough common suffix, padding -> normal path) If the replacement string is longer, or all of the above fails,
we can try to shorten the string, but this only works if the new length is we error out.
sufficiently short (typically the case when going from large padding -> normal path)
If the replacement string is longer, or all of the above fails, we error out.
Arguments: Arguments:
f: file opened in rb+ mode f: file opened in rb+ mode
@@ -204,11 +201,10 @@ def _apply_to_file(self, f):
""" """
assert f.tell() == 0 assert f.tell() == 0
# We *could* read binary data in chunks to avoid loading all in memory, # We *could* read binary data in chunks to avoid loading all in memory, but it's nasty to
# but it's nasty to deal with matches across boundaries, so let's stick to # deal with matches across boundaries, so let's stick to something simple.
# something simple.
modified = True modified = False
for match in self.regex.finditer(f.read()): for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new) # The matching prefix (old) and its replacement (new)
@@ -218,8 +214,7 @@ def _apply_to_file(self, f):
# Did we find a trailing null within a N + 1 bytes window after the prefix? # Did we find a trailing null within a N + 1 bytes window after the prefix?
null_terminated = match.end(0) > match.end(1) null_terminated = match.end(0) > match.end(1)
# Suffix string length, excluding the null byte # Suffix string length, excluding the null byte. Only makes sense if null_terminated
# Only makes sense if null_terminated
suffix_strlen = match.end(0) - match.end(1) - 1 suffix_strlen = match.end(0) - match.end(1) - 1
# How many bytes are we shrinking our string? # How many bytes are we shrinking our string?
@@ -229,9 +224,9 @@ def _apply_to_file(self, f):
if bytes_shorter < 0: if bytes_shorter < 0:
raise CannotGrowString(old, new) raise CannotGrowString(old, new)
# If we don't know whether this is a null terminated C-string (we're looking # If we don't know whether this is a null terminated C-string (we're looking only N + 1
# only N + 1 bytes ahead), or if it is and we have a common suffix, we can # bytes ahead), or if it is and we have a common suffix, we can simply pad with leading
# simply pad with leading dir separators. # dir separators.
elif ( elif (
not null_terminated not null_terminated
or suffix_strlen >= self.suffix_safety_size # == is enough, but let's be defensive or suffix_strlen >= self.suffix_safety_size # == is enough, but let's be defensive
@@ -240,9 +235,9 @@ def _apply_to_file(self, f):
): ):
replacement = b"/" * bytes_shorter + new replacement = b"/" * bytes_shorter + new
# If it *was* null terminated, all that matters is that we can leave N bytes # If it *was* null terminated, all that matters is that we can leave N bytes of old
# of old suffix in place. Note that > is required since we also insert an # suffix in place. Note that > is required since we also insert an additional null
# additional null terminator. # terminator.
elif bytes_shorter > self.suffix_safety_size: elif bytes_shorter > self.suffix_safety_size:
replacement = new + match.group(2) # includes the trailing null replacement = new + match.group(2) # includes the trailing null
@@ -257,22 +252,6 @@ def _apply_to_file(self, f):
return modified return modified
class BinaryStringReplacementError(spack.error.SpackError):
def __init__(self, file_path, old_len, new_len):
"""The size of the file changed after binary path substitution
Args:
file_path (str): file with changing size
old_len (str): original length of the file
new_len (str): length of the file after substitution
"""
super().__init__(
"Doing a binary string replacement in %s failed.\n"
"The size of the file changed from %s to %s\n"
"when it should have remanined the same." % (file_path, old_len, new_len)
)
class BinaryTextReplaceError(spack.error.SpackError): class BinaryTextReplaceError(spack.error.SpackError):
def __init__(self, msg): def __init__(self, msg):
msg += ( msg += (
@@ -284,17 +263,16 @@ def __init__(self, msg):
class CannotGrowString(BinaryTextReplaceError): class CannotGrowString(BinaryTextReplaceError):
def __init__(self, old, new): def __init__(self, old, new):
msg = "Cannot replace {!r} with {!r} because the new prefix is longer.".format(old, new) return super().__init__(
super().__init__(msg) f"Cannot replace {old!r} with {new!r} because the new prefix is longer."
)
class CannotShrinkCString(BinaryTextReplaceError): class CannotShrinkCString(BinaryTextReplaceError):
def __init__(self, old, new, full_old_string): def __init__(self, old, new, full_old_string):
# Just interpolate binary string to not risk issues with invalid # Just interpolate binary string to not risk issues with invalid unicode, which would be
# unicode, which would be really bad user experience: error in error. # really bad user experience: error in error. We have no clue if we actually deal with a
# We have no clue if we actually deal with a real C-string nor what # real C-string nor what encoding it has.
# encoding it has. super().__init__(
msg = "Cannot replace {!r} with {!r} in the C-string {!r}.".format( f"Cannot replace {old!r} with {new!r} in the C-string {full_old_string!r}."
old, new, full_old_string
) )
super().__init__(msg)

View File

@@ -69,7 +69,7 @@ def rewire_node(spec, explicit):
os.path.join(spec.prefix, rel_path) for rel_path in buildinfo["relocate_textfiles"] os.path.join(spec.prefix, rel_path) for rel_path in buildinfo["relocate_textfiles"]
] ]
if text_to_relocate: if text_to_relocate:
relocate.relocate_text(files=text_to_relocate, prefixes=prefix_to_prefix) relocate.relocate_text(files=text_to_relocate, prefix_to_prefix=prefix_to_prefix)
links = [os.path.join(spec.prefix, f) for f in buildinfo["relocate_links"]] links = [os.path.join(spec.prefix, f) for f in buildinfo["relocate_links"]]
relocate.relocate_links(links, prefix_to_prefix) relocate.relocate_links(links, prefix_to_prefix)
bins_to_relocate = [ bins_to_relocate = [
@@ -80,7 +80,7 @@ def rewire_node(spec, explicit):
relocate.relocate_macho_binaries(bins_to_relocate, prefix_to_prefix) relocate.relocate_macho_binaries(bins_to_relocate, prefix_to_prefix)
if "elf" in platform.binary_formats: if "elf" in platform.binary_formats:
relocate.relocate_elf_binaries(bins_to_relocate, prefix_to_prefix) relocate.relocate_elf_binaries(bins_to_relocate, prefix_to_prefix)
relocate.relocate_text_bin(binaries=bins_to_relocate, prefixes=prefix_to_prefix) relocate.relocate_text_bin(binaries=bins_to_relocate, prefix_to_prefix=prefix_to_prefix)
shutil.rmtree(tempdir) shutil.rmtree(tempdir)
install_manifest = os.path.join( install_manifest = os.path.join(
spec.prefix, spec.prefix,

View File

@@ -6,6 +6,8 @@
import typing import typing
import warnings import warnings
import jsonschema
import llnl.util.lang import llnl.util.lang
from spack.error import SpecSyntaxError from spack.error import SpecSyntaxError
@@ -19,12 +21,8 @@ class DeprecationMessage(typing.NamedTuple):
# jsonschema is imported lazily as it is heavy to import # jsonschema is imported lazily as it is heavy to import
# and increases the start-up time # and increases the start-up time
def _make_validator(): def _make_validator():
import jsonschema
def _validate_spec(validator, is_spec, instance, schema): def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs.""" """Check if the attributes on instance are valid specs."""
import jsonschema
import spack.spec_parser import spack.spec_parser
if not validator.is_type(instance, "object"): if not validator.is_type(instance, "object"):
@@ -33,8 +31,8 @@ def _validate_spec(validator, is_spec, instance, schema):
for spec_str in instance: for spec_str in instance:
try: try:
spack.spec_parser.parse(spec_str) spack.spec_parser.parse(spec_str)
except SpecSyntaxError as e: except SpecSyntaxError:
yield jsonschema.ValidationError(str(e)) yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
def _deprecated_properties(validator, deprecated, instance, schema): def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")): if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
@@ -67,7 +65,7 @@ def _deprecated_properties(validator, deprecated, instance, schema):
yield jsonschema.ValidationError("\n".join(errors)) yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend( return jsonschema.validators.extend(
jsonschema.Draft4Validator, jsonschema.Draft7Validator,
{"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties}, {"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties},
) )

View File

@@ -19,7 +19,7 @@
"items": { "items": {
"type": "object", "type": "object",
"properties": {"when": {"type": "string"}}, "properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema}, "additionalProperties": spec_list_schema,
}, },
} }
} }

View File

@@ -9,6 +9,8 @@
""" """
from typing import Any, Dict from typing import Any, Dict
import jsonschema
#: Common properties for connection specification #: Common properties for connection specification
connection = { connection = {
"url": {"type": "string"}, "url": {"type": "string"},
@@ -102,8 +104,6 @@
def update(data): def update(data):
import jsonschema
errors = [] errors = []
def check_access_pair(name, section): def check_access_pair(name, section):

View File

@@ -12,22 +12,6 @@
import spack.schema.environment import spack.schema.environment
import spack.schema.projections import spack.schema.projections
#: Matches a spec or a multi-valued variant but not another
#: valid keyword.
#:
#: THIS NEEDS TO BE UPDATED FOR EVERY NEW KEYWORD THAT
#: IS ADDED IMMEDIATELY BELOW THE MODULE TYPE ATTRIBUTE
spec_regex = (
r"(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|"
r"include|exclude|projections|naming_scheme|core_compilers|all)(^\w[\w-]*)"
)
#: Matches a valid name for a module set
valid_module_set_name = r"^(?!prefix_inspections$)\w[\w-]*$"
#: Matches an anonymous spec, i.e. a spec without a root name
anonymous_spec_regex = r"^[\^@%+~]"
#: Definitions for parts of module schema #: Definitions for parts of module schema
array_of_strings = {"type": "array", "default": [], "items": {"type": "string"}} array_of_strings = {"type": "array", "default": [], "items": {"type": "string"}}
@@ -56,7 +40,7 @@
"suffixes": { "suffixes": {
"type": "object", "type": "object",
"validate_spec": True, "validate_spec": True,
"patternProperties": {r"\w[\w-]*": {"type": "string"}}, # key "additionalProperties": {"type": "string"}, # key
}, },
"environment": spack.schema.environment.definition, "environment": spack.schema.environment.definition,
}, },
@@ -64,34 +48,40 @@
projections_scheme = spack.schema.projections.properties["projections"] projections_scheme = spack.schema.projections.properties["projections"]
module_type_configuration = { module_type_configuration: Dict = {
"type": "object", "type": "object",
"default": {}, "default": {},
"allOf": [ "validate_spec": True,
{ "properties": {
"properties": { "verbose": {"type": "boolean", "default": False},
"verbose": {"type": "boolean", "default": False}, "hash_length": {"type": "integer", "minimum": 0, "default": 7},
"hash_length": {"type": "integer", "minimum": 0, "default": 7}, "include": array_of_strings,
"include": array_of_strings, "exclude": array_of_strings,
"exclude": array_of_strings, "exclude_implicits": {"type": "boolean", "default": False},
"exclude_implicits": {"type": "boolean", "default": False}, "defaults": array_of_strings,
"defaults": array_of_strings, "hide_implicits": {"type": "boolean", "default": False},
"hide_implicits": {"type": "boolean", "default": False}, "naming_scheme": {"type": "string"},
"naming_scheme": {"type": "string"}, # Can we be more specific here? "projections": projections_scheme,
"projections": projections_scheme, "all": module_file_configuration,
"all": module_file_configuration, },
} "additionalProperties": module_file_configuration,
},
{
"validate_spec": True,
"patternProperties": {
spec_regex: module_file_configuration,
anonymous_spec_regex: module_file_configuration,
},
},
],
} }
tcl_configuration = module_type_configuration.copy()
lmod_configuration = module_type_configuration.copy()
lmod_configuration["properties"].update(
{
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"validate_spec": True,
"additionalProperties": array_of_strings,
},
}
)
module_config_properties = { module_config_properties = {
"use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]}, "use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]},
@@ -105,31 +95,8 @@
"default": [], "default": [],
"items": {"type": "string", "enum": ["tcl", "lmod"]}, "items": {"type": "string", "enum": ["tcl", "lmod"]},
}, },
"lmod": { "lmod": lmod_configuration,
"allOf": [ "tcl": tcl_configuration,
# Base configuration
module_type_configuration,
{
"type": "object",
"properties": {
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"patternProperties": {spec_regex: array_of_strings},
},
},
}, # Specific lmod extensions
]
},
"tcl": {
"allOf": [
# Base configuration
module_type_configuration,
{}, # Specific tcl extensions
]
},
"prefix_inspections": { "prefix_inspections": {
"type": "object", "type": "object",
"additionalProperties": False, "additionalProperties": False,
@@ -145,7 +112,6 @@
properties: Dict[str, Any] = { properties: Dict[str, Any] = {
"modules": { "modules": {
"type": "object", "type": "object",
"additionalProperties": False,
"properties": { "properties": {
"prefix_inspections": { "prefix_inspections": {
"type": "object", "type": "object",
@@ -156,13 +122,11 @@
}, },
} }
}, },
"patternProperties": { "additionalProperties": {
valid_module_set_name: { "type": "object",
"type": "object", "default": {},
"default": {}, "additionalProperties": False,
"additionalProperties": False, "properties": module_config_properties,
"properties": module_config_properties,
}
}, },
} }
} }

View File

@@ -98,7 +98,6 @@
"packages": { "packages": {
"type": "object", "type": "object",
"default": {}, "default": {},
"additionalProperties": False,
"properties": { "properties": {
"all": { # package name "all": { # package name
"type": "object", "type": "object",
@@ -140,58 +139,54 @@
}, },
} }
}, },
"patternProperties": { "additionalProperties": { # package name
r"(?!^all$)(^\w[\w-]*)": { # package name "type": "object",
"type": "object", "default": {},
"default": {}, "additionalProperties": False,
"additionalProperties": False, "properties": {
"properties": { "require": requirements,
"require": requirements, "prefer": prefer_and_conflict,
"prefer": prefer_and_conflict, "conflict": prefer_and_conflict,
"conflict": prefer_and_conflict, "version": {
"version": { "type": "array",
"type": "array", "default": [],
"default": [], # version strings
# version strings "items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]}, },
}, "buildable": {"type": "boolean", "default": True},
"buildable": {"type": "boolean", "default": True}, "permissions": permissions,
"permissions": permissions, # If 'get_full_repo' is promoted to a Package-level
# If 'get_full_repo' is promoted to a Package-level # attribute, it could be useful to set it here
# attribute, it could be useful to set it here "package_attributes": package_attributes,
"package_attributes": package_attributes, "variants": variants,
"variants": variants, "externals": {
"externals": { "type": "array",
"type": "array", "items": {
"items": { "type": "object",
"type": "object", "properties": {
"properties": { "spec": {"type": "string"},
"spec": {"type": "string"}, "prefix": {"type": "string"},
"prefix": {"type": "string"}, "modules": {"type": "array", "items": {"type": "string"}},
"modules": {"type": "array", "items": {"type": "string"}}, "extra_attributes": {
"extra_attributes": { "type": "object",
"type": "object", "additionalProperties": {"type": "string"},
"additionalProperties": True, "properties": {
"properties": { "compilers": {
"compilers": { "type": "object",
"type": "object", "patternProperties": {r"(^\w[\w-]*)": {"type": "string"}},
"patternProperties": {
r"(^\w[\w-]*)": {"type": "string"}
},
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
}, },
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
}, },
}, },
"additionalProperties": True,
"required": ["spec"],
}, },
"additionalProperties": True,
"required": ["spec"],
}, },
}, },
} },
}, },
} }
} }

View File

@@ -37,6 +37,7 @@
import spack.package_prefs import spack.package_prefs
import spack.platforms import spack.platforms
import spack.repo import spack.repo
import spack.solver.splicing
import spack.spec import spack.spec
import spack.store import spack.store
import spack.util.crypto import spack.util.crypto
@@ -67,7 +68,7 @@
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion] GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
TransformFunction = Callable[["spack.spec.Spec", List[AspFunction]], List[AspFunction]] TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
#: Enable the addition of a runtime node #: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32" WITH_RUNTIME = sys.platform != "win32"
@@ -127,8 +128,8 @@ def __str__(self):
@contextmanager @contextmanager
def named_spec( def named_spec(
spec: Optional["spack.spec.Spec"], name: Optional[str] spec: Optional[spack.spec.Spec], name: Optional[str]
) -> Iterator[Optional["spack.spec.Spec"]]: ) -> Iterator[Optional[spack.spec.Spec]]:
"""Context manager to temporarily set the name of a spec""" """Context manager to temporarily set the name of a spec"""
if spec is None or name is None: if spec is None or name is None:
yield spec yield spec
@@ -747,11 +748,11 @@ def on_model(model):
class KnownCompiler(NamedTuple): class KnownCompiler(NamedTuple):
"""Data class to collect information on compilers""" """Data class to collect information on compilers"""
spec: "spack.spec.Spec" spec: spack.spec.Spec
os: str os: str
target: str target: Optional[str]
available: bool available: bool
compiler_obj: Optional["spack.compiler.Compiler"] compiler_obj: Optional[spack.compiler.Compiler]
def _key(self): def _key(self):
return self.spec, self.os, self.target return self.spec, self.os, self.target
@@ -1132,7 +1133,7 @@ def __init__(self, tests: bool = False):
set set
) )
self.possible_compilers: List = [] self.possible_compilers: List[KnownCompiler] = []
self.possible_oses: Set = set() self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set() self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set() self.version_constraints: Set = set()
@@ -1386,7 +1387,7 @@ def effect_rules(self):
def define_variant( def define_variant(
self, self,
pkg: "Type[spack.package_base.PackageBase]", pkg: Type[spack.package_base.PackageBase],
name: str, name: str,
when: spack.spec.Spec, when: spack.spec.Spec,
variant_def: vt.Variant, variant_def: vt.Variant,
@@ -1490,7 +1491,7 @@ def define_auto_variant(self, name: str, multi: bool):
) )
) )
def variant_rules(self, pkg: "Type[spack.package_base.PackageBase]"): def variant_rules(self, pkg: Type[spack.package_base.PackageBase]):
for name in pkg.variant_names(): for name in pkg.variant_names():
self.gen.h3(f"Variant {name} in package {pkg.name}") self.gen.h3(f"Variant {name} in package {pkg.name}")
for when, variant_def in pkg.variant_definitions(name): for when, variant_def in pkg.variant_definitions(name):
@@ -1681,8 +1682,8 @@ def dependency_holds(input_spec, requirements):
def _gen_match_variant_splice_constraints( def _gen_match_variant_splice_constraints(
self, self,
pkg, pkg,
cond_spec: "spack.spec.Spec", cond_spec: spack.spec.Spec,
splice_spec: "spack.spec.Spec", splice_spec: spack.spec.Spec,
hash_asp_var: "AspVar", hash_asp_var: "AspVar",
splice_node, splice_node,
match_variants: List[str], match_variants: List[str],
@@ -1740,7 +1741,7 @@ def package_splice_rules(self, pkg):
if any( if any(
v in cond.variants or v in spec_to_splice.variants for v in match_variants v in cond.variants or v in spec_to_splice.variants for v in match_variants
): ):
raise Exception( raise spack.error.PackageError(
"Overlap between match_variants and explicitly set variants" "Overlap between match_variants and explicitly set variants"
) )
variant_constraints = self._gen_match_variant_splice_constraints( variant_constraints = self._gen_match_variant_splice_constraints(
@@ -2710,7 +2711,7 @@ def setup(
if env: if env:
dev_specs = tuple( dev_specs = tuple(
spack.spec.Spec(info["spec"]).constrained( spack.spec.Spec(info["spec"]).constrained(
"dev_path=%s" 'dev_path="%s"'
% spack.util.path.canonicalize_path(info["path"], default_wd=env.path) % spack.util.path.canonicalize_path(info["path"], default_wd=env.path)
) )
for name, info in env.dev_specs.items() for name, info in env.dev_specs.items()
@@ -2977,7 +2978,7 @@ def _specs_from_requires(self, pkg_name, section):
for s in spec_group[key]: for s in spec_group[key]:
yield _spec_with_default_name(s, pkg_name) yield _spec_with_default_name(s, pkg_name)
def pkg_class(self, pkg_name: str) -> typing.Type["spack.package_base.PackageBase"]: def pkg_class(self, pkg_name: str) -> typing.Type[spack.package_base.PackageBase]:
request = pkg_name request = pkg_name
if pkg_name in self.explicitly_required_namespaces: if pkg_name in self.explicitly_required_namespaces:
namespace = self.explicitly_required_namespaces[pkg_name] namespace = self.explicitly_required_namespaces[pkg_name]
@@ -3096,7 +3097,7 @@ def __init__(self, configuration) -> None:
self.compilers.add(candidate) self.compilers.add(candidate)
def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerParser": def with_input_specs(self, input_specs: List[spack.spec.Spec]) -> "CompilerParser":
"""Accounts for input specs when building the list of possible compilers. """Accounts for input specs when building the list of possible compilers.
Args: Args:
@@ -3136,7 +3137,7 @@ def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerPar
return self return self
def add_compiler_from_concrete_spec(self, spec: "spack.spec.Spec") -> None: def add_compiler_from_concrete_spec(self, spec: spack.spec.Spec) -> None:
"""Account for compilers that are coming from concrete specs, through reuse. """Account for compilers that are coming from concrete specs, through reuse.
Args: Args:
@@ -3374,14 +3375,6 @@ def consume_facts(self):
self._setup.effect_rules() self._setup.effect_rules()
# This should be a dataclass, but dataclasses don't work on Python 3.6
class Splice:
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
self.splice_node = splice_node
self.child_name = child_name
self.child_hash = child_hash
class SpecBuilder: class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results.""" """Class with actions to rebuild a spec from ASP results."""
@@ -3421,7 +3414,7 @@ def __init__(self, specs, hash_lookup=None):
self._specs: Dict[NodeArgument, spack.spec.Spec] = {} self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
# Matches parent nodes to splice node # Matches parent nodes to splice node
self._splices: Dict[NodeArgument, List[Splice]] = {} self._splices: Dict[spack.spec.Spec, List[spack.solver.splicing.Splice]] = {}
self._result = None self._result = None
self._command_line_specs = specs self._command_line_specs = specs
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict( self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
@@ -3540,15 +3533,13 @@ def reorder_flags(self):
) )
cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse()) cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse())
for spec in self._specs.values(): for node, spec in self._specs.items():
# if bootstrapping, compiler is not in config and has no flags # if bootstrapping, compiler is not in config and has no flags
flagmap_from_compiler = {} flagmap_from_compiler = {}
if spec.compiler in compilers: if spec.compiler in compilers:
flagmap_from_compiler = compilers[spec.compiler].flags flagmap_from_compiler = compilers[spec.compiler].flags
for flag_type in spec.compiler_flags.valid_compiler_flags(): for flag_type in spec.compiler_flags.valid_compiler_flags():
node = SpecBuilder.make_node(pkg=spec.name)
ordered_flags = [] ordered_flags = []
# 1. Put compiler flags first # 1. Put compiler flags first
@@ -3630,49 +3621,12 @@ def splice_at_hash(
child_name: str, child_name: str,
child_hash: str, child_hash: str,
): ):
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash) parent_spec = self._specs[parent_node]
self._splices.setdefault(parent_node, []).append(splice) splice_spec = self._specs[splice_node]
splice = spack.solver.splicing.Splice(
def _resolve_automatic_splices(self): splice_spec, child_name=child_name, child_hash=child_hash
"""After all of the specs have been concretized, apply all immediate splices. )
self._splices.setdefault(parent_spec, []).append(splice)
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
new_spec.add_dependency_edge(
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
)
elif edge.spec in fixed_specs:
new_spec.add_dependency_edge(
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(
edge.spec, depflag=depflag, virtuals=edge.virtuals
)
self._specs[node] = new_spec
fixed_specs[spec] = new_spec
@staticmethod @staticmethod
def sort_fn(function_tuple) -> Tuple[int, int]: def sort_fn(function_tuple) -> Tuple[int, int]:
@@ -3765,7 +3719,15 @@ def build_specs(self, function_tuples):
for root in roots.values(): for root in roots.values():
root._finalize_concretization() root._finalize_concretization()
self._resolve_automatic_splices() # Only attempt to resolve automatic splices if the solver produced any
if self._splices:
resolved_splices = spack.solver.splicing._resolve_collected_splices(
list(self._specs.values()), self._splices
)
new_specs = {}
for node, spec in self._specs.items():
new_specs[node] = resolved_splices.get(spec, spec)
self._specs = new_specs
for s in self._specs.values(): for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s) spack.spec.Spec.ensure_no_deprecated(s)

View File

@@ -0,0 +1,73 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from functools import cmp_to_key
from typing import Dict, List, NamedTuple
import spack.deptypes as dt
from spack.spec import Spec
from spack.traverse import by_dag_hash, traverse_nodes
class Splice(NamedTuple):
#: The spec being spliced into a parent
splice_spec: Spec
#: The name of the child that splice spec is replacing
child_name: str
#: The hash of the child that `splice_spec` is replacing
child_hash: str
def _resolve_collected_splices(
specs: List[Spec], splices: Dict[Spec, List[Splice]]
) -> Dict[Spec, Spec]:
"""After all of the specs have been concretized, apply all immediate splices.
Returns a dict mapping original specs to their resolved counterparts
"""
def splice_cmp(s1: Spec, s2: Spec):
"""This function can be used to sort a list of specs such that that any
spec which will be spliced into a parent comes after the parent it will
be spliced into. This order ensures that transitive splices will be
executed in the correct order.
"""
s1_splices = splices.get(s1, [])
s2_splices = splices.get(s2, [])
if any([s2.dag_hash() == splice.splice_spec.dag_hash() for splice in s1_splices]):
return -1
elif any([s1.dag_hash() == splice.splice_spec.dag_hash() for splice in s2_splices]):
return 1
else:
return 0
splice_order = sorted(specs, key=cmp_to_key(splice_cmp))
reverse_topo_order = reversed(
[x for x in traverse_nodes(splice_order, order="topo", key=by_dag_hash) if x in specs]
)
already_resolved: Dict[Spec, Spec] = {}
for spec in reverse_topo_order:
immediate = splices.get(spec, [])
if not immediate and not any(
edge.spec in already_resolved for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.clear_caches(ignore=("package_hash",))
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
# If the spec being splice in is also spliced
splice_spec = already_resolved.get(splice.splice_spec, splice.splice_spec)
new_spec.add_dependency_edge(splice_spec, depflag=depflag, virtuals=edge.virtuals)
elif edge.spec in already_resolved:
new_spec.add_dependency_edge(
already_resolved[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(edge.spec, depflag=depflag, virtuals=edge.virtuals)
already_resolved[spec] = new_spec
return already_resolved

View File

@@ -86,7 +86,6 @@
import spack import spack
import spack.compiler import spack.compiler
import spack.compilers import spack.compilers
import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.error import spack.error
import spack.hash_types as ht import spack.hash_types as ht
@@ -94,7 +93,6 @@
import spack.platforms import spack.platforms
import spack.provider_index import spack.provider_index
import spack.repo import spack.repo
import spack.solver
import spack.spec_parser import spack.spec_parser
import spack.store import spack.store
import spack.traverse import spack.traverse
@@ -580,14 +578,9 @@ def to_dict(self):
target_data = str(self.target) target_data = str(self.target)
else: else:
# Get rid of compiler flag information before turning the uarch into a dict # Get rid of compiler flag information before turning the uarch into a dict
uarch_dict = self.target.to_dict() target_data = self.target.to_dict()
uarch_dict.pop("compilers", None) target_data.pop("compilers", None)
target_data = syaml.syaml_dict(uarch_dict.items()) return {"arch": {"platform": self.platform, "platform_os": self.os, "target": target_data}}
d = syaml.syaml_dict(
[("platform", self.platform), ("platform_os", self.os), ("target", target_data)]
)
return syaml.syaml_dict([("arch", d)])
@staticmethod @staticmethod
def from_dict(d): def from_dict(d):
@@ -712,10 +705,7 @@ def _cmp_iter(self):
yield self.versions yield self.versions
def to_dict(self): def to_dict(self):
d = syaml.syaml_dict([("name", self.name)]) return {"compiler": {"name": self.name, **self.versions.to_dict()}}
d.update(self.versions.to_dict())
return syaml.syaml_dict([("compiler", d)])
@staticmethod @staticmethod
def from_dict(d): def from_dict(d):
@@ -2052,6 +2042,20 @@ def traverse_edges(
visited=visited, visited=visited,
) )
@property
def long_spec(self):
"""Returns a string of the spec with the dependencies completely
enumerated."""
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@property @property
def short_spec(self): def short_spec(self):
"""Returns a version of the spec with the dependencies hashed """Returns a version of the spec with the dependencies hashed
@@ -2278,9 +2282,7 @@ def to_node_dict(self, hash=ht.dag_hash):
Arguments: Arguments:
hash (spack.hash_types.SpecHashDescriptor) type of hash to generate. hash (spack.hash_types.SpecHashDescriptor) type of hash to generate.
""" """
d = syaml.syaml_dict() d = {"name": self.name}
d["name"] = self.name
if self.versions: if self.versions:
d.update(self.versions.to_dict()) d.update(self.versions.to_dict())
@@ -2294,7 +2296,7 @@ def to_node_dict(self, hash=ht.dag_hash):
if self.namespace: if self.namespace:
d["namespace"] = self.namespace d["namespace"] = self.namespace
params = syaml.syaml_dict(sorted(v.yaml_entry() for _, v in self.variants.items())) params = dict(sorted(v.yaml_entry() for v in self.variants.values()))
# Only need the string compiler flag for yaml file # Only need the string compiler flag for yaml file
params.update( params.update(
@@ -2320,13 +2322,16 @@ def to_node_dict(self, hash=ht.dag_hash):
) )
if self.external: if self.external:
d["external"] = syaml.syaml_dict( if self.extra_attributes:
[ extra_attributes = syaml.sorted_dict(self.extra_attributes)
("path", self.external_path), else:
("module", self.external_modules), extra_attributes = None
("extra_attributes", self.extra_attributes),
] d["external"] = {
) "path": self.external_path,
"module": self.external_modules,
"extra_attributes": extra_attributes,
}
if not self._concrete: if not self._concrete:
d["concrete"] = False d["concrete"] = False
@@ -2357,29 +2362,25 @@ def to_node_dict(self, hash=ht.dag_hash):
# Note: Relies on sorting dict by keys later in algorithm. # Note: Relies on sorting dict by keys later in algorithm.
deps = self._dependencies_dict(depflag=hash.depflag) deps = self._dependencies_dict(depflag=hash.depflag)
if deps: if deps:
deps_list = [] d["dependencies"] = [
for name, edges_for_name in sorted(deps.items()): {
name_tuple = ("name", name) "name": name,
for dspec in edges_for_name: hash.name: dspec.spec._cached_hash(hash),
hash_tuple = (hash.name, dspec.spec._cached_hash(hash)) "parameters": {
parameters_tuple = ( "deptypes": dt.flag_to_tuple(dspec.depflag),
"parameters", "virtuals": dspec.virtuals,
syaml.syaml_dict( },
( }
("deptypes", dt.flag_to_tuple(dspec.depflag)), for name, edges_for_name in sorted(deps.items())
("virtuals", dspec.virtuals), for dspec in edges_for_name
) ]
),
)
ordered_entries = [name_tuple, hash_tuple, parameters_tuple]
deps_list.append(syaml.syaml_dict(ordered_entries))
d["dependencies"] = deps_list
# Name is included in case this is replacing a virtual. # Name is included in case this is replacing a virtual.
if self._build_spec: if self._build_spec:
d["build_spec"] = syaml.syaml_dict( d["build_spec"] = {
[("name", self.build_spec.name), (hash.name, self.build_spec._cached_hash(hash))] "name": self.build_spec.name,
) hash.name: self.build_spec._cached_hash(hash),
}
return d return d
def to_dict(self, hash=ht.dag_hash): def to_dict(self, hash=ht.dag_hash):
@@ -2481,10 +2482,7 @@ def to_dict(self, hash=ht.dag_hash):
node_list.append(node) node_list.append(node)
hash_set.add(node_hash) hash_set.add(node_hash)
meta_dict = syaml.syaml_dict([("version", SPECFILE_FORMAT_VERSION)]) return {"spec": {"_meta": {"version": SPECFILE_FORMAT_VERSION}, "nodes": node_list}}
inner_dict = syaml.syaml_dict([("_meta", meta_dict), ("nodes", node_list)])
spec_dict = syaml.syaml_dict([("spec", inner_dict)])
return spec_dict
def node_dict_with_hashes(self, hash=ht.dag_hash): def node_dict_with_hashes(self, hash=ht.dag_hash):
"""Returns a node_dict of this spec with the dag hash added. If this """Returns a node_dict of this spec with the dag hash added. If this
@@ -2935,44 +2933,16 @@ def ensure_no_deprecated(root):
raise SpecDeprecatedError(msg) raise SpecDeprecatedError(msg)
def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None: def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None:
"""Concretize the current spec. from spack.concretize import concretize_one
Args: warnings.warn(
tests: if False disregard 'test' dependencies, if a list of names activate them for "`Spec.concretize` is deprecated and will be removed in version 1.0.0. Use "
the packages in the list, if True activate 'test' dependencies for all packages. "`spack.concretize.concretize_one` instead.",
""" category=spack.error.SpackAPIWarning,
import spack.solver.asp stacklevel=2,
)
self.replace_hash() self._dup(concretize_one(self, tests))
for node in self.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if self._concrete:
return
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve([self], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = self.name
# TODO: Consolidate this code with similar code in solve.py
if self.virtual:
providers = [spec.name for spec in answer.values() if spec.package.provides(name)]
name = providers[0]
node = spack.solver.asp.SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
self._dup(concretized)
def _mark_root_concrete(self, value=True): def _mark_root_concrete(self, value=True):
"""Mark just this spec (not dependencies) concrete.""" """Mark just this spec (not dependencies) concrete."""
@@ -3062,19 +3032,16 @@ def _finalize_concretization(self):
spec._cached_hash(ht.dag_hash) spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec": def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize(). from spack.concretize import concretize_one
First clones, then returns a concrete version of this package warnings.warn(
without modifying this package. "`Spec.concretized` is deprecated and will be removed in version 1.0.0. Use "
"`spack.concretize.concretize_one` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
Args: return concretize_one(self, tests)
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
clone = self.copy()
clone.concretize(tests=tests)
return clone
def index(self, deptype="all"): def index(self, deptype="all"):
"""Return a dictionary that points to all the dependencies in this """Return a dictionary that points to all the dependencies in this
@@ -3610,25 +3577,16 @@ def patches(self):
return self._patches return self._patches
def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, cleardeps=True): def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True) -> bool:
"""Copy the spec other into self. This is an overwriting """Copies "other" into self, by overwriting all attributes.
copy. It does not copy any dependents (parents), but by default
copies dependencies.
To duplicate an entire DAG, call _dup() on the root of the DAG.
Args: Args:
other (Spec): spec to be copied onto ``self`` other: spec to be copied onto ``self``
deps: if True copies all the dependencies. If deps: if True copies all the dependencies. If False copies None.
False copies None. If deptype/depflag, copy matching types. If deptype, or depflag, copy matching types.
cleardeps (bool): if True clears the dependencies of ``self``,
before possibly copying the dependencies of ``other`` onto
``self``
Returns: Returns:
True if ``self`` changed because of the copy operation, True if ``self`` changed because of the copy operation, False otherwise.
False otherwise.
""" """
# We don't count dependencies as changes here # We don't count dependencies as changes here
changed = True changed = True
@@ -3653,14 +3611,15 @@ def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, clearde
self.versions = other.versions.copy() self.versions = other.versions.copy()
self.architecture = other.architecture.copy() if other.architecture else None self.architecture = other.architecture.copy() if other.architecture else None
self.compiler = other.compiler.copy() if other.compiler else None self.compiler = other.compiler.copy() if other.compiler else None
if cleardeps:
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
self.compiler_flags = other.compiler_flags.copy() self.compiler_flags = other.compiler_flags.copy()
self.compiler_flags.spec = self self.compiler_flags.spec = self
self.variants = other.variants.copy() self.variants = other.variants.copy()
self._build_spec = other._build_spec self._build_spec = other._build_spec
# Clear dependencies
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
# FIXME: we manage _patches_in_order_of_appearance specially here # FIXME: we manage _patches_in_order_of_appearance specially here
# to keep it from leaking out of spec.py, but we should figure # to keep it from leaking out of spec.py, but we should figure
# out how to handle it more elegantly in the Variant classes. # out how to handle it more elegantly in the Variant classes.
@@ -4165,15 +4124,7 @@ def __str__(self):
if not self._dependencies: if not self._dependencies:
return self.format() return self.format()
root_str = [self.format()] return self.long_spec
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@property @property
def colored_str(self): def colored_str(self):
@@ -4551,7 +4502,7 @@ def mask_build_deps(in_spec):
return spec return spec
def clear_caches(self, ignore=()): def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
""" """
Clears all cached hashes in a Spec, while preserving other properties. Clears all cached hashes in a Spec, while preserving other properties.
""" """
@@ -4936,9 +4887,7 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"] spec.external_modules = node["external"]["module"]
if spec.external_modules is False: if spec.external_modules is False:
spec.external_modules = None spec.external_modules = None
spec.extra_attributes = node["external"].get( spec.extra_attributes = node["external"].get("extra_attributes", {})
"extra_attributes", syaml.syaml_dict()
)
# specs read in are concrete unless marked abstract # specs read in are concrete unless marked abstract
if node.get("concrete", True): if node.get("concrete", True):

View File

@@ -7,35 +7,14 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.solver.asp
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.solver.asp import SolverError
from spack.spec import Spec from spack.spec import Spec
class CacheManager:
def __init__(self, specs: List[str]) -> None:
self.req_specs = specs
self.concr_specs: List[Spec]
self.concr_specs = []
def __enter__(self):
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
for s in self.concr_specs:
PackageInstaller([s.package], fake=True, explicit=True).install()
def __exit__(self, exc_type, exc_val, exc_tb):
for s in self.concr_specs:
s.package.do_uninstall()
# MacOS and Windows only work if you pass this function pointer rather than a
# closure
def _mock_has_runtime_dependencies(_x):
return True
def _make_specs_non_buildable(specs: List[str]): def _make_specs_non_buildable(specs: List[str]):
output_config = {} output_config = {}
for spec in specs: for spec in specs:
@@ -44,203 +23,262 @@ def _make_specs_non_buildable(specs: List[str]):
@pytest.fixture @pytest.fixture
def splicing_setup(mutable_database, mock_packages, monkeypatch): def install_specs(
spack.config.set("concretizer:reuse", True) mutable_database,
monkeypatch.setattr( mock_packages,
spack.solver.asp, "_has_runtime_dependencies", _mock_has_runtime_dependencies mutable_config,
) do_not_check_runtimes_on_reuse,
install_mockery,
):
"""Returns a function that concretizes and installs a list of abstract specs"""
mutable_config.set("concretizer:reuse", True)
def _impl(*specs_str):
concrete_specs = [Spec(s).concretized() for s in specs_str]
PackageInstaller([s.package for s in concrete_specs], fake=True, explicit=True).install()
return concrete_specs
return _impl
def _enable_splicing(): def _enable_splicing():
spack.config.set("concretizer:splice", {"automatic": True}) spack.config.set("concretizer:splice", {"automatic": True})
def _has_build_dependency(spec: Spec, name: str): @pytest.mark.parametrize("spec_str", ["splice-z", "splice-h@1"])
return any(s.name == name for s in spec.dependencies(None, dt.BUILD)) def test_spec_reuse(spec_str, install_specs, mutable_config):
"""Tests reuse of splice-z, without splicing, as a root and as a dependency of splice-h"""
splice_z = install_specs("splice-z@1.0.0+compat")[0]
mutable_config.set("packages", _make_specs_non_buildable(["splice-z"]))
concrete = spack.concretize.concretize_one(spec_str)
assert concrete["splice-z"].satisfies(splice_z)
def test_simple_reuse(splicing_setup): @pytest.mark.regression("48578")
with CacheManager(["splice-z@1.0.0+compat"]): def test_splice_installed_hash(install_specs, mutable_config):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"])) """Tests splicing the dependency of an installed spec, for another installed spec"""
assert Spec("splice-z").concretized().satisfies(Spec("splice-z")) splice_t, splice_h = install_specs(
def test_simple_dep_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
def test_splice_installed_hash(splicing_setup):
cache = [
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0", "splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0",
"splice-h@1.0.2+compat ^splice-z@1.0.0", "splice-h@1.0.2+compat ^splice-z@1.0.0",
] )
with CacheManager(cache): packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"]) mutable_config.set("packages", packages_config)
spack.config.set("packages", packages_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0") goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0"
with pytest.raises(Exception): with pytest.raises(SolverError):
goal_spec.concretized() spack.concretize.concretize_one(goal_spec)
_enable_splicing() _enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec) concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h is reused, so the hash should stay the same
assert concrete["splice-h"].satisfies(splice_h)
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert concrete["splice-h"].dag_hash() == splice_h.dag_hash()
def test_splice_build_splice_node(splicing_setup): def test_splice_build_splice_node(install_specs, mutable_config):
with CacheManager(["splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat"]): """Tests splicing the dependency of an installed spec, for a spec that is yet to be built"""
spack.config.set("packages", _make_specs_non_buildable(["splice-t"])) splice_t = install_specs("splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat")[0]
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat") mutable_config.set("packages", _make_specs_non_buildable(["splice-t"]))
with pytest.raises(Exception):
goal_spec.concretized() goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat"
_enable_splicing() with pytest.raises(SolverError):
assert goal_spec.concretized().satisfies(goal_spec) spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h should be different
assert concrete["splice-h"].dag_hash() != splice_t["splice-h"].dag_hash()
assert concrete["splice-h"].build_spec.dag_hash() == concrete["splice-h"].dag_hash()
def test_double_splice(splicing_setup): def test_double_splice(install_specs, mutable_config):
cache = [ """Tests splicing two dependencies of an installed spec, for other installed specs"""
splice_t, splice_h, splice_z = install_specs(
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat", "splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat",
"splice-h@1.0.2+compat ^splice-z@1.0.1+compat", "splice-h@1.0.2+compat ^splice-z@1.0.1+compat",
"splice-z@1.0.2+compat", "splice-z@1.0.2+compat",
] )
with CacheManager(cache): mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"]))
freeze_builds_config = _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"])
spack.config.set("packages", freeze_builds_config) goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat"
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat") with pytest.raises(SolverError):
with pytest.raises(Exception): spack.concretize.concretize_one(goal_spec)
goal_spec.concretized()
_enable_splicing() _enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec) concrete = spack.concretize.concretize_one(goal_spec)
# splice-t and splice-h have a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
assert concrete["splice-h"].dag_hash() != splice_h.dag_hash()
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert not concrete["splice-h"].satisfies(splice_h)
# splice-z is reused, so the hash should stay the same
assert concrete["splice-z"].dag_hash() == splice_z.dag_hash()
# The next two tests are mirrors of one another @pytest.mark.parametrize(
def test_virtual_multi_splices_in(splicing_setup): "original_spec,goal_spec",
cache = [ [
"depends-on-virtual-with-abi ^virtual-abi-1", # `virtual-abi-1` can be spliced for `virtual-abi-multi abi=one` and vice-versa
"depends-on-virtual-with-abi ^virtual-abi-2", (
] "depends-on-virtual-with-abi ^virtual-abi-1",
goal_specs = [ "depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one", ),
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two", (
] "depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
with CacheManager(cache): "depends-on-virtual-with-abi ^virtual-abi-1",
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"])) ),
for gs in goal_specs: # `virtual-abi-2` can be spliced for `virtual-abi-multi abi=two` and vice-versa
with pytest.raises(Exception): (
Spec(gs).concretized() "depends-on-virtual-with-abi ^virtual-abi-2",
_enable_splicing() "depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
for gs in goal_specs: ),
assert Spec(gs).concretized().satisfies(gs) (
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
"depends-on-virtual-with-abi ^virtual-abi-2",
),
],
)
def test_virtual_multi_splices_in(original_spec, goal_spec, install_specs, mutable_config):
"""Tests that we can splice a virtual dependency with a different, but compatible, provider."""
original = install_specs(original_spec)[0]
mutable_config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
assert spliced["virtual-with-abi"].name != spliced.build_spec["virtual-with-abi"].name
def test_virtual_multi_can_be_spliced(splicing_setup): @pytest.mark.parametrize(
cache = [ "original_spec,goal_spec",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one", [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(Exception):
for gs in goal_specs:
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
def test_manyvariant_star_matching_variant_splice(splicing_setup):
cache = [
# can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*") # can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2", (
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3", "depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
] "depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2",
goal_specs = [ ),
Spec("depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2"), (
Spec("depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3"), "depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
] "depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3",
with CacheManager(cache): ),
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for goal in goal_specs:
with pytest.raises(Exception):
goal.concretized()
_enable_splicing()
for goal in goal_specs:
assert goal.concretized().satisfies(goal)
def test_manyvariant_limited_matching(splicing_setup):
cache = [
# can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"]) # can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2", (
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
"depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2",
),
# can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b") # can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1", (
] "depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
goal_specs = [ "depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3",
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2"), ),
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3"), ],
] )
with CacheManager(cache): def test_manyvariant_matching_variant_splice(
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}} original_spec, goal_spec, install_specs, mutable_config
spack.config.set("packages", freeze_build_config) ):
for s in goal_specs: """Tests splicing with different kind of matching on variants"""
with pytest.raises(Exception): original = install_specs(original_spec)[0]
s.concretized() mutable_config.set("packages", {"depends-on-manyvariants": {"buildable": False}})
_enable_splicing()
for s in goal_specs: with pytest.raises(SolverError):
assert s.concretized().satisfies(s) spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
# The spliced 'manyvariants' is yet to be built
assert spliced["manyvariants"].dag_hash() != original["manyvariants"].dag_hash()
assert spliced["manyvariants"].build_spec.dag_hash() == spliced["manyvariants"].dag_hash()
def test_external_splice_same_name(splicing_setup): def test_external_splice_same_name(install_specs, mutable_config):
cache = [ """Tests that externals can be spliced for non-external specs"""
original_splice_h, original_splice_t = install_specs(
"splice-h@1.0.0 ^splice-z@1.0.0+compat", "splice-h@1.0.0 ^splice-z@1.0.0+compat",
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat", "splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat",
] )
packages_yaml = { mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h"]))
"splice-z": {"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}]} mutable_config.set(
} "packages",
goal_specs = [ {
Spec("splice-h@1.0.0 ^splice-z@1.0.2"), "splice-z": {
Spec("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"), "externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}],
] "buildable": False,
with CacheManager(cache): }
spack.config.set("packages", packages_yaml) },
_enable_splicing() )
for s in goal_specs:
assert s.concretized().satisfies(s) _enable_splicing()
concrete_splice_h = spack.concretize.concretize_one("splice-h@1.0.0 ^splice-z@1.0.2")
concrete_splice_t = spack.concretize.concretize_one(
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"
)
assert concrete_splice_h.dag_hash() != original_splice_h.dag_hash()
assert concrete_splice_h.build_spec.dag_hash() == original_splice_h.dag_hash()
assert concrete_splice_h["splice-z"].external
assert concrete_splice_t.dag_hash() != original_splice_t.dag_hash()
assert concrete_splice_t.build_spec.dag_hash() == original_splice_t.dag_hash()
assert concrete_splice_t["splice-z"].external
assert concrete_splice_t["splice-z"].dag_hash() == concrete_splice_h["splice-z"].dag_hash()
def test_spliced_build_deps_only_in_build_spec(splicing_setup): def test_spliced_build_deps_only_in_build_spec(install_specs):
cache = ["splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0"] """Tests that build specs are not reported in the spliced spec"""
goal_spec = Spec("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0") install_specs("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0")
with CacheManager(cache): _enable_splicing()
_enable_splicing() spliced = spack.concretize.concretize_one("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
concr_goal = goal_spec.concretized() build_spec = spliced.build_spec
build_spec = concr_goal._build_spec
# Spec has been spliced # Spec has been spliced
assert build_spec is not None assert build_spec.dag_hash() != spliced.dag_hash()
# Build spec has spliced build dependencies # Build spec has spliced build dependencies
assert _has_build_dependency(build_spec, "splice-h") assert build_spec.dependencies("splice-h", dt.BUILD)
assert _has_build_dependency(build_spec, "splice-z") assert build_spec.dependencies("splice-z", dt.BUILD)
# Spliced build dependencies are removed # Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0 assert len(spliced.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(splicing_setup): def test_spliced_transitive_dependency(install_specs, mutable_config):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"] """Tests that build specs are not reported, even for spliced transitive dependencies"""
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2") install_specs("splice-depends-on-t@1.0 ^splice-h@1.0.1")
mutable_config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
with CacheManager(cache): _enable_splicing()
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"])) spliced = spack.concretize.concretize_one("splice-depends-on-t^splice-h@1.0.2")
_enable_splicing()
concr_goal = goal_spec.concretized() # Spec has been spliced
# Spec has been spliced assert spliced.build_spec.dag_hash() != spliced.dag_hash()
assert concr_goal._build_spec is not None assert spliced["splice-t"].build_spec.dag_hash() != spliced["splice-t"].dag_hash()
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec) # Spliced build dependencies are removed
# Spliced build dependencies are removed assert len(spliced.dependencies(None, dt.BUILD)) == 0
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0 assert len(spliced["splice-t"].dependencies(None, dt.BUILD)) == 0

View File

@@ -133,5 +133,5 @@ def test_concretize_target_ranges(root_target_range, dep_target_range, result, m
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}" f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
) )
with spack.concretize.disable_compiler_existence_check(): with spack.concretize.disable_compiler_existence_check():
spec.concretize() spec = spack.concretize.concretize_one(spec)
assert spec.target == spec["pkg-b"].target == result assert spec.target == spec["pkg-b"].target == result

View File

@@ -28,6 +28,7 @@
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.caches import spack.caches
import spack.compilers import spack.compilers
import spack.concretize
import spack.config import spack.config
import spack.fetch_strategy import spack.fetch_strategy
import spack.hooks.sbang as sbang import spack.hooks.sbang as sbang
@@ -36,13 +37,13 @@
import spack.oci.image import spack.oci.image
import spack.paths import spack.paths
import spack.spec import spack.spec
import spack.stage
import spack.store import spack.store
import spack.util.gpg import spack.util.gpg
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
from spack.binary_distribution import CannotListKeys, GenerateIndexError from spack.binary_distribution import CannotListKeys, GenerateIndexError
from spack.installer import PackageInstaller
from spack.paths import test_path from spack.paths import test_path
from spack.spec import Spec from spack.spec import Spec
@@ -205,8 +206,9 @@ def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths Test the creation and installation of buildcaches with default rpaths
into the default directory layout scheme. into the default directory layout scheme.
""" """
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized() gspec = spack.concretize.concretize_one("garply")
sy_spec = Spec("symly").concretized() cspec = spack.concretize.concretize_one("corge")
sy_spec = spack.concretize.concretize_one("symly")
# Install 'corge' without using a cache # Install 'corge' without using a cache
install_cmd("--no-cache", cspec.name) install_cmd("--no-cache", cspec.name)
@@ -253,9 +255,9 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme. into the non-default directory layout scheme.
""" """
cspec = Spec("corge").concretized() cspec = spack.concretize.concretize_one("corge")
# This guy tests for symlink relocation # This guy tests for symlink relocation
sy_spec = Spec("symly").concretized() sy_spec = spack.concretize.concretize_one("symly")
# Install some packages with dependent packages # Install some packages with dependent packages
# test install in non-default install path scheme # test install in non-default install path scheme
@@ -276,7 +278,8 @@ def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with relative Test the creation and installation of buildcaches with relative
rpaths into the default directory layout scheme. rpaths into the default directory layout scheme.
""" """
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized() gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
# Install buildcache created with relativized rpaths # Install buildcache created with relativized rpaths
buildcache_cmd("install", "-uf", cspec.name) buildcache_cmd("install", "-uf", cspec.name)
@@ -305,7 +308,7 @@ def test_relative_rpaths_install_nondefault(temporary_mirror_dir):
Test the installation of buildcaches with relativized rpaths Test the installation of buildcaches with relativized rpaths
into the non-default directory layout scheme. into the non-default directory layout scheme.
""" """
cspec = Spec("corge").concretized() cspec = spack.concretize.concretize_one("corge")
# Test install in non-default install path scheme and relative path # Test install in non-default install path scheme and relative path
buildcache_cmd("install", "-uf", cspec.name) buildcache_cmd("install", "-uf", cspec.name)
@@ -358,7 +361,8 @@ def test_built_spec_cache(temporary_mirror_dir):
that cache from a buildcache index.""" that cache from a buildcache index."""
buildcache_cmd("list", "-a", "-l") buildcache_cmd("list", "-a", "-l")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized() gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
for s in [gspec, cspec]: for s in [gspec, cspec]:
results = bindist.get_mirrors_for_spec(s) results = bindist.get_mirrors_for_spec(s)
@@ -381,7 +385,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
mirror_dir = tmpdir.join("mirror_dir") mirror_dir = tmpdir.join("mirror_dir")
mirror_url = url_util.path_to_file_url(mirror_dir.strpath) mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
s = Spec("libdwarf").concretized() s = spack.concretize.concretize_one("libdwarf")
# Install a package # Install a package
install_cmd(s.name) install_cmd(s.name)
@@ -410,7 +414,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
mirror_url = url_util.path_to_file_url(mirror_dir.strpath) mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
spack.config.set("mirrors", {"test": mirror_url}) spack.config.set("mirrors", {"test": mirror_url})
s = Spec("libdwarf").concretized() s = spack.concretize.concretize_one("libdwarf")
# Install a package # Install a package
install_cmd("--no-cache", s.name) install_cmd("--no-cache", s.name)
@@ -492,74 +496,40 @@ def mock_list_url(url, recursive=False):
assert f"Encountered problem listing packages at {url}" in capfd.readouterr().err assert f"Encountered problem listing packages at {url}" in capfd.readouterr().err
@pytest.mark.usefixtures("mock_fetch", "install_mockery") def test_update_sbang(tmp_path, temporary_mirror, mock_fetch, install_mockery):
def test_update_sbang(tmpdir, temporary_mirror): """Test relocation of the sbang shebang line in a package script"""
"""Test the creation and installation of buildcaches with default rpaths s = spack.concretize.concretize_one("old-sbang")
into the non-default directory layout scheme, triggering an update of the PackageInstaller([s.package]).install()
sbang. old_prefix, old_sbang_shebang = s.prefix, sbang.sbang_shebang_line()
""" old_contents = f"""\
spec_str = "old-sbang" {old_sbang_shebang}
# Concretize a package with some old-fashioned sbang lines. #!/usr/bin/env python3
old_spec = Spec(spec_str).concretized()
old_spec_hash_str = "/{0}".format(old_spec.dag_hash())
# Need a fake mirror with *function* scope. {s.prefix.bin}
mirror_dir = temporary_mirror """
with open(os.path.join(s.prefix.bin, "script.sh"), encoding="utf-8") as f:
# Assume all commands will concretize old_spec the same way. assert f.read() == old_contents
install_cmd("--no-cache", old_spec.name)
# Create a buildcache with the installed spec. # Create a buildcache with the installed spec.
buildcache_cmd("push", "-u", mirror_dir, old_spec_hash_str) buildcache_cmd("push", "--update-index", "--unsigned", temporary_mirror, f"/{s.dag_hash()}")
# Need to force an update of the buildcache index
buildcache_cmd("update-index", mirror_dir)
# Uninstall the original package.
uninstall_cmd("-y", old_spec_hash_str)
# Switch the store to the new install tree locations # Switch the store to the new install tree locations
newtree_dir = tmpdir.join("newtree") with spack.store.use_store(str(tmp_path)):
with spack.store.use_store(str(newtree_dir)): s._prefix = None # clear the cached old prefix
new_spec = Spec("old-sbang").concretized() new_prefix, new_sbang_shebang = s.prefix, sbang.sbang_shebang_line()
assert new_spec.dag_hash() == old_spec.dag_hash() assert old_prefix != new_prefix
assert old_sbang_shebang != new_sbang_shebang
PackageInstaller([s.package], cache_only=True, unsigned=True).install()
# Install package from buildcache # Check that the sbang line refers to the new install tree
buildcache_cmd("install", "-u", "-f", new_spec.name) new_contents = f"""\
{sbang.sbang_shebang_line()}
#!/usr/bin/env python3
# Continue blowing away caches {s.prefix.bin}
bindist.clear_spec_cache() """
spack.stage.purge() with open(os.path.join(s.prefix.bin, "script.sh"), encoding="utf-8") as f:
assert f.read() == new_contents
# test that the sbang was updated by the move
sbang_style_1_expected = """{0}
#!/usr/bin/env python
{1}
""".format(
sbang.sbang_shebang_line(), new_spec.prefix.bin
)
sbang_style_2_expected = """{0}
#!/usr/bin/env python
{1}
""".format(
sbang.sbang_shebang_line(), new_spec.prefix.bin
)
installed_script_style_1_path = new_spec.prefix.bin.join("sbang-style-1.sh")
assert (
sbang_style_1_expected
== open(str(installed_script_style_1_path), encoding="utf-8").read()
)
installed_script_style_2_path = new_spec.prefix.bin.join("sbang-style-2.sh")
assert (
sbang_style_2_expected
== open(str(installed_script_style_2_path), encoding="utf-8").read()
)
uninstall_cmd("-y", "/%s" % new_spec.dag_hash())
@pytest.mark.skipif( @pytest.mark.skipif(

View File

@@ -220,14 +220,12 @@ def test_source_is_disabled(mutable_config):
# The source is not explicitly enabled or disabled, so the following # The source is not explicitly enabled or disabled, so the following
# call should raise to skip using it for bootstrapping # call should raise to skip using it for bootstrapping
with pytest.raises(ValueError): assert not spack.bootstrap.core.source_is_enabled(conf)
spack.bootstrap.core.source_is_enabled_or_raise(conf)
# Try to explicitly disable the source and verify that the behavior # Try to explicitly disable the source and verify that the behavior
# is the same as above # is the same as above
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False)) spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError): assert not spack.bootstrap.core.source_is_enabled(conf)
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247") @pytest.mark.regression("45247")

View File

@@ -8,15 +8,15 @@
import pytest import pytest
import spack.binary_distribution as bd import spack.binary_distribution as bd
import spack.concretize
import spack.mirrors.mirror import spack.mirrors.mirror
import spack.spec
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
pytestmark = pytest.mark.not_on_windows("does not run on windows") pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path): def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path):
spec = spack.spec.Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
PackageInstaller([spec.package], fake=True).install() PackageInstaller([spec.package], fake=True).install()
specs = [spec] specs = [spec]

View File

@@ -16,6 +16,7 @@
import spack.build_environment import spack.build_environment
import spack.compiler import spack.compiler
import spack.compilers import spack.compilers
import spack.concretize
import spack.config import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.package_base import spack.package_base
@@ -163,8 +164,7 @@ def test_static_to_shared_library(build_environment):
@pytest.mark.regression("8345") @pytest.mark.regression("8345")
@pytest.mark.usefixtures("config", "mock_packages") @pytest.mark.usefixtures("config", "mock_packages")
def test_cc_not_changed_by_modules(monkeypatch, working_env): def test_cc_not_changed_by_modules(monkeypatch, working_env):
s = spack.spec.Spec("cmake") s = spack.concretize.concretize_one("cmake")
s.concretize()
pkg = s.package pkg = s.package
def _set_wrong_cc(x): def _set_wrong_cc(x):
@@ -184,7 +184,7 @@ def test_setup_dependent_package_inherited_modules(
working_env, mock_packages, install_mockery, mock_fetch working_env, mock_packages, install_mockery, mock_fetch
): ):
# This will raise on regression # This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized() s = spack.concretize.concretize_one("cmake-client-inheritor")
PackageInstaller([s.package]).install() PackageInstaller([s.package]).install()
@@ -277,7 +277,7 @@ def platform_pathsep(pathlist):
return convert_to_platform_path(pathlist) return convert_to_platform_path(pathlist)
# Monkeypatch a pkg.compiler.environment with the required modifications # Monkeypatch a pkg.compiler.environment with the required modifications
pkg = spack.spec.Spec("cmake").concretized().package pkg = spack.concretize.concretize_one("cmake").package
monkeypatch.setattr(pkg.compiler, "environment", modifications) monkeypatch.setattr(pkg.compiler, "environment", modifications)
# Trigger the modifications # Trigger the modifications
spack.build_environment.setup_package(pkg, False) spack.build_environment.setup_package(pkg, False)
@@ -301,7 +301,7 @@ def custom_env(pkg, env):
env.prepend_path("PATH", test_path) env.prepend_path("PATH", test_path)
env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1") env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1")
pkg = spack.spec.Spec("cmake").concretized().package pkg = spack.concretize.concretize_one("cmake").package
monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env) monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env)
spack.build_environment.setup_package(pkg, False) spack.build_environment.setup_package(pkg, False)
@@ -322,7 +322,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
} }
spack.config.set("packages:cmake", cmake_config) spack.config.set("packages:cmake", cmake_config)
cmake_client = spack.spec.Spec("cmake-client").concretized() cmake_client = spack.concretize.concretize_one("cmake-client")
spack.build_environment.setup_package(cmake_client.package, False) spack.build_environment.setup_package(cmake_client.package, False)
assert os.environ["TEST_ENV_VAR_SET"] == "yes it's set" assert os.environ["TEST_ENV_VAR_SET"] == "yes it's set"
@@ -330,8 +330,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
@pytest.mark.regression("9107") @pytest.mark.regression("9107")
def test_spack_paths_before_module_paths(config, mock_packages, monkeypatch, working_env): def test_spack_paths_before_module_paths(config, mock_packages, monkeypatch, working_env):
s = spack.spec.Spec("cmake") s = spack.concretize.concretize_one("cmake")
s.concretize()
pkg = s.package pkg = s.package
module_path = os.path.join("path", "to", "module") module_path = os.path.join("path", "to", "module")
@@ -352,8 +351,7 @@ def _set_wrong_cc(x):
def test_package_inheritance_module_setup(config, mock_packages, working_env): def test_package_inheritance_module_setup(config, mock_packages, working_env):
s = spack.spec.Spec("multimodule-inheritance") s = spack.concretize.concretize_one("multimodule-inheritance")
s.concretize()
pkg = s.package pkg = s.package
spack.build_environment.setup_package(pkg, False) spack.build_environment.setup_package(pkg, False)
@@ -387,8 +385,7 @@ def test_wrapper_variables(
not in cuda_include_dirs not in cuda_include_dirs
) )
root = spack.spec.Spec("dt-diamond") root = spack.concretize.concretize_one("dt-diamond")
root.concretize()
for s in root.traverse(): for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name) s.prefix = "/{0}-prefix/".format(s.name)
@@ -453,7 +450,7 @@ def test_external_prefixes_last(mutable_config, mock_packages, working_env, monk
""" """
) )
spack.config.set("packages", cfg_data) spack.config.set("packages", cfg_data)
top = spack.spec.Spec("dt-diamond").concretized() top = spack.concretize.concretize_one("dt-diamond")
def _trust_me_its_a_dir(path): def _trust_me_its_a_dir(path):
return True return True
@@ -500,8 +497,7 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
) )
def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mock_packages): def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mock_packages):
# Pick a random package to be able to set compiler's variables # Pick a random package to be able to set compiler's variables
s = spack.spec.Spec("cmake") s = spack.concretize.concretize_one("cmake")
s.concretize()
pkg = s.package pkg = s.package
env = EnvironmentModifications() env = EnvironmentModifications()
@@ -533,7 +529,7 @@ def setup_dependent_package(module, dependent_spec):
assert dependent_module.ninja is not None assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True dependent_spec.package.test_attr = True
externaltool = spack.spec.Spec("externaltest").concretized() externaltool = spack.concretize.concretize_one("externaltest")
monkeypatch.setattr( monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
) )
@@ -728,7 +724,7 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
But obviously it can lead to very hard to find bugs... We should get rid of those globals and But obviously it can lead to very hard to find bugs... We should get rid of those globals and
define them instead as a property on the package instance. define them instead as a property on the package instance.
""" """
root = spack.spec.Spec("mpileaks").concretized() root = spack.concretize.concretize_one("mpileaks")
build_variables = ("std_cmake_args", "std_meson_args", "std_pip_args") build_variables = ("std_cmake_args", "std_meson_args", "std_pip_args")
# See todo above, we clear out any properties that may have been set by the previous test. # See todo above, we clear out any properties that may have been set by the previous test.

View File

@@ -15,12 +15,13 @@
import spack.build_systems.autotools import spack.build_systems.autotools
import spack.build_systems.cmake import spack.build_systems.cmake
import spack.builder import spack.builder
import spack.concretize
import spack.environment import spack.environment
import spack.error import spack.error
import spack.paths import spack.paths
import spack.platforms import spack.platforms
import spack.platforms.test import spack.platforms.test
from spack.build_environment import ChildError, setup_package from spack.build_environment import ChildError, MakeExecutable, setup_package
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.spec import Spec from spack.spec import Spec
from spack.util.executable import which from spack.util.executable import which
@@ -29,10 +30,12 @@
@pytest.fixture() @pytest.fixture()
def concretize_and_setup(default_mock_concretization): def concretize_and_setup(default_mock_concretization, monkeypatch):
def _func(spec_str): def _func(spec_str):
s = default_mock_concretization(spec_str) s = default_mock_concretization(spec_str)
setup_package(s.package, False) setup_package(s.package, False)
monkeypatch.setattr(s.package.module, "make", MakeExecutable("make", jobs=1))
monkeypatch.setattr(s.package.module, "ninja", MakeExecutable("ninja", jobs=1))
return s return s
return _func return _func
@@ -144,7 +147,7 @@ def test_none_is_allowed(self, default_mock_concretization):
def test_libtool_archive_files_are_deleted_by_default(self, mutable_database): def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Install a package that creates a mock libtool archive # Install a package that creates a mock libtool archive
s = Spec("libtool-deletion").concretized() s = spack.concretize.concretize_one("libtool-deletion")
PackageInstaller([s.package], explicit=True).install() PackageInstaller([s.package], explicit=True).install()
# Assert the libtool archive is not there and we have # Assert the libtool archive is not there and we have
@@ -159,7 +162,7 @@ def test_libtool_archive_files_might_be_installed_on_demand(
): ):
# Install a package that creates a mock libtool archive, # Install a package that creates a mock libtool archive,
# patch its package to preserve the installation # patch its package to preserve the installation
s = Spec("libtool-deletion").concretized() s = spack.concretize.concretize_one("libtool-deletion")
monkeypatch.setattr( monkeypatch.setattr(
type(spack.builder.create(s.package)), "install_libtool_archives", True type(spack.builder.create(s.package)), "install_libtool_archives", True
) )
@@ -173,7 +176,9 @@ def test_autotools_gnuconfig_replacement(self, mutable_database):
Tests whether only broken config.sub and config.guess are replaced with Tests whether only broken config.sub and config.guess are replaced with
files from working alternatives from the gnuconfig package. files from working alternatives from the gnuconfig package.
""" """
s = Spec("autotools-config-replacement +patch_config_files +gnuconfig").concretized() s = spack.concretize.concretize_one(
Spec("autotools-config-replacement +patch_config_files +gnuconfig")
)
PackageInstaller([s.package]).install() PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f: with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f:
@@ -192,7 +197,9 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
""" """
Tests whether disabling patch_config_files Tests whether disabling patch_config_files
""" """
s = Spec("autotools-config-replacement ~patch_config_files +gnuconfig").concretized() s = spack.concretize.concretize_one(
Spec("autotools-config-replacement ~patch_config_files +gnuconfig")
)
PackageInstaller([s.package]).install() PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f: with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f:
@@ -217,8 +224,9 @@ def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, mo
enabled, but gnuconfig is not listed as a direct build dependency. enabled, but gnuconfig is not listed as a direct build dependency.
""" """
monkeypatch.setattr(spack.platforms.test.Test, "default", "x86_64") monkeypatch.setattr(spack.platforms.test.Test, "default", "x86_64")
s = Spec("autotools-config-replacement +patch_config_files ~gnuconfig") s = spack.concretize.concretize_one(
s.concretize() Spec("autotools-config-replacement +patch_config_files ~gnuconfig")
)
msg = "Cannot patch config files: missing dependencies: gnuconfig" msg = "Cannot patch config files: missing dependencies: gnuconfig"
with pytest.raises(ChildError, match=msg): with pytest.raises(ChildError, match=msg):
@@ -298,7 +306,7 @@ def test_define(self, default_mock_concretization):
assert define("SINGLE", "red") == "-DSINGLE:STRING=red" assert define("SINGLE", "red") == "-DSINGLE:STRING=red"
def test_define_from_variant(self): def test_define_from_variant(self):
s = Spec("cmake-client multi=up,right ~truthy single=red").concretized() s = spack.concretize.concretize_one("cmake-client multi=up,right ~truthy single=red")
arg = s.package.define_from_variant("MULTI") arg = s.package.define_from_variant("MULTI")
assert arg == "-DMULTI:STRING=right;up" assert arg == "-DMULTI:STRING=right;up"

View File

@@ -8,9 +8,9 @@
from llnl.util.filesystem import touch from llnl.util.filesystem import touch
import spack.builder import spack.builder
import spack.concretize
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.spec
@pytest.fixture() @pytest.fixture()
@@ -78,7 +78,7 @@ def builder_test_repository():
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_callbacks_and_installation_procedure(spec_str, expected_values, working_env): def test_callbacks_and_installation_procedure(spec_str, expected_values, working_env):
"""Test the correct execution of callbacks and installation procedures for packages.""" """Test the correct execution of callbacks and installation procedures for packages."""
s = spack.spec.Spec(spec_str).concretized() s = spack.concretize.concretize_one(spec_str)
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
for phase_fn in builder: for phase_fn in builder:
phase_fn.execute() phase_fn.execute()
@@ -101,7 +101,7 @@ def test_callbacks_and_installation_procedure(spec_str, expected_values, working
], ],
) )
def test_old_style_compatibility_with_super(spec_str, method_name, expected): def test_old_style_compatibility_with_super(spec_str, method_name, expected):
s = spack.spec.Spec(spec_str).concretized() s = spack.concretize.concretize_one(spec_str)
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
value = getattr(builder, method_name)() value = getattr(builder, method_name)()
assert value == expected assert value == expected
@@ -112,7 +112,7 @@ def test_old_style_compatibility_with_super(spec_str, method_name, expected):
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env") @pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_build_time_tests_are_executed_from_default_builder(): def test_build_time_tests_are_executed_from_default_builder():
s = spack.spec.Spec("old-style-autotools").concretized() s = spack.concretize.concretize_one("old-style-autotools")
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
builder.pkg.run_tests = True builder.pkg.run_tests = True
for phase_fn in builder: for phase_fn in builder:
@@ -126,7 +126,7 @@ def test_build_time_tests_are_executed_from_default_builder():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env") @pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_wrapped_pkg(): def test_monkey_patching_wrapped_pkg():
"""Confirm 'run_tests' is accessible through wrappers.""" """Confirm 'run_tests' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized() s = spack.concretize.concretize_one("old-style-autotools")
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
assert s.package.run_tests is False assert s.package.run_tests is False
assert builder.pkg.run_tests is False assert builder.pkg.run_tests is False
@@ -141,7 +141,7 @@ def test_monkey_patching_wrapped_pkg():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env") @pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_test_log_file(): def test_monkey_patching_test_log_file():
"""Confirm 'test_log_file' is accessible through wrappers.""" """Confirm 'test_log_file' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized() s = spack.concretize.concretize_one("old-style-autotools")
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
s.package.tester.test_log_file = "/some/file" s.package.tester.test_log_file = "/some/file"
@@ -154,7 +154,7 @@ def test_monkey_patching_test_log_file():
@pytest.mark.not_on_windows("Does not run on windows") @pytest.mark.not_on_windows("Does not run on windows")
def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage): def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage):
"""Confirm able to run stand-alone test as a post-install callback.""" """Confirm able to run stand-alone test as a post-install callback."""
s = spack.spec.Spec("py-test-callback").concretized() s = spack.concretize.concretize_one("py-test-callback")
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
builder.pkg.run_tests = True builder.pkg.run_tests = True
s.package.tester.test_log_file = tmpdir.join("install_test.log") s.package.tester.test_log_file = tmpdir.join("install_test.log")
@@ -174,7 +174,7 @@ def test_mixins_with_builders(working_env):
"""Tests that run_after and run_before callbacks are accumulated correctly, """Tests that run_after and run_before callbacks are accumulated correctly,
when mixins are used with builders. when mixins are used with builders.
""" """
s = spack.spec.Spec("builder-and-mixins").concretized() s = spack.concretize.concretize_one("builder-and-mixins")
builder = spack.builder.create(s.package) builder = spack.builder.create(s.package)
# Check that callbacks added by the mixin are in the list # Check that callbacks added by the mixin are in the list

View File

@@ -4,6 +4,7 @@
import pytest import pytest
import spack.concretize
import spack.deptypes as dt import spack.deptypes as dt
import spack.installer as inst import spack.installer as inst
import spack.repo import spack.repo
@@ -21,8 +22,7 @@ def test_build_request_errors(install_mockery):
def test_build_request_basics(install_mockery): def test_build_request_basics(install_mockery):
spec = spack.spec.Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
assert spec.concrete assert spec.concrete
# Ensure key properties match expectations # Ensure key properties match expectations
@@ -39,8 +39,7 @@ def test_build_request_basics(install_mockery):
def test_build_request_strings(install_mockery): def test_build_request_strings(install_mockery):
"""Tests of BuildRequest repr and str for coverage purposes.""" """Tests of BuildRequest repr and str for coverage purposes."""
# Using a package with one dependency # Using a package with one dependency
spec = spack.spec.Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
assert spec.concrete assert spec.concrete
# Ensure key properties match expectations # Ensure key properties match expectations
@@ -72,7 +71,7 @@ def test_build_request_deptypes(
package_deptypes, package_deptypes,
dependencies_deptypes, dependencies_deptypes,
): ):
s = spack.spec.Spec("dependent-install").concretized() s = spack.concretize.concretize_one("dependent-install")
build_request = inst.BuildRequest( build_request = inst.BuildRequest(
s.package, s.package,

View File

@@ -4,6 +4,7 @@
import pytest import pytest
import spack.concretize
import spack.error import spack.error
import spack.installer as inst import spack.installer as inst
import spack.repo import spack.repo
@@ -24,7 +25,7 @@ def test_build_task_errors(install_mockery):
inst.BuildTask(pkg_cls(spec), None) inst.BuildTask(pkg_cls(spec), None)
# Using a concretized package now means the request argument is checked. # Using a concretized package now means the request argument is checked.
spec.concretize() spec = spack.concretize.concretize_one(spec)
assert spec.concrete assert spec.concrete
with pytest.raises(TypeError, match="is not a valid build request"): with pytest.raises(TypeError, match="is not a valid build request"):
@@ -47,8 +48,7 @@ def test_build_task_errors(install_mockery):
def test_build_task_basics(install_mockery): def test_build_task_basics(install_mockery):
spec = spack.spec.Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
assert spec.concrete assert spec.concrete
# Ensure key properties match expectations # Ensure key properties match expectations
@@ -69,8 +69,7 @@ def test_build_task_basics(install_mockery):
def test_build_task_strings(install_mockery): def test_build_task_strings(install_mockery):
"""Tests of build_task repr and str for coverage purposes.""" """Tests of build_task repr and str for coverage purposes."""
# Using a package with one dependency # Using a package with one dependency
spec = spack.spec.Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
assert spec.concrete assert spec.concrete
# Ensure key properties match expectations # Ensure key properties match expectations

View File

@@ -9,13 +9,12 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.ci as ci import spack.ci as ci
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.paths as spack_paths import spack.paths as spack_paths
import spack.repo as repo import spack.repo as repo
import spack.spec
import spack.util.git import spack.util.git
from spack.spec import Spec
pytestmark = [pytest.mark.usefixtures("mock_packages")] pytestmark = [pytest.mark.usefixtures("mock_packages")]
@@ -54,7 +53,7 @@ def test_pipeline_dag(config, tmpdir):
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)]) builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
with repo.use_repositories(builder.root): with repo.use_repositories(builder.root):
spec_a = Spec("pkg-a").concretized() spec_a = spack.concretize.concretize_one("pkg-a")
key_a = ci.common.PipelineDag.key(spec_a) key_a = ci.common.PipelineDag.key(spec_a)
key_b = ci.common.PipelineDag.key(spec_a["pkg-b"]) key_b = ci.common.PipelineDag.key(spec_a["pkg-b"])
@@ -449,7 +448,7 @@ def test_ci_run_standalone_tests_not_installed_junit(
log_file = tmp_path / "junit.xml" log_file = tmp_path / "junit.xml"
args = { args = {
"log_file": str(log_file), "log_file": str(log_file),
"job_spec": spack.spec.Spec("printing-package").concretized(), "job_spec": spack.concretize.concretize_one("printing-package"),
"repro_dir": str(repro_dir), "repro_dir": str(repro_dir),
"fail_fast": True, "fail_fast": True,
} }
@@ -468,7 +467,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
log_file = tmp_path / "junit.xml" log_file = tmp_path / "junit.xml"
args = { args = {
"log_file": str(log_file), "log_file": str(log_file),
"job_spec": spack.spec.Spec("printing-package").concretized(), "job_spec": spack.concretize.concretize_one("printing-package"),
"repro_dir": str(repro_dir), "repro_dir": str(repro_dir),
} }
@@ -501,7 +500,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
def test_ci_skipped_report(tmpdir, mock_packages, config): def test_ci_skipped_report(tmpdir, mock_packages, config):
"""Test explicit skipping of report as well as CI's 'package' arg.""" """Test explicit skipping of report as well as CI's 'package' arg."""
pkg = "trivial-smoke-test" pkg = "trivial-smoke-test"
spec = spack.spec.Spec(pkg).concretized() spec = spack.concretize.concretize_one(pkg)
ci_cdash = { ci_cdash = {
"url": "file://fake", "url": "file://fake",
"build-group": "fake-group", "build-group": "fake-group",

View File

@@ -10,6 +10,7 @@
import spack.bootstrap import spack.bootstrap
import spack.bootstrap.core import spack.bootstrap.core
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.main import spack.main
@@ -183,7 +184,7 @@ def test_bootstrap_mirror_metadata(mutable_config, linux_os, monkeypatch, tmpdir
""" """
old_create = spack.mirrors.utils.create old_create = spack.mirrors.utils.create
monkeypatch.setattr(spack.mirrors.utils, "create", lambda p, s: old_create(p, [])) monkeypatch.setattr(spack.mirrors.utils, "create", lambda p, s: old_create(p, []))
monkeypatch.setattr(spack.spec.Spec, "concretized", lambda p: p) monkeypatch.setattr(spack.concretize, "concretize_one", lambda p: spack.spec.Spec(p))
# Create the mirror in a temporary folder # Create the mirror in a temporary folder
compilers = [ compilers = [

View File

@@ -12,6 +12,7 @@
import spack.binary_distribution import spack.binary_distribution
import spack.cmd.buildcache import spack.cmd.buildcache
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.main import spack.main
@@ -19,7 +20,6 @@
import spack.spec import spack.spec
import spack.util.url import spack.util.url
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.spec import Spec
buildcache = spack.main.SpackCommand("buildcache") buildcache = spack.main.SpackCommand("buildcache")
install = spack.main.SpackCommand("install") install = spack.main.SpackCommand("install")
@@ -81,7 +81,7 @@ def tests_buildcache_create(install_mockery, mock_fetch, monkeypatch, tmpdir):
buildcache("push", "--unsigned", str(tmpdir), pkg) buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = Spec(pkg).concretized() spec = spack.concretize.concretize_one(pkg)
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack") tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json") tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path)) assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -101,7 +101,7 @@ def tests_buildcache_create_env(
buildcache("push", "--unsigned", str(tmpdir)) buildcache("push", "--unsigned", str(tmpdir))
spec = Spec(pkg).concretized() spec = spack.concretize.concretize_one(pkg)
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack") tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json") tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path)) assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -145,7 +145,7 @@ def test_update_key_index(
gpg("create", "Test Signing Key", "nobody@nowhere.com") gpg("create", "Test Signing Key", "nobody@nowhere.com")
s = Spec("libdwarf").concretized() s = spack.concretize.concretize_one("libdwarf")
# Install a package # Install a package
install(s.name) install(s.name)
@@ -175,7 +175,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
mirror("add", "--unsigned", "mirror", mirror_dir.as_uri()) mirror("add", "--unsigned", "mirror", mirror_dir.as_uri())
mirror("add", "--autopush", "--unsigned", "mirror-autopush", mirror_autopush_dir.as_uri()) mirror("add", "--autopush", "--unsigned", "mirror-autopush", mirror_autopush_dir.as_uri())
s = Spec("libdwarf").concretized() s = spack.concretize.concretize_one("libdwarf")
# Install and generate build cache index # Install and generate build cache index
PackageInstaller([s.package], explicit=True).install() PackageInstaller([s.package], explicit=True).install()
@@ -219,7 +219,7 @@ def verify_mirror_contents():
assert False assert False
# Install a package and put it in the buildcache # Install a package and put it in the buildcache
s = Spec(out_env_pkg).concretized() s = spack.concretize.concretize_one(out_env_pkg)
install(s.name) install(s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name) buildcache("push", "-u", "-f", src_mirror_url, s.name)
@@ -329,7 +329,7 @@ def test_buildcache_create_install(
buildcache("push", "--unsigned", str(tmpdir), pkg) buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = Spec(pkg).concretized() spec = spack.concretize.concretize_one(pkg)
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack") tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json") tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path)) assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -450,7 +450,7 @@ def test_push_and_install_with_mirror_marked_unsigned_does_not_require_extra_fla
def test_skip_no_redistribute(mock_packages, config): def test_skip_no_redistribute(mock_packages, config):
specs = list(Spec("no-redistribute-dependent").concretized().traverse()) specs = list(spack.concretize.concretize_one("no-redistribute-dependent").traverse())
filtered = spack.cmd.buildcache._skip_no_redistribute_for_public(specs) filtered = spack.cmd.buildcache._skip_no_redistribute_for_public(specs)
assert not any(s.name == "no-redistribute" for s in filtered) assert not any(s.name == "no-redistribute" for s in filtered)
assert any(s.name == "no-redistribute-dependent" for s in filtered) assert any(s.name == "no-redistribute-dependent" for s in filtered)
@@ -490,7 +490,7 @@ def test_push_without_build_deps(tmp_path, temporary_store, mock_packages, mutab
mirror("add", "--unsigned", "my-mirror", str(tmp_path)) mirror("add", "--unsigned", "my-mirror", str(tmp_path))
s = spack.spec.Spec("dtrun3").concretized() s = spack.concretize.concretize_one("dtrun3")
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
s["dtbuild3"].package.do_uninstall() s["dtbuild3"].package.do_uninstall()

View File

@@ -7,10 +7,10 @@
import pytest import pytest
import spack.cmd.checksum import spack.cmd.checksum
import spack.concretize
import spack.error import spack.error
import spack.package_base import spack.package_base
import spack.repo import spack.repo
import spack.spec
import spack.stage import spack.stage
import spack.util.web import spack.util.web
from spack.main import SpackCommand from spack.main import SpackCommand
@@ -308,7 +308,7 @@ def test_checksum_url(mock_packages, config):
def test_checksum_verification_fails(default_mock_concretization, capsys, can_fetch_versions): def test_checksum_verification_fails(default_mock_concretization, capsys, can_fetch_versions):
spec = spack.spec.Spec("zlib").concretized() spec = spack.concretize.concretize_one("zlib")
pkg = spec.package pkg = spec.package
versions = list(pkg.versions.keys()) versions = list(pkg.versions.keys())
version_hashes = {versions[0]: "abadhash", Version("0.1"): "123456789"} version_hashes = {versions[0]: "abadhash", Version("0.1"): "123456789"}

View File

@@ -18,6 +18,7 @@
import spack.ci as ci import spack.ci as ci
import spack.cmd import spack.cmd
import spack.cmd.ci import spack.cmd.ci
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.hash_types as ht import spack.hash_types as ht
import spack.main import spack.main
@@ -1056,7 +1057,7 @@ def test_ci_rebuild_index(
with working_dir(tmp_path): with working_dir(tmp_path):
env_cmd("create", "test", "./spack.yaml") env_cmd("create", "test", "./spack.yaml")
with ev.read("test"): with ev.read("test"):
concrete_spec = Spec("callpath").concretized() concrete_spec = spack.concretize.concretize_one("callpath")
with open(tmp_path / "spec.json", "w", encoding="utf-8") as f: with open(tmp_path / "spec.json", "w", encoding="utf-8") as f:
f.write(concrete_spec.to_json(hash=ht.dag_hash)) f.write(concrete_spec.to_json(hash=ht.dag_hash))
@@ -1177,12 +1178,10 @@ def test_ci_generate_read_broken_specs_url(
ci_base_environment, ci_base_environment,
): ):
"""Verify that `broken-specs-url` works as intended""" """Verify that `broken-specs-url` works as intended"""
spec_a = Spec("pkg-a") spec_a = spack.concretize.concretize_one("pkg-a")
spec_a.concretize()
a_dag_hash = spec_a.dag_hash() a_dag_hash = spec_a.dag_hash()
spec_flattendeps = Spec("flatten-deps") spec_flattendeps = spack.concretize.concretize_one("flatten-deps")
spec_flattendeps.concretize()
flattendeps_dag_hash = spec_flattendeps.dag_hash() flattendeps_dag_hash = spec_flattendeps.dag_hash()
broken_specs_url = tmp_path.as_uri() broken_specs_url = tmp_path.as_uri()
@@ -1533,8 +1532,7 @@ def dynamic_mapping_setup(tmpdir):
""" """
) )
spec_a = Spec("pkg-a") spec_a = spack.concretize.concretize_one("pkg-a")
spec_a.concretize()
return gitlab_generator.get_job_name(spec_a) return gitlab_generator.get_job_name(spec_a)

View File

@@ -10,10 +10,8 @@
import spack.caches import spack.caches
import spack.cmd.clean import spack.cmd.clean
import spack.environment as ev
import spack.main import spack.main
import spack.package_base import spack.package_base
import spack.spec
import spack.stage import spack.stage
import spack.store import spack.store
@@ -69,20 +67,6 @@ def test_function_calls(command_line, effects, mock_calls_for_clean):
assert mock_calls_for_clean[name] == (1 if name in effects else 0) assert mock_calls_for_clean[name] == (1 if name in effects else 0)
def test_env_aware_clean(mock_stage, install_mockery, mutable_mock_env_path, monkeypatch):
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.concretize()
def fail(*args, **kwargs):
raise Exception("This should not have been called")
monkeypatch.setattr(spack.spec.Spec, "concretize", fail)
with e:
clean("mpileaks")
def test_remove_python_cache(tmpdir, monkeypatch): def test_remove_python_cache(tmpdir, monkeypatch):
cache_files = ["file1.pyo", "file2.pyc"] cache_files = ["file1.pyo", "file2.pyc"]
source_file = "file1.py" source_file = "file1.py"

View File

@@ -8,12 +8,12 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.concretize
import spack.config import spack.config
import spack.database import spack.database
import spack.environment as ev import spack.environment as ev
import spack.main import spack.main
import spack.schema.config import spack.schema.config
import spack.spec
import spack.store import spack.store
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
@@ -593,8 +593,7 @@ def test_config_prefer_upstream(
prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/")) prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/"))
for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]: for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]:
dep = spack.spec.Spec(spec) dep = spack.concretize.concretize_one(spec)
dep.concretize()
prepared_db.add(dep) prepared_db.add(dep)
downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root")) downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))

View File

@@ -4,6 +4,7 @@
import pytest import pytest
import spack.concretize
import spack.spec import spack.spec
import spack.store import spack.store
from spack.enums import InstallRecordStatus from spack.enums import InstallRecordStatus
@@ -66,8 +67,8 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
install("libdwarf@20130729 ^libelf@0.8.13") install("libdwarf@20130729 ^libelf@0.8.13")
install("libdwarf@20130207 ^libelf@0.8.10") install("libdwarf@20130207 ^libelf@0.8.10")
new_spec = spack.spec.Spec("libdwarf@20130729^libelf@0.8.13").concretized() new_spec = spack.concretize.concretize_one("libdwarf@20130729^libelf@0.8.13")
old_spec = spack.spec.Spec("libdwarf@20130207^libelf@0.8.10").concretized() old_spec = spack.concretize.concretize_one("libdwarf@20130207^libelf@0.8.10")
all_installed = spack.store.STORE.db.query() all_installed = spack.store.STORE.db.query()
@@ -107,12 +108,12 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
install("libelf@0.8.12") install("libelf@0.8.12")
install("libelf@0.8.10") install("libelf@0.8.10")
deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized() deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12") deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
deprecator = spack.store.STORE.db.deprecator(deprecated_spec) deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.spec.Spec("libelf@0.8.12").concretized() assert deprecator == spack.concretize.concretize_one("libelf@0.8.12")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13") deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
@@ -122,7 +123,7 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
assert len(all_available) == 3 assert len(all_available) == 3
deprecator = spack.store.STORE.db.deprecator(deprecated_spec) deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.spec.Spec("libelf@0.8.13").concretized() assert deprecator == spack.concretize.concretize_one("libelf@0.8.13")
def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery): def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery):
@@ -132,9 +133,9 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
install("libelf@0.8.12") install("libelf@0.8.12")
install("libelf@0.8.10") install("libelf@0.8.10")
first_deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized() first_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
second_deprecated_spec = spack.spec.Spec("libelf@0.8.12").concretized() second_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.12")
final_deprecator = spack.spec.Spec("libelf@0.8.13").concretized() final_deprecator = spack.concretize.concretize_one("libelf@0.8.13")
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12") deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
@@ -164,7 +165,7 @@ def test_concretize_deprecated(mock_packages, mock_archive, mock_fetch, install_
spec = spack.spec.Spec("libelf@0.8.10") spec = spack.spec.Spec("libelf@0.8.10")
with pytest.raises(spack.spec.SpecDeprecatedError): with pytest.raises(spack.spec.SpecDeprecatedError):
spec.concretize() spack.concretize.concretize_one(spec)
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery") @pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")

View File

@@ -8,6 +8,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.repo import spack.repo
@@ -23,7 +24,9 @@
def test_dev_build_basics(tmpdir, install_mockery): def test_dev_build_basics(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized() spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
assert "dev_path" in spec.variants assert "dev_path" in spec.variants
@@ -41,7 +44,9 @@ def test_dev_build_basics(tmpdir, install_mockery):
def test_dev_build_before(tmpdir, install_mockery): def test_dev_build_before(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized() spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -57,7 +62,9 @@ def test_dev_build_before(tmpdir, install_mockery):
def test_dev_build_until(tmpdir, install_mockery): def test_dev_build_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized() spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -75,7 +82,9 @@ def test_dev_build_until(tmpdir, install_mockery):
def test_dev_build_until_last_phase(tmpdir, install_mockery): def test_dev_build_until_last_phase(tmpdir, install_mockery):
# Test that we ignore the last_phase argument if it is already last # Test that we ignore the last_phase argument if it is already last
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized() spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -93,7 +102,9 @@ def test_dev_build_until_last_phase(tmpdir, install_mockery):
def test_dev_build_before_until(tmpdir, install_mockery): def test_dev_build_before_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized() spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -129,8 +140,9 @@ def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery,
def test_dev_build_fails_already_installed(tmpdir, install_mockery): def test_dev_build_fails_already_installed(tmpdir, install_mockery):
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir) spec = spack.concretize.concretize_one(
spec.concretize() spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -168,12 +180,25 @@ def test_dev_build_fails_no_version(mock_packages):
assert "dev-build spec must have a single, concrete version" in output assert "dev-build spec must have a single, concrete version" in output
def test_dev_build_can_parse_path_with_at_symbol(tmpdir, install_mockery):
special_char_dir = tmpdir.mkdir("tmp@place")
spec = spack.spec.Spec(f'dev-build-test-install@0.0.0 dev_path="{special_char_dir}"')
spec.concretize()
with special_char_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
f.write(spec.package.original_string)
dev_build("dev-build-test-install@0.0.0")
assert spec.package.filename in os.listdir(spec.prefix)
def test_dev_build_env(tmpdir, install_mockery, mutable_mock_env_path): def test_dev_build_env(tmpdir, install_mockery, mutable_mock_env_path):
"""Test Spack does dev builds for packages in develop section of env.""" """Test Spack does dev builds for packages in develop section of env."""
# setup dev-build-test-install package for dev build # setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build") build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir) spec = spack.concretize.concretize_one(
spec.concretize() spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
with build_dir.as_cwd(): with build_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -208,8 +233,9 @@ def test_dev_build_env_with_vars(tmpdir, install_mockery, mutable_mock_env_path,
"""Test Spack does dev builds for packages in develop section of env (path with variables).""" """Test Spack does dev builds for packages in develop section of env (path with variables)."""
# setup dev-build-test-install package for dev build # setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build") build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}") spec = spack.concretize.concretize_one(
spec.concretize() spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
)
# store the build path in an environment variable that will be used in the environment # store the build path in an environment variable that will be used in the environment
monkeypatch.setenv("CUSTOM_BUILD_PATH", build_dir) monkeypatch.setenv("CUSTOM_BUILD_PATH", build_dir)
@@ -246,8 +272,9 @@ def test_dev_build_env_version_mismatch(tmpdir, install_mockery, mutable_mock_en
"""Test Spack constraints concretization by develop specs.""" """Test Spack constraints concretization by develop specs."""
# setup dev-build-test-install package for dev build # setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build") build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir) spec = spack.concretize.concretize_one(
spec.concretize() spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
with build_dir.as_cwd(): with build_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f: with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -327,8 +354,8 @@ def test_dev_build_multiple(tmpdir, install_mockery, mutable_mock_env_path, mock
with ev.read("test"): with ev.read("test"):
# Do concretization inside environment for dev info # Do concretization inside environment for dev info
# These specs are the source of truth to compare against the installs # These specs are the source of truth to compare against the installs
leaf_spec.concretize() leaf_spec = spack.concretize.concretize_one(leaf_spec)
root_spec.concretize() root_spec = spack.concretize.concretize_one(root_spec)
# Do install # Do install
install() install()
@@ -374,8 +401,8 @@ def test_dev_build_env_dependency(tmpdir, install_mockery, mock_fetch, mutable_m
# concretize in the environment to get the dev build info # concretize in the environment to get the dev build info
# equivalent to setting dev_build and dev_path variants # equivalent to setting dev_build and dev_path variants
# on all specs above # on all specs above
spec.concretize() spec = spack.concretize.concretize_one(spec)
dep_spec.concretize() dep_spec = spack.concretize.concretize_one(dep_spec)
install() install()
# Ensure that both specs installed properly # Ensure that both specs installed properly
@@ -399,8 +426,9 @@ def test_dev_build_rebuild_on_source_changes(
""" """
# setup dev-build-test-install package for dev build # setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build") build_dir = tmpdir.mkdir("build")
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir) spec = spack.concretize.concretize_one(
spec.concretize() spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
def reset_string(): def reset_string():
with build_dir.as_cwd(): with build_dir.as_cwd():

View File

@@ -8,6 +8,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.package_base import spack.package_base
@@ -138,7 +139,8 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path) self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked # Check modifications actually worked
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath) result = spack.concretize.concretize_one("mpich@1.0")
assert result.satisfies("dev_path=%s" % abspath)
def test_develop_canonicalize_path_no_args(self, monkeypatch): def test_develop_canonicalize_path_no_args(self, monkeypatch):
env("create", "test") env("create", "test")
@@ -165,7 +167,8 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path) self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked # Check modifications actually worked
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath) result = spack.concretize.concretize_one("mpich@1.0")
assert result.satisfies("dev_path=%s" % abspath)
def _git_commit_list(git_repo_dir): def _git_commit_list(git_repo_dir):
@@ -190,7 +193,7 @@ def test_develop_full_git_repo(
spack.package_base.PackageBase, "git", "file://%s" % repo_path, raising=False spack.package_base.PackageBase, "git", "file://%s" % repo_path, raising=False
) )
spec = spack.spec.Spec("git-test-commit@1.2").concretized() spec = spack.concretize.concretize_one("git-test-commit@1.2")
try: try:
spec.package.do_stage() spec.package.do_stage()
commits = _git_commit_list(spec.package.stage[0].source_path) commits = _git_commit_list(spec.package.stage[0].source_path)
@@ -213,3 +216,22 @@ def test_develop_full_git_repo(
develop_dir = spec.variants["dev_path"].value develop_dir = spec.variants["dev_path"].value
commits = _git_commit_list(develop_dir) commits = _git_commit_list(develop_dir)
assert len(commits) > 1 assert len(commits) > 1
def test_concretize_dev_path_with_at_symbol_in_env(mutable_mock_env_path, tmpdir, mock_packages):
spec_like = "develop-test@develop"
develop_dir = tmpdir.mkdir("build@location")
env("create", "test_at_sym")
with ev.read("test_at_sym") as e:
add(spec_like)
develop(f"--path={develop_dir}", spec_like)
e.concretize()
result = e.concrete_roots()
assert len(result) == 1
cspec = result[0]
assert cspec.satisfies(spec_like), cspec
assert cspec.is_develop, cspec
assert develop_dir in cspec.variants["dev_path"], cspec

View File

@@ -5,9 +5,9 @@
import pytest import pytest
import spack.cmd.diff import spack.cmd.diff
import spack.concretize
import spack.main import spack.main
import spack.repo import spack.repo
import spack.spec
import spack.util.spack_json as sjson import spack.util.spack_json as sjson
from spack.test.conftest import create_test_repo from spack.test.conftest import create_test_repo
@@ -133,8 +133,8 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
def test_diff_ignore(test_repo): def test_diff_ignore(test_repo):
specA = spack.spec.Spec("p1+usev1").concretized() specA = spack.concretize.concretize_one("p1+usev1")
specB = spack.spec.Spec("p1~usev1").concretized() specB = spack.concretize.concretize_one("p1~usev1")
c1 = spack.cmd.diff.compare_specs(specA, specB, to_string=False) c1 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
@@ -154,8 +154,8 @@ def find(function_list, name, args):
# Check ignoring changes on multiple packages # Check ignoring changes on multiple packages
specA = spack.spec.Spec("p1+usev1 ^p3+p3var").concretized() specA = spack.concretize.concretize_one("p1+usev1 ^p3+p3var")
specA = spack.spec.Spec("p1~usev1 ^p3~p3var").concretized() specA = spack.concretize.concretize_one("p1~usev1 ^p3~p3var")
c3 = spack.cmd.diff.compare_specs(specA, specB, to_string=False) c3 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
assert find(c3["a_not_b"], "variant_value", ["p3", "p3var"]) assert find(c3["a_not_b"], "variant_value", ["p3", "p3var"])
@@ -168,8 +168,8 @@ def find(function_list, name, args):
def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages): def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test that we can install two packages and diff them""" """Test that we can install two packages and diff them"""
specA = spack.spec.Spec("mpileaks").concretized() specA = spack.concretize.concretize_one("mpileaks")
specB = spack.spec.Spec("mpileaks+debug").concretized() specB = spack.concretize.concretize_one("mpileaks+debug")
# Specs should be the same as themselves # Specs should be the same as themselves
c = spack.cmd.diff.compare_specs(specA, specA, to_string=True) c = spack.cmd.diff.compare_specs(specA, specA, to_string=True)

View File

@@ -19,6 +19,7 @@
from llnl.util.symlink import readlink from llnl.util.symlink import readlink
import spack.cmd.env import spack.cmd.env
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.environment.depfile as depfile import spack.environment.depfile as depfile
@@ -957,7 +958,7 @@ def test_lockfile_spliced_specs(environment_from_manifest, install_mockery):
"""Test that an environment can round-trip a spliced spec.""" """Test that an environment can round-trip a spliced spec."""
# Create a local install for zmpi to splice in # Create a local install for zmpi to splice in
# Default concretization is not using zmpi # Default concretization is not using zmpi
zmpi = spack.spec.Spec("zmpi").concretized() zmpi = spack.concretize.concretize_one("zmpi")
PackageInstaller([zmpi.package], fake=True).install() PackageInstaller([zmpi.package], fake=True).install()
e1 = environment_from_manifest( e1 = environment_from_manifest(
@@ -1320,39 +1321,43 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
with e: with e:
# List of requirements, flip a variant # List of requirements, flip a variant
config("change", "packages:mpich:require:~debug") config("change", "packages:mpich:require:~debug")
test_spec = spack.spec.Spec("mpich").concretized() test_spec = spack.concretize.concretize_one("mpich")
assert test_spec.satisfies("@3.0.2~debug") assert test_spec.satisfies("@3.0.2~debug")
# List of requirements, change the version (in a different scope) # List of requirements, change the version (in a different scope)
config("change", "packages:mpich:require:@3.0.3") config("change", "packages:mpich:require:@3.0.3")
test_spec = spack.spec.Spec("mpich").concretized() test_spec = spack.concretize.concretize_one("mpich")
assert test_spec.satisfies("@3.0.3") assert test_spec.satisfies("@3.0.3")
# "require:" as a single string, also try specifying # "require:" as a single string, also try specifying
# a spec string that requires enclosing in quotes as # a spec string that requires enclosing in quotes as
# part of the config path # part of the config path
config("change", 'packages:libelf:require:"@0.8.12:"') config("change", 'packages:libelf:require:"@0.8.12:"')
spack.spec.Spec("libelf@0.8.12").concretized() spack.concretize.concretize_one("libelf@0.8.12")
# No need for assert, if there wasn't a failure, we # No need for assert, if there wasn't a failure, we
# changed the requirement successfully. # changed the requirement successfully.
# Use change to add a requirement for a package that # Use change to add a requirement for a package that
# has no requirements defined # has no requirements defined
config("change", "packages:fftw:require:+mpi") config("change", "packages:fftw:require:+mpi")
test_spec = spack.spec.Spec("fftw").concretized() test_spec = spack.concretize.concretize_one("fftw")
assert test_spec.satisfies("+mpi") assert test_spec.satisfies("+mpi")
config("change", "packages:fftw:require:~mpi") config("change", "packages:fftw:require:~mpi")
test_spec = spack.spec.Spec("fftw").concretized() test_spec = spack.concretize.concretize_one("fftw")
assert test_spec.satisfies("~mpi") assert test_spec.satisfies("~mpi")
config("change", "packages:fftw:require:@1.0") config("change", "packages:fftw:require:@1.0")
test_spec = spack.spec.Spec("fftw").concretized() test_spec = spack.concretize.concretize_one("fftw")
assert test_spec.satisfies("@1.0~mpi") assert test_spec.satisfies("@1.0~mpi")
# Use "--match-spec" to change one spec in a "one_of" # Use "--match-spec" to change one spec in a "one_of"
# list # list
config("change", "packages:bowtie:require:@1.2.2", "--match-spec", "@1.2.0") config("change", "packages:bowtie:require:@1.2.2", "--match-spec", "@1.2.0")
spack.spec.Spec("bowtie@1.3.0").concretize() # confirm that we can concretize to either value
spack.spec.Spec("bowtie@1.2.2").concretized() spack.concretize.concretize_one("bowtie@1.3.0")
spack.concretize.concretize_one("bowtie@1.2.2")
# confirm that we cannot concretize to the old value
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretize_one("bowtie@1.2.0")
def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config): def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config):
@@ -1367,8 +1372,8 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
with ev.Environment(tmp_path): with ev.Environment(tmp_path):
config("change", "packages:mpich:require:~debug") config("change", "packages:mpich:require:~debug")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError): with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.spec.Spec("mpich+debug").concretized() spack.concretize.concretize_one("mpich+debug")
spack.spec.Spec("mpich~debug").concretized() spack.concretize.concretize_one("mpich~debug")
# Now check that we raise an error if we need to add a require: constraint # Now check that we raise an error if we need to add a require: constraint
# when preexisting config manually specified it as a singular spec # when preexisting config manually specified it as a singular spec
@@ -1382,7 +1387,7 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
""" """
) )
with ev.Environment(tmp_path): with ev.Environment(tmp_path):
assert spack.spec.Spec("mpich").concretized().satisfies("@3.0.3") assert spack.concretize.concretize_one("mpich").satisfies("@3.0.3")
with pytest.raises(spack.error.ConfigError, match="not a list"): with pytest.raises(spack.error.ConfigError, match="not a list"):
config("change", "packages:mpich:require:~debug") config("change", "packages:mpich:require:~debug")
@@ -1690,7 +1695,7 @@ def test_stage(mock_stage, mock_fetch, install_mockery):
root = str(mock_stage) root = str(mock_stage)
def check_stage(spec): def check_stage(spec):
spec = Spec(spec).concretized() spec = spack.concretize.concretize_one(spec)
for dep in spec.traverse(): for dep in spec.traverse():
stage_name = f"{stage_prefix}{dep.name}-{dep.version}-{dep.dag_hash()}" stage_name = f"{stage_prefix}{dep.name}-{dep.version}-{dep.dag_hash()}"
assert os.path.isdir(os.path.join(root, stage_name)) assert os.path.isdir(os.path.join(root, stage_name))
@@ -1791,7 +1796,7 @@ def test_indirect_build_dep(tmp_path):
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
x_spec = Spec("x") x_spec = Spec("x")
x_concretized = x_spec.concretized() x_concretized = spack.concretize.concretize_one(x_spec)
_env_create("test", with_view=False) _env_create("test", with_view=False)
e = ev.read("test") e = ev.read("test")
@@ -1824,10 +1829,10 @@ def test_store_different_build_deps(tmp_path):
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
y_spec = Spec("y ^z@3") y_spec = Spec("y ^z@3")
y_concretized = y_spec.concretized() y_concretized = spack.concretize.concretize_one(y_spec)
x_spec = Spec("x ^z@2") x_spec = Spec("x ^z@2")
x_concretized = x_spec.concretized() x_concretized = spack.concretize.concretize_one(x_spec)
# Even though x chose a different 'z', the y it chooses should be identical # Even though x chose a different 'z', the y it chooses should be identical
# *aside* from the dependency on 'z'. The dag_hash() will show the difference # *aside* from the dependency on 'z'. The dag_hash() will show the difference
@@ -2120,15 +2125,7 @@ def configure_reuse(reuse_mode, combined_env) -> Optional[ev.Environment]:
"from_environment_raise", "from_environment_raise",
], ],
) )
def test_env_include_concrete_reuse(monkeypatch, reuse_mode): def test_env_include_concrete_reuse(do_not_check_runtimes_on_reuse, reuse_mode):
# The mock packages do not use the gcc-runtime
def mock_has_runtime_dependencies(*args, **kwargs):
return True
monkeypatch.setattr(
spack.solver.asp, "_has_runtime_dependencies", mock_has_runtime_dependencies
)
# The default mpi version is 3.x provided by mpich in the mock repo. # The default mpi version is 3.x provided by mpich in the mock repo.
# This test verifies that concretizing with an included concrete # This test verifies that concretizing with an included concrete
# environment with "concretizer:reuse:true" the included # environment with "concretizer:reuse:true" the included

View File

@@ -5,16 +5,18 @@
import pytest import pytest
import spack.concretize
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.main import SpackCommand, SpackCommandError from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
extensions = SpackCommand("extensions") extensions = SpackCommand("extensions")
@pytest.fixture @pytest.fixture
def python_database(mock_packages, mutable_database): def python_database(mock_packages, mutable_database):
specs = [Spec(s).concretized() for s in ["python", "py-extension1", "py-extension2"]] specs = [
spack.concretize.concretize_one(s) for s in ["python", "py-extension1", "py-extension2"]
]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install() PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
yield yield
@@ -22,7 +24,7 @@ def python_database(mock_packages, mutable_database):
@pytest.mark.not_on_windows("All Fetchers Failed") @pytest.mark.not_on_windows("All Fetchers Failed")
@pytest.mark.db @pytest.mark.db
def test_extensions(mock_packages, python_database, capsys): def test_extensions(mock_packages, python_database, capsys):
ext2 = Spec("py-extension2").concretized() ext2 = spack.concretize.concretize_one("py-extension2")
def check_output(ni): def check_output(ni):
with capsys.disabled(): with capsys.disabled():

View File

@@ -12,13 +12,13 @@
import spack.cmd as cmd import spack.cmd as cmd
import spack.cmd.find import spack.cmd.find
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.repo import spack.repo
import spack.store import spack.store
import spack.user_environment as uenv import spack.user_environment as uenv
from spack.enums import InstallRecordStatus from spack.enums import InstallRecordStatus
from spack.main import SpackCommand from spack.main import SpackCommand
from spack.spec import Spec
from spack.test.conftest import create_test_repo from spack.test.conftest import create_test_repo
from spack.test.utilities import SpackCommandArgs from spack.test.utilities import SpackCommandArgs
from spack.util.pattern import Bunch from spack.util.pattern import Bunch
@@ -201,7 +201,8 @@ def test_find_json_deps(database):
@pytest.mark.db @pytest.mark.db
def test_display_json(database, capsys): def test_display_json(database, capsys):
specs = [ specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"] spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
] ]
cmd.display_specs_as_json(specs) cmd.display_specs_as_json(specs)
@@ -216,7 +217,8 @@ def test_display_json(database, capsys):
@pytest.mark.db @pytest.mark.db
def test_display_json_deps(database, capsys): def test_display_json_deps(database, capsys):
specs = [ specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"] spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
] ]
cmd.display_specs_as_json(specs, deps=True) cmd.display_specs_as_json(specs, deps=True)
@@ -275,7 +277,7 @@ def test_find_format_deps(database, config):
def test_find_format_deps_paths(database, config): def test_find_format_deps_paths(database, config):
output = find("-dp", "--format", "{name}-{version}", "mpileaks", "^zmpi") output = find("-dp", "--format", "{name}-{version}", "mpileaks", "^zmpi")
spec = Spec("mpileaks ^zmpi").concretized() spec = spack.concretize.concretize_one("mpileaks ^zmpi")
prefixes = [s.prefix for s in spec.traverse()] prefixes = [s.prefix for s in spec.traverse()]
assert ( assert (
@@ -300,7 +302,8 @@ def test_find_very_long(database, config):
output = find("-L", "--no-groups", "mpileaks") output = find("-L", "--no-groups", "mpileaks")
specs = [ specs = [
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"] spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
] ]
assert set(output.strip().split("\n")) == set( assert set(output.strip().split("\n")) == set(

View File

@@ -5,6 +5,7 @@
import pytest import pytest
import spack.concretize
import spack.deptypes as dt import spack.deptypes as dt
import spack.environment as ev import spack.environment as ev
import spack.main import spack.main
@@ -25,8 +26,7 @@ def test_gc_without_build_dependency(mutable_database):
@pytest.mark.db @pytest.mark.db
def test_gc_with_build_dependency(mutable_database): def test_gc_with_build_dependency(mutable_database):
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
assert "There are no unused specs." in gc("-yb") assert "There are no unused specs." in gc("-yb")
@@ -36,8 +36,8 @@ def test_gc_with_build_dependency(mutable_database):
@pytest.mark.db @pytest.mark.db
def test_gc_with_constraints(mutable_database): def test_gc_with_constraints(mutable_database):
s_cmake1 = spack.spec.Spec("simple-inheritance ^cmake@3.4.3").concretized() s_cmake1 = spack.concretize.concretize_one("simple-inheritance ^cmake@3.4.3")
s_cmake2 = spack.spec.Spec("simple-inheritance ^cmake@3.23.1").concretized() s_cmake2 = spack.concretize.concretize_one("simple-inheritance ^cmake@3.23.1")
PackageInstaller([s_cmake1.package], explicit=True, fake=True).install() PackageInstaller([s_cmake1.package], explicit=True, fake=True).install()
PackageInstaller([s_cmake2.package], explicit=True, fake=True).install() PackageInstaller([s_cmake2.package], explicit=True, fake=True).install()
@@ -52,8 +52,7 @@ def test_gc_with_constraints(mutable_database):
@pytest.mark.db @pytest.mark.db
def test_gc_with_environment(mutable_database, mutable_mock_env_path): def test_gc_with_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc") e = ev.create("test_gc")
@@ -68,8 +67,7 @@ def test_gc_with_environment(mutable_database, mutable_mock_env_path):
@pytest.mark.db @pytest.mark.db
def test_gc_with_build_dependency_in_environment(mutable_database, mutable_mock_env_path): def test_gc_with_build_dependency_in_environment(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc") e = ev.create("test_gc")
@@ -120,8 +118,7 @@ def test_gc_except_any_environments(mutable_database, mutable_mock_env_path):
@pytest.mark.db @pytest.mark.db
def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path): def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi") assert mutable_database.query_local("zmpi")
@@ -147,8 +144,7 @@ def test_gc_except_nonexisting_dir_env(mutable_database, mutable_mock_env_path,
@pytest.mark.db @pytest.mark.db
def test_gc_except_specific_dir_env(mutable_database, mutable_mock_env_path, tmpdir): def test_gc_except_specific_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi") assert mutable_database.query_local("zmpi")

View File

@@ -19,6 +19,7 @@
import spack.build_environment import spack.build_environment
import spack.cmd.common.arguments import spack.cmd.common.arguments
import spack.cmd.install import spack.cmd.install
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
@@ -134,7 +135,7 @@ def test_package_output(tmpdir, capsys, install_mockery, mock_fetch):
# we can't use output capture here because it interferes with Spack's # we can't use output capture here because it interferes with Spack's
# logging. TODO: see whether we can get multiple log_outputs to work # logging. TODO: see whether we can get multiple log_outputs to work
# when nested AND in pytest # when nested AND in pytest
spec = Spec("printing-package").concretized() spec = spack.concretize.concretize_one("printing-package")
pkg = spec.package pkg = spec.package
PackageInstaller([pkg], explicit=True, verbose=True).install() PackageInstaller([pkg], explicit=True, verbose=True).install()
@@ -174,7 +175,7 @@ def test_install_output_on_python_error(mock_packages, mock_archive, mock_fetch,
def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mockery): def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Verify that source has been copied into place.""" """Verify that source has been copied into place."""
install("--source", "--keep-stage", "trivial-install-test-package") install("--source", "--keep-stage", "trivial-install-test-package")
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
src = os.path.join(spec.prefix.share, "trivial-install-test-package", "src") src = os.path.join(spec.prefix.share, "trivial-install-test-package", "src")
assert filecmp.cmp( assert filecmp.cmp(
os.path.join(mock_archive.path, "configure"), os.path.join(src, "configure") os.path.join(mock_archive.path, "configure"), os.path.join(src, "configure")
@@ -182,8 +183,7 @@ def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mo
def test_install_env_variables(mock_packages, mock_archive, mock_fetch, install_mockery): def test_install_env_variables(mock_packages, mock_archive, mock_fetch, install_mockery):
spec = Spec("libdwarf") spec = spack.concretize.concretize_one("libdwarf")
spec.concretize()
install("libdwarf") install("libdwarf")
assert os.path.isfile(spec.package.install_env_path) assert os.path.isfile(spec.package.install_env_path)
@@ -204,8 +204,7 @@ def test_show_log_on_error(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mockery): def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it. # Try to install a spec and then to reinstall it.
spec = Spec("libdwarf") spec = spack.concretize.concretize_one("libdwarf")
spec.concretize()
install("libdwarf") install("libdwarf")
@@ -238,8 +237,7 @@ def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite_not_installed(mock_packages, mock_archive, mock_fetch, install_mockery): def test_install_overwrite_not_installed(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it. # Try to install a spec and then to reinstall it.
spec = Spec("libdwarf") spec = spack.concretize.concretize_one("libdwarf")
spec.concretize()
assert not os.path.exists(spec.prefix) assert not os.path.exists(spec.prefix)
@@ -260,7 +258,7 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
monkeypatch.setattr(spack.package_base.PackageBase, "git", file_url, raising=False) monkeypatch.setattr(spack.package_base.PackageBase, "git", file_url, raising=False)
# Use the earliest commit in the respository # Use the earliest commit in the respository
spec = Spec(f"git-test-commit@{commits[-1]}").concretized() spec = spack.concretize.concretize_one(f"git-test-commit@{commits[-1]}")
PackageInstaller([spec.package], explicit=True).install() PackageInstaller([spec.package], explicit=True).install()
# Ensure first commit file contents were written # Ensure first commit file contents were written
@@ -273,13 +271,11 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
def test_install_overwrite_multiple(mock_packages, mock_archive, mock_fetch, install_mockery): def test_install_overwrite_multiple(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it. # Try to install a spec and then to reinstall it.
libdwarf = Spec("libdwarf") libdwarf = spack.concretize.concretize_one("libdwarf")
libdwarf.concretize()
install("libdwarf") install("libdwarf")
cmake = Spec("cmake") cmake = spack.concretize.concretize_one("cmake")
cmake.concretize()
install("cmake") install("cmake")
@@ -355,7 +351,7 @@ def test_install_invalid_spec():
) )
def test_install_from_file(spec, concretize, error_code, tmpdir): def test_install_from_file(spec, concretize, error_code, tmpdir):
if concretize: if concretize:
spec.concretize() spec = spack.concretize.concretize_one(spec)
specfile = tmpdir.join("spec.yaml") specfile = tmpdir.join("spec.yaml")
@@ -485,8 +481,7 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
for spec in filespecs: for spec in filespecs:
filepath = tmpdir.join(spec + ".yaml") filepath = tmpdir.join(spec + ".yaml")
args = ["-f", str(filepath)] + args args = ["-f", str(filepath)] + args
s = Spec(spec) s = spack.concretize.concretize_one(spec)
s.concretize()
with filepath.open("w") as f: with filepath.open("w") as f:
s.to_yaml(f) s.to_yaml(f)
@@ -495,8 +490,7 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
def test_extra_files_are_archived(mock_packages, mock_archive, mock_fetch, install_mockery): def test_extra_files_are_archived(mock_packages, mock_archive, mock_fetch, install_mockery):
s = Spec("archive-files") s = spack.concretize.concretize_one("archive-files")
s.concretize()
install("archive-files") install("archive-files")
@@ -615,8 +609,7 @@ def test_cdash_install_from_spec_json(
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json")) spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("pkg-a") pkg_spec = spack.concretize.concretize_one("pkg-a")
pkg_spec.concretize()
with open(spec_json_path, "w", encoding="utf-8") as fd: with open(spec_json_path, "w", encoding="utf-8") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash)) fd.write(pkg_spec.to_json(hash=ht.dag_hash))
@@ -692,8 +685,8 @@ def test_cache_only_fails(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_only_dependencies(tmpdir, mock_fetch, install_mockery): def test_install_only_dependencies(tmpdir, mock_fetch, install_mockery):
dep = Spec("dependency-install").concretized() dep = spack.concretize.concretize_one("dependency-install")
root = Spec("dependent-install").concretized() root = spack.concretize.concretize_one("dependent-install")
install("--only", "dependencies", "dependent-install") install("--only", "dependencies", "dependent-install")
@@ -714,8 +707,8 @@ def test_install_only_package(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery): def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
dep = Spec("dependency-install").concretized() dep = spack.concretize.concretize_one("dependency-install")
root = Spec("dependent-install").concretized() root = spack.concretize.concretize_one("dependent-install")
install("--only", "dependencies", "dependent-install") install("--only", "dependencies", "dependent-install")
assert os.path.exists(dep.prefix) assert os.path.exists(dep.prefix)
@@ -733,8 +726,8 @@ def test_install_only_dependencies_in_env(
env("create", "test") env("create", "test")
with ev.read("test"): with ev.read("test"):
dep = Spec("dependency-install").concretized() dep = spack.concretize.concretize_one("dependency-install")
root = Spec("dependent-install").concretized() root = spack.concretize.concretize_one("dependent-install")
install("-v", "--only", "dependencies", "--add", "dependent-install") install("-v", "--only", "dependencies", "--add", "dependent-install")
@@ -750,8 +743,8 @@ def test_install_only_dependencies_of_all_in_env(
with ev.read("test"): with ev.read("test"):
roots = [ roots = [
Spec("dependent-install@1.0").concretized(), spack.concretize.concretize_one("dependent-install@1.0"),
Spec("dependent-install@2.0").concretized(), spack.concretize.concretize_one("dependent-install@2.0"),
] ]
add("dependent-install@1.0") add("dependent-install@1.0")
@@ -900,7 +893,7 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# Ensure that even on non-x86_64 architectures, there are no # Ensure that even on non-x86_64 architectures, there are no
# dependencies installed # dependencies installed
spec = Spec("configure-warning").concretized() spec = spack.concretize.concretize_one("configure-warning")
spec.clear_dependencies() spec.clear_dependencies()
specfile = "./spec.json" specfile = "./spec.json"
with open(specfile, "w", encoding="utf-8") as f: with open(specfile, "w", encoding="utf-8") as f:
@@ -946,7 +939,7 @@ def test_install_env_with_tests_all(
): ):
env("create", "test") env("create", "test")
with ev.read("test"): with ev.read("test"):
test_dep = Spec("test-dependency").concretized() test_dep = spack.concretize.concretize_one("test-dependency")
add("depb") add("depb")
install("--test", "all") install("--test", "all")
assert os.path.exists(test_dep.prefix) assert os.path.exists(test_dep.prefix)
@@ -958,7 +951,7 @@ def test_install_env_with_tests_root(
): ):
env("create", "test") env("create", "test")
with ev.read("test"): with ev.read("test"):
test_dep = Spec("test-dependency").concretized() test_dep = spack.concretize.concretize_one("test-dependency")
add("depb") add("depb")
install("--test", "root") install("--test", "root")
assert not os.path.exists(test_dep.prefix) assert not os.path.exists(test_dep.prefix)

View File

@@ -7,7 +7,7 @@
import pytest import pytest
import spack.spec import spack.concretize
import spack.user_environment as uenv import spack.user_environment as uenv
from spack.main import SpackCommand from spack.main import SpackCommand
@@ -49,7 +49,7 @@ def test_load_shell(shell, set_command):
"""Test that `spack load` applies prefix inspections of its required runtime deps in """Test that `spack load` applies prefix inspections of its required runtime deps in
topo-order""" topo-order"""
install("mpileaks") install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized() mpileaks_spec = spack.concretize.concretize_one("mpileaks")
# Ensure our reference variable is clean. # Ensure our reference variable is clean.
os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world" os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world"
@@ -166,7 +166,7 @@ def test_unload(
"""Tests that any variables set in the user environment are undone by the """Tests that any variables set in the user environment are undone by the
unload command""" unload command"""
install("mpileaks") install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized() mpileaks_spec = spack.concretize.concretize_one("mpileaks")
# Set so unload has something to do # Set so unload has something to do
os.environ["FOOBAR"] = "mpileaks" os.environ["FOOBAR"] = "mpileaks"
@@ -187,7 +187,7 @@ def test_unload_fails_no_shell(
): ):
"""Test that spack unload prints an error message without a shell.""" """Test that spack unload prints an error message without a shell."""
install("mpileaks") install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized() mpileaks_spec = spack.concretize.concretize_one("mpileaks")
os.environ[uenv.spack_loaded_hashes_var] = mpileaks_spec.dag_hash() os.environ[uenv.spack_loaded_hashes_var] = mpileaks_spec.dag_hash()
out = unload("mpileaks", fail_on_error=False) out = unload("mpileaks", fail_on_error=False)

View File

@@ -8,9 +8,9 @@
from llnl.util.filesystem import mkdirp from llnl.util.filesystem import mkdirp
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.paths import spack.paths
import spack.spec
import spack.stage import spack.stage
from spack.main import SpackCommand, SpackCommandError from spack.main import SpackCommand, SpackCommandError
@@ -25,7 +25,7 @@
@pytest.fixture @pytest.fixture
def mock_spec(): def mock_spec():
# Make it look like the source was actually expanded. # Make it look like the source was actually expanded.
s = spack.spec.Spec("externaltest").concretized() s = spack.concretize.concretize_one("externaltest")
source_path = s.package.stage.source_path source_path = s.package.stage.source_path
mkdirp(source_path) mkdirp(source_path)
yield s, s.package yield s, s.package

View File

@@ -13,6 +13,7 @@
import spack import spack
import spack.cmd.logs import spack.cmd.logs
import spack.concretize
import spack.main import spack.main
import spack.spec import spack.spec
from spack.main import SpackCommand from spack.main import SpackCommand
@@ -53,7 +54,7 @@ def disable_capture(capfd):
def test_logs_cmd_errors(install_mockery, mock_fetch, mock_archive, mock_packages): def test_logs_cmd_errors(install_mockery, mock_fetch, mock_archive, mock_packages):
spec = spack.spec.Spec("libelf").concretized() spec = spack.concretize.concretize_one("libelf")
assert not spec.installed assert not spec.installed
with pytest.raises(spack.main.SpackCommandError, match="is not installed or staged"): with pytest.raises(spack.main.SpackCommandError, match="is not installed or staged"):
@@ -82,7 +83,7 @@ def test_dump_logs(install_mockery, mock_fetch, mock_archive, mock_packages, dis
decompress them. decompress them.
""" """
cmdline_spec = spack.spec.Spec("libelf") cmdline_spec = spack.spec.Spec("libelf")
concrete_spec = cmdline_spec.concretized() concrete_spec = spack.concretize.concretize_one(cmdline_spec)
# Sanity check, make sure this test is checking what we want: to # Sanity check, make sure this test is checking what we want: to
# start with # start with

View File

@@ -7,6 +7,7 @@
import pytest import pytest
import spack.cmd.mirror import spack.cmd.mirror
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
@@ -60,7 +61,7 @@ def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_p
@pytest.fixture @pytest.fixture
def source_for_pkg_with_hash(mock_packages, tmpdir): def source_for_pkg_with_hash(mock_packages, tmpdir):
s = spack.spec.Spec("trivial-pkg-with-valid-hash").concretized() s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
local_url_basename = os.path.basename(s.package.url) local_url_basename = os.path.basename(s.package.url)
local_path = os.path.join(str(tmpdir), local_url_basename) local_path = os.path.join(str(tmpdir), local_url_basename)
with open(local_path, "w", encoding="utf-8") as f: with open(local_path, "w", encoding="utf-8") as f:
@@ -72,7 +73,9 @@ def source_for_pkg_with_hash(mock_packages, tmpdir):
def test_mirror_skip_unstable(tmpdir_factory, mock_packages, config, source_for_pkg_with_hash): def test_mirror_skip_unstable(tmpdir_factory, mock_packages, config, source_for_pkg_with_hash):
mirror_dir = str(tmpdir_factory.mktemp("mirror-dir")) mirror_dir = str(tmpdir_factory.mktemp("mirror-dir"))
specs = [spack.spec.Spec(x).concretized() for x in ["git-test", "trivial-pkg-with-valid-hash"]] specs = [
spack.concretize.concretize_one(x) for x in ["git-test", "trivial-pkg-with-valid-hash"]
]
spack.mirrors.utils.create(mirror_dir, specs, skip_unstable_versions=True) spack.mirrors.utils.create(mirror_dir, specs, skip_unstable_versions=True)
assert set(os.listdir(mirror_dir)) - set(["_source-cache"]) == set( assert set(os.listdir(mirror_dir)) - set(["_source-cache"]) == set(
@@ -111,7 +114,7 @@ def test_exclude_specs(mock_packages, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args) mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set( expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"] spack.concretize.concretize_one(x) for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
) )
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"]) expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs) assert expected_include <= set(mirror_specs)
@@ -145,7 +148,7 @@ def test_exclude_file(mock_packages, tmpdir, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args) mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set( expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"] spack.concretize.concretize_one(x) for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
) )
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"]) expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs) assert expected_include <= set(mirror_specs)

View File

@@ -7,12 +7,12 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.main import spack.main
import spack.modules import spack.modules
import spack.modules.lmod import spack.modules.lmod
import spack.repo import spack.repo
import spack.spec
import spack.store import spack.store
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
@@ -33,7 +33,7 @@ def ensure_module_files_are_there(mock_repo_path, mock_store, mock_configuration
def _module_files(module_type, *specs): def _module_files(module_type, *specs):
specs = [spack.spec.Spec(x).concretized() for x in specs] specs = [spack.concretize.concretize_one(x) for x in specs]
writer_cls = spack.modules.module_types[module_type] writer_cls = spack.modules.module_types[module_type]
return [writer_cls(spec, "default").layout.filename for spec in specs] return [writer_cls(spec, "default").layout.filename for spec in specs]
@@ -184,12 +184,15 @@ def test_setdefault_command(mutable_database, mutable_config):
# Install two different versions of pkg-a # Install two different versions of pkg-a
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0" other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0"
specs = [spack.spec.Spec(other_spec).concretized(), spack.spec.Spec(preferred).concretized()] specs = [
spack.concretize.concretize_one(other_spec),
spack.concretize.concretize_one(preferred),
]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install() PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
writers = { writers = {
preferred: writer_cls(spack.spec.Spec(preferred).concretized(), "default"), preferred: writer_cls(specs[1], "default"),
other_spec: writer_cls(spack.spec.Spec(other_spec).concretized(), "default"), other_spec: writer_cls(specs[0], "default"),
} }
# Create two module files for the same software # Create two module files for the same software

View File

@@ -2,9 +2,9 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.main import spack.main
import spack.repo import spack.repo
import spack.spec
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
tags = spack.main.SpackCommand("tags") tags = spack.main.SpackCommand("tags")
@@ -47,7 +47,7 @@ class tag_path:
def test_tags_installed(install_mockery, mock_fetch): def test_tags_installed(install_mockery, mock_fetch):
s = spack.spec.Spec("mpich").concretized() s = spack.concretize.concretize_one("mpich")
PackageInstaller([s.package], explicit=True, fake=True).install() PackageInstaller([s.package], explicit=True, fake=True).install()
out = tags("-i") out = tags("-i")

View File

@@ -11,10 +11,10 @@
import spack.cmd.common.arguments import spack.cmd.common.arguments
import spack.cmd.test import spack.cmd.test
import spack.concretize
import spack.config import spack.config
import spack.install_test import spack.install_test
import spack.paths import spack.paths
import spack.spec
from spack.install_test import TestStatus from spack.install_test import TestStatus
from spack.main import SpackCommand from spack.main import SpackCommand
@@ -240,7 +240,7 @@ def test_read_old_results(mock_packages, mock_test_stage):
def test_test_results_none(mock_packages, mock_test_stage): def test_test_results_none(mock_packages, mock_test_stage):
name = "trivial" name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized() spec = spack.concretize.concretize_one("trivial-smoke-test")
suite = spack.install_test.TestSuite([spec], name) suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage() suite.ensure_stage()
spack.install_test.write_test_suite_file(suite) spack.install_test.write_test_suite_file(suite)
@@ -255,7 +255,7 @@ def test_test_results_none(mock_packages, mock_test_stage):
def test_test_results_status(mock_packages, mock_test_stage, status): def test_test_results_status(mock_packages, mock_test_stage, status):
"""Confirm 'spack test results' returns expected status.""" """Confirm 'spack test results' returns expected status."""
name = "trivial" name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized() spec = spack.concretize.concretize_one("trivial-smoke-test")
suite = spack.install_test.TestSuite([spec], name) suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage() suite.ensure_stage()
spack.install_test.write_test_suite_file(suite) spack.install_test.write_test_suite_file(suite)
@@ -278,7 +278,7 @@ def test_test_results_status(mock_packages, mock_test_stage, status):
def test_report_filename_for_cdash(install_mockery, mock_fetch): def test_report_filename_for_cdash(install_mockery, mock_fetch):
"""Test that the temporary file used to write Testing.xml for CDash is not the upload URL""" """Test that the temporary file used to write Testing.xml for CDash is not the upload URL"""
name = "trivial" name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized() spec = spack.concretize.concretize_one("trivial-smoke-test")
suite = spack.install_test.TestSuite([spec], name) suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage() suite.ensure_stage()

View File

@@ -1,8 +1,8 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details. # Copyright Spack Project Developers. See COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.environment as ev import spack.environment as ev
import spack.spec
from spack.main import SpackCommand from spack.main import SpackCommand
undevelop = SpackCommand("undevelop") undevelop = SpackCommand("undevelop")
@@ -30,9 +30,9 @@ def test_undevelop(tmpdir, mutable_config, mock_packages, mutable_mock_env_path)
env("create", "test", "./spack.yaml") env("create", "test", "./spack.yaml")
with ev.read("test"): with ev.read("test"):
before = spack.spec.Spec("mpich").concretized() before = spack.concretize.concretize_one("mpich")
undevelop("mpich") undevelop("mpich")
after = spack.spec.Spec("mpich").concretized() after = spack.concretize.concretize_one("mpich")
# Removing dev spec from environment changes concretization # Removing dev spec from environment changes concretization
assert before.satisfies("dev_path=*") assert before.satisfies("dev_path=*")

View File

@@ -7,7 +7,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.spec import spack.concretize
import spack.store import spack.store
import spack.util.spack_json as sjson import spack.util.spack_json as sjson
import spack.verify import spack.verify
@@ -65,7 +65,7 @@ def test_single_file_verify_cmd(tmpdir):
def test_single_spec_verify_cmd(tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery): def test_single_spec_verify_cmd(tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery):
# Test the verify command interface to verify a single spec # Test the verify command interface to verify a single spec
install("libelf") install("libelf")
s = spack.spec.Spec("libelf").concretized() s = spack.concretize.concretize_one("libelf")
prefix = s.prefix prefix = s.prefix
hash = s.dag_hash() hash = s.dag_hash()

View File

@@ -9,10 +9,10 @@
from llnl.util.symlink import _windows_can_symlink from llnl.util.symlink import _windows_can_symlink
import spack.concretize
import spack.util.spack_yaml as s_yaml import spack.util.spack_yaml as s_yaml
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.main import SpackCommand from spack.main import SpackCommand
from spack.spec import Spec
extensions = SpackCommand("extensions") extensions = SpackCommand("extensions")
install = SpackCommand("install") install = SpackCommand("install")
@@ -190,7 +190,7 @@ def test_view_fails_with_missing_projections_file(tmpdir):
def test_view_files_not_ignored( def test_view_files_not_ignored(
tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery, cmd, with_projection tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery, cmd, with_projection
): ):
spec = Spec("view-not-ignored").concretized() spec = spack.concretize.concretize_one("view-not-ignored")
pkg = spec.package pkg = spec.package
PackageInstaller([pkg], explicit=True).install() PackageInstaller([pkg], explicit=True).install()
pkg.assert_installed(spec.prefix) pkg.assert_installed(spec.prefix)

View File

@@ -8,6 +8,7 @@
import archspec.cpu import archspec.cpu
import spack.concretize
import spack.config import spack.config
import spack.paths import spack.paths
import spack.repo import spack.repo
@@ -20,7 +21,7 @@
def _concretize_with_reuse(*, root_str, reused_str): def _concretize_with_reuse(*, root_str, reused_str):
reused_spec = spack.spec.Spec(reused_str).concretized() reused_spec = spack.concretize.concretize_one(reused_str)
setup = spack.solver.asp.SpackSolverSetup(tests=False) setup = spack.solver.asp.SpackSolverSetup(tests=False)
driver = spack.solver.asp.PyclingoDriver() driver = spack.solver.asp.PyclingoDriver()
result, _, _ = driver.solve(setup, [spack.spec.Spec(f"{root_str}")], reuse=[reused_spec]) result, _, _ = driver.solve(setup, [spack.spec.Spec(f"{root_str}")], reuse=[reused_spec])
@@ -44,7 +45,7 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo): def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized() s = spack.concretize.concretize_one("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0")
a, b = s["pkg-a"], s["pkg-b"] a, b = s["pkg-a"], s["pkg-b"]
# Both a and b should depend on the same gcc-runtime directly # Both a and b should depend on the same gcc-runtime directly
@@ -61,7 +62,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}} packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml) spack.config.set("packages", packages_yaml)
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized() s = spack.concretize.concretize_one("pkg-a%gcc@10.2.1")
a, b = s["pkg-a"], s["pkg-b"] a, b = s["pkg-a"], s["pkg-b"]

File diff suppressed because it is too large Load Diff

View File

@@ -4,9 +4,9 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.solver.asp import spack.solver.asp
import spack.spec
version_error_messages = [ version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:", "Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",
@@ -57,7 +57,7 @@ def test_error_messages(error_messages, config_set, spec, mock_packages, mutable
spack.config.set(path, conf) spack.config.set(path, conf)
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError) as e: with pytest.raises(spack.solver.asp.UnsatisfiableSpecError) as e:
_ = spack.spec.Spec(spec).concretized() _ = spack.concretize.concretize_one(spec)
for em in error_messages: for em in error_messages:
assert em in str(e.value) assert em in str(e.value)

View File

@@ -5,12 +5,13 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
from spack.spec import Spec
""" """
These tests include the following package DAGs: These tests include the following package DAGs:
@@ -62,12 +63,12 @@ def test_mix_spec_and_requirements(concretize_scope, test_repo):
""" """
update_concretize_scope(conf_str, "packages") update_concretize_scope(conf_str, "packages")
s1 = Spec('y cflags="-a"').concretized() s1 = spack.concretize.concretize_one('y cflags="-a"')
assert s1.satisfies('cflags="-a -c"') assert s1.satisfies('cflags="-a -c"')
def test_mix_spec_and_dependent(concretize_scope, test_repo): def test_mix_spec_and_dependent(concretize_scope, test_repo):
s1 = Spec('x ^y cflags="-a"').concretized() s1 = spack.concretize.concretize_one('x ^y cflags="-a"')
assert s1["y"].satisfies('cflags="-a -d1"') assert s1["y"].satisfies('cflags="-a -d1"')
@@ -92,10 +93,22 @@ def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall") conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "compilers") update_concretize_scope(conf_str, "compilers")
s1 = Spec('y %gcc@12.100.100 cflags="-O2"').concretized() s1 = spack.concretize.concretize_one('y %gcc@12.100.100 cflags="-O2"')
assert s1.satisfies('cflags="-Wall -O2"') assert s1.satisfies('cflags="-Wall -O2"')
def test_pkg_flags_from_compiler_and_none(concretize_scope, mock_packages):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "compilers")
s1 = spack.spec.Spec("cmake%gcc@12.100.100")
s2 = spack.spec.Spec("cmake-client^cmake%clang")
concrete = dict(spack.concretize.concretize_together([(s1, None), (s2, None)]))
assert concrete[s1].compiler_flags["cflags"] == ["-Wall"]
assert concrete[s2].compiler_flags["cflags"] == []
@pytest.mark.parametrize( @pytest.mark.parametrize(
"cmd_flags,req_flags,cmp_flags,dflags,expected_order", "cmd_flags,req_flags,cmp_flags,dflags,expected_order",
[ [
@@ -147,7 +160,7 @@ def test_flag_order_and_grouping(
if cmd_flags: if cmd_flags:
spec_str += f' cflags="{cmd_flags}"' spec_str += f' cflags="{cmd_flags}"'
root_spec = Spec(spec_str).concretized() root_spec = spack.concretize.concretize_one(spec_str)
spec = root_spec["y"] spec = root_spec["y"]
satisfy_flags = " ".join(x for x in [cmd_flags, req_flags, cmp_flags, expected_dflags] if x) satisfy_flags = " ".join(x for x in [cmd_flags, req_flags, cmp_flags, expected_dflags] if x)
assert spec.satisfies(f'cflags="{satisfy_flags}"') assert spec.satisfies(f'cflags="{satisfy_flags}"')
@@ -155,11 +168,11 @@ def test_flag_order_and_grouping(
def test_two_dependents_flag_mixing(concretize_scope, test_repo): def test_two_dependents_flag_mixing(concretize_scope, test_repo):
root_spec1 = Spec("w~moveflaglater").concretized() root_spec1 = spack.concretize.concretize_one("w~moveflaglater")
spec1 = root_spec1["y"] spec1 = root_spec1["y"]
assert spec1.compiler_flags["cflags"] == "-d0 -d1 -d2".split() assert spec1.compiler_flags["cflags"] == "-d0 -d1 -d2".split()
root_spec2 = Spec("w+moveflaglater").concretized() root_spec2 = spack.concretize.concretize_one("w+moveflaglater")
spec2 = root_spec2["y"] spec2 = root_spec2["y"]
assert spec2.compiler_flags["cflags"] == "-d3 -d1 -d2".split() assert spec2.compiler_flags["cflags"] == "-d3 -d1 -d2".split()
@@ -168,7 +181,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-f2") conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
update_concretize_scope(conf_str, "compilers") update_concretize_scope(conf_str, "compilers")
root_spec = Spec("v %gcc@12.100.100 cflags=='-f1'").concretized() root_spec = spack.concretize.concretize_one("v %gcc@12.100.100 cflags=='-f1'")
assert root_spec["y"].satisfies("cflags='-f1 -f2'") assert root_spec["y"].satisfies("cflags='-f1 -f2'")
@@ -177,7 +190,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
def test_propagate_and_pkg_dep(concretize_scope, test_repo): def test_propagate_and_pkg_dep(concretize_scope, test_repo):
root_spec1 = Spec("x ~activatemultiflag cflags=='-f1'").concretized() root_spec1 = spack.concretize.concretize_one("x ~activatemultiflag cflags=='-f1'")
assert root_spec1["y"].satisfies("cflags='-f1 -d1'") assert root_spec1["y"].satisfies("cflags='-f1 -d1'")
@@ -189,7 +202,7 @@ def test_propagate_and_require(concretize_scope, test_repo):
""" """
update_concretize_scope(conf_str, "packages") update_concretize_scope(conf_str, "packages")
root_spec1 = Spec("v cflags=='-f1'").concretized() root_spec1 = spack.concretize.concretize_one("v cflags=='-f1'")
assert root_spec1["y"].satisfies("cflags='-f1 -f2'") assert root_spec1["y"].satisfies("cflags='-f1 -f2'")
# Next, check that a requirement does not "undo" a request for # Next, check that a requirement does not "undo" a request for
@@ -201,7 +214,7 @@ def test_propagate_and_require(concretize_scope, test_repo):
""" """
update_concretize_scope(conf_str, "packages") update_concretize_scope(conf_str, "packages")
root_spec2 = Spec("v cflags=='-f1'").concretized() root_spec2 = spack.concretize.concretize_one("v cflags=='-f1'")
assert root_spec2["y"].satisfies("cflags='-f1'") assert root_spec2["y"].satisfies("cflags='-f1'")
# Note: requirements cannot enforce propagation: any attempt to do # Note: requirements cannot enforce propagation: any attempt to do
@@ -245,7 +258,7 @@ def test_diamond_dep_flag_mixing(concretize_scope, test_repo):
nodes of the diamond always appear in the same order). nodes of the diamond always appear in the same order).
`Spec.traverse` is responsible for handling both of these needs. `Spec.traverse` is responsible for handling both of these needs.
""" """
root_spec1 = Spec("t").concretized() root_spec1 = spack.concretize.concretize_one("t")
spec1 = root_spec1["y"] spec1 = root_spec1["y"]
assert spec1.satisfies('cflags="-c1 -c2 -d1 -d2 -e1 -e2"') assert spec1.satisfies('cflags="-c1 -c2 -d1 -d2 -e1 -e2"')
assert spec1.compiler_flags["cflags"] == "-c1 -c2 -e1 -e2 -d1 -d2".split() assert spec1.compiler_flags["cflags"] == "-c1 -c2 -e1 -e2 -d1 -d2".split()

View File

@@ -7,6 +7,7 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.package_prefs import spack.package_prefs
import spack.repo import spack.repo
@@ -46,7 +47,7 @@ def configure_permissions():
def concretize(abstract_spec): def concretize(abstract_spec):
return Spec(abstract_spec).concretized() return spack.concretize.concretize_one(abstract_spec)
def update_packages(pkgname, section, value): def update_packages(pkgname, section, value):
@@ -111,7 +112,7 @@ def test_preferred_variants_from_wildcard(self):
def test_preferred_compilers(self, compiler_str, spec_str): def test_preferred_compilers(self, compiler_str, spec_str):
"""Test preferred compilers are applied correctly""" """Test preferred compilers are applied correctly"""
update_packages("all", "compiler", [compiler_str]) update_packages("all", "compiler", [compiler_str])
spec = spack.spec.Spec(spec_str).concretized() spec = spack.concretize.concretize_one(spec_str)
assert spec.compiler == CompilerSpec(compiler_str) assert spec.compiler == CompilerSpec(compiler_str)
def test_preferred_target(self, mutable_mock_repo): def test_preferred_target(self, mutable_mock_repo):
@@ -213,15 +214,13 @@ def test_config_set_pkg_property_new(self, mock_repo_path):
def test_preferred(self): def test_preferred(self):
""" "Test packages with some version marked as preferred=True""" """ "Test packages with some version marked as preferred=True"""
spec = Spec("python") spec = spack.concretize.concretize_one("python")
spec.concretize()
assert spec.version == Version("2.7.11") assert spec.version == Version("2.7.11")
# now add packages.yaml with versions other than preferred # now add packages.yaml with versions other than preferred
# ensure that once config is in place, non-preferred version is used # ensure that once config is in place, non-preferred version is used
update_packages("python", "version", ["3.5.0"]) update_packages("python", "version", ["3.5.0"])
spec = Spec("python") spec = spack.concretize.concretize_one("python")
spec.concretize()
assert spec.version == Version("3.5.0") assert spec.version == Version("3.5.0")
def test_preferred_undefined_raises(self): def test_preferred_undefined_raises(self):
@@ -229,7 +228,7 @@ def test_preferred_undefined_raises(self):
update_packages("python", "version", ["3.5.0.1"]) update_packages("python", "version", ["3.5.0.1"])
spec = Spec("python") spec = Spec("python")
with pytest.raises(ConfigError): with pytest.raises(ConfigError):
spec.concretize() spack.concretize.concretize_one(spec)
def test_preferred_truncated(self): def test_preferred_truncated(self):
"""Versions without "=" are treated as version ranges: if there is """Versions without "=" are treated as version ranges: if there is
@@ -237,35 +236,29 @@ def test_preferred_truncated(self):
(don't define a new version). (don't define a new version).
""" """
update_packages("python", "version", ["3.5"]) update_packages("python", "version", ["3.5"])
spec = Spec("python") spec = spack.concretize.concretize_one("python")
spec.concretize()
assert spec.satisfies("@3.5.1") assert spec.satisfies("@3.5.1")
def test_develop(self): def test_develop(self):
"""Test concretization with develop-like versions""" """Test concretization with develop-like versions"""
spec = Spec("develop-test") spec = spack.concretize.concretize_one("develop-test")
spec.concretize()
assert spec.version == Version("0.2.15") assert spec.version == Version("0.2.15")
spec = Spec("develop-test2") spec = spack.concretize.concretize_one("develop-test2")
spec.concretize()
assert spec.version == Version("0.2.15") assert spec.version == Version("0.2.15")
# now add packages.yaml with develop-like versions # now add packages.yaml with develop-like versions
# ensure that once config is in place, develop-like version is used # ensure that once config is in place, develop-like version is used
update_packages("develop-test", "version", ["develop"]) update_packages("develop-test", "version", ["develop"])
spec = Spec("develop-test") spec = spack.concretize.concretize_one("develop-test")
spec.concretize()
assert spec.version == Version("develop") assert spec.version == Version("develop")
update_packages("develop-test2", "version", ["0.2.15.develop"]) update_packages("develop-test2", "version", ["0.2.15.develop"])
spec = Spec("develop-test2") spec = spack.concretize.concretize_one("develop-test2")
spec.concretize()
assert spec.version == Version("0.2.15.develop") assert spec.version == Version("0.2.15.develop")
def test_external_mpi(self): def test_external_mpi(self):
# make sure this doesn't give us an external first. # make sure this doesn't give us an external first.
spec = Spec("mpi") spec = spack.concretize.concretize_one("mpi")
spec.concretize()
assert not spec["mpi"].external assert not spec["mpi"].external
# load config # load config
@@ -284,8 +277,7 @@ def test_external_mpi(self):
spack.config.set("packages", conf, scope="concretize") spack.config.set("packages", conf, scope="concretize")
# ensure that once config is in place, external is used # ensure that once config is in place, external is used
spec = Spec("mpi") spec = spack.concretize.concretize_one("mpi")
spec.concretize()
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path") assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_external_module(self, monkeypatch): def test_external_module(self, monkeypatch):
@@ -300,8 +292,7 @@ def mock_module(cmd, module):
monkeypatch.setattr(spack.util.module_cmd, "module", mock_module) monkeypatch.setattr(spack.util.module_cmd, "module", mock_module)
spec = Spec("mpi") spec = spack.concretize.concretize_one("mpi")
spec.concretize()
assert not spec["mpi"].external assert not spec["mpi"].external
# load config # load config
@@ -320,8 +311,7 @@ def mock_module(cmd, module):
spack.config.set("packages", conf, scope="concretize") spack.config.set("packages", conf, scope="concretize")
# ensure that once config is in place, external is used # ensure that once config is in place, external is used
spec = Spec("mpi") spec = spack.concretize.concretize_one("mpi")
spec.concretize()
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path") assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_buildable_false(self): def test_buildable_false(self):
@@ -467,7 +457,7 @@ def test_variant_not_flipped_to_pull_externals(self):
"""Test that a package doesn't prefer pulling in an """Test that a package doesn't prefer pulling in an
external to using the default value of a variant. external to using the default value of a variant.
""" """
s = Spec("vdefault-or-external-root").concretized() s = spack.concretize.concretize_one("vdefault-or-external-root")
assert "~external" in s["vdefault-or-external"] assert "~external" in s["vdefault-or-external"]
assert "externaltool" not in s assert "externaltool" not in s
@@ -479,7 +469,7 @@ def test_dependencies_cant_make_version_parent_score_better(self):
that makes the overall version score even or better and maybe that makes the overall version score even or better and maybe
has a better score in some lower priority criteria. has a better score in some lower priority criteria.
""" """
s = Spec("version-test-root").concretized() s = spack.concretize.concretize_one("version-test-root")
assert s.satisfies("^version-test-pkg@2.4.6") assert s.satisfies("^version-test-pkg@2.4.6")
assert "version-test-dependency-preferred" not in s assert "version-test-dependency-preferred" not in s
@@ -497,13 +487,13 @@ def test_multivalued_variants_are_lower_priority_than_providers(self):
with spack.config.override( with spack.config.override(
"packages:all", {"providers": {"somevirtual": ["some-virtual-preferred"]}} "packages:all", {"providers": {"somevirtual": ["some-virtual-preferred"]}}
): ):
s = Spec("somevirtual").concretized() s = spack.concretize.concretize_one("somevirtual")
assert s.name == "some-virtual-preferred" assert s.name == "some-virtual-preferred"
@pytest.mark.regression("26721,19736") @pytest.mark.regression("26721,19736")
def test_sticky_variant_accounts_for_packages_yaml(self): def test_sticky_variant_accounts_for_packages_yaml(self):
with spack.config.override("packages:sticky-variant", {"variants": "+allow-gcc"}): with spack.config.override("packages:sticky-variant", {"variants": "+allow-gcc"}):
s = Spec("sticky-variant %gcc").concretized() s = spack.concretize.concretize_one("sticky-variant %gcc")
assert s.satisfies("%gcc") and s.satisfies("+allow-gcc") assert s.satisfies("%gcc") and s.satisfies("+allow-gcc")
@pytest.mark.regression("41134") @pytest.mark.regression("41134")
@@ -512,5 +502,5 @@ def test_default_preference_variant_different_type_does_not_error(self):
packages.yaml doesn't fail with an error. packages.yaml doesn't fail with an error.
""" """
with spack.config.override("packages:all", {"variants": "+foo"}): with spack.config.override("packages:all", {"variants": "+foo"}):
s = Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
assert s.satisfies("foo=bar") assert s.satisfies("foo=bar")

View File

@@ -6,6 +6,7 @@
import pytest import pytest
import spack.concretize
import spack.config import spack.config
import spack.error import spack.error
import spack.package_base import spack.package_base
@@ -42,7 +43,7 @@ def test_one_package_multiple_reqs(concretize_scope, test_repo):
- "~shared" - "~shared"
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
y_spec = Spec("y").concretized() y_spec = spack.concretize.concretize_one("y")
assert y_spec.satisfies("@2.4~shared") assert y_spec.satisfies("@2.4~shared")
@@ -57,7 +58,7 @@ def test_requirement_isnt_optional(concretize_scope, test_repo):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
with pytest.raises(UnsatisfiableSpecError): with pytest.raises(UnsatisfiableSpecError):
Spec("x@1.1").concretize() spack.concretize.concretize_one("x@1.1")
def test_require_undefined_version(concretize_scope, test_repo): def test_require_undefined_version(concretize_scope, test_repo):
@@ -74,7 +75,7 @@ def test_require_undefined_version(concretize_scope, test_repo):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
with pytest.raises(spack.error.ConfigError): with pytest.raises(spack.error.ConfigError):
Spec("x").concretize() spack.concretize.concretize_one("x")
def test_require_truncated(concretize_scope, test_repo): def test_require_truncated(concretize_scope, test_repo):
@@ -89,7 +90,7 @@ def test_require_truncated(concretize_scope, test_repo):
require: "@1" require: "@1"
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
xspec = Spec("x").concretized() xspec = spack.concretize.concretize_one("x")
assert xspec.satisfies("@1.1") assert xspec.satisfies("@1.1")
@@ -159,7 +160,7 @@ def test_requirement_adds_new_version(
) )
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("v").concretized() s1 = spack.concretize.concretize_one("v")
assert s1.satisfies("@2.2") assert s1.satisfies("@2.2")
# Make sure the git commit info is retained # Make sure the git commit info is retained
assert isinstance(s1.version, spack.version.GitVersion) assert isinstance(s1.version, spack.version.GitVersion)
@@ -180,7 +181,7 @@ def test_requirement_adds_version_satisfies(
) )
# Sanity check: early version of T does not include U # Sanity check: early version of T does not include U
s0 = Spec("t@2.0").concretized() s0 = spack.concretize.concretize_one("t@2.0")
assert not ("u" in s0) assert not ("u" in s0)
conf_str = """\ conf_str = """\
@@ -192,7 +193,7 @@ def test_requirement_adds_version_satisfies(
) )
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("t").concretized() s1 = spack.concretize.concretize_one("t")
assert "u" in s1 assert "u" in s1
assert s1.satisfies("@2.2") assert s1.satisfies("@2.2")
@@ -218,7 +219,7 @@ def test_requirement_adds_git_hash_version(
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("v").concretized() s1 = spack.concretize.concretize_one("v")
assert isinstance(s1.version, spack.version.GitVersion) assert isinstance(s1.version, spack.version.GitVersion)
assert s1.satisfies(f"v@{a_commit_hash}") assert s1.satisfies(f"v@{a_commit_hash}")
@@ -239,8 +240,8 @@ def test_requirement_adds_multiple_new_versions(
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
assert Spec("v").concretized().satisfies(f"@{commits[0]}=2.2") assert spack.concretize.concretize_one("v").satisfies(f"@{commits[0]}=2.2")
assert Spec("v@2.3").concretized().satisfies(f"v@{commits[1]}=2.3") assert spack.concretize.concretize_one("v@2.3").satisfies(f"v@{commits[1]}=2.3")
# TODO: this belongs in the concretize_preferences test module but uses # TODO: this belongs in the concretize_preferences test module but uses
@@ -263,11 +264,11 @@ def test_preference_adds_new_version(
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
assert Spec("v").concretized().satisfies(f"@{commits[0]}=2.2") assert spack.concretize.concretize_one("v").satisfies(f"@{commits[0]}=2.2")
assert Spec("v@2.3").concretized().satisfies(f"@{commits[1]}=2.3") assert spack.concretize.concretize_one("v@2.3").satisfies(f"@{commits[1]}=2.3")
# When installing by hash, a lookup is triggered, so it's not mapped to =2.3. # When installing by hash, a lookup is triggered, so it's not mapped to =2.3.
s3 = Spec(f"v@{commits[1]}").concretized() s3 = spack.concretize.concretize_one(f"v@{commits[1]}")
assert s3.satisfies(f"v@{commits[1]}") assert s3.satisfies(f"v@{commits[1]}")
assert not s3.satisfies("@2.3") assert not s3.satisfies("@2.3")
@@ -287,7 +288,7 @@ def test_external_adds_new_version_that_is_preferred(concretize_scope, test_repo
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("x").concretized() spec = spack.concretize.concretize_one("x")
assert spec["y"].satisfies("@2.7") assert spec["y"].satisfies("@2.7")
assert spack.version.Version("2.7") not in spec["y"].package.versions assert spack.version.Version("2.7") not in spec["y"].package.versions
@@ -296,7 +297,7 @@ def test_requirement_is_successfully_applied(concretize_scope, test_repo):
"""If a simple requirement can be satisfied, make sure the """If a simple requirement can be satisfied, make sure the
concretization succeeds and the requirement spec is applied. concretization succeeds and the requirement spec is applied.
""" """
s1 = Spec("x").concretized() s1 = spack.concretize.concretize_one("x")
# Without any requirements/preferences, the later version is preferred # Without any requirements/preferences, the later version is preferred
assert s1.satisfies("@1.1") assert s1.satisfies("@1.1")
@@ -306,7 +307,7 @@ def test_requirement_is_successfully_applied(concretize_scope, test_repo):
require: "@1.0" require: "@1.0"
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
s2 = Spec("x").concretized() s2 = spack.concretize.concretize_one("x")
# The requirement forces choosing the eariler version # The requirement forces choosing the eariler version
assert s2.satisfies("@1.0") assert s2.satisfies("@1.0")
@@ -323,7 +324,7 @@ def test_multiple_packages_requirements_are_respected(concretize_scope, test_rep
require: "@2.4" require: "@2.4"
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("x").concretized() spec = spack.concretize.concretize_one("x")
assert spec["x"].satisfies("@1.0") assert spec["x"].satisfies("@1.0")
assert spec["y"].satisfies("@2.4") assert spec["y"].satisfies("@2.4")
@@ -339,7 +340,7 @@ def test_oneof(concretize_scope, test_repo):
- one_of: ["@2.4", "~shared"] - one_of: ["@2.4", "~shared"]
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("x").concretized() spec = spack.concretize.concretize_one("x")
# The concretizer only has to satisfy one of @2.4/~shared, and @2.4 # The concretizer only has to satisfy one of @2.4/~shared, and @2.4
# comes first so it is prioritized # comes first so it is prioritized
assert spec["y"].satisfies("@2.4+shared") assert spec["y"].satisfies("@2.4+shared")
@@ -358,10 +359,10 @@ def test_one_package_multiple_oneof_groups(concretize_scope, test_repo):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("y@2.5").concretized() s1 = spack.concretize.concretize_one("y@2.5")
assert s1.satisfies("%clang~shared") assert s1.satisfies("%clang~shared")
s2 = Spec("y@2.4").concretized() s2 = spack.concretize.concretize_one("y@2.4")
assert s2.satisfies("%gcc+shared") assert s2.satisfies("%gcc+shared")
@@ -377,10 +378,10 @@ def test_require_cflags(concretize_scope, mock_packages):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec_mpich2 = Spec("mpich2").concretized() spec_mpich2 = spack.concretize.concretize_one("mpich2")
assert spec_mpich2.satisfies("cflags=-g") assert spec_mpich2.satisfies("cflags=-g")
spec_mpi = Spec("mpi").concretized() spec_mpi = spack.concretize.concretize_one("mpi")
assert spec_mpi.satisfies("mpich cflags=-O1") assert spec_mpi.satisfies("mpich cflags=-O1")
@@ -403,7 +404,7 @@ def test_requirements_for_package_that_is_not_needed(concretize_scope, test_repo
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("v").concretized() s1 = spack.concretize.concretize_one("v")
assert s1.satisfies("@2.1") assert s1.satisfies("@2.1")
@@ -420,10 +421,10 @@ def test_oneof_ordering(concretize_scope, test_repo):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
s1 = Spec("y").concretized() s1 = spack.concretize.concretize_one("y")
assert s1.satisfies("@2.4") assert s1.satisfies("@2.4")
s2 = Spec("y@2.5").concretized() s2 = spack.concretize.concretize_one("y@2.5")
assert s2.satisfies("@2.5") assert s2.satisfies("@2.5")
@@ -437,14 +438,14 @@ def test_reuse_oneof(concretize_scope, test_repo, tmp_path, mock_fetch):
store_dir = tmp_path / "store" store_dir = tmp_path / "store"
with spack.store.use_store(str(store_dir)): with spack.store.use_store(str(store_dir)):
s1 = Spec("y@2.5 ~shared").concretized() s1 = spack.concretize.concretize_one("y@2.5~shared")
PackageInstaller([s1.package], fake=True, explicit=True).install() PackageInstaller([s1.package], fake=True, explicit=True).install()
update_packages_config(conf_str) update_packages_config(conf_str)
with spack.config.override("concretizer:reuse", True): with spack.config.override("concretizer:reuse", True):
s2 = Spec("y").concretized() s2 = spack.concretize.concretize_one("y")
assert not s2.satisfies("@2.5 ~shared") assert not s2.satisfies("@2.5~shared")
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -472,7 +473,7 @@ def test_requirements_and_deprecated_versions(
update_packages_config(conf_str) update_packages_config(conf_str)
with spack.config.override("config:deprecated", allow_deprecated): with spack.config.override("config:deprecated", allow_deprecated):
s1 = Spec("y").concretized() s1 = spack.concretize.concretize_one("y")
for constrain in expected: for constrain in expected:
assert s1.satisfies(constrain) assert s1.satisfies(constrain)
@@ -490,7 +491,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec(spec_str).concretized() spec = spack.concretize.concretize_one(spec_str)
for s in spec.traverse(): for s in spec.traverse():
assert s.satisfies(requirement_str) assert s.satisfies(requirement_str)
@@ -499,7 +500,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
"requirements,expectations", "requirements,expectations",
[ [
(("%gcc", "%clang"), ("%gcc", "%clang")), (("%gcc", "%clang"), ("%gcc", "%clang")),
(("%gcc ~shared", "@1.0"), ("%gcc ~shared", "@1.0 +shared")), (("%gcc~shared", "@1.0"), ("%gcc~shared", "@1.0+shared")),
], ],
) )
def test_default_and_package_specific_requirements( def test_default_and_package_specific_requirements(
@@ -517,7 +518,7 @@ def test_default_and_package_specific_requirements(
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("x").concretized() spec = spack.concretize.concretize_one("x")
assert spec.satisfies(specific_exp) assert spec.satisfies(specific_exp)
for s in spec.traverse(root=False): for s in spec.traverse(root=False):
assert s.satisfies(generic_exp) assert s.satisfies(generic_exp)
@@ -532,7 +533,7 @@ def test_requirements_on_virtual(mpi_requirement, concretize_scope, mock_package
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("callpath").concretized() spec = spack.concretize.concretize_one("callpath")
assert "mpi" in spec assert "mpi" in spec
assert mpi_requirement in spec assert mpi_requirement in spec
@@ -553,7 +554,7 @@ def test_requirements_on_virtual_and_on_package(
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("callpath").concretized() spec = spack.concretize.concretize_one("callpath")
assert "mpi" in spec assert "mpi" in spec
assert mpi_requirement in spec assert mpi_requirement in spec
assert spec["mpi"].satisfies(specific_requirement) assert spec["mpi"].satisfies(specific_requirement)
@@ -567,10 +568,10 @@ def test_incompatible_virtual_requirements_raise(concretize_scope, mock_packages
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("callpath ^zmpi") spec = Spec("callpath^zmpi")
# TODO (multiple nodes): recover a better error message later # TODO (multiple nodes): recover a better error message later
with pytest.raises((UnsatisfiableSpecError, InternalConcretizerError)): with pytest.raises((UnsatisfiableSpecError, InternalConcretizerError)):
spec.concretize() spack.concretize.concretize_one(spec)
def test_non_existing_variants_under_all(concretize_scope, mock_packages): def test_non_existing_variants_under_all(concretize_scope, mock_packages):
@@ -582,7 +583,7 @@ def test_non_existing_variants_under_all(concretize_scope, mock_packages):
""" """
update_packages_config(conf_str) update_packages_config(conf_str)
spec = Spec("callpath ^zmpi").concretized() spec = spack.concretize.concretize_one("callpath^zmpi")
assert "~foo" not in spec assert "~foo" not in spec
@@ -657,7 +658,7 @@ def test_conditional_requirements_from_packages_yaml(
and optional when the condition is not met. and optional when the condition is not met.
""" """
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
spec = Spec(spec_str).concretized() spec = spack.concretize.concretize_one(spec_str)
for match_str, expected in expected_satisfies: for match_str, expected in expected_satisfies:
assert spec.satisfies(match_str) is expected assert spec.satisfies(match_str) is expected
@@ -733,7 +734,7 @@ def test_requirements_fail_with_custom_message(
""" """
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
with pytest.raises(spack.error.SpackError, match=expected_message): with pytest.raises(spack.error.SpackError, match=expected_message):
Spec(spec_str).concretized() spack.concretize.concretize_one(spec_str)
def test_skip_requirement_when_default_requirement_condition_cannot_be_met( def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
@@ -752,9 +753,9 @@ def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
when: "+shared" when: "+shared"
""" """
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
s = Spec("mpileaks").concretized() s = spack.concretize.concretize_one("mpileaks")
assert s.satisfies("%clang +shared") assert s.satisfies("%clang+shared")
# Sanity checks that 'callpath' doesn't have the shared variant, but that didn't # Sanity checks that 'callpath' doesn't have the shared variant, but that didn't
# cause failures during concretization. # cause failures during concretization.
assert "shared" not in s["callpath"].variants assert "shared" not in s["callpath"].variants
@@ -781,12 +782,12 @@ def test_requires_directive(concretize_scope, mock_packages):
spack.config.CONFIG.clear_caches() spack.config.CONFIG.clear_caches()
# This package requires either clang or gcc # This package requires either clang or gcc
s = Spec("requires_clang_or_gcc").concretized() s = spack.concretize.concretize_one("requires_clang_or_gcc")
assert s.satisfies("%gcc@12.0.0") assert s.satisfies("%gcc@12.0.0")
# This package can only be compiled with clang # This package can only be compiled with clang
with pytest.raises(spack.error.SpackError, match="can only be compiled with Clang"): with pytest.raises(spack.error.SpackError, match="can only be compiled with Clang"):
Spec("requires_clang").concretized() spack.concretize.concretize_one("requires_clang")
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -839,20 +840,20 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
""" """
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
# Regular zlib concretize to +shared # Regular zlib concretize to+shared
s = Spec("zlib").concretized() s = spack.concretize.concretize_one("zlib")
assert s.satisfies("+shared") assert s.satisfies("+shared")
# If we specify the variant we can concretize only the one matching the constraint # If we specify the variant we can concretize only the one matching the constraint
s = Spec("zlib +shared").concretized() s = spack.concretize.concretize_one("zlib+shared")
assert s.satisfies("+shared") assert s.satisfies("+shared")
with pytest.raises(UnsatisfiableSpecError): with pytest.raises(UnsatisfiableSpecError):
Spec("zlib ~shared").concretized() spack.concretize.concretize_one("zlib~shared")
# A spec without the shared variant still concretize # A spec without the shared variant still concretize
s = Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
assert not s.satisfies("pkg-a +shared") assert not s.satisfies("pkg-a+shared")
assert not s.satisfies("pkg-a ~shared") assert not s.satisfies("pkg-a~shared")
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -896,7 +897,7 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
""" """
packages: packages:
all: all:
require: "libs=static +feefoo" require: "libs=static+feefoo"
""", """,
"multivalue-variant", "multivalue-variant",
["libs=shared"], ["libs=shared"],
@@ -911,7 +912,7 @@ def test_default_requirements_semantic_with_mv_variants(
from MV variants. from MV variants.
""" """
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
s = Spec(spec_str).concretized() s = spack.concretize.concretize_one(spec_str)
for constraint in expected: for constraint in expected:
assert s.satisfies(constraint), constraint assert s.satisfies(constraint), constraint
@@ -936,7 +937,7 @@ def test_requiring_package_on_multiple_virtuals(concretize_scope, mock_packages)
require: intel-parallel-studio require: intel-parallel-studio
""" """
) )
s = Spec("dla-future").concretized() s = spack.concretize.concretize_one("dla-future")
assert s["blas"].name == "intel-parallel-studio" assert s["blas"].name == "intel-parallel-studio"
assert s["lapack"].name == "intel-parallel-studio" assert s["lapack"].name == "intel-parallel-studio"
@@ -989,7 +990,7 @@ def test_strong_preferences_packages_yaml(
): ):
"""Tests that "preferred" specs are stronger than usual preferences, but can be overridden.""" """Tests that "preferred" specs are stronger than usual preferences, but can be overridden."""
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
s = Spec(spec_str).concretized() s = spack.concretize.concretize_one(spec_str)
for constraint in expected: for constraint in expected:
assert s.satisfies(constraint), constraint assert s.satisfies(constraint), constraint
@@ -1038,29 +1039,29 @@ def test_conflict_packages_yaml(packages_yaml, spec_str, concretize_scope, mock_
"""Tests conflicts that are specified from configuration files.""" """Tests conflicts that are specified from configuration files."""
update_packages_config(packages_yaml) update_packages_config(packages_yaml)
with pytest.raises(UnsatisfiableSpecError): with pytest.raises(UnsatisfiableSpecError):
Spec(spec_str).concretized() spack.concretize.concretize_one(spec_str)
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str,expected,not_expected", "spec_str,expected,not_expected",
[ [
( (
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv~cuda", "forward-multi-value+cuda cuda_arch=10^dependency-mv~cuda",
["cuda_arch=10", "^dependency-mv~cuda"], ["cuda_arch=10", "^dependency-mv~cuda"],
["cuda_arch=11", "^dependency-mv cuda_arch=10", "^dependency-mv cuda_arch=11"], ["cuda_arch=11", "^dependency-mv cuda_arch=10", "^dependency-mv cuda_arch=11"],
), ),
( (
"forward-multi-value +cuda cuda_arch=10 ^dependency-mv+cuda", "forward-multi-value+cuda cuda_arch=10^dependency-mv+cuda",
["cuda_arch=10", "^dependency-mv cuda_arch=10"], ["cuda_arch=10", "^dependency-mv cuda_arch=10"],
["cuda_arch=11", "^dependency-mv cuda_arch=11"], ["cuda_arch=11", "^dependency-mv cuda_arch=11"],
), ),
( (
"forward-multi-value +cuda cuda_arch=11 ^dependency-mv+cuda", "forward-multi-value+cuda cuda_arch=11^dependency-mv+cuda",
["cuda_arch=11", "^dependency-mv cuda_arch=11"], ["cuda_arch=11", "^dependency-mv cuda_arch=11"],
["cuda_arch=10", "^dependency-mv cuda_arch=10"], ["cuda_arch=10", "^dependency-mv cuda_arch=10"],
), ),
( (
"forward-multi-value +cuda cuda_arch=10,11 ^dependency-mv+cuda", "forward-multi-value+cuda cuda_arch=10,11^dependency-mv+cuda",
["cuda_arch=10,11", "^dependency-mv cuda_arch=10,11"], ["cuda_arch=10,11", "^dependency-mv cuda_arch=10,11"],
[], [],
), ),
@@ -1073,9 +1074,9 @@ def test_forward_multi_valued_variant_using_requires(
`requires` directives of the form: `requires` directives of the form:
for _val in ("shared", "static"): for _val in ("shared", "static"):
requires(f"^some-virtual-mv libs={_val}", when=f"libs={_val} ^some-virtual-mv") requires(f"^some-virtual-mv libs={_val}", when=f"libs={_val}^some-virtual-mv")
""" """
s = Spec(spec_str).concretized() s = spack.concretize.concretize_one(spec_str)
for constraint in expected: for constraint in expected:
assert s.satisfies(constraint) assert s.satisfies(constraint)
@@ -1086,7 +1087,7 @@ def test_forward_multi_valued_variant_using_requires(
def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_packages): def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_packages):
"""Tests that strong preferences have a higher priority than reusing specs.""" """Tests that strong preferences have a higher priority than reusing specs."""
reused_spec = Spec("adios2~bzip2").concretized() reused_spec = spack.concretize.concretize_one("adios2~bzip2")
reuse_nodes = list(reused_spec.traverse()) reuse_nodes = list(reused_spec.traverse())
root_specs = [Spec("ascent+adios2")] root_specs = [Spec("ascent+adios2")]
@@ -1121,7 +1122,7 @@ def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_pa
solver = spack.solver.asp.Solver() solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup() setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve( result, _, _ = solver.driver.solve(
setup, [Spec("ascent+adios2 ^adios2~bzip2")], reuse=reuse_nodes setup, [Spec("ascent+adios2^adios2~bzip2")], reuse=reuse_nodes
) )
ascent = result.specs[0] ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent

View File

@@ -4,7 +4,7 @@
import pytest import pytest
import spack.spec import spack.concretize
import spack.store import spack.store
@@ -13,7 +13,7 @@
def test_set_install_hash_length(hash_length, mutable_config, tmpdir): def test_set_install_hash_length(hash_length, mutable_config, tmpdir):
mutable_config.set("config:install_hash_length", hash_length) mutable_config.set("config:install_hash_length", hash_length)
with spack.store.use_store(str(tmpdir)): with spack.store.use_store(str(tmpdir)):
spec = spack.spec.Spec("libelf").concretized() spec = spack.concretize.concretize_one("libelf")
prefix = spec.prefix prefix = spec.prefix
hash_str = prefix.rsplit("-")[-1] hash_str = prefix.rsplit("-")[-1]
assert len(hash_str) == hash_length assert len(hash_str) == hash_length
@@ -23,7 +23,7 @@ def test_set_install_hash_length(hash_length, mutable_config, tmpdir):
def test_set_install_hash_length_upper_case(mutable_config, tmpdir): def test_set_install_hash_length_upper_case(mutable_config, tmpdir):
mutable_config.set("config:install_hash_length", 5) mutable_config.set("config:install_hash_length", 5)
with spack.store.use_store(str(tmpdir), extra_data={"projections": {"all": "{name}-{HASH}"}}): with spack.store.use_store(str(tmpdir), extra_data={"projections": {"all": "{name}-{HASH}"}}):
spec = spack.spec.Spec("libelf").concretized() spec = spack.concretize.concretize_one("libelf")
prefix = spec.prefix prefix = spec.prefix
hash_str = prefix.rsplit("-")[-1] hash_str = prefix.rsplit("-")[-1]
assert len(hash_str) == 5 assert len(hash_str) == 5

View File

@@ -36,6 +36,7 @@
import spack.caches import spack.caches
import spack.compiler import spack.compiler
import spack.compilers import spack.compilers
import spack.concretize
import spack.config import spack.config
import spack.directives_meta import spack.directives_meta
import spack.environment as ev import spack.environment as ev
@@ -849,7 +850,7 @@ def _populate(mock_db):
""" """
def _install(spec): def _install(spec):
s = spack.spec.Spec(spec).concretized() s = spack.concretize.concretize_one(spec)
PackageInstaller([s.package], fake=True, explicit=True).install() PackageInstaller([s.package], fake=True, explicit=True).install()
_install("mpileaks ^mpich") _install("mpileaks ^mpich")
@@ -887,21 +888,26 @@ def mock_store(
""" """
store_path, store_cache = _store_dir_and_cache store_path, store_cache = _store_dir_and_cache
# Make the DB filesystem read-only to ensure constructors don't modify anything in it.
# We want Spack to be able to point to a DB on a read-only filesystem easily.
store_path.chmod(mode=0o555, rec=1)
# If the cache does not exist populate the store and create it # If the cache does not exist populate the store and create it
if not os.path.exists(str(store_cache.join(".spack-db"))): if not os.path.exists(str(store_cache.join(".spack-db"))):
with spack.config.use_configuration(*mock_configuration_scopes): with spack.config.use_configuration(*mock_configuration_scopes):
with spack.store.use_store(str(store_path)) as store: with spack.store.use_store(str(store_path)) as store:
with spack.repo.use_repositories(mock_repo_path): with spack.repo.use_repositories(mock_repo_path):
# make the DB filesystem writable only while we populate it
store_path.chmod(mode=0o755, rec=1)
_populate(store.db) _populate(store.db)
copy_tree(str(store_path), str(store_cache)) store_path.chmod(mode=0o555, rec=1)
# Make the DB filesystem read-only to ensure we can't modify entries store_cache.chmod(mode=0o755, rec=1)
store_path.join(".spack-db").chmod(mode=0o555, rec=1) copy_tree(str(store_path), str(store_cache))
store_cache.chmod(mode=0o555, rec=1)
yield store_path yield store_path
store_path.join(".spack-db").chmod(mode=0o755, rec=1)
@pytest.fixture(scope="function") @pytest.fixture(scope="function")
def database(mock_store, mock_packages, config): def database(mock_store, mock_packages, config):
@@ -927,7 +933,7 @@ def mutable_database(database_mutable_config, _store_dir_and_cache):
""" """
# Make the database writeable, as we are going to modify it # Make the database writeable, as we are going to modify it
store_path, store_cache = _store_dir_and_cache store_path, store_cache = _store_dir_and_cache
store_path.join(".spack-db").chmod(mode=0o755, rec=1) store_path.chmod(mode=0o755, rec=1)
yield database_mutable_config yield database_mutable_config
@@ -935,7 +941,7 @@ def mutable_database(database_mutable_config, _store_dir_and_cache):
# the store and making the database read-only # the store and making the database read-only
store_path.remove(rec=1) store_path.remove(rec=1)
copy_tree(str(store_cache), str(store_path)) copy_tree(str(store_cache), str(store_path))
store_path.join(".spack-db").chmod(mode=0o555, rec=1) store_path.chmod(mode=0o555, rec=1)
@pytest.fixture() @pytest.fixture()
@@ -1039,7 +1045,8 @@ def temporary_store(tmpdir, request):
temporary_store_path = tmpdir.join("opt") temporary_store_path = tmpdir.join("opt")
with spack.store.use_store(str(temporary_store_path)) as s: with spack.store.use_store(str(temporary_store_path)) as s:
yield s yield s
temporary_store_path.remove() if temporary_store_path.exists():
temporary_store_path.remove()
@pytest.fixture() @pytest.fixture()
@@ -1977,7 +1984,9 @@ def default_mock_concretization(config, mock_packages, concretized_specs_cache):
def _func(spec_str, tests=False): def _func(spec_str, tests=False):
key = spec_str, tests key = spec_str, tests
if key not in concretized_specs_cache: if key not in concretized_specs_cache:
concretized_specs_cache[key] = spack.spec.Spec(spec_str).concretized(tests=tests) concretized_specs_cache[key] = spack.concretize.concretize_one(
spack.spec.Spec(spec_str), tests=tests
)
return concretized_specs_cache[key].copy() return concretized_specs_cache[key].copy()
return _func return _func

View File

@@ -8,8 +8,8 @@
from llnl.util.filesystem import mkdirp, touch, working_dir from llnl.util.filesystem import mkdirp, touch, working_dir
import spack.concretize
from spack.fetch_strategy import CvsFetchStrategy from spack.fetch_strategy import CvsFetchStrategy
from spack.spec import Spec
from spack.stage import Stage from spack.stage import Stage
from spack.util.executable import which from spack.util.executable import which
from spack.version import Version from spack.version import Version
@@ -37,7 +37,7 @@ def test_fetch(type_of_test, mock_cvs_repository, config, mutable_mock_repo):
get_date = mock_cvs_repository.get_date get_date = mock_cvs_repository.get_date
# Construct the package under test # Construct the package under test
spec = Spec("cvs-test").concretized() spec = spack.concretize.concretize_one("cvs-test")
spec.package.versions[Version("cvs")] = test.args spec.package.versions[Version("cvs")] = test.args
# Enter the stage directory and check some properties # Enter the stage directory and check some properties

View File

@@ -2,10 +2,12 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Check the database is functioning properly, both in memory and in its file.""" """Check the database is functioning properly, both in memory and in its file."""
import contextlib
import datetime import datetime
import functools import functools
import json import json
import os import os
import pathlib
import re import re
import shutil import shutil
import sys import sys
@@ -26,12 +28,14 @@
import llnl.util.lock as lk import llnl.util.lock as lk
from llnl.util.tty.colify import colify from llnl.util.tty.colify import colify
import spack.concretize
import spack.database import spack.database
import spack.deptypes as dt import spack.deptypes as dt
import spack.package_base import spack.package_base
import spack.repo import spack.repo
import spack.spec import spack.spec
import spack.store import spack.store
import spack.util.lock
import spack.version as vn import spack.version as vn
from spack.enums import InstallRecordStatus from spack.enums import InstallRecordStatus
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
@@ -41,24 +45,51 @@
pytestmark = pytest.mark.db pytestmark = pytest.mark.db
@contextlib.contextmanager
def writable(database):
"""Allow a database to be written inside this context manager."""
old_lock, old_is_upstream = database.lock, database.is_upstream
db_root = pathlib.Path(database.root)
try:
# this is safe on all platforms during tests (tests get their own tmpdirs)
database.lock = spack.util.lock.Lock(str(database._lock_path), enable=False)
database.is_upstream = False
db_root.chmod(mode=0o755)
with database.write_transaction():
yield
finally:
db_root.chmod(mode=0o555)
database.lock = old_lock
database.is_upstream = old_is_upstream
@pytest.fixture() @pytest.fixture()
def upstream_and_downstream_db(tmpdir, gen_mock_layout): def upstream_and_downstream_db(tmpdir, gen_mock_layout):
mock_db_root = str(tmpdir.mkdir("mock_db_root")) """Fixture for a pair of stores: upstream and downstream.
upstream_layout = gen_mock_layout("/a/")
upstream_write_db = spack.database.Database(mock_db_root, layout=upstream_layout)
upstream_db = spack.database.Database(mock_db_root, is_upstream=True, layout=upstream_layout)
# Generate initial DB file to avoid reindex
with open(upstream_write_db._index_path, "w", encoding="utf-8") as db_file:
upstream_write_db._write_to_file(db_file)
downstream_db_root = str(tmpdir.mkdir("mock_downstream_db_root")) Upstream API prohibits writing to an upstream, so we also return a writable version
downstream_db = spack.database.Database( of the upstream DB for tests to use.
downstream_db_root, upstream_dbs=[upstream_db], layout=gen_mock_layout("/b/")
"""
mock_db_root = tmpdir.mkdir("mock_db_root")
mock_db_root.chmod(0o555)
upstream_db = spack.database.Database(
str(mock_db_root), is_upstream=True, layout=gen_mock_layout("/a/")
) )
with open(downstream_db._index_path, "w", encoding="utf-8") as db_file: with writable(upstream_db):
downstream_db._write_to_file(db_file) upstream_db._write()
yield upstream_write_db, upstream_db, downstream_db downstream_db_root = tmpdir.mkdir("mock_downstream_db_root")
downstream_db_root.chmod(0o755)
downstream_db = spack.database.Database(
str(downstream_db_root), upstream_dbs=[upstream_db], layout=gen_mock_layout("/b/")
)
downstream_db._write()
yield upstream_db, downstream_db
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -70,16 +101,18 @@ def upstream_and_downstream_db(tmpdir, gen_mock_layout):
("{u}", ["pkg-c"]), ("{u}", ["pkg-c"]),
("{d}", ["pkg-b"]), ("{d}", ["pkg-b"]),
], ],
ids=["all", "upstream", "local", "upstream_path", "downstream_path"],
) )
def test_query_by_install_tree( def test_query_by_install_tree(
install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config
): ):
up_write_db, up_db, down_db = upstream_and_downstream_db up_db, down_db = upstream_and_downstream_db
# Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b") # Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b")
b = spack.spec.Spec("pkg-b").concretized() b = spack.concretize.concretize_one("pkg-b")
c = spack.spec.Spec("pkg-c").concretized() c = spack.concretize.concretize_one("pkg-c")
up_write_db.add(c) with writable(up_db):
up_db.add(c)
up_db._read() up_db._read()
down_db.add(b) down_db.add(b)
@@ -91,15 +124,16 @@ def test_spec_installed_upstream(
upstream_and_downstream_db, mock_custom_repository, config, monkeypatch upstream_and_downstream_db, mock_custom_repository, config, monkeypatch
): ):
"""Test whether Spec.installed_upstream() works.""" """Test whether Spec.installed_upstream() works."""
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db upstream_db, downstream_db = upstream_and_downstream_db
# a known installed spec should say that it's installed # a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository): with spack.repo.use_repositories(mock_custom_repository):
spec = spack.spec.Spec("pkg-c").concretized() spec = spack.concretize.concretize_one("pkg-c")
assert not spec.installed assert not spec.installed
assert not spec.installed_upstream assert not spec.installed_upstream
upstream_write_db.add(spec) with writable(upstream_db):
upstream_db.add(spec)
upstream_db._read() upstream_db._read()
monkeypatch.setattr(spack.store.STORE, "db", downstream_db) monkeypatch.setattr(spack.store.STORE, "db", downstream_db)
@@ -115,7 +149,7 @@ def test_spec_installed_upstream(
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
def test_installed_upstream(upstream_and_downstream_db, tmpdir): def test_installed_upstream(upstream_and_downstream_db, tmpdir):
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x") builder.add_package("x")
@@ -124,9 +158,10 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
builder.add_package("w", dependencies=[("x", None, None), ("y", None, None)]) builder.add_package("w", dependencies=[("x", None, None), ("y", None, None)])
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("w").concretized() spec = spack.concretize.concretize_one("w")
for dep in spec.traverse(root=False): with writable(upstream_db):
upstream_write_db.add(dep) for dep in spec.traverse(root=False):
upstream_db.add(dep)
upstream_db._read() upstream_db._read()
for dep in spec.traverse(root=False): for dep in spec.traverse(root=False):
@@ -135,7 +170,7 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
with pytest.raises(spack.database.ForbiddenLockError): with pytest.raises(spack.database.ForbiddenLockError):
upstream_db.get_by_hash(dep.dag_hash()) upstream_db.get_by_hash(dep.dag_hash())
new_spec = spack.spec.Spec("w").concretized() new_spec = spack.concretize.concretize_one("w")
downstream_db.add(new_spec) downstream_db.add(new_spec)
for dep in new_spec.traverse(root=False): for dep in new_spec.traverse(root=False):
upstream, record = downstream_db.query_by_spec_hash(dep.dag_hash()) upstream, record = downstream_db.query_by_spec_hash(dep.dag_hash())
@@ -150,23 +185,25 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir, capsys, config): def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir, capsys, config):
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("z") builder.add_package("z")
builder.add_package("y", dependencies=[("z", None, None)]) builder.add_package("y", dependencies=[("z", None, None)])
with spack.repo.use_repositories(builder): with spack.repo.use_repositories(builder):
y = spack.spec.Spec("y").concretized() y = spack.concretize.concretize_one("y")
z = y["z"] z = y["z"]
# add dependency to upstream, dependents to downstream # add dependency to upstream, dependents to downstream
upstream_write_db.add(z) with writable(upstream_db):
upstream_db.add(z)
upstream_db._read() upstream_db._read()
downstream_db.add(y) downstream_db.add(y)
# remove the dependency from the upstream DB # remove the dependency from the upstream DB
upstream_write_db.remove(z) with writable(upstream_db):
upstream_db.remove(z)
upstream_db._read() upstream_db._read()
# then rereading the downstream DB should warn about the missing dep # then rereading the downstream DB should warn about the missing dep
@@ -183,16 +220,17 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
DB. When a package is recorded as installed in both, the results should DB. When a package is recorded as installed in both, the results should
refer to the downstream DB. refer to the downstream DB.
""" """
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x") builder.add_package("x")
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized() spec = spack.concretize.concretize_one("x")
downstream_db.add(spec) downstream_db.add(spec)
upstream_write_db.add(spec) with writable(upstream_db):
upstream_db.add(spec)
upstream_db._read() upstream_db._read()
upstream, record = downstream_db.query_by_spec_hash(spec.dag_hash()) upstream, record = downstream_db.query_by_spec_hash(spec.dag_hash())
@@ -221,7 +259,7 @@ def test_cannot_write_upstream(tmp_path, mock_packages, config):
db = spack.database.Database(str(tmp_path), is_upstream=True) db = spack.database.Database(str(tmp_path), is_upstream=True)
with pytest.raises(spack.database.ForbiddenLockError): with pytest.raises(spack.database.ForbiddenLockError):
db.add(spack.spec.Spec("pkg-a").concretized()) db.add(spack.concretize.concretize_one("pkg-a"))
@pytest.mark.usefixtures("config", "temporary_store") @pytest.mark.usefixtures("config", "temporary_store")
@@ -235,7 +273,7 @@ def test_recursive_upstream_dbs(tmpdir, gen_mock_layout):
builder.add_package("x", dependencies=[("y", None, None)]) builder.add_package("x", dependencies=[("y", None, None)])
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized() spec = spack.concretize.concretize_one("x")
db_c = spack.database.Database(roots[2], layout=layouts[2]) db_c = spack.database.Database(roots[2], layout=layouts[2])
db_c.add(spec["z"]) db_c.add(spec["z"])
@@ -385,7 +423,7 @@ def _check_remove_and_add_package(database: spack.database.Database, spec):
def _mock_install(spec: str): def _mock_install(spec: str):
s = spack.spec.Spec(spec).concretized() s = spack.concretize.concretize_one(spec)
PackageInstaller([s.package], fake=True, explicit=True).install() PackageInstaller([s.package], fake=True, explicit=True).install()
@@ -730,8 +768,7 @@ def test_regression_issue_8036(mutable_database, usr_folder_exists):
# existing. Even when the package prefix exists, the package should # existing. Even when the package prefix exists, the package should
# not be considered installed until it is added to the database by # not be considered installed until it is added to the database by
# the installer with install(). # the installer with install().
s = spack.spec.Spec("externaltool@0.9") s = spack.concretize.concretize_one("externaltool@0.9")
s.concretize()
assert not s.installed assert not s.installed
# Now install the external package and check again the `installed` property # Now install the external package and check again the `installed` property
@@ -746,8 +783,7 @@ def test_old_external_entries_prefix(mutable_database):
jsonschema.validate(db_obj, schema) jsonschema.validate(db_obj, schema)
s = spack.spec.Spec("externaltool") s = spack.concretize.concretize_one("externaltool")
s.concretize()
db_obj["database"]["installs"][s.dag_hash()]["path"] = "None" db_obj["database"]["installs"][s.dag_hash()]["path"] = "None"
@@ -776,8 +812,7 @@ def test_uninstall_by_spec(mutable_database):
def test_query_unused_specs(mutable_database): def test_query_unused_specs(mutable_database):
# This spec installs a fake cmake as a build only dependency # This spec installs a fake cmake as a build only dependency
s = spack.spec.Spec("simple-inheritance") s = spack.concretize.concretize_one("simple-inheritance")
s.concretize()
PackageInstaller([s.package], fake=True, explicit=True).install() PackageInstaller([s.package], fake=True, explicit=True).install()
si = s.dag_hash() si = s.dag_hash()
@@ -819,8 +854,7 @@ def check_unused(roots, deptype, expected):
def test_query_spec_with_conditional_dependency(mutable_database): def test_query_spec_with_conditional_dependency(mutable_database):
# The issue is triggered by having dependencies that are # The issue is triggered by having dependencies that are
# conditional on a Boolean variant # conditional on a Boolean variant
s = spack.spec.Spec("hdf5~mpi") s = spack.concretize.concretize_one("hdf5~mpi")
s.concretize()
PackageInstaller([s.package], fake=True, explicit=True).install() PackageInstaller([s.package], fake=True, explicit=True).install()
results = spack.store.STORE.db.query_local("hdf5 ^mpich") results = spack.store.STORE.db.query_local("hdf5 ^mpich")
@@ -843,7 +877,7 @@ def test_query_virtual_spec(database):
assert all(name in names for name in ["mpich", "mpich2", "zmpi"]) assert all(name in names for name in ["mpich", "mpich2", "zmpi"])
def test_failed_spec_path_error(database): def test_failed_spec_path_error(mutable_database):
"""Ensure spec not concrete check is covered.""" """Ensure spec not concrete check is covered."""
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("pkg-a")
with pytest.raises(AssertionError, match="concrete spec required"): with pytest.raises(AssertionError, match="concrete spec required"):
@@ -860,7 +894,7 @@ def _is(self, spec):
# Pretend the spec has been failure locked # Pretend the spec has been failure locked
monkeypatch.setattr(spack.database.FailureTracker, "lock_taken", _is) monkeypatch.setattr(spack.database.FailureTracker, "lock_taken", _is)
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
spack.store.STORE.failure_tracker.clear(s) spack.store.STORE.failure_tracker.clear(s)
out = capfd.readouterr()[0] out = capfd.readouterr()[0]
assert "Retaining failure marking" in out assert "Retaining failure marking" in out
@@ -878,7 +912,7 @@ def _is(self, spec):
# Ensure raise OSError when try to remove the non-existent marking # Ensure raise OSError when try to remove the non-existent marking
monkeypatch.setattr(spack.database.FailureTracker, "persistent_mark", _is) monkeypatch.setattr(spack.database.FailureTracker, "persistent_mark", _is)
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
spack.store.STORE.failure_tracker.clear(s, force=True) spack.store.STORE.failure_tracker.clear(s, force=True)
out = capfd.readouterr()[1] out = capfd.readouterr()[1]
assert "Removing failure marking despite lock" in out assert "Removing failure marking despite lock" in out
@@ -893,7 +927,7 @@ def _raise_exc(lock):
raise lk.LockTimeoutError("write", "/mock-lock", 1.234, 10) raise lk.LockTimeoutError("write", "/mock-lock", 1.234, 10)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
# Ensure attempt to acquire write lock on the mark raises the exception # Ensure attempt to acquire write lock on the mark raises the exception
monkeypatch.setattr(lk.Lock, "acquire_write", _raise_exc) monkeypatch.setattr(lk.Lock, "acquire_write", _raise_exc)
@@ -909,7 +943,7 @@ def _raise_exc(lock):
def test_prefix_failed(mutable_database, monkeypatch): def test_prefix_failed(mutable_database, monkeypatch):
"""Add coverage to failed operation.""" """Add coverage to failed operation."""
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
# Confirm the spec is not already marked as failed # Confirm the spec is not already marked as failed
assert not spack.store.STORE.failure_tracker.has_failed(s) assert not spack.store.STORE.failure_tracker.has_failed(s)
@@ -933,7 +967,7 @@ def test_prefix_write_lock_error(mutable_database, monkeypatch):
def _raise(db, spec): def _raise(db, spec):
raise lk.LockError("Mock lock error") raise lk.LockError("Mock lock error")
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
# Ensure subsequent lock operations fail # Ensure subsequent lock operations fail
monkeypatch.setattr(lk.Lock, "acquire_write", _raise) monkeypatch.setattr(lk.Lock, "acquire_write", _raise)
@@ -950,7 +984,7 @@ def test_database_works_with_empty_dir(tmpdir):
db_dir = tmpdir.ensure_dir(".spack-db") db_dir = tmpdir.ensure_dir(".spack-db")
db_dir.ensure("lock") db_dir.ensure("lock")
db_dir.ensure_dir("failures") db_dir.ensure_dir("failures")
tmpdir.chmod(mode=0o555, rec=1) tmpdir.chmod(mode=0o555)
db = spack.database.Database(str(tmpdir)) db = spack.database.Database(str(tmpdir))
with db.read_transaction(): with db.read_transaction():
db.query() db.query()
@@ -1105,6 +1139,8 @@ def test_error_message_when_using_too_new_db(database, monkeypatch):
def test_database_construction_doesnt_use_globals(tmpdir, config, nullify_globals, lock_cfg): def test_database_construction_doesnt_use_globals(tmpdir, config, nullify_globals, lock_cfg):
lock_cfg = lock_cfg or spack.database.lock_configuration(config) lock_cfg = lock_cfg or spack.database.lock_configuration(config)
db = spack.database.Database(str(tmpdir), lock_cfg=lock_cfg) db = spack.database.Database(str(tmpdir), lock_cfg=lock_cfg)
with db.write_transaction():
pass # ensure the DB is written
assert os.path.exists(db.database_directory) assert os.path.exists(db.database_directory)
@@ -1125,15 +1161,13 @@ def test_database_read_works_with_trailing_data(tmp_path, default_mock_concretiz
assert spack.database.Database(root).query_local() == specs_in_db assert spack.database.Database(root).query_local() == specs_in_db
def test_database_errors_with_just_a_version_key(tmp_path): def test_database_errors_with_just_a_version_key(mutable_database):
root = str(tmp_path)
db = spack.database.Database(root)
next_version = f"{spack.database._DB_VERSION}.next" next_version = f"{spack.database._DB_VERSION}.next"
with open(db._index_path, "w", encoding="utf-8") as f: with open(mutable_database._index_path, "w", encoding="utf-8") as f:
f.write(json.dumps({"database": {"version": next_version}})) f.write(json.dumps({"database": {"version": next_version}}))
with pytest.raises(spack.database.InvalidDatabaseVersionError): with pytest.raises(spack.database.InvalidDatabaseVersionError):
spack.database.Database(root).query_local() spack.database.Database(mutable_database.root).query_local()
def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config): def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config):
@@ -1141,7 +1175,7 @@ def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config):
# we install `mpileaks` locally with dependencies in the upstream. And we even install # we install `mpileaks` locally with dependencies in the upstream. And we even install
# `mpileaks` with the same hash in the upstream. After reindexing, `mpileaks` should still be # `mpileaks` with the same hash in the upstream. After reindexing, `mpileaks` should still be
# in the local db, and `callpath` should not. # in the local db, and `callpath` should not.
mpileaks = spack.spec.Spec("mpileaks").concretized() mpileaks = spack.concretize.concretize_one("mpileaks")
callpath = mpileaks.dependencies("callpath")[0] callpath = mpileaks.dependencies("callpath")[0]
upstream_store = spack.store.create( upstream_store = spack.store.create(

View File

@@ -5,6 +5,7 @@
import pytest import pytest
import spack.concretize
import spack.directives import spack.directives
import spack.repo import spack.repo
import spack.spec import spack.spec
@@ -59,8 +60,8 @@ def test_constraints_from_context_are_merged(mock_packages):
@pytest.mark.regression("27754") @pytest.mark.regression("27754")
def test_extends_spec(config, mock_packages): def test_extends_spec(config, mock_packages):
extender = spack.spec.Spec("extends-spec").concretized() extender = spack.concretize.concretize_one("extends-spec")
extendee = spack.spec.Spec("extendee").concretized() extendee = spack.concretize.concretize_one("extendee")
assert extender.dependencies assert extender.dependencies
assert extender.package.extends(extendee) assert extender.package.extends(extendee)
@@ -206,7 +207,7 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
def test_redistribute_directive(test_repo, spec_str, distribute_src, distribute_bin): def test_redistribute_directive(test_repo, spec_str, distribute_src, distribute_bin):
spec = spack.spec.Spec(spec_str) spec = spack.spec.Spec(spec_str)
assert spec.package_class.redistribute_source(spec) == distribute_src assert spec.package_class.redistribute_source(spec) == distribute_src
concretized_spec = spec.concretized() concretized_spec = spack.concretize.concretize_one(spec)
assert concretized_spec.package.redistribute_binary == distribute_bin assert concretized_spec.package.redistribute_binary == distribute_bin

View File

@@ -13,6 +13,7 @@
from llnl.path import path_to_os_path from llnl.path import path_to_os_path
import spack.concretize
import spack.hash_types import spack.hash_types
import spack.paths import spack.paths
import spack.repo import spack.repo
@@ -59,7 +60,7 @@ def test_yaml_directory_layout_parameters(tmpdir, default_mock_concretization):
assert package7 == path_package7 assert package7 == path_package7
# Test separation of architecture or namespace # Test separation of architecture or namespace
spec2 = Spec("libelf").concretized() spec2 = spack.concretize.concretize_one("libelf")
arch_scheme = ( arch_scheme = (
"{architecture.platform}/{architecture.target}/{architecture.os}/{name}/{version}/{hash:7}" "{architecture.platform}/{architecture.target}/{architecture.os}/{name}/{version}/{hash:7}"
@@ -97,7 +98,7 @@ def test_read_and_write_spec(temporary_store, config, mock_packages):
# If a spec fails to concretize, just skip it. If it is a # If a spec fails to concretize, just skip it. If it is a
# real error, it will be caught by concretization tests. # real error, it will be caught by concretization tests.
try: try:
spec = spack.spec.Spec(name).concretized() spec = spack.concretize.concretize_one(name)
except Exception: except Exception:
continue continue
@@ -136,7 +137,7 @@ def test_read_and_write_spec(temporary_store, config, mock_packages):
assert norm.eq_dag(spec_from_file) assert norm.eq_dag(spec_from_file)
# TODO: revise this when build deps are in dag_hash # TODO: revise this when build deps are in dag_hash
conc = read_separately.concretized().copy(deps=stored_deptypes) conc = spack.concretize.concretize_one(read_separately).copy(deps=stored_deptypes)
assert conc == spec_from_file assert conc == spec_from_file
assert conc.eq_dag(spec_from_file) assert conc.eq_dag(spec_from_file)
@@ -172,12 +173,10 @@ def test_handle_unknown_package(temporary_store, config, mock_packages, tmp_path
# Create all the packages that are not in mock. # Create all the packages that are not in mock.
installed_specs = {} installed_specs = {}
for pkg_name in packages: for pkg_name in packages:
spec = spack.spec.Spec(pkg_name)
# If a spec fails to concretize, just skip it. If it is a # If a spec fails to concretize, just skip it. If it is a
# real error, it will be caught by concretization tests. # real error, it will be caught by concretization tests.
try: try:
spec.concretize() spec = spack.concretize.concretize_one(pkg_name)
except Exception: except Exception:
continue continue
@@ -209,7 +208,7 @@ def test_find(temporary_store, config, mock_packages):
if name.startswith("external"): if name.startswith("external"):
# External package tests cannot be installed # External package tests cannot be installed
continue continue
spec = spack.spec.Spec(name).concretized() spec = spack.concretize.concretize_one(name)
installed_specs[spec.name] = spec installed_specs[spec.name] = spec
layout.create_install_directory(spec) layout.create_install_directory(spec)

View File

@@ -7,7 +7,7 @@
import pytest import pytest
import spack.build_environment import spack.build_environment
import spack.spec import spack.concretize
from spack.package import build_system_flags, env_flags, inject_flags from spack.package import build_system_flags, env_flags, inject_flags
@@ -30,10 +30,10 @@ def add_o3_to_build_system_cflags(pkg, name, flags):
class TestFlagHandlers: class TestFlagHandlers:
def test_no_build_system_flags(self, temp_env): def test_no_build_system_flags(self, temp_env):
# Test that both autotools and cmake work getting no build_system flags # Test that both autotools and cmake work getting no build_system flags
s1 = spack.spec.Spec("cmake-client").concretized() s1 = spack.concretize.concretize_one("cmake-client")
spack.build_environment.setup_package(s1.package, False) spack.build_environment.setup_package(s1.package, False)
s2 = spack.spec.Spec("patchelf").concretized() s2 = spack.concretize.concretize_one("patchelf")
spack.build_environment.setup_package(s2.package, False) spack.build_environment.setup_package(s2.package, False)
# Use cppflags as a canary # Use cppflags as a canary
@@ -43,28 +43,28 @@ def test_no_build_system_flags(self, temp_env):
def test_unbound_method(self, temp_env): def test_unbound_method(self, temp_env):
# Other tests test flag_handlers set as bound methods and functions. # Other tests test flag_handlers set as bound methods and functions.
# This tests an unbound method in python2 (no change in python3). # This tests an unbound method in python2 (no change in python3).
s = spack.spec.Spec("mpileaks cppflags=-g").concretized() s = spack.concretize.concretize_one("mpileaks cppflags=-g")
s.package.flag_handler = s.package.__class__.inject_flags s.package.flag_handler = s.package.__class__.inject_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert os.environ["SPACK_CPPFLAGS"] == "-g" assert os.environ["SPACK_CPPFLAGS"] == "-g"
assert "CPPFLAGS" not in os.environ assert "CPPFLAGS" not in os.environ
def test_inject_flags(self, temp_env): def test_inject_flags(self, temp_env):
s = spack.spec.Spec("mpileaks cppflags=-g").concretized() s = spack.concretize.concretize_one("mpileaks cppflags=-g")
s.package.flag_handler = inject_flags s.package.flag_handler = inject_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert os.environ["SPACK_CPPFLAGS"] == "-g" assert os.environ["SPACK_CPPFLAGS"] == "-g"
assert "CPPFLAGS" not in os.environ assert "CPPFLAGS" not in os.environ
def test_env_flags(self, temp_env): def test_env_flags(self, temp_env):
s = spack.spec.Spec("mpileaks cppflags=-g").concretized() s = spack.concretize.concretize_one("mpileaks cppflags=-g")
s.package.flag_handler = env_flags s.package.flag_handler = env_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert os.environ["CPPFLAGS"] == "-g" assert os.environ["CPPFLAGS"] == "-g"
assert "SPACK_CPPFLAGS" not in os.environ assert "SPACK_CPPFLAGS" not in os.environ
def test_build_system_flags_cmake(self, temp_env): def test_build_system_flags_cmake(self, temp_env):
s = spack.spec.Spec("cmake-client cppflags=-g").concretized() s = spack.concretize.concretize_one("cmake-client cppflags=-g")
s.package.flag_handler = build_system_flags s.package.flag_handler = build_system_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "SPACK_CPPFLAGS" not in os.environ assert "SPACK_CPPFLAGS" not in os.environ
@@ -76,7 +76,7 @@ def test_build_system_flags_cmake(self, temp_env):
} }
def test_build_system_flags_autotools(self, temp_env): def test_build_system_flags_autotools(self, temp_env):
s = spack.spec.Spec("patchelf cppflags=-g").concretized() s = spack.concretize.concretize_one("patchelf cppflags=-g")
s.package.flag_handler = build_system_flags s.package.flag_handler = build_system_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "SPACK_CPPFLAGS" not in os.environ assert "SPACK_CPPFLAGS" not in os.environ
@@ -85,7 +85,7 @@ def test_build_system_flags_autotools(self, temp_env):
def test_build_system_flags_not_implemented(self, temp_env): def test_build_system_flags_not_implemented(self, temp_env):
"""Test the command line flags method raises a NotImplementedError""" """Test the command line flags method raises a NotImplementedError"""
s = spack.spec.Spec("mpileaks cppflags=-g").concretized() s = spack.concretize.concretize_one("mpileaks cppflags=-g")
s.package.flag_handler = build_system_flags s.package.flag_handler = build_system_flags
try: try:
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
@@ -94,7 +94,7 @@ def test_build_system_flags_not_implemented(self, temp_env):
assert True assert True
def test_add_build_system_flags_autotools(self, temp_env): def test_add_build_system_flags_autotools(self, temp_env):
s = spack.spec.Spec("patchelf cppflags=-g").concretized() s = spack.concretize.concretize_one("patchelf cppflags=-g")
s.package.flag_handler = add_o3_to_build_system_cflags s.package.flag_handler = add_o3_to_build_system_cflags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "-g" in os.environ["SPACK_CPPFLAGS"] assert "-g" in os.environ["SPACK_CPPFLAGS"]
@@ -102,7 +102,7 @@ def test_add_build_system_flags_autotools(self, temp_env):
assert s.package.configure_flag_args == ["CFLAGS=-O3"] assert s.package.configure_flag_args == ["CFLAGS=-O3"]
def test_add_build_system_flags_cmake(self, temp_env): def test_add_build_system_flags_cmake(self, temp_env):
s = spack.spec.Spec("cmake-client cppflags=-g").concretized() s = spack.concretize.concretize_one("cmake-client cppflags=-g")
s.package.flag_handler = add_o3_to_build_system_cflags s.package.flag_handler = add_o3_to_build_system_cflags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "-g" in os.environ["SPACK_CPPFLAGS"] assert "-g" in os.environ["SPACK_CPPFLAGS"]
@@ -110,7 +110,7 @@ def test_add_build_system_flags_cmake(self, temp_env):
assert s.package.cmake_flag_args == ["-DCMAKE_C_FLAGS=-O3"] assert s.package.cmake_flag_args == ["-DCMAKE_C_FLAGS=-O3"]
def test_ld_flags_cmake(self, temp_env): def test_ld_flags_cmake(self, temp_env):
s = spack.spec.Spec("cmake-client ldflags=-mthreads").concretized() s = spack.concretize.concretize_one("cmake-client ldflags=-mthreads")
s.package.flag_handler = build_system_flags s.package.flag_handler = build_system_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "SPACK_LDFLAGS" not in os.environ assert "SPACK_LDFLAGS" not in os.environ
@@ -122,7 +122,7 @@ def test_ld_flags_cmake(self, temp_env):
} }
def test_ld_libs_cmake(self, temp_env): def test_ld_libs_cmake(self, temp_env):
s = spack.spec.Spec("cmake-client ldlibs=-lfoo").concretized() s = spack.concretize.concretize_one("cmake-client ldlibs=-lfoo")
s.package.flag_handler = build_system_flags s.package.flag_handler = build_system_flags
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)
assert "SPACK_LDLIBS" not in os.environ assert "SPACK_LDLIBS" not in os.environ
@@ -138,7 +138,7 @@ def test_flag_handler(self, name, flags):
flags.append("-foo") flags.append("-foo")
return (flags, None, None) return (flags, None, None)
s = spack.spec.Spec("cmake-client").concretized() s = spack.concretize.concretize_one("cmake-client")
s.package.flag_handler = test_flag_handler s.package.flag_handler = test_flag_handler
spack.build_environment.setup_package(s.package, False) spack.build_environment.setup_package(s.package, False)

View File

@@ -10,6 +10,7 @@
from llnl.util.filesystem import mkdirp, touch, working_dir from llnl.util.filesystem import mkdirp, touch, working_dir
import spack.concretize
import spack.config import spack.config
import spack.error import spack.error
import spack.fetch_strategy import spack.fetch_strategy
@@ -185,8 +186,9 @@ def test_adhoc_version_submodules(
monkeypatch.setitem(pkg_class.versions, Version("git"), t.args) monkeypatch.setitem(pkg_class.versions, Version("git"), t.args)
monkeypatch.setattr(pkg_class, "git", "file://%s" % mock_git_repository.path, raising=False) monkeypatch.setattr(pkg_class, "git", "file://%s" % mock_git_repository.path, raising=False)
spec = Spec("git-test@{0}".format(mock_git_repository.unversioned_commit)) spec = spack.concretize.concretize_one(
spec.concretize() Spec("git-test@{0}".format(mock_git_repository.unversioned_commit))
)
spec.package.do_stage() spec.package.do_stage()
collected_fnames = set() collected_fnames = set()
for root, dirs, files in os.walk(spec.package.stage.source_path): for root, dirs, files in os.walk(spec.package.stage.source_path):

View File

@@ -3,8 +3,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import io import io
import spack.concretize
import spack.graph import spack.graph
import spack.spec
def test_dynamic_dot_graph_mpileaks(default_mock_concretization): def test_dynamic_dot_graph_mpileaks(default_mock_concretization):
@@ -38,7 +38,7 @@ def test_dynamic_dot_graph_mpileaks(default_mock_concretization):
def test_ascii_graph_mpileaks(config, mock_packages, monkeypatch): def test_ascii_graph_mpileaks(config, mock_packages, monkeypatch):
monkeypatch.setattr(spack.graph.AsciiGraph, "_node_label", lambda self, node: node.name) monkeypatch.setattr(spack.graph.AsciiGraph, "_node_label", lambda self, node: node.name)
s = spack.spec.Spec("mpileaks").concretized() s = spack.concretize.concretize_one("mpileaks")
stream = io.StringIO() stream = io.StringIO()
graph = spack.graph.AsciiGraph() graph = spack.graph.AsciiGraph()

View File

@@ -8,9 +8,9 @@
from llnl.util.filesystem import mkdirp, touch, working_dir from llnl.util.filesystem import mkdirp, touch, working_dir
import spack.concretize
import spack.config import spack.config
from spack.fetch_strategy import HgFetchStrategy from spack.fetch_strategy import HgFetchStrategy
from spack.spec import Spec
from spack.stage import Stage from spack.stage import Stage
from spack.util.executable import which from spack.util.executable import which
from spack.version import Version from spack.version import Version
@@ -40,7 +40,7 @@ def test_fetch(type_of_test, secure, mock_hg_repository, config, mutable_mock_re
h = mock_hg_repository.hash h = mock_hg_repository.hash
# Construct the package under test # Construct the package under test
s = Spec("hg-test").concretized() s = spack.concretize.concretize_one("hg-test")
monkeypatch.setitem(s.package.versions, Version("hg"), t.args) monkeypatch.setitem(s.package.versions, Version("hg"), t.args)
# Enter the stage directory and check some properties # Enter the stage directory and check some properties

View File

@@ -11,6 +11,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.build_environment import spack.build_environment
import spack.concretize
import spack.config import spack.config
import spack.database import spack.database
import spack.error import spack.error
@@ -41,7 +42,7 @@ def find_nothing(*args):
def test_install_and_uninstall(install_mockery, mock_fetch, monkeypatch): def test_install_and_uninstall(install_mockery, mock_fetch, monkeypatch):
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
PackageInstaller([spec.package], explicit=True).install() PackageInstaller([spec.package], explicit=True).install()
assert spec.installed assert spec.installed
@@ -53,7 +54,7 @@ def test_install_and_uninstall(install_mockery, mock_fetch, monkeypatch):
@pytest.mark.regression("11870") @pytest.mark.regression("11870")
def test_uninstall_non_existing_package(install_mockery, mock_fetch, monkeypatch): def test_uninstall_non_existing_package(install_mockery, mock_fetch, monkeypatch):
"""Ensure that we can uninstall a package that has been deleted from the repo""" """Ensure that we can uninstall a package that has been deleted from the repo"""
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
PackageInstaller([spec.package], explicit=True).install() PackageInstaller([spec.package], explicit=True).install()
assert spec.installed assert spec.installed
@@ -71,8 +72,7 @@ def test_uninstall_non_existing_package(install_mockery, mock_fetch, monkeypatch
def test_pkg_attributes(install_mockery, mock_fetch, monkeypatch): def test_pkg_attributes(install_mockery, mock_fetch, monkeypatch):
# Get a basic concrete spec for the dummy package. # Get a basic concrete spec for the dummy package.
spec = Spec("attributes-foo-app ^attributes-foo") spec = spack.concretize.concretize_one("attributes-foo-app ^attributes-foo")
spec.concretize()
assert spec.concrete assert spec.concrete
pkg = spec.package pkg = spec.package
@@ -127,7 +127,7 @@ def remove_prefix(self):
def test_partial_install_delete_prefix_and_stage(install_mockery, mock_fetch, working_env): def test_partial_install_delete_prefix_and_stage(install_mockery, mock_fetch, working_env):
s = Spec("canfail").concretized() s = spack.concretize.concretize_one("canfail")
instance_rm_prefix = s.package.remove_prefix instance_rm_prefix = s.package.remove_prefix
@@ -157,7 +157,7 @@ def test_failing_overwrite_install_should_keep_previous_installation(
the original install prefix instead of cleaning it. the original install prefix instead of cleaning it.
""" """
# Do a successful install # Do a successful install
s = Spec("canfail").concretized() s = spack.concretize.concretize_one("canfail")
s.package.set_install_succeed() s.package.set_install_succeed()
# Do a failing overwrite install # Do a failing overwrite install
@@ -173,13 +173,11 @@ def test_failing_overwrite_install_should_keep_previous_installation(
def test_dont_add_patches_to_installed_package(install_mockery, mock_fetch, monkeypatch): def test_dont_add_patches_to_installed_package(install_mockery, mock_fetch, monkeypatch):
dependency = Spec("dependency-install") dependency = spack.concretize.concretize_one("dependency-install")
dependency.concretize()
PackageInstaller([dependency.package], explicit=True).install() PackageInstaller([dependency.package], explicit=True).install()
dependency_hash = dependency.dag_hash() dependency_hash = dependency.dag_hash()
dependent = Spec("dependent-install ^/" + dependency_hash) dependent = spack.concretize.concretize_one("dependent-install ^/" + dependency_hash)
dependent.concretize()
monkeypatch.setitem( monkeypatch.setitem(
dependency.package.patches, dependency.package.patches,
@@ -191,20 +189,18 @@ def test_dont_add_patches_to_installed_package(install_mockery, mock_fetch, monk
def test_installed_dependency_request_conflicts(install_mockery, mock_fetch, mutable_mock_repo): def test_installed_dependency_request_conflicts(install_mockery, mock_fetch, mutable_mock_repo):
dependency = Spec("dependency-install") dependency = spack.concretize.concretize_one("dependency-install")
dependency.concretize()
PackageInstaller([dependency.package], explicit=True).install() PackageInstaller([dependency.package], explicit=True).install()
dependency_hash = dependency.dag_hash() dependency_hash = dependency.dag_hash()
dependent = Spec("conflicting-dependent ^/" + dependency_hash) dependent = Spec("conflicting-dependent ^/" + dependency_hash)
with pytest.raises(spack.error.UnsatisfiableSpecError): with pytest.raises(spack.error.UnsatisfiableSpecError):
dependent.concretize() spack.concretize.concretize_one(dependent)
def test_install_dependency_symlinks_pkg(install_mockery, mock_fetch, mutable_mock_repo): def test_install_dependency_symlinks_pkg(install_mockery, mock_fetch, mutable_mock_repo):
"""Test dependency flattening/symlinks mock package.""" """Test dependency flattening/symlinks mock package."""
spec = Spec("flatten-deps") spec = spack.concretize.concretize_one("flatten-deps")
spec.concretize()
pkg = spec.package pkg = spec.package
PackageInstaller([pkg], explicit=True).install() PackageInstaller([pkg], explicit=True).install()
@@ -215,7 +211,7 @@ def test_install_dependency_symlinks_pkg(install_mockery, mock_fetch, mutable_mo
def test_install_times(install_mockery, mock_fetch, mutable_mock_repo): def test_install_times(install_mockery, mock_fetch, mutable_mock_repo):
"""Test install times added.""" """Test install times added."""
spec = Spec("dev-build-test-install-phases").concretized() spec = spack.concretize.concretize_one("dev-build-test-install-phases")
PackageInstaller([spec.package], explicit=True).install() PackageInstaller([spec.package], explicit=True).install()
# Ensure dependency directory exists after the installation. # Ensure dependency directory exists after the installation.
@@ -236,8 +232,7 @@ def test_flatten_deps(install_mockery, mock_fetch, mutable_mock_repo):
"""Explicitly test the flattening code for coverage purposes.""" """Explicitly test the flattening code for coverage purposes."""
# Unfortunately, executing the 'flatten-deps' spec's installation does # Unfortunately, executing the 'flatten-deps' spec's installation does
# not affect code coverage results, so be explicit here. # not affect code coverage results, so be explicit here.
spec = Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
pkg = spec.package pkg = spec.package
PackageInstaller([pkg], explicit=True).install() PackageInstaller([pkg], explicit=True).install()
@@ -272,7 +267,7 @@ def install_upstream(tmpdir_factory, gen_mock_layout, install_mockery):
def _install_upstream(*specs): def _install_upstream(*specs):
for spec_str in specs: for spec_str in specs:
prepared_db.add(Spec(spec_str).concretized()) prepared_db.add(spack.concretize.concretize_one(spec_str))
downstream_root = str(tmpdir_factory.mktemp("mock_downstream_db_root")) downstream_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))
return downstream_root, upstream_layout return downstream_root, upstream_layout
@@ -285,8 +280,7 @@ def test_installed_upstream_external(install_upstream, mock_fetch):
""" """
store_root, _ = install_upstream("externaltool") store_root, _ = install_upstream("externaltool")
with spack.store.use_store(store_root): with spack.store.use_store(store_root):
dependent = Spec("externaltest") dependent = spack.concretize.concretize_one("externaltest")
dependent.concretize()
new_dependency = dependent["externaltool"] new_dependency = dependent["externaltool"]
assert new_dependency.external assert new_dependency.external
@@ -304,8 +298,8 @@ def test_installed_upstream(install_upstream, mock_fetch):
""" """
store_root, upstream_layout = install_upstream("dependency-install") store_root, upstream_layout = install_upstream("dependency-install")
with spack.store.use_store(store_root): with spack.store.use_store(store_root):
dependency = Spec("dependency-install").concretized() dependency = spack.concretize.concretize_one("dependency-install")
dependent = Spec("dependent-install").concretized() dependent = spack.concretize.concretize_one("dependent-install")
new_dependency = dependent["dependency-install"] new_dependency = dependent["dependency-install"]
assert new_dependency.installed_upstream assert new_dependency.installed_upstream
@@ -319,7 +313,7 @@ def test_installed_upstream(install_upstream, mock_fetch):
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch, working_env): def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch, working_env):
s = Spec("canfail").concretized() s = spack.concretize.concretize_one("canfail")
# If remove_prefix is called at any point in this test, that is an error # If remove_prefix is called at any point in this test, that is an error
monkeypatch.setattr(spack.package_base.PackageBase, "remove_prefix", mock_remove_prefix) monkeypatch.setattr(spack.package_base.PackageBase, "remove_prefix", mock_remove_prefix)
@@ -336,7 +330,7 @@ def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch, w
def test_second_install_no_overwrite_first(install_mockery, mock_fetch, monkeypatch): def test_second_install_no_overwrite_first(install_mockery, mock_fetch, monkeypatch):
s = Spec("canfail").concretized() s = spack.concretize.concretize_one("canfail")
monkeypatch.setattr(spack.package_base.PackageBase, "remove_prefix", mock_remove_prefix) monkeypatch.setattr(spack.package_base.PackageBase, "remove_prefix", mock_remove_prefix)
s.package.set_install_succeed() s.package.set_install_succeed()
@@ -356,8 +350,8 @@ def test_install_prefix_collision_fails(config, mock_fetch, mock_packages, tmpdi
projections = {"projections": {"all": "one-prefix-per-package-{name}"}} projections = {"projections": {"all": "one-prefix-per-package-{name}"}}
with spack.store.use_store(str(tmpdir), extra_data=projections): with spack.store.use_store(str(tmpdir), extra_data=projections):
with spack.config.override("config:checksum", False): with spack.config.override("config:checksum", False):
pkg_a = Spec("libelf@0.8.13").concretized().package pkg_a = spack.concretize.concretize_one("libelf@0.8.13").package
pkg_b = Spec("libelf@0.8.12").concretized().package pkg_b = spack.concretize.concretize_one("libelf@0.8.12").package
PackageInstaller([pkg_a], explicit=True, fake=True).install() PackageInstaller([pkg_a], explicit=True, fake=True).install()
with pytest.raises(InstallError, match="Install prefix collision"): with pytest.raises(InstallError, match="Install prefix collision"):
@@ -365,14 +359,14 @@ def test_install_prefix_collision_fails(config, mock_fetch, mock_packages, tmpdi
def test_store(install_mockery, mock_fetch): def test_store(install_mockery, mock_fetch):
spec = Spec("cmake-client").concretized() spec = spack.concretize.concretize_one("cmake-client")
pkg = spec.package pkg = spec.package
PackageInstaller([pkg], fake=True, explicit=True).install() PackageInstaller([pkg], fake=True, explicit=True).install()
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_failing_build(install_mockery, mock_fetch, capfd): def test_failing_build(install_mockery, mock_fetch, capfd):
spec = Spec("failing-build").concretized() spec = spack.concretize.concretize_one("failing-build")
pkg = spec.package pkg = spec.package
with pytest.raises(spack.build_environment.ChildError, match="Expected failure"): with pytest.raises(spack.build_environment.ChildError, match="Expected failure"):
@@ -387,8 +381,7 @@ def test_uninstall_by_spec_errors(mutable_database):
"""Test exceptional cases with the uninstall command.""" """Test exceptional cases with the uninstall command."""
# Try to uninstall a spec that has not been installed # Try to uninstall a spec that has not been installed
spec = Spec("dependent-install") spec = spack.concretize.concretize_one("dependent-install")
spec.concretize()
with pytest.raises(InstallError, match="is not installed"): with pytest.raises(InstallError, match="is not installed"):
PackageBase.uninstall_by_spec(spec) PackageBase.uninstall_by_spec(spec)
@@ -401,7 +394,7 @@ def test_uninstall_by_spec_errors(mutable_database):
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_nosource_pkg_install(install_mockery, mock_fetch, mock_packages, capfd, ensure_debug): def test_nosource_pkg_install(install_mockery, mock_fetch, mock_packages, capfd, ensure_debug):
"""Test install phases with the nosource package.""" """Test install phases with the nosource package."""
spec = Spec("nosource").concretized() spec = spack.concretize.concretize_one("nosource")
pkg = spec.package pkg = spec.package
# Make sure install works even though there is no associated code. # Make sure install works even though there is no associated code.
@@ -418,7 +411,7 @@ def test_nosource_bundle_pkg_install(
install_mockery, mock_fetch, mock_packages, capfd, ensure_debug install_mockery, mock_fetch, mock_packages, capfd, ensure_debug
): ):
"""Test install phases with the nosource-bundle package.""" """Test install phases with the nosource-bundle package."""
spec = Spec("nosource-bundle").concretized() spec = spack.concretize.concretize_one("nosource-bundle")
pkg = spec.package pkg = spec.package
# Make sure install works even though there is no associated code. # Make sure install works even though there is no associated code.
@@ -432,7 +425,7 @@ def test_nosource_bundle_pkg_install(
def test_nosource_pkg_install_post_install(install_mockery, mock_fetch, mock_packages): def test_nosource_pkg_install_post_install(install_mockery, mock_fetch, mock_packages):
"""Test install phases with the nosource package with post-install.""" """Test install phases with the nosource package with post-install."""
spec = Spec("nosource-install").concretized() spec = spack.concretize.concretize_one("nosource-install")
pkg = spec.package pkg = spec.package
# Make sure both the install and post-install package methods work. # Make sure both the install and post-install package methods work.
@@ -449,14 +442,14 @@ def test_nosource_pkg_install_post_install(install_mockery, mock_fetch, mock_pac
def test_pkg_build_paths(install_mockery): def test_pkg_build_paths(install_mockery):
# Get a basic concrete spec for the trivial install package. # Get a basic concrete spec for the trivial install package.
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
assert spec.package.log_path.endswith(_spack_build_logfile) assert spec.package.log_path.endswith(_spack_build_logfile)
assert spec.package.env_path.endswith(_spack_build_envfile) assert spec.package.env_path.endswith(_spack_build_envfile)
def test_pkg_install_paths(install_mockery): def test_pkg_install_paths(install_mockery):
# Get a basic concrete spec for the trivial install package. # Get a basic concrete spec for the trivial install package.
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
log_path = os.path.join(spec.prefix, ".spack", _spack_build_logfile + ".gz") log_path = os.path.join(spec.prefix, ".spack", _spack_build_logfile + ".gz")
assert spec.package.install_log_path == log_path assert spec.package.install_log_path == log_path
@@ -493,7 +486,7 @@ def test_pkg_install_paths(install_mockery):
def test_log_install_without_build_files(install_mockery): def test_log_install_without_build_files(install_mockery):
"""Test the installer log function when no build files are present.""" """Test the installer log function when no build files are present."""
# Get a basic concrete spec for the trivial install package. # Get a basic concrete spec for the trivial install package.
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
# Attempt installing log without the build log file # Attempt installing log without the build log file
with pytest.raises(IOError, match="No such file or directory"): with pytest.raises(IOError, match="No such file or directory"):
@@ -515,7 +508,7 @@ def _install(src, dest):
monkeypatch.setattr(fs, "install", _install) monkeypatch.setattr(fs, "install", _install)
spec = Spec("trivial-install-test-package").concretized() spec = spack.concretize.concretize_one("trivial-install-test-package")
# Set up mock build files and try again to include archive failure # Set up mock build files and try again to include archive failure
log_path = spec.package.log_path log_path = spec.package.log_path
@@ -587,7 +580,7 @@ def test_empty_install_sanity_check_prefix(
monkeypatch, install_mockery, mock_fetch, mock_packages monkeypatch, install_mockery, mock_fetch, mock_packages
): ):
"""Test empty install triggers sanity_check_prefix.""" """Test empty install triggers sanity_check_prefix."""
spec = Spec("failing-empty-install").concretized() spec = spack.concretize.concretize_one("failing-empty-install")
with pytest.raises(spack.build_environment.ChildError, match="Nothing was installed"): with pytest.raises(spack.build_environment.ChildError, match="Nothing was installed"):
PackageInstaller([spec.package], explicit=True).install() PackageInstaller([spec.package], explicit=True).install()
@@ -599,7 +592,7 @@ def test_install_from_binary_with_missing_patch_succeeds(
pushing the package to a binary cache, installation from that binary cache shouldn't error out pushing the package to a binary cache, installation from that binary cache shouldn't error out
because of the missing patch.""" because of the missing patch."""
# Create a spec s with non-existing patches # Create a spec s with non-existing patches
s = Spec("trivial-install-test-package").concretized() s = spack.concretize.concretize_one("trivial-install-test-package")
patches = ["a" * 64] patches = ["a" * 64]
s_dict = s.to_dict() s_dict = s.to_dict()
s_dict["spec"]["nodes"][0]["patches"] = patches s_dict["spec"]["nodes"][0]["patches"] = patches

View File

@@ -16,6 +16,7 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.binary_distribution import spack.binary_distribution
import spack.concretize
import spack.database import spack.database
import spack.deptypes as dt import spack.deptypes as dt
import spack.error import spack.error
@@ -81,7 +82,7 @@ def create_installer(
) -> inst.PackageInstaller: ) -> inst.PackageInstaller:
"""Create an installer instance for a list of specs or package names that will be """Create an installer instance for a list of specs or package names that will be
concretized.""" concretized."""
_specs = [spack.spec.Spec(s).concretized() if isinstance(s, str) else s for s in specs] _specs = [spack.concretize.concretize_one(s) if isinstance(s, str) else s for s in specs]
_install_args = {} if install_args is None else install_args _install_args = {} if install_args is None else install_args
return inst.PackageInstaller([spec.package for spec in _specs], **_install_args) return inst.PackageInstaller([spec.package for spec in _specs], **_install_args)
@@ -96,8 +97,7 @@ def test_hms(sec, result):
def test_get_dependent_ids(install_mockery, mock_packages): def test_get_dependent_ids(install_mockery, mock_packages):
# Concretize the parent package, which handle dependency too # Concretize the parent package, which handle dependency too
spec = spack.spec.Spec("pkg-a") spec = spack.concretize.concretize_one("pkg-a")
spec.concretize()
assert spec.concrete assert spec.concrete
pkg_id = inst.package_id(spec) pkg_id = inst.package_id(spec)
@@ -133,8 +133,7 @@ def test_install_msg(monkeypatch):
def test_install_from_cache_errors(install_mockery): def test_install_from_cache_errors(install_mockery):
"""Test to ensure cover install from cache errors.""" """Test to ensure cover install from cache errors."""
spec = spack.spec.Spec("trivial-install-test-package") spec = spack.concretize.concretize_one("trivial-install-test-package")
spec.concretize()
assert spec.concrete assert spec.concrete
# Check with cache-only # Check with cache-only
@@ -153,8 +152,7 @@ def test_install_from_cache_errors(install_mockery):
def test_install_from_cache_ok(install_mockery, monkeypatch): def test_install_from_cache_ok(install_mockery, monkeypatch):
"""Test to ensure cover _install_from_cache to the return.""" """Test to ensure cover _install_from_cache to the return."""
spec = spack.spec.Spec("trivial-install-test-package") spec = spack.concretize.concretize_one("trivial-install-test-package")
spec.concretize()
monkeypatch.setattr(inst, "_try_install_from_binary_cache", _true) monkeypatch.setattr(inst, "_try_install_from_binary_cache", _true)
monkeypatch.setattr(spack.hooks, "post_install", _noop) monkeypatch.setattr(spack.hooks, "post_install", _noop)
@@ -163,8 +161,7 @@ def test_install_from_cache_ok(install_mockery, monkeypatch):
def test_process_external_package_module(install_mockery, monkeypatch, capfd): def test_process_external_package_module(install_mockery, monkeypatch, capfd):
"""Test to simply cover the external module message path.""" """Test to simply cover the external module message path."""
spec = spack.spec.Spec("trivial-install-test-package") spec = spack.concretize.concretize_one("trivial-install-test-package")
spec.concretize()
assert spec.concrete assert spec.concrete
# Ensure take the external module path WITHOUT any changes to the database # Ensure take the external module path WITHOUT any changes to the database
@@ -191,7 +188,7 @@ def _spec(spec, unsigned=False, mirrors_for_spec=None):
# Skip database updates # Skip database updates
monkeypatch.setattr(spack.database.Database, "add", _noop) monkeypatch.setattr(spack.database.Database, "add", _noop)
spec = spack.spec.Spec("pkg-a").concretized() spec = spack.concretize.concretize_one("pkg-a")
assert inst._process_binary_cache_tarball(spec.package, explicit=False, unsigned=False) assert inst._process_binary_cache_tarball(spec.package, explicit=False, unsigned=False)
out = capfd.readouterr()[0] out = capfd.readouterr()[0]
@@ -201,8 +198,7 @@ def _spec(spec, unsigned=False, mirrors_for_spec=None):
def test_try_install_from_binary_cache(install_mockery, mock_packages, monkeypatch): def test_try_install_from_binary_cache(install_mockery, mock_packages, monkeypatch):
"""Test return false when no match exists in the mirror""" """Test return false when no match exists in the mirror"""
spec = spack.spec.Spec("mpich") spec = spack.concretize.concretize_one("mpich")
spec.concretize()
result = inst._try_install_from_binary_cache(spec.package, False, False) result = inst._try_install_from_binary_cache(spec.package, False, False)
assert not result assert not result
@@ -274,7 +270,7 @@ def _mock_installed(self):
def test_check_before_phase_error(install_mockery): def test_check_before_phase_error(install_mockery):
s = spack.spec.Spec("trivial-install-test-package").concretized() s = spack.concretize.concretize_one("trivial-install-test-package")
s.package.stop_before_phase = "beforephase" s.package.stop_before_phase = "beforephase"
with pytest.raises(inst.BadInstallPhase) as exc_info: with pytest.raises(inst.BadInstallPhase) as exc_info:
inst._check_last_phase(s.package) inst._check_last_phase(s.package)
@@ -285,7 +281,7 @@ def test_check_before_phase_error(install_mockery):
def test_check_last_phase_error(install_mockery): def test_check_last_phase_error(install_mockery):
s = spack.spec.Spec("trivial-install-test-package").concretized() s = spack.concretize.concretize_one("trivial-install-test-package")
s.package.stop_before_phase = None s.package.stop_before_phase = None
s.package.last_phase = "badphase" s.package.last_phase = "badphase"
with pytest.raises(inst.BadInstallPhase) as exc_info: with pytest.raises(inst.BadInstallPhase) as exc_info:
@@ -420,15 +416,13 @@ def test_package_id_err(install_mockery):
def test_package_id_ok(install_mockery): def test_package_id_ok(install_mockery):
spec = spack.spec.Spec("trivial-install-test-package") spec = spack.concretize.concretize_one("trivial-install-test-package")
spec.concretize()
assert spec.concrete assert spec.concrete
assert spec.name in inst.package_id(spec) assert spec.name in inst.package_id(spec)
def test_fake_install(install_mockery): def test_fake_install(install_mockery):
spec = spack.spec.Spec("trivial-install-test-package") spec = spack.concretize.concretize_one("trivial-install-test-package")
spec.concretize()
assert spec.concrete assert spec.concrete
pkg = spec.package pkg = spec.package
@@ -440,7 +434,7 @@ def test_dump_packages_deps_ok(install_mockery, tmpdir, mock_packages):
"""Test happy path for dump_packages with dependencies.""" """Test happy path for dump_packages with dependencies."""
spec_name = "simple-inheritance" spec_name = "simple-inheritance"
spec = spack.spec.Spec(spec_name).concretized() spec = spack.concretize.concretize_one(spec_name)
inst.dump_packages(spec, str(tmpdir)) inst.dump_packages(spec, str(tmpdir))
repo = mock_packages.repos[0] repo = mock_packages.repos[0]
@@ -471,7 +465,7 @@ def _repoerr(repo, name):
# the try-except block # the try-except block
monkeypatch.setattr(spack.store.STORE.layout, "build_packages_path", bpp_path) monkeypatch.setattr(spack.store.STORE.layout, "build_packages_path", bpp_path)
spec = spack.spec.Spec("simple-inheritance").concretized() spec = spack.concretize.concretize_one("simple-inheritance")
path = str(tmpdir) path = str(tmpdir)
# The call to install_tree will raise the exception since not mocking # The call to install_tree will raise the exception since not mocking
@@ -581,7 +575,7 @@ def test_check_deps_status_install_failure(install_mockery):
"""Tests that checking the dependency status on a request to install """Tests that checking the dependency status on a request to install
'a' fails, if we mark the dependency as failed. 'a' fails, if we mark the dependency as failed.
""" """
s = spack.spec.Spec("pkg-a").concretized() s = spack.concretize.concretize_one("pkg-a")
for dep in s.traverse(root=False): for dep in s.traverse(root=False):
spack.store.STORE.failure_tracker.mark(dep) spack.store.STORE.failure_tracker.mark(dep)
@@ -654,8 +648,8 @@ def test_installer_init_requests(install_mockery):
@pytest.mark.parametrize("transitive", [True, False]) @pytest.mark.parametrize("transitive", [True, False])
def test_install_spliced(install_mockery, mock_fetch, monkeypatch, capsys, transitive): def test_install_spliced(install_mockery, mock_fetch, monkeypatch, capsys, transitive):
"""Test installing a spliced spec""" """Test installing a spliced spec"""
spec = spack.spec.Spec("splice-t").concretized() spec = spack.concretize.concretize_one("splice-t")
dep = spack.spec.Spec("splice-h+foo").concretized() dep = spack.concretize.concretize_one("splice-h+foo")
# Do the splice. # Do the splice.
out = spec.splice(dep, transitive) out = spec.splice(dep, transitive)
@@ -669,8 +663,8 @@ def test_install_spliced(install_mockery, mock_fetch, monkeypatch, capsys, trans
@pytest.mark.parametrize("transitive", [True, False]) @pytest.mark.parametrize("transitive", [True, False])
def test_install_spliced_build_spec_installed(install_mockery, capfd, mock_fetch, transitive): def test_install_spliced_build_spec_installed(install_mockery, capfd, mock_fetch, transitive):
"""Test installing a spliced spec with the build spec already installed""" """Test installing a spliced spec with the build spec already installed"""
spec = spack.spec.Spec("splice-t").concretized() spec = spack.concretize.concretize_one("splice-t")
dep = spack.spec.Spec("splice-h+foo").concretized() dep = spack.concretize.concretize_one("splice-h+foo")
# Do the splice. # Do the splice.
out = spec.splice(dep, transitive) out = spec.splice(dep, transitive)
@@ -696,8 +690,8 @@ def test_install_splice_root_from_binary(
): ):
"""Test installing a spliced spec with the root available in binary cache""" """Test installing a spliced spec with the root available in binary cache"""
# Test splicing and rewiring a spec with the same name, different hash. # Test splicing and rewiring a spec with the same name, different hash.
original_spec = spack.spec.Spec(root_str).concretized() original_spec = spack.concretize.concretize_one(root_str)
spec_to_splice = spack.spec.Spec("splice-h+foo").concretized() spec_to_splice = spack.concretize.concretize_one("splice-h+foo")
PackageInstaller([original_spec.package, spec_to_splice.package]).install() PackageInstaller([original_spec.package, spec_to_splice.package]).install()
@@ -853,7 +847,7 @@ def _chgrp(path, group, follow_symlinks=True):
monkeypatch.setattr(fs, "chgrp", _chgrp) monkeypatch.setattr(fs, "chgrp", _chgrp)
build_task = create_build_task( build_task = create_build_task(
spack.spec.Spec("trivial-install-test-package").concretized().package spack.concretize.concretize_one("trivial-install-test-package").package
) )
spec = build_task.request.pkg.spec spec = build_task.request.pkg.spec
@@ -1024,7 +1018,8 @@ def test_install_fail_multi(install_mockery, mock_fetch, monkeypatch):
def test_install_fail_fast_on_detect(install_mockery, monkeypatch, capsys): def test_install_fail_fast_on_detect(install_mockery, monkeypatch, capsys):
"""Test fail_fast install when an install failure is detected.""" """Test fail_fast install when an install failure is detected."""
b, c = spack.spec.Spec("pkg-b").concretized(), spack.spec.Spec("pkg-c").concretized() b = spack.concretize.concretize_one("pkg-b")
c = spack.concretize.concretize_one("pkg-c")
b_id, c_id = inst.package_id(b), inst.package_id(c) b_id, c_id = inst.package_id(b), inst.package_id(c)
installer = create_installer([b, c], {"fail_fast": True}) installer = create_installer([b, c], {"fail_fast": True})
@@ -1093,7 +1088,7 @@ def _requeued(installer, task, install_status):
def test_install_lock_installed_requeue(install_mockery, monkeypatch, capfd): def test_install_lock_installed_requeue(install_mockery, monkeypatch, capfd):
"""Cover basic install handling for installed package.""" """Cover basic install handling for installed package."""
b = spack.spec.Spec("pkg-b").concretized() b = spack.concretize.concretize_one("pkg-b")
b_pkg_id = inst.package_id(b) b_pkg_id = inst.package_id(b)
installer = create_installer([b]) installer = create_installer([b])
@@ -1279,7 +1274,7 @@ def test_term_status_line():
@pytest.mark.parametrize("explicit", [True, False]) @pytest.mark.parametrize("explicit", [True, False])
def test_single_external_implicit_install(install_mockery, explicit): def test_single_external_implicit_install(install_mockery, explicit):
pkg = "trivial-install-test-package" pkg = "trivial-install-test-package"
s = spack.spec.Spec(pkg).concretized() s = spack.concretize.concretize_one(pkg)
s.external_path = "/usr" s.external_path = "/usr"
args = {"explicit": [s.dag_hash()] if explicit else []} args = {"explicit": [s.dag_hash()] if explicit else []}
create_installer([s], args).install() create_installer([s], args).install()
@@ -1288,7 +1283,7 @@ def test_single_external_implicit_install(install_mockery, explicit):
def test_overwrite_install_does_install_build_deps(install_mockery, mock_fetch): def test_overwrite_install_does_install_build_deps(install_mockery, mock_fetch):
"""When overwrite installing something from sources, build deps should be installed.""" """When overwrite installing something from sources, build deps should be installed."""
s = spack.spec.Spec("dtrun3").concretized() s = spack.concretize.concretize_one("dtrun3")
create_installer([s]).install() create_installer([s]).install()
# Verify there is a pure build dep # Verify there is a pure build dep
@@ -1310,7 +1305,7 @@ def test_overwrite_install_does_install_build_deps(install_mockery, mock_fetch):
def test_print_install_test_log_skipped(install_mockery, mock_packages, capfd, run_tests): def test_print_install_test_log_skipped(install_mockery, mock_packages, capfd, run_tests):
"""Confirm printing of install log skipped if not run/no failures.""" """Confirm printing of install log skipped if not run/no failures."""
name = "trivial-install-test-package" name = "trivial-install-test-package"
s = spack.spec.Spec(name).concretized() s = spack.concretize.concretize_one(name)
pkg = s.package pkg = s.package
pkg.run_tests = run_tests pkg.run_tests = run_tests
@@ -1324,7 +1319,7 @@ def test_print_install_test_log_failures(
): ):
"""Confirm expected outputs when there are test failures.""" """Confirm expected outputs when there are test failures."""
name = "trivial-install-test-package" name = "trivial-install-test-package"
s = spack.spec.Spec(name).concretized() s = spack.concretize.concretize_one(name)
pkg = s.package pkg = s.package
# Missing test log is an error # Missing test log is an error

View File

@@ -29,31 +29,31 @@ def make_executable(tmp_path, working_env):
def test_make_normal(): def test_make_normal():
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
assert make(output=str).strip() == "-j8" assert make(output=str).strip() == "-j8"
assert make("install", output=str).strip() == "-j8 install" assert make("install", output=str).strip() == "-j8 install"
def test_make_explicit(): def test_make_explicit():
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
assert make(parallel=True, output=str).strip() == "-j8" assert make(parallel=True, output=str).strip() == "-j8"
assert make("install", parallel=True, output=str).strip() == "-j8 install" assert make("install", parallel=True, output=str).strip() == "-j8 install"
def test_make_one_job(): def test_make_one_job():
make = MakeExecutable("make", 1) make = MakeExecutable("make", jobs=1)
assert make(output=str).strip() == "-j1" assert make(output=str).strip() == "-j1"
assert make("install", output=str).strip() == "-j1 install" assert make("install", output=str).strip() == "-j1 install"
def test_make_parallel_false(): def test_make_parallel_false():
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
assert make(parallel=False, output=str).strip() == "-j1" assert make(parallel=False, output=str).strip() == "-j1"
assert make("install", parallel=False, output=str).strip() == "-j1 install" assert make("install", parallel=False, output=str).strip() == "-j1 install"
def test_make_parallel_disabled(monkeypatch): def test_make_parallel_disabled(monkeypatch):
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
monkeypatch.setenv("SPACK_NO_PARALLEL_MAKE", "true") monkeypatch.setenv("SPACK_NO_PARALLEL_MAKE", "true")
assert make(output=str).strip() == "-j1" assert make(output=str).strip() == "-j1"
@@ -74,7 +74,7 @@ def test_make_parallel_disabled(monkeypatch):
def test_make_parallel_precedence(monkeypatch): def test_make_parallel_precedence(monkeypatch):
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
# These should work # These should work
monkeypatch.setenv("SPACK_NO_PARALLEL_MAKE", "true") monkeypatch.setenv("SPACK_NO_PARALLEL_MAKE", "true")
@@ -96,21 +96,21 @@ def test_make_parallel_precedence(monkeypatch):
def test_make_jobs_env(): def test_make_jobs_env():
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
dump_env = {} dump_env = {}
assert make(output=str, jobs_env="MAKE_PARALLELISM", _dump_env=dump_env).strip() == "-j8" assert make(output=str, jobs_env="MAKE_PARALLELISM", _dump_env=dump_env).strip() == "-j8"
assert dump_env["MAKE_PARALLELISM"] == "8" assert dump_env["MAKE_PARALLELISM"] == "8"
def test_make_jobserver(monkeypatch): def test_make_jobserver(monkeypatch):
make = MakeExecutable("make", 8) make = MakeExecutable("make", jobs=8)
monkeypatch.setenv("MAKEFLAGS", "--jobserver-auth=X,Y") monkeypatch.setenv("MAKEFLAGS", "--jobserver-auth=X,Y")
assert make(output=str).strip() == "" assert make(output=str).strip() == ""
assert make(parallel=False, output=str).strip() == "-j1" assert make(parallel=False, output=str).strip() == "-j1"
def test_make_jobserver_not_supported(monkeypatch): def test_make_jobserver_not_supported(monkeypatch):
make = MakeExecutable("make", 8, supports_jobserver=False) make = MakeExecutable("make", jobs=8, supports_jobserver=False)
monkeypatch.setenv("MAKEFLAGS", "--jobserver-auth=X,Y") monkeypatch.setenv("MAKEFLAGS", "--jobserver-auth=X,Y")
# Currently fallback on default job count, Maybe it should force -j1 ? # Currently fallback on default job count, Maybe it should force -j1 ?
assert make(output=str).strip() == "-j8" assert make(output=str).strip() == "-j8"

View File

@@ -11,6 +11,7 @@
from llnl.util.symlink import resolve_link_target_relative_to_the_link from llnl.util.symlink import resolve_link_target_relative_to_the_link
import spack.caches import spack.caches
import spack.concretize
import spack.config import spack.config
import spack.fetch_strategy import spack.fetch_strategy
import spack.mirrors.layout import spack.mirrors.layout
@@ -43,7 +44,7 @@ def set_up_package(name, repository, url_attr):
2. Point the package's version args at that repo. 2. Point the package's version args at that repo.
""" """
# Set up packages to point at mock repos. # Set up packages to point at mock repos.
s = Spec(name).concretized() s = spack.concretize.concretize_one(name)
repos[name] = repository repos[name] = repository
# change the fetch args of the first (only) version. # change the fetch args of the first (only) version.
@@ -60,7 +61,7 @@ def check_mirror():
mirrors = {"spack-mirror-test": url_util.path_to_file_url(mirror_root)} mirrors = {"spack-mirror-test": url_util.path_to_file_url(mirror_root)}
with spack.config.override("mirrors", mirrors): with spack.config.override("mirrors", mirrors):
with spack.config.override("config:checksum", False): with spack.config.override("config:checksum", False):
specs = [Spec(x).concretized() for x in repos] specs = [spack.concretize.concretize_one(x) for x in repos]
spack.mirrors.utils.create(mirror_root, specs) spack.mirrors.utils.create(mirror_root, specs)
# Stage directory exists # Stage directory exists
@@ -77,7 +78,7 @@ def check_mirror():
# Now try to fetch each package. # Now try to fetch each package.
for name, mock_repo in repos.items(): for name, mock_repo in repos.items():
spec = Spec(name).concretized() spec = spack.concretize.concretize_one(name)
pkg = spec.package pkg = spec.package
with spack.config.override("config:checksum", False): with spack.config.override("config:checksum", False):
@@ -212,13 +213,15 @@ def test_invalid_json_mirror_collection(invalid_json, error_message):
def test_mirror_archive_paths_no_version(mock_packages, mock_archive): def test_mirror_archive_paths_no_version(mock_packages, mock_archive):
spec = Spec("trivial-install-test-package@=nonexistingversion").concretized() spec = spack.concretize.concretize_one(
Spec("trivial-install-test-package@=nonexistingversion")
)
fetcher = spack.fetch_strategy.URLFetchStrategy(url=mock_archive.url) fetcher = spack.fetch_strategy.URLFetchStrategy(url=mock_archive.url)
spack.mirrors.layout.default_mirror_layout(fetcher, "per-package-ref", spec) spack.mirrors.layout.default_mirror_layout(fetcher, "per-package-ref", spec)
def test_mirror_with_url_patches(mock_packages, monkeypatch): def test_mirror_with_url_patches(mock_packages, monkeypatch):
spec = Spec("patch-several-dependencies").concretized() spec = spack.concretize.concretize_one("patch-several-dependencies")
files_cached_in_mirror = set() files_cached_in_mirror = set()
def record_store(_class, fetcher, relative_dst, cosmetic_path=None): def record_store(_class, fetcher, relative_dst, cosmetic_path=None):

View File

@@ -35,6 +35,53 @@ def test_module_function_change_env(tmp_path):
assert environb[b"NOT_AFFECTED"] == b"NOT_AFFECTED" assert environb[b"NOT_AFFECTED"] == b"NOT_AFFECTED"
def test_module_function_change_env_with_module_src_cmd(tmp_path):
environb = {
b"MODULESHOME": b"here",
b"TEST_MODULE_ENV_VAR": b"TEST_FAIL",
b"TEST_ANOTHER_MODULE_ENV_VAR": b"TEST_FAIL",
b"NOT_AFFECTED": b"NOT_AFFECTED",
}
src_file = tmp_path / "src_me"
src_file.write_text("export TEST_MODULE_ENV_VAR=TEST_SUCCESS\n")
module_src_file = tmp_path / "src_me_too"
module_src_file.write_text("export TEST_ANOTHER_MODULE_ENV_VAR=TEST_SUCCESS\n")
module("load", str(src_file), module_template=f". {src_file} 2>&1", environb=environb)
module(
"load",
str(src_file),
module_template=f". {src_file} 2>&1",
module_src_cmd=f". {module_src_file} 2>&1; ",
environb=environb,
)
assert environb[b"TEST_MODULE_ENV_VAR"] == b"TEST_SUCCESS"
assert environb[b"TEST_ANOTHER_MODULE_ENV_VAR"] == b"TEST_SUCCESS"
assert environb[b"NOT_AFFECTED"] == b"NOT_AFFECTED"
def test_module_function_change_env_without_moduleshome_no_module_src_cmd(tmp_path):
environb = {
b"TEST_MODULE_ENV_VAR": b"TEST_FAIL",
b"TEST_ANOTHER_MODULE_ENV_VAR": b"TEST_FAIL",
b"NOT_AFFECTED": b"NOT_AFFECTED",
}
src_file = tmp_path / "src_me"
src_file.write_text("export TEST_MODULE_ENV_VAR=TEST_SUCCESS\n")
module_src_file = tmp_path / "src_me_too"
module_src_file.write_text("export TEST_ANOTHER_MODULE_ENV_VAR=TEST_SUCCESS\n")
module("load", str(src_file), module_template=f". {src_file} 2>&1", environb=environb)
module(
"load",
str(src_file),
module_template=f". {src_file} 2>&1",
module_src_cmd=f". {module_src_file} 2>&1; ",
environb=environb,
)
assert environb[b"TEST_MODULE_ENV_VAR"] == b"TEST_SUCCESS"
assert environb[b"TEST_ANOTHER_MODULE_ENV_VAR"] == b"TEST_FAIL"
assert environb[b"NOT_AFFECTED"] == b"NOT_AFFECTED"
def test_module_function_no_change(tmpdir): def test_module_function_no_change(tmpdir):
src_file = str(tmpdir.join("src_me")) src_file = str(tmpdir.join("src_me"))
with open(src_file, "w", encoding="utf-8") as f: with open(src_file, "w", encoding="utf-8") as f:

Some files were not shown because too many files have changed in this diff Show More