Compare commits

...

162 Commits

Author SHA1 Message Date
Harmen Stoppels
8b020067f4 spec.py: further arch related fixes 2025-01-28 16:59:53 +01:00
Massimiliano Culpo
40a1da4a73 spec.py: fix ArchSpec.intersects (#48741)
fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.
2025-01-28 16:46:09 +01:00
Satish Balay
82e091e2c2 petsc+rocm: add dependency on hipblas-common (#48644) 2025-01-28 09:45:12 -06:00
Thomas-Ulrich
c86112b0e8 update hypre version and add new memalign for petsc (#47831) 2025-01-28 09:42:42 -06:00
jmuddnv
bb25c04845 Changes for NVIDIA HPC SDK 25.1 (#48696) 2025-01-28 07:05:33 -07:00
Buldram
d69d26d9ce toybox: add v0.8.12 (#48657) 2025-01-28 14:47:22 +01:00
snehring
06d660b9ba autodock-vina: adding version 1.2.6 (#48684)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-01-28 14:28:28 +01:00
snehring
40b3196412 py-ipyrad: adding version 0.9.102 (#48686)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-01-28 14:27:16 +01:00
Juan Miguel Carceller
7e893da4a6 thepeg: extend the rivet@:3 dependency up to version 2.3 (#48691)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-28 14:25:02 +01:00
David Schneller
13aa8b6867 easi: add v1.5.1; relax yaml-cpp and lua requirements (#48675) 2025-01-28 14:23:57 +01:00
Adam J. Stewart
b0afb619de py-geemap: add new package (#48602) 2025-01-28 14:12:25 +01:00
Adam J. Stewart
7a82c703c7 JAX: add v0.4.32+ (#46346)
* JAX: add v0.4.34

* Disable search for clang

* Update CUDA flags

* Add py-jax 0.4.33, comment out until py-jaxlib 0.4.33 is also released

* Fix GCC build

* Try TF_NVCC_CLANG

* py-jax: add v0.4.34

* jax no longer has separate tags for jaxlib

* Install compiled wheel

* Join path before glob

* Wheel is in spack stage, not tmp path

* Add 0.4.35

* Add newer versions

* Build system has been refactored yet again

* Drop clang

* Fix build with source tarball, rocm support

* Support GCC

* Remove clang-specific compiler flags

* enable_cuda flag was removed

* Fix logic

* py-jax: add v0.4.38

* Add patch to fix GCC support

* Patch no longer needed

* Skip patching, directly pass flags

* New flags

* Remove unused import

* Patch changed

* Use older version of patch

* Newer patch

* Add CUDA symlink

* Symlink more directories

* Recursive symlink

* Import function

* Recursive search

* Undo cuda changes

* Add v0.5.0

* I quit
2025-01-28 13:37:50 +01:00
G-Ragghianti
0d3667175a papi: fix error finding gmake during post-install testing (#48592) 2025-01-28 12:51:10 +01:00
Wouter Deconinck
a754341f6c hep stack: additional event generator packages (#48565)
* hep stack: additional event generator packages

* hep: adidtional packages

* hep: collier doesn't have +pic +shared

* py-awkward-cpp: fix scikit-build-core range of applicability

* hep: disable agile

* hep: disable garfieldpp and genie

* py-wxpython: depends_on pkgconfig even if using external wxwidgets

* hep: disable professor
2025-01-28 05:38:58 -06:00
Thomas Bouvier
a50c45f00c py-flash-attn: add missing triton dependency (#48645) 2025-01-28 11:43:27 +01:00
Sreenivasa Murthy Kolam
87e65e5377 Bump up the version for rocm-6.3.1 release (#48440)
This PR updates the versions for the rocm recipes for rocm-6.3.1 release.
2025-01-28 01:53:55 -08:00
Etienne Ndamlabin
50fe96aaf6 damaris: add v1.12.0, update maintainers (#48674)
Co-authored-by: Etienne Ndamlabin <jean-etienne.ndamlabin-mboula@inria.fr>
2025-01-28 10:13:21 +01:00
Rocco Meli
56495a8cd8 dftd4, mctc-lib: enable cmake builds and add multicharge package (#48594)
* enable cmake builds

* [@spackbot] updating style on behalf of RMeli

* Update var/spack/repos/builtin/packages/dftd4/package.py

Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>

* Update var/spack/repos/builtin/packages/mctc-lib/package.py

* include

* fix

* use sha256

* update

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
2025-01-28 09:53:22 +01:00
Harmen Stoppels
c054cb818d import os.path -> os (#48709) 2025-01-28 09:45:43 +01:00
Benjamin Meyers
bc28ec35d1 py-ogb: update version (#48740) 2025-01-27 17:10:53 -07:00
Massimiliano Culpo
e47a6059a7 Improve definition of a few placeholder packages (#48730)
* Improve definition of a few placeholder packages
   These packages are placeholders for vendor provided software,
   that is not buildable, and should be declared as external.
* ibm-java: remove package, as asked by maintainer
2025-01-27 15:34:34 -08:00
Matt Thompson
0d170b9ef3 mapl: fix too strict oneapi conflict (#48743)
* mapl: fix too strict oneapi conflict
* Use older style
* Add diag-disable flag
* Updates from Dom
2025-01-27 15:14:38 -08:00
Garth N. Wells
5174cb9180 Add nanobind 2.4.0 (#48721) 2025-01-27 15:07:11 -08:00
Dom Heinzeller
22ba366e85 Fix creating a bootstrap mirrors (#48252)
Regressed in #47126 

Spack was not interpreting mirrors using relative path with respect to the
metadata directory.

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-27 22:37:29 +00:00
Massimiliano Culpo
13558269b5 Fix a few minor issues in tests (#48735)
* test_no_matching_compiler_specs: does not need mock_low_high_config,
  since mutable_config is already used at class level
* bindist.py: setup a configuration that doesn't super-impose builtin.mock
  over builtin
* builder.py: use a mock configuration for the tests
2025-01-27 23:36:27 +01:00
Stephen Nicholas Swatman
615b7a6ddb geomodel: Add version 6.8.0 (#48727)
This commit adds version 6.8.0 of GeoModel. As far as I can tell from
the change notes, there are no changes required to the build
configuration or dependencies.
2025-01-27 14:09:12 -08:00
Dennis Klein
0415b21d3d fairmq: add v1.9.1 (#48724) 2025-01-27 14:08:32 -08:00
Matthieu Dorier
053c9d2846 librdkafka: added version 2.6.1 and 2.8.0 (#48725) 2025-01-27 14:06:58 -08:00
Harmen Stoppels
1e763629f6 dealii: add vtk backward compat bound (#48744) 2025-01-27 15:00:32 -07:00
Harmen Stoppels
7568687f1e vtk: fix incorrect detection of -fvisibility (#48737) 2025-01-27 22:12:40 +01:00
Vicente Bolea
3b81c0e6b7 zfp: add smoke test (#48598) 2025-01-27 12:33:11 -08:00
Joe
c764400338 Allowing environment variables to be set in a spack.yaml (#47587)
This adds a new configuration section called `env_vars:` that can be set in an environment.

It looks very similar to the existing `environment:` section that can be added to `modules.yaml`,
but it is global for an entire spack environment. It's called `env_vars:` to deconflate it with spack
environments (the term was too overloaded).

The syntax looks like this:

```yaml
spack:
  specs:
    - cmake%gcc
  env_vars:
    set:
      ENVAR_SET_IN_ENV_LOAD: "True"
```

Any of our standard environment modifications can be added to the `env_vars` section, e.g.
`prepend_path:`, `unset:`, `append_path:`, etc.  Operations in `env_vars:` are performed
on `spack env activate` and undone on `spack env deactivate`.
2025-01-27 12:20:22 -08:00
Mikael Simberg
4e8a6eec1a fmt: Add 11.1.3 (#48726) 2025-01-27 13:03:43 -07:00
Alec Scott
ebc9f03dda Go build system: reduce resulting installation sizes (#47943)
* Reduce the size of outputted go built binaries
* Remove unused import from go package
* go: remove comment from setup dependents build env
* Add back missing imports after rebase
2025-01-27 11:32:56 -08:00
Alec Scott
8ac0bd2825 GoPackage: respect -j concurrency (#48421) 2025-01-27 12:58:29 -05:00
Tara Drwenski
cc9e0137df Initialize deque with path string (#48734) 2025-01-27 17:42:29 +01:00
Adam J. Stewart
b8e448afa0 py-timm: add v1.0.14 (#48648) 2025-01-27 11:16:37 -05:00
Adam J. Stewart
209d670bf3 py-scikit-image: add v0.25.1 (#48728) 2025-01-27 11:13:41 -05:00
Harmen Stoppels
c6202842ed Improve error handling in urlopen / socket read (#48707)
* Backward compat with Python 3.9 for socket.timeout
* Forward compat with Python [unknown] as HTTPResponse.geturl is deprecated
* Catch timeout etc from .read()
* Some minor simplifications: json.load(...) takes file object in binary mode.
* Fix CDash code which does error handling wrong: non-2XX responses raise.
2025-01-27 16:59:05 +01:00
Taillefumier Mathieu
b2a75db030 Force rocm dependency on hipfft with +rocm is given (#48211)
Make sure that hipfft+rocm is explicit when +rocm is true.

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2025-01-27 11:33:31 +01:00
Greg Becker
0ec00a9c9a rewiring.py: eliminate code duplication from bindist (#48723) 2025-01-27 10:09:04 +01:00
psakievich
5e3020ad02 filesystem.py: fix recursive_mtime_greater_than (#48718) 2025-01-26 08:29:29 +01:00
Dave Keeshan
a0d0e6321f verible: v0.0.3929 (#48701) 2025-01-25 13:32:21 -07:00
Dave Keeshan
0afac0beaa yosys: add v0.49 (#48700) 2025-01-24 12:36:19 -08:00
Thomas-Ulrich
6155be8548 New Package: PUMGen (#48705)
* add pumgen package
* remove netcdf variant
* remove scorec variant, move zoltan dep
* fix license
2025-01-24 12:32:21 -08:00
Matt Thompson
611cb98b02 mapl: add v2.53.0 (#48712) 2025-01-24 13:09:14 -07:00
Mark W. Krentel
ea5742853f hpcviewer: add version 2025.01 (#48670) 2025-01-24 10:52:18 -08:00
Harmen Stoppels
25a3e8ba59 Remove unused Tokenizer.full_match (#48650) 2025-01-24 15:53:42 +01:00
Mikael Simberg
7fbb3df6b0 Revert "pika: Add conflict between HIP and GCC (libstdc++) >= 13 (#46291)" (#48693)
The conflict is too strict and prevents usable combinations from being
concretized. Some packages depend on pika, but do not include pika
headers in HIP device code files, and can thus be compiled even with the
bad combination of HIP and GCC versions.
2025-01-24 07:35:11 -07:00
AMD Toolchain Support
a728db95de Allow scipy to be built with AOCC again (#48505)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2025-01-24 11:13:47 +01:00
Juan Miguel Carceller
7bc4069b9e py-awkward: add dependency for py-importlib-metadata (#47027)
* py-awkward: add dependency on py-importlib-metadata and py-fsspec
* Implement suggested constraint and dependency type changes

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-23 16:22:28 -08:00
Dom Heinzeller
51fc195d14 Fix var/spack/repos/builtin/packages/ectrans/package.py: instead of declaring a conflict with Intel LLVM compilers, apply patch from upstream (#48687) 2025-01-23 16:18:48 -08:00
Harmen Stoppels
27a0593104 openmpi: do not pass --with-wrapper-ldflags (#48673)
On macOS you cannot unconditionally pass `-rpath`, it can conflict with
`-r`.

mpicc is not the place to inject rpaths to compiler runtime
libraries. That's up to the compiler's config files (spec files in gcc,
config files in llvm).

Also remove some patches that are redundant in newer versions of
openmpi.
2025-01-23 15:53:49 +01:00
Juan Miguel Carceller
f95e27a159 SIPPackage: add gmake as a build dependency to be able to build py-pyqt5 (#48660) 2025-01-23 15:51:59 +01:00
Valentin Volkl
effe433c96 gaudi: set GAUDI_PLUGIN_PATH for runtime on macos (#48039)
* set GAUDI_PLUGIN_PATH

* [@spackbot] updating style on behalf of vvolkl

---------

Co-authored-by: vvolkl <vvolkl@users.noreply.github.com>
2025-01-23 07:55:21 -06:00
Juan Miguel Carceller
21988fbb18 pandoramonitoring: add patch for compiling with C++20 (#48659)
* pandoramonitoring: add patch for compiling with C++20

* Fix patch for PandoraMonitoring

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-23 06:25:26 -06:00
Massimiliano Culpo
2db654bf5a Define the DB index file name in a single place (#48681) 2025-01-23 12:17:34 +01:00
Harmen Stoppels
9992b563db Improve spack.build_systems typing (#48590) 2025-01-23 12:15:43 +01:00
Harmen Stoppels
daba1a805e build_environment.py: clear CONFIG_SITE for configure scripts (#48690)
Should be sufficient to set CONFIG_SITE to /dev/null before running configure to 
prevent any config site file to be loaded, which currently causes non-determinism in builds.

Setting the variable CONFIG_SITE prevents configure files to check $prefix/share/config.site, 
$prefix/etc/config.site, $ac_default_prefix/share/config.site, $ac_default_prefix/etc/config.site 
so we don't have to patch a block of code.
2025-01-23 11:45:52 +01:00
Wouter Deconinck
832bf95aa4 fmt: add v11.1.{1,2}; spdlog: add v1.15.0 (#48402)
* fmt: add v11.1.1

* spdlog: add v1.15.0

* spdlog: terminate string

* fmt: add v11.1.2

* spdlog: apply fmt-11.1 support patch up to 1.15.0

* spdlog: add prereq patch to ensure intended patch applies cleanly

* spdlog: fix patch checksum
2025-01-23 09:34:20 +01:00
Wouter Deconinck
81e6dcd95c gaudi: add v39.2; CUDA support (#48557)
* gaudi: add v39.2; CUDA support
* gaudi: CUDAPackage -> CudaPackage
2025-01-23 08:41:38 +01:00
Tara Drwenski
518572e710 Use gcc toolchain when using hip and gcc host compiler (#48632) 2025-01-22 23:37:24 -08:00
Sinan
6f4ac31a67 alps: add new package (#48183) 2025-01-22 14:57:19 -07:00
Victor A. P. Magri
e291daaa17 Add OpenMP dependency to Kokkos and KokkosKernels (#48455)
* Add OpenMP dependency to Kokkos and KokkosKernels

* Add Chris' suggestions
2025-01-22 12:53:41 -07:00
Seth R. Johnson
58f1e791a0 Celeritas: new version 0.5.1 (#48678)
* Celeritas: add version 0.5

* Fix style

* Un-deprecate non-awful versions
2025-01-22 11:03:48 -07:00
jnhealy2
aba0a740c2 Address quoting issue impacting dev_paths containing @ symbols (#48555)
* Address quoting issue that casuses dev_paths containing @ symbols to parse as versions

* Fix additional location were dev_path was imporperly constructed

The dev_path must be quoted to avoid parsing issues when
paths contain '@' symbols. Also add tests to catch
regression of this behavior

* Format to fix line length

* fix failing tests

* Fix whitespace error

* Add binary_compatibility fixture to test

* Change string formatting to avoid multiline f string

* Update lib/spack/spack/test/concretization/core.py

* Add tmate debug session

* Revert "Add tmate debug session"

This reverts commit 24e2f77e3c.

* Move test and refactor to use env methods

* Update lib/spack/spack/test/cmd/develop.py

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-22 18:03:13 +01:00
Harmen Stoppels
0fe8e763c3 bootstrap: do not show disabled sources as errors if enabled sources fail (#48676) 2025-01-22 17:02:23 +01:00
Harmen Stoppels
0e2d261b7e bootstrap/status.py: remove make (#48677) 2025-01-22 17:00:25 +01:00
Matthieu Dorier
85cb234861 pmdk: add cmake dependency back (#48664) 2025-01-22 14:44:04 +01:00
Harmen Stoppels
87a83db623 autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-01-22 11:29:50 +01:00
Taillefumier Mathieu
e1e17786c5 sirius: libxc 7 forward compat (#48663) 2025-01-22 11:27:19 +01:00
eugeneswalker
68af5cc4c0 e4s ci stacks: add libceed (#48668) 2025-01-21 14:13:55 -08:00
Harmen Stoppels
70df460fa7 PackageBase.detect_dev_src_change: speed up (#48618) 2025-01-21 16:48:37 +01:00
Harmen Stoppels
31a1b2fd6c relocate.py, binary_distribution.py: cleanup (#48651) 2025-01-21 15:45:08 +01:00
moloney
f8fd51e12f BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2, missing pkgconfig dep (#48597)
* BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2

Improves compat with newer Python (>3.9) and numpy.

Fix error during configure step (even on at least some older
versions) by including 'pkgconfig' as a build dep so gtk+ is found.

* BF: Make wxpython use spack built wxwidgets instead of rebuilding

* Update var/spack/repos/builtin/packages/py-wxpython/package.py

Avoid using too new version of py-setuptools as license file format is currently not compatible.

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 21:01:47 -06:00
Mirko Rahn
12784594aa gpi-space: add v24.12 (#48361) 2025-01-20 11:47:56 -07:00
Massimiliano Culpo
e0eb0aba37 netlib-lapack: add v3.12.1 (#48318)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 18:23:54 +01:00
Stephen Nicholas Swatman
f47bf5f6b8 indicators: new package (#47786)
* indicators: new package

Indicators is a new package which aims to add easy-to-use progress bars
to modern C++.

* Update header

* Update dependencies
2025-01-20 09:00:37 -06:00
Seth R. Johnson
9296527775 npm, node-js, typescript: add external find (#48587)
* npm: add external find

* node-js: add external find

* Add external for typescript

* Match full executable string

* Fix style
2025-01-20 07:26:01 -06:00
Wouter Deconinck
08c53fa405 pythia8: correct with_or_without prefix for +openmpi (#48131)
* pythia8: correct with_or_without prefix for +openmpi

* pythia8: fix style

* pythia8: fix style
2025-01-20 06:50:29 -06:00
Harmen Stoppels
0c6f0c090d openldap: fix build (#48646) 2025-01-20 13:16:52 +01:00
Nathan Hanford
c623448f81 add affinity package (#48589) 2025-01-20 04:44:25 -07:00
Wouter Deconinck
df71341972 py-mplhep: add v0.3.55 (#48338) 2025-01-20 10:26:09 +01:00
Wouter Deconinck
75862c456d lhapdf: add v6.5.5 (#48336) 2025-01-20 10:25:38 +01:00
dmagdavector
e680a0c153 libslirp: add v4.8.0 (#48561) 2025-01-20 10:23:47 +01:00
Wouter Deconinck
9ad36080ca r: add v4.4.2 (#48329)
* r: add v4.4.2

* r-rcpp: add v1.0.13-1 spot release; conflict 1.0.13 with r@4.4.2
2025-01-20 10:23:23 +01:00
Harmen Stoppels
ecd14f0ad9 archive.py: fix include_parent_directories=True with path_to_name (#48570) 2025-01-20 10:21:32 +01:00
Rocco Meli
c44edf1e8d gnina: add v1.3 (#47711)
* gnina: add v1.3

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* use github patch

* update

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-01-20 10:19:49 +01:00
Paul R. C. Kent
1eacdca5aa py-pyscf: add v2.8.0 (#48571) 2025-01-20 10:15:41 +01:00
Keita Iwabuchi
4a8f5efb38 Metall: add v0.29 and v0.30 (#48564)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2025-01-20 10:14:42 +01:00
Mickael PHILIT
2e753571bd cgns: add v4.5.0 (#48490) 2025-01-20 10:09:45 +01:00
Thomas Bouvier
da16336550 nvtx: fix missing import (#48580) 2025-01-20 10:08:47 +01:00
Alberto Sartori
1818e70e74 justbuild: add version 1.4.2 (#48581) 2025-01-20 10:07:55 +01:00
Chang Liu
1dde785e9a fusion-io: new package (#47699) 2025-01-20 10:06:13 +01:00
Alberto Invernizzi
a7af32c23b py-virtualenvwrapper: add v6.0.0, v6.1.0, v6.1.1 (#47785) 2025-01-20 10:04:40 +01:00
Simon Frasch
6c92ad439b spfft: add v1.1.1 (#48643) 2025-01-20 10:03:50 +01:00
snehring
93f555eb14 dorado: fixing typo on version restriction for hdf5 (#48591)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-01-20 10:01:29 +01:00
Vanessasaurus
fa3725e9de flux-pmix: add v0.6.0 (#48600)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 10:00:35 +01:00
Vanessasaurus
870dd6206f flux-sched: add v0.41.0(#48599)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 09:57:22 +01:00
Olivier Cessenat
b1d411ab06 octave: new version 9.3.0 (#48606) 2025-01-20 09:56:55 +01:00
Massimiliano Culpo
783eccfbd5 jsonschema: use draft7 validator and simplify schemas (#48621) 2025-01-20 09:51:29 +01:00
Wouter Deconinck
a842332b1b rsync: add v3.4.0, v3.4.1 (#48607) 2025-01-20 09:50:44 +01:00
Harmen Stoppels
7e41288ca6 spack_yaml.py / spec.py: use dict instead of OrderedDict (#48616)
* for config: let `syaml_dict` inherit from `dict` instead of `OrderedDict`. `syaml_dict` now only exists as a mutable wrapper for yaml related metadata.
* for spec serialization / hashing: use `dict` directly

This is possible since we only support cpython 3.6+ in which dicts are ordered.

This improves performance of hash computation a bit. For a larger spec I'm getting 9.22ms instead of 11.9ms, so 22.5% reduction in runtime.
2025-01-20 17:33:44 +09:00
Tara Drwenski
3bb375a47f Change llvm-amdgpu to a build dependency of rocm (#48612) 2025-01-20 17:28:15 +09:00
Olivier Cessenat
478855728f ngspice: add v44 (#48611) 2025-01-20 09:27:51 +01:00
jgraciahlrs
5e3baeabfa scorep: update configure opts for libbfd (#48614) 2025-01-20 09:24:12 +01:00
Axel Huebl
58b9b54066 openPMD-api: add v0.16.1 (#48601)
* openpmd-api: add v0.16.1

Add the latest release of openPMD-api.

* TOML11: Latest Versions
2025-01-20 09:22:30 +01:00
Luca Heltai
3918deab74 dealii: add v9.6.1, and v9.6.2 (#48620) 2025-01-20 09:20:49 +01:00
Matt Thompson
ceb2ce352f mapl: add v2.52.0 (#48629) 2025-01-20 09:10:29 +01:00
Wouter Deconinck
7dc6bff7b1 root: depends_on r-rcpp@:1.0.12 when @:6.32.02 (#48625) 2025-01-20 09:05:55 +01:00
Adam J. Stewart
05fbbd7164 py-numpy: add v2.1.3, v2.2.1, v2.2.2 (#48640) 2025-01-20 08:48:56 +01:00
Juan Miguel Carceller
58421866c2 hepmc3: extend python when the python bindings are enabled (#47836)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-20 08:46:08 +01:00
Harmen Stoppels
962498095d compat with mypy 1.14 (#48626)
* OSError.errno apparently is int | None, but already part of __str__ anyway so drop

* fix disambiguate_spec_from_hashes

* List[Spec].sort() complains, ignore it

* fix typing of KnownCompiler, and use it in asp.py
2025-01-18 21:16:05 +01:00
Anna Rift
d0217cf04e Fix two instances of duplicate 'https://' in URLs (#48628) 2025-01-17 22:59:02 -07:00
Olivier Cessenat
65745fa0df poppler: many improvements (#48627)
* poppler: many improvements

* Strangely I did not have the proper comment header

* Add a maintainer
2025-01-17 22:42:50 -07:00
John Gouwar
9f7cff1780 Stable splice-topo order for resolving splicing (#48605)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-17 16:27:16 +01:00
Harmen Stoppels
bb43fa5444 spec.py: make hashing of extra_attributes order independent (#48615) 2025-01-17 13:50:36 +01:00
Julien Loiseau
847f560a6e hard: new package (#48595) 2025-01-16 15:47:59 -07:00
Tamara Dahlgren
623ff835fc mypy: update python version to avoid error/warning (#48593)
* pyproject: remove mypy python version option since defaults to python used to run it (#48593)
2025-01-16 11:24:29 -07:00
Harmen Stoppels
ca19790ff2 py-executing: support python 3.13 (#48604) 2025-01-16 19:00:10 +01:00
acastanedam
f23366e4f8 Update elk to versions 8.8, 9.6, 10.2 (#48583) 2025-01-16 08:46:18 -08:00
Chris Marsh
42fb689501 py-pint-xarray: add version 0.4 (#47599) 2025-01-16 16:57:49 +01:00
arezaii
c33bbdb77d add Arkouda server and client packages (#48054)
Adds packages for the Arkouda server (`arkouda`) and Arkouda python client  (`py-arkouda`).

Arkouda server and client are divided into separate packages to allow for them to be
installed independently of one another. 

Future work remains to add a `+dev` variant to `py-arkouda`, but that will require additional
supporting packages made available through spack (e.g. for `py-pytest-env`
2025-01-15 21:39:23 -08:00
Laurent Chardon
1af6aa22c1 py-basemap: add v1.4.1 (#48548)
* py-basemap: add v1.4.1
* py-basemap: remove broken v1.2.1
* py-basemap: add hires variant
2025-01-15 18:05:05 -08:00
Laurent Chardon
03d9373e5c py-pycuda: add version 2024.1.2 (#48547)
* py-pycuda: add version 2024.1.2
* py-pycuda: add version 2024.1.2
* py-pycuda: Improve dependencies versions
2025-01-15 17:33:14 -08:00
HELICS-bot
f469a3d6ab helics: Add version 3.6.0 (#48000)
* helics: Add version 3.6.0
* helics: Add constraints for new minimum CMake (3.22+), Boost (1.75+), GCC (11+), Clang (15+), and ICC (21+) versions in HELICS 3.6+

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
2025-01-15 17:32:27 -07:00
Xuefeng Ding
25f24d947a add root to rpath of python (#48491)
* add root to rpath of python?

* fix #48446

* [@spackbot] updating style on behalf of DingXuefeng

---------

Co-authored-by: DingXuefeng <DingXuefeng@users.noreply.github.com>
Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-01-15 14:27:59 -07:00
Sergio Sánchez Ramírez
af25a84a56 julia: bump 1.11 and 1.10 patch versions (#48149) 2025-01-15 20:23:09 +01:00
Massimiliano Culpo
59a71959e7 Fix wrong hash in spliced specs, and improve unit-tests (#48574)
Modifications:
* Fix a severe bug where a spliced spec has the same hash as the original build spec
* Removed CacheManager in favor of install_mockery. The latter sets up a temporary store, and removes it at the end 
   of the test.
* Deleted a couple of helper functions e.g. _has_dependencies
* Checked that SolverException is raised, rather than Exception (more specific)
* Extended, and renamed the splicing_setup fixture (now called install_specs)
* Added more specific assertions in each test

One test is currently flaky, due to some instability when sorting multiple specs. It's currently marked xfail
2025-01-15 19:43:55 +01:00
Harmen Stoppels
00e804a94b package.py: expose types of common globals unconditionally (#48588) 2025-01-15 17:33:38 +01:00
Harmen Stoppels
db997229f2 julia: fix forward compat with curl (#48585) 2025-01-15 16:13:37 +01:00
Massimiliano Culpo
6fac041d40 Add type-hints to which and which_string (#48554) 2025-01-15 15:24:18 +01:00
Paul R. C. Kent
f8b2c65ddf llvm: add v19.1.7 (#48572) 2025-01-15 15:22:14 +01:00
Jordan Galby
c504304d39 Add version attributes up_to_1, up_to_2, up_to_3 (#38410)
Making them also available in spec format string
2025-01-15 13:07:17 +01:00
Harmen Stoppels
976f1c2198 flag_mixing.py: add missing import (#48579) 2025-01-15 10:48:34 +01:00
Greg Becker
e7c591a8b8 Deprecate Spec.concretize/Spec.concretized in favor of spack.concretize.concretize_one (#47971)
The methods spack.spec.Spec.concretize and spack.spec.Spec.concretized
are deprecated in favor of spack.concretize.concretize_one.

This will resolve a circular dependency between the spack.spec and
spack.concretize in the next Spack release.
2025-01-15 10:13:19 +01:00
arezaii
f3522cba74 Chapel 2.3 (#48053)
* add python bindings variant to chapel
* fix chpl_home, update chapel frontend rpath in venv
* fix chpl frontend shared lib rpath hack
* patch chapel env var line length limit
* patch chapel python shared lib location by chapel version
* unhack chpl frontend lib rpath
* fix postinstall main version number file path
* update chapel version number to 2.3
* use chapel releases instead of source tarballs, remove dead code
* Chapel 2.3 adds support for LLVM 19
* Bundled LLVM always requires CMake 3.20:
* Apply 2.3 patch for LLVM include search path

Fixes build errors for `chapel@2.3+rocm` of the form:

```
compiler/llvm/llvmDebug.cpp:174:9: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
compiler/llvm/llvmDebug.cpp:254:15: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
```

* fix style

* Fix misreporting of test_hello failures

`test_part` was aliasing the enclosing test name, leading to confusing and incorrect reporting on test failure.

* Ensure `libxml2` optional dep of `hwloc` is also added to `PKG_CONFIG_PATH` in the run env

* patch chplCheck for GPU + multilocale configs, install docs

* Adjust patch for checkChplInstall

* Ensure `CHPL_DEVELOPER` is unset for `~developer`

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2025-01-14 21:48:44 -08:00
Hubertus van Dam
0bd9c235a0 lammps: enable scafacos (#47638)
* Enable Scafacos in LAMMPS
* lammps: make scafacos work with +lib

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-14 21:33:44 -07:00
jclause
335fca7049 Improve support using external modules with zsh (#48115)
* Improve support using external modules with zsh

Spack integrates with external modules by launching a python subprocess
to scrape the path from the module command. In zsh, subshells do not
inheret functions defined in the parent shell (only environment
variables). This breaks the module function in module_cmd.py.

As a workaround, source the module commands using $MODULESHOME prior to
running the module command.

* Fix formatting

* Fix flake error

* Add guard around sourcing module file

* Add improved unit testing to module src command

* Correct style

* Another attempt at style

* formatting again

* formatting again

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-14 21:29:15 -07:00
Alex Richert
ce5ef14fdb crtm: add v3.1.1-build1 (#48451)
* add crtm@3.1.1-build1
* fix url_for_version's version check

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-01-14 19:03:50 -07:00
Greg Becker
2447d16e55 bugfix: associate nodes directly with specs for compiler flag reordering (#48496) 2025-01-14 10:15:57 -08:00
Wouter Deconinck
8196c68ff3 py-dask: fix py-versioneer version pin (#47980)
* py-dask: fix py-versioneer version pin
* py-dask: depends_on py-click@8.1 when @2023.11.0:
* py-dask: reorder dependency lines

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>

---------

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>
2025-01-14 09:44:24 -08:00
Taillefumier Mathieu
308f74fe8b trexio: add v2.2.3 -> master (#48543)
* Update trexio
   - update version
   - add cmake support

* Fix formating

* hdf5+hl only needed when < 2.3.0

---------

Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
2025-01-14 09:39:43 -08:00
Ludovic Räss
864f09fef0 pism: add v2.0.7, v2.1.1 (#47921)
* Update recipe

* Update

* Update

* Cleanup

* Fixup
2025-01-14 09:20:16 -08:00
Juan Miguel Carceller
29b53581e2 bdsim: update to point to the new location in github and add 1.7.7 (#48458)
* bdsim: update to point to the new location in github and add 1.7.7

* Remove the C++ standard patch

* Remove the cmake_args and add a comment instead

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-14 09:18:22 -08:00
Wouter Deconinck
e441e780b9 flatbuffers: add v24.12.23 (#48563)
* flatbuffers: add v24.12.23

* flatbuffers: fix style
2025-01-14 09:12:08 -08:00
Jean Luca Bez
4c642df5ae drishti: add v0.5, v0.6 (#39262)
* include new versions, update dependencies, add test
* Update package.py
* requested changes
* link dependency
* include commit
* fix style
2025-01-14 08:48:35 -07:00
Matt Thompson
217774c972 mepo: add v2.2.1, v2.3.0 (#48552) 2025-01-14 08:44:57 -07:00
Matthieu Dorier
8ec89fc54c pmdk: add gmake dependency, remove cmake dependency (#48544) 2025-01-14 08:44:43 -07:00
Marc T. Henry de Frahan
66ce93a2e3 Openfast version update (#48558) 2025-01-14 08:44:26 -07:00
AMD Toolchain Support
116ffe5809 ICON: Fix missing ocean variant logic handling (#48550)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2025-01-14 08:44:01 -07:00
Hans Fangohr
6b0ea2db1d octopus: add v15.1 (#48514) 2025-01-14 08:43:45 -07:00
Harmen Stoppels
d72b371c8a relocate_text: fix return value (#48568) 2025-01-14 13:29:06 +01:00
Mikael Simberg
aa88ced154 Set CMake policy CMP0074 to the new behaviour in hip package (#48549) 2025-01-14 04:48:09 -07:00
Todd Gamblin
d89ae7bcde Database: make constructor safe for read-only filesystems (#48454)
`spack find` and other commands that read the DB will fail if the databse is in a
read-only location, because `FailureTracker` and `Database` can try to create their
parent directories when they are constructed, even if the DB root is read-only.

Instead, we should only write to the filesystem when needed.

- [x] don't create parent directories in constructors
- [x] have write operations create parents if needed
- [x] in tests, `chmod` databases to be read-only unless absolutely needed.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-01-14 00:10:51 -08:00
Harmen Stoppels
53d1665a8b traverse: use deque and popleft (#48353)
Avoids a quadratic time operation. In practice it is not really felt, cause it has a tiny constant.
But maybe in very large DAGs it is.
2025-01-14 00:09:57 -08:00
Harmen Stoppels
9fa1654102 builtin: add missing deps on gmake (#48545) 2025-01-13 22:51:20 -07:00
Julien Bigot
2c692a5755 doxygen: new versions 1.13 (#48541) 2025-01-13 21:29:40 -07:00
Hubertus van Dam
c0df012b18 py-mdi: new package (#47636)
* Adding MDI
* py-mdi: update package
* lammps: py-mdi needs to be usable at runtime
* py-mdi: update dependency constraints

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-13 20:54:06 -07:00
Cody Balos
66e8523e14 add long_spec property to get fully enumerated spec string (#48389) 2025-01-13 16:43:39 -08:00
Paul R. C. Kent
3932299768 Add v0.3.29 (#48536) 2025-01-13 16:36:29 -08:00
Buldram
3eba6b8379 qemacs, texi2html: new packages (#48537)
* qemacs, texi2html: new packages
* Make plugin support variant
2025-01-13 16:23:40 -08:00
579 changed files with 5748 additions and 3355 deletions

View File

@@ -25,7 +25,6 @@ exit 1
# The code above runs this file with our preferred python interpreter.
import os
import os.path
import sys
min_python3 = (3, 6)

View File

@@ -36,7 +36,7 @@ packages:
go-or-gccgo-bootstrap: [go-bootstrap, gcc]
iconv: [libiconv]
ipp: [intel-oneapi-ipp]
java: [openjdk, jdk, ibm-java]
java: [openjdk, jdk]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
@@ -73,15 +73,27 @@ packages:
permissions:
read: world
write: user
cray-fftw:
buildable: false
cray-libsci:
buildable: false
cray-mpich:
buildable: false
cray-mvapich2:
buildable: false
cray-pmi:
buildable: false
egl:
buildable: false
essl:
buildable: false
fujitsu-mpi:
buildable: false
fujitsu-ssl2:
buildable: false
hpcx-mpi:
buildable: false
mpt:
buildable: false
spectrum-mpi:
buildable: false

View File

@@ -170,7 +170,7 @@ bootstrapping.
To register the mirror on the platform where it's supposed to be used run the following command(s):
% spack bootstrap add --trust local-sources /opt/bootstrap/metadata/sources
% spack bootstrap add --trust local-binaries /opt/bootstrap/metadata/binaries
% spack buildcache update-index /opt/bootstrap/bootstrap_cache
This command needs to be run on a machine with internet access and the resulting folder
has to be moved over to the air-gapped system. Once the local sources are added using the

View File

@@ -56,13 +56,13 @@ If you look at the ``perl`` package, you'll see:
.. code-block:: python
phases = ["configure", "build", "install"]
phases = ("configure", "build", "install")
Similarly, ``cmake`` defines:
.. code-block:: python
phases = ["bootstrap", "build", "install"]
phases = ("bootstrap", "build", "install")
If we look at the ``cmake`` example, this tells Spack's ``PackageBase``
class to run the ``bootstrap``, ``build``, and ``install`` functions

View File

@@ -543,10 +543,10 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
$ spack python -c 'from spack.concretize import concretize_one; concretize_one("python")'
...
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
$ spack python -i ipython -c 'from spack.concretize import concretize_one; concretize_one("python")'
Out[1]: ...
or a file:

View File

@@ -456,14 +456,13 @@ For instance, the following config options,
tcl:
all:
suffixes:
^python@3: 'python{^python.version}'
^python@3: 'python{^python.version.up_to_2}'
^openblas: 'openblas'
will add a ``python-3.12.1`` version string to any packages compiled with
Python matching the spec, ``python@3``. This is useful to know which
version of Python a set of Python extensions is associated with. Likewise, the
``openblas`` string is attached to any program that has openblas in the spec,
most likely via the ``+blas`` variant specification.
will add a ``python3.12`` to module names of packages compiled with Python 3.12, and similarly for
all specs depending on ``python@3``. This is useful to know which version of Python a set of Python
extensions is associated with. Likewise, the ``openblas`` string is attached to any program that
has openblas in the spec, most likely via the ``+blas`` variant specification.
The most heavyweight solution to module naming is to change the entire
naming convention for module files. This uses the projections format

View File

@@ -3,7 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""URL primitives that just require Python standard library."""
import itertools
import os.path
import os
import re
from typing import Optional, Set, Tuple
from urllib.parse import urlsplit, urlunsplit

View File

@@ -75,7 +75,6 @@
"install_tree",
"is_exe",
"join_path",
"last_modification_time_recursive",
"library_extensions",
"mkdirp",
"partition_path",
@@ -1470,15 +1469,36 @@ def set_executable(path):
@system_path_filter
def last_modification_time_recursive(path):
path = os.path.abspath(path)
times = [os.stat(path).st_mtime]
times.extend(
os.lstat(os.path.join(root, name)).st_mtime
for root, dirs, files in os.walk(path)
for name in dirs + files
)
return max(times)
def recursive_mtime_greater_than(path: str, time: float) -> bool:
"""Returns true if any file or dir recursively under `path` has mtime greater than `time`."""
# use bfs order to increase likelihood of early return
queue: Deque[str] = collections.deque([path])
if os.stat(path).st_mtime > time:
return True
while queue:
current = queue.popleft()
try:
entries = os.scandir(current)
except OSError:
continue
with entries:
for entry in entries:
try:
st = entry.stat(follow_symlinks=False)
except OSError:
continue
if st.st_mtime > time:
return True
if entry.is_dir(follow_symlinks=False):
queue.append(entry.path)
return False
@system_path_filter
@@ -1740,8 +1760,7 @@ def find(
def _log_file_access_issue(e: OSError, path: str) -> None:
errno_name = errno.errorcode.get(e.errno, "UNKNOWN")
tty.debug(f"find must skip {path}: {errno_name} {e}")
tty.debug(f"find must skip {path}: {e}")
def _file_id(s: os.stat_result) -> Tuple[int, int]:

View File

@@ -1356,14 +1356,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
if isinstance(_expected, str) and isinstance(_detected, str):
try:
_regex = re.compile(_expected)
except re.error:
@@ -1379,7 +1373,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1394,6 +1388,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result

View File

@@ -5,6 +5,7 @@
import codecs
import collections
import concurrent.futures
import contextlib
import copy
import hashlib
import io
@@ -23,7 +24,7 @@
import urllib.request
import warnings
from contextlib import closing
from typing import IO, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
from typing import IO, Callable, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
import llnl.util.filesystem as fsys
import llnl.util.lang
@@ -91,6 +92,9 @@
CURRENT_BUILD_CACHE_LAYOUT_VERSION = 2
INDEX_HASH_FILE = "index.json.hash"
class BuildCacheDatabase(spack_db.Database):
"""A database for binary buildcaches.
@@ -502,7 +506,7 @@ def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
scheme = urllib.parse.urlparse(mirror_url).scheme
if scheme != "oci" and not web_util.url_exists(
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
):
return False
@@ -669,19 +673,24 @@ def sign_specfile(key: str, specfile_path: str) -> str:
def _read_specs_and_push_index(
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency
file_list: List[str],
read_method: Callable,
cache_prefix: str,
db: BuildCacheDatabase,
temp_dir: str,
concurrency: int,
):
"""Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror.
Args:
file_list (list(str)): List of urls or file paths pointing at spec files to read
file_list: List of urls or file paths pointing at spec files to read
read_method: A function taking a single argument, either a url or a file path,
and which reads the spec file at that location, and returns the spec.
cache_prefix (str): prefix of the build cache on s3 where index should be pushed.
cache_prefix: prefix of the build cache on s3 where index should be pushed.
db: A spack database used for adding specs and then writing the index.
temp_dir (str): Location to write index.json and hash for pushing
concurrency (int): Number of parallel processes to use when fetching
temp_dir: Location to write index.json and hash for pushing
concurrency: Number of parallel processes to use when fetching
"""
for file in file_list:
contents = read_method(file)
@@ -699,7 +708,7 @@ def _read_specs_and_push_index(
# Now generate the index, compute its hash, and push the two files to
# the mirror.
index_json_path = os.path.join(temp_dir, "index.json")
index_json_path = os.path.join(temp_dir, spack_db.INDEX_JSON_FILE)
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
@@ -709,14 +718,14 @@ def _read_specs_and_push_index(
index_hash = compute_hash(index_string)
# Write the hash out to a local file
index_hash_path = os.path.join(temp_dir, "index.json.hash")
index_hash_path = os.path.join(temp_dir, INDEX_HASH_FILE)
with open(index_hash_path, "w", encoding="utf-8") as f:
f.write(index_hash)
# Push the index itself
web_util.push_to_url(
index_json_path,
url_util.join(cache_prefix, "index.json"),
url_util.join(cache_prefix, spack_db.INDEX_JSON_FILE),
keep_original=False,
extra_args={"ContentType": "application/json", "CacheControl": "no-cache"},
)
@@ -724,7 +733,7 @@ def _read_specs_and_push_index(
# Push the hash
web_util.push_to_url(
index_hash_path,
url_util.join(cache_prefix, "index.json.hash"),
url_util.join(cache_prefix, INDEX_HASH_FILE),
keep_original=False,
extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"},
)
@@ -793,7 +802,7 @@ def url_read_method(url):
try:
_, _, spec_file = web_util.read_from_url(url)
contents = codecs.getreader("utf-8")(spec_file).read()
except web_util.SpackWebError as e:
except (web_util.SpackWebError, OSError) as e:
tty.error(f"Error reading specfile: {url}: {e}")
return contents
@@ -861,9 +870,12 @@ def _url_generate_package_index(url: str, tmpdir: str, concurrency: int = 32):
tty.debug(f"Retrieving spec descriptor files from {url} to build index")
db = BuildCacheDatabase(tmpdir)
db._write()
try:
_read_specs_and_push_index(file_list, read_fn, url, db, db.database_directory, concurrency)
_read_specs_and_push_index(
file_list, read_fn, url, db, str(db.database_directory), concurrency
)
except Exception as e:
raise GenerateIndexError(f"Encountered problem pushing package index to {url}: {e}") from e
@@ -1777,7 +1789,7 @@ def _oci_update_index(
db.mark(spec, "in_buildcache", True)
# Create the index.json file
index_json_path = os.path.join(tmpdir, "index.json")
index_json_path = os.path.join(tmpdir, spack_db.INDEX_JSON_FILE)
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
@@ -1998,7 +2010,7 @@ def fetch_url_to_mirror(url):
# Download the config = spec.json and the relevant tarball
try:
manifest = json.loads(response.read())
manifest = json.load(response)
spec_digest = spack.oci.image.Digest.from_string(manifest["config"]["digest"])
tarball_digest = spack.oci.image.Digest.from_string(
manifest["layers"][-1]["digest"]
@@ -2158,7 +2170,8 @@ def dedupe_hardlinks_if_necessary(root, buildinfo):
def relocate_package(spec: spack.spec.Spec) -> None:
"""Relocate binaries and text files in the given spec prefix, based on its buildinfo file."""
buildinfo = read_buildinfo_file(spec.prefix)
spec_prefix = str(spec.prefix)
buildinfo = read_buildinfo_file(spec_prefix)
old_layout_root = str(buildinfo["buildpath"])
# Warn about old style tarballs created with the --rel flag (removed in Spack v0.20)
@@ -2179,7 +2192,7 @@ def relocate_package(spec: spack.spec.Spec) -> None:
"and an older buildcache create implementation. It cannot be relocated."
)
prefix_to_prefix = {}
prefix_to_prefix: Dict[str, str] = {}
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
@@ -2231,12 +2244,12 @@ def relocate_package(spec: spack.spec.Spec) -> None:
tty.debug(f"Relocating: {old} => {new}.")
# Old archives may have hardlinks repeated.
dedupe_hardlinks_if_necessary(spec.prefix, buildinfo)
dedupe_hardlinks_if_necessary(spec_prefix, buildinfo)
# Text files containing the prefix text
textfiles = [os.path.join(spec.prefix, f) for f in buildinfo["relocate_textfiles"]]
binaries = [os.path.join(spec.prefix, f) for f in buildinfo.get("relocate_binaries")]
links = [os.path.join(spec.prefix, f) for f in buildinfo.get("relocate_links", [])]
textfiles = [os.path.join(spec_prefix, f) for f in buildinfo["relocate_textfiles"]]
binaries = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_binaries")]
links = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_links", [])]
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
@@ -2258,6 +2271,24 @@ def relocate_package(spec: spack.spec.Spec) -> None:
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
install_manifest = os.path.join(
spec.prefix,
spack.store.STORE.layout.metadata_dir,
spack.store.STORE.layout.manifest_file_name,
)
if not os.path.exists(install_manifest):
spec_id = spec.format("{name}/{hash:7}")
tty.warn("No manifest file in tarball for spec %s" % spec_id)
# overwrite old metadata with new
if spec.spliced:
# rewrite spec on disk
spack.store.STORE.layout.write_spec(spec, spack.store.STORE.layout.spec_file_path(spec))
# de-cache the install manifest
with contextlib.suppress(FileNotFoundError):
os.unlink(install_manifest)
def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):
stagepath = os.path.dirname(filename)
@@ -2424,15 +2455,6 @@ def extract_tarball(spec, download_result, force=False, timer=timer.NULL_TIMER):
except Exception as e:
shutil.rmtree(spec.prefix, ignore_errors=True)
raise e
else:
manifest_file = os.path.join(
spec.prefix,
spack.store.STORE.layout.metadata_dir,
spack.store.STORE.layout.manifest_file_name,
)
if not os.path.exists(manifest_file):
spec_id = spec.format("{name}/{hash:7}")
tty.warn("No manifest file in tarball for spec %s" % spec_id)
finally:
if tmpdir:
shutil.rmtree(tmpdir, ignore_errors=True)
@@ -2537,10 +2559,6 @@ def install_root_node(
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
if spec.spliced: # overwrite old metadata with new
spack.store.STORE.layout.write_spec(
spec, spack.store.STORE.layout.spec_file_path(spec)
)
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, allow_missing=allow_missing)
@@ -2578,11 +2596,14 @@ def try_direct_fetch(spec, mirrors=None):
)
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
specfile_contents = codecs.getreader("utf-8")(fs).read()
specfile_is_signed = True
except web_util.SpackWebError as e1:
except (web_util.SpackWebError, OSError) as e1:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except web_util.SpackWebError as e2:
specfile_contents = codecs.getreader("utf-8")(fs).read()
specfile_is_signed = False
except (web_util.SpackWebError, OSError) as e2:
tty.debug(
f"Did not find {specfile_name} on {buildcache_fetch_url_signed_json}",
e1,
@@ -2592,7 +2613,6 @@ def try_direct_fetch(spec, mirrors=None):
f"Did not find {specfile_name} on {buildcache_fetch_url_json}", e2, level=2
)
continue
specfile_contents = codecs.getreader("utf-8")(fs).read()
# read the spec from the build cache file. All specs in build caches
# are concrete (as they are built) so we need to mark this spec
@@ -2686,8 +2706,9 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
try:
_, _, json_file = web_util.read_from_url(keys_index)
json_index = sjson.load(codecs.getreader("utf-8")(json_file))
except web_util.SpackWebError as url_err:
json_index = sjson.load(json_file)
except (web_util.SpackWebError, OSError, ValueError) as url_err:
# TODO: avoid repeated request
if web_util.url_exists(keys_index):
tty.error(
f"Unable to find public keys in {url_util.format(fetch_url)},"
@@ -2934,14 +2955,14 @@ def __init__(self, url, local_hash, urlopen=web_util.urlopen):
def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, INDEX_HASH_FILE)
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except (TimeoutError, urllib.error.URLError):
remote_hash = response.read(64)
except OSError:
return None
# Validate the hash
remote_hash = response.read(64)
if not re.match(rb"[a-f\d]{64}$", remote_hash):
return None
return remote_hash.decode("utf-8")
@@ -2955,17 +2976,17 @@ def conditional_fetch(self) -> FetchIndexResult:
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise, download index.json
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e
except OSError as e:
raise FetchIndexError(f"Could not fetch index from {url_index}", e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
raise FetchIndexError("Remote index {} is invalid".format(url_index), e) from e
except (ValueError, OSError) as e:
raise FetchIndexError(f"Remote index {url_index} is invalid") from e
computed_hash = compute_hash(result)
@@ -2999,7 +3020,7 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
headers = {"User-Agent": web_util.SPACK_USER_AGENT, "If-None-Match": f'"{self.etag}"'}
try:
@@ -3009,12 +3030,12 @@ def conditional_fetch(self) -> FetchIndexResult:
# Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError(f"Could not fetch index {url}", e) from e
except (TimeoutError, urllib.error.URLError) as e:
except OSError as e: # URLError, socket.timeout, etc.
raise FetchIndexError(f"Could not fetch index {url}", e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
except (ValueError, OSError) as e:
raise FetchIndexError(f"Remote index {url} is invalid", e) from e
headers = response.headers
@@ -3046,11 +3067,11 @@ def conditional_fetch(self) -> FetchIndexResult:
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
)
)
except (TimeoutError, urllib.error.URLError) as e:
except OSError as e:
raise FetchIndexError(f"Could not fetch manifest from {url_manifest}", e) from e
try:
manifest = json.loads(response.read())
manifest = json.load(response)
except Exception as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
@@ -3065,14 +3086,16 @@ def conditional_fetch(self) -> FetchIndexResult:
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise fetch the blob / index.json
response = self.urlopen(
urllib.request.Request(
url=self.ref.blob_url(index_digest),
headers={"Accept": "application/vnd.oci.image.layer.v1.tar+gzip"},
try:
response = self.urlopen(
urllib.request.Request(
url=self.ref.blob_url(index_digest),
headers={"Accept": "application/vnd.oci.image.layer.v1.tar+gzip"},
)
)
)
result = codecs.getreader("utf-8")(response).read()
result = codecs.getreader("utf-8")(response).read()
except (OSError, ValueError) as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
# Make sure the blob we download has the advertised hash
if compute_hash(result) != index_digest.digest:

View File

@@ -5,12 +5,14 @@
import fnmatch
import glob
import importlib
import os.path
import os
import re
import sys
import sysconfig
import warnings
from typing import Dict, Optional, Sequence, Union
from typing import Optional, Sequence, Union
from typing_extensions import TypedDict
import archspec.cpu
@@ -18,13 +20,17 @@
from llnl.util import tty
import spack.platforms
import spack.spec
import spack.store
import spack.util.environment
import spack.util.executable
from .config import spec_for_current_python
QueryInfo = Dict[str, "spack.spec.Spec"]
class QueryInfo(TypedDict, total=False):
spec: spack.spec.Spec
command: spack.util.executable.Executable
def _python_import(module: str) -> bool:
@@ -211,7 +217,9 @@ def _executables_in_store(
):
spack.util.environment.path_put_first("PATH", [bin_dir])
if query_info is not None:
query_info["command"] = spack.util.executable.which(*executables, path=bin_dir)
query_info["command"] = spack.util.executable.which(
*executables, path=bin_dir, required=True
)
query_info["spec"] = concrete_spec
return True
return False

View File

@@ -4,7 +4,7 @@
"""Manage configuration swapping for bootstrapping purposes"""
import contextlib
import os.path
import os
import sys
from typing import Any, Dict, Generator, MutableSequence, Sequence

View File

@@ -25,7 +25,6 @@
import functools
import json
import os
import os.path
import sys
import uuid
from typing import Any, Callable, Dict, List, Optional, Tuple
@@ -34,8 +33,10 @@
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.concretize
import spack.config
import spack.detection
import spack.error
import spack.mirrors.mirror
import spack.platforms
import spack.spec
@@ -44,10 +45,17 @@
import spack.util.executable
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
import spack.version
from spack.installer import PackageInstaller
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from ._common import (
QueryInfo,
_executables_in_store,
_python_import,
_root_spec,
_try_import_from_store,
)
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python
@@ -89,8 +97,12 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.name = conf["name"]
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
# Check for relative paths, and turn them into absolute paths
# root is the metadata_dir
maybe_url = conf["info"]["url"]
if spack.util.url.is_path_instead_of_url(maybe_url) and not os.path.isabs(maybe_url):
maybe_url = os.path.join(self.metadata_dir, maybe_url)
self.url = spack.mirrors.mirror.Mirror(maybe_url).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:
@@ -134,7 +146,7 @@ class BuildcacheBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None
self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}"
@staticmethod
@@ -211,14 +223,14 @@ def _install_and_test(
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
info: QueryInfo = {}
if test_fn(query_spec=abstract_spec, query_info=info):
self.last_search = info
return True
return False
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary
info: QueryInfo
test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
return True
@@ -231,7 +243,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary
info: QueryInfo
test_fn, info = functools.partial(_executables_in_store, executables), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
self.last_search = info
@@ -249,11 +261,11 @@ class SourceBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None
self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}"
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary = {}
info: QueryInfo = {}
if _try_import_from_store(module, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -270,10 +282,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
concrete_spec = spack.spec.Spec(
abstract_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize()
concrete_spec = spack.concretize.concretize_one(abstract_spec)
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
@@ -288,7 +300,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return False
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary = {}
info: QueryInfo = {}
if _executables_in_store(executables, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -299,7 +311,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
concrete_spec = spack.concretize.concretize_one(abstract_spec_str)
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):
@@ -316,11 +328,9 @@ def create_bootstrapper(conf: ConfigDictionary):
return _bootstrap_methods[btype](conf)
def source_is_enabled_or_raise(conf: ConfigDictionary):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"]
if not trusted.get(name, False):
raise ValueError("source is not trusted")
def source_is_enabled(conf: ConfigDictionary) -> bool:
"""Returns true if the source is not enabled for bootstrapping"""
return spack.config.get("bootstrap:trusted").get(conf["name"], False)
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
@@ -350,24 +360,23 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
return
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {module}"
)
msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise ImportError(msg)
@@ -405,8 +414,9 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):
# Additional environment variables needed
@@ -414,6 +424,7 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
assert cmd is not None, "expected an Executable"
cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False
@@ -421,18 +432,17 @@ def ensure_executables_in_path_or_raise(
)
return cmd
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {executables_str}"
)
msg = f"cannot bootstrap any of the {executables_str} executables "
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise RuntimeError(msg)

View File

@@ -63,7 +63,6 @@ def _missing(name: str, purpose: str, system_only: bool = True) -> str:
def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),

View File

@@ -301,11 +301,13 @@ def clean_environment():
env.unset("CPLUS_INCLUDE_PATH")
env.unset("OBJC_INCLUDE_PATH")
# prevent configure scripts from sourcing variables from config site file (AC_SITE_LOAD).
env.set("CONFIG_SITE", os.devnull)
env.unset("CMAKE_PREFIX_PATH")
env.unset("PYTHONPATH")
env.unset("R_HOME")
env.unset("R_ENVIRON")
env.unset("LUA_PATH")
env.unset("LUA_CPATH")

View File

@@ -6,7 +6,9 @@
import llnl.util.filesystem as fs
import spack.directives
import spack.spec
import spack.util.executable
import spack.util.prefix
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -17,19 +19,18 @@ class AspellBuilder(AutotoolsBuilder):
to the Aspell extensions.
"""
def configure(self, pkg, spec, prefix):
def configure(
self,
pkg: "AspellDictPackage", # type: ignore[override]
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
):
aspell = spec["aspell"].prefix.bin.aspell
prezip = spec["aspell"].prefix.bin.prezip
destdir = prefix
sh = spack.util.executable.which("sh")
sh(
"./configure",
"--vars",
"ASPELL={0}".format(aspell),
"PREZIP={0}".format(prezip),
"DESTDIR={0}".format(destdir),
)
sh = spack.util.executable.Executable("/bin/sh")
sh("./configure", "--vars", f"ASPELL={aspell}", f"PREZIP={prezip}", f"DESTDIR={destdir}")
# Aspell dictionaries install their bits into their prefix.lib

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import stat
import subprocess
from typing import Callable, List, Optional, Set, Tuple, Union
@@ -356,6 +355,13 @@ def _do_patch_libtool_configure(self) -> None:
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
@@ -527,7 +533,7 @@ def build_directory(self) -> str:
return build_dir
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
def _delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
@@ -540,7 +546,7 @@ def autoreconf_search_path_args(self) -> List[str]:
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
def _set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
@@ -564,10 +570,7 @@ def configure_args(self) -> List[str]:
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Not needed usually, configure should be already there"""
@@ -596,10 +599,7 @@ def autoreconf(
self.pkg.module.autoreconf(*autoreconf_args)
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
@@ -612,10 +612,7 @@ def configure(
pkg.module.configure(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
@@ -625,10 +622,7 @@ def build(
pkg.module.make(*params)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
@@ -825,7 +819,7 @@ def installcheck(self) -> None:
self.pkg._if_make_target_execute("installcheck")
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
def _remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""

View File

@@ -10,6 +10,8 @@
import llnl.util.tty as tty
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from .cmake import CMakeBuilder, CMakePackage
@@ -293,6 +295,13 @@ def initconfig_hardware_entries(self):
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
if spec.satisfies("%gcc"):
entries.append(
cmake_cache_string(
"CMAKE_HIP_FLAGS", f"--gcc-toolchain={self.pkg.compiler.prefix}"
)
)
return entries
def std_initconfig_entries(self):
@@ -323,7 +332,9 @@ def initconfig_package_entries(self):
"""This method is to be overwritten by the package"""
return []
def initconfig(self, pkg, spec, prefix):
def initconfig(
self, pkg: "CachedCMakePackage", spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
cache_entries = (
self.std_initconfig_entries()
+ self.initconfig_compiler_entries()

View File

@@ -7,6 +7,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
@@ -81,12 +83,16 @@ def check_args(self):
def setup_build_environment(self, env):
env.set("CARGO_HOME", self.stage.path)
def build(self, pkg, spec, prefix):
def build(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
def install(self, pkg, spec, prefix):
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy build files into package prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)

View File

@@ -454,10 +454,7 @@ def cmake_args(self) -> List[str]:
return []
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cmake`` in the build directory"""
@@ -474,10 +471,7 @@ def cmake(
pkg.module.cmake(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
with fs.working_dir(self.build_directory):
@@ -488,10 +482,7 @@ def build(
pkg.module.ninja(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):

View File

@@ -7,6 +7,8 @@
import spack.directives
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
@@ -48,3 +50,8 @@ class GenericBuilder(BuilderWithDefaults):
# unconditionally perform any post-install phase tests
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def install(
self, pkg: Package, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
raise NotImplementedError

View File

@@ -7,7 +7,9 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_install_time_tests
@@ -26,9 +28,7 @@ class GoPackage(spack.package_base.PackageBase):
build_system("go")
with when("build_system=go"):
# TODO: this seems like it should be depends_on, see
# setup_dependent_build_environment in go for why I kept it like this
extends("go@1.14:", type="build")
depends_on("go", type="build")
@spack.builder.builder("go")
@@ -71,6 +71,7 @@ class GoBuilder(BuilderWithDefaults):
def setup_build_environment(self, env):
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
env.set("GOPATH", fs.join_path(self.pkg.stage.path, "go"))
@property
def build_directory(self):
@@ -81,19 +82,31 @@ def build_directory(self):
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-modcacherw", "-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
return [
"-p",
str(self.pkg.module.make_jobs),
"-modcacherw",
"-ldflags",
"-s -w",
"-o",
f"{self.pkg.name}",
]
@property
def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.go("build", *self.build_args)
def install(self, pkg, spec, prefix):
def install(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install built binaries into prefix bin."""
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)

View File

@@ -7,7 +7,9 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.executable
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
@@ -55,7 +57,9 @@ class LuaBuilder(spack.builder.Builder):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def unpack(self, pkg, spec, prefix):
def unpack(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
if os.path.splitext(pkg.stage.archive_file)[1] == ".rock":
directory = pkg.luarocks("unpack", pkg.stage.archive_file, output=str)
dirlines = directory.split("\n")
@@ -66,15 +70,16 @@ def unpack(self, pkg, spec, prefix):
def _generate_tree_line(name, prefix):
return """{{ name = "{name}", root = "{prefix}" }};""".format(name=name, prefix=prefix)
def generate_luarocks_config(self, pkg, spec, prefix):
def generate_luarocks_config(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
spec = self.pkg.spec
table_entries = []
for d in spec.traverse(deptype=("build", "run")):
if d.package.extends(self.pkg.extendee_spec):
table_entries.append(self._generate_tree_line(d.name, d.prefix))
path = self._luarocks_config_path()
with open(path, "w", encoding="utf-8") as config:
with open(self._luarocks_config_path(), "w", encoding="utf-8") as config:
config.write(
"""
deps_mode="all"
@@ -85,23 +90,26 @@ def generate_luarocks_config(self, pkg, spec, prefix):
"\n".join(table_entries)
)
)
return path
def preprocess(self, pkg, spec, prefix):
def preprocess(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Override this to preprocess source before building with luarocks"""
pass
def luarocks_args(self):
return []
def install(self, pkg, spec, prefix):
def install(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
rock = "."
specs = find(".", "*.rockspec", recursive=False)
if specs:
rock = specs[0]
rocks_args = self.luarocks_args()
rocks_args.append(rock)
self.pkg.luarocks("--tree=" + prefix, "make", *rocks_args)
pkg.luarocks("--tree=" + prefix, "make", *rocks_args)
def _luarocks_config_path(self):
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")

View File

@@ -98,29 +98,20 @@ def build_directory(self) -> str:
return self.pkg.stage.source_path
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):

View File

@@ -5,6 +5,8 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from spack.util.executable import which
@@ -58,16 +60,20 @@ def build_args(self):
"""List of args to pass to build phase."""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Compile code and package into a JAR file."""
with fs.working_dir(self.build_directory):
mvn = which("mvn")
mvn = which("mvn", required=True)
if self.pkg.run_tests:
mvn("verify", *self.build_args())
else:
mvn("package", "-DskipTests", *self.build_args())
def install(self, pkg, spec, prefix):
def install(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy to installation prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree(".", prefix)

View File

@@ -188,10 +188,7 @@ def meson_args(self) -> List[str]:
return []
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run ``meson`` in the build directory"""
options = []
@@ -204,10 +201,7 @@ def meson(
pkg.module.meson(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
options = ["-v"]
@@ -216,10 +210,7 @@ def build(
pkg.module.ninja(*options)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):

View File

@@ -7,6 +7,8 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
@@ -99,7 +101,9 @@ def msbuild_install_args(self):
as `msbuild_args` by default."""
return self.msbuild_args()
def build(self, pkg, spec, prefix):
def build(
self, pkg: MSBuildPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "msbuild" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.msbuild(
@@ -108,7 +112,9 @@ def build(self, pkg, spec, prefix):
self.define_targets(*self.build_targets),
)
def install(self, pkg, spec, prefix):
def install(
self, pkg: MSBuildPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "msbuild" on the install targets specified by the builder.
This is INSTALL by default"""
with fs.working_dir(self.build_directory):

View File

@@ -7,6 +7,8 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
@@ -123,7 +125,9 @@ def nmake_install_args(self):
Individual packages should override to specify NMake args to command line"""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: NMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "nmake" on the build targets specified by the builder."""
opts = self.std_nmake_args
opts += self.nmake_args()
@@ -132,7 +136,9 @@ def build(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
pkg.module.nmake(*opts, *self.build_targets, ignore_quotes=self.ignore_quotes)
def install(self, pkg, spec, prefix):
def install(
self, pkg: NMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "nmake" on the install targets specified by the builder.
This is INSTALL by default"""
opts = self.std_nmake_args

View File

@@ -3,6 +3,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, extends
from spack.multimethod import when
@@ -42,7 +44,9 @@ class OctaveBuilder(BuilderWithDefaults):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def install(self, pkg, spec, prefix):
def install(
self, pkg: OctavePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package from the archive file"""
pkg.module.octave(
"--quiet",

View File

@@ -10,6 +10,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.install_test import SkipTest, test_part
from spack.multimethod import when
@@ -149,7 +151,9 @@ def configure_args(self):
"""
return []
def configure(self, pkg, spec, prefix):
def configure(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run Makefile.PL or Build.PL with arguments consisting of
an appropriate installation base directory followed by the
list returned by :py:meth:`~.PerlBuilder.configure_args`.
@@ -173,7 +177,9 @@ def fix_shebang(self):
repl = "#!/usr/bin/env perl"
filter_file(pattern, repl, "Build", backup=False)
def build(self, pkg, spec, prefix):
def build(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Builds a Perl package."""
self.build_executable()
@@ -184,6 +190,8 @@ def check(self):
"""Runs built-in tests of a Perl package."""
self.build_executable("test")
def install(self, pkg, spec, prefix):
def install(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Installs a Perl package."""
self.build_executable("install")

View File

@@ -28,6 +28,7 @@
import spack.repo
import spack.spec
import spack.store
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part

View File

@@ -6,6 +6,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
@@ -62,17 +64,23 @@ def qmake_args(self):
"""List of arguments passed to qmake."""
return []
def qmake(self, pkg, spec, prefix):
def qmake(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run ``qmake`` to configure the project and generate a Makefile."""
with working_dir(self.build_directory):
pkg.module.qmake(*self.qmake_args())
def build(self, pkg, spec, prefix):
def build(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
with working_dir(self.build_directory):
pkg.module.make()
def install(self, pkg, spec, prefix):
def install(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with working_dir(self.build_directory):
pkg.module.make("install")

View File

@@ -9,6 +9,8 @@
import llnl.util.tty as tty
import spack.builder
import spack.spec
import spack.util.prefix
from spack.build_environment import SPACK_NO_PARALLEL_MAKE
from spack.config import determine_number_of_jobs
from spack.directives import build_system, extends, maintainers
@@ -74,18 +76,22 @@ def build_directory(self):
ret = os.path.join(ret, self.subdirectory)
return ret
def install(self, pkg, spec, prefix):
def install(
self, pkg: RacketPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install everything from build directory."""
raco = Executable("raco")
with fs.working_dir(self.build_directory):
parallel = self.pkg.parallel and (not env_flag(SPACK_NO_PARALLEL_MAKE))
parallel = pkg.parallel and (not env_flag(SPACK_NO_PARALLEL_MAKE))
name = pkg.racket_name
assert name is not None, "Racket package name is not set"
args = [
"pkg",
"install",
"-t",
"dir",
"-n",
self.pkg.racket_name,
name,
"--deps",
"fail",
"--ignore-implies",
@@ -101,8 +107,7 @@ def install(self, pkg, spec, prefix):
except ProcessError:
args.insert(-2, "--skip-installed")
raco(*args)
msg = (
"Racket package {0} was already installed, uninstalling via "
tty.warn(
f"Racket package {name} was already installed, uninstalling via "
"Spack may make someone unhappy!"
)
tty.warn(msg.format(self.pkg.racket_name))

View File

@@ -140,7 +140,7 @@ class ROCmPackage(PackageBase):
when="+rocm",
)
depends_on("llvm-amdgpu", when="+rocm")
depends_on("llvm-amdgpu", type="build", when="+rocm")
depends_on("hsa-rocr-dev", when="+rocm")
depends_on("hip +rocm", when="+rocm")

View File

@@ -5,6 +5,8 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, extends, maintainers
from ._checks import BuilderWithDefaults
@@ -42,7 +44,9 @@ class RubyBuilder(BuilderWithDefaults):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def build(self, pkg, spec, prefix):
def build(
self, pkg: RubyPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build a Ruby gem."""
# ruby-rake provides both rake.gemspec and Rakefile, but only
@@ -58,7 +62,9 @@ def build(self, pkg, spec, prefix):
# Some Ruby packages only ship `*.gem` files, so nothing to build
pass
def install(self, pkg, spec, prefix):
def install(
self, pkg: RubyPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install a Ruby gem.
The ruby package sets ``GEM_HOME`` to tell gem where to install to."""

View File

@@ -4,6 +4,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
@@ -59,7 +61,9 @@ def build_args(self, spec, prefix):
"""Arguments to pass to build."""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: SConsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build the package."""
pkg.module.scons(*self.build_args(spec, prefix))
@@ -67,7 +71,9 @@ def install_args(self, spec, prefix):
"""Arguments to pass to install."""
return []
def install(self, pkg, spec, prefix):
def install(
self, pkg: SConsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package."""
pkg.module.scons("install", *self.install_args(spec, prefix))

View File

@@ -11,6 +11,8 @@
import spack.install_test
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
@@ -41,6 +43,7 @@ class SIPPackage(spack.package_base.PackageBase):
with when("build_system=sip"):
extends("python", type=("build", "link", "run"))
depends_on("py-sip", type="build")
depends_on("gmake", type="build")
@property
def import_modules(self):
@@ -130,7 +133,9 @@ class SIPBuilder(BuilderWithDefaults):
build_directory = "build"
def configure(self, pkg, spec, prefix):
def configure(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Configure the package."""
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
@@ -148,7 +153,9 @@ def configure_args(self):
"""Arguments to pass to configure."""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build the package."""
args = self.build_args()
@@ -159,7 +166,9 @@ def build_args(self):
"""Arguments to pass to build."""
return []
def install(self, pkg, spec, prefix):
def install(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package."""
args = self.install_args()

View File

@@ -6,6 +6,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
@@ -97,7 +99,9 @@ def waf(self, *args, **kwargs):
with working_dir(self.build_directory):
self.python("waf", "-j{0}".format(jobs), *args, **kwargs)
def configure(self, pkg, spec, prefix):
def configure(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Configures the project."""
args = ["--prefix={0}".format(self.pkg.prefix)]
args += self.configure_args()
@@ -108,7 +112,9 @@ def configure_args(self):
"""Arguments to pass to configure."""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Executes the build."""
args = self.build_args()
@@ -118,7 +124,9 @@ def build_args(self):
"""Arguments to pass to build."""
return []
def install(self, pkg, spec, prefix):
def install(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Installs the targets on the system."""
args = self.install_args()

View File

@@ -14,7 +14,6 @@
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set
from urllib.error import HTTPError, URLError
from urllib.request import HTTPHandler, Request, build_opener
import llnl.util.filesystem as fs
@@ -472,12 +471,9 @@ def generate_pipeline(env: ev.Environment, args) -> None:
# Use all unpruned specs to populate the build group for this set
cdash_config = cfg.get("cdash")
if options.cdash_handler and options.cdash_handler.auth_token:
try:
options.cdash_handler.populate_buildgroup(
[options.cdash_handler.build_name(s) for s in pipeline_specs]
)
except (SpackError, HTTPError, URLError, TimeoutError) as err:
tty.warn(f"Problem populating buildgroup: {err}")
options.cdash_handler.populate_buildgroup(
[options.cdash_handler.build_name(s) for s in pipeline_specs]
)
elif cdash_config:
# warn only if there was actually a CDash configuration.
tty.warn("Unable to populate buildgroup without CDash credentials")

View File

@@ -1,23 +1,21 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import codecs
import copy
import json
import os
import re
import ssl
import sys
import time
from collections import deque
from enum import Enum
from typing import Dict, Generator, List, Optional, Set, Tuple
from urllib.parse import quote, urlencode, urlparse
from urllib.request import HTTPHandler, HTTPSHandler, Request, build_opener
from urllib.request import Request
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import Singleton, memoized
from llnl.util.lang import memoized
import spack.binary_distribution as bindist
import spack.config as cfg
@@ -35,32 +33,11 @@
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp
def _urlopen():
error_handler = web_util.SpackHTTPDefaultErrorHandler()
# One opener with HTTPS ssl enabled
with_ssl = build_opener(
HTTPHandler(), HTTPSHandler(context=web_util.ssl_create_default_context()), error_handler
)
# One opener with HTTPS ssl disabled
without_ssl = build_opener(
HTTPHandler(), HTTPSHandler(context=ssl._create_unverified_context()), error_handler
)
# And dynamically dispatch based on the config:verify_ssl.
def dispatch_open(fullurl, data=None, timeout=None, verify_ssl=True):
opener = with_ssl if verify_ssl else without_ssl
timeout = timeout or cfg.get("config:connect_timeout", 1)
return opener.open(fullurl, data, timeout)
return dispatch_open
IS_WINDOWS = sys.platform == "win32"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
_dyn_mapping_urlopener = Singleton(_urlopen)
# this exists purely for testing purposes
_urlopen = web_util.urlopen
def copy_files_to_artifacts(src, artifacts_dir):
@@ -279,26 +256,25 @@ def copy_test_results(self, source, dest):
reports = fs.join_path(source, "*_Test*.xml")
copy_files_to_artifacts(reports, dest)
def create_buildgroup(self, opener, headers, url, group_name, group_type):
def create_buildgroup(self, headers, url, group_name, group_type):
data = {"newbuildgroup": group_name, "project": self.project, "type": group_type}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code not in [200, 201]:
msg = f"Creating buildgroup failed (response code = {response_code})"
tty.warn(msg)
try:
response_text = _urlopen(request, timeout=SPACK_CDASH_TIMEOUT).read()
except OSError as e:
tty.warn(f"Failed to create CDash buildgroup: {e}")
return None
response_text = response.read()
response_json = json.loads(response_text)
build_group_id = response_json["id"]
return build_group_id
try:
response_json = json.loads(response_text)
return response_json["id"]
except (json.JSONDecodeError, KeyError) as e:
tty.warn(f"Failed to parse CDash response: {e}")
return None
def populate_buildgroup(self, job_names):
url = f"{self.url}/api/v1/buildgroup.php"
@@ -308,16 +284,11 @@ def populate_buildgroup(self, job_names):
"Content-Type": "application/json",
}
opener = build_opener(HTTPHandler)
parent_group_id = self.create_buildgroup(opener, headers, url, self.build_group, "Daily")
group_id = self.create_buildgroup(
opener, headers, url, f"Latest {self.build_group}", "Latest"
)
parent_group_id = self.create_buildgroup(headers, url, self.build_group, "Daily")
group_id = self.create_buildgroup(headers, url, f"Latest {self.build_group}", "Latest")
if not parent_group_id or not group_id:
msg = f"Failed to create or retrieve buildgroups for {self.build_group}"
tty.warn(msg)
tty.warn(f"Failed to create or retrieve buildgroups for {self.build_group}")
return
data = {
@@ -329,15 +300,12 @@ def populate_buildgroup(self, job_names):
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
request = Request(url, data=enc_data, headers=headers, method="PUT")
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
msg = f"Error response code ({response_code}) in populate_buildgroup"
tty.warn(msg)
try:
_urlopen(request, timeout=SPACK_CDASH_TIMEOUT)
except OSError as e:
tty.warn(f"Failed to populate CDash buildgroup: {e}")
def report_skipped(self, spec: spack.spec.Spec, report_dir: str, reason: Optional[str]):
"""Explicitly report skipping testing of a spec (e.g., it's CI
@@ -735,9 +703,6 @@ def _apply_section(dest, src):
for value in header.values():
value = os.path.expandvars(value)
verify_ssl = mapping.get("verify_ssl", spack.config.get("config:verify_ssl", True))
timeout = mapping.get("timeout", spack.config.get("config:connect_timeout", 1))
required = mapping.get("require", [])
allowed = mapping.get("allow", [])
ignored = mapping.get("ignore", [])
@@ -771,19 +736,15 @@ def job_query(job):
endpoint_url._replace(query=query).geturl(), headers=header, method="GET"
)
try:
response = _dyn_mapping_urlopener(
request, verify_ssl=verify_ssl, timeout=timeout
)
response = _urlopen(request)
config = json.load(response)
except Exception as e:
# For now just ignore any errors from dynamic mapping and continue
# This is still experimental, and failures should not stop CI
# from running normally
tty.warn(f"Failed to fetch dynamic mapping for query:\n\t{query}")
tty.warn(f"{e}")
tty.warn(f"Failed to fetch dynamic mapping for query:\n\t{query}: {e}")
continue
config = json.load(codecs.getreader("utf-8")(response))
# Strip ignore keys
if ignored:
for key in ignored:

View File

@@ -202,7 +202,7 @@ def _concretize_spec_pairs(
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized(tests=tests)]
return [concrete or spack.concretize.concretize_one(abstract, tests=tests)]
# Special case if every spec is either concrete or has an abstract hash
if all(
@@ -254,9 +254,9 @@ def matching_spec_from_env(spec):
"""
env = ev.active_environment()
if env:
return env.matching_spec(spec) or spec.concretized()
return env.matching_spec(spec) or spack.concretize.concretize_one(spec)
else:
return spec.concretized()
return spack.concretize.concretize_one(spec)
def matching_specs_from_env(specs):
@@ -297,7 +297,7 @@ def disambiguate_spec(
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: List[str],
hashes: Optional[List[str]],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,

View File

@@ -1,7 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import os
import shutil
import sys
import tempfile
@@ -14,9 +14,9 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
import spack.util.spack_yaml
@@ -397,7 +397,7 @@ def _mirror(args):
llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
spec = spack.concretize.concretize_one(spec_str)
for node in spec.traverse():
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
@@ -436,6 +436,7 @@ def write_metadata(subdir, metadata):
shutil.copy(spack.util.path.canonicalize_path(GNUPG_JSON), abs_directory)
shutil.copy(spack.util.path.canonicalize_path(PATCHELF_JSON), abs_directory)
instructions += cmd.format("local-binaries", rel_directory)
instructions += " % spack buildcache update-index <final-path>/bootstrap_cache\n"
print(instructions)

View File

@@ -16,6 +16,7 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.environment as ev
@@ -554,8 +555,7 @@ def check_fn(args: argparse.Namespace):
tty.msg("No specs provided, exiting.")
return
for spec in specs:
spec.concretize()
specs = [spack.concretize.concretize_one(s) for s in specs]
# Next see if there are any configured binary mirrors
configured_mirrors = spack.config.get("mirrors", scope=args.scope)
@@ -623,7 +623,7 @@ def save_specfile_fn(args):
root = specs[0]
if not root.concrete:
root.concretize()
root = spack.concretize.concretize_one(root)
save_dependency_specfiles(
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)

View File

@@ -4,7 +4,7 @@
import argparse
import os.path
import os
import textwrap
from llnl.util.lang import stable_partition

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import llnl.util.tty

View File

@@ -18,6 +18,7 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.concretize
import spack.environment as ev
import spack.installer
import spack.store
@@ -103,7 +104,7 @@ def deprecate(parser, args):
)
if args.install:
deprecator = specs[1].concretized()
deprecator = spack.concretize.concretize_one(specs[1])
else:
deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True)

View File

@@ -10,6 +10,7 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments
import spack.concretize
import spack.config
import spack.repo
from spack.cmd.common import arguments
@@ -113,8 +114,8 @@ def dev_build(self, args):
source_path = os.path.abspath(source_path)
# Forces the build to run out of the source directory.
spec.constrain("dev_path=%s" % source_path)
spec.concretize()
spec.constrain(f'dev_path="{source_path}"')
spec = spack.concretize.concretize_one(spec)
if spec.installed:
tty.error("Already installed in %s" % spec.prefix)

View File

@@ -13,6 +13,7 @@
from llnl.util import lang, tty
import spack.cmd
import spack.concretize
import spack.config
import spack.environment as ev
import spack.paths
@@ -450,7 +451,7 @@ def concrete_specs_from_file(args):
else:
s = spack.spec.Spec.from_json(f)
concretized = s.concretized()
concretized = spack.concretize.concretize_one(s)
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."

View File

@@ -7,9 +7,9 @@
from llnl.path import convert_to_posix_path
import spack.concretize
import spack.paths
import spack.util.executable
from spack.spec import Spec
description = "generate Windows installer"
section = "admin"
@@ -65,8 +65,7 @@ def make_installer(parser, args):
"""
if sys.platform == "win32":
output_dir = args.output_dir
cmake_spec = Spec("cmake")
cmake_spec.concretize()
cmake_spec = spack.concretize.concretize_one("cmake")
cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe")
cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe")
spack_source = args.spack_source

View File

@@ -492,7 +492,7 @@ def extend_with_additional_versions(specs, num_versions):
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
mirror_specs = [spack.concretize.concretize_one(x) for x in mirror_specs]
return mirror_specs

View File

@@ -5,7 +5,7 @@
"""Implementation details of the ``spack module`` command."""
import collections
import os.path
import os
import shutil
import sys

View File

@@ -2,7 +2,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import os
import shutil
import llnl.util.tty as tty

View File

@@ -5,7 +5,7 @@
import argparse
import collections
import io
import os.path
import os
import re
import sys

View File

@@ -37,13 +37,12 @@ def enable_compiler_existence_check():
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
def concretize_specs_together(
abstract_specs: Sequence[SpecLike], tests: TestsType = False
) -> Sequence[Spec]:
def _concretize_specs_together(
abstract_specs: Sequence[Spec], tests: TestsType = False
) -> List[Spec]:
"""Given a number of specs as input, tries to concretize them together.
Args:
@@ -51,11 +50,10 @@ def concretize_specs_together(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
result = Solver().solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
@@ -72,7 +70,7 @@ def concretize_together(
"""
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
abstract_specs = [abstract for abstract, _ in spec_list]
concrete_specs = concretize_specs_together(to_concretize, tests=tests)
concrete_specs = _concretize_specs_together(to_concretize, tests=tests)
return list(zip(abstract_specs, concrete_specs))
@@ -90,7 +88,7 @@ def concretize_together_when_possible(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
from spack.solver.asp import Solver
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
@@ -98,9 +96,8 @@ def concretize_together_when_possible(
}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
for result in Solver().solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
@@ -124,7 +121,7 @@ def concretize_separately(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.bootstrap
from spack.bootstrap import ensure_bootstrap_configuration, ensure_clingo_importable_or_raise
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
@@ -134,8 +131,8 @@ def concretize_separately(
]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
with ensure_bootstrap_configuration():
ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -190,10 +187,52 @@ def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int,
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = Spec(spec_str).concretized(tests=tests)
spec = concretize_one(Spec(spec_str), tests=tests)
return index, spec, time.time() - start
def concretize_one(spec: Union[str, Spec], tests: TestsType = False) -> Spec:
"""Return a concretized copy of the given spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
from spack.solver.asp import Solver, SpecBuilder
if isinstance(spec, str):
spec = Spec(spec)
spec = spec.lookup_hash()
if spec.concrete:
return spec.copy()
for node in spec.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve([spec], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]
node = SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
return concretized
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""

View File

@@ -36,6 +36,8 @@
import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
import jsonschema
from llnl.util import filesystem, lang, tty
import spack.error
@@ -51,6 +53,7 @@
import spack.schema.definitions
import spack.schema.develop
import spack.schema.env
import spack.schema.env_vars
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -68,6 +71,7 @@
"compilers": spack.schema.compilers.schema,
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"env_vars": spack.schema.env_vars.schema,
"view": spack.schema.view.schema,
"develop": spack.schema.develop.schema,
"mirrors": spack.schema.mirrors.schema,
@@ -1048,8 +1052,6 @@ def validate(
This leverages the line information (start_mark, end_mark) stored
on Spack YAML structures.
"""
import jsonschema
try:
spack.schema.Validator(schema).validate(data)
except jsonschema.ValidationError as e:

View File

@@ -6,6 +6,8 @@
"""
import warnings
import jsonschema
import spack.environment as ev
import spack.schema.env as env
import spack.util.spack_yaml as syaml
@@ -30,8 +32,6 @@ def validate(configuration_file):
Returns:
A sanitized copy of the configuration stored in the input file
"""
import jsonschema
with open(configuration_file, encoding="utf-8") as f:
config = syaml.load(f)

View File

@@ -3,7 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Manages the details on the images used in the various stages."""
import json
import os.path
import os
import shlex
import sys

View File

@@ -9,6 +9,8 @@
from collections import namedtuple
from typing import Optional
import jsonschema
import spack.environment as ev
import spack.error
import spack.schema.env
@@ -188,8 +190,6 @@ def paths(self):
@tengine.context_property
def manifest(self):
"""The spack.yaml file that should be used in the image"""
import jsonschema
# Copy in the part of spack.yaml prescribed in the configuration file
manifest = copy.deepcopy(self.config)
manifest.pop("container")

View File

@@ -123,6 +123,15 @@
"deprecated_for",
)
#: File where the database is written
INDEX_JSON_FILE = "index.json"
# Verifier file to check last modification of the DB
_INDEX_VERIFIER_FILE = "index_verifier"
# Lockfile for the database
_LOCK_FILE = "lock"
@llnl.util.lang.memoized
def _getfqdn():
@@ -260,7 +269,7 @@ class ForbiddenLockError(SpackError):
class ForbiddenLock:
def __getattr__(self, name):
raise ForbiddenLockError("Cannot access attribute '{0}' of lock".format(name))
raise ForbiddenLockError(f"Cannot access attribute '{name}' of lock")
def __reduce__(self):
return ForbiddenLock, tuple()
@@ -419,14 +428,25 @@ class FailureTracker:
the likelihood of collision very low with no cleanup required.
"""
#: root directory of the failure tracker
dir: pathlib.Path
#: File for locking particular concrete spec hashes
locker: SpecLocker
def __init__(self, root_dir: Union[str, pathlib.Path], default_timeout: Optional[float]):
#: Ensure a persistent location for dealing with parallel installation
#: failures (e.g., across near-concurrent processes).
self.dir = pathlib.Path(root_dir) / _DB_DIRNAME / "failures"
self.dir.mkdir(parents=True, exist_ok=True)
self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout)
def _ensure_parent_directories(self) -> None:
"""Ensure that parent directories of the FailureTracker exist.
Accesses the filesystem only once, the first time it's called on a given FailureTracker.
"""
self.dir.mkdir(parents=True, exist_ok=True)
def clear(self, spec: "spack.spec.Spec", force: bool = False) -> None:
"""Removes any persistent and cached failure tracking for the spec.
@@ -469,13 +489,18 @@ def clear_all(self) -> None:
tty.debug("Removing prefix failure tracking files")
try:
for fail_mark in os.listdir(str(self.dir)):
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
marks = os.listdir(str(self.dir))
except FileNotFoundError:
return # directory doesn't exist yet
except OSError as exc:
tty.warn(f"Unable to remove failure marking files: {str(exc)}")
return
for fail_mark in marks:
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
"""Marks a spec as failing to install.
@@ -483,6 +508,8 @@ def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
Args:
spec: spec that failed to install
"""
self._ensure_parent_directories()
# Dump the spec to the failure file for (manual) debugging purposes
path = self._path(spec)
path.write_text(spec.to_json())
@@ -567,17 +594,13 @@ def __init__(
Relevant only if the repository is not an upstream.
"""
self.root = root
self.database_directory = os.path.join(self.root, _DB_DIRNAME)
self.database_directory = pathlib.Path(self.root) / _DB_DIRNAME
self.layout = layout
# Set up layout of database files within the db dir
self._index_path = os.path.join(self.database_directory, "index.json")
self._verifier_path = os.path.join(self.database_directory, "index_verifier")
self._lock_path = os.path.join(self.database_directory, "lock")
# Create needed directories and files
if not is_upstream and not os.path.exists(self.database_directory):
fs.mkdirp(self.database_directory)
self._index_path = self.database_directory / INDEX_JSON_FILE
self._verifier_path = self.database_directory / _INDEX_VERIFIER_FILE
self._lock_path = self.database_directory / _LOCK_FILE
self.is_upstream = is_upstream
self.last_seen_verifier = ""
@@ -592,14 +615,14 @@ def __init__(
# initialize rest of state.
self.db_lock_timeout = lock_cfg.database_timeout
tty.debug("DATABASE LOCK TIMEOUT: {0}s".format(str(self.db_lock_timeout)))
tty.debug(f"DATABASE LOCK TIMEOUT: {str(self.db_lock_timeout)}s")
self.lock: Union[ForbiddenLock, lk.Lock]
if self.is_upstream:
self.lock = ForbiddenLock()
else:
self.lock = lk.Lock(
self._lock_path,
str(self._lock_path),
default_timeout=self.db_lock_timeout,
desc="database",
enable=lock_cfg.enable,
@@ -616,6 +639,11 @@ def __init__(
self._write_transaction_impl = lk.WriteTransaction
self._read_transaction_impl = lk.ReadTransaction
def _ensure_parent_directories(self):
"""Create the parent directory for the DB, if necessary."""
if not self.is_upstream:
self.database_directory.mkdir(parents=True, exist_ok=True)
def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return self._write_transaction_impl(self.lock, acquire=self._read, release=self._write)
@@ -630,6 +658,8 @@ def _write_to_file(self, stream):
This function does not do any locking or transactions.
"""
self._ensure_parent_directories()
# map from per-spec hash code to installation record.
installs = dict(
(k, v.to_dict(include_fields=self.record_fields)) for k, v in self._data.items()
@@ -759,7 +789,7 @@ def _read_from_file(self, filename):
Does not do any locking.
"""
try:
with open(filename, "r", encoding="utf-8") as f:
with open(str(filename), "r", encoding="utf-8") as f:
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
@@ -860,11 +890,13 @@ def reindex(self):
if self.is_upstream:
raise UpstreamDatabaseLockingError("Cannot reindex an upstream database")
self._ensure_parent_directories()
# Special transaction to avoid recursive reindex calls and to
# ignore errors if we need to rebuild a corrupt database.
def _read_suppress_error():
try:
if os.path.isfile(self._index_path):
if self._index_path.is_file():
self._read_from_file(self._index_path)
except CorruptDatabaseError as e:
tty.warn(f"Reindexing corrupt database, error was: {e}")
@@ -1007,7 +1039,7 @@ def _check_ref_counts(self):
% (key, found, expected, self._index_path)
)
def _write(self, type, value, traceback):
def _write(self, type=None, value=None, traceback=None):
"""Write the in-memory database index to its file path.
This is a helper function called by the WriteTransaction context
@@ -1018,6 +1050,8 @@ def _write(self, type, value, traceback):
This routine does no locking.
"""
self._ensure_parent_directories()
# Do not write if exceptions were raised
if type is not None:
# A failure interrupted a transaction, so we should record that
@@ -1026,16 +1060,16 @@ def _write(self, type, value, traceback):
self._state_is_inconsistent = True
return
temp_file = self._index_path + (".%s.%s.temp" % (_getfqdn(), os.getpid()))
temp_file = str(self._index_path) + (".%s.%s.temp" % (_getfqdn(), os.getpid()))
# Write a temporary database file them move it into place
try:
with open(temp_file, "w", encoding="utf-8") as f:
self._write_to_file(f)
fs.rename(temp_file, self._index_path)
fs.rename(temp_file, str(self._index_path))
if _use_uuid:
with open(self._verifier_path, "w", encoding="utf-8") as f:
with self._verifier_path.open("w", encoding="utf-8") as f:
new_verifier = str(uuid.uuid4())
f.write(new_verifier)
self.last_seen_verifier = new_verifier
@@ -1048,11 +1082,11 @@ def _write(self, type, value, traceback):
def _read(self):
"""Re-read Database from the data in the set location. This does no locking."""
if os.path.isfile(self._index_path):
if self._index_path.is_file():
current_verifier = ""
if _use_uuid:
try:
with open(self._verifier_path, "r", encoding="utf-8") as f:
with self._verifier_path.open("r", encoding="utf-8") as f:
current_verifier = f.read()
except BaseException:
pass
@@ -1065,7 +1099,7 @@ def _read(self):
self._state_is_inconsistent = False
return
elif self.is_upstream:
tty.warn("upstream not found: {0}".format(self._index_path))
tty.warn(f"upstream not found: {self._index_path}")
def _add(
self,
@@ -1681,7 +1715,7 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
results.sort()
results.sort() # type: ignore[call-overload]
return results
def query_one(

View File

@@ -15,7 +15,6 @@
import glob
import itertools
import os
import os.path
import pathlib
import re
import sys

View File

@@ -7,7 +7,6 @@
import collections
import concurrent.futures
import os
import os.path
import re
import sys
import traceback

View File

@@ -32,7 +32,7 @@ class OpenMpi(Package):
"""
import collections
import collections.abc
import os.path
import os
import re
from typing import Any, Callable, List, Optional, Tuple, Type, Union

View File

@@ -10,6 +10,7 @@
import spack.environment as ev
import spack.repo
import spack.schema.environment
import spack.store
from spack.util.environment import EnvironmentModifications
@@ -156,6 +157,11 @@ def activate(
# MANPATH, PYTHONPATH, etc. All variables that end in PATH (case-sensitive)
# become PATH variables.
#
env_vars_yaml = env.manifest.configuration.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml))
try:
if view and env.has_view(view):
with spack.store.STORE.db.read_transaction():
@@ -189,6 +195,10 @@ def deactivate() -> EnvironmentModifications:
if active is None:
return env_mods
env_vars_yaml = active.manifest.configuration.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml).reversed())
active_view = os.getenv(ev.spack_env_view_var)
if active_view and active.has_view(active_view):

View File

@@ -25,7 +25,6 @@
import functools
import http.client
import os
import os.path
import re
import shutil
import urllib.error
@@ -321,9 +320,15 @@ def _fetch_urllib(self, url):
request = urllib.request.Request(url, headers={"User-Agent": web_util.SPACK_USER_AGENT})
if os.path.lexists(save_file):
os.remove(save_file)
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
tty.msg(f"Fetching {url}")
with open(save_file, "wb") as f:
shutil.copyfileobj(response, f)
except OSError as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
@@ -331,14 +336,6 @@ def _fetch_urllib(self, url):
os.remove(save_file)
raise FailedDownloadError(e) from e
tty.msg(f"Fetching {url}")
if os.path.lexists(save_file):
os.remove(save_file)
with open(save_file, "wb") as f:
shutil.copyfileobj(response, f)
# Save the redirected URL for error messages. Sometimes we're redirected to an arbitrary
# mirror that is broken, leading to spurious download failures. In that case it's helpful
# for users to know which URL was actually fetched.
@@ -535,11 +532,16 @@ def __init__(self, *, url: str, checksum: Optional[str] = None, **kwargs):
@_needs_stage
def fetch(self):
file = self.stage.save_filename
tty.msg(f"Fetching {self.url}")
if os.path.lexists(file):
os.remove(file)
try:
response = self._urlopen(self.url)
except (TimeoutError, urllib.error.URLError) as e:
tty.msg(f"Fetching {self.url}")
with open(file, "wb") as f:
shutil.copyfileobj(response, f)
except OSError as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
@@ -547,12 +549,6 @@ def fetch(self):
os.remove(file)
raise FailedDownloadError(e) from e
if os.path.lexists(file):
os.remove(file)
with open(file, "wb") as f:
shutil.copyfileobj(response, f)
class VCSFetchStrategy(FetchStrategy):
"""Superclass for version control system fetch strategies.

View File

@@ -89,10 +89,10 @@ def view_copy(
if stat.S_ISLNK(src_stat.st_mode):
spack.relocate.relocate_links(links=[dst], prefix_to_prefix=prefix_to_projection)
elif spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(binaries=[dst], prefixes=prefix_to_projection)
spack.relocate.relocate_text_bin(binaries=[dst], prefix_to_prefix=prefix_to_projection)
else:
prefix_to_projection[spack.store.STORE.layout.root] = view._root
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
spack.relocate.relocate_text(files=[dst], prefix_to_prefix=prefix_to_projection)
# The os module on Windows does not have a chown function.
if sys.platform != "win32":

View File

@@ -275,7 +275,7 @@ def _do_fake_install(pkg: "spack.package_base.PackageBase") -> None:
fs.mkdirp(pkg.prefix.bin)
fs.touch(os.path.join(pkg.prefix.bin, command))
if sys.platform != "win32":
chmod = which("chmod")
chmod = which("chmod", required=True)
chmod("+x", os.path.join(pkg.prefix.bin, command))
# Install fake header file

View File

@@ -14,7 +14,6 @@
import io
import operator
import os
import os.path
import pstats
import re
import shlex

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import traceback
import llnl.util.tty as tty

View File

@@ -31,7 +31,7 @@
import copy
import datetime
import inspect
import os.path
import os
import re
import string
from typing import List, Optional

View File

@@ -4,7 +4,7 @@
import collections
import itertools
import os.path
import os
from typing import Dict, List, Optional, Tuple
import llnl.util.filesystem as fs

View File

@@ -5,7 +5,7 @@
"""This module implements the classes necessary to generate Tcl
non-hierarchical modules.
"""
import os.path
import os
from typing import Dict, Optional, Tuple
import spack.config

View File

@@ -7,6 +7,7 @@
import base64
import json
import re
import socket
import time
import urllib.error
import urllib.parse
@@ -410,7 +411,7 @@ def wrapper(*args, **kwargs):
for i in range(retries):
try:
return f(*args, **kwargs)
except (urllib.error.URLError, TimeoutError) as e:
except OSError as e:
# Retry on internal server errors, and rate limit errors
# Potentially this could take into account the Retry-After header
# if registries support it
@@ -420,9 +421,10 @@ def wrapper(*args, **kwargs):
and (500 <= e.code < 600 or e.code == 429)
)
or (
isinstance(e, urllib.error.URLError) and isinstance(e.reason, TimeoutError)
isinstance(e, urllib.error.URLError)
and isinstance(e.reason, socket.timeout)
)
or isinstance(e, TimeoutError)
or isinstance(e, socket.timeout)
):
# Exponential backoff
sleep(2**i)

View File

@@ -3,8 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.lang
import spack.util.spack_yaml as syaml
@llnl.util.lang.lazy_lexicographic_ordering
class OperatingSystem:
@@ -42,4 +40,4 @@ def _cmp_iter(self):
yield self.version
def to_dict(self):
return syaml.syaml_dict([("name", self.name), ("version", self.version)])
return {"name": self.name, "version": self.version}

View File

@@ -106,8 +106,16 @@
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver
# These are just here for editor support; they will be replaced when the build env
# is set up.
make = MakeExecutable("make", jobs=1)
ninja = MakeExecutable("ninja", jobs=1)
configure = Executable(join_path(".", "configure"))
# These are just here for editor support; they may be set when the build env is set up.
configure: Executable
make_jobs: int
make: MakeExecutable
ninja: MakeExecutable
python_include: str
python_platlib: str
python_purelib: str
python: Executable
spack_cc: str
spack_cxx: str
spack_f77: str
spack_fc: str

View File

@@ -1099,14 +1099,14 @@ def update_external_dependencies(self, extendee_spec=None):
"""
pass
def detect_dev_src_change(self):
def detect_dev_src_change(self) -> bool:
"""
Method for checking for source code changes to trigger rebuild/reinstall
"""
dev_path_var = self.spec.variants.get("dev_path", None)
_, record = spack.store.STORE.db.query_by_spec_hash(self.spec.dag_hash())
mtime = fsys.last_modification_time_recursive(dev_path_var.value)
return mtime > record.installation_time
assert dev_path_var and record, "dev_path variant and record must be present"
return fsys.recursive_mtime_greater_than(dev_path_var.value, record.installation_time)
def all_urls_for_version(self, version: StandardVersion) -> List[str]:
"""Return all URLs derived from version_urls(), url, urls, and

View File

@@ -4,7 +4,6 @@
import hashlib
import os
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type, Union

View File

@@ -1,7 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import os
def slingshot_network():

View File

@@ -6,8 +6,7 @@
import os
import re
import sys
from collections import OrderedDict
from typing import List, Optional
from typing import Dict, Iterable, List, Optional
import macholib.mach_o
import macholib.MachO
@@ -18,28 +17,11 @@
from llnl.util.lang import memoized
from llnl.util.symlink import readlink, symlink
import spack.error
import spack.store
import spack.util.elf as elf
import spack.util.executable as executable
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer
class InstallRootStringError(spack.error.SpackError):
def __init__(self, file_path, root_path):
"""Signal that the relocated binary still has the original
Spack's store root string
Args:
file_path (str): path of the binary
root_path (str): original Spack's store root string
"""
super().__init__(
"\n %s \ncontains string\n %s \n"
"after replacing it in rpaths.\n"
"Package should not be relocated.\n Use -a to override." % (file_path, root_path)
)
from .relocate_text import BinaryFilePrefixReplacer, PrefixToPrefix, TextFilePrefixReplacer
@memoized
@@ -58,7 +40,7 @@ def _decode_macho_data(bytestring):
return bytestring.rstrip(b"\x00").decode("ascii")
def macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
def _macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
"""
Inputs
original rpaths from mach-o binaries
@@ -103,7 +85,7 @@ def macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
return paths_to_paths
def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
def _modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
"""
This function is used to make machO buildcaches on macOS by
replacing old paths with new paths using install_name_tool
@@ -146,7 +128,7 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path):
def _macholib_get_paths(cur_path):
"""Get rpaths, dependent libraries, and library id of mach-o objects."""
headers = []
try:
@@ -228,25 +210,25 @@ def relocate_macho_binaries(path_names, prefix_to_prefix):
if path_name.endswith(".o"):
continue
# get the paths in the old prefix
rpaths, deps, idpath = macholib_get_paths(path_name)
rpaths, deps, idpath = _macholib_get_paths(path_name)
# get the mapping of paths in the old prerix to the new prefix
paths_to_paths = macho_find_paths(rpaths, deps, idpath, prefix_to_prefix)
paths_to_paths = _macho_find_paths(rpaths, deps, idpath, prefix_to_prefix)
# replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
_modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
def relocate_elf_binaries(binaries, prefix_to_prefix):
"""Take a list of binaries, and an ordered dictionary of
prefix to prefix mapping, and update the rpaths accordingly."""
def relocate_elf_binaries(binaries: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Take a list of binaries, and an ordered prefix to prefix mapping, and update the rpaths
accordingly."""
# Transform to binary string
prefix_to_prefix = OrderedDict(
(k.encode("utf-8"), v.encode("utf-8")) for (k, v) in prefix_to_prefix.items()
)
prefix_to_prefix_bin = {
k.encode("utf-8"): v.encode("utf-8") for k, v in prefix_to_prefix.items()
}
for path in binaries:
try:
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix)
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix_bin)
except elf.ElfCStringUpdatesFailed as e:
# Fall back to `patchelf --set-rpath ... --set-interpreter ...`
rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else []
@@ -254,13 +236,13 @@ def relocate_elf_binaries(binaries, prefix_to_prefix):
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def warn_if_link_cant_be_relocated(link, target):
def _warn_if_link_cant_be_relocated(link: str, target: str):
if not os.path.isabs(target):
return
tty.warn('Symbolic link at "{}" to "{}" cannot be relocated'.format(link, target))
tty.warn(f'Symbolic link at "{link}" to "{target}" cannot be relocated')
def relocate_links(links, prefix_to_prefix):
def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Relocate links to a new install prefix."""
regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys()))
for link in links:
@@ -269,7 +251,7 @@ def relocate_links(links, prefix_to_prefix):
# No match.
if match is None:
warn_if_link_cant_be_relocated(link, old_target)
_warn_if_link_cant_be_relocated(link, old_target)
continue
new_target = prefix_to_prefix[match.group()] + old_target[match.end() :]
@@ -277,32 +259,32 @@ def relocate_links(links, prefix_to_prefix):
symlink(new_target, link)
def relocate_text(files, prefixes):
def relocate_text(files: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> None:
"""Relocate text file from the original installation prefix to the
new prefix.
Relocation also affects the the path in Spack's sbang script.
Args:
files (list): Text files to be relocated
prefixes (OrderedDict): String prefixes which need to be changed
files: Text files to be relocated
prefix_to_prefix: ordered prefix to prefix mapping
"""
TextFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(files)
TextFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(files)
def relocate_text_bin(binaries, prefixes):
def relocate_text_bin(binaries: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> List[str]:
"""Replace null terminated path strings hard-coded into binaries.
The new install prefix must be shorter than the original one.
Args:
binaries (list): binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed.
binaries: paths to binaries to be relocated
prefix_to_prefix: ordered prefix to prefix mapping
Raises:
spack.relocate_text.BinaryTextReplaceError: when the new path is longer than the old path
"""
return BinaryFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(binaries)
return BinaryFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(binaries)
def is_macho_magic(magic: bytes) -> bool:
@@ -339,7 +321,7 @@ def _exists_dir(dirname):
return os.path.isdir(dirname)
def is_macho_binary(path):
def is_macho_binary(path: str) -> bool:
try:
with open(path, "rb") as f:
return is_macho_magic(f.read(4))
@@ -363,7 +345,7 @@ def fixup_macos_rpath(root, filename):
return False
# Get Mach-O header commands
(rpath_list, deps, id_dylib) = macholib_get_paths(abspath)
(rpath_list, deps, id_dylib) = _macholib_get_paths(abspath)
# Convert rpaths list to (name -> number of occurrences)
add_rpaths = set()

View File

@@ -6,64 +6,61 @@
paths inside text files and binaries."""
import re
from collections import OrderedDict
from typing import Dict, Union
from typing import IO, Dict, Iterable, List, Union
from llnl.util.lang import PatternBytes
import spack.error
Prefix = Union[str, bytes]
PrefixToPrefix = Union[Dict[str, str], Dict[bytes, bytes]]
def encode_path(p: Prefix) -> bytes:
return p if isinstance(p, bytes) else p.encode("utf-8")
def _prefix_to_prefix_as_bytes(prefix_to_prefix) -> Dict[bytes, bytes]:
return OrderedDict((encode_path(k), encode_path(v)) for (k, v) in prefix_to_prefix.items())
def _prefix_to_prefix_as_bytes(prefix_to_prefix: PrefixToPrefix) -> Dict[bytes, bytes]:
return {encode_path(k): encode_path(v) for (k, v) in prefix_to_prefix.items()}
def utf8_path_to_binary_regex(prefix: str):
def utf8_path_to_binary_regex(prefix: str) -> PatternBytes:
"""Create a binary regex that matches the input path in utf8"""
prefix_bytes = re.escape(prefix).encode("utf-8")
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)" % prefix_bytes)
def _byte_strings_to_single_binary_regex(prefixes):
def _byte_strings_to_single_binary_regex(prefixes: Iterable[bytes]) -> PatternBytes:
all_prefixes = b"|".join(re.escape(p) for p in prefixes)
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)(%s)([\\w\\-_/]*)" % all_prefixes)
def utf8_paths_to_single_binary_regex(prefixes):
def utf8_paths_to_single_binary_regex(prefixes: Iterable[str]) -> PatternBytes:
"""Create a (binary) regex that matches any input path in utf8"""
return _byte_strings_to_single_binary_regex(p.encode("utf-8") for p in prefixes)
def filter_identity_mappings(prefix_to_prefix):
def filter_identity_mappings(prefix_to_prefix: Dict[bytes, bytes]) -> Dict[bytes, bytes]:
"""Drop mappings that are not changed."""
# NOTE: we don't guard against the following case:
# [/abc/def -> /abc/def, /abc -> /x] *will* be simplified to
# [/abc -> /x], meaning that after this simplification /abc/def will be
# mapped to /x/def instead of /abc/def. This should not be a problem.
return OrderedDict((k, v) for (k, v) in prefix_to_prefix.items() if k != v)
return {k: v for k, v in prefix_to_prefix.items() if k != v}
class PrefixReplacer:
"""Base class for applying a prefix to prefix map
to a list of binaries or text files.
Child classes implement _apply_to_file to do the
actual work, which is different when it comes to
"""Base class for applying a prefix to prefix map to a list of binaries or text files. Derived
classes implement _apply_to_file to do the actual work, which is different when it comes to
binaries and text files."""
def __init__(self, prefix_to_prefix: Dict[bytes, bytes]):
def __init__(self, prefix_to_prefix: Dict[bytes, bytes]) -> None:
"""
Arguments:
prefix_to_prefix (OrderedDict):
A ordered mapping from prefix to prefix. The order is
relevant to support substring fallbacks, for example
[("/first/sub", "/x"), ("/first", "/y")] will ensure
/first/sub is matched and replaced before /first.
prefix_to_prefix: An ordered mapping from prefix to prefix. The order is relevant to
support substring fallbacks, for example
``[("/first/sub", "/x"), ("/first", "/y")]`` will ensure /first/sub is matched and
replaced before /first.
"""
self.prefix_to_prefix = filter_identity_mappings(prefix_to_prefix)
@@ -74,7 +71,7 @@ def is_noop(self) -> bool:
or there are no prefixes to replace."""
return not self.prefix_to_prefix
def apply(self, filenames: list):
def apply(self, filenames: Iterable[str]) -> List[str]:
"""Returns a list of files that were modified"""
changed_files = []
if self.is_noop:
@@ -84,17 +81,20 @@ def apply(self, filenames: list):
changed_files.append(filename)
return changed_files
def apply_to_filename(self, filename):
def apply_to_filename(self, filename: str) -> bool:
if self.is_noop:
return False
with open(filename, "rb+") as f:
return self.apply_to_file(f)
def apply_to_file(self, f):
def apply_to_file(self, f: IO[bytes]) -> bool:
if self.is_noop:
return False
return self._apply_to_file(f)
def _apply_to_file(self, f: IO) -> bool:
raise NotImplementedError("Derived classes must implement this method")
class TextFilePrefixReplacer(PrefixReplacer):
"""This class applies prefix to prefix mappings for relocation
@@ -112,13 +112,11 @@ def __init__(self, prefix_to_prefix: Dict[bytes, bytes]):
self.regex = _byte_strings_to_single_binary_regex(self.prefix_to_prefix.keys())
@classmethod
def from_strings_or_bytes(
cls, prefix_to_prefix: Dict[Prefix, Prefix]
) -> "TextFilePrefixReplacer":
def from_strings_or_bytes(cls, prefix_to_prefix: PrefixToPrefix) -> "TextFilePrefixReplacer":
"""Create a TextFilePrefixReplacer from an ordered prefix to prefix map."""
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix))
def _apply_to_file(self, f):
def _apply_to_file(self, f: IO) -> bool:
"""Text replacement implementation simply reads the entire file
in memory and applies the combined regex."""
replacement = lambda m: m.group(1) + self.prefix_to_prefix[m.group(2)] + m.group(3)
@@ -133,12 +131,12 @@ def _apply_to_file(self, f):
class BinaryFilePrefixReplacer(PrefixReplacer):
def __init__(self, prefix_to_prefix, suffix_safety_size=7):
def __init__(self, prefix_to_prefix: Dict[bytes, bytes], suffix_safety_size: int = 7) -> None:
"""
prefix_to_prefix (OrderedDict): OrderedDictionary where the keys are
bytes representing the old prefixes and the values are the new
suffix_safety_size (int): in case of null terminated strings, what size
of the suffix should remain to avoid aliasing issues?
prefix_to_prefix: Ordered dictionary where the keys are bytes representing the old prefixes
and the values are the new
suffix_safety_size: in case of null terminated strings, what size of the suffix should
remain to avoid aliasing issues?
"""
assert suffix_safety_size >= 0
super().__init__(prefix_to_prefix)
@@ -146,17 +144,18 @@ def __init__(self, prefix_to_prefix, suffix_safety_size=7):
self.regex = self.binary_text_regex(self.prefix_to_prefix.keys(), suffix_safety_size)
@classmethod
def binary_text_regex(cls, binary_prefixes, suffix_safety_size=7):
"""
Create a regex that looks for exact matches of prefixes, and also tries to
match a C-string type null terminator in a small lookahead window.
def binary_text_regex(
cls, binary_prefixes: Iterable[bytes], suffix_safety_size: int = 7
) -> PatternBytes:
"""Create a regex that looks for exact matches of prefixes, and also tries to match a
C-string type null terminator in a small lookahead window.
Arguments:
binary_prefixes (list): List of byte strings of prefixes to match
suffix_safety_size (int): Sizeof the lookahed for null-terminated string.
Returns: compiled regex
binary_prefixes: Iterable of byte strings of prefixes to match
suffix_safety_size: Sizeof the lookahed for null-terminated string.
"""
# Note: it's important not to use capture groups for the prefix, since it destroys
# performance due to common prefix optimization.
return re.compile(
b"("
+ b"|".join(re.escape(p) for p in binary_prefixes)
@@ -165,36 +164,34 @@ def binary_text_regex(cls, binary_prefixes, suffix_safety_size=7):
@classmethod
def from_strings_or_bytes(
cls, prefix_to_prefix: Dict[Prefix, Prefix], suffix_safety_size: int = 7
cls, prefix_to_prefix: PrefixToPrefix, suffix_safety_size: int = 7
) -> "BinaryFilePrefixReplacer":
"""Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map.
Arguments:
prefix_to_prefix (OrderedDict): Ordered mapping of prefix to prefix.
suffix_safety_size (int): Number of bytes to retain at the end of a C-string
to avoid binary string-aliasing issues.
prefix_to_prefix: Ordered mapping of prefix to prefix.
suffix_safety_size: Number of bytes to retain at the end of a C-string to avoid binary
string-aliasing issues.
"""
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix), suffix_safety_size)
def _apply_to_file(self, f):
def _apply_to_file(self, f: IO[bytes]) -> bool:
"""
Given a file opened in rb+ mode, apply the string replacements as
specified by an ordered dictionary of prefix to prefix mappings. This
method takes special care of null-terminated C-strings. C-string constants
are problematic because compilers and linkers optimize readonly strings for
space by aliasing those that share a common suffix (only suffix since all
of them are null terminated). See https://github.com/spack/spack/pull/31739
and https://github.com/spack/spack/pull/32253 for details. Our logic matches
the original prefix with a ``suffix_safety_size + 1`` lookahead for null bytes.
If no null terminator is found, we simply pad with leading /, assuming that
it's a long C-string; the full C-string after replacement has a large suffix
in common with its original value.
If there *is* a null terminator we can do the same as long as the replacement
has a sufficiently long common suffix with the original prefix.
As a last resort when the replacement does not have a long enough common suffix,
we can try to shorten the string, but this only works if the new length is
sufficiently short (typically the case when going from large padding -> normal path)
If the replacement string is longer, or all of the above fails, we error out.
Given a file opened in rb+ mode, apply the string replacements as specified by an ordered
dictionary of prefix to prefix mappings. This method takes special care of null-terminated
C-strings. C-string constants are problematic because compilers and linkers optimize
readonly strings for space by aliasing those that share a common suffix (only suffix since
all of them are null terminated). See https://github.com/spack/spack/pull/31739 and
https://github.com/spack/spack/pull/32253 for details. Our logic matches the original
prefix with a ``suffix_safety_size + 1`` lookahead for null bytes. If no null terminator
is found, we simply pad with leading /, assuming that it's a long C-string; the full
C-string after replacement has a large suffix in common with its original value. If there
*is* a null terminator we can do the same as long as the replacement has a sufficiently
long common suffix with the original prefix. As a last resort when the replacement does
not have a long enough common suffix, we can try to shorten the string, but this only
works if the new length is sufficiently short (typically the case when going from large
padding -> normal path) If the replacement string is longer, or all of the above fails,
we error out.
Arguments:
f: file opened in rb+ mode
@@ -204,11 +201,10 @@ def _apply_to_file(self, f):
"""
assert f.tell() == 0
# We *could* read binary data in chunks to avoid loading all in memory,
# but it's nasty to deal with matches across boundaries, so let's stick to
# something simple.
# We *could* read binary data in chunks to avoid loading all in memory, but it's nasty to
# deal with matches across boundaries, so let's stick to something simple.
modified = True
modified = False
for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new)
@@ -218,8 +214,7 @@ def _apply_to_file(self, f):
# Did we find a trailing null within a N + 1 bytes window after the prefix?
null_terminated = match.end(0) > match.end(1)
# Suffix string length, excluding the null byte
# Only makes sense if null_terminated
# Suffix string length, excluding the null byte. Only makes sense if null_terminated
suffix_strlen = match.end(0) - match.end(1) - 1
# How many bytes are we shrinking our string?
@@ -229,9 +224,9 @@ def _apply_to_file(self, f):
if bytes_shorter < 0:
raise CannotGrowString(old, new)
# If we don't know whether this is a null terminated C-string (we're looking
# only N + 1 bytes ahead), or if it is and we have a common suffix, we can
# simply pad with leading dir separators.
# If we don't know whether this is a null terminated C-string (we're looking only N + 1
# bytes ahead), or if it is and we have a common suffix, we can simply pad with leading
# dir separators.
elif (
not null_terminated
or suffix_strlen >= self.suffix_safety_size # == is enough, but let's be defensive
@@ -240,9 +235,9 @@ def _apply_to_file(self, f):
):
replacement = b"/" * bytes_shorter + new
# If it *was* null terminated, all that matters is that we can leave N bytes
# of old suffix in place. Note that > is required since we also insert an
# additional null terminator.
# If it *was* null terminated, all that matters is that we can leave N bytes of old
# suffix in place. Note that > is required since we also insert an additional null
# terminator.
elif bytes_shorter > self.suffix_safety_size:
replacement = new + match.group(2) # includes the trailing null
@@ -257,22 +252,6 @@ def _apply_to_file(self, f):
return modified
class BinaryStringReplacementError(spack.error.SpackError):
def __init__(self, file_path, old_len, new_len):
"""The size of the file changed after binary path substitution
Args:
file_path (str): file with changing size
old_len (str): original length of the file
new_len (str): length of the file after substitution
"""
super().__init__(
"Doing a binary string replacement in %s failed.\n"
"The size of the file changed from %s to %s\n"
"when it should have remanined the same." % (file_path, old_len, new_len)
)
class BinaryTextReplaceError(spack.error.SpackError):
def __init__(self, msg):
msg += (
@@ -284,17 +263,16 @@ def __init__(self, msg):
class CannotGrowString(BinaryTextReplaceError):
def __init__(self, old, new):
msg = "Cannot replace {!r} with {!r} because the new prefix is longer.".format(old, new)
super().__init__(msg)
return super().__init__(
f"Cannot replace {old!r} with {new!r} because the new prefix is longer."
)
class CannotShrinkCString(BinaryTextReplaceError):
def __init__(self, old, new, full_old_string):
# Just interpolate binary string to not risk issues with invalid
# unicode, which would be really bad user experience: error in error.
# We have no clue if we actually deal with a real C-string nor what
# encoding it has.
msg = "Cannot replace {!r} with {!r} in the C-string {!r}.".format(
old, new, full_old_string
# Just interpolate binary string to not risk issues with invalid unicode, which would be
# really bad user experience: error in error. We have no clue if we actually deal with a
# real C-string nor what encoding it has.
super().__init__(
f"Cannot replace {old!r} with {new!r} in the C-string {full_old_string!r}."
)
super().__init__(msg)

View File

@@ -14,7 +14,6 @@
import inspect
import itertools
import os
import os.path
import random
import re
import shutil

View File

@@ -3,7 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import hashlib
import os.path
import os
import platform
import posixpath
import re

View File

@@ -1,7 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import os
import spack.tengine

View File

@@ -3,15 +3,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import tempfile
import spack.binary_distribution as bindist
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.platforms
import spack.relocate as relocate
import spack.store
@@ -42,63 +38,11 @@ def rewire_node(spec, explicit):
spack.hooks.pre_install(spec)
bindist.extract_buildcache_tarball(tarball, destination=spec.prefix)
buildinfo = bindist.read_buildinfo_file(spec.prefix)
bindist.relocate_package(spec)
# compute prefix-to-prefix for every node from the build spec to the spliced
# spec
prefix_to_prefix = {spec.build_spec.prefix: spec.prefix}
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in bindist.specs_to_relocate(spec):
analog = s
if id(s) not in build_spec_ids:
analogs = [
d
for d in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD)
if s._splice_match(d, self_root=spec, other_root=spec.build_spec)
]
if analogs:
# Prefer same-name analogs and prefer higher versions
# This matches the preferences in Spec.splice, so we will find same node
analog = max(analogs, key=lambda a: (a.name == s.name, a.version))
prefix_to_prefix[analog.prefix] = s.prefix
platform = spack.platforms.by_name(spec.platform)
text_to_relocate = [
os.path.join(spec.prefix, rel_path) for rel_path in buildinfo["relocate_textfiles"]
]
if text_to_relocate:
relocate.relocate_text(files=text_to_relocate, prefixes=prefix_to_prefix)
links = [os.path.join(spec.prefix, f) for f in buildinfo["relocate_links"]]
relocate.relocate_links(links, prefix_to_prefix)
bins_to_relocate = [
os.path.join(spec.prefix, rel_path) for rel_path in buildinfo["relocate_binaries"]
]
if bins_to_relocate:
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(bins_to_relocate, prefix_to_prefix)
if "elf" in platform.binary_formats:
relocate.relocate_elf_binaries(bins_to_relocate, prefix_to_prefix)
relocate.relocate_text_bin(binaries=bins_to_relocate, prefixes=prefix_to_prefix)
shutil.rmtree(tempdir)
install_manifest = os.path.join(
spec.prefix,
spack.store.STORE.layout.metadata_dir,
spack.store.STORE.layout.manifest_file_name,
)
try:
os.unlink(install_manifest)
except FileNotFoundError:
pass
# Write the spliced spec into spec.json. Without this, Database.add would fail because it
# checks the spec.json in the prefix against the spec being added to look for mismatches
spack.store.STORE.layout.write_spec(spec, spack.store.STORE.layout.spec_file_path(spec))
# add to database, not sure about explicit
spack.store.STORE.db.add(spec, explicit=explicit)
# run post install hooks
# run post install hooks and add to db
spack.hooks.post_install(spec, explicit)
spack.store.STORE.db.add(spec, explicit=explicit)
class RewireError(spack.error.SpackError):

View File

@@ -6,6 +6,8 @@
import typing
import warnings
import jsonschema
import llnl.util.lang
from spack.error import SpecSyntaxError
@@ -19,12 +21,8 @@ class DeprecationMessage(typing.NamedTuple):
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
import jsonschema
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import jsonschema
import spack.spec_parser
if not validator.is_type(instance, "object"):
@@ -33,8 +31,8 @@ def _validate_spec(validator, is_spec, instance, schema):
for spec_str in instance:
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError as e:
yield jsonschema.ValidationError(str(e))
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
@@ -67,7 +65,7 @@ def _deprecated_properties(validator, deprecated, instance, schema):
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft4Validator,
jsonschema.Draft7Validator,
{"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties},
)

View File

@@ -19,7 +19,7 @@
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
"additionalProperties": spec_list_schema,
},
}
}

View File

@@ -0,0 +1,22 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for env_vars.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/env_vars.py
:lines: 15-
"""
from typing import Any, Dict
import spack.schema.environment
properties: Dict[str, Any] = {"env_vars": spack.schema.environment.definition}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack env_vars configuration file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -20,6 +20,7 @@
import spack.schema.container
import spack.schema.definitions
import spack.schema.develop
import spack.schema.env_vars
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -38,6 +39,7 @@
spack.schema.ci.properties,
spack.schema.definitions.properties,
spack.schema.develop.properties,
spack.schema.env_vars.properties,
spack.schema.mirrors.properties,
spack.schema.modules.properties,
spack.schema.packages.properties,

View File

@@ -9,6 +9,8 @@
"""
from typing import Any, Dict
import jsonschema
#: Common properties for connection specification
connection = {
"url": {"type": "string"},
@@ -102,8 +104,6 @@
def update(data):
import jsonschema
errors = []
def check_access_pair(name, section):

View File

@@ -12,22 +12,6 @@
import spack.schema.environment
import spack.schema.projections
#: Matches a spec or a multi-valued variant but not another
#: valid keyword.
#:
#: THIS NEEDS TO BE UPDATED FOR EVERY NEW KEYWORD THAT
#: IS ADDED IMMEDIATELY BELOW THE MODULE TYPE ATTRIBUTE
spec_regex = (
r"(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|"
r"include|exclude|projections|naming_scheme|core_compilers|all)(^\w[\w-]*)"
)
#: Matches a valid name for a module set
valid_module_set_name = r"^(?!prefix_inspections$)\w[\w-]*$"
#: Matches an anonymous spec, i.e. a spec without a root name
anonymous_spec_regex = r"^[\^@%+~]"
#: Definitions for parts of module schema
array_of_strings = {"type": "array", "default": [], "items": {"type": "string"}}
@@ -56,7 +40,7 @@
"suffixes": {
"type": "object",
"validate_spec": True,
"patternProperties": {r"\w[\w-]*": {"type": "string"}}, # key
"additionalProperties": {"type": "string"}, # key
},
"environment": spack.schema.environment.definition,
},
@@ -64,34 +48,40 @@
projections_scheme = spack.schema.projections.properties["projections"]
module_type_configuration = {
module_type_configuration: Dict = {
"type": "object",
"default": {},
"allOf": [
{
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"}, # Can we be more specific here?
"projections": projections_scheme,
"all": module_file_configuration,
}
},
{
"validate_spec": True,
"patternProperties": {
spec_regex: module_file_configuration,
anonymous_spec_regex: module_file_configuration,
},
},
],
"validate_spec": True,
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
},
"additionalProperties": module_file_configuration,
}
tcl_configuration = module_type_configuration.copy()
lmod_configuration = module_type_configuration.copy()
lmod_configuration["properties"].update(
{
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"validate_spec": True,
"additionalProperties": array_of_strings,
},
}
)
module_config_properties = {
"use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]},
@@ -105,31 +95,8 @@
"default": [],
"items": {"type": "string", "enum": ["tcl", "lmod"]},
},
"lmod": {
"allOf": [
# Base configuration
module_type_configuration,
{
"type": "object",
"properties": {
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"patternProperties": {spec_regex: array_of_strings},
},
},
}, # Specific lmod extensions
]
},
"tcl": {
"allOf": [
# Base configuration
module_type_configuration,
{}, # Specific tcl extensions
]
},
"lmod": lmod_configuration,
"tcl": tcl_configuration,
"prefix_inspections": {
"type": "object",
"additionalProperties": False,
@@ -145,7 +112,6 @@
properties: Dict[str, Any] = {
"modules": {
"type": "object",
"additionalProperties": False,
"properties": {
"prefix_inspections": {
"type": "object",
@@ -156,13 +122,11 @@
},
}
},
"patternProperties": {
valid_module_set_name: {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": module_config_properties,
}
"additionalProperties": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": module_config_properties,
},
}
}

View File

@@ -98,7 +98,6 @@
"packages": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"all": { # package name
"type": "object",
@@ -140,58 +139,54 @@
},
}
},
"patternProperties": {
r"(?!^all$)(^\w[\w-]*)": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {
"type": "array",
"default": [],
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"variants": variants,
"externals": {
"type": "array",
"items": {
"type": "object",
"properties": {
"spec": {"type": "string"},
"prefix": {"type": "string"},
"modules": {"type": "array", "items": {"type": "string"}},
"extra_attributes": {
"type": "object",
"additionalProperties": True,
"properties": {
"compilers": {
"type": "object",
"patternProperties": {
r"(^\w[\w-]*)": {"type": "string"}
},
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
"additionalProperties": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {
"type": "array",
"default": [],
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"variants": variants,
"externals": {
"type": "array",
"items": {
"type": "object",
"properties": {
"spec": {"type": "string"},
"prefix": {"type": "string"},
"modules": {"type": "array", "items": {"type": "string"}},
"extra_attributes": {
"type": "object",
"additionalProperties": {"type": "string"},
"properties": {
"compilers": {
"type": "object",
"patternProperties": {r"(^\w[\w-]*)": {"type": "string"}},
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
},
},
"additionalProperties": True,
"required": ["spec"],
},
"additionalProperties": True,
"required": ["spec"],
},
},
}
},
},
}
}

View File

@@ -37,6 +37,7 @@
import spack.package_prefs
import spack.platforms
import spack.repo
import spack.solver.splicing
import spack.spec
import spack.store
import spack.util.crypto
@@ -67,7 +68,7 @@
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
TransformFunction = Callable[["spack.spec.Spec", List[AspFunction]], List[AspFunction]]
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
#: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32"
@@ -127,8 +128,8 @@ def __str__(self):
@contextmanager
def named_spec(
spec: Optional["spack.spec.Spec"], name: Optional[str]
) -> Iterator[Optional["spack.spec.Spec"]]:
spec: Optional[spack.spec.Spec], name: Optional[str]
) -> Iterator[Optional[spack.spec.Spec]]:
"""Context manager to temporarily set the name of a spec"""
if spec is None or name is None:
yield spec
@@ -747,11 +748,11 @@ def on_model(model):
class KnownCompiler(NamedTuple):
"""Data class to collect information on compilers"""
spec: "spack.spec.Spec"
spec: spack.spec.Spec
os: str
target: str
target: Optional[str]
available: bool
compiler_obj: Optional["spack.compiler.Compiler"]
compiler_obj: Optional[spack.compiler.Compiler]
def _key(self):
return self.spec, self.os, self.target
@@ -1132,7 +1133,7 @@ def __init__(self, tests: bool = False):
set
)
self.possible_compilers: List = []
self.possible_compilers: List[KnownCompiler] = []
self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set()
@@ -1386,7 +1387,7 @@ def effect_rules(self):
def define_variant(
self,
pkg: "Type[spack.package_base.PackageBase]",
pkg: Type[spack.package_base.PackageBase],
name: str,
when: spack.spec.Spec,
variant_def: vt.Variant,
@@ -1490,7 +1491,7 @@ def define_auto_variant(self, name: str, multi: bool):
)
)
def variant_rules(self, pkg: "Type[spack.package_base.PackageBase]"):
def variant_rules(self, pkg: Type[spack.package_base.PackageBase]):
for name in pkg.variant_names():
self.gen.h3(f"Variant {name} in package {pkg.name}")
for when, variant_def in pkg.variant_definitions(name):
@@ -1681,8 +1682,8 @@ def dependency_holds(input_spec, requirements):
def _gen_match_variant_splice_constraints(
self,
pkg,
cond_spec: "spack.spec.Spec",
splice_spec: "spack.spec.Spec",
cond_spec: spack.spec.Spec,
splice_spec: spack.spec.Spec,
hash_asp_var: "AspVar",
splice_node,
match_variants: List[str],
@@ -1740,7 +1741,7 @@ def package_splice_rules(self, pkg):
if any(
v in cond.variants or v in spec_to_splice.variants for v in match_variants
):
raise Exception(
raise spack.error.PackageError(
"Overlap between match_variants and explicitly set variants"
)
variant_constraints = self._gen_match_variant_splice_constraints(
@@ -2710,7 +2711,7 @@ def setup(
if env:
dev_specs = tuple(
spack.spec.Spec(info["spec"]).constrained(
"dev_path=%s"
'dev_path="%s"'
% spack.util.path.canonicalize_path(info["path"], default_wd=env.path)
)
for name, info in env.dev_specs.items()
@@ -2977,7 +2978,7 @@ def _specs_from_requires(self, pkg_name, section):
for s in spec_group[key]:
yield _spec_with_default_name(s, pkg_name)
def pkg_class(self, pkg_name: str) -> typing.Type["spack.package_base.PackageBase"]:
def pkg_class(self, pkg_name: str) -> typing.Type[spack.package_base.PackageBase]:
request = pkg_name
if pkg_name in self.explicitly_required_namespaces:
namespace = self.explicitly_required_namespaces[pkg_name]
@@ -3096,7 +3097,7 @@ def __init__(self, configuration) -> None:
self.compilers.add(candidate)
def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerParser":
def with_input_specs(self, input_specs: List[spack.spec.Spec]) -> "CompilerParser":
"""Accounts for input specs when building the list of possible compilers.
Args:
@@ -3136,7 +3137,7 @@ def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerPar
return self
def add_compiler_from_concrete_spec(self, spec: "spack.spec.Spec") -> None:
def add_compiler_from_concrete_spec(self, spec: spack.spec.Spec) -> None:
"""Account for compilers that are coming from concrete specs, through reuse.
Args:
@@ -3374,14 +3375,6 @@ def consume_facts(self):
self._setup.effect_rules()
# This should be a dataclass, but dataclasses don't work on Python 3.6
class Splice:
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
self.splice_node = splice_node
self.child_name = child_name
self.child_hash = child_hash
class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results."""
@@ -3421,7 +3414,7 @@ def __init__(self, specs, hash_lookup=None):
self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
# Matches parent nodes to splice node
self._splices: Dict[NodeArgument, List[Splice]] = {}
self._splices: Dict[spack.spec.Spec, List[spack.solver.splicing.Splice]] = {}
self._result = None
self._command_line_specs = specs
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
@@ -3540,15 +3533,13 @@ def reorder_flags(self):
)
cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse())
for spec in self._specs.values():
for node, spec in self._specs.items():
# if bootstrapping, compiler is not in config and has no flags
flagmap_from_compiler = {}
if spec.compiler in compilers:
flagmap_from_compiler = compilers[spec.compiler].flags
for flag_type in spec.compiler_flags.valid_compiler_flags():
node = SpecBuilder.make_node(pkg=spec.name)
ordered_flags = []
# 1. Put compiler flags first
@@ -3630,49 +3621,12 @@ def splice_at_hash(
child_name: str,
child_hash: str,
):
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash)
self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate splices.
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
new_spec.add_dependency_edge(
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
)
elif edge.spec in fixed_specs:
new_spec.add_dependency_edge(
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(
edge.spec, depflag=depflag, virtuals=edge.virtuals
)
self._specs[node] = new_spec
fixed_specs[spec] = new_spec
parent_spec = self._specs[parent_node]
splice_spec = self._specs[splice_node]
splice = spack.solver.splicing.Splice(
splice_spec, child_name=child_name, child_hash=child_hash
)
self._splices.setdefault(parent_spec, []).append(splice)
@staticmethod
def sort_fn(function_tuple) -> Tuple[int, int]:
@@ -3765,7 +3719,15 @@ def build_specs(self, function_tuples):
for root in roots.values():
root._finalize_concretization()
self._resolve_automatic_splices()
# Only attempt to resolve automatic splices if the solver produced any
if self._splices:
resolved_splices = spack.solver.splicing._resolve_collected_splices(
list(self._specs.values()), self._splices
)
new_specs = {}
for node, spec in self._specs.items():
new_specs[node] = resolved_splices.get(spec, spec)
self._specs = new_specs
for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s)

View File

@@ -0,0 +1,73 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from functools import cmp_to_key
from typing import Dict, List, NamedTuple
import spack.deptypes as dt
from spack.spec import Spec
from spack.traverse import by_dag_hash, traverse_nodes
class Splice(NamedTuple):
#: The spec being spliced into a parent
splice_spec: Spec
#: The name of the child that splice spec is replacing
child_name: str
#: The hash of the child that `splice_spec` is replacing
child_hash: str
def _resolve_collected_splices(
specs: List[Spec], splices: Dict[Spec, List[Splice]]
) -> Dict[Spec, Spec]:
"""After all of the specs have been concretized, apply all immediate splices.
Returns a dict mapping original specs to their resolved counterparts
"""
def splice_cmp(s1: Spec, s2: Spec):
"""This function can be used to sort a list of specs such that that any
spec which will be spliced into a parent comes after the parent it will
be spliced into. This order ensures that transitive splices will be
executed in the correct order.
"""
s1_splices = splices.get(s1, [])
s2_splices = splices.get(s2, [])
if any([s2.dag_hash() == splice.splice_spec.dag_hash() for splice in s1_splices]):
return -1
elif any([s1.dag_hash() == splice.splice_spec.dag_hash() for splice in s2_splices]):
return 1
else:
return 0
splice_order = sorted(specs, key=cmp_to_key(splice_cmp))
reverse_topo_order = reversed(
[x for x in traverse_nodes(splice_order, order="topo", key=by_dag_hash) if x in specs]
)
already_resolved: Dict[Spec, Spec] = {}
for spec in reverse_topo_order:
immediate = splices.get(spec, [])
if not immediate and not any(
edge.spec in already_resolved for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.clear_caches(ignore=("package_hash",))
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
# If the spec being splice in is also spliced
splice_spec = already_resolved.get(splice.splice_spec, splice.splice_spec)
new_spec.add_dependency_edge(splice_spec, depflag=depflag, virtuals=edge.virtuals)
elif edge.spec in already_resolved:
new_spec.add_dependency_edge(
already_resolved[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(edge.spec, depflag=depflag, virtuals=edge.virtuals)
already_resolved[spec] = new_spec
return already_resolved

View File

@@ -86,7 +86,6 @@
import spack
import spack.compiler
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
import spack.hash_types as ht
@@ -94,7 +93,6 @@
import spack.platforms
import spack.provider_index
import spack.repo
import spack.solver
import spack.spec_parser
import spack.store
import spack.traverse
@@ -232,7 +230,7 @@ def ensure_modern_format_string(fmt: str) -> None:
def _make_microarchitecture(name: str) -> archspec.cpu.Microarchitecture:
if isinstance(name, archspec.cpu.Microarchitecture):
return name
return archspec.cpu.TARGETS.get(name, archspec.cpu.generic_microarchitecture(name))
return archspec.cpu.TARGETS.get(name) or archspec.cpu.generic_microarchitecture(name)
@lang.lazy_lexicographic_ordering
@@ -463,13 +461,16 @@ def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
return bool(self._target_intersection(other))
def _target_constrain(self, other: "ArchSpec") -> bool:
if self.target is None and other.target is None:
return False
if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other)
if self.target_concrete:
if self._target_concrete:
return False
elif other.target_concrete:
elif other._target_concrete:
self.target = other.target
return True
@@ -484,8 +485,8 @@ def _target_constrain(self, other: "ArchSpec") -> bool:
self.target = intersection_target
return True
def _target_intersection(self, other):
results = []
def _target_intersection(self, other: "ArchSpec") -> List[str]:
results: List[str] = []
if not self.target or not other.target:
return results
@@ -511,21 +512,56 @@ def _target_intersection(self, other):
if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max):
results.append(o_min)
else:
# Take intersection of two ranges
# Lots of comparisons needed
_s_min = _make_microarchitecture(s_min)
_s_max = _make_microarchitecture(s_max)
_o_min = _make_microarchitecture(o_min)
_o_max = _make_microarchitecture(o_max)
# Take the "min" of the two max, if there is a partial ordering.
n_max = ""
if s_max and o_max:
_s_max = _make_microarchitecture(s_max)
_o_max = _make_microarchitecture(o_max)
if _s_max.family != _o_max.family:
continue
if _s_max <= _o_max:
n_max = s_max
elif _o_max < _s_max:
n_max = o_max
else:
continue
elif s_max:
n_max = s_max
elif o_max:
n_max = o_max
# Take the "max" of the two min.
n_min = ""
if s_min and o_min:
_s_min = _make_microarchitecture(s_min)
_o_min = _make_microarchitecture(o_min)
if _s_min.family != _o_min.family:
continue
if _s_min >= _o_min:
n_min = s_min
elif _o_min > _s_min:
n_min = o_min
else:
continue
elif s_min:
n_min = s_min
elif o_min:
n_min = o_min
if n_min and n_max:
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min.family != _n_max.family or not _n_min <= _n_max:
continue
if n_min == n_max:
results.append(n_min)
else:
results.append(f"{n_min}:{n_max}")
elif n_min:
results.append(f"{n_min}:")
elif n_max:
results.append(f":{n_max}")
n_min = s_min if _s_min >= _o_min else o_min
n_max = s_max if _s_max <= _o_max else o_max
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min == _n_max:
results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max:
results.append("%s:%s" % (n_min, n_max))
return results
def constrain(self, other: "ArchSpec") -> bool:
@@ -557,40 +593,35 @@ def constrain(self, other: "ArchSpec") -> bool:
return constrained
def copy(self):
def copy(self) -> "ArchSpec":
"""Copy the current instance and returns the clone."""
return ArchSpec(self)
@property
def concrete(self):
"""True if the spec is concrete, False otherwise"""
return self.platform and self.os and self.target and self.target_concrete
return self.platform and self.os and self.target and self._target_concrete
@property
def target_concrete(self):
def _target_concrete(self) -> bool:
"""True if the target is not a range or list."""
return (
self.target is not None and ":" not in str(self.target) and "," not in str(self.target)
)
def to_dict(self):
def to_dict(self) -> dict:
# Generic targets represent either an architecture family (like x86_64)
# or a custom micro-architecture
if self.target.vendor == "generic":
target_data = str(self.target)
else:
# Get rid of compiler flag information before turning the uarch into a dict
uarch_dict = self.target.to_dict()
uarch_dict.pop("compilers", None)
target_data = syaml.syaml_dict(uarch_dict.items())
d = syaml.syaml_dict(
[("platform", self.platform), ("platform_os", self.os), ("target", target_data)]
)
return syaml.syaml_dict([("arch", d)])
target_data = self.target.to_dict()
target_data.pop("compilers", None)
return {"arch": {"platform": self.platform, "platform_os": self.os, "target": target_data}}
@staticmethod
def from_dict(d):
def from_dict(d: dict) -> "ArchSpec":
"""Import an ArchSpec from raw YAML/JSON data"""
arch = d["arch"]
target_name = arch["target"]
@@ -600,13 +631,12 @@ def from_dict(d):
return ArchSpec((arch["platform"], arch["platform_os"], target))
def __str__(self):
return "%s-%s-%s" % (self.platform, self.os, self.target)
return f"{self.platform}-{self.os}-{self.target}"
def __repr__(self):
fmt = "ArchSpec(({0.platform!r}, {0.os!r}, {1!r}))"
return fmt.format(self, str(self.target))
return f"ArchSpec(({self.platform!r}, {self.os!r}, {str(self.target)!r}))"
def __contains__(self, string):
def __contains__(self, string) -> bool:
return string in str(self) or string in self.target
@@ -712,10 +742,7 @@ def _cmp_iter(self):
yield self.versions
def to_dict(self):
d = syaml.syaml_dict([("name", self.name)])
d.update(self.versions.to_dict())
return syaml.syaml_dict([("compiler", d)])
return {"compiler": {"name": self.name, **self.versions.to_dict()}}
@staticmethod
def from_dict(d):
@@ -2052,6 +2079,20 @@ def traverse_edges(
visited=visited,
)
@property
def long_spec(self):
"""Returns a string of the spec with the dependencies completely
enumerated."""
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@property
def short_spec(self):
"""Returns a version of the spec with the dependencies hashed
@@ -2278,9 +2319,7 @@ def to_node_dict(self, hash=ht.dag_hash):
Arguments:
hash (spack.hash_types.SpecHashDescriptor) type of hash to generate.
"""
d = syaml.syaml_dict()
d["name"] = self.name
d = {"name": self.name}
if self.versions:
d.update(self.versions.to_dict())
@@ -2294,7 +2333,7 @@ def to_node_dict(self, hash=ht.dag_hash):
if self.namespace:
d["namespace"] = self.namespace
params = syaml.syaml_dict(sorted(v.yaml_entry() for _, v in self.variants.items()))
params = dict(sorted(v.yaml_entry() for v in self.variants.values()))
# Only need the string compiler flag for yaml file
params.update(
@@ -2320,13 +2359,16 @@ def to_node_dict(self, hash=ht.dag_hash):
)
if self.external:
d["external"] = syaml.syaml_dict(
[
("path", self.external_path),
("module", self.external_modules),
("extra_attributes", self.extra_attributes),
]
)
if self.extra_attributes:
extra_attributes = syaml.sorted_dict(self.extra_attributes)
else:
extra_attributes = None
d["external"] = {
"path": self.external_path,
"module": self.external_modules,
"extra_attributes": extra_attributes,
}
if not self._concrete:
d["concrete"] = False
@@ -2357,29 +2399,25 @@ def to_node_dict(self, hash=ht.dag_hash):
# Note: Relies on sorting dict by keys later in algorithm.
deps = self._dependencies_dict(depflag=hash.depflag)
if deps:
deps_list = []
for name, edges_for_name in sorted(deps.items()):
name_tuple = ("name", name)
for dspec in edges_for_name:
hash_tuple = (hash.name, dspec.spec._cached_hash(hash))
parameters_tuple = (
"parameters",
syaml.syaml_dict(
(
("deptypes", dt.flag_to_tuple(dspec.depflag)),
("virtuals", dspec.virtuals),
)
),
)
ordered_entries = [name_tuple, hash_tuple, parameters_tuple]
deps_list.append(syaml.syaml_dict(ordered_entries))
d["dependencies"] = deps_list
d["dependencies"] = [
{
"name": name,
hash.name: dspec.spec._cached_hash(hash),
"parameters": {
"deptypes": dt.flag_to_tuple(dspec.depflag),
"virtuals": dspec.virtuals,
},
}
for name, edges_for_name in sorted(deps.items())
for dspec in edges_for_name
]
# Name is included in case this is replacing a virtual.
if self._build_spec:
d["build_spec"] = syaml.syaml_dict(
[("name", self.build_spec.name), (hash.name, self.build_spec._cached_hash(hash))]
)
d["build_spec"] = {
"name": self.build_spec.name,
hash.name: self.build_spec._cached_hash(hash),
}
return d
def to_dict(self, hash=ht.dag_hash):
@@ -2481,10 +2519,7 @@ def to_dict(self, hash=ht.dag_hash):
node_list.append(node)
hash_set.add(node_hash)
meta_dict = syaml.syaml_dict([("version", SPECFILE_FORMAT_VERSION)])
inner_dict = syaml.syaml_dict([("_meta", meta_dict), ("nodes", node_list)])
spec_dict = syaml.syaml_dict([("spec", inner_dict)])
return spec_dict
return {"spec": {"_meta": {"version": SPECFILE_FORMAT_VERSION}, "nodes": node_list}}
def node_dict_with_hashes(self, hash=ht.dag_hash):
"""Returns a node_dict of this spec with the dag hash added. If this
@@ -2935,44 +2970,16 @@ def ensure_no_deprecated(root):
raise SpecDeprecatedError(msg)
def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None:
"""Concretize the current spec.
from spack.concretize import concretize_one
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
import spack.solver.asp
warnings.warn(
"`Spec.concretize` is deprecated and will be removed in version 1.0.0. Use "
"`spack.concretize.concretize_one` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
self.replace_hash()
for node in self.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if self._concrete:
return
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve([self], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = self.name
# TODO: Consolidate this code with similar code in solve.py
if self.virtual:
providers = [spec.name for spec in answer.values() if spec.package.provides(name)]
name = providers[0]
node = spack.solver.asp.SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
self._dup(concretized)
self._dup(concretize_one(self, tests))
def _mark_root_concrete(self, value=True):
"""Mark just this spec (not dependencies) concrete."""
@@ -3062,19 +3069,16 @@ def _finalize_concretization(self):
spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize().
from spack.concretize import concretize_one
First clones, then returns a concrete version of this package
without modifying this package.
warnings.warn(
"`Spec.concretized` is deprecated and will be removed in version 1.0.0. Use "
"`spack.concretize.concretize_one` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
clone = self.copy()
clone.concretize(tests=tests)
return clone
return concretize_one(self, tests)
def index(self, deptype="all"):
"""Return a dictionary that points to all the dependencies in this
@@ -3184,18 +3188,13 @@ def constrain(self, other, deps=True):
if not self.variants[v].compatible(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
# TODO: Check out the logic here
sarch, oarch = self.architecture, other.architecture
if sarch is not None and oarch is not None:
if sarch.platform is not None and oarch.platform is not None:
if sarch.platform != oarch.platform:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.os is not None and oarch.os is not None:
if sarch.os != oarch.os:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.target is not None and oarch.target is not None:
if sarch.target != oarch.target:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if (
sarch is not None
and oarch is not None
and not self.architecture.intersects(other.architecture)
):
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
changed = False
@@ -3218,18 +3217,12 @@ def constrain(self, other, deps=True):
changed |= self.compiler_flags.constrain(other.compiler_flags)
old = str(self.architecture)
sarch, oarch = self.architecture, other.architecture
if sarch is None or other.architecture is None:
self.architecture = sarch or oarch
else:
if sarch.platform is None or oarch.platform is None:
self.architecture.platform = sarch.platform or oarch.platform
if sarch.os is None or oarch.os is None:
sarch.os = sarch.os or oarch.os
if sarch.target is None or oarch.target is None:
sarch.target = sarch.target or oarch.target
changed |= str(self.architecture) != old
if sarch is not None and oarch is not None:
changed |= self.architecture.constrain(other.architecture)
elif oarch is not None:
self.architecture = oarch
changed = True
if deps:
changed |= self._constrain_dependencies(other)
@@ -3610,25 +3603,16 @@ def patches(self):
return self._patches
def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, cleardeps=True):
"""Copy the spec other into self. This is an overwriting
copy. It does not copy any dependents (parents), but by default
copies dependencies.
To duplicate an entire DAG, call _dup() on the root of the DAG.
def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True) -> bool:
"""Copies "other" into self, by overwriting all attributes.
Args:
other (Spec): spec to be copied onto ``self``
deps: if True copies all the dependencies. If
False copies None. If deptype/depflag, copy matching types.
cleardeps (bool): if True clears the dependencies of ``self``,
before possibly copying the dependencies of ``other`` onto
``self``
other: spec to be copied onto ``self``
deps: if True copies all the dependencies. If False copies None.
If deptype, or depflag, copy matching types.
Returns:
True if ``self`` changed because of the copy operation,
False otherwise.
True if ``self`` changed because of the copy operation, False otherwise.
"""
# We don't count dependencies as changes here
changed = True
@@ -3653,14 +3637,15 @@ def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, clearde
self.versions = other.versions.copy()
self.architecture = other.architecture.copy() if other.architecture else None
self.compiler = other.compiler.copy() if other.compiler else None
if cleardeps:
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
self.compiler_flags = other.compiler_flags.copy()
self.compiler_flags.spec = self
self.variants = other.variants.copy()
self._build_spec = other._build_spec
# Clear dependencies
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
# FIXME: we manage _patches_in_order_of_appearance specially here
# to keep it from leaking out of spec.py, but we should figure
# out how to handle it more elegantly in the Variant classes.
@@ -4165,15 +4150,7 @@ def __str__(self):
if not self._dependencies:
return self.format()
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
return self.long_spec
@property
def colored_str(self):
@@ -4551,7 +4528,7 @@ def mask_build_deps(in_spec):
return spec
def clear_caches(self, ignore=()):
def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
"""
Clears all cached hashes in a Spec, while preserving other properties.
"""
@@ -4936,9 +4913,7 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"]
if spec.external_modules is False:
spec.external_modules = None
spec.extra_attributes = node["external"].get(
"extra_attributes", syaml.syaml_dict()
)
spec.extra_attributes = node["external"].get("extra_attributes", {})
# specs read in are concrete unless marked abstract
if node.get("concrete", True):

View File

@@ -7,35 +7,14 @@
import pytest
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.solver.asp
from spack.installer import PackageInstaller
from spack.solver.asp import SolverError
from spack.spec import Spec
class CacheManager:
def __init__(self, specs: List[str]) -> None:
self.req_specs = specs
self.concr_specs: List[Spec]
self.concr_specs = []
def __enter__(self):
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
for s in self.concr_specs:
PackageInstaller([s.package], fake=True, explicit=True).install()
def __exit__(self, exc_type, exc_val, exc_tb):
for s in self.concr_specs:
s.package.do_uninstall()
# MacOS and Windows only work if you pass this function pointer rather than a
# closure
def _mock_has_runtime_dependencies(_x):
return True
def _make_specs_non_buildable(specs: List[str]):
output_config = {}
for spec in specs:
@@ -44,203 +23,262 @@ def _make_specs_non_buildable(specs: List[str]):
@pytest.fixture
def splicing_setup(mutable_database, mock_packages, monkeypatch):
spack.config.set("concretizer:reuse", True)
monkeypatch.setattr(
spack.solver.asp, "_has_runtime_dependencies", _mock_has_runtime_dependencies
)
def install_specs(
mutable_database,
mock_packages,
mutable_config,
do_not_check_runtimes_on_reuse,
install_mockery,
):
"""Returns a function that concretizes and installs a list of abstract specs"""
mutable_config.set("concretizer:reuse", True)
def _impl(*specs_str):
concrete_specs = [Spec(s).concretized() for s in specs_str]
PackageInstaller([s.package for s in concrete_specs], fake=True, explicit=True).install()
return concrete_specs
return _impl
def _enable_splicing():
spack.config.set("concretizer:splice", {"automatic": True})
def _has_build_dependency(spec: Spec, name: str):
return any(s.name == name for s in spec.dependencies(None, dt.BUILD))
@pytest.mark.parametrize("spec_str", ["splice-z", "splice-h@1"])
def test_spec_reuse(spec_str, install_specs, mutable_config):
"""Tests reuse of splice-z, without splicing, as a root and as a dependency of splice-h"""
splice_z = install_specs("splice-z@1.0.0+compat")[0]
mutable_config.set("packages", _make_specs_non_buildable(["splice-z"]))
concrete = spack.concretize.concretize_one(spec_str)
assert concrete["splice-z"].satisfies(splice_z)
def test_simple_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-z").concretized().satisfies(Spec("splice-z"))
def test_simple_dep_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
def test_splice_installed_hash(splicing_setup):
cache = [
@pytest.mark.regression("48578")
def test_splice_installed_hash(install_specs, mutable_config):
"""Tests splicing the dependency of an installed spec, for another installed spec"""
splice_t, splice_h = install_specs(
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0",
"splice-h@1.0.2+compat ^splice-z@1.0.0",
]
with CacheManager(cache):
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
spack.config.set("packages", packages_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
)
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
mutable_config.set("packages", packages_config)
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h is reused, so the hash should stay the same
assert concrete["splice-h"].satisfies(splice_h)
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert concrete["splice-h"].dag_hash() == splice_h.dag_hash()
def test_splice_build_splice_node(splicing_setup):
with CacheManager(["splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
def test_splice_build_splice_node(install_specs, mutable_config):
"""Tests splicing the dependency of an installed spec, for a spec that is yet to be built"""
splice_t = install_specs("splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat")[0]
mutable_config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h should be different
assert concrete["splice-h"].dag_hash() != splice_t["splice-h"].dag_hash()
assert concrete["splice-h"].build_spec.dag_hash() == concrete["splice-h"].dag_hash()
def test_double_splice(splicing_setup):
cache = [
def test_double_splice(install_specs, mutable_config):
"""Tests splicing two dependencies of an installed spec, for other installed specs"""
splice_t, splice_h, splice_z = install_specs(
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat",
"splice-h@1.0.2+compat ^splice-z@1.0.1+compat",
"splice-z@1.0.2+compat",
]
with CacheManager(cache):
freeze_builds_config = _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"])
spack.config.set("packages", freeze_builds_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
)
mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"]))
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t and splice-h have a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
assert concrete["splice-h"].dag_hash() != splice_h.dag_hash()
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert not concrete["splice-h"].satisfies(splice_h)
# splice-z is reused, so the hash should stay the same
assert concrete["splice-z"].dag_hash() == splice_z.dag_hash()
# The next two tests are mirrors of one another
def test_virtual_multi_splices_in(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
for gs in goal_specs:
with pytest.raises(Exception):
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
@pytest.mark.parametrize(
"original_spec,goal_spec",
[
# `virtual-abi-1` can be spliced for `virtual-abi-multi abi=one` and vice-versa
(
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
),
(
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-1",
),
# `virtual-abi-2` can be spliced for `virtual-abi-multi abi=two` and vice-versa
(
"depends-on-virtual-with-abi ^virtual-abi-2",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
),
(
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
"depends-on-virtual-with-abi ^virtual-abi-2",
),
],
)
def test_virtual_multi_splices_in(original_spec, goal_spec, install_specs, mutable_config):
"""Tests that we can splice a virtual dependency with a different, but compatible, provider."""
original = install_specs(original_spec)[0]
mutable_config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
assert spliced["virtual-with-abi"].name != spliced.build_spec["virtual-with-abi"].name
def test_virtual_multi_can_be_spliced(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(Exception):
for gs in goal_specs:
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
def test_manyvariant_star_matching_variant_splice(splicing_setup):
cache = [
@pytest.mark.parametrize(
"original_spec,goal_spec",
[
# can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
]
goal_specs = [
Spec("depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2"),
Spec("depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for goal in goal_specs:
with pytest.raises(Exception):
goal.concretized()
_enable_splicing()
for goal in goal_specs:
assert goal.concretized().satisfies(goal)
def test_manyvariant_limited_matching(splicing_setup):
cache = [
(
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
"depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2",
),
(
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
"depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3",
),
# can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
(
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
"depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2",
),
# can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
]
goal_specs = [
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2"),
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for s in goal_specs:
with pytest.raises(Exception):
s.concretized()
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
(
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
"depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3",
),
],
)
def test_manyvariant_matching_variant_splice(
original_spec, goal_spec, install_specs, mutable_config
):
"""Tests splicing with different kind of matching on variants"""
original = install_specs(original_spec)[0]
mutable_config.set("packages", {"depends-on-manyvariants": {"buildable": False}})
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
# The spliced 'manyvariants' is yet to be built
assert spliced["manyvariants"].dag_hash() != original["manyvariants"].dag_hash()
assert spliced["manyvariants"].build_spec.dag_hash() == spliced["manyvariants"].dag_hash()
def test_external_splice_same_name(splicing_setup):
cache = [
def test_external_splice_same_name(install_specs, mutable_config):
"""Tests that externals can be spliced for non-external specs"""
original_splice_h, original_splice_t = install_specs(
"splice-h@1.0.0 ^splice-z@1.0.0+compat",
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat",
]
packages_yaml = {
"splice-z": {"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}]}
}
goal_specs = [
Spec("splice-h@1.0.0 ^splice-z@1.0.2"),
Spec("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"),
]
with CacheManager(cache):
spack.config.set("packages", packages_yaml)
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
)
mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h"]))
mutable_config.set(
"packages",
{
"splice-z": {
"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}],
"buildable": False,
}
},
)
_enable_splicing()
concrete_splice_h = spack.concretize.concretize_one("splice-h@1.0.0 ^splice-z@1.0.2")
concrete_splice_t = spack.concretize.concretize_one(
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"
)
assert concrete_splice_h.dag_hash() != original_splice_h.dag_hash()
assert concrete_splice_h.build_spec.dag_hash() == original_splice_h.dag_hash()
assert concrete_splice_h["splice-z"].external
assert concrete_splice_t.dag_hash() != original_splice_t.dag_hash()
assert concrete_splice_t.build_spec.dag_hash() == original_splice_t.dag_hash()
assert concrete_splice_t["splice-z"].external
assert concrete_splice_t["splice-z"].dag_hash() == concrete_splice_h["splice-z"].dag_hash()
def test_spliced_build_deps_only_in_build_spec(splicing_setup):
cache = ["splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0"]
goal_spec = Spec("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
def test_spliced_build_deps_only_in_build_spec(install_specs):
"""Tests that build specs are not reported in the spliced spec"""
install_specs("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0")
with CacheManager(cache):
_enable_splicing()
concr_goal = goal_spec.concretized()
build_spec = concr_goal._build_spec
# Spec has been spliced
assert build_spec is not None
# Build spec has spliced build dependencies
assert _has_build_dependency(build_spec, "splice-h")
assert _has_build_dependency(build_spec, "splice-z")
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
_enable_splicing()
spliced = spack.concretize.concretize_one("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
build_spec = spliced.build_spec
# Spec has been spliced
assert build_spec.dag_hash() != spliced.dag_hash()
# Build spec has spliced build dependencies
assert build_spec.dependencies("splice-h", dt.BUILD)
assert build_spec.dependencies("splice-z", dt.BUILD)
# Spliced build dependencies are removed
assert len(spliced.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(splicing_setup):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"]
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2")
def test_spliced_transitive_dependency(install_specs, mutable_config):
"""Tests that build specs are not reported, even for spliced transitive dependencies"""
install_specs("splice-depends-on-t@1.0 ^splice-h@1.0.1")
mutable_config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec)
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
_enable_splicing()
spliced = spack.concretize.concretize_one("splice-depends-on-t^splice-h@1.0.2")
# Spec has been spliced
assert spliced.build_spec.dag_hash() != spliced.dag_hash()
assert spliced["splice-t"].build_spec.dag_hash() != spliced["splice-t"].dag_hash()
# Spliced build dependencies are removed
assert len(spliced.dependencies(None, dt.BUILD)) == 0
assert len(spliced["splice-t"].dependencies(None, dt.BUILD)) == 0

View File

@@ -133,5 +133,5 @@ def test_concretize_target_ranges(root_target_range, dep_target_range, result, m
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
spec = spack.concretize.concretize_one(spec)
assert spec.target == spec["pkg-b"].target == result

View File

@@ -28,6 +28,7 @@
import spack.binary_distribution as bindist
import spack.caches
import spack.compilers
import spack.concretize
import spack.config
import spack.fetch_strategy
import spack.hooks.sbang as sbang
@@ -35,13 +36,15 @@
import spack.mirrors.mirror
import spack.oci.image
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.util.gpg
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.binary_distribution import CannotListKeys, GenerateIndexError
from spack.binary_distribution import INDEX_HASH_FILE, CannotListKeys, GenerateIndexError
from spack.database import INDEX_JSON_FILE
from spack.installer import PackageInstaller
from spack.paths import test_path
from spack.spec import Spec
@@ -92,7 +95,7 @@ def config_directory(tmp_path_factory):
@pytest.fixture(scope="function")
def default_config(tmp_path, config_directory, monkeypatch, install_mockery):
def default_config(tmp_path, config_directory, mock_repo_path, install_mockery):
# This fixture depends on install_mockery to ensure
# there is a clear order of initialization. The substitution of the
# config scopes here is done on top of the substitution that comes with
@@ -107,7 +110,6 @@ def default_config(tmp_path, config_directory, monkeypatch, install_mockery):
]
with spack.config.use_configuration(*scopes):
spack.config.CONFIG.set("repos", [spack.paths.mock_packages_path])
njobs = spack.config.get("config:build_jobs")
if not njobs:
spack.config.set("config:build_jobs", 4, scope="user")
@@ -128,8 +130,8 @@ def default_config(tmp_path, config_directory, monkeypatch, install_mockery):
timeout = spack.config.get("config:connect_timeout")
if not timeout:
spack.config.set("config:connect_timeout", 10, scope="user")
yield spack.config.CONFIG
with spack.repo.use_repositories(mock_repo_path):
yield spack.config.CONFIG
@pytest.fixture(scope="function")
@@ -205,8 +207,9 @@ def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the default directory layout scheme.
"""
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
sy_spec = Spec("symly").concretized()
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
sy_spec = spack.concretize.concretize_one("symly")
# Install 'corge' without using a cache
install_cmd("--no-cache", cspec.name)
@@ -253,9 +256,9 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme.
"""
cspec = Spec("corge").concretized()
cspec = spack.concretize.concretize_one("corge")
# This guy tests for symlink relocation
sy_spec = Spec("symly").concretized()
sy_spec = spack.concretize.concretize_one("symly")
# Install some packages with dependent packages
# test install in non-default install path scheme
@@ -276,7 +279,8 @@ def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with relative
rpaths into the default directory layout scheme.
"""
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
# Install buildcache created with relativized rpaths
buildcache_cmd("install", "-uf", cspec.name)
@@ -305,7 +309,7 @@ def test_relative_rpaths_install_nondefault(temporary_mirror_dir):
Test the installation of buildcaches with relativized rpaths
into the non-default directory layout scheme.
"""
cspec = Spec("corge").concretized()
cspec = spack.concretize.concretize_one("corge")
# Test install in non-default install path scheme and relative path
buildcache_cmd("install", "-uf", cspec.name)
@@ -358,7 +362,8 @@ def test_built_spec_cache(temporary_mirror_dir):
that cache from a buildcache index."""
buildcache_cmd("list", "-a", "-l")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
for s in [gspec, cspec]:
results = bindist.get_mirrors_for_spec(s)
@@ -381,7 +386,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
s = Spec("libdwarf").concretized()
s = spack.concretize.concretize_one("libdwarf")
# Install a package
install_cmd(s.name)
@@ -410,7 +415,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
spack.config.set("mirrors", {"test": mirror_url})
s = Spec("libdwarf").concretized()
s = spack.concretize.concretize_one("libdwarf")
# Install a package
install_cmd("--no-cache", s.name)
@@ -494,7 +499,7 @@ def mock_list_url(url, recursive=False):
def test_update_sbang(tmp_path, temporary_mirror, mock_fetch, install_mockery):
"""Test relocation of the sbang shebang line in a package script"""
s = Spec("old-sbang").concretized()
s = spack.concretize.concretize_one("old-sbang")
PackageInstaller([s.package]).install()
old_prefix, old_sbang_shebang = s.prefix, sbang.sbang_shebang_line()
old_contents = f"""\
@@ -602,7 +607,7 @@ def test_etag_fetching_304():
# handled as success, since it means the local cache is up-to-date.
def response_304(request: urllib.request.Request):
url = request.get_full_url()
if url == "https://www.example.com/build_cache/index.json":
if url == f"https://www.example.com/build_cache/{INDEX_JSON_FILE}":
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
raise urllib.error.HTTPError(
url, 304, "Not Modified", hdrs={}, fp=None # type: ignore[arg-type]
@@ -624,7 +629,7 @@ def test_etag_fetching_200():
# Test conditional fetch with etags. The remote has modified the file.
def response_200(request: urllib.request.Request):
url = request.get_full_url()
if url == "https://www.example.com/build_cache/index.json":
if url == f"https://www.example.com/build_cache/{INDEX_JSON_FILE}":
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
return urllib.response.addinfourl(
io.BytesIO(b"Result"),
@@ -675,7 +680,7 @@ def test_default_index_fetch_200():
def urlopen(request: urllib.request.Request):
url = request.get_full_url()
if url.endswith("index.json.hash"):
if url.endswith(INDEX_HASH_FILE):
return urllib.response.addinfourl( # type: ignore[arg-type]
io.BytesIO(index_json_hash.encode()),
headers={}, # type: ignore[arg-type]
@@ -683,7 +688,7 @@ def urlopen(request: urllib.request.Request):
code=200,
)
elif url.endswith("index.json"):
elif url.endswith(INDEX_JSON_FILE):
return urllib.response.addinfourl(
io.BytesIO(index_json.encode()),
headers={"Etag": '"59bcc3ad6775562f845953cf01624225"'}, # type: ignore[arg-type]
@@ -714,7 +719,7 @@ def test_default_index_dont_fetch_index_json_hash_if_no_local_hash():
def urlopen(request: urllib.request.Request):
url = request.get_full_url()
if url.endswith("index.json"):
if url.endswith(INDEX_JSON_FILE):
return urllib.response.addinfourl(
io.BytesIO(index_json.encode()),
headers={"Etag": '"59bcc3ad6775562f845953cf01624225"'}, # type: ignore[arg-type]
@@ -743,7 +748,7 @@ def test_default_index_not_modified():
def urlopen(request: urllib.request.Request):
url = request.get_full_url()
if url.endswith("index.json.hash"):
if url.endswith(INDEX_HASH_FILE):
return urllib.response.addinfourl(
io.BytesIO(index_json_hash.encode()),
headers={}, # type: ignore[arg-type]
@@ -788,7 +793,7 @@ def test_default_index_json_404():
def urlopen(request: urllib.request.Request):
url = request.get_full_url()
if url.endswith("index.json.hash"):
if url.endswith(INDEX_HASH_FILE):
return urllib.response.addinfourl(
io.BytesIO(index_json_hash.encode()),
headers={}, # type: ignore[arg-type]
@@ -796,7 +801,7 @@ def urlopen(request: urllib.request.Request):
code=200,
)
elif url.endswith("index.json"):
elif url.endswith(INDEX_JSON_FILE):
raise urllib.error.HTTPError(
url,
code=404,

View File

@@ -220,14 +220,12 @@ def test_source_is_disabled(mutable_config):
# The source is not explicitly enabled or disabled, so the following
# call should raise to skip using it for bootstrapping
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
assert not spack.bootstrap.core.source_is_enabled(conf)
# Try to explicitly disable the source and verify that the behavior
# is the same as above
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
assert not spack.bootstrap.core.source_is_enabled(conf)
@pytest.mark.regression("45247")

View File

@@ -3,20 +3,19 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import pytest
import spack.binary_distribution as bd
import spack.concretize
import spack.mirrors.mirror
import spack.spec
from spack.installer import PackageInstaller
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path):
spec = spack.spec.Spec("trivial-install-test-package").concretized()
spec = spack.concretize.concretize_one("trivial-install-test-package")
PackageInstaller([spec.package], fake=True).install()
specs = [spec]

View File

@@ -16,6 +16,7 @@
import spack.build_environment
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.package_base
@@ -163,8 +164,7 @@ def test_static_to_shared_library(build_environment):
@pytest.mark.regression("8345")
@pytest.mark.usefixtures("config", "mock_packages")
def test_cc_not_changed_by_modules(monkeypatch, working_env):
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretize_one("cmake")
pkg = s.package
def _set_wrong_cc(x):
@@ -184,7 +184,7 @@ def test_setup_dependent_package_inherited_modules(
working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized()
s = spack.concretize.concretize_one("cmake-client-inheritor")
PackageInstaller([s.package]).install()
@@ -277,7 +277,7 @@ def platform_pathsep(pathlist):
return convert_to_platform_path(pathlist)
# Monkeypatch a pkg.compiler.environment with the required modifications
pkg = spack.spec.Spec("cmake").concretized().package
pkg = spack.concretize.concretize_one("cmake").package
monkeypatch.setattr(pkg.compiler, "environment", modifications)
# Trigger the modifications
spack.build_environment.setup_package(pkg, False)
@@ -301,7 +301,7 @@ def custom_env(pkg, env):
env.prepend_path("PATH", test_path)
env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1")
pkg = spack.spec.Spec("cmake").concretized().package
pkg = spack.concretize.concretize_one("cmake").package
monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env)
spack.build_environment.setup_package(pkg, False)
@@ -322,7 +322,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
}
spack.config.set("packages:cmake", cmake_config)
cmake_client = spack.spec.Spec("cmake-client").concretized()
cmake_client = spack.concretize.concretize_one("cmake-client")
spack.build_environment.setup_package(cmake_client.package, False)
assert os.environ["TEST_ENV_VAR_SET"] == "yes it's set"
@@ -330,8 +330,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
@pytest.mark.regression("9107")
def test_spack_paths_before_module_paths(config, mock_packages, monkeypatch, working_env):
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretize_one("cmake")
pkg = s.package
module_path = os.path.join("path", "to", "module")
@@ -352,8 +351,7 @@ def _set_wrong_cc(x):
def test_package_inheritance_module_setup(config, mock_packages, working_env):
s = spack.spec.Spec("multimodule-inheritance")
s.concretize()
s = spack.concretize.concretize_one("multimodule-inheritance")
pkg = s.package
spack.build_environment.setup_package(pkg, False)
@@ -387,8 +385,7 @@ def test_wrapper_variables(
not in cuda_include_dirs
)
root = spack.spec.Spec("dt-diamond")
root.concretize()
root = spack.concretize.concretize_one("dt-diamond")
for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name)
@@ -453,7 +450,7 @@ def test_external_prefixes_last(mutable_config, mock_packages, working_env, monk
"""
)
spack.config.set("packages", cfg_data)
top = spack.spec.Spec("dt-diamond").concretized()
top = spack.concretize.concretize_one("dt-diamond")
def _trust_me_its_a_dir(path):
return True
@@ -500,8 +497,7 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
)
def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mock_packages):
# Pick a random package to be able to set compiler's variables
s = spack.spec.Spec("cmake")
s.concretize()
s = spack.concretize.concretize_one("cmake")
pkg = s.package
env = EnvironmentModifications()
@@ -533,7 +529,7 @@ def setup_dependent_package(module, dependent_spec):
assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True
externaltool = spack.spec.Spec("externaltest").concretized()
externaltool = spack.concretize.concretize_one("externaltest")
monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
)
@@ -728,7 +724,7 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
But obviously it can lead to very hard to find bugs... We should get rid of those globals and
define them instead as a property on the package instance.
"""
root = spack.spec.Spec("mpileaks").concretized()
root = spack.concretize.concretize_one("mpileaks")
build_variables = ("std_cmake_args", "std_meson_args", "std_pip_args")
# See todo above, we clear out any properties that may have been set by the previous test.

Some files were not shown because too many files have changed in this diff Show More