Compare commits

..

194 Commits

Author SHA1 Message Date
Satish Balay
998effa8a1 ginkgo: revert mpi@3.1 dependency:
Currently this breaks builds with OpenMPI - as spack is resolving this to 'openmpi@1.7.4'
2024-09-09 14:26:26 -05:00
Jen Herting
5f9c6299d1 [py-langsmith] added version 0.1.81 (#46259) 2024-09-07 04:29:45 -06:00
Jen Herting
541e40e252 py-httpcore: add v1.0.5 (#46123)
* py-httpcore: Added new version

* [py-httpcore]

- added version 0.18.0
- restructured dependencies as everything has a when and
  type/when ordering was all over the place

* [py-httpcore] ordered dependencies in the order listed in v1.0.5 pyproject.toml

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-09-07 02:28:15 -06:00
Jen Herting
ddc8790896 [py-lpips] new package (#46256) 2024-09-06 18:54:51 -06:00
Jen Herting
7d6b643b58 [py-torch-fidelity] New package (#46257)
* [py-torch-fidelity] New package
* [py-torch-fidelity] add patch to fix missing requirements.txt
2024-09-06 18:54:36 -06:00
Dan Lipsa
6f08db4631 Set module variables for all packages before running setup_dependent_package (#44327)
When a package is running `setup_dependent_package` on a parent, ensure
that module variables like `spack_cc` are available. This was often
true prior to this commit, but externals were an exception.

---------

Co-authored-by: John Parent <john.parent@kitware.com>
2024-09-06 18:54:09 -06:00
eugeneswalker
7086d6f1ac use updated container with cmake@3.30.2 (#46262) 2024-09-06 18:48:45 -06:00
Robert Underwood
7a313295ac Sz3 fix (#46263)
* Updated version of sz3
  Supercedes #46128
* Add Robertu94 to maintainers fo r SZ3

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 14:35:55 -07:00
Jen Herting
3bdaaaf47c New package: py-monai (#46140)
* New package: py-monai

* Fixed linked issues with py-monai

* [py-monai] removed extra line

* [py-monai]

- New version 1.3.2
- ran black
- update copyright

* [py-monai] added license

* [py-monai]

- moved build only dependencies
- specified newer python requirements for newer versions

---------

Co-authored-by: vehrc <vehrc@rit.edu>
2024-09-06 15:19:12 -06:00
Massimiliano Culpo
a3fa54812f Remove deprecated config options (#44061)
These options have been deprecated in v0.21, and
slated for removal in v0.23
2024-09-06 15:14:36 -06:00
Kyle Gerheiser
afc9615abf libfabric: Add versions v1.22.0 and v1.21.1 (#46188)
* Add libfabric v1.22.0 and v1.21.1

* Fix whitespace
2024-09-06 14:21:41 -05:00
Adam J. Stewart
19352af10b GEOS: add v3.13.0 (#46261) 2024-09-06 12:17:41 -07:00
Robert Underwood
e4f8cff286 Updated version of sz3 (#46246)
Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 12:15:03 -07:00
Sam Reeve
76df9de26a Update ExaCA backend handling and add version 2.0 (#46243)
* Add exaca 2.0
* Add kokkos/cuda/hip backend support for exaca
2024-09-06 14:53:21 -04:00
Auriane R.
983e7427f7 Replace if ... in spec with spec.satisfies in f* and g* packages (#46127)
* Replace if ... in spec with spec.satisfies in f* and g* packages

* gromacs: ^amdfftw -> ^[virtuals=fftw-api] amdfftw

* flamemaster: add virtuals lapack for the amdlibflame dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-06 13:22:19 -05:00
Adam J. Stewart
d9df520e85 py-pandas: relax build dependencies (#46229) 2024-09-06 11:19:49 -07:00
Adam J. Stewart
b51fd9f5a6 py-torchgeo: zipfile-deflate64 no longer required (#46233) 2024-09-06 11:18:10 -07:00
Robert Underwood
8058cd34e4 Require a newer version of cudnn for cupy (#46248)
cupy 12 needs a newer version of cudnn as documented here.  Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 10:56:46 -07:00
Robert Underwood
2f488b9329 Use an HTTP git url for libpressio-opt (#46249)
superceds #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 10:43:23 -07:00
Peter Scheibel
78a4d3e7d2 Mixed-source cflags (#41049)
Allow flags from different sources (compilers, `require:`, command-line
specs, and `depends_on`) to be merged together, and enforce a consistent
order among them.

The order is based on the sources, e.g. flags on specs from the command
line always come last. Some flag order consistency issues are fixed:

1. Flags from `compilers.yaml` and the command line were always intra- and
   inter-source order consistent.
2. Flags from dependents and packages.yaml (introduced via `require:`)
   were not: for `-a -b` from one source and `-c` from another, the final
   result might rearrange `-a -b`, and would also be inconsistent in terms
   of whether `-c` came before or after.

(1) is/was handled by going back to the original source, i.e., flags are
retrieved directly from the command line spec rather than the solver.

(2) is addressed by:

* Keeping track of grouped flags in the solver
* Keeping track of flag sources in the solver on a per-flag basis

The latter info is used in this PR to enforce DAG ordering on flags
applied from multiple dependents to the same package, e.g., for this
graph:

```
   a
  /|\
 b | c
  \|/
   d
```

If `a`, `b`, and `c` impose flags on `d`, the combined flags on `d` will
contain the flags of `a`, `b`, and `c` -- in that order. 

Conflicting flags are allowed (e.g. -O2 and -O3). `Spec.satisifes()` has
been updated such that X satisfies Y as long as X has *at least* all of
the flags that Y has. This is also true in the solver constraints.
`.satisfies` does not account for how order can change behavior (so
`-O2 -O3` can satisfy `-O3 -O2`); it is expected that this can be
addressed later (e.g. by prohibiting flag conflicts).

`Spec.constrain` and `.intersects` have been updated to be consistent
with this new definition of `.satisfies`.
2024-09-06 10:37:33 -07:00
Juan Miguel Carceller
2b9a621d19 sleef: add the PIC flag (#46217) 2024-09-06 09:37:20 -06:00
Juan Miguel Carceller
fa5f4f1cab pthreadpool: add the PIC flag (#46216) 2024-09-06 15:11:03 +02:00
John W. Parent
4042afaa99 Bootstrap GnuPG and file on Windows (#41810)
Spack can now bootstrap two new dependencies on Windows: GnuPG, and file. 

These dependencies are modeled as a separate package, and they install a cross-compiled binary.
Details on how they binaries are built are in https://github.com/spack/windows-bootstrap-resources
2024-09-06 14:26:46 +02:00
Harmen Stoppels
7fdf1029b7 fish: use shlex.quote instead of custom quote (#46251) 2024-09-06 11:38:14 +00:00
Stephen Nicholas Swatman
814f4f20c0 py-pybind11: add v2.13.0-v2.13.4 and new conflict (#45872)
This PR adds py-pybind11 versions 2.13.0, 2.13.1, 2.13.2, 2.13.3, and
2.13.4. It also adds a new conflict between gcc 14 and pybind versions
up to and including 2.13.1.
2024-09-06 10:38:35 +02:00
Harmen Stoppels
7c473937ba db: more type hints (#46242) 2024-09-06 09:13:14 +02:00
Robert Underwood
9d754c127a Require a newer version of cudnn for cupy (#46247)
cupy 12 needs a newer version of cudnn as documented here.  Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-05 23:12:46 -06:00
Massimiliano Culpo
9a8bff01ad Allow deprecating more than one property in config (#46221)
* Allow deprecating more than one property in config

This internal change allows the customization of errors
and warnings to be printed when deprecating a property.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* fix

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Use a list comprehension for "issues"

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-05 15:33:20 -07:00
Nicole C.
434a703bcf Windows: Update pytest with echo and remove others (#45620)
* Windows: Update pytest with echo and remove others

* Fix style

* Remove unused pytest import in undevelop

* Update test_test_part_pass to be Window compatible

* Style
2024-09-05 15:34:19 -05:00
Adam J. Stewart
a42108438d py-contourpy: add v1.3.0 (#46225) 2024-09-05 22:16:14 +02:00
Sebastian Pipping
1be9b7f53c expat: Add 2.6.3 with security fixes + deprecate vulnerable 2.6.2 (#46208) 2024-09-05 14:09:34 -06:00
Dom Heinzeller
6b05a80745 Bug fixes for building py-netcdf4 and py-ruamel-yaml-clib with apple-clang@15 (#46184)
* Fix compiler flags in py-netcdf4 for apple-clang@15:
* Fix compiler flags in py-ruamel-yaml-clib for apple-clang@15:
2024-09-05 11:29:26 -07:00
Fernando Ayats
4d36b0a5ef npb: fix mpi rank mismatch errors (#46075)
MPI variant has several rank mismatch errors, which are silently
ignored. This downgrades the errors to warnings.
2024-09-05 11:27:15 -07:00
Adam J. Stewart
636843f330 PyTorch: update ecosystem (#46220) 2024-09-05 11:06:57 -07:00
Adam J. Stewart
d4378b6e09 awscli-v2: remove upperbound on cryptography (#46222) 2024-09-05 11:04:56 -07:00
Adam J. Stewart
69b54d9039 py-imageio: add v2.35.1 (#46227) 2024-09-05 10:57:57 -07:00
Adam J. Stewart
618866b35c py-laspy: add v2.3.5 (#46228) 2024-09-05 10:56:51 -07:00
Adam J. Stewart
47bd8a6b26 py-pyvista: add v0.44.1 (#46231) 2024-09-05 10:53:42 -07:00
Adam J. Stewart
93d31225db py-tifffile: add v2024 (#46232) 2024-09-05 10:52:21 -07:00
Alex Richert
c3a1d1f91e Add when's to some tau dependencies (#46212)
* Add when's to some tau dependencies

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-09-05 10:35:01 -07:00
estewart08
33621a9860 [rocm-openmp-extras] - Add support for flang-legacy in 6.1.2 (#46130)
* [rocm-openmp-extras] - Add support for flang-legacy in 6.1.2
* [rocm-openmp-extras] - Remove unused variable flang_legacy_dir
* [rocm-openmp-extras] - Limit flang-legacy build to 6.1 and newer ROCm versions
2024-09-05 12:27:50 -05:00
etiennemlb
055eb3cd94 PLUMED: Using a C compiler variable (#46082)
* Using a C compiler variable

* homogeneity
2024-09-05 10:43:28 -06:00
Massimiliano Culpo
c4d18671fe Avoid best-effort expansion of stacks (#40792)
fixes #40791

Currently stacks behave differently if used in unify:false
environments, which leads to inconsistencies during concretization.

For instance, we might have two abstract user specs that do not
intersect with each other map to the same concrete spec in the
environment. This is clearly wrong.

This PR removes the best effort expansion, so that user specs
are always applied strictly.
2024-09-05 10:32:15 -06:00
Thomas Madlener
5d9f0cf44e gaudi: Specify boost dependencies explicitly and cleanup package (#46194)
* gaudi: Specify boost components and add +fiber for v39

* gaudi: Limit fmt version to allow building master branch

* Make boost dependencies a bit more readable

* Remove patches for no longer existing versions
2024-09-05 08:27:04 -06:00
Harmen Stoppels
02faa7b97e reindex: ensure database is empty before reindex (#46199)
fixes two tests that did not clear the in-memory bits of a database
before calling reindex.
2024-09-05 14:53:29 +02:00
Mikael Simberg
d37749cedd pika: Add 0.28.0 (#46207)
* pika: Add 0.28.0

* Add conflict between pika 0.28.0 and dla-future
2024-09-05 14:12:21 +02:00
Harmen Stoppels
7e5e6f2833 Pass Database layout in constructor (#46219)
Ensures that Database instances do not reference a global
`spack.store.STORE.layout`. Simplify Database.{add,reindex} signature.
2024-09-05 10:49:03 +00:00
Massimiliano Culpo
37ea9657cf Remove test_external_package_module (#46218)
This test was possibly meant for the Cray platform, and
currently is a no-op.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-05 12:10:59 +02:00
Harmen Stoppels
2107a88514 spack deprecate: deprecate --link-type flag (#46202) 2024-09-05 11:06:46 +02:00
Auriane R.
1a4b07e730 Replace if ... in spec with spec.satisfies in d* and e* packages (#46126)
* Replace if ... in spec with spec.satisfies in d* and e* packages

* Use virtuals for different mpi implementations in esmf

* esmf: ^[virtuals=mpi] mpt

* extrae: ^[virtuals=mpi] intel-oneapi-mpi

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-04 22:33:51 -06:00
Jack Morrison
c98045e028 libfabric: Add CUDA variant (#46203)
Signed-off-by: Jack Morrison <jack.morrison@cornelisnetworks.com>
2024-09-04 21:43:25 -06:00
mvlopri
bc5456a791 seacas: require +metis and +mpi instead of +parmetis (#46205)
This change aligns the build condition for parmetis with the
depends_on condition.
The current build condition of parmetis looks for "+parmetis" in
the spec which is not added by the depends_on as that adds
"^parmetis" instead.
2024-09-04 21:03:02 -06:00
Pranav Sivaraman
656720a387 py-schema: add v0.7.7 (#46210)
* py-schema: add v0.7.7

* py-schema: fix when spec

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-04 20:07:35 -06:00
Mikael Simberg
9604c0a9b3 boost: Conditionally include/exclude Boost.Json depending on Boost version (#46200) 2024-09-04 16:50:06 -06:00
Pranav Sivaraman
ee96194486 py-misk: add new package (#46153) 2024-09-04 11:25:01 -07:00
Felix Thaler
ab21fc1daf Added jump package (#46164) 2024-09-04 11:16:20 -07:00
Pranav Sivaraman
d593ad0c06 py-trieregex: new package (#46154)
* py-trieregex: new package
2024-09-04 10:58:08 -07:00
Weiqun Zhang
254fe6ed6e amrex: add v24.09 (#46171) 2024-09-04 10:54:49 -07:00
renjithravindrankannath
7e20874f54 rocm-openmp-extras: Avoiding registration of duplicate check-targets and fix for the failure in hostexec (#45658)
* Adding addtional check for omptarget library for amdgpu in nvidia environment
* Avoiding registration of duplicate when built on cuda
* Adding hsa library path in LD_LIBRARY_PATH
* Correction in hsa prefix library path in LD_LIBRARY_PATH
2024-09-04 10:31:28 -07:00
Vanessasaurus
cd4c40fdbd Automated deployment to update package flux-core 2024-09-04 (#46191)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-09-04 10:28:44 -07:00
Harmen Stoppels
c13e8e49fe goaccess: new package (#46193) 2024-09-04 10:23:28 -07:00
Adam J. Stewart
35a2a0b3d0 py-rasterio: add v1.3.11 (#46195)
* py-rasterio: add v1.3.11
* Use default_args
2024-09-04 09:45:16 -07:00
Adam J. Stewart
22d69724f4 py-fiona: add v1.10.0 (#46196) 2024-09-04 09:36:43 -07:00
Adam J. Stewart
f6e3f6eec7 py-numpy: add v2.1.1 (#46197) 2024-09-04 09:33:07 -07:00
G-Ragghianti
866c440f0c Updating git repo location (#46183) 2024-09-04 10:34:05 -05:00
Juan Miguel Carceller
fe6f5b87dc texlive: clean up recipe (#45863)
* texlive: clean up recipe

* Update the poppler dependency

* Fix typo

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-09-04 10:20:23 -05:00
Vicente Bolea
3df3f40984 paraview: add new v5.13.0 release (#46091) 2024-09-04 10:14:52 -05:00
Satish Balay
33f4a40df4 trilinos: update @[master,develop] dependency on kokkos (#46182)
==> Error: InstallError: For Trilinos@[master,develop], ^kokkos version in spec must match version in Trilinos source code. Specify ^kokkos@4.4.00 for trilinos@[master,develop] instead of ^kokkos@4.3.01.
2024-09-04 07:34:20 -06:00
Kyle Knoepfel
7ba0132f66 llvm: improve detection regexes (#46187) 2024-09-04 04:12:42 -06:00
Mikael Simberg
744f034dfb Add conflict between llvm-amdgpu until version 5 and ninja since version 1.12.0 (#45788)
* Add conflict between llvm-amdgpu and ninja since version 1.12.0

* Update llvm-amdgpu and ninja conflict to extend to 6.0
2024-09-04 10:37:48 +02:00
Massimiliano Culpo
d41fb3d542 llvm: be more strict with detection (#46179) 2024-09-03 22:37:30 +02:00
Laurent Aphecetche
9b077a360e root: fix @loader_path on macOS (#44826)
* root: fix @loader_path on macOS

* root: use loader patch from upstream

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* add missing comma

* root: rm unused patch

* root: conflict on macos 12 for @:6.27 when +python

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-03 09:50:31 -05:00
Stephen Nicholas Swatman
5c297d8322 root: fix X11 and OpenGL-related issues on macOS (#45632)
* root: Add dependency on libglx

We have been trying to build the Acts package on MacOS, and in this
process we have been running into problems with the ROOT spec on that
operating system; the primary issue we are encountering is that the
compiler is unable to find the `GL/glx.h` header, which is part of glx.
It seems, therefore, that ROOT depends on libglx, but this is not
currently encoded in the spec. This commit ensures that ROOT depends on
the virtual libglx package when both the OpenCL and X11 variants are
enabled.

* Enable builtin glew on MacOS

* Allow `root+opengl+aqua~x` on macOS
2024-09-03 08:32:17 -05:00
Massimiliano Culpo
9e18e63053 solver: minor cleanup and optimization (#46176)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-03 10:35:54 +00:00
Harmen Stoppels
069286bda7 database: remove a few class properties (#46175) 2024-09-03 11:22:14 +02:00
Richard Berger
1679b5e141 nvpl updates (#45922)
* nvpl-blas: add new version 0.3.0

* nvpl-lapack: add new version 0.2.3.1
2024-09-03 08:07:24 +02:00
Pierre Augier
1f935ac356 Add py-pytest-allclose package (#45877) 2024-09-02 14:34:24 -06:00
Pariksheet Nanda
e66e572656 py-cellprofiler: add 4.2.6 new package (#44824)
* py-cellprofiler: add 4.2.6 new package

* py-mysqlclient: Limit pkg-config patch to @1.4:

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-02 14:21:39 -05:00
JCGoran
17d3d17d46 py-sympy: add version 1.13.0 (#46163)
* sympy: add version 1.13.0

* sympy: update deps
2024-09-02 09:51:44 -06:00
Thomas Madlener
c814fb5fe6 podio: Add the new datasource variant once it is available (#46078)
* podio: Add the new datasource variant once it is available

* Make sure to require a suitable minimal root version
2024-09-02 08:39:43 -05:00
Harmen Stoppels
fe2d06399f db: type hints (#46168) 2024-09-02 14:44:49 +02:00
Harmen Stoppels
b5dec35113 projections: simplify expression (#46167) 2024-09-02 12:38:47 +00:00
Harmen Stoppels
20565ba8ab remove dead code: fs.is_writable_dir was used on file (#46166) 2024-09-02 11:55:40 +00:00
Wouter Deconinck
c47a3ee05b package_base: sort deprecated versions later in preferred_version (#46025) 2024-09-02 10:42:15 +00:00
Wouter Deconinck
aaa7469b92 py-vector: add v1.4.2, v1.5.0; variant awkward (#46039) 2024-09-02 09:41:15 +02:00
Asher Mancinelli
1f8a6d8e8b [neovim] add utf8proc dependency (#46064)
I believe utf8proc was added as a neovim dependency in neovim/neovim#26165
and is only in the master branch.
2024-09-02 09:28:30 +02:00
Richard Berger
654bf45c01 kokkos-nvcc-wrapper: add new 4.4.00 version (#46067) 2024-09-02 09:27:13 +02:00
Wouter Deconinck
daa42be47f ddt: deprecate all versions in favor of linaro-forge (#46115) 2024-09-02 09:15:47 +02:00
Satish Balay
ca179deb8e petsc, py-petsc4py: add v3.21.5 (#46151) 2024-09-02 09:09:05 +02:00
Satish Balay
8f6092bf83 xsdk: remove develop and 0.7.0, and deprecate 0.8.0 (#46121) 2024-09-02 09:04:36 +02:00
pauleonix
6b649ccf4f cuda: add v12.6.1 (#46143)
Update build system conflict between CUDA 12.6 and Clang 18
2024-09-02 08:46:03 +02:00
Georgia Stuart
d463d4566d docs: update conditional definition arch (#46139)
Signed-off-by: Georgia Stuart <gstuart@umass.edu>

Co-authored-by: Jordan Galby <67924449+Jordan474@users.noreply.github.com>
2024-09-02 08:32:26 +02:00
Adam J. Stewart
f79be3022b py-torchgeo: add v0.6.0 (#46158) 2024-09-02 08:30:02 +02:00
eugeneswalker
7327e731b4 e4s ci stacks: add geopm-runtime (#45881) 2024-09-02 08:24:10 +02:00
Adam J. Stewart
ab6d494fe4 py-horovod: py-torch 2.1.0 now supported (#46152) 2024-09-02 08:21:16 +02:00
Christophe Prud'homme
e5aa74e7cb package cln 1.3.7 feelpp/spack#2 (#46162)
* package cln 1.3.7 feelpp/spack#2

* add myself as maintainer

* fix style issue, rm blankline
2024-09-01 22:38:26 -05:00
Gavin John
728f8e2654 nanomath: add version 1.4.0 (#46159) 2024-09-01 17:23:57 -06:00
Stephen Sachs
9a58a6da0d [openmpi] Add optional debug build variant (#45708) 2024-09-01 11:02:04 -05:00
Stephen Nicholas Swatman
395491815a dd4hep: mark conflict with root@6.31.1: (#45855)
dd4hep versions up to and including 1.27 had a conflict with root
versions starting from 6.31.1, as shown in
https://github.com/AIDASoft/DD4hep/issues/1210. This PR explicitly adds
that conflict to the spec.
2024-09-01 10:54:20 -05:00
Joseph Wang
fd98ebed9d add rapidjson conflict for gcc14 (#46007) 2024-09-01 10:01:53 -05:00
Harmen Stoppels
f7de621d0c Remove redundant inspect.getmodule(self) idiom in packages (#46073) 2024-09-01 11:25:51 +02:00
dunatotatos
a5f404cff5 Update py-numcodecs. (#45715) 2024-08-31 15:15:18 -06:00
Martin Pokorny
8100b0d575 casacore: add new versions 3.6.1, 3.6.0, 3.2.1 (#46068) 2024-08-31 15:14:52 -06:00
Juan Miguel Carceller
b38ab54028 whizard: add a patch when using hepmc3 3.3.0 or newer (#45862)
* whizard: add a patch when using hepmc3 3.3.0 or newer

* whizard: comment with patch origin

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-08-31 13:13:42 -06:00
Stephen Nicholas Swatman
412f22b76a podio: apply patch for gcc 14 builds (#45854)
* podio: apply patch for gcc 14 builds

Podio versions after 0.17.0 but before 1.0.0 are broken when using gcc
14 because of a missing include, which is addressed in the podio pull
request at https://github.com/AIDASoft/podio/pull/600. This commit
patches pre-1.0.0 versions of Podio so they can be compiled with gcc 14,
which is important for building Acts.

* Style

* Style 2

* Fixes

* Add comment:

* Add sha256
2024-08-31 13:42:02 -05:00
Jen Herting
d226ef31bd New package: py-jsonlines (#46124)
* py-jsonlines: new package

* py-jsonlines: fix dependency

---------

Co-authored-by: Alex C Leute <acl2809@rit.edu>
2024-08-31 12:30:07 -05:00
Jen Herting
ae32af927d New package: py-ops (#46122)
* New package: py-ops

* [py-ops]

- added version 2.16.0
- ran black
- updated copyright
- added license()

---------

Co-authored-by: vehrc <vehrc@rit.edu>
2024-08-31 12:11:27 -05:00
Alex Richert
400dd40492 sigio: add v2.3.3 (#46116) 2024-08-31 12:01:08 -05:00
dependabot[bot]
04bdff33ad build(deps): bump actions/setup-python from 5.1.1 to 5.2.0 (#46129)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.1.1 to 5.2.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](39cd14951b...f677139bbe)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 11:51:49 -05:00
Pranav Sivaraman
017e3dd417 doctest: add new package (#46138) 2024-08-31 11:48:20 -05:00
Alex Richert
f7e3902ca8 landsfcutil: add v2.4.2 (#46144) 2024-08-31 09:13:10 -05:00
Alex Richert
89da8d4c84 gfsio: add v1.4.2 (#46145) 2024-08-31 09:12:23 -05:00
Alex Richert
8cac74699b sfcio: add v1.4.2 (#46146)
* sfcio: add v1.4.2

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-08-31 09:11:13 -05:00
dependabot[bot]
db311eef46 build(deps): bump actions/upload-artifact from 4.3.6 to 4.4.0 (#46149)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.6 to 4.4.0.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](834a144ee9...50769540e7)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 08:38:16 -05:00
etiennemlb
1427735876 unzip: use more generic strip flag for cce (#46087)
* Use more generic strip flag for cce

* [@spackbot] updating style on behalf of etiennemlb

* Apply always
2024-08-30 12:57:24 -05:00
Kacper Kornet
f88ca8cc1f plumed: add v2.9.1 (#46022) 2024-08-30 15:29:49 +02:00
Massimiliano Culpo
bf1f4e15ee boost: remove Compiler.cxx_names (#46037) 2024-08-30 13:25:40 +02:00
Harmen Stoppels
dd756d53de windows-vis: fix failing pipeline (#46135)
* seacas: fix gnu parallel dep

* add vtk@9.0 platform=windows conflict
2024-08-30 12:57:16 +02:00
Massimiliano Culpo
1c1970e727 Put some more constraint on a few mpi providers (#46132)
This should help not selecting, by default, some niche implementation that are supposed to be externals.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-08-30 11:16:35 +02:00
Massimiliano Culpo
c283fce487 Remove DetectedPackage class (#46071)
This PR simplifies the code doing external spec detection by removing 
the `DetectedPackage` class. Now, functions accepting or returning lists 
of `DetectedPackage`, will accept or return list of specs.

Performance doesn't seem to change if we use `Spec.__reduce__` instead 
of `DetectionPackage.__reduce__`.
2024-08-30 08:11:17 +00:00
Harmen Stoppels
199cbce5ef windows-vis: vtk~mpi (#46133) 2024-08-30 09:38:41 +02:00
Massimiliano Culpo
82dea7e6ce mpitrampoline: fix wrong use of compiler.cc_names (#46038) 2024-08-30 07:50:09 +02:00
Harmen Stoppels
bd71ce5856 cray: allow failure due to broken blas (#46111)
1. libsci_cray.so is broken, as it fails to list `libdl.so` in
   DT_NEEDED.
2. cray compilers fail to build a different blas
2024-08-30 07:49:10 +02:00
Mark Abraham
73fc86cbc3 gromacs: support version 2024.3 (#46117) 2024-08-29 17:38:04 -06:00
Chris White
3589edcc6d conduit package: Honor compiler extra_rpaths + extras (#46046)
- Honor compiler extra_rpaths (this build bypasses spack wrappers,
  so the RPATHs are added explicitly as CMake options)
- Use root_cmakelists_dir instead of adding it to the command line
- Add BLT as a dependency, allowing different versions outside of what
  is in the tarball
- Put a copy of host-config in the stage directory: this allows
  examination of the host-config when a build fails (before, the
  host-config was just stored in the install directory, which is
  deleted by default on a failed build)
2024-08-29 13:59:15 -07:00
Chris Marsh
cade66d842 CMakePackage: Set policy CMP0042 NEW on macos (#46114)
so linking to shared libraries works well at runtime on darwin for all packages with cmake_minimum_required < 3.
2024-08-29 18:49:53 +00:00
Louise Spellacy
c5766aa757 linaro-forge: added 24.0.4 version (#46112)
Updated platform.machine() to only match "aarch64".
"arm64" not supported by "spack arch"
2024-08-29 12:33:49 -06:00
Harmen Stoppels
c3e9bd4fbf spectrum-mpi: no windows (#46119) 2024-08-29 20:04:33 +02:00
Harmen Stoppels
05357052ac py-greenlet: add missing forward compat bound (#46113) 2024-08-29 17:55:26 +02:00
Harmen Stoppels
dc3f5cd606 windows_establish_runtime_linkage: post install hook (#46103) 2024-08-29 17:16:36 +02:00
Alberto Invernizzi
9a16927993 paraview: add cdi support (#44222)
* add basic CDI package

* add CDI variant to paraview

* [@spackbot] updating style on behalf of albestro

---------

Co-authored-by: albestro <albestro@users.noreply.github.com>
2024-08-29 14:19:04 +02:00
Wouter Deconinck
093b273f5c py-mypy: add v1.11.2 (#46099) 2024-08-29 05:46:36 -06:00
Tamara Dahlgren
b6ff126494 Executable: make the timeout message readable (#46098) 2024-08-29 11:46:15 +00:00
Harmen Stoppels
fe8f631b7d tau: fix (cray) compiler names/paths (#46104)
fixes a build issue on cray ci
2024-08-29 11:24:58 +02:00
Massimiliano Culpo
f9065f0c7e Remove "get_compiler_duplicates" (#46077)
This function is used only in tests.
2024-08-29 06:53:17 +02:00
etiennemlb
699735016f Add more compiler leniency (#46083) 2024-08-28 16:33:26 -07:00
Arne Becker
ec85bbe3f7 perl-compress-bzip2: new package (#46055)
* perl-compress-bzip2: new package
  Adds Compress::Bzip2
* Use bzip2 from Spack, not system
2024-08-28 13:57:01 -07:00
Taillefumier Mathieu
7e1ad8f321 [Update] New version of sirius (#46049) 2024-08-28 13:38:06 -07:00
Derek Ryan Strong
0eb8f4f0f9 pmix: add v5.0.3 and fix variants (#45621)
* Add pmix v5.0.3 and fix variants
* Update pmix homepage link
* pmix: Simplify/update hwloc dependency
* pmix: Update versions for --disable-sphinx configure option
* pmix: Add munge variant
* pmix: Add zlib dependency
* pmix: Fix dependency py-sphinx@5
2024-08-28 12:59:48 -07:00
renjithravindrankannath
ee27dc5d45 llvm-amdgpu: Updating LD_LIBRARY_PATH w.r.t new prefix path (#45940)
* Updating LD_LIBRARY_PATH w.r.t new prefix path

* Updating hsa external path for 6.x
2024-08-28 11:26:05 -07:00
Robert Underwood
ec0a57cba4 py-numcodecs rename git branch to match upstream (#46085)
* py-numcodecs rename git branch to match upstream
2024-08-28 19:16:23 +02:00
Jordan Galby
4c91e6245c Don't check checksums on spack-develop packages (#46076)
Fix regression introduced in spack 0.22.1 where Spack would ask about checksums
on spack-develop packages.
2024-08-28 17:07:30 +00:00
Satish Balay
6a1dfbda97 mfem, pflotran, alquimia: remove old versions with xsdk string (in version) that were used in old/removed xsdk releases (#45837) 2024-08-28 10:03:51 -07:00
Jordan Galby
6b0011c8f1 For "when:" and install_environment.json: Support fully qualified hostname (#45522) 2024-08-28 18:38:54 +02:00
Nicholas Sly
8b5521ec0a Replace unparsable apostrophe character with ASCII "'". (#46069) 2024-08-28 18:34:06 +02:00
Mikael Simberg
b9e4e98f15 boost: install BoostConfig.cmake even when header-only (#46062)
* Install BoostConfig.cmake even when header-only

* boost: Only use --without-libraries when --with-libraries would have an empty list
2024-08-28 17:44:02 +02:00
Harmen Stoppels
85487f23bc buildcache.py: elide string not spec (#46074) 2024-08-28 15:27:44 +02:00
Harmen Stoppels
fb4811ec3f Drop now redundant use of inspect (#46057)
inspect.isclass -> isinstance(..., type)
inspect.getmro -> cls.__mro__
2024-08-28 14:35:08 +02:00
Mark Abraham
202e64872a gromacs: add conflict between NVSHMEM and cuFFTMp (#46065)
* Add conflict between NVSHMEM and cuFFTMp for GROMACS package

These don't work in the same build configuration.

* [@spackbot] updating style on behalf of mabraham

* Update package.py

Also constrain NVSHMEM appropriately

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Andrey Alekseenko <al42and@gmail.com>

* Update package.py

* [@spackbot] updating style on behalf of mabraham

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of mabraham

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
Co-authored-by: Andrey Alekseenko <al42and@gmail.com>
2024-08-28 04:34:37 -06:00
Massimiliano Culpo
25ba3124bd Spec.from_detection now accounts for external prefix (#46063)
Change the signature of Spec.from_detection to set the
external prefix, and the external modules, if they are
present.

Delete "spack.package_prefs.spec_externals" since it
is unused.
2024-08-28 10:51:36 +02:00
Massimiliano Culpo
df57e1ceb3 Remove llnl.util.lang.has_method, use built-in hasattr instead (#46072) 2024-08-28 10:17:12 +02:00
Adam J. Stewart
59b4b785e0 py-numpy: add v2.0.2 (#46056) 2024-08-27 11:20:42 -07:00
Brian Spilner
b1b21a4d02 cdo-2.4.3 (#46033)
cdo-2.4.3 - fixed hash
2024-08-27 11:17:16 -07:00
Ye Luo
b1af32cb60 Introduce offload variant for llvm >= 19. (#45865) 2024-08-27 13:44:01 -04:00
Harmen Stoppels
9d8f94a7c8 spack_yaml: delete custom deepcopy (#46048) 2024-08-27 18:45:44 +02:00
Massimiliano Culpo
1297673a70 Remove "prevent_cray_detection" context manager (#46060)
This context manager was used to prevent detecting a platform
as "cray". Since now Cray machines are detected as linux, we can
remove the context manager.
2024-08-27 18:43:07 +02:00
Harmen Stoppels
0fee2c234e config.py: tell don't ask (#46040) 2024-08-27 15:55:44 +02:00
Alec Scott
229cf49c71 bfs: add v4.0.1 (#46044)
* bfs: add v4.0.1

* fix style
2024-08-27 08:15:38 -05:00
Massimiliano Culpo
394e6159d6 Remove a few unused classes and globals (#46054) 2024-08-27 15:06:39 +02:00
Stephen Nicholas Swatman
cbe18d9cbc detray: add version 0.73.0 (#46053)
This commit adds version 0.73.0 of the detray package. As this version
drops support for pre-C++20 standards, I had to update the `cxxstd`
variant logic.
2024-08-27 08:05:06 -05:00
Stephen Nicholas Swatman
2d83707f84 acts: add version 36.2.0 (#46052)
This commit adds version 36.2.0 of ACTS. As far as I can tell, there are
no dependency changes.
2024-08-27 07:18:06 -05:00
Massimiliano Culpo
9a91f021a7 Move spack.compilers._to_dict to Compiler (#46051) 2024-08-27 14:01:50 +02:00
Harmen Stoppels
297e43b097 abi.py: remove (#46047) 2024-08-27 10:38:54 +02:00
FrederickDeny
900765901d Added e4s-cl@1.0.4 (#46043) 2024-08-26 23:26:12 -06:00
Nick Hagerty
680d1f2e58 lammps: improve FFT selection and add fft_kokkos variant (#45969) 2024-08-27 07:15:12 +02:00
Richard Berger
76957f19f9 nvpl-fft: new package (#45985) 2024-08-26 21:42:30 -06:00
AcriusWinter
c7001efeb8 sundials: new test API (#45373)
* sundials: new test API

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-26 15:34:06 -07:00
Harmen Stoppels
a60d1084b1 jsonschema: remove optional deps (#46041) 2024-08-26 22:15:52 +02:00
Harmen Stoppels
497e19f0e3 distro.py: avoid excessive stat calls (#46030) 2024-08-26 18:55:55 +02:00
Pranav Sivaraman
cd6ee96398 parallel-hashmap: add v1.3.12 (#46017)
Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-08-26 10:03:38 -06:00
Wouter Deconinck
904d85b53b fastjson: add v1.2.83, v2.0.52 (#45733)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-08-26 16:29:02 +02:00
Tamara Dahlgren
199653dd31 Bugfix/hsakmt-roct: use correct version for rocm-core and llvm-amdgpu (#45817) 2024-08-26 16:14:20 +02:00
Matthieu Dorier
fdfb4e9893 mruby: add v3.3.0 (#45964) 2024-08-26 15:33:00 +02:00
Adam J. Stewart
afa76ebbdc py-smp: add v0.3.4 (#45967) 2024-08-26 15:31:53 +02:00
Patrick Diehl
e5c045cc1c kokkos: add v4.4.00 (#45758)
Co-authored-by: Patrick Diehl <diehlpk@lanl.gov>
2024-08-26 15:28:35 +02:00
pauleonix
8c92836c39 cuda: add v12.6 (#45558) 2024-08-26 15:27:07 +02:00
Wouter Deconinck
a782e6bc33 r-googlesheets4: fix r-gargle dependency (#45980) 2024-08-26 14:37:00 +02:00
mvlopri
4ede0ae5e3 seacas: add parallel as a dependency (#45981) 2024-08-26 14:31:17 +02:00
Wouter Deconinck
986325eb0d r-pbkrtest: fix typo in dependency (#45997) 2024-08-26 14:25:01 +02:00
Wouter Deconinck
8bcd64ce6c r-diagram: fix dependency on non-existent R version (#46003) 2024-08-26 14:11:41 +02:00
Wouter Deconinck
f079ad3690 r-sf: deprecate unconcretizable 0.5-5 (#46016) 2024-08-26 14:10:25 +02:00
Juan Miguel Carceller
8c1d6188e3 gaudi: remove redundant dependency on cppgsl (#46029)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-08-26 14:09:48 +02:00
Todd Gamblin
1d70ab934c ci generate: don't warn when no cdash config (#46004)
Right now if you run `spack ci generate` you get a warning about CDash credentials even
if there's no CDash configuration specified. We should only warn if there was actually a
CDash config.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-26 13:58:41 +02:00
Kacper Kornet
fa704e867c polyml: add new package (#46020) 2024-08-26 13:42:30 +02:00
Kacper Kornet
85939b26ae mrbayes: readline and mpi variants are mutually exclusive (#46021) 2024-08-26 13:38:57 +02:00
Wouter Deconinck
a5436b3962 R: external detection (#46023)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-08-26 13:37:35 +02:00
James Shen
a8e25193e0 root: patch v6.22.08 (#46019) 2024-08-26 13:16:25 +02:00
Wouter Deconinck
480d6f9911 cppunit: add v1.15.1; deprecate custom commit version (#46026) 2024-08-26 13:15:29 +02:00
Harmen Stoppels
02f329a8af compilers: avoid redundant fs operations and cache (#46031) 2024-08-26 12:49:58 +02:00
Wouter Deconinck
2de712b35f netfilter pkgs: avoid 3rd party urls, add latest official version (#46027)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-08-26 12:41:11 +02:00
Wouter Deconinck
aa49b3d8ce lshw: add v02.20 (#46028) 2024-08-26 09:46:41 +02:00
Adam J. Stewart
eccecba39a Python: add v3.12.5, default to latest version (#45712)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-25 18:49:01 -07:00
Harmen Stoppels
94c99fc5d4 variant.py: extract spec bits into spec.py (#45941) 2024-08-24 09:45:23 +02:00
John W. Parent
1f1021a47f PythonExtension: use different bin dir on Windows (#45427) 2024-08-24 09:34:24 +02:00
421 changed files with 4909 additions and 3699 deletions

View File

@@ -29,7 +29,7 @@ jobs:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -63,7 +63,7 @@ jobs:
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: "3.12"
- name: Bootstrap clingo
@@ -112,10 +112,10 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest'}}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
@@ -124,11 +124,16 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Setup Windows
if: ${{ matrix.runner == 'windows-latest' }}
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: |
3.8
@@ -137,11 +142,20 @@ jobs:
3.11
3.12
- name: Set bootstrap sources
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.4
- name: Disable from source bootstrap
if: ${{ matrix.runner != 'windows-latest' }}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
# No binary clingo on Windows yet
if: ${{ matrix.runner != 'windows-latest' }}
run: |
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
@@ -164,7 +178,24 @@ jobs:
fi
done
- name: Bootstrap GnuPG
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
source share/spack/setup-env.sh
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack -d gpg list
tree ~/.spack/bootstrap/store/
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
- name: Bootstrap File
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack -d python share/spack/qa/bootstrap-file.py
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/

View File

@@ -87,7 +87,7 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@834a144ee995460fba8ed112a2fc961b36a5ec5a
uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
@@ -126,7 +126,7 @@ jobs:
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@834a144ee995460fba8ed112a2fc961b36a5ec5a
uses: actions/upload-artifact/merge@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
name: dockerfiles
pattern: dockerfiles_*

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -43,7 +43,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -91,7 +91,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
- name: Install System packages
@@ -151,7 +151,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
- name: Install System packages
@@ -188,7 +188,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -225,7 +225,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -72,3 +72,13 @@ packages:
permissions:
read: world
write: user
cray-mpich:
buildable: false
cray-mvapich2:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:
buildable: false
spectrum-mpi:
buildable: false

View File

@@ -863,7 +863,7 @@ named list ``compilers`` is ``['%gcc', '%clang', '%intel']`` on
spack:
definitions:
- compilers: ['%gcc', '%clang']
- when: arch.satisfies('x86_64:')
- when: arch.satisfies('target=x86_64:')
compilers: ['%intel']
.. note::
@@ -893,8 +893,9 @@ The valid variables for a ``when`` clause are:
#. ``env``. The user environment (usually ``os.environ`` in Python).
#. ``hostname``. The hostname of the system (if ``hostname`` is an
executable in the user's PATH).
#. ``hostname``. The hostname of the system.
#. ``full_hostname``. The fully qualified hostname of the system.
^^^^^^^^^^^^^^^^^^^^^^^^
SpecLists as Constraints

View File

@@ -1265,27 +1265,29 @@ def _distro_release_info(self) -> Dict[str, str]:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
else:
try:
basenames = [
basename
for basename in os.listdir(self.etc_dir)
if basename not in _DISTRO_RELEASE_IGNORE_BASENAMES
and os.path.isfile(os.path.join(self.etc_dir, basename))
]
with os.scandir(self.etc_dir) as it:
etc_files = [
p.path for p in it
if p.is_file() and p.name not in _DISTRO_RELEASE_IGNORE_BASENAMES
]
# We sort for repeatability in cases where there are multiple
# distro specific files; e.g. CentOS, Oracle, Enterprise all
# containing `redhat-release` on top of their own.
basenames.sort()
etc_files.sort()
except OSError:
# This may occur when /etc is not readable but we can't be
# sure about the *-release files. Check common entries of
# /etc for information. If they turn out to not be there the
# error is handled in `_parse_distro_release_file()`.
basenames = _DISTRO_RELEASE_BASENAMES
for basename in basenames:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
etc_files = [
os.path.join(self.etc_dir, basename)
for basename in _DISTRO_RELEASE_BASENAMES
]
for filepath in etc_files:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(os.path.basename(filepath))
if match is None:
continue
filepath = os.path.join(self.etc_dir, basename)
distro_info = self._parse_distro_release_file(filepath)
# The name is always present if the pattern matches.
if "name" not in distro_info:

View File

@@ -231,96 +231,6 @@ def is_host_name(instance):
return True
try:
# The built-in `idna` codec only implements RFC 3890, so we go elsewhere.
import idna
except ImportError:
pass
else:
@_checks_drafts(draft7="idn-hostname", raises=idna.IDNAError)
def is_idn_host_name(instance):
if not isinstance(instance, str_types):
return True
idna.encode(instance)
return True
try:
import rfc3987
except ImportError:
try:
from rfc3986_validator import validate_rfc3986
except ImportError:
pass
else:
@_checks_drafts(name="uri")
def is_uri(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3986(instance, rule="URI")
@_checks_drafts(
draft6="uri-reference",
draft7="uri-reference",
raises=ValueError,
)
def is_uri_reference(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3986(instance, rule="URI_reference")
else:
@_checks_drafts(draft7="iri", raises=ValueError)
def is_iri(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="IRI")
@_checks_drafts(draft7="iri-reference", raises=ValueError)
def is_iri_reference(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="IRI_reference")
@_checks_drafts(name="uri", raises=ValueError)
def is_uri(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="URI")
@_checks_drafts(
draft6="uri-reference",
draft7="uri-reference",
raises=ValueError,
)
def is_uri_reference(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="URI_reference")
try:
from strict_rfc3339 import validate_rfc3339
except ImportError:
try:
from rfc3339_validator import validate_rfc3339
except ImportError:
validate_rfc3339 = None
if validate_rfc3339:
@_checks_drafts(name="date-time")
def is_datetime(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3339(instance)
@_checks_drafts(draft7="time")
def is_time(instance):
if not isinstance(instance, str_types):
return True
return is_datetime("1970-01-01T" + instance)
@_checks_drafts(name="regex", raises=re.error)
def is_regex(instance):
if not isinstance(instance, str_types):
@@ -340,86 +250,3 @@ def is_draft3_time(instance):
if not isinstance(instance, str_types):
return True
return datetime.datetime.strptime(instance, "%H:%M:%S")
try:
import webcolors
except ImportError:
pass
else:
def is_css_color_code(instance):
return webcolors.normalize_hex(instance)
@_checks_drafts(draft3="color", raises=(ValueError, TypeError))
def is_css21_color(instance):
if (
not isinstance(instance, str_types) or
instance.lower() in webcolors.css21_names_to_hex
):
return True
return is_css_color_code(instance)
def is_css3_color(instance):
if instance.lower() in webcolors.css3_names_to_hex:
return True
return is_css_color_code(instance)
try:
import jsonpointer
except ImportError:
pass
else:
@_checks_drafts(
draft6="json-pointer",
draft7="json-pointer",
raises=jsonpointer.JsonPointerException,
)
def is_json_pointer(instance):
if not isinstance(instance, str_types):
return True
return jsonpointer.JsonPointer(instance)
# TODO: I don't want to maintain this, so it
# needs to go either into jsonpointer (pending
# https://github.com/stefankoegl/python-json-pointer/issues/34) or
# into a new external library.
@_checks_drafts(
draft7="relative-json-pointer",
raises=jsonpointer.JsonPointerException,
)
def is_relative_json_pointer(instance):
# Definition taken from:
# https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
if not isinstance(instance, str_types):
return True
non_negative_integer, rest = [], ""
for i, character in enumerate(instance):
if character.isdigit():
non_negative_integer.append(character)
continue
if not non_negative_integer:
return False
rest = instance[i:]
break
return (rest == "#") or jsonpointer.JsonPointer(rest)
try:
import uritemplate.exceptions
except ImportError:
pass
else:
@_checks_drafts(
draft6="uri-template",
draft7="uri-template",
raises=uritemplate.exceptions.InvalidTemplate,
)
def is_uri_template(
instance,
template_validator=uritemplate.Validator().force_balanced_braces(),
):
template = uritemplate.URITemplate(instance)
return template_validator.validate(template)

45
lib/spack/external/patches/distro.patch vendored Normal file
View File

@@ -0,0 +1,45 @@
diff --git a/lib/spack/external/_vendoring/distro/distro.py b/lib/spack/external/_vendoring/distro/distro.py
index 89e1868047..50c3b18d4d 100644
--- a/lib/spack/external/_vendoring/distro/distro.py
+++ b/lib/spack/external/_vendoring/distro/distro.py
@@ -1265,27 +1265,29 @@ def _distro_release_info(self) -> Dict[str, str]:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
else:
try:
- basenames = [
- basename
- for basename in os.listdir(self.etc_dir)
- if basename not in _DISTRO_RELEASE_IGNORE_BASENAMES
- and os.path.isfile(os.path.join(self.etc_dir, basename))
- ]
+ with os.scandir(self.etc_dir) as it:
+ etc_files = [
+ p.path for p in it
+ if p.is_file() and p.name not in _DISTRO_RELEASE_IGNORE_BASENAMES
+ ]
# We sort for repeatability in cases where there are multiple
# distro specific files; e.g. CentOS, Oracle, Enterprise all
# containing `redhat-release` on top of their own.
- basenames.sort()
+ etc_files.sort()
except OSError:
# This may occur when /etc is not readable but we can't be
# sure about the *-release files. Check common entries of
# /etc for information. If they turn out to not be there the
# error is handled in `_parse_distro_release_file()`.
- basenames = _DISTRO_RELEASE_BASENAMES
- for basename in basenames:
- match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
+ etc_files = [
+ os.path.join(self.etc_dir, basename)
+ for basename in _DISTRO_RELEASE_BASENAMES
+ ]
+
+ for filepath in etc_files:
+ match = _DISTRO_RELEASE_BASENAME_PATTERN.match(os.path.basename(filepath))
if match is None:
continue
- filepath = os.path.join(self.etc_dir, basename)
distro_info = self._parse_distro_release_file(filepath)
# The name is always present if the pattern matches.
if "name" not in distro_info:

View File

@@ -13,3 +13,191 @@ index 6b630cdfbb..1791fe7fbf 100644
-__version__ = metadata.version("jsonschema")
+
+__version__ = "3.2.0"
diff --git a/lib/spack/external/_vendoring/jsonschema/_format.py b/lib/spack/external/_vendoring/jsonschema/_format.py
index 281a7cfcff..29061e3661 100644
--- a/lib/spack/external/_vendoring/jsonschema/_format.py
+++ b/lib/spack/external/_vendoring/jsonschema/_format.py
@@ -231,96 +231,6 @@ def is_host_name(instance):
return True
-try:
- # The built-in `idna` codec only implements RFC 3890, so we go elsewhere.
- import idna
-except ImportError:
- pass
-else:
- @_checks_drafts(draft7="idn-hostname", raises=idna.IDNAError)
- def is_idn_host_name(instance):
- if not isinstance(instance, str_types):
- return True
- idna.encode(instance)
- return True
-
-
-try:
- import rfc3987
-except ImportError:
- try:
- from rfc3986_validator import validate_rfc3986
- except ImportError:
- pass
- else:
- @_checks_drafts(name="uri")
- def is_uri(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3986(instance, rule="URI")
-
- @_checks_drafts(
- draft6="uri-reference",
- draft7="uri-reference",
- raises=ValueError,
- )
- def is_uri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3986(instance, rule="URI_reference")
-
-else:
- @_checks_drafts(draft7="iri", raises=ValueError)
- def is_iri(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="IRI")
-
- @_checks_drafts(draft7="iri-reference", raises=ValueError)
- def is_iri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="IRI_reference")
-
- @_checks_drafts(name="uri", raises=ValueError)
- def is_uri(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="URI")
-
- @_checks_drafts(
- draft6="uri-reference",
- draft7="uri-reference",
- raises=ValueError,
- )
- def is_uri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="URI_reference")
-
-
-try:
- from strict_rfc3339 import validate_rfc3339
-except ImportError:
- try:
- from rfc3339_validator import validate_rfc3339
- except ImportError:
- validate_rfc3339 = None
-
-if validate_rfc3339:
- @_checks_drafts(name="date-time")
- def is_datetime(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3339(instance)
-
- @_checks_drafts(draft7="time")
- def is_time(instance):
- if not isinstance(instance, str_types):
- return True
- return is_datetime("1970-01-01T" + instance)
-
-
@_checks_drafts(name="regex", raises=re.error)
def is_regex(instance):
if not isinstance(instance, str_types):
@@ -340,86 +250,3 @@ def is_draft3_time(instance):
if not isinstance(instance, str_types):
return True
return datetime.datetime.strptime(instance, "%H:%M:%S")
-
-
-try:
- import webcolors
-except ImportError:
- pass
-else:
- def is_css_color_code(instance):
- return webcolors.normalize_hex(instance)
-
- @_checks_drafts(draft3="color", raises=(ValueError, TypeError))
- def is_css21_color(instance):
- if (
- not isinstance(instance, str_types) or
- instance.lower() in webcolors.css21_names_to_hex
- ):
- return True
- return is_css_color_code(instance)
-
- def is_css3_color(instance):
- if instance.lower() in webcolors.css3_names_to_hex:
- return True
- return is_css_color_code(instance)
-
-
-try:
- import jsonpointer
-except ImportError:
- pass
-else:
- @_checks_drafts(
- draft6="json-pointer",
- draft7="json-pointer",
- raises=jsonpointer.JsonPointerException,
- )
- def is_json_pointer(instance):
- if not isinstance(instance, str_types):
- return True
- return jsonpointer.JsonPointer(instance)
-
- # TODO: I don't want to maintain this, so it
- # needs to go either into jsonpointer (pending
- # https://github.com/stefankoegl/python-json-pointer/issues/34) or
- # into a new external library.
- @_checks_drafts(
- draft7="relative-json-pointer",
- raises=jsonpointer.JsonPointerException,
- )
- def is_relative_json_pointer(instance):
- # Definition taken from:
- # https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
- if not isinstance(instance, str_types):
- return True
- non_negative_integer, rest = [], ""
- for i, character in enumerate(instance):
- if character.isdigit():
- non_negative_integer.append(character)
- continue
-
- if not non_negative_integer:
- return False
-
- rest = instance[i:]
- break
- return (rest == "#") or jsonpointer.JsonPointer(rest)
-
-
-try:
- import uritemplate.exceptions
-except ImportError:
- pass
-else:
- @_checks_drafts(
- draft6="uri-template",
- draft7="uri-template",
- raises=uritemplate.exceptions.InvalidTemplate,
- )
- def is_uri_template(
- instance,
- template_validator=uritemplate.Validator().force_balanced_braces(),
- ):
- template = uritemplate.URITemplate(instance)
- return template_validator.validate(template)

View File

@@ -27,8 +27,6 @@
from llnl.util.lang import dedupe, memoized
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from spack.util.executable import Executable, which
from ..path import path_to_os_path, system_path_filter
if sys.platform != "win32":
@@ -53,7 +51,6 @@
"find_all_headers",
"find_libraries",
"find_system_libraries",
"fix_darwin_install_name",
"force_remove",
"force_symlink",
"getuid",
@@ -248,42 +245,6 @@ def path_contains_subdirectory(path, root):
return norm_path.startswith(norm_root)
@memoized
def file_command(*args):
"""Creates entry point to `file` system command with provided arguments"""
file_cmd = which("file", required=True)
for arg in args:
file_cmd.add_default_arg(arg)
return file_cmd
@memoized
def _get_mime_type():
"""Generate method to call `file` system command to aquire mime type
for a specified path
"""
if sys.platform == "win32":
# -h option (no-dereference) does not exist in Windows
return file_command("-b", "--mime-type")
else:
return file_command("-b", "-h", "--mime-type")
def mime_type(filename):
"""Returns the mime type and subtype of a file.
Args:
filename: file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
#: This generates the library filenames that may appear on any OS.
library_extensions = ["a", "la", "so", "tbd", "dylib"]
@@ -1679,41 +1640,6 @@ def safe_remove(*files_or_dirs):
raise
@system_path_filter
def fix_darwin_install_name(path):
"""Fix install name of dynamic libraries on Darwin to have full path.
There are two parts of this task:
1. Use ``install_name('-id', ...)`` to change install name of a single lib
2. Use ``install_name('-change', ...)`` to change the cross linking between
libs. The function assumes that all libraries are in one folder and
currently won't follow subfolders.
Parameters:
path (str): directory in which .dylib files are located
"""
libs = glob.glob(join_path(path, "*.dylib"))
for lib in libs:
# fix install name first:
install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool")
long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies:
for dep in deps:
for loc in libs:
# We really want to check for either
# dep == os.path.basename(loc) or
# dep == join_path(builddir, os.path.basename(loc)),
# but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib)
break
def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2) -> Optional[str]:
"""Find the first file matching a pattern.

View File

@@ -6,7 +6,6 @@
import collections.abc
import contextlib
import functools
import inspect
import itertools
import os
import re
@@ -16,7 +15,7 @@
from typing import Any, Callable, Iterable, List, Tuple
# Ignore emacs backups when listing modules
ignore_modules = [r"^\.#", "~$"]
ignore_modules = r"^\.#|~$"
def index_by(objects, *funcs):
@@ -91,15 +90,6 @@ def attr_setdefault(obj, name, value):
return getattr(obj, name)
def has_method(cls, name):
for base in inspect.getmro(cls):
if base is object:
continue
if name in base.__dict__:
return True
return False
def union_dicts(*dicts):
"""Use update() to combine all dicts into one.
@@ -164,19 +154,22 @@ def list_modules(directory, **kwargs):
order."""
list_directories = kwargs.setdefault("directories", True)
for name in os.listdir(directory):
if name == "__init__.py":
continue
ignore = re.compile(ignore_modules)
path = os.path.join(directory, name)
if list_directories and os.path.isdir(path):
init_py = os.path.join(path, "__init__.py")
if os.path.isfile(init_py):
yield name
with os.scandir(directory) as it:
for entry in it:
if entry.name == "__init__.py" or entry.name == "__pycache__":
continue
elif name.endswith(".py"):
if not any(re.search(pattern, name) for pattern in ignore_modules):
yield re.sub(".py$", "", name)
if (
list_directories
and entry.is_dir()
and os.path.isfile(os.path.join(entry.path, "__init__.py"))
):
yield entry.name
elif entry.name.endswith(".py") and entry.is_file() and not ignore.search(entry.name):
yield entry.name[:-3] # strip .py
def decorator_with_or_without_args(decorator):
@@ -223,8 +216,8 @@ def setter(name, value):
value.__name__ = name
setattr(cls, name, value)
if not has_method(cls, "_cmp_key"):
raise TypeError("'%s' doesn't define _cmp_key()." % cls.__name__)
if not hasattr(cls, "_cmp_key"):
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_key().")
setter("__eq__", lambda s, o: (s is o) or (o is not None and s._cmp_key() == o._cmp_key()))
setter("__lt__", lambda s, o: o is not None and s._cmp_key() < o._cmp_key())
@@ -374,8 +367,8 @@ def cd_fun():
TypeError: If the class does not have a ``_cmp_iter`` method
"""
if not has_method(cls, "_cmp_iter"):
raise TypeError("'%s' doesn't define _cmp_iter()." % cls.__name__)
if not hasattr(cls, "_cmp_iter"):
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_iter().")
# comparison operators are implemented in terms of lazy_eq and lazy_lt
def eq(self, other):
@@ -850,20 +843,19 @@ def uniq(sequence):
return uniq_list
def elide_list(line_list, max_num=10):
def elide_list(line_list: List[str], max_num: int = 10) -> List[str]:
"""Takes a long list and limits it to a smaller number of elements,
replacing intervening elements with '...'. For example::
elide_list([1,2,3,4,5,6], 4)
elide_list(["1", "2", "3", "4", "5", "6"], 4)
gives::
[1, 2, 3, '...', 6]
["1", "2", "3", "...", "6"]
"""
if len(line_list) > max_num:
return line_list[: max_num - 1] + ["..."] + line_list[-1:]
else:
return line_list
return [*line_list[: max_num - 1], "...", line_list[-1]]
return line_list
@contextlib.contextmanager

View File

@@ -1,131 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from llnl.util.lang import memoized
import spack.spec
import spack.version
from spack.compilers.clang import Clang
from spack.util.executable import Executable, ProcessError
class ABI:
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec
) -> bool:
"""Return true if architecture of target spec is ABI compatible
to the architecture of constraint spec. If either the target
or constraint specs have no architecture, target is also defined
as architecture ABI compatible to constraint."""
return (
not target.architecture
or not constraint.architecture
or target.architecture.intersects(constraint.architecture)
)
@memoized
def _gcc_get_libstdcxx_version(self, version):
"""Returns gcc ABI compatibility info by getting the library version of
a compiler's libstdc++ or libgcc_s"""
from spack.build_environment import dso_suffix
spec = spack.spec.CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:
return None
compiler = compilers[0]
rungcc = None
libname = None
output = None
if compiler.cxx:
rungcc = Executable(compiler.cxx)
libname = "libstdc++." + dso_suffix
elif compiler.cc:
rungcc = Executable(compiler.cc)
libname = "libgcc_s." + dso_suffix
else:
return None
try:
# Some gcc's are actually clang and don't respond properly to
# --print-file-name (they just print the filename, not the
# full path). Ignore these and expect them to be handled as clang.
if Clang.default_version(rungcc.exe[0]) != "unknown":
return None
output = rungcc("--print-file-name=%s" % libname, output=str)
except ProcessError:
return None
if not output:
return None
libpath = os.path.realpath(output.strip())
if not libpath:
return None
return os.path.basename(libpath)
@memoized
def _gcc_compiler_compare(self, pversion, cversion):
"""Returns true iff the gcc version pversion and cversion
are ABI compatible."""
plib = self._gcc_get_libstdcxx_version(pversion)
clib = self._gcc_get_libstdcxx_version(cversion)
if not plib or not clib:
return False
return plib == clib
def _intel_compiler_compare(
self, pversion: spack.version.ClosedOpenRange, cversion: spack.version.ClosedOpenRange
) -> bool:
"""Returns true iff the intel version pversion and cversion
are ABI compatible"""
# Test major and minor versions. Ignore build version.
pv = pversion.lo
cv = cversion.lo
return pv.up_to(2) == cv.up_to(2)
def compiler_compatible(
self, parent: spack.spec.Spec, child: spack.spec.Spec, loose: bool = False
) -> bool:
"""Return true if compilers for parent and child are ABI compatible."""
if not parent.compiler or not child.compiler:
return True
if parent.compiler.name != child.compiler.name:
# Different compiler families are assumed ABI incompatible
return False
if loose:
return True
# TODO: Can we move the specialized ABI matching stuff
# TODO: into compiler classes?
for pversion in parent.compiler.versions:
for cversion in child.compiler.versions:
# For a few compilers use specialized comparisons.
# Otherwise match on version match.
if pversion.intersects(cversion):
return True
elif parent.compiler.name == "gcc" and self._gcc_compiler_compare(
pversion, cversion
):
return True
elif parent.compiler.name == "intel" and self._intel_compiler_compare(
pversion, cversion
):
return True
return False
def compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec, loose: bool = False
) -> bool:
"""Returns true if target spec is ABI compatible to constraint spec"""
return self.architecture_compatible(target, constraint) and self.compiler_compatible(
target, constraint, loose=loose
)

View File

@@ -39,7 +39,6 @@ def _search_duplicate_compilers(error_cls):
import collections
import collections.abc
import glob
import inspect
import io
import itertools
import os
@@ -258,40 +257,6 @@ def _search_duplicate_specs_in_externals(error_cls):
return errors
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(attribute_name, config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
dict_view = syaml.syaml_dict((k, v) for k, v in config_data.items() if k == attribute_name)
syaml.dump_config(dict_view, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
if "all" in packages_yaml and "version" in packages_yaml["all"]:
summary = "Using the deprecated 'version' attribute under 'packages:all'"
errors.append(make_error("version", packages_yaml["all"], summary))
for package_name in packages_yaml:
if package_name == "all":
continue
package_conf = packages_yaml[package_name]
for attribute in ("compiler", "providers", "target"):
if attribute not in package_conf:
continue
summary = (
f"Using the deprecated '{attribute}' attribute " f"under 'packages:{package_name}'"
)
errors.append(make_error(attribute, package_conf, summary))
return errors
@config_packages
def _avoid_mismatched_variants(error_cls):
"""Warns if variant preferences have mismatched types or names."""
@@ -525,7 +490,7 @@ def _search_for_reserved_attributes_names_in_packages(pkgs, error_cls):
name_definitions = collections.defaultdict(list)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for cls_item in inspect.getmro(pkg_cls):
for cls_item in pkg_cls.__mro__:
for name in RESERVED_NAMES:
current_value = cls_item.__dict__.get(name)
if current_value is None:

View File

@@ -54,6 +54,7 @@
import spack.util.archive
import spack.util.crypto
import spack.util.file_cache as file_cache
import spack.util.filesystem as ssys
import spack.util.gpg
import spack.util.parallel
import spack.util.path
@@ -105,7 +106,7 @@ class BuildCacheDatabase(spack_db.Database):
record_fields = ("spec", "ref_count", "in_buildcache")
def __init__(self, root):
super().__init__(root, lock_cfg=spack_db.NO_LOCK)
super().__init__(root, lock_cfg=spack_db.NO_LOCK, layout=None)
self._write_transaction_impl = llnl.util.lang.nullcontext
self._read_transaction_impl = llnl.util.lang.nullcontext
@@ -687,7 +688,7 @@ def get_buildfile_manifest(spec):
# Non-symlinks.
for rel_path in visitor.files:
abs_path = os.path.join(root, rel_path)
m_type, m_subtype = fsys.mime_type(abs_path)
m_type, m_subtype = ssys.mime_type(abs_path)
if relocate.needs_binary_relocation(m_type, m_subtype):
# Why is this branch not part of needs_binary_relocation? :(
@@ -788,7 +789,9 @@ def sign_specfile(key: str, specfile_path: str) -> str:
return signed_specfile_path
def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_dir, concurrency):
def _read_specs_and_push_index(
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency
):
"""Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror.
@@ -812,7 +815,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
else:
continue
db.add(fetched_spec, None)
db.add(fetched_spec)
db.mark(fetched_spec, "in_buildcache", True)
# Now generate the index, compute its hash, and push the two files to
@@ -1765,7 +1768,7 @@ def _oci_update_index(
for spec_dict in spec_dicts:
spec = Spec.from_dict(spec_dict)
db.add(spec, directory_layout=None)
db.add(spec)
db.mark(spec, "in_buildcache", True)
# Create the index.json file
@@ -2561,9 +2564,8 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spec.package.windows_establish_runtime_linkage()
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)
spack.store.STORE.db.add(spec)
def install_single_spec(spec, unsigned=False, force=False):

View File

@@ -9,6 +9,7 @@
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_file_in_path_or_raise,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
@@ -19,6 +20,7 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_file_in_path_or_raise",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",

View File

@@ -152,7 +152,7 @@ def _ensure_bootstrap_configuration() -> Generator:
bootstrap_store_path = store_path()
user_configuration = _read_and_sanitize_configuration()
with spack.environment.no_active_environment():
with spack.platforms.prevent_cray_detection(), spack.platforms.use_platform(
with spack.platforms.use_platform(
spack.platforms.real_host()
), spack.repo.use_repositories(spack.paths.packages_path):
# Default configuration scopes excluding command line

View File

@@ -472,7 +472,8 @@ def ensure_clingo_importable_or_raise() -> None:
def gnupg_root_spec() -> str:
"""Return the root spec used to bootstrap GnuPG"""
return _root_spec("gnupg@2.3:")
root_spec_name = "win-gpg" if IS_WINDOWS else "gnupg"
return _root_spec(f"{root_spec_name}@2.3:")
def ensure_gpg_in_path_or_raise() -> None:
@@ -482,6 +483,19 @@ def ensure_gpg_in_path_or_raise() -> None:
)
def file_root_spec() -> str:
"""Return the root spec used to bootstrap file"""
root_spec_name = "win-file" if IS_WINDOWS else "file"
return _root_spec(root_spec_name)
def ensure_file_in_path_or_raise() -> None:
"""Ensure file is in the PATH or raise"""
return ensure_executables_in_path_or_raise(
executables=["file"], abstract_spec=file_root_spec()
)
def patchelf_root_spec() -> str:
"""Return the root spec used to bootstrap patchelf"""
# 0.13.1 is the last version not to require C++17.
@@ -565,14 +579,15 @@ def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":
ensure_patchelf_in_path_or_raise()
if not IS_WINDOWS:
ensure_gpg_in_path_or_raise()
elif sys.platform == "win32":
ensure_file_in_path_or_raise()
ensure_gpg_in_path_or_raise()
ensure_clingo_importable_or_raise()
def all_core_root_specs() -> List[str]:
"""Return a list of all the core root specs that may be used to bootstrap Spack"""
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec(), file_root_spec()]
def bootstrapping_sources(scope: Optional[str] = None):

View File

@@ -88,7 +88,7 @@ def _core_requirements() -> List[RequiredResponseType]:
def _buildcache_requirements() -> List[RequiredResponseType]:
_buildcache_exes = {
"file": _missing("file", "required to analyze files for buildcaches"),
"file": _missing("file", "required to analyze files for buildcaches", system_only=False),
("gpg2", "gpg"): _missing("gpg2", "required to sign/verify buildcaches", False),
}
if platform.system().lower() == "darwin":

View File

@@ -1003,7 +1003,6 @@ def set_all_package_py_globals(self):
"""Set the globals in modules of package.py files."""
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
if self.should_set_package_py_globals & flag:
if self.context == Context.BUILD and self.needs_build_context & flag:
set_package_py_globals(pkg, context=Context.BUILD)
@@ -1011,6 +1010,12 @@ def set_all_package_py_globals(self):
# This includes runtime dependencies, also runtime deps of direct build deps.
set_package_py_globals(pkg, context=Context.RUN)
# Looping over the set of packages a second time
# ensures all globals are loaded into the module space prior to
# any package setup. This guarantees package setup methods have
# access to expected module level definitions such as "spack_cc"
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
for spec in dspec.dependents():
# Note: some specs have dependents that are unreachable from the root, so avoid
# setting globals for those.
@@ -1553,21 +1558,21 @@ class ModuleChangePropagator:
_PROTECTED_NAMES = ("package", "current_module", "modules_in_mro", "_set_attributes")
def __init__(self, package):
def __init__(self, package: spack.package_base.PackageBase) -> None:
self._set_self_attributes("package", package)
self._set_self_attributes("current_module", package.module)
#: Modules for the classes in the MRO up to PackageBase
modules_in_mro = []
for cls in inspect.getmro(type(package)):
module = cls.module
for cls in package.__class__.__mro__:
module = getattr(cls, "module", None)
if module == self.current_module:
continue
if module == spack.package_base:
if module is None or module is spack.package_base:
break
if module is self.current_module:
continue
modules_in_mro.append(module)
self._set_self_attributes("modules_in_mro", modules_in_mro)
self._set_self_attributes("_set_attributes", {})

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
import os.path
import stat
@@ -549,13 +548,12 @@ def autoreconf(self, pkg, spec, prefix):
tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************")
with fs.working_dir(self.configure_directory):
m = inspect.getmodule(self.pkg)
# This line is what is needed most of the time
# --install, --verbose, --force
autoreconf_args = ["-ivf"]
autoreconf_args += self.autoreconf_search_path_args
autoreconf_args += self.autoreconf_extra_args
m.autoreconf(*autoreconf_args)
self.pkg.module.autoreconf(*autoreconf_args)
@property
def autoreconf_search_path_args(self):
@@ -579,7 +577,9 @@ def set_configure_or_die(self):
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
inspect.getmodule(self.pkg).configure = Executable(self.configure_abs_path)
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
@@ -596,7 +596,7 @@ def configure(self, pkg, spec, prefix):
options += self.configure_args()
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).configure(*options)
pkg.module.configure(*options)
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
@@ -604,12 +604,12 @@ def build(self, pkg, spec, prefix):
params = ["V=1"]
params += self.build_targets
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*params)
pkg.module.make(*params)
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
@@ -72,9 +70,7 @@ def check_args(self):
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
def install(self, pkg, spec, prefix):
"""Copy build files into package prefix."""
@@ -86,4 +82,4 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Run "cargo test"."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).cargo("test", *self.check_args)
self.pkg.module.cargo("test", *self.check_args)

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import inspect
import os
import pathlib
import platform
@@ -108,6 +107,11 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
def generator(*names: str, default: Optional[str] = None):
"""The build system generator to use.
@@ -539,24 +543,24 @@ def cmake(self, pkg, spec, prefix):
options += self.cmake_args()
options.append(os.path.abspath(self.root_cmakelists_dir))
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).cmake(*options)
pkg.module.cmake(*options)
def build(self, pkg, spec, prefix):
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
inspect.getmodule(self.pkg).make(*self.build_targets)
pkg.module.make(*self.build_targets)
elif self.generator == "Ninja":
self.build_targets.append("-v")
inspect.getmodule(self.pkg).ninja(*self.build_targets)
pkg.module.ninja(*self.build_targets)
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
elif self.generator == "Ninja":
inspect.getmodule(self.pkg).ninja(*self.install_targets)
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -138,7 +138,7 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.5")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.6")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
@@ -146,6 +146,7 @@ def cuda_flags(arch_list):
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
conflicts("%clang@19:", when="+cuda ^cuda@:12.6")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
@@ -82,7 +80,7 @@ def check_args(self):
def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).go("build", *self.build_args)
pkg.module.go("build", *self.build_args)
def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin."""
@@ -95,4 +93,4 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).go("test", *self.check_args)
self.pkg.module.go("test", *self.check_args)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List
import llnl.util.filesystem as fs
@@ -103,12 +102,12 @@ def edit(self, pkg, spec, prefix):
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.build_targets)
pkg.module.make(*self.build_targets)
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import List
@@ -195,19 +194,19 @@ def meson(self, pkg, spec, prefix):
options += self.std_meson_args
options += self.meson_args()
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).meson(*options)
pkg.module.meson(*options)
def build(self, pkg, spec, prefix):
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).ninja(*options)
pkg.module.ninja(*options)
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).ninja(*self.install_targets)
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
import llnl.util.filesystem as fs
@@ -104,7 +103,7 @@ def msbuild_install_args(self):
def build(self, pkg, spec, prefix):
"""Run "msbuild" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).msbuild(
pkg.module.msbuild(
*self.std_msbuild_args,
*self.msbuild_args(),
self.define_targets(*self.build_targets),
@@ -114,6 +113,6 @@ def install(self, pkg, spec, prefix):
"""Run "msbuild" on the install targets specified by the builder.
This is INSTALL by default"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).msbuild(
pkg.module.msbuild(
*self.msbuild_install_args(), self.define_targets(*self.install_targets)
)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
import llnl.util.filesystem as fs
@@ -132,9 +131,7 @@ def build(self, pkg, spec, prefix):
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.build_targets, ignore_quotes=self.ignore_quotes
)
pkg.module.nmake(*opts, *self.build_targets, ignore_quotes=self.ignore_quotes)
def install(self, pkg, spec, prefix):
"""Run "nmake" on the install targets specified by the builder.
@@ -146,6 +143,4 @@ def install(self, pkg, spec, prefix):
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes
)
pkg.module.nmake(*opts, *self.install_targets, ignore_quotes=self.ignore_quotes)

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
@@ -47,7 +45,7 @@ class OctaveBuilder(BaseBuilder):
def install(self, pkg, spec, prefix):
"""Install the package from the archive file"""
inspect.getmodule(self.pkg).octave(
pkg.module.octave(
"--quiet",
"--norc",
"--built-in-docstrings-file=/dev/null",

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import Iterable
@@ -134,7 +133,7 @@ def build_method(self):
def build_executable(self):
"""Returns the executable method to build the perl package"""
if self.build_method == "Makefile.PL":
build_executable = inspect.getmodule(self.pkg).make
build_executable = self.pkg.module.make
elif self.build_method == "Build.PL":
build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build"))
return build_executable
@@ -158,7 +157,7 @@ def configure(self, pkg, spec, prefix):
options = ["Build.PL", "--install_base", prefix]
options += self.configure_args()
inspect.getmodule(self.pkg).perl(*options)
pkg.module.perl(*options)
# It is possible that the shebang in the Build script that is created from
# Build.PL may be too long causing the build to fail. Patching the shebang

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import inspect
import operator
import os
import re
@@ -17,7 +16,7 @@
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
from llnl.util.filesystem import HeaderList, LibraryList, join_path
import spack.builder
import spack.config
@@ -120,6 +119,12 @@ def skip_modules(self) -> Iterable[str]:
"""
return []
@property
def bindir(self) -> str:
"""Path to Python package's bindir, bin on unix like OS's Scripts on Windows"""
windows = self.spec.satisfies("platform=windows")
return join_path(self.spec.prefix, "Scripts" if windows else "bin")
def view_file_conflicts(self, view, merge_map):
"""Report all file conflicts, excepting special cases for python.
Specifically, this does not report errors for duplicate
@@ -222,7 +227,7 @@ def test_imports(self) -> None:
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python # type: ignore[union-attr]
python = self.module.python
for module in self.import_modules:
with test_part(
self,
@@ -309,9 +314,9 @@ def get_external_python_for_prefix(self):
)
python_externals_detected = [
d.spec
for d in python_externals_detection.get("python", [])
if d.prefix == self.spec.external_path
spec
for spec in python_externals_detection.get("python", [])
if spec.external_path == self.spec.external_path
]
if python_externals_detected:
return python_externals_detected[0]

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir
import spack.builder
@@ -66,17 +64,17 @@ def qmake_args(self):
def qmake(self, pkg, spec, prefix):
"""Run ``qmake`` to configure the project and generate a Makefile."""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).qmake(*self.qmake_args())
pkg.module.qmake(*self.qmake_args())
def build(self, pkg, spec, prefix):
"""Make the build targets"""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make()
pkg.module.make()
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make("install")
pkg.module.make("install")
def check(self):
"""Search the Makefile for a ``check:`` target and runs it if found."""

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import Optional, Tuple
import llnl.util.lang as lang
@@ -51,7 +50,7 @@ def install(self, pkg, spec, prefix):
args.extend(["--library={0}".format(self.pkg.module.r_lib_dir), self.stage.source_path])
inspect.getmodule(self.pkg).R(*args)
pkg.module.R(*args)
class RPackage(Package):

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import inspect
import spack.builder
import spack.package_base
@@ -52,10 +51,10 @@ def build(self, pkg, spec, prefix):
gemspecs = glob.glob("*.gemspec")
rakefiles = glob.glob("Rakefile")
if gemspecs:
inspect.getmodule(self.pkg).gem("build", "--norc", gemspecs[0])
pkg.module.gem("build", "--norc", gemspecs[0])
elif rakefiles:
jobs = inspect.getmodule(self.pkg).make_jobs
inspect.getmodule(self.pkg).rake("package", "-j{0}".format(jobs))
jobs = pkg.module.make_jobs
pkg.module.rake("package", "-j{0}".format(jobs))
else:
# Some Ruby packages only ship `*.gem` files, so nothing to build
pass
@@ -70,6 +69,6 @@ def install(self, pkg, spec, prefix):
# if --install-dir is not used, GEM_PATH is deleted from the
# environement, and Gems required to build native extensions will
# not be found. Those extensions are built during `gem install`.
inspect.getmodule(self.pkg).gem(
pkg.module.gem(
"install", "--norc", "--ignore-dependencies", "--install-dir", prefix, gems[0]
)

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
@@ -63,8 +61,7 @@ def build_args(self, spec, prefix):
def build(self, pkg, spec, prefix):
"""Build the package."""
args = self.build_args(spec, prefix)
inspect.getmodule(self.pkg).scons(*args)
pkg.module.scons(*self.build_args(spec, prefix))
def install_args(self, spec, prefix):
"""Arguments to pass to install."""
@@ -72,9 +69,7 @@ def install_args(self, spec, prefix):
def install(self, pkg, spec, prefix):
"""Install the package."""
args = self.install_args(spec, prefix)
inspect.getmodule(self.pkg).scons("install", *args)
pkg.module.scons("install", *self.install_args(spec, prefix))
def build_test(self):
"""Run unit tests after build.

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
import re
@@ -86,14 +85,13 @@ def import_modules(self):
def python(self, *args, **kwargs):
"""The python ``Executable``."""
inspect.getmodule(self).python(*args, **kwargs)
self.pkg.module.python(*args, **kwargs)
def test_imports(self):
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python
for module in self.import_modules:
with spack.install_test.test_part(
self,
@@ -101,7 +99,7 @@ def test_imports(self):
purpose="checking import of {0}".format(module),
work_dir="spack-test",
):
python("-c", "import {0}".format(module))
self.python("-c", "import {0}".format(module))
@spack.builder.builder("sip")
@@ -136,7 +134,7 @@ def configure(self, pkg, spec, prefix):
"""Configure the package."""
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
args = ["--verbose", "--target-dir", inspect.getmodule(self.pkg).python_platlib]
args = ["--verbose", "--target-dir", pkg.module.python_platlib]
args.extend(self.configure_args())
# https://github.com/Python-SIP/sip/commit/cb0be6cb6e9b756b8b0db3136efb014f6fb9b766
@@ -155,7 +153,7 @@ def build(self, pkg, spec, prefix):
args = self.build_args()
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*args)
pkg.module.make(*args)
def build_args(self):
"""Arguments to pass to build."""
@@ -166,7 +164,7 @@ def install(self, pkg, spec, prefix):
args = self.install_args()
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make("install", *args)
pkg.module.make("install", *args)
def install_args(self):
"""Arguments to pass to install."""

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir
import spack.builder
@@ -90,11 +88,11 @@ def build_directory(self):
def python(self, *args, **kwargs):
"""The python ``Executable``."""
inspect.getmodule(self.pkg).python(*args, **kwargs)
self.pkg.module.python(*args, **kwargs)
def waf(self, *args, **kwargs):
"""Runs the waf ``Executable``."""
jobs = inspect.getmodule(self.pkg).make_jobs
jobs = self.pkg.module.make_jobs
with working_dir(self.build_directory):
self.python("waf", "-j{0}".format(jobs), *args, **kwargs)

View File

@@ -6,7 +6,6 @@
import collections.abc
import copy
import functools
import inspect
from typing import List, Optional, Tuple
from llnl.util import lang
@@ -97,11 +96,10 @@ class hierarchy (look at AspellDictPackage for an example of that)
Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder
"""
package_module = inspect.getmodule(pkg)
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
builder_cls_name = default_builder_cls.__name__
builder_cls = getattr(package_module, builder_cls_name, None)
builder_cls = getattr(pkg.module, builder_cls_name, None)
if builder_cls:
return builder_cls(pkg)

View File

@@ -1110,7 +1110,8 @@ def main_script_replacements(cmd):
cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError, TimeoutError) as err:
tty.warn(f"Problem populating buildgroup: {err}")
else:
elif cdash_config:
# warn only if there was actually a CDash configuration.
tty.warn("Unable to populate buildgroup without CDash credentials")
service_job_retries = {

View File

@@ -460,7 +460,7 @@ def push_fn(args):
"The following {} specs were skipped as they already exist in the buildcache:\n"
" {}\n"
" Use --force to overwrite them.".format(
len(skipped), ", ".join(elide_list(skipped, 5))
len(skipped), ", ".join(elide_list([_format_spec(s) for s in skipped], 5))
)
)

View File

@@ -7,6 +7,7 @@
import copy
import os
import re
import shlex
import sys
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
@@ -18,6 +19,7 @@
import spack.cmd
import spack.main
import spack.paths
import spack.platforms
from spack.main import section_descriptions
description = "list available spack commands"
@@ -139,7 +141,7 @@ def usage(self, usage: str) -> str:
cmd = self.parser.prog.replace(" ", "-")
if cmd in self.documented:
string += "\n:ref:`More documentation <cmd-{0}>`\n".format(cmd)
string = f"{string}\n:ref:`More documentation <cmd-{cmd}>`\n"
return string
@@ -249,33 +251,27 @@ def body(
Function body.
"""
if positionals:
return """
return f"""
if $list_options
then
{0}
{self.optionals(optionals)}
else
{1}
{self.positionals(positionals)}
fi
""".format(
self.optionals(optionals), self.positionals(positionals)
)
"""
elif subcommands:
return """
return f"""
if $list_options
then
{0}
{self.optionals(optionals)}
else
{1}
{self.subcommands(subcommands)}
fi
""".format(
self.optionals(optionals), self.subcommands(subcommands)
)
"""
else:
return """
{0}
""".format(
self.optionals(optionals)
)
return f"""
{self.optionals(optionals)}
"""
def positionals(self, positionals: Sequence[str]) -> str:
"""Return the syntax for reporting positional arguments.
@@ -304,7 +300,7 @@ def optionals(self, optionals: Sequence[str]) -> str:
Returns:
Syntax for optional flags.
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(optionals))
return f'SPACK_COMPREPLY="{" ".join(optionals)}"'
def subcommands(self, subcommands: Sequence[str]) -> str:
"""Return the syntax for reporting subcommands.
@@ -315,7 +311,7 @@ def subcommands(self, subcommands: Sequence[str]) -> str:
Returns:
Syntax for subcommand parsers
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(subcommands))
return f'SPACK_COMPREPLY="{" ".join(subcommands)}"'
# Map argument destination names to their complete commands
@@ -395,7 +391,7 @@ def _fish_dest_get_complete(prog: str, dest: str) -> Optional[str]:
subcmd = s[1] if len(s) == 2 else ""
for (prog_key, pos_key), value in _dest_to_fish_complete.items():
if subcmd.startswith(prog_key) and re.match("^" + pos_key + "$", dest):
if subcmd.startswith(prog_key) and re.match(f"^{pos_key}$", dest):
return value
return None
@@ -427,24 +423,6 @@ def format(self, cmd: Command) -> str:
+ self.complete(cmd.prog, positionals, optionals, subcommands)
)
def _quote(self, string: str) -> str:
"""Quote string and escape special characters if necessary.
Args:
string: Input string.
Returns:
Quoted string.
"""
# Goal here is to match fish_indent behavior
# Strings without spaces (or other special characters) do not need to be escaped
if not any([sub in string for sub in [" ", "'", '"']]):
return string
string = string.replace("'", r"\'")
return f"'{string}'"
def optspecs(
self,
prog: str,
@@ -463,7 +441,7 @@ def optspecs(
optspec_var = "__fish_spack_optspecs_" + prog.replace(" ", "_").replace("-", "_")
if optionals is None:
return "set -g %s\n" % optspec_var
return f"set -g {optspec_var}\n"
# Build optspec by iterating over options
args = []
@@ -490,11 +468,11 @@ def optspecs(
long = [f[2:] for f in flags if f.startswith("--")]
while len(short) > 0 and len(long) > 0:
arg = "%s/%s%s" % (short.pop(), long.pop(), required)
arg = f"{short.pop()}/{long.pop()}{required}"
while len(short) > 0:
arg = "%s/%s" % (short.pop(), required)
arg = f"{short.pop()}/{required}"
while len(long) > 0:
arg = "%s%s" % (long.pop(), required)
arg = f"{long.pop()}{required}"
args.append(arg)
@@ -503,7 +481,7 @@ def optspecs(
# indicate that such subcommand exists.
args = " ".join(args)
return "set -g %s %s\n" % (optspec_var, args)
return f"set -g {optspec_var} {args}\n"
@staticmethod
def complete_head(
@@ -524,12 +502,14 @@ def complete_head(
subcmd = s[1] if len(s) == 2 else ""
if index is None:
return "complete -c %s -n '__fish_spack_using_command %s'" % (s[0], subcmd)
return f"complete -c {s[0]} -n '__fish_spack_using_command {subcmd}'"
elif nargs in [argparse.ZERO_OR_MORE, argparse.ONE_OR_MORE, argparse.REMAINDER]:
head = "complete -c %s -n '__fish_spack_using_command_pos_remainder %d %s'"
return (
f"complete -c {s[0]} -n '__fish_spack_using_command_pos_remainder "
f"{index} {subcmd}'"
)
else:
head = "complete -c %s -n '__fish_spack_using_command_pos %d %s'"
return head % (s[0], index, subcmd)
return f"complete -c {s[0]} -n '__fish_spack_using_command_pos {index} {subcmd}'"
def complete(
self,
@@ -597,25 +577,18 @@ def positionals(
if choices is not None:
# If there are choices, we provide a completion for all possible values.
commands.append(head + " -f -a %s" % self._quote(" ".join(choices)))
commands.append(f"{head} -f -a {shlex.quote(' '.join(choices))}")
else:
# Otherwise, we try to find a predefined completion for it
value = _fish_dest_get_complete(prog, args)
if value is not None:
commands.append(head + " " + value)
commands.append(f"{head} {value}")
return "\n".join(commands) + "\n"
def prog_comment(self, prog: str) -> str:
"""Return a comment line for the command.
Args:
prog: Program name.
Returns:
Comment line.
"""
return "\n# %s\n" % prog
"""Return a comment line for the command."""
return f"\n# {prog}\n"
def optionals(
self,
@@ -658,28 +631,28 @@ def optionals(
for f in flags:
if f.startswith("--"):
long = f[2:]
prefix += " -l %s" % long
prefix = f"{prefix} -l {long}"
elif f.startswith("-"):
short = f[1:]
assert len(short) == 1
prefix += " -s %s" % short
prefix = f"{prefix} -s {short}"
# Check if option require argument.
# Currently multi-argument options are not supported, so we treat it like one argument.
if nargs != 0:
prefix += " -r"
prefix = f"{prefix} -r"
if dest is not None:
# If there are choices, we provide a completion for all possible values.
commands.append(prefix + " -f -a %s" % self._quote(" ".join(dest)))
commands.append(f"{prefix} -f -a {shlex.quote(' '.join(dest))}")
else:
# Otherwise, we try to find a predefined completion for it
value = _fish_dest_get_complete(prog, dest)
if value is not None:
commands.append(prefix + " " + value)
commands.append(f"{prefix} {value}")
if help:
commands.append(prefix + " -d %s" % self._quote(help))
commands.append(f"{prefix} -d {shlex.quote(help)}")
return "\n".join(commands) + "\n"
@@ -697,11 +670,11 @@ def subcommands(self, prog: str, subcommands: List[Tuple[ArgumentParser, str, st
head = self.complete_head(prog, 0)
for _, subcommand, help in subcommands:
command = head + " -f -a %s" % self._quote(subcommand)
command = f"{head} -f -a {shlex.quote(subcommand)}"
if help is not None and len(help) > 0:
help = help.split("\n")[0]
command += " -d %s" % self._quote(help)
command = f"{command} -d {shlex.quote(help)}"
commands.append(command)
@@ -747,7 +720,7 @@ def rst_index(out: IO) -> None:
for i, cmd in enumerate(sorted(commands)):
description = description.capitalize() if i == 0 else ""
ref = ":ref:`%s <spack-%s>`" % (cmd, cmd)
ref = f":ref:`{cmd} <spack-{cmd}>`"
comma = "," if i != len(commands) - 1 else ""
bar = "| " if i % 8 == 0 else " "
out.write(line % (description, bar + ref + comma))
@@ -858,10 +831,10 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
# check header first so we don't open out files unnecessarily
if args.header and not os.path.exists(args.header):
tty.die("No such file: '%s'" % args.header)
tty.die(f"No such file: '{args.header}'")
if args.update:
tty.msg("Updating file: %s" % args.update)
tty.msg(f"Updating file: {args.update}")
with open(args.update, "w") as f:
prepend_header(args, f)
formatter(args, f)

View File

@@ -14,7 +14,6 @@
installation and its deprecator.
"""
import argparse
import os
import llnl.util.tty as tty
from llnl.util.symlink import symlink
@@ -76,12 +75,7 @@ def setup_parser(sp):
)
sp.add_argument(
"-l",
"--link-type",
type=str,
default="soft",
choices=["soft", "hard"],
help="type of filesystem link to use for deprecation (default soft)",
"-l", "--link-type", type=str, default=None, choices=["soft", "hard"], help="(deprecated)"
)
sp.add_argument(
@@ -91,6 +85,9 @@ def setup_parser(sp):
def deprecate(parser, args):
"""Deprecate one spec in favor of another"""
if args.link_type is not None:
tty.warn("The --link-type option is deprecated and will be removed in a future release.")
env = ev.active_environment()
specs = spack.cmd.parse_specs(args.specs)
@@ -144,7 +141,5 @@ def deprecate(parser, args):
if not answer:
tty.die("Will not deprecate any packages.")
link_fn = os.link if args.link_type == "hard" else symlink
for dcate, dcator in zip(all_deprecate, all_deprecators):
dcate.package.do_deprecate(dcator, link_fn)
dcate.package.do_deprecate(dcator, symlink)

View File

@@ -29,6 +29,9 @@
__all__ = ["Compiler"]
PATH_INSTANCE_VARS = ["cc", "cxx", "f77", "fc"]
FLAG_INSTANCE_VARS = ["cflags", "cppflags", "cxxflags", "fflags"]
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
@@ -700,6 +703,30 @@ def compiler_environment(self):
os.environ.clear()
os.environ.update(backup_env)
def to_dict(self):
flags_dict = {fname: " ".join(fvals) for fname, fvals in self.flags.items()}
flags_dict.update(
{attr: getattr(self, attr, None) for attr in FLAG_INSTANCE_VARS if hasattr(self, attr)}
)
result = {
"spec": str(self.spec),
"paths": {attr: getattr(self, attr, None) for attr in PATH_INSTANCE_VARS},
"flags": flags_dict,
"operating_system": str(self.operating_system),
"target": str(self.target),
"modules": self.modules or [],
"environment": self.environment or {},
"extra_rpaths": self.extra_rpaths or [],
}
if self.enable_implicit_rpaths is not None:
result["implicit_rpaths"] = self.enable_implicit_rpaths
if self.alias:
result["alias"] = self.alias
return result
class CompilerAccessError(spack.error.SpackError):
def __init__(self, compiler, paths):

View File

@@ -6,7 +6,6 @@
"""This module contains functions related to finding compilers on the
system and configuring Spack to use multiple compilers.
"""
import collections
import importlib
import os
import sys
@@ -31,8 +30,6 @@
from spack.util.environment import get_path
from spack.util.naming import mod_to_class
_path_instance_vars = ["cc", "cxx", "f77", "fc"]
_flags_instance_vars = ["cflags", "cppflags", "cxxflags", "fflags"]
_other_instance_vars = [
"modules",
"operating_system",
@@ -90,29 +87,7 @@ def converter(cspec_like, *args, **kwargs):
def _to_dict(compiler):
"""Return a dict version of compiler suitable to insert in YAML."""
d = {}
d["spec"] = str(compiler.spec)
d["paths"] = dict((attr, getattr(compiler, attr, None)) for attr in _path_instance_vars)
d["flags"] = dict((fname, " ".join(fvals)) for fname, fvals in compiler.flags.items())
d["flags"].update(
dict(
(attr, getattr(compiler, attr, None))
for attr in _flags_instance_vars
if hasattr(compiler, attr)
)
)
d["operating_system"] = str(compiler.operating_system)
d["target"] = str(compiler.target)
d["modules"] = compiler.modules or []
d["environment"] = compiler.environment or {}
d["extra_rpaths"] = compiler.extra_rpaths or []
if compiler.enable_implicit_rpaths is not None:
d["implicit_rpaths"] = compiler.enable_implicit_rpaths
if compiler.alias:
d["alias"] = compiler.alias
return {"compiler": d}
return {"compiler": compiler.to_dict()}
def get_compiler_config(
@@ -298,24 +273,24 @@ def find_compilers(
valid_compilers = {}
for name, detected in detected_packages.items():
compilers = [x for x in detected if CompilerConfigFactory.from_external_spec(x.spec)]
compilers = [x for x in detected if CompilerConfigFactory.from_external_spec(x)]
if not compilers:
continue
valid_compilers[name] = compilers
def _has_fortran_compilers(x):
if "compilers" not in x.spec.extra_attributes:
if "compilers" not in x.extra_attributes:
return False
return "fortran" in x.spec.extra_attributes["compilers"]
return "fortran" in x.extra_attributes["compilers"]
if mixed_toolchain:
gccs = [x for x in valid_compilers.get("gcc", []) if _has_fortran_compilers(x)]
if gccs:
best_gcc = sorted(
gccs, key=lambda x: spack.spec.parse_with_version_concrete(x.spec).version
gccs, key=lambda x: spack.spec.parse_with_version_concrete(x).version
)[-1]
gfortran = best_gcc.spec.extra_attributes["compilers"]["fortran"]
gfortran = best_gcc.extra_attributes["compilers"]["fortran"]
for name in ("llvm", "apple-clang"):
if name not in valid_compilers:
continue
@@ -323,11 +298,11 @@ def _has_fortran_compilers(x):
for candidate in candidates:
if _has_fortran_compilers(candidate):
continue
candidate.spec.extra_attributes["compilers"]["fortran"] = gfortran
candidate.extra_attributes["compilers"]["fortran"] = gfortran
new_compilers = []
for name, detected in valid_compilers.items():
for config in CompilerConfigFactory.from_specs([x.spec for x in detected]):
for config in CompilerConfigFactory.from_specs(detected):
c = _compiler_from_config_entry(config["compiler"])
if c in known_compilers:
continue
@@ -394,8 +369,9 @@ def replace_apple_clang(name):
return [replace_apple_clang(name) for name in all_compiler_module_names()]
@llnl.util.lang.memoized
def all_compiler_module_names() -> List[str]:
return [name for name in llnl.util.lang.list_modules(spack.paths.compilers_path)]
return list(llnl.util.lang.list_modules(spack.paths.compilers_path))
@_auto_compiler_spec
@@ -487,13 +463,15 @@ def compiler_from_dict(items):
os = items.get("operating_system", None)
target = items.get("target", None)
if not ("paths" in items and all(n in items["paths"] for n in _path_instance_vars)):
if not (
"paths" in items and all(n in items["paths"] for n in spack.compiler.PATH_INSTANCE_VARS)
):
raise InvalidCompilerConfigurationError(cspec)
cls = class_for_compiler_name(cspec.name)
compiler_paths = []
for c in _path_instance_vars:
for c in spack.compiler.PATH_INSTANCE_VARS:
compiler_path = items["paths"][c]
if compiler_path != "None":
compiler_paths.append(compiler_path)
@@ -621,24 +599,6 @@ def compiler_for_spec(compiler_spec, arch_spec):
return compilers[0]
@_auto_compiler_spec
def get_compiler_duplicates(compiler_spec, arch_spec):
config = spack.config.CONFIG
scope_to_compilers = {}
for scope in config.scopes:
compilers = compilers_for_spec(compiler_spec, arch_spec=arch_spec, scope=scope)
if compilers:
scope_to_compilers[scope] = compilers
cfg_file_to_duplicates = {}
for scope, compilers in scope_to_compilers.items():
config_file = config.get_config_filename(scope, "compilers")
cfg_file_to_duplicates[config_file] = compilers
return cfg_file_to_duplicates
@llnl.util.lang.memoized
def class_for_compiler_name(compiler_name):
"""Given a compiler module name, get the corresponding Compiler class."""
@@ -661,50 +621,10 @@ def class_for_compiler_name(compiler_name):
return cls
def all_os_classes():
"""
Return the list of classes for all operating systems available on
this platform
"""
classes = []
platform = spack.platforms.host()
for os_class in platform.operating_sys.values():
classes.append(os_class)
return classes
def all_compiler_types():
return [class_for_compiler_name(c) for c in supported_compilers()]
#: Gathers the attribute values by which a detected compiler is considered
#: unique in Spack.
#:
#: - os: the operating system
#: - compiler_name: the name of the compiler (e.g. 'gcc', 'clang', etc.)
#: - version: the version of the compiler
#:
CompilerID = collections.namedtuple("CompilerID", ["os", "compiler_name", "version"])
#: Variations on a matched compiler name
NameVariation = collections.namedtuple("NameVariation", ["prefix", "suffix"])
#: Groups together the arguments needed by `detect_version`. The four entries
#: in the tuple are:
#:
#: - id: An instance of the CompilerID named tuple (version can be set to None
#: as it will be detected later)
#: - variation: a NameVariation for file being tested
#: - language: compiler language being tested (one of 'cc', 'cxx', 'fc', 'f77')
#: - path: full path to the executable being tested
#:
DetectVersionArgs = collections.namedtuple(
"DetectVersionArgs", ["id", "variation", "language", "path"]
)
def is_mixed_toolchain(compiler):
"""Returns True if the current compiler is a mixed toolchain,
False otherwise.
@@ -903,17 +823,12 @@ def _extract_os_and_target(spec: "spack.spec.Spec"):
class InvalidCompilerConfigurationError(spack.error.SpackError):
def __init__(self, compiler_spec):
super().__init__(
'Invalid configuration for [compiler "%s"]: ' % compiler_spec,
"Compiler configuration must contain entries for all compilers: %s"
% _path_instance_vars,
f'Invalid configuration for [compiler "{compiler_spec}"]: ',
f"Compiler configuration must contain entries for "
f"all compilers: {spack.compiler.PATH_INSTANCE_VARS}",
)
class NoCompilersError(spack.error.SpackError):
def __init__(self):
super().__init__("Spack could not find any compilers!")
class UnknownCompilerError(spack.error.SpackError):
def __init__(self, compiler_name):
super().__init__("Spack doesn't support the requested compiler: {0}".format(compiler_name))
@@ -924,25 +839,3 @@ def __init__(self, compiler_spec, target):
super().__init__(
"No compilers for operating system %s satisfy spec %s" % (target, compiler_spec)
)
class CompilerDuplicateError(spack.error.SpackError):
def __init__(self, compiler_spec, arch_spec):
config_file_to_duplicates = get_compiler_duplicates(compiler_spec, arch_spec)
duplicate_table = list((x, len(y)) for x, y in config_file_to_duplicates.items())
descriptor = lambda num: "time" if num == 1 else "times"
duplicate_msg = lambda cfgfile, count: "{0}: {1} {2}".format(
cfgfile, str(count), descriptor(count)
)
msg = (
"Compiler configuration contains entries with duplicate"
+ " specification ({0}, {1})".format(compiler_spec, arch_spec)
+ " in the following files:\n\t"
+ "\n\t".join(duplicate_msg(x, y) for x, y in duplicate_table)
)
super().__init__(msg)
class CompilerSpecInsufficientlySpecificError(spack.error.SpackError):
def __init__(self, compiler_spec):
super().__init__("Multiple compilers satisfy spec %s" % compiler_spec)

View File

@@ -8,7 +8,6 @@
from contextlib import contextmanager
from itertools import chain
import spack.abi
import spack.compilers
import spack.config
import spack.environment

View File

@@ -1090,7 +1090,7 @@ def validate(
def read_config_file(
filename: str, schema: Optional[YamlConfigDict] = None
path: str, schema: Optional[YamlConfigDict] = None
) -> Optional[YamlConfigDict]:
"""Read a YAML configuration file.
@@ -1100,21 +1100,9 @@ def read_config_file(
# to preserve flexibility in calling convention (don't need to provide
# schema when it's not necessary) while allowing us to validate against a
# known schema when the top-level key could be incorrect.
if not os.path.exists(filename):
# Ignore nonexistent files.
tty.debug(f"Skipping nonexistent config path {filename}", level=3)
return None
elif not os.path.isfile(filename):
raise ConfigFileError(f"Invalid configuration. {filename} exists but is not a file.")
elif not os.access(filename, os.R_OK):
raise ConfigFileError(f"Config file is not readable: {filename}")
try:
tty.debug(f"Reading config from file {filename}")
with open(filename) as f:
with open(path) as f:
tty.debug(f"Reading config from file {path}")
data = syaml.load_config(f)
if data:
@@ -1125,15 +1113,20 @@ def read_config_file(
return data
except StopIteration:
raise ConfigFileError(f"Config file is empty or is not a valid YAML dict: {filename}")
except FileNotFoundError:
# Ignore nonexistent files.
tty.debug(f"Skipping nonexistent config path {path}", level=3)
return None
except OSError as e:
raise ConfigFileError(f"Path is not a file or is not readable: {path}: {str(e)}") from e
except StopIteration as e:
raise ConfigFileError(f"Config file is empty or is not a valid YAML dict: {path}") from e
except syaml.SpackYAMLError as e:
raise ConfigFileError(str(e)) from e
except OSError as e:
raise ConfigFileError(f"Error reading configuration file {filename}: {str(e)}") from e
def _override(string: str) -> bool:
"""Test if a spack YAML string is an override.

View File

@@ -14,12 +14,14 @@
import llnl.util.tty as tty
import spack.cmd
import spack.compilers
import spack.deptypes as dt
import spack.error
import spack.hash_types as hash_types
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.schema.cray_manifest import schema as manifest_schema
#: Cray systems can store a Spack-compatible description of system
@@ -237,7 +239,7 @@ def read(path, apply_updates):
tty.debug(f"Include this\n{traceback.format_exc()}")
if apply_updates:
for spec in specs.values():
spack.store.STORE.db.add(spec, directory_layout=None)
spack.store.STORE.db.add(spec)
class ManifestValidationError(spack.error.SpackError):

View File

@@ -59,7 +59,11 @@
import spack.util.lock as lk
import spack.util.spack_json as sjson
import spack.version as vn
from spack.directory_layout import DirectoryLayoutError, InconsistentInstallDirectoryError
from spack.directory_layout import (
DirectoryLayout,
DirectoryLayoutError,
InconsistentInstallDirectoryError,
)
from spack.error import SpackError
from spack.util.crypto import bit_length
@@ -208,7 +212,7 @@ def __init__(
ref_count: int = 0,
explicit: bool = False,
installation_time: Optional[float] = None,
deprecated_for: Optional["spack.spec.Spec"] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin=None,
):
@@ -595,9 +599,11 @@ class Database:
def __init__(
self,
root: str,
*,
upstream_dbs: Optional[List["Database"]] = None,
is_upstream: bool = False,
lock_cfg: LockConfiguration = DEFAULT_LOCK_CFG,
layout: Optional[DirectoryLayout] = None,
) -> None:
"""Database for Spack installations.
@@ -620,6 +626,7 @@ def __init__(
"""
self.root = root
self.database_directory = os.path.join(self.root, _DB_DIRNAME)
self.layout = layout
# Set up layout of database files within the db dir
self._index_path = os.path.join(self.database_directory, "index.json")
@@ -664,14 +671,6 @@ def __init__(
self.upstream_dbs = list(upstream_dbs) if upstream_dbs else []
# whether there was an error at the start of a read transaction
self._error = None
# For testing: if this is true, an exception is thrown when missing
# dependencies are detected (rather than just printing a warning
# message)
self._fail_when_missing_deps = False
self._write_transaction_impl = lk.WriteTransaction
self._read_transaction_impl = lk.ReadTransaction
@@ -774,7 +773,13 @@ def query_local_by_spec_hash(self, hash_key):
with self.read_transaction():
return self._data.get(hash_key, None)
def _assign_dependencies(self, spec_reader, hash_key, installs, data):
def _assign_dependencies(
self,
spec_reader: Type["spack.spec.SpecfileReaderBase"],
hash_key: str,
installs: dict,
data: Dict[str, InstallRecord],
):
# Add dependencies from other records in the install DB to
# form a full spec.
spec = data[hash_key].spec
@@ -787,26 +792,20 @@ def _assign_dependencies(self, spec_reader, hash_key, installs, data):
for dname, dhash, dtypes, _, virtuals in spec_reader.read_specfile_dep_specs(
yaml_deps
):
# It is important that we always check upstream installations
# in the same order, and that we always check the local
# installation first: if a downstream Spack installs a package
# then dependents in that installation could be using it.
# If a hash is installed locally and upstream, there isn't
# enough information to determine which one a local package
# depends on, so the convention ensures that this isn't an
# issue.
upstream, record = self.query_by_spec_hash(dhash, data=data)
# It is important that we always check upstream installations in the same order,
# and that we always check the local installation first: if a downstream Spack
# installs a package then dependents in that installation could be using it. If a
# hash is installed locally and upstream, there isn't enough information to
# determine which one a local package depends on, so the convention ensures that
# this isn't an issue.
_, record = self.query_by_spec_hash(dhash, data=data)
child = record.spec if record else None
if not child:
msg = "Missing dependency not in database: " "%s needs %s-%s" % (
spec.cformat("{name}{/hash:7}"),
dname,
dhash[:7],
tty.warn(
f"Missing dependency not in database: "
f"{spec.cformat('{name}{/hash:7}')} needs {dname}-{dhash[:7]}"
)
if self._fail_when_missing_deps:
raise MissingDependenciesError(msg)
tty.warn(msg)
continue
spec._add_dependency(child, depflag=dt.canonicalize(dtypes), virtuals=virtuals)
@@ -873,8 +872,8 @@ def invalid_record(hash_key, error):
# (i.e., its specs are a true Merkle DAG, unlike most specs.)
# Pass 1: Iterate through database and build specs w/o dependencies
data = {}
installed_prefixes = set()
data: Dict[str, InstallRecord] = {}
installed_prefixes: Set[str] = set()
for hash_key, rec in installs.items():
try:
# This constructs a spec DAG from the list of all installs
@@ -911,7 +910,7 @@ def invalid_record(hash_key, error):
self._data = data
self._installed_prefixes = installed_prefixes
def reindex(self, directory_layout):
def reindex(self):
"""Build database index from scratch based on a directory layout.
Locks the DB if it isn't locked already.
@@ -919,6 +918,8 @@ def reindex(self, directory_layout):
if self.is_upstream:
raise UpstreamDatabaseLockingError("Cannot reindex an upstream database")
error: Optional[CorruptDatabaseError] = None
# Special transaction to avoid recursive reindex calls and to
# ignore errors if we need to rebuild a corrupt database.
def _read_suppress_error():
@@ -926,7 +927,8 @@ def _read_suppress_error():
if os.path.isfile(self._index_path):
self._read_from_file(self._index_path)
except CorruptDatabaseError as e:
self._error = e
nonlocal error
error = e
self._data = {}
self._installed_prefixes = set()
@@ -935,14 +937,13 @@ def _read_suppress_error():
)
with transaction:
if self._error:
tty.warn("Spack database was corrupt. Will rebuild. Error was:", str(self._error))
self._error = None
if error is not None:
tty.warn(f"Spack database was corrupt. Will rebuild. Error was: {error}")
old_data = self._data
old_installed_prefixes = self._installed_prefixes
try:
self._construct_from_directory_layout(directory_layout, old_data)
self._construct_from_directory_layout(old_data)
except BaseException:
# If anything explodes, restore old data, skip write.
self._data = old_data
@@ -950,13 +951,16 @@ def _read_suppress_error():
raise
def _construct_entry_from_directory_layout(
self, directory_layout, old_data, spec, deprecator=None
self,
old_data: Dict[str, InstallRecord],
spec: "spack.spec.Spec",
deprecator: Optional["spack.spec.Spec"] = None,
):
# Try to recover explicit value from old DB, but
# default it to True if DB was corrupt. This is
# just to be conservative in case a command like
# "autoremove" is run by the user after a reindex.
tty.debug("RECONSTRUCTING FROM SPEC.YAML: {0}".format(spec))
tty.debug(f"Reconstructing from spec file: {spec}")
explicit = True
inst_time = os.stat(spec.prefix).st_ctime
if old_data is not None:
@@ -965,18 +969,17 @@ def _construct_entry_from_directory_layout(
explicit = old_info.explicit
inst_time = old_info.installation_time
extra_args = {"explicit": explicit, "installation_time": inst_time}
self._add(spec, directory_layout, **extra_args)
self._add(spec, explicit=explicit, installation_time=inst_time)
if deprecator:
self._deprecate(spec, deprecator)
def _construct_from_directory_layout(self, directory_layout, old_data):
# Read first the `spec.yaml` files in the prefixes. They should be
# considered authoritative with respect to DB reindexing, as
# entries in the DB may be corrupted in a way that still makes
# them readable. If we considered DB entries authoritative
# instead, we would perpetuate errors over a reindex.
with directory_layout.disable_upstream_check():
def _construct_from_directory_layout(self, old_data: Dict[str, InstallRecord]):
# Read first the spec files in the prefixes. They should be considered authoritative with
# respect to DB reindexing, as entries in the DB may be corrupted in a way that still makes
# them readable. If we considered DB entries authoritative instead, we would perpetuate
# errors over a reindex.
assert self.layout is not None, "Cannot reindex a database without a known layout"
with self.layout.disable_upstream_check():
# Initialize data in the reconstructed DB
self._data = {}
self._installed_prefixes = set()
@@ -984,44 +987,36 @@ def _construct_from_directory_layout(self, directory_layout, old_data):
# Start inspecting the installed prefixes
processed_specs = set()
for spec in directory_layout.all_specs():
self._construct_entry_from_directory_layout(directory_layout, old_data, spec)
for spec in self.layout.all_specs():
self._construct_entry_from_directory_layout(old_data, spec)
processed_specs.add(spec)
for spec, deprecator in directory_layout.all_deprecated_specs():
self._construct_entry_from_directory_layout(
directory_layout, old_data, spec, deprecator
)
for spec, deprecator in self.layout.all_deprecated_specs():
self._construct_entry_from_directory_layout(old_data, spec, deprecator)
processed_specs.add(spec)
for key, entry in old_data.items():
# We already took care of this spec using
# `spec.yaml` from its prefix.
for entry in old_data.values():
# We already took care of this spec using spec file from its prefix.
if entry.spec in processed_specs:
msg = "SKIPPING RECONSTRUCTION FROM OLD DB: {0}"
msg += " [already reconstructed from spec.yaml]"
tty.debug(msg.format(entry.spec))
tty.debug(
f"Skipping reconstruction from old db: {entry.spec}"
" [already reconstructed from spec file]"
)
continue
# If we arrived here it very likely means that
# we have external specs that are not dependencies
# of other specs. This may be the case for externally
# installed compilers or externally installed
# applications.
tty.debug("RECONSTRUCTING FROM OLD DB: {0}".format(entry.spec))
# If we arrived here it very likely means that we have external specs that are not
# dependencies of other specs. This may be the case for externally installed
# compilers or externally installed applications.
tty.debug(f"Reconstructing from old db: {entry.spec}")
try:
layout = None if entry.spec.external else directory_layout
kwargs = {
"spec": entry.spec,
"directory_layout": layout,
"explicit": entry.explicit,
"installation_time": entry.installation_time,
}
self._add(**kwargs)
self._add(
spec=entry.spec,
explicit=entry.explicit,
installation_time=entry.installation_time,
)
processed_specs.add(entry.spec)
except Exception as e:
# Something went wrong, so the spec was not restored
# from old data
# Something went wrong, so the spec was not restored from old data
tty.debug(e)
self._check_ref_counts()
@@ -1033,7 +1028,7 @@ def _check_ref_counts(self):
Does no locking.
"""
counts = {}
counts: Dict[str, int] = {}
for key, rec in self._data.items():
counts.setdefault(key, 0)
for dep in rec.spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
@@ -1117,29 +1112,23 @@ def _read(self):
def _add(
self,
spec,
directory_layout=None,
explicit=False,
installation_time=None,
allow_missing=False,
spec: "spack.spec.Spec",
explicit: bool = False,
installation_time: Optional[float] = None,
allow_missing: bool = False,
):
"""Add an install record for this spec to the database.
Assumes spec is installed in ``directory_layout.path_for_spec(spec)``.
Also ensures dependencies are present and updated in the DB as
either installed or missing.
Also ensures dependencies are present and updated in the DB as either installed or missing.
Args:
spec (spack.spec.Spec): spec to be added
directory_layout: layout of the spec installation
spec: spec to be added
explicit:
Possible values: True, False, any
A spec that was installed following a specific user
request is marked as explicit. If instead it was
pulled-in as a dependency of a user requested spec
it's considered implicit.
A spec that was installed following a specific user request is marked as explicit.
If instead it was pulled-in as a dependency of a user requested spec it's
considered implicit.
installation_time:
Date and time of installation
@@ -1150,48 +1139,42 @@ def _add(
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
key = spec.dag_hash()
spec_pkg_hash = spec._package_hash
spec_pkg_hash = spec._package_hash # type: ignore[attr-defined]
upstream, record = self.query_by_spec_hash(key)
if upstream:
return
# Retrieve optional arguments
installation_time = installation_time or _now()
for edge in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
if edge.spec.dag_hash() in self._data:
continue
# allow missing build-only deps. This prevents excessive
# warnings when a spec is installed, and its build dep
# is missing a build dep; there's no need to install the
# build dep's build dep first, and there's no need to warn
# about it missing.
dep_allow_missing = allow_missing or edge.depflag == dt.BUILD
self._add(
edge.spec,
directory_layout,
explicit=False,
installation_time=installation_time,
allow_missing=dep_allow_missing,
# allow missing build-only deps. This prevents excessive warnings when a spec is
# installed, and its build dep is missing a build dep; there's no need to install
# the build dep's build dep first, and there's no need to warn about it missing.
allow_missing=allow_missing or edge.depflag == dt.BUILD,
)
# Make sure the directory layout agrees whether the spec is installed
if not spec.external and directory_layout:
path = directory_layout.path_for_spec(spec)
if not spec.external and self.layout:
path = self.layout.path_for_spec(spec)
installed = False
try:
directory_layout.ensure_installed(spec)
self.layout.ensure_installed(spec)
installed = True
self._installed_prefixes.add(path)
except DirectoryLayoutError as e:
if not (allow_missing and isinstance(e, InconsistentInstallDirectoryError)):
msg = (
"{0} is being {1} in the database with prefix {2}, "
"but this directory does not contain an installation of "
"the spec, due to: {3}"
)
action = "updated" if key in self._data else "registered"
tty.warn(msg.format(spec.short_spec, action, path, str(e)))
tty.warn(
f"{spec.short_spec} is being {action} in the database with prefix {path}, "
"but this directory does not contain an installation of "
f"the spec, due to: {e}"
)
elif spec.external_path:
path = spec.external_path
installed = True
@@ -1202,23 +1185,27 @@ def _add(
if key not in self._data:
# Create a new install record with no deps initially.
new_spec = spec.copy(deps=False)
extra_args = {"explicit": explicit, "installation_time": installation_time}
# Commands other than 'spack install' may add specs to the DB,
# we can record the source of an installed Spec with 'origin'
if hasattr(spec, "origin"):
extra_args["origin"] = spec.origin
self._data[key] = InstallRecord(new_spec, path, installed, ref_count=0, **extra_args)
self._data[key] = InstallRecord(
new_spec,
path=path,
installed=installed,
ref_count=0,
explicit=explicit,
installation_time=installation_time,
origin=None if not hasattr(spec, "origin") else spec.origin,
)
# Connect dependencies from the DB to the new copy.
for dep in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
dkey = dep.spec.dag_hash()
upstream, record = self.query_by_spec_hash(dkey)
assert record, f"Missing dependency {dep.spec} in DB"
new_spec._add_dependency(record.spec, depflag=dep.depflag, virtuals=dep.virtuals)
if not upstream:
record.ref_count += 1
# Mark concrete once everything is built, and preserve
# the original hashes of concrete specs.
# Mark concrete once everything is built, and preserve the original hashes of concrete
# specs.
new_spec._mark_concrete()
new_spec._hash = key
new_spec._package_hash = spec_pkg_hash
@@ -1231,7 +1218,7 @@ def _add(
self._data[key].explicit = explicit
@_autospec
def add(self, spec, directory_layout, explicit=False):
def add(self, spec: "spack.spec.Spec", *, explicit: bool = False) -> None:
"""Add spec at path to database, locking and reading DB to sync.
``add()`` will lock and read from the DB on disk.
@@ -1240,9 +1227,9 @@ def add(self, spec, directory_layout, explicit=False):
# TODO: ensure that spec is concrete?
# Entire add is transactional.
with self.write_transaction():
self._add(spec, directory_layout, explicit=explicit)
self._add(spec, explicit=explicit)
def _get_matching_spec_key(self, spec, **kwargs):
def _get_matching_spec_key(self, spec: "spack.spec.Spec", **kwargs) -> str:
"""Get the exact spec OR get a single spec that matches."""
key = spec.dag_hash()
upstream, record = self.query_by_spec_hash(key)
@@ -1254,12 +1241,12 @@ def _get_matching_spec_key(self, spec, **kwargs):
return key
@_autospec
def get_record(self, spec, **kwargs):
def get_record(self, spec: "spack.spec.Spec", **kwargs) -> Optional[InstallRecord]:
key = self._get_matching_spec_key(spec, **kwargs)
upstream, record = self.query_by_spec_hash(key)
return record
def _decrement_ref_count(self, spec):
def _decrement_ref_count(self, spec: "spack.spec.Spec") -> None:
key = spec.dag_hash()
if key not in self._data:
@@ -1276,7 +1263,7 @@ def _decrement_ref_count(self, spec):
for dep in spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
self._decrement_ref_count(dep)
def _increment_ref_count(self, spec):
def _increment_ref_count(self, spec: "spack.spec.Spec") -> None:
key = spec.dag_hash()
if key not in self._data:
@@ -1285,14 +1272,14 @@ def _increment_ref_count(self, spec):
rec = self._data[key]
rec.ref_count += 1
def _remove(self, spec):
def _remove(self, spec: "spack.spec.Spec") -> "spack.spec.Spec":
"""Non-locking version of remove(); does real work."""
key = self._get_matching_spec_key(spec)
rec = self._data[key]
# This install prefix is now free for other specs to use, even if the
# spec is only marked uninstalled.
if not rec.spec.external and rec.installed:
if not rec.spec.external and rec.installed and rec.path:
self._installed_prefixes.remove(rec.path)
if rec.ref_count > 0:
@@ -1316,7 +1303,7 @@ def _remove(self, spec):
return rec.spec
@_autospec
def remove(self, spec):
def remove(self, spec: "spack.spec.Spec") -> "spack.spec.Spec":
"""Removes a spec from the database. To be called on uninstall.
Reads the database, then:
@@ -1331,7 +1318,7 @@ def remove(self, spec):
with self.write_transaction():
return self._remove(spec)
def deprecator(self, spec):
def deprecator(self, spec: "spack.spec.Spec") -> Optional["spack.spec.Spec"]:
"""Return the spec that the given spec is deprecated for, or None"""
with self.read_transaction():
spec_key = self._get_matching_spec_key(spec)
@@ -1342,14 +1329,14 @@ def deprecator(self, spec):
else:
return None
def specs_deprecated_by(self, spec):
def specs_deprecated_by(self, spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
"""Return all specs deprecated in favor of the given spec"""
with self.read_transaction():
return [
rec.spec for rec in self._data.values() if rec.deprecated_for == spec.dag_hash()
]
def _deprecate(self, spec, deprecator):
def _deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> None:
spec_key = self._get_matching_spec_key(spec)
spec_rec = self._data[spec_key]
@@ -1367,17 +1354,17 @@ def _deprecate(self, spec, deprecator):
self._data[spec_key] = spec_rec
@_autospec
def mark(self, spec, key, value):
def mark(self, spec: "spack.spec.Spec", key, value) -> None:
"""Mark an arbitrary record on a spec."""
with self.write_transaction():
return self._mark(spec, key, value)
def _mark(self, spec, key, value):
def _mark(self, spec: "spack.spec.Spec", key, value) -> None:
record = self._data[self._get_matching_spec_key(spec)]
setattr(record, key, value)
@_autospec
def deprecate(self, spec, deprecator):
def deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> None:
"""Marks a spec as deprecated in favor of its deprecator"""
with self.write_transaction():
return self._deprecate(spec, deprecator)
@@ -1385,16 +1372,16 @@ def deprecate(self, spec, deprecator):
@_autospec
def installed_relatives(
self,
spec,
direction="children",
transitive=True,
spec: "spack.spec.Spec",
direction: str = "children",
transitive: bool = True,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
):
) -> Set["spack.spec.Spec"]:
"""Return installed specs related to this one."""
if direction not in ("parents", "children"):
raise ValueError("Invalid direction: %s" % direction)
relatives = set()
relatives: Set[spack.spec.Spec] = set()
for spec in self.query(spec):
if transitive:
to_add = spec.traverse(direction=direction, root=False, deptype=deptype)
@@ -1405,17 +1392,13 @@ def installed_relatives(
for relative in to_add:
hash_key = relative.dag_hash()
upstream, record = self.query_by_spec_hash(hash_key)
_, record = self.query_by_spec_hash(hash_key)
if not record:
reltype = "Dependent" if direction == "parents" else "Dependency"
msg = "Inconsistent state! %s %s of %s not in DB" % (
reltype,
hash_key,
spec.dag_hash(),
tty.warn(
f"Inconsistent state: "
f"{'dependent' if direction == 'parents' else 'dependency'} {hash_key} of "
f"{spec.dag_hash()} not in DB"
)
if self._fail_when_missing_deps:
raise MissingDependenciesError(msg)
tty.warn(msg)
continue
if not record.installed:
@@ -1425,7 +1408,7 @@ def installed_relatives(
return relatives
@_autospec
def installed_extensions_for(self, extendee_spec):
def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
"""Returns the specs of all packages that extend the given spec"""
for spec in self.query():
if spec.package.extends(extendee_spec):
@@ -1684,7 +1667,7 @@ def unused_specs(
self,
root_hashes: Optional[Container[str]] = None,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.LINK | dt.RUN,
) -> "List[spack.spec.Spec]":
) -> List["spack.spec.Spec"]:
"""Return all specs that are currently installed but not needed by root specs.
By default, roots are all explicit specs in the database. If a set of root

View File

@@ -2,17 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from .common import (
DetectedPackage,
executable_prefix,
set_virtuals_nonbuildable,
update_configuration,
)
from .common import executable_prefix, set_virtuals_nonbuildable, update_configuration
from .path import by_path, executables_in_path
from .test import detection_tests
__all__ = [
"DetectedPackage",
"by_path",
"executables_in_path",
"executable_prefix",

View File

@@ -6,9 +6,9 @@
function to update packages.yaml given a list of detected packages.
Ideally, each detection method should be placed in a specific subpackage
and implement at least a function that returns a list of DetectedPackage
objects. The update in packages.yaml can then be done using the function
provided here.
and implement at least a function that returns a list of specs.
The update in packages.yaml can then be done using the function provided here.
The module also contains other functions that might be useful across different
detection mechanisms.
@@ -17,9 +17,10 @@
import itertools
import os
import os.path
import pathlib
import re
import sys
from typing import Dict, List, NamedTuple, Optional, Set, Tuple, Union
from typing import Dict, List, Optional, Set, Tuple, Union
import llnl.util.tty
@@ -30,25 +31,6 @@
import spack.util.windows_registry
class DetectedPackage(NamedTuple):
"""Information on a package that has been detected."""
#: Spec that was detected
spec: spack.spec.Spec
#: Prefix of the spec
prefix: str
def __reduce__(self):
return DetectedPackage.restore, (str(self.spec), self.prefix, self.spec.extra_attributes)
@staticmethod
def restore(
spec_str: str, prefix: str, extra_attributes: Optional[Dict[str, str]]
) -> "DetectedPackage":
spec = spack.spec.Spec.from_detection(spec_str=spec_str, extra_attributes=extra_attributes)
return DetectedPackage(spec=spec, prefix=prefix)
def _externals_in_packages_yaml() -> Set[spack.spec.Spec]:
"""Returns all the specs mentioned as externals in packages.yaml"""
packages_yaml = spack.config.get("packages")
@@ -63,7 +45,7 @@ def _externals_in_packages_yaml() -> Set[spack.spec.Spec]:
def _pkg_config_dict(
external_pkg_entries: List[DetectedPackage],
external_pkg_entries: List["spack.spec.Spec"],
) -> Dict[str, Union[bool, List[Dict[str, ExternalEntryType]]]]:
"""Generate a package specific config dict according to the packages.yaml schema.
@@ -83,22 +65,19 @@ def _pkg_config_dict(
pkg_dict = spack.util.spack_yaml.syaml_dict()
pkg_dict["externals"] = []
for e in external_pkg_entries:
if not _spec_is_valid(e.spec):
if not _spec_is_valid(e):
continue
external_items: List[Tuple[str, ExternalEntryType]] = [
("spec", str(e.spec)),
("prefix", e.prefix),
("spec", str(e)),
("prefix", pathlib.Path(e.external_path).as_posix()),
]
if e.spec.external_modules:
external_items.append(("modules", e.spec.external_modules))
if e.external_modules:
external_items.append(("modules", e.external_modules))
if e.spec.extra_attributes:
if e.extra_attributes:
external_items.append(
(
"extra_attributes",
spack.util.spack_yaml.syaml_dict(e.spec.extra_attributes.items()),
)
("extra_attributes", spack.util.spack_yaml.syaml_dict(e.extra_attributes.items()))
)
# external_items.extend(e.spec.extra_attributes.items())
@@ -219,33 +198,32 @@ def library_prefix(library_dir: str) -> str:
def update_configuration(
detected_packages: Dict[str, List[DetectedPackage]],
detected_packages: Dict[str, List["spack.spec.Spec"]],
scope: Optional[str] = None,
buildable: bool = True,
) -> List[spack.spec.Spec]:
"""Add the packages passed as arguments to packages.yaml
Args:
detected_packages: list of DetectedPackage objects to be added
detected_packages: list of specs to be added
scope: configuration scope where to add the detected packages
buildable: whether the detected packages are buildable or not
"""
predefined_external_specs = _externals_in_packages_yaml()
pkg_to_cfg, all_new_specs = {}, []
for package_name, entries in detected_packages.items():
new_entries = [e for e in entries if (e.spec not in predefined_external_specs)]
new_entries = [s for s in entries if s not in predefined_external_specs]
pkg_config = _pkg_config_dict(new_entries)
external_entries = pkg_config.get("externals", [])
assert not isinstance(external_entries, bool), "unexpected value for external entry"
all_new_specs.extend([x.spec for x in new_entries])
all_new_specs.extend(new_entries)
if buildable is False:
pkg_config["buildable"] = False
pkg_to_cfg[package_name] = pkg_config
pkgs_cfg = spack.config.get("packages", scope=scope)
pkgs_cfg = spack.config.merge_yaml(pkgs_cfg, pkg_to_cfg)
spack.config.set("packages", pkgs_cfg, scope=scope)

View File

@@ -24,7 +24,6 @@
import spack.util.ld_so_conf
from .common import (
DetectedPackage,
WindowsCompilerExternalPaths,
WindowsKitExternalPaths,
_convert_to_iterable,
@@ -229,7 +228,7 @@ def prefix_from_path(self, *, path: str) -> str:
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
) -> List[DetectedPackage]:
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
Args:
@@ -295,16 +294,16 @@ def detect_specs(
warnings.warn(msg)
continue
if spec.external_path:
prefix = spec.external_path
if not spec.external_path:
spec.external_path = prefix
result.append(DetectedPackage(spec=spec, prefix=prefix))
result.append(spec)
return result
def find(
self, *, pkg_name: str, repository, initial_guess: Optional[List[str]] = None
) -> List[DetectedPackage]:
) -> List["spack.spec.Spec"]:
"""For a given package, returns a list of detected specs.
Args:
@@ -388,7 +387,7 @@ def by_path(
*,
path_hints: Optional[List[str]] = None,
max_workers: Optional[int] = None,
) -> Dict[str, List[DetectedPackage]]:
) -> Dict[str, List["spack.spec.Spec"]]:
"""Return the list of packages that have been detected on the system, keyed by
unqualified package name.

View File

@@ -68,7 +68,7 @@ def execute(self) -> List[spack.spec.Spec]:
with self._mock_layout() as path_hints:
entries = by_path([self.test.pkg_name], path_hints=path_hints)
_, unqualified_name = spack.repo.partition_package_name(self.test.pkg_name)
specs = set(x.spec for x in entries[unqualified_name])
specs = set(entries[unqualified_name])
return list(specs)
@contextlib.contextmanager
@@ -104,7 +104,9 @@ def _create_executable_scripts(self, mock_executables: MockExecutables) -> List[
@property
def expected_specs(self) -> List[spack.spec.Spec]:
return [
spack.spec.Spec.from_detection(item.spec, extra_attributes=item.extra_attributes)
spack.spec.Spec.from_detection(
item.spec, external_path=self.tmpdir.name, extra_attributes=item.extra_attributes
)
for item in self.test.results
]

View File

@@ -14,7 +14,6 @@
from pathlib import Path
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config
@@ -23,13 +22,8 @@
import spack.util.spack_json as sjson
from spack.error import SpackError
# Note: Posixpath is used here as opposed to
# os.path.join due to spack.spec.Spec.format
# requiring forward slash path seperators at this stage
default_projections = {
"all": posixpath.join(
"{architecture}", "{compiler.name}-{compiler.version}", "{name}-{version}-{hash}"
)
"all": "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
}
@@ -152,20 +146,9 @@ def read_spec(self, path):
def spec_file_path(self, spec):
"""Gets full path to spec file"""
_check_concrete(spec)
# Attempts to convert to JSON if possible.
# Otherwise just returns the YAML.
yaml_path = os.path.join(self.metadata_path(spec), self._spec_file_name_yaml)
json_path = os.path.join(self.metadata_path(spec), self.spec_file_name)
if os.path.exists(yaml_path) and fs.can_write_to_dir(yaml_path):
self.write_spec(spec, json_path)
try:
os.remove(yaml_path)
except OSError as err:
tty.debug("Could not remove deprecated {0}".format(yaml_path))
tty.debug(err)
elif os.path.exists(yaml_path):
return yaml_path
return json_path
return yaml_path if os.path.exists(yaml_path) else json_path
def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
"""Gets full path to spec file for deprecated spec
@@ -199,17 +182,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
deprecated_spec.dag_hash() + "_" + self.spec_file_name,
)
if os.path.exists(yaml_path) and fs.can_write_to_dir(yaml_path):
self.write_spec(deprecated_spec, json_path)
try:
os.remove(yaml_path)
except (IOError, OSError) as err:
tty.debug("Could not remove deprecated {0}".format(yaml_path))
tty.debug(err)
elif os.path.exists(yaml_path):
return yaml_path
return json_path
return yaml_path if os.path.exists(yaml_path) else json_path
@contextmanager
def disable_upstream_check(self):

View File

@@ -58,9 +58,8 @@
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
from spack.spec_list import InvalidSpecConstraintError, SpecList
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
from spack.variant import UnknownVariantError
#: environment variable used to indicate the active environment
spack_env_var = "SPACK_ENV"
@@ -1625,10 +1624,10 @@ def _concretize_separately(self, tests=False):
# Concretize any new user specs that we haven't concretized yet
args, root_specs, i = [], [], 0
for uspec, uspec_constraints in zip(self.user_specs, self.user_specs.specs_as_constraints):
for uspec in self.user_specs:
if uspec not in old_concretized_user_specs:
root_specs.append(uspec)
args.append((i, [str(x) for x in uspec_constraints], tests))
args.append((i, str(uspec), tests))
i += 1
# Ensure we don't try to bootstrap clingo in parallel
@@ -2508,52 +2507,11 @@ def display_specs(specs):
print(tree_string)
def _concretize_from_constraints(spec_constraints, tests=False):
# Accept only valid constraints from list and concretize spec
# Get the named spec even if out of order
root_spec = [s for s in spec_constraints if s.name]
if len(root_spec) != 1:
m = "The constraints %s are not a valid spec " % spec_constraints
m += "concretization target. all specs must have a single name "
m += "constraint for concretization."
raise InvalidSpecConstraintError(m)
spec_constraints.remove(root_spec[0])
invalid_constraints = []
while True:
# Attach all anonymous constraints to one named spec
s = root_spec[0].copy()
for c in spec_constraints:
if c not in invalid_constraints:
s.constrain(c)
try:
return s.concretized(tests=tests)
except spack.spec.InvalidDependencyError as e:
invalid_deps_string = ["^" + d for d in e.invalid_deps]
invalid_deps = [
c
for c in spec_constraints
if any(c.satisfies(invd) for invd in invalid_deps_string)
]
if len(invalid_deps) != len(invalid_deps_string):
raise e
invalid_constraints.extend(invalid_deps)
except UnknownVariantError as e:
invalid_variants = e.unknown_variants
inv_variant_constraints = [
c for c in spec_constraints if any(name in c.variants for name in invalid_variants)
]
if len(inv_variant_constraints) != len(invalid_variants):
raise e
invalid_constraints.extend(inv_variant_constraints)
def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
index, spec_constraints, tests = packed_arguments
spec_constraints = [Spec(x) for x in spec_constraints]
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = _concretize_from_constraints(spec_constraints, tests)
spec = Spec(spec_str).concretized(tests=tests)
return index, spec, time.time() - start

View File

@@ -451,7 +451,7 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
# Add to the DB
tty.debug(f"{pre} registering into DB")
spack.store.STORE.db.add(spec, None, explicit=explicit)
spack.store.STORE.db.add(spec, explicit=explicit)
def _process_binary_cache_tarball(
@@ -488,13 +488,12 @@ def _process_binary_cache_tarball(
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
pkg.windows_establish_runtime_linkage()
if hasattr(pkg, "_post_buildcache_install_hook"):
pkg._post_buildcache_install_hook()
pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(pkg.spec, explicit=explicit)
return True
@@ -1669,7 +1668,7 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
)
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(pkg.spec, explicit=explicit)
# If a compiler, ensure it is added to the configuration
if task.compiler:

View File

@@ -25,7 +25,6 @@
so package authors should use their judgement.
"""
import functools
import inspect
from contextlib import contextmanager
import spack.directives_meta
@@ -133,7 +132,7 @@ def __call__(self, package_or_builder_self, *args, **kwargs):
# its superclasses for successive calls. We don't have that
# information within `SpecMultiMethod`, because it is not
# associated with the package class.
for cls in inspect.getmro(package_or_builder_self.__class__)[1:]:
for cls in package_or_builder_self.__class__.__mro__[1:]:
superself = cls.__dict__.get(self.__name__, None)
if isinstance(superself, SpecMultiMethod):

View File

@@ -104,6 +104,7 @@
from spack.spec import InvalidSpecDetected, Spec
from spack.util.cpus import determine_number_of_jobs
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,

View File

@@ -16,7 +16,6 @@
import glob
import hashlib
import importlib
import inspect
import io
import os
import re
@@ -117,11 +116,10 @@ def preferred_version(pkg: "PackageBase"):
Arguments:
pkg: The package whose versions are to be assessed.
"""
# Here we sort first on the fact that a version is marked
# as preferred in the package, then on the fact that the
# version is not develop, then lexicographically
key_fn = lambda v: (pkg.versions[v].get("preferred", False), not v.isdevelop(), v)
return max(pkg.versions, key=key_fn)
from spack.solver.asp import concretization_version_order
version, _ = max(pkg.versions.items(), key=concretization_version_order)
return version
class WindowsRPath:
@@ -246,10 +244,7 @@ def determine_spec_details(cls, prefix, objs_in_prefix):
if version_str:
objs_by_version[version_str].append(obj)
except Exception as e:
msg = (
"An error occurred when trying to detect " 'the version of "{0}" [{1}]'
)
tty.debug(msg.format(obj, str(e)))
tty.debug(f"Cannot detect the version of '{obj}' [{str(e)}]")
specs = []
for version_str, objs in objs_by_version.items():
@@ -262,27 +257,23 @@ def determine_spec_details(cls, prefix, objs_in_prefix):
if isinstance(variant, str):
variant = (variant, {})
variant_str, extra_attributes = variant
spec_str = "{0}@{1} {2}".format(cls.name, version_str, variant_str)
spec_str = f"{cls.name}@{version_str} {variant_str}"
# Pop a few reserved keys from extra attributes, since
# they have a different semantics
external_path = extra_attributes.pop("prefix", None)
external_modules = extra_attributes.pop("modules", None)
try:
spec = spack.spec.Spec(
spec = spack.spec.Spec.from_detection(
spec_str,
external_path=external_path,
external_modules=external_modules,
extra_attributes=extra_attributes,
)
except Exception as e:
msg = 'Parsing failed [spec_str="{0}", error={1}]'
tty.debug(msg.format(spec_str, str(e)))
tty.debug(f'Parsing failed [spec_str="{spec_str}", error={str(e)}]')
else:
specs.append(
spack.spec.Spec.from_detection(
spec, extra_attributes=extra_attributes
)
)
specs.append(spec)
return sorted(specs)
@@ -885,7 +876,7 @@ def fullname(cls):
def fullnames(cls):
"""Fullnames for this package and any packages from which it inherits."""
fullnames = []
for cls in inspect.getmro(cls):
for cls in cls.__mro__:
namespace = getattr(cls, "namespace", None)
if namespace:
fullnames.append("%s.%s" % (namespace, cls.name))
@@ -1474,6 +1465,7 @@ def do_fetch(self, mirror_only=False):
checksum
and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion))
and ("dev_path" not in self.spec.variants)
):
tty.warn(
"There is no checksum on file to fetch %s safely."
@@ -1750,7 +1742,7 @@ def _has_make_target(self, target):
bool: True if 'target' is found, else False
"""
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(inspect.getmodule(self).make)
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
@@ -1800,7 +1792,7 @@ def _if_make_target_execute(self, target, *args, **kwargs):
"""
if self._has_make_target(target):
# Execute target
inspect.getmodule(self).make(target, *args, **kwargs)
self.module.make(target, *args, **kwargs)
def _has_ninja_target(self, target):
"""Checks to see if 'target' is a valid target in a Ninja build script.
@@ -1811,7 +1803,7 @@ def _has_ninja_target(self, target):
Returns:
bool: True if 'target' is found, else False
"""
ninja = inspect.getmodule(self).ninja
ninja = self.module.ninja
# Check if we have a Ninja build script
if not os.path.exists("build.ninja"):
@@ -1840,7 +1832,7 @@ def _if_ninja_target_execute(self, target, *args, **kwargs):
"""
if self._has_ninja_target(target):
# Execute target
inspect.getmodule(self).ninja(target, *args, **kwargs)
self.module.ninja(target, *args, **kwargs)
def _get_needed_resources(self):
# We use intersects here cause it would also work if self.spec is abstract

View File

@@ -5,10 +5,11 @@
import stat
import warnings
import spack.config
import spack.error
import spack.repo
import spack.spec
from spack.config import ConfigError
from spack.util.path import canonicalize_path
from spack.version import Version
_lesser_spec_types = {"compiler": spack.spec.CompilerSpec, "version": Version}
@@ -154,44 +155,6 @@ def preferred_variants(cls, pkg_name):
)
def spec_externals(spec):
"""Return a list of external specs (w/external directory path filled in),
one for each known external installation.
"""
# break circular import.
from spack.util.module_cmd import path_from_modules # noqa: F401
def _package(maybe_abstract_spec):
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
return pkg_cls(maybe_abstract_spec)
allpkgs = spack.config.get("packages")
names = set([spec.name])
names |= set(vspec.name for vspec in _package(spec).virtuals_provided)
external_specs = []
for name in names:
pkg_config = allpkgs.get(name, {})
pkg_externals = pkg_config.get("externals", [])
for entry in pkg_externals:
spec_str = entry["spec"]
external_path = entry.get("prefix", None)
if external_path:
external_path = canonicalize_path(external_path)
external_modules = entry.get("modules", None)
external_spec = spack.spec.Spec.from_detection(
spack.spec.Spec(
spec_str, external_path=external_path, external_modules=external_modules
),
extra_attributes=entry.get("extra_attributes", {}),
)
if external_spec.intersects(spec):
external_specs.append(external_spec)
# Defensively copy returned specs
return [s.copy() for s in external_specs]
def is_spec_buildable(spec):
"""Return true if the spec is configured as buildable"""
allpkgs = spack.config.get("packages")

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import hashlib
import inspect
import os
import os.path
import pathlib
@@ -185,8 +184,8 @@ def __init__(
# search mro to look for the file
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls): # type: ignore
pkg_cls = pkg if isinstance(pkg, type) else pkg.__class__
for cls in pkg_cls.__mro__: # type: ignore
if not hasattr(cls, "module"):
# We've gone too far up the MRO
break

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
from ._functions import _host, by_name, platforms, prevent_cray_detection, reset
from ._functions import _host, by_name, platforms, reset
from ._platform import Platform
from .darwin import Darwin
from .freebsd import FreeBSD
@@ -23,7 +23,6 @@
"host",
"by_name",
"reset",
"prevent_cray_detection",
]
#: The "real" platform of the host running Spack. This should not be changed

View File

@@ -2,12 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import llnl.util.lang
import spack.util.environment
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -57,14 +53,3 @@ def by_name(name):
"""
platform_cls = cls_by_name(name)
return platform_cls() if platform_cls else None
@contextlib.contextmanager
def prevent_cray_detection():
"""Context manager that prevents the detection of the Cray platform"""
reset()
try:
with spack.util.environment.set_env(MODULEPATH=""):
yield
finally:
reset()

View File

@@ -12,7 +12,6 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -25,6 +24,7 @@
import spack.store
import spack.util.elf as elf
import spack.util.executable as executable
import spack.util.filesystem as ssys
import spack.util.path
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer
@@ -664,7 +664,7 @@ def is_binary(filename):
Returns:
True or False
"""
m_type, _ = fs.mime_type(filename)
m_type, _ = ssys.mime_type(filename)
msg = "[{0}] -> ".format(filename)
if m_type == "application":
@@ -692,7 +692,7 @@ def fixup_macos_rpath(root, filename):
True if fixups were applied, else False
"""
abspath = os.path.join(root, filename)
if fs.mime_type(abspath) != ("application", "x-mach-binary"):
if ssys.mime_type(abspath) != ("application", "x-mach-binary"):
return False
# Get Mach-O header commands

View File

@@ -1281,7 +1281,7 @@ def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]
raise RepoError(msg) from e
cls = getattr(module, class_name)
if not inspect.isclass(cls):
if not isinstance(cls, type):
tty.die(f"{pkg_name}.{class_name} is not a class")
# Clear any prior changes to class attributes in case the class was loaded from the

View File

@@ -116,7 +116,7 @@ def rewire_node(spec, explicit):
# spec being added to look for mismatches)
spack.store.STORE.layout.write_spec(spec, spack.store.STORE.layout.spec_file_path(spec))
# add to database, not sure about explicit
spack.store.STORE.db.add(spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(spec, explicit=explicit)
# run post install hooks
spack.hooks.post_install(spec, explicit)

View File

@@ -3,22 +3,28 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains jsonschema files for all of Spack's YAML formats."""
import typing
import warnings
import llnl.util.lang
class DeprecationMessage(typing.NamedTuple):
message: str
error: bool
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
import jsonschema
import spack.parser
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import jsonschema
import spack.parser
if not validator.is_type(instance, "object"):
return
@@ -32,27 +38,31 @@ def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
# Get a list of the deprecated properties, return if there is none
deprecated_properties = [x for x in instance if x in deprecated["properties"]]
if not deprecated_properties:
if not deprecated:
return
# Retrieve the template message
msg_str_or_func = deprecated["message"]
if isinstance(msg_str_or_func, str):
msg = msg_str_or_func.format(properties=deprecated_properties)
else:
msg = msg_str_or_func(instance, deprecated_properties)
if msg is None:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
is_error = deprecated["error"]
if not is_error:
warnings.warn(msg)
else:
import jsonschema
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
yield jsonschema.ValidationError(msg)
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft4Validator,

View File

@@ -96,12 +96,14 @@
"binary_index_ttl": {"type": "integer", "minimum": 0},
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},
},
"deprecatedProperties": {
"properties": ["concretizer"],
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
},
"deprecatedProperties": [
{
"names": ["concretizer"],
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
}
],
}
}

View File

@@ -109,7 +109,6 @@
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {}, # Here only to warn users on ignored properties
"target": {
"type": "array",
"default": [],
@@ -140,14 +139,6 @@
},
"variants": variants,
},
"deprecatedProperties": {
"properties": ["version"],
"message": "setting version preferences in the 'all' section of packages.yaml "
"is deprecated and will be removed in v0.23\n\n\tThese preferences "
"will be ignored by Spack. You can set them only in package-specific sections "
"of the same file.\n",
"error": False,
},
}
},
"patternProperties": {
@@ -165,14 +156,11 @@
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"target": {}, # Here only to warn users on ignored properties
"compiler": {}, # Here only to warn users on ignored properties
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"providers": {}, # Here only to warn users on ignored properties
"variants": variants,
"externals": {
"type": "array",
@@ -204,18 +192,6 @@
},
},
},
"deprecatedProperties": {
"properties": ["target", "compiler", "providers"],
"message": "setting 'compiler:', 'target:' or 'provider:' preferences in "
"a package-specific section of packages.yaml is deprecated, and will be "
"removed in v0.23.\n\n\tThese preferences will be ignored by Spack, and "
"can be set only in the 'all' section of the same file. "
"You can run:\n\n\t\t$ spack audit configs\n\n\tto get better diagnostics, "
"including files:lines where the deprecated attributes are used.\n\n"
"\tUse requirements to enforce conditions on specific packages: "
f"{REQUIREMENT_URL}\n",
"error": False,
},
}
},
}

View File

@@ -579,7 +579,7 @@ def _is_checksummed_version(version_info: Tuple[GitOrStandardVersion, dict]):
return _is_checksummed_git_version(version)
def _concretization_version_order(version_info: Tuple[GitOrStandardVersion, dict]):
def concretization_version_order(version_info: Tuple[GitOrStandardVersion, dict]):
"""Version order key for concretization, where preferred > not preferred,
not deprecated > deprecated, finite > any infinite component; only if all are
the same, do we use default version ordering."""
@@ -1022,6 +1022,102 @@ def __iter__(self):
ConditionSpecCache = Dict[str, Dict[ConditionSpecKey, ConditionIdFunctionPair]]
class ConstraintOrigin(enum.Enum):
"""Generates identifiers that can be pased into the solver attached
to constraints, and then later retrieved to determine the origin of
those constraints when ``SpecBuilder`` creates Specs from the solve
result.
"""
DEPENDS_ON = 1
REQUIRE = 2
@staticmethod
def _SUFFIXES() -> Dict["ConstraintOrigin", str]:
return {ConstraintOrigin.DEPENDS_ON: "_dep", ConstraintOrigin.REQUIRE: "_req"}
@staticmethod
def append_type_suffix(pkg_id: str, kind: "ConstraintOrigin") -> str:
"""Given a package identifier and a constraint kind, generate a string ID."""
suffix = ConstraintOrigin._SUFFIXES()[kind]
return f"{pkg_id}{suffix}"
@staticmethod
def strip_type_suffix(source: str) -> Tuple[int, Optional[str]]:
"""Take a combined package/type ID generated by
``append_type_suffix``, and extract the package ID and
an associated weight.
"""
if not source:
return -1, None
for kind, suffix in ConstraintOrigin._SUFFIXES().items():
if source.endswith(suffix):
return kind.value, source[: -len(suffix)]
return -1, source
class SourceContext:
"""Tracks context in which a Spec's clause-set is generated (i.e.
with ``SpackSolverSetup.spec_clauses``).
Facts generated for the spec may include this context.
"""
def __init__(self):
# This can be "literal" for constraints that come from a user
# spec (e.g. from the command line); it can be the output of
# `ConstraintOrigin.append_type_suffix`; the default is "none"
# (which means it isn't important to keep track of the source
# in that case).
self.source = "none"
class ConditionIdContext(SourceContext):
"""Derived from a ``ConditionContext``: for clause-sets generated by
imposed/required specs, stores an associated transform.
This is primarily used for tracking whether we are generating clauses
in the context of a required spec, or for an imposed spec.
Is not a subclass of ``ConditionContext`` because it exists in a
lower-level context with less information.
"""
def __init__(self):
super().__init__()
self.transform = None
class ConditionContext(SourceContext):
"""Tracks context in which a condition (i.e. ``SpackSolverSetup.condition``)
is generated (e.g. for a `depends_on`).
This may modify the required/imposed specs generated as relevant
for the context.
"""
def __init__(self):
super().__init__()
# transformation applied to facts from the required spec. Defaults
# to leave facts as they are.
self.transform_required = None
# transformation applied to facts from the imposed spec. Defaults
# to removing "node" and "virtual_node" facts.
self.transform_imposed = None
def requirement_context(self) -> ConditionIdContext:
ctxt = ConditionIdContext()
ctxt.source = self.source
ctxt.transform = self.transform_required
return ctxt
def impose_context(self) -> ConditionIdContext:
ctxt = ConditionIdContext()
ctxt.source = self.source
ctxt.transform = self.transform_imposed
return ctxt
class SpackSolverSetup:
"""Class to set up and run a Spack concretization solve."""
@@ -1197,8 +1293,9 @@ def compiler_facts(self):
if compiler.compiler_obj is not None:
c = compiler.compiler_obj
for flag_type, flags in c.flags.items():
flag_group = " ".join(flags)
for flag in flags:
self.gen.fact(fn.compiler_flag(compiler_id, flag_type, flag))
self.gen.fact(fn.compiler_flag(compiler_id, flag_type, flag, flag_group))
if compiler.available:
self.gen.fact(fn.compiler_available(compiler_id))
@@ -1375,7 +1472,7 @@ def _get_condition_id(
named_cond: spack.spec.Spec,
cache: ConditionSpecCache,
body: bool,
transform: Optional[TransformFunction] = None,
context: ConditionIdContext,
) -> int:
"""Get the id for one half of a condition (either a trigger or an imposed constraint).
@@ -1389,15 +1486,15 @@ def _get_condition_id(
"""
pkg_cache = cache[named_cond.name]
named_cond_key = (str(named_cond), transform)
named_cond_key = (str(named_cond), context.transform)
result = pkg_cache.get(named_cond_key)
if result:
return result[0]
cond_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=body)
if transform:
requirements = transform(named_cond, requirements)
requirements = self.spec_clauses(named_cond, body=body, context=context)
if context.transform:
requirements = context.transform(named_cond, requirements)
pkg_cache[named_cond_key] = (cond_id, requirements)
return cond_id
@@ -1408,8 +1505,7 @@ def condition(
imposed_spec: Optional[spack.spec.Spec] = None,
name: Optional[str] = None,
msg: Optional[str] = None,
transform_required: Optional[TransformFunction] = None,
transform_imposed: Optional[TransformFunction] = remove_node,
context: Optional[ConditionContext] = None,
):
"""Generate facts for a dependency or virtual provider condition.
@@ -1418,10 +1514,8 @@ def condition(
imposed_spec: the constraints that are imposed when this condition is triggered
name: name for `required_spec` (required if required_spec is anonymous, ignored if not)
msg: description of the condition
transform_required: transformation applied to facts from the required spec. Defaults
to leave facts as they are.
transform_imposed: transformation applied to facts from the imposed spec. Defaults
to removing "node" and "virtual_node" facts.
context: if provided, indicates how to modify the clause-sets for the required/imposed
specs based on the type of constraint they are generated for (e.g. `depends_on`)
Returns:
int: id of the condition created by this function
"""
@@ -1429,14 +1523,19 @@ def condition(
if not name:
raise ValueError(f"Must provide a name for anonymous condition: '{required_spec}'")
if not context:
context = ConditionContext()
context.transform_imposed = remove_node
with spec_with_name(required_spec, name):
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
requirement_context = context.requirement_context()
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
required_spec, cache=self._trigger_cache, body=True, context=requirement_context
)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
@@ -1446,8 +1545,9 @@ def condition(
if not imposed_spec:
return condition_id
impose_context = context.impose_context()
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
imposed_spec, cache=self._effect_cache, body=False, context=impose_context
)
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_effect(condition_id, effect_id))
@@ -1455,8 +1555,8 @@ def condition(
return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
imposed_constraints = self.spec_clauses(imposed_spec, body=body, required_from=name)
def impose(self, condition_id, imposed_spec, node=True, body=False):
imposed_constraints = self.spec_clauses(imposed_spec, body=body)
for pred in imposed_constraints:
# imposed "node"-like conditions are no-ops
if not node and pred.args[0] in ("node", "virtual_node"):
@@ -1528,14 +1628,14 @@ def dependency_holds(input_spec, requirements):
if t & depflag
]
self.condition(
cond,
dep.spec,
name=pkg.name,
msg=msg,
transform_required=track_dependencies,
transform_imposed=dependency_holds,
context = ConditionContext()
context.source = ConstraintOrigin.append_type_suffix(
pkg.name, ConstraintOrigin.DEPENDS_ON
)
context.transform_required = track_dependencies
context.transform_imposed = dependency_holds
self.condition(cond, dep.spec, name=pkg.name, msg=msg, context=context)
self.gen.newline()
@@ -1613,17 +1713,21 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
when_spec = spack.spec.Spec(pkg_name)
try:
# With virtual we want to emit "node" and "virtual_node" in imposed specs
transform: Optional[TransformFunction] = remove_node
if virtual:
transform = None
context = ConditionContext()
context.source = ConstraintOrigin.append_type_suffix(
pkg_name, ConstraintOrigin.REQUIRE
)
if not virtual:
context.transform_imposed = remove_node
# else: for virtuals we want to emit "node" and
# "virtual_node" in imposed specs
member_id = self.condition(
required_spec=when_spec,
imposed_spec=spec,
name=pkg_name,
transform_imposed=transform,
msg=f"{input_spec} is a requirement for package {pkg_name}",
context=context,
)
except Exception as e:
# Do not raise if the rule comes from the 'all' subsection, since usability
@@ -1678,6 +1782,10 @@ def external_packages(self):
if pkg_name not in spack.repo.PATH:
continue
# This package is not among possible dependencies
if pkg_name not in self.pkgs:
continue
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
@@ -1718,7 +1826,9 @@ def external_imposition(input_spec, requirements):
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
context = ConditionContext()
context.transform_imposed = external_imposition
self.condition(spec, spec, msg=msg, context=context)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
@@ -1793,6 +1903,7 @@ def spec_clauses(
expand_hashes: bool = False,
concrete_build_deps=False,
required_from: Optional[str] = None,
context: Optional[SourceContext] = None,
) -> List[AspFunction]:
"""Wrap a call to `_spec_clauses()` into a try/except block with better error handling.
@@ -1808,6 +1919,7 @@ def spec_clauses(
transitive=transitive,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
context=context,
)
except RuntimeError as exc:
msg = str(exc)
@@ -1824,6 +1936,7 @@ def _spec_clauses(
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps: bool = False,
context: Optional[SourceContext] = None,
) -> List[AspFunction]:
"""Return a list of clauses for a spec mandates are true.
@@ -1835,6 +1948,8 @@ def _spec_clauses(
expand_hashes: if True, descend into hashes of concrete specs (default False)
concrete_build_deps: if False, do not include pure build deps of concrete specs
(as they have no effect on runtime constraints)
context: tracks what constraint this clause set is generated for (e.g. a
`depends_on` constraint in a package.py file)
Normally, if called with ``transitive=True``, ``spec_clauses()`` just generates
hashes for the dependency requirements of concrete specs. If ``expand_hashes``
@@ -1921,13 +2036,19 @@ def _spec_clauses(
self.compiler_version_constraints.add(spec.compiler)
# compiler flags
source = context.source if context else "none"
for flag_type, flags in spec.compiler_flags.items():
flag_group = " ".join(flags)
for flag in flags:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(
f.node_flag(spec.name, fn.node_flag(flag_type, flag, flag_group, source))
)
if not spec.concrete and flag.propagate is True:
clauses.append(
f.propagate(
spec.name, fn.node_flag(flag_type, flag), fn.edge_types("link", "run")
spec.name,
fn.node_flag(flag_type, flag, flag_group, source),
fn.edge_types("link", "run"),
)
)
@@ -2009,6 +2130,7 @@ def _spec_clauses(
body=body,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
context=context,
)
)
@@ -2026,7 +2148,7 @@ def define_package_versions_and_validate_preferences(
# like being a "develop" version or being preferred exist only at a
# package.py level, sort them in this partial list here
package_py_versions = sorted(
pkg_cls.versions.items(), key=_concretization_version_order, reverse=True
pkg_cls.versions.items(), key=concretization_version_order, reverse=True
)
if require_checksum and pkg_cls.has_code:
@@ -2644,7 +2766,9 @@ def literal_specs(self, specs):
effect_id, requirements = cache[imposed_spec_key]
else:
effect_id = next(self._id_counter)
requirements = self.spec_clauses(spec)
context = SourceContext()
context.source = "literal"
requirements = self.spec_clauses(spec, context=context)
root_name = spec.name
for clause in requirements:
clause_name = clause.args[0]
@@ -3344,7 +3468,6 @@ def __init__(self, specs, hash_lookup=None):
self._result = None
self._command_line_specs = specs
self._flag_sources = collections.defaultdict(lambda: set())
self._flag_compiler_defaults = set()
# Pass in as arguments reusable specs and plug them in
# from this dictionary during reconstruction
@@ -3402,14 +3525,10 @@ def node_compiler_version(self, node, compiler, version):
self._specs[node].compiler = spack.spec.CompilerSpec(compiler)
self._specs[node].compiler.versions = vn.VersionList([vn.Version(version)])
def node_flag_compiler_default(self, node):
self._flag_compiler_defaults.add(node)
def node_flag(self, node, flag_type, flag):
self._specs[node].compiler_flags.add_flag(flag_type, flag, False)
def node_flag_source(self, node, flag_type, source):
self._flag_sources[(node, flag_type)].add(source)
def node_flag(self, node, node_flag):
self._specs[node].compiler_flags.add_flag(
node_flag.flag_type, node_flag.flag, False, node_flag.flag_group, node_flag.source
)
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx has been selected for this package."""
@@ -3450,15 +3569,23 @@ def virtual_on_edge(self, parent_node, provider_node, virtual):
dependencies[0].update_virtuals((virtual,))
def reorder_flags(self):
"""Order compiler flags on specs in predefined order.
We order flags so that any node's flags will take priority over
those of its dependents. That is, the deepest node in the DAG's
flags will appear last on the compile line, in the order they
were specified.
"""For each spec, determine the order of compiler flags applied to it.
The solver determines which flags are on nodes; this routine
imposes order afterwards.
imposes order afterwards. The order is:
1. Flags applied in compiler definitions should come first
2. Flags applied by dependents are ordered topologically (with a
dependency on `traverse` to resolve the partial order into a
stable total order)
3. Flags from requirements are then applied (requirements always
come from the package and never a parent)
4. Command-line flags should come last
Additionally, for each source (requirements, compiler, command line, and
dependents), flags from that source should retain their order and grouping:
e.g. for `y cflags="-z -a"` "-z" and "-a" should never have any intervening
flags inserted, and should always appear in that order.
"""
# reverse compilers so we get highest priority compilers that share a spec
compilers = dict(
@@ -3473,40 +3600,78 @@ def reorder_flags(self):
flagmap_from_compiler = compilers[spec.compiler].flags
for flag_type in spec.compiler_flags.valid_compiler_flags():
from_compiler = flagmap_from_compiler.get(flag_type, [])
from_sources = []
# order is determined by the DAG. A spec's flags come after any of its ancestors
# on the compile line
node = SpecBuilder.make_node(pkg=spec.name)
source_key = (node, flag_type)
if source_key in self._flag_sources:
order = [
SpecBuilder.make_node(pkg=s.name)
for s in spec.traverse(order="post", direction="parents")
]
sorted_sources = sorted(
self._flag_sources[source_key], key=lambda s: order.index(s)
ordered_flags = []
# 1. Put compiler flags first
from_compiler = tuple(flagmap_from_compiler.get(flag_type, []))
extend_flag_list(ordered_flags, from_compiler)
# 2. Add all sources (the compiler is one of them, so skip any
# flag group that matches it exactly)
flag_groups = set()
for flag in self._specs[node].compiler_flags.get(flag_type, []):
flag_groups.add(
spack.spec.CompilerFlag(
flag.flag_group,
propagate=flag.propagate,
flag_group=flag.flag_group,
source=flag.source,
)
)
# add flags from each source, lowest to highest precedence
for node in sorted_sources:
all_src_flags = list()
per_pkg_sources = [self._specs[node]]
if node.pkg in cmd_specs:
per_pkg_sources.append(cmd_specs[node.pkg])
for source in per_pkg_sources:
all_src_flags.extend(source.compiler_flags.get(flag_type, []))
extend_flag_list(from_sources, all_src_flags)
# For flags that are applied by dependents, put flags from parents
# before children; we depend on the stability of traverse() to
# achieve a stable flag order for flags introduced in this manner.
topo_order = list(s.name for s in spec.traverse(order="post", direction="parents"))
lex_order = list(sorted(flag_groups))
def _order_index(flag_group):
source = flag_group.source
# Note: if 'require: ^dependency cflags=...' is ever possible,
# this will topologically sort for require as well
type_index, pkg_source = ConstraintOrigin.strip_type_suffix(source)
if pkg_source in topo_order:
major_index = topo_order.index(pkg_source)
# If for x->y, x has multiple depends_on declarations that
# are activated, and each adds cflags to y, we fall back on
# alphabetical ordering to maintain a total order
minor_index = lex_order.index(flag_group)
else:
major_index = len(topo_order) + lex_order.index(flag_group)
minor_index = 0
return (type_index, major_index, minor_index)
prioritized_groups = sorted(flag_groups, key=lambda x: _order_index(x))
for grp in prioritized_groups:
grp_flags = tuple(
x for (x, y) in spack.compiler.tokenize_flags(grp.flag_group)
)
if grp_flags == from_compiler:
continue
as_compiler_flags = list(
spack.spec.CompilerFlag(
x,
propagate=grp.propagate,
flag_group=grp.flag_group,
source=grp.source,
)
for x in grp_flags
)
extend_flag_list(ordered_flags, as_compiler_flags)
# 3. Now put cmd-line flags last
if node.pkg in cmd_specs:
cmd_flags = cmd_specs[node.pkg].compiler_flags.get(flag_type, [])
extend_flag_list(ordered_flags, cmd_flags)
# compiler flags from compilers config are lowest precedence
ordered_compiler_flags = list(llnl.util.lang.dedupe(from_compiler + from_sources))
compiler_flags = spec.compiler_flags.get(flag_type, [])
msg = "%s does not equal %s" % (set(compiler_flags), set(ordered_flags))
assert set(compiler_flags) == set(ordered_flags), msg
msg = f"{set(compiler_flags)} does not equal {set(ordered_compiler_flags)}"
assert set(compiler_flags) == set(ordered_compiler_flags), msg
spec.compiler_flags.update({flag_type: ordered_compiler_flags})
spec.compiler_flags.update({flag_type: ordered_flags})
def deprecated(self, node: NodeArgument, version: str) -> None:
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
@@ -3570,10 +3735,9 @@ def build_specs(self, function_tuples):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
# do not bother calling actions on it
spec = self._specs.get(args[0])
if spec and spec.concrete and name != "node_flag_source":
if spec and spec.concrete:
continue
action(*args)
@@ -3633,7 +3797,8 @@ def _develop_specs_from_env(spec, env):
assert spec.variants["dev_path"].value == path, error_msg
else:
spec.variants.setdefault("dev_path", spack.variant.SingleValuedVariant("dev_path", path))
spec.constrain(dev_info["spec"])
assert spec.satisfies(dev_info["spec"])
def _is_reusable(spec: spack.spec.Spec, packages, local: bool) -> bool:

View File

@@ -43,9 +43,7 @@
internal_error("Only nodes can have node_compiler_version").
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode),
internal_error("variant_value true for a non-node").
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode),
internal_error("node_flag_compiler_default true for non-node").
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode),
:- attr("node_flag", PackageNode, _), not attr("node", PackageNode),
internal_error("node_flag assigned for non-node").
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode),
internal_error("external_spec_selected for non-node").
@@ -53,10 +51,6 @@
internal_error("non-node depends on something").
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode),
internal_error("something depends_on a non-node").
:- attr("node_flag_source", Node, _, _), not attr("node", Node),
internal_error("node_flag_source assigned for a non-node").
:- attr("node_flag_source", _, _, SourceNode), not attr("node", SourceNode),
internal_error("node_flag_source assigned with a non-node source").
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode),
internal_error("virtual node with no provider").
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode),
@@ -154,7 +148,6 @@ unification_set(SetID, VirtualNode)
% TODO: literals, at the moment, can only influence the "root" unification set. This needs to be extended later.
% Node attributes that have multiple node arguments (usually, only the first argument is a node)
multiple_nodes_attribute("node_flag_source").
multiple_nodes_attribute("depends_on").
multiple_nodes_attribute("virtual_on_edge").
multiple_nodes_attribute("provider_set").
@@ -392,7 +385,6 @@ trigger_condition_holds(ID, RequestorNode) :-
attr(Name, node(X, A1), A2, A3) : condition_requirement(ID, Name, A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name);
attr(Name, node(X, A1), A2, A3, A4) : condition_requirement(ID, Name, A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1));
% Special cases
attr("node_flag_source", node(X, A1), A2, node(Y, A3)) : condition_requirement(ID, "node_flag_source", A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), condition_nodes(ID, PackageNode, node(Y, A3));
not cannot_hold(ID, PackageNode).
condition_holds(ConditionID, node(X, Package))
@@ -440,13 +432,6 @@ attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constrai
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)).
% For node flag sources we need to look at the condition_set of the source, since it is the dependent
% of the package on which I want to impose the constraint
attr("node_flag_source", node(X, A1), A2, node(Y, A3))
:- impose(ID, node(X, A1)),
imposed_constraint(ID, "node_flag_source", A1, A2, A3),
condition_set(node(Y, A3), node(X, A1)).
% Provider set is relevant only for literals, since it's the only place where `^[virtuals=foo] bar`
% might appear in the HEAD of a rule
attr("provider_set", node(min_dupe_id, Provider), node(min_dupe_id, Virtual))
@@ -487,8 +472,8 @@ virtual_condition_holds(node(Y, A2), Virtual)
% we cannot have additional flag values when we are working with concrete specs
:- attr("node", node(ID, Package)),
attr("hash", node(ID, Package), Hash),
attr("node_flag", node(ID, Package), FlagType, Flag),
not imposed_constraint(Hash, "node_flag", Package, FlagType, Flag),
attr("node_flag", node(ID, Package), node_flag(FlagType, Flag, _, _)),
not imposed_constraint(Hash, "node_flag", Package, node_flag(FlagType, Flag, _, _)),
internal_error("imposed hash without imposing all flag values").
#defined condition/2.
@@ -789,22 +774,15 @@ required_provider(Provider, Virtual)
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider.
% TODO: the following two choice rules allow the solver to add compiler
% TODO: the following choice rule allows the solver to add compiler
% flags if their only source is from a requirement. This is overly-specific
% and should use a more-generic approach like in https://github.com/spack/spack/pull/37180
{ attr("node_flag", node(ID, Package), FlagType, FlagValue) } :-
{ attr("node_flag", node(ID, Package), NodeFlag) } :-
requirement_group_member(ConditionID, Package, RequirementID),
activate_requirement(node(ID, Package), RequirementID),
pkg_fact(Package, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node_flag_set", Package, FlagType, FlagValue).
{ attr("node_flag_source", node(NodeID1, Package1), FlagType, node(NodeID2, Package2)) } :-
requirement_group_member(ConditionID, Package1, RequirementID),
activate_requirement(node(NodeID1, Package1), RequirementID),
pkg_fact(Package1, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node_flag_source", Package1, FlagType, Package2),
imposed_nodes(EffectID, node(NodeID2, Package2), node(NodeID1, Package1)).
imposed_constraint(EffectID, "node_flag_set", Package, NodeFlag).
requirement_weight(node(ID, Package), Group, W) :-
W = #min {
@@ -1050,23 +1028,22 @@ variant_is_propagated(PackageNode, Variant) :-
% 1. The same flag type is not set on this node
% 2. This node has the same compiler as the propagation source
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag), _),
not attr("node_flag_set", node(PackageID, Package), FlagType, _),
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag, FlagGroup, Source), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag, FlagGroup, Source), _),
not attr("node_flag_set", node(PackageID, Package), node_flag(FlagType, _, _, "literal")),
% Same compiler as propagation source
node_compiler(node(PackageID, Package), CompilerID),
node_compiler(SourceNode, CompilerID),
attr("propagate", SourceNode, node_flag(FlagType, Flag), _),
attr("propagate", SourceNode, node_flag(FlagType, Flag, FlagGroup, Source), _),
node(PackageID, Package) != SourceNode,
not runtime(Package).
attr("node_flag", PackageNode, FlagType, Flag) :- propagated_flag(PackageNode, node_flag(FlagType, Flag), _).
attr("node_flag_source", PackageNode, FlagType, SourceNode) :- propagated_flag(PackageNode, node_flag(FlagType, _), SourceNode).
attr("node_flag", PackageNode, NodeFlag) :- propagated_flag(PackageNode, NodeFlag, _).
% Cannot propagate the same flag from two distinct sources
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source2)),
propagated_flag(node(ID, Package), node_flag(FlagType, _, _, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _, _, _), node(_, Source2)),
Source1 < Source2.
%----
@@ -1353,32 +1330,18 @@ error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_miss
% Compiler flags
%-----------------------------------------------------------------------------
% remember where flags came from
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag", PackageNode, FlagType, _), attr("hash", PackageNode, _).
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", PackageNode, FlagType, Flag)
:- compiler_flag(CompilerID, FlagType, Flag),
attr("node_flag", PackageNode, node_flag(FlagType, Flag, FlagGroup, CompilerID))
:- compiler_flag(CompilerID, FlagType, Flag, FlagGroup),
node_compiler(PackageNode, CompilerID),
flag_type(FlagType),
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
attr("node_flag_compiler_default", PackageNode)
:- not attr("node_flag_set", PackageNode, FlagType, _),
compiler_flag(CompilerID, FlagType, Flag),
node_compiler(PackageNode, CompilerID),
flag_type(FlagType),
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
attr("node_flag", PackageNode, NodeFlag) :- attr("node_flag_set", PackageNode, NodeFlag).
% Flag set to something
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_set", PackageNode, FlagType, Flag).
#defined compiler_flag/3.
#defined compiler_flag/4.
%-----------------------------------------------------------------------------

View File

@@ -230,6 +230,13 @@ class NodeArgument(NamedTuple):
pkg: str
class NodeFlag(NamedTuple):
flag_type: str
flag: str
flag_group: str
source: str
def intermediate_repr(sym):
"""Returns an intermediate representation of clingo models for Spack's spec builder.
@@ -248,6 +255,13 @@ def intermediate_repr(sym):
return NodeArgument(
id=intermediate_repr(sym.arguments[0]), pkg=intermediate_repr(sym.arguments[1])
)
elif sym.name == "node_flag":
return NodeFlag(
flag_type=intermediate_repr(sym.arguments[0]),
flag=intermediate_repr(sym.arguments[1]),
flag_group=intermediate_repr(sym.arguments[2]),
source=intermediate_repr(sym.arguments[3]),
)
except RuntimeError:
# This happens when using clingo w/ CFFI and trying to access ".name" for symbols
# that are not functions

View File

@@ -13,6 +13,7 @@
#show attr/2.
#show attr/3.
#show attr/4.
#show attr/5.
% names of optimization criteria
#show opt_criterion/2.

View File

@@ -51,6 +51,7 @@
import collections
import collections.abc
import enum
import io
import itertools
import os
import pathlib
@@ -780,17 +781,49 @@ class CompilerFlag(str):
propagate (bool): if ``True`` the flag value will
be passed to the package's dependencies. If
``False`` it will not
flag_group (str): if this flag was introduced along
with several flags via a single source, then
this will store all such flags
source (str): identifies the type of constraint that
introduced this flag (e.g. if a package has
``depends_on(... cflags=-g)``, then the ``source``
for "-g" would indicate ``depends_on``.
"""
def __new__(cls, value, **kwargs):
obj = str.__new__(cls, value)
obj.propagate = kwargs.pop("propagate", False)
obj.flag_group = kwargs.pop("flag_group", value)
obj.source = kwargs.pop("source", None)
return obj
_valid_compiler_flags = ["cflags", "cxxflags", "fflags", "ldflags", "ldlibs", "cppflags"]
def _shared_subset_pair_iterate(container1, container2):
"""
[0, a, c, d, f]
[a, d, e, f]
yields [(a, a), (d, d), (f, f)]
no repeated elements
"""
a_idx, b_idx = 0, 0
max_a, max_b = len(container1), len(container2)
while a_idx < max_a and b_idx < max_b:
if container1[a_idx] == container2[b_idx]:
yield (container1[a_idx], container2[b_idx])
a_idx += 1
b_idx += 1
else:
while container1[a_idx] < container2[b_idx]:
a_idx += 1
while container1[a_idx] > container2[b_idx]:
b_idx += 1
class FlagMap(lang.HashableMap):
__slots__ = ("spec",)
@@ -799,23 +832,9 @@ def __init__(self, spec):
self.spec = spec
def satisfies(self, other):
return all(f in self and self[f] == other[f] for f in other)
return all(f in self and set(self[f]) >= set(other[f]) for f in other)
def intersects(self, other):
common_types = set(self) & set(other)
for flag_type in common_types:
if not self[flag_type] or not other[flag_type]:
# At least one of the two is empty
continue
if self[flag_type] != other[flag_type]:
return False
if not all(
f1.propagate == f2.propagate for f1, f2 in zip(self[flag_type], other[flag_type])
):
# At least one propagation flag didn't match
return False
return True
def constrain(self, other):
@@ -823,28 +842,28 @@ def constrain(self, other):
Return whether the spec changed.
"""
if other.spec and other.spec._concrete:
for k in self:
if k not in other:
raise UnsatisfiableCompilerFlagSpecError(self[k], "<absent>")
changed = False
for k in other:
if k in self and not set(self[k]) <= set(other[k]):
raise UnsatisfiableCompilerFlagSpecError(
" ".join(f for f in self[k]), " ".join(f for f in other[k])
)
elif k not in self:
self[k] = other[k]
for flag_type in other:
if flag_type not in self:
self[flag_type] = other[flag_type]
changed = True
else:
extra_other = set(other[flag_type]) - set(self[flag_type])
if extra_other:
self[flag_type] = list(self[flag_type]) + list(
x for x in other[flag_type] if x in extra_other
)
changed = True
# Next, if any flags in other propagate, we force them to propagate in our case
shared = list(sorted(set(other[flag_type]) - extra_other))
for x, y in _shared_subset_pair_iterate(shared, sorted(self[flag_type])):
if x.propagate:
y.propagate = True
# TODO: what happens if flag groups with a partial (but not complete)
# intersection specify different behaviors for flag propagation?
# Check that the propagation values match
if self[k] == other[k]:
for i in range(len(other[k])):
if self[k][i].propagate != other[k][i].propagate:
raise UnsatisfiableCompilerFlagSpecError(
self[k][i].propagate, other[k][i].propagate
)
return changed
@staticmethod
@@ -857,7 +876,7 @@ def copy(self):
clone[name] = compiler_flag
return clone
def add_flag(self, flag_type, value, propagation):
def add_flag(self, flag_type, value, propagation, flag_group=None, source=None):
"""Stores the flag's value in CompilerFlag and adds it
to the FlagMap
@@ -868,7 +887,8 @@ def add_flag(self, flag_type, value, propagation):
propagation (bool): if ``True`` the flag value will be passed to
the packages' dependencies. If``False`` it will not be passed
"""
flag = CompilerFlag(value, propagate=propagation)
flag_group = flag_group or value
flag = CompilerFlag(value, propagate=propagation, flag_group=flag_group, source=source)
if flag_type not in self:
self[flag_type] = [flag]
@@ -1427,7 +1447,7 @@ def __init__(
# init an empty spec that matches anything.
self.name = None
self.versions = vn.VersionList(":")
self.variants = vt.VariantMap(self)
self.variants = VariantMap(self)
self.architecture = None
self.compiler = None
self.compiler_flags = FlagMap(self)
@@ -1551,7 +1571,9 @@ def _get_dependency(self, name):
raise spack.error.SpecError(err_msg.format(name, len(deps)))
return deps[0]
def edges_from_dependents(self, name=None, depflag: dt.DepFlag = dt.ALL):
def edges_from_dependents(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
"""Return a list of edges connecting this node in the DAG
to parents.
@@ -1561,7 +1583,9 @@ def edges_from_dependents(self, name=None, depflag: dt.DepFlag = dt.ALL):
"""
return [d for d in self._dependents.select(parent=name, depflag=depflag)]
def edges_to_dependencies(self, name=None, depflag: dt.DepFlag = dt.ALL):
def edges_to_dependencies(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
"""Return a list of edges connecting this node in the DAG
to children.
@@ -1660,8 +1684,9 @@ def _add_flag(self, name, value, propagate):
elif name in valid_flags:
assert self.compiler_flags is not None
flags_and_propagation = spack.compiler.tokenize_flags(value, propagate)
flag_group = " ".join(x for (x, y) in flags_and_propagation)
for flag, propagation in flags_and_propagation:
self.compiler_flags.add_flag(name, flag, propagation)
self.compiler_flags.add_flag(name, flag, propagation, flag_group)
else:
# FIXME:
# All other flags represent variants. 'foo=true' and 'foo=false'
@@ -2577,22 +2602,27 @@ def from_signed_json(stream):
return Spec.from_dict(extracted_json)
@staticmethod
def from_detection(spec_str, extra_attributes=None):
def from_detection(
spec_str: str,
*,
external_path: str,
external_modules: Optional[List[str]] = None,
extra_attributes: Optional[Dict] = None,
) -> "Spec":
"""Construct a spec from a spec string determined during external
detection and attach extra attributes to it.
Args:
spec_str (str): spec string
extra_attributes (dict): dictionary containing extra attributes
Returns:
spack.spec.Spec: external spec
spec_str: spec string
external_path: prefix of the external spec
external_modules: optional module files to be loaded when the external spec is used
extra_attributes: dictionary containing extra attributes
"""
s = Spec(spec_str)
s = Spec(spec_str, external_path=external_path, external_modules=external_modules)
extra_attributes = syaml.sorted_dict(extra_attributes or {})
# This is needed to be able to validate multi-valued variants,
# otherwise they'll still be abstract in the context of detection.
vt.substitute_abstract_variants(s)
substitute_abstract_variants(s)
s.extra_attributes = extra_attributes
return s
@@ -2915,7 +2945,7 @@ def validate_or_raise(self):
# Ensure correctness of variants (if the spec is not virtual)
if not spec.virtual:
Spec.ensure_valid_variants(spec)
vt.substitute_abstract_variants(spec)
substitute_abstract_variants(spec)
@staticmethod
def ensure_valid_variants(spec):
@@ -3884,7 +3914,7 @@ def format_attribute(match_object: Match) -> str:
if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if part == "variants" and isinstance(current, vt.VariantMap):
if part == "variants" and isinstance(current, VariantMap):
# subscript instead of getattr for variant names
current = current[part]
else:
@@ -4339,6 +4369,152 @@ def attach_git_version_lookup(self):
v.attach_lookup(spack.version.git_ref_lookup.GitRefLookup(self.fullname))
class VariantMap(lang.HashableMap):
"""Map containing variant instances. New values can be added only
if the key is not already present."""
def __init__(self, spec: Spec):
super().__init__()
self.spec = spec
def __setitem__(self, name, vspec):
# Raise a TypeError if vspec is not of the right type
if not isinstance(vspec, vt.AbstractVariant):
raise TypeError(
"VariantMap accepts only values of variant types "
f"[got {type(vspec).__name__} instead]"
)
# Raise an error if the variant was already in this map
if name in self.dict:
msg = 'Cannot specify variant "{0}" twice'.format(name)
raise vt.DuplicateVariantError(msg)
# Raise an error if name and vspec.name don't match
if name != vspec.name:
raise KeyError(
f'Inconsistent key "{name}", must be "{vspec.name}" to ' "match VariantSpec"
)
# Set the item
super().__setitem__(name, vspec)
def substitute(self, vspec):
"""Substitutes the entry under ``vspec.name`` with ``vspec``.
Args:
vspec: variant spec to be substituted
"""
if vspec.name not in self:
raise KeyError(f"cannot substitute a key that does not exist [{vspec.name}]")
# Set the item
super().__setitem__(vspec.name, vspec)
def satisfies(self, other):
return all(k in self and self[k].satisfies(other[k]) for k in other)
def intersects(self, other):
return all(self[k].intersects(other[k]) for k in other if k in self)
def constrain(self, other: "VariantMap") -> bool:
"""Add all variants in other that aren't in self to self. Also constrain all multi-valued
variants that are already present. Return True iff self changed"""
if other.spec is not None and other.spec._concrete:
for k in self:
if k not in other:
raise vt.UnsatisfiableVariantSpecError(self[k], "<absent>")
changed = False
for k in other:
if k in self:
# If they are not compatible raise an error
if not self[k].compatible(other[k]):
raise vt.UnsatisfiableVariantSpecError(self[k], other[k])
# If they are compatible merge them
changed |= self[k].constrain(other[k])
else:
# If it is not present copy it straight away
self[k] = other[k].copy()
changed = True
return changed
@property
def concrete(self):
"""Returns True if the spec is concrete in terms of variants.
Returns:
bool: True or False
"""
return self.spec._concrete or all(v in self for v in self.spec.package_class.variants)
def copy(self) -> "VariantMap":
clone = VariantMap(self.spec)
for name, variant in self.items():
clone[name] = variant.copy()
return clone
def __str__(self):
if not self:
return ""
# print keys in order
sorted_keys = sorted(self.keys())
# Separate boolean variants from key-value pairs as they print
# differently. All booleans go first to avoid ' ~foo' strings that
# break spec reuse in zsh.
bool_keys = []
kv_keys = []
for key in sorted_keys:
bool_keys.append(key) if isinstance(self[key].value, bool) else kv_keys.append(key)
# add spaces before and after key/value variants.
string = io.StringIO()
for key in bool_keys:
string.write(str(self[key]))
for key in kv_keys:
string.write(" ")
string.write(str(self[key]))
return string.getvalue()
def substitute_abstract_variants(spec: Spec):
"""Uses the information in `spec.package` to turn any variant that needs
it into a SingleValuedVariant.
This method is best effort. All variants that can be substituted will be
substituted before any error is raised.
Args:
spec: spec on which to operate the substitution
"""
# This method needs to be best effort so that it works in matrix exlusion
# in $spack/lib/spack/spack/spec_list.py
failed = []
for name, v in spec.variants.items():
if name == "dev_path":
spec.variants.substitute(vt.SingleValuedVariant(name, v._original_value))
continue
elif name in vt.reserved_names:
continue
elif name not in spec.package_class.variants:
failed.append(name)
continue
pkg_variant, _ = spec.package_class.variants[name]
new_variant = pkg_variant.make_variant(v._original_value)
pkg_variant.validate_or_raise(new_variant, spec.package_class)
spec.variants.substitute(new_variant)
# Raise all errors at once
if failed:
raise vt.UnknownVariantError(spec, failed)
def parse_with_version_concrete(spec_like: Union[str, Spec], compiler: bool = False):
"""Same as Spec(string), but interprets @x as @=x"""
s: Union[CompilerSpec, Spec] = CompilerSpec(spec_like) if compiler else Spec(spec_like)
@@ -4530,6 +4706,10 @@ def _load(cls, data):
return hash_dict[root_spec_hash]["node_spec"]
@classmethod
def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
raise NotImplementedError("Subclasses must implement this method.")
class SpecfileV1(SpecfileReaderBase):
@classmethod
@@ -4740,6 +4920,7 @@ def get_host_environment() -> Dict[str, Any]:
"architecture": arch_spec,
"arch_str": str(arch_spec),
"hostname": socket.gethostname(),
"full_hostname": socket.getfqdn(),
}

View File

@@ -5,6 +5,7 @@
import itertools
from typing import List
import spack.spec
import spack.variant
from spack.error import SpackError
from spack.spec import Spec
@@ -225,7 +226,7 @@ def _expand_matrix_constraints(matrix_config):
# Catch exceptions because we want to be able to operate on
# abstract specs without needing package information
try:
spack.variant.substitute_abstract_variants(test_spec)
spack.spec.substitute_abstract_variants(test_spec)
except spack.variant.UnknownVariantError:
pass

View File

@@ -173,7 +173,12 @@ def __init__(
self.hash_length = hash_length
self.upstreams = upstreams
self.lock_cfg = lock_cfg
self.db = spack.database.Database(root, upstream_dbs=upstreams, lock_cfg=lock_cfg)
self.layout = spack.directory_layout.DirectoryLayout(
root, projections=projections, hash_length=hash_length
)
self.db = spack.database.Database(
root, upstream_dbs=upstreams, lock_cfg=lock_cfg, layout=self.layout
)
timeout_format_str = (
f"{str(lock_cfg.package_timeout)}s" if lock_cfg.package_timeout else "No timeout"
@@ -187,13 +192,9 @@ def __init__(
self.root, default_timeout=lock_cfg.package_timeout
)
self.layout = spack.directory_layout.DirectoryLayout(
root, projections=projections, hash_length=hash_length
)
def reindex(self) -> None:
"""Convenience function to reindex the store DB with its own layout."""
return self.db.reindex(self.layout)
return self.db.reindex()
def __reduce__(self):
return Store, (
@@ -261,7 +262,7 @@ def restore(token):
def _construct_upstream_dbs_from_install_roots(
install_roots: List[str], _test: bool = False
install_roots: List[str],
) -> List[spack.database.Database]:
accumulated_upstream_dbs: List[spack.database.Database] = []
for install_root in reversed(install_roots):
@@ -271,7 +272,6 @@ def _construct_upstream_dbs_from_install_roots(
is_upstream=True,
upstream_dbs=upstream_dbs,
)
next_db._fail_when_missing_deps = _test
next_db._read()
accumulated_upstream_dbs.insert(0, next_db)

View File

@@ -1,66 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
""" Test ABI compatibility helpers"""
import pytest
from spack.abi import ABI
from spack.spec import Spec
@pytest.mark.parametrize(
"target,constraint,expected",
[
("foo", "bar", True),
("platform=linux", "foo", True),
("foo", "arch=linux-fedora31-x86_64", True),
("arch=linux-fedora31-skylake", "arch=linux-fedora31-skylake", True),
("arch=linux-fedora31-skylake", "arch=linux-fedora31-x86_64", False),
("platform=linux os=fedora31", "arch=linux-fedora31-x86_64", True),
("platform=linux", "arch=linux-fedora31-x86_64", True),
("platform=linux os=fedora31", "platform=linux", True),
("platform=darwin", "arch=linux-fedora31-x86_64", False),
("os=fedora31", "platform=linux", True),
],
)
def test_architecture_compatibility(target, constraint, expected):
assert ABI().architecture_compatible(Spec(target), Spec(constraint)) == expected
@pytest.mark.parametrize(
"target,constraint,loose,expected",
[
("foo", "bar", False, True),
("%gcc", "foo", False, True),
("foo", "%gcc", False, True),
("%gcc", "%gcc", False, True),
("%gcc", "%intel", False, False),
("%gcc", "%clang", False, False),
("%gcc@9.1", "%gcc@9.2", False, False), # TODO should be true ?
("%gcc@9.2.1", "%gcc@9.2.2", False, False), # TODO should be true ?
("%gcc@4.9", "%gcc@9.2", False, False),
("%clang@5", "%clang@6", False, False),
("%gcc@9.1", "%gcc@9.2", True, True),
("%gcc@9.2.1", "%gcc@9.2.2", True, True),
("%gcc@4.9", "%gcc@9.2", True, True),
("%clang@5", "%clang@6", True, True),
],
)
def test_compiler_compatibility(target, constraint, loose, expected):
assert ABI().compiler_compatible(Spec(target), Spec(constraint), loose=loose) == expected
@pytest.mark.parametrize(
"target,constraint,loose,expected",
[
("foo", "bar", False, True),
("%gcc", "platform=linux", False, True),
("%gcc@9.2.1", "%gcc@8.3.1 platform=linux", False, False),
("%gcc@9.2.1", "%gcc@8.3.1 platform=linux", True, True),
("%gcc@9.2.1 arch=linux-fedora31-skylake", "%gcc@9.2.1 platform=linux", False, True),
],
)
def test_compatibility(target, constraint, loose, expected):
assert ABI().compatible(Spec(target), Spec(constraint), loose=loose) == expected

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
import platform
import posixpath
@@ -514,6 +513,30 @@ def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mo
assert dtags_to_add.value == expected_flag
def test_module_globals_available_at_setup_dependent_time(
monkeypatch, mutable_config, mock_packages, working_env
):
"""Spack built package externaltest depends on an external package
externaltool. Externaltool's setup_dependent_package needs to be able to
access globals on the dependent"""
def setup_dependent_package(module, dependent_spec):
# Make sure set_package_py_globals was already called on
# dependents
# ninja is always set by the setup context and is not None
dependent_module = dependent_spec.package.module
assert hasattr(dependent_module, "ninja")
assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True
externaltool = spack.spec.Spec("externaltest").concretized()
monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
)
spack.build_environment.setup_package(externaltool.package, False)
assert externaltool.package.test_attr
def test_build_jobs_sequential_is_sequential():
assert (
determine_number_of_jobs(
@@ -593,7 +616,7 @@ def test_setting_attributes(self, default_mock_concretization):
# We can also propagate the settings to classes in the MRO
module_wrapper.propagate_changes_to_mro()
for cls in inspect.getmro(type(s.package)):
for cls in s.package.__class__.__mro__:
current_module = cls.module
if current_module == spack.package_base:
break

View File

@@ -379,9 +379,8 @@ def test_buildcache_create_install(
def test_correct_specs_are_pushed(
things_to_install, expected, tmpdir, monkeypatch, default_mock_concretization, temporary_store
):
# Concretize dttop and add it to the temporary database (without prefixes)
spec = default_mock_concretization("dttop")
temporary_store.db.add(spec, directory_layout=None)
spec.package.do_install(fake=True)
slash_hash = f"/{spec.dag_hash()}"
class DontUpload(spack.binary_distribution.Uploader):

View File

@@ -591,14 +591,12 @@ def test_config_prefer_upstream(
"""
mock_db_root = str(tmpdir_factory.mktemp("mock_db_root"))
prepared_db = spack.database.Database(mock_db_root)
upstream_layout = gen_mock_layout("/a/")
prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/"))
for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]:
dep = spack.spec.Spec(spec)
dep.concretize()
prepared_db.add(dep, upstream_layout)
prepared_db.add(dep)
downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))
db_for_test = spack.database.Database(downstream_db_root, upstream_dbs=[prepared_db])

View File

@@ -18,8 +18,6 @@
develop = SpackCommand("develop")
env = SpackCommand("env")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
@pytest.mark.usefixtures("mutable_mock_env_path", "mock_packages", "mock_fetch", "mutable_config")
class TestDevelop:

View File

@@ -2336,103 +2336,6 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
assert mpileaks_spec not in after_conc
def test_stack_concretize_extraneous_deps(tmpdir, mock_packages):
# FIXME: The new concretizer doesn't handle yet soft
# FIXME: constraints for stacks
# FIXME: This now works for statically-determinable invalid deps
# FIXME: But it still does not work for dynamically determined invalid deps
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['^zmpi', '^mpich']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
test = ev.read("test")
for user, concrete in test.concretized_specs():
assert concrete.concrete
assert not user.concrete
if user.name == "libelf":
assert not concrete.satisfies("^mpi")
elif user.name == "mpileaks":
assert concrete.satisfies("^mpi")
def test_stack_concretize_extraneous_variants(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['~shared', '+shared']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
test = ev.read("test")
for user, concrete in test.concretized_specs():
assert concrete.concrete
assert not user.concrete
if user.name == "libelf":
assert "shared" not in concrete.variants
if user.name == "mpileaks":
assert concrete.variants["shared"].value == user.variants["shared"].value
def test_stack_concretize_extraneous_variants_with_dash(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['shared=False', '+shared-libs']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
ev.read("test")
# Regression test for handling of variants with dashes in them
# will fail before this point if code regresses
assert True
def test_stack_definition_extension(tmpdir):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
@@ -4301,9 +4204,6 @@ def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config
{''.join(includes)}
specs:
- mpileaks
packages:
mpileaks:
compiler: [gcc]
"""
)

View File

@@ -44,7 +44,7 @@ def test_find_external_single_package(mock_executable):
assert len(specs_by_package) == 1 and "cmake" in specs_by_package
detected_spec = specs_by_package["cmake"]
assert len(detected_spec) == 1 and detected_spec[0].spec == Spec("cmake@1.foo")
assert len(detected_spec) == 1 and detected_spec[0] == Spec("cmake@1.foo")
def test_find_external_two_instances_same_package(mock_executable):
@@ -61,10 +61,10 @@ def test_find_external_two_instances_same_package(mock_executable):
)
assert len(detected_specs) == 2
spec_to_path = {e.spec: e.prefix for e in detected_specs}
spec_to_path = {s: s.external_path for s in detected_specs}
assert spec_to_path[Spec("cmake@1.foo")] == (
spack.detection.executable_prefix(str(cmake1.parent))
)
), spec_to_path
assert spec_to_path[Spec("cmake@3.17.2")] == (
spack.detection.executable_prefix(str(cmake2.parent))
)
@@ -72,8 +72,8 @@ def test_find_external_two_instances_same_package(mock_executable):
def test_find_external_update_config(mutable_config):
entries = [
spack.detection.DetectedPackage(Spec.from_detection("cmake@1.foo"), "/x/y1/"),
spack.detection.DetectedPackage(Spec.from_detection("cmake@3.17.2"), "/x/y2/"),
Spec.from_detection("cmake@1.foo", external_path="/x/y1"),
Spec.from_detection("cmake@3.17.2", external_path="/x/y2"),
]
pkg_to_entries = {"cmake": entries}
@@ -84,8 +84,8 @@ def test_find_external_update_config(mutable_config):
cmake_cfg = pkgs_cfg["cmake"]
cmake_externals = cmake_cfg["externals"]
assert {"spec": "cmake@1.foo", "prefix": "/x/y1/"} in cmake_externals
assert {"spec": "cmake@3.17.2", "prefix": "/x/y2/"} in cmake_externals
assert {"spec": "cmake@1.foo", "prefix": "/x/y1"} in cmake_externals
assert {"spec": "cmake@3.17.2", "prefix": "/x/y2"} in cmake_externals
def test_get_executables(working_env, mock_executable):
@@ -221,21 +221,19 @@ def fail():
assert "Skipping manifest and continuing" in output
def test_find_external_merge(mutable_config, mutable_mock_repo):
"""Check that 'spack find external' doesn't overwrite an existing spec
entry in packages.yaml.
"""
def test_find_external_merge(mutable_config, mutable_mock_repo, tmp_path):
"""Checks that 'spack find external' doesn't overwrite an existing spec in packages.yaml."""
pkgs_cfg_init = {
"find-externals1": {
"externals": [{"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix/"}],
"externals": [{"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix"}],
"buildable": False,
}
}
mutable_config.update_config("packages", pkgs_cfg_init)
entries = [
spack.detection.DetectedPackage(Spec.from_detection("find-externals1@1.1"), "/x/y1/"),
spack.detection.DetectedPackage(Spec.from_detection("find-externals1@1.2"), "/x/y2/"),
Spec.from_detection("find-externals1@1.1", external_path="/x/y1"),
Spec.from_detection("find-externals1@1.2", external_path="/x/y2"),
]
pkg_to_entries = {"find-externals1": entries}
scope = spack.config.default_modify_scope("packages")
@@ -245,8 +243,8 @@ def test_find_external_merge(mutable_config, mutable_mock_repo):
pkg_cfg = pkgs_cfg["find-externals1"]
pkg_externals = pkg_cfg["externals"]
assert {"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix/"} in pkg_externals
assert {"spec": "find-externals1@1.2", "prefix": "/x/y2/"} in pkg_externals
assert {"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix"} in pkg_externals
assert {"spec": "find-externals1@1.2", "prefix": "/x/y2"} in pkg_externals
def test_list_detectable_packages(mutable_config, mutable_mock_repo):
@@ -272,7 +270,7 @@ def _determine_variants(cls, exes, version_str):
assert len(detected_specs) == 1
gcc = detected_specs[0].spec
gcc = detected_specs[0]
assert gcc.name == "gcc"
assert gcc.external_path == os.path.sep + os.path.join("opt", "gcc", "bin")

View File

@@ -334,7 +334,6 @@ def test_find_command_basic_usage(database):
assert "mpileaks" in output
@pytest.mark.not_on_windows("envirnment is not yet supported on windows")
@pytest.mark.regression("9875")
def test_find_prefix_in_env(
mutable_mock_env_path, install_mockery, mock_fetch, mock_packages, mock_archive

View File

@@ -16,8 +16,6 @@
add = spack.main.SpackCommand("add")
install = spack.main.SpackCommand("install")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
@pytest.mark.db
def test_gc_without_build_dependency(mutable_database):

View File

@@ -2,9 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import spack.store
from spack.database import Database
from spack.main import SpackCommand
install = SpackCommand("install")
@@ -23,20 +24,31 @@ def test_reindex_basic(mock_packages, mock_archive, mock_fetch, install_mockery)
assert spack.store.STORE.db.query() == all_installed
def test_reindex_db_deleted(mock_packages, mock_archive, mock_fetch, install_mockery):
def _clear_db(tmp_path):
empty_db = Database(str(tmp_path))
with empty_db.write_transaction():
pass
shutil.rmtree(spack.store.STORE.db.database_directory)
shutil.copytree(empty_db.database_directory, spack.store.STORE.db.database_directory)
# force a re-read of the database
assert len(spack.store.STORE.db.query()) == 0
def test_reindex_db_deleted(mock_packages, mock_archive, mock_fetch, install_mockery, tmp_path):
install("libelf@0.8.13")
install("libelf@0.8.12")
all_installed = spack.store.STORE.db.query()
os.remove(spack.store.STORE.db._index_path)
_clear_db(tmp_path)
reindex()
assert spack.store.STORE.db.query() == all_installed
def test_reindex_with_deprecated_packages(
mock_packages, mock_archive, mock_fetch, install_mockery
mock_packages, mock_archive, mock_fetch, install_mockery, tmp_path
):
install("libelf@0.8.13")
install("libelf@0.8.12")
@@ -46,7 +58,8 @@ def test_reindex_with_deprecated_packages(
all_installed = spack.store.STORE.db.query(installed=any)
non_deprecated = spack.store.STORE.db.query(installed=True)
os.remove(spack.store.STORE.db._index_path)
_clear_db(tmp_path)
reindex()
assert spack.store.STORE.db.query(installed=any) == all_installed

View File

@@ -50,7 +50,6 @@ def fake_stage(pkg, mirror_only=False):
return expected_path
@pytest.mark.not_on_windows("PermissionError")
def test_stage_path(check_stage_path):
"""Verify that --path only works with single specs."""
stage("--path={0}".format(check_stage_path), "trivial-install-test-package")

View File

@@ -2,10 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.environment as ev
import spack.spec
from spack.main import SpackCommand
@@ -14,8 +10,6 @@
env = SpackCommand("env")
concretize = SpackCommand("concretize")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_undevelop(tmpdir, mutable_config, mock_packages, mutable_mock_env_path):
# setup environment

View File

@@ -205,7 +205,6 @@ def _warn(*args, **kwargs):
# Note: I want to use https://docs.pytest.org/en/7.1.x/how-to/skipping.html#skip-all-test-functions-of-a-class-or-module
# the style formatter insists on separating these two lines.
@pytest.mark.not_on_windows("Envs unsupported on Windows")
class TestUninstallFromEnv:
"""Tests an installation with two environments e1 and e2, which each have
shared package installations:

View File

@@ -42,25 +42,6 @@ def test_multiple_conflicting_compiler_definitions(mutable_config):
assert cmp.f77 == "f77"
def test_get_compiler_duplicates(mutable_config, compiler_factory):
# In this case there is only one instance of the specified compiler in
# the test configuration (so it is not actually a duplicate), but the
# method behaves the same.
cnl_compiler = compiler_factory(spec="gcc@4.5.0", operating_system="CNL")
# CNL compiler has no target attribute, and this is essential to make detection pass
del cnl_compiler["compiler"]["target"]
mutable_config.set(
"compilers", [compiler_factory(spec="gcc@4.5.0", operating_system="SuSE11"), cnl_compiler]
)
cfg_file_to_duplicates = spack.compilers.get_compiler_duplicates(
"gcc@4.5.0", spack.spec.ArchSpec("cray-CNL-xeon")
)
assert len(cfg_file_to_duplicates) == 1
cfg_file, duplicates = next(iter(cfg_file_to_duplicates.items()))
assert len(duplicates) == 1
def test_compiler_flags_from_config_are_grouped():
compiler_entry = {
"spec": "intel@17.0.2",

View File

@@ -13,6 +13,7 @@
import llnl.util.lang
import spack.binary_distribution
import spack.compiler
import spack.compilers
import spack.concretize
@@ -400,14 +401,6 @@ def test_spec_flags_maintain_order(self, mutable_config, gcc11_with_flags):
s.compiler_flags[x] == ["-O0", "-g"] for x in ("cflags", "cxxflags", "fflags")
)
@pytest.mark.xfail(reason="Broken, needs to be fixed")
def test_compiler_flags_from_compiler_and_dependent(self):
client = Spec("cmake-client %clang@12.2.0 platform=test os=fe target=fe cflags==-g")
client.concretize()
cmake = client["cmake"]
for spec in [client, cmake]:
assert spec.compiler_flags["cflags"] == ["-O3", "-g"]
def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags):
mutable_config.set("compilers", [clang12_with_flags])
# Correct arch to use test compiler that has flags
@@ -441,6 +434,13 @@ def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12
["hypre cflags='-g'", "^openblas cflags='-O3'"],
["^openblas cflags='-g'"],
),
# Setting propagation on parent and dependency -> the
# dependency propagation flags override
(
"hypre cflags=='-g' ^openblas cflags=='-O3'",
["hypre cflags='-g'", "^openblas cflags='-O3'"],
["^openblas cflags='-g'"],
),
# Propagation doesn't go across build dependencies
(
"cmake-client cflags=='-O2 -g'",
@@ -648,20 +648,6 @@ def test_external_package(self):
assert "externalprereq" not in spec
assert spec["externaltool"].compiler.satisfies("gcc")
def test_external_package_module(self):
# No tcl modules on darwin/linux machines
# and Windows does not (currently) allow for bash calls
# TODO: improved way to check for this.
platform = spack.platforms.real_host().name
if platform == "darwin" or platform == "linux" or platform == "windows":
return
spec = Spec("externalmodule")
spec.concretize()
assert spec["externalmodule"].external_modules == ["external-module"]
assert "externalprereq" not in spec
assert spec["externalmodule"].compiler.satisfies("gcc")
def test_nobuild_package(self):
"""Test that a non-buildable package raise an error if no specs
in packages.yaml are compatible with the request.
@@ -775,15 +761,15 @@ def test_regression_issue_7239(self):
s = Spec("mpileaks")
s.concretize()
assert llnl.util.lang.ObjectWrapper not in type(s).__mro__
assert llnl.util.lang.ObjectWrapper not in s.__class__.__mro__
# Spec wrapped in a build interface
build_interface = s["mpileaks"]
assert llnl.util.lang.ObjectWrapper in type(build_interface).__mro__
assert llnl.util.lang.ObjectWrapper in build_interface.__class__.__mro__
# Mimics asking the build interface from a build interface
build_interface = s["mpileaks"]["mpileaks"]
assert llnl.util.lang.ObjectWrapper in type(build_interface).__mro__
assert llnl.util.lang.ObjectWrapper in build_interface.__class__.__mro__
@pytest.mark.regression("7705")
def test_regression_issue_7705(self):
@@ -1301,7 +1287,7 @@ def mock_fn(*args, **kwargs):
return [first_spec]
if mock_db:
temporary_store.db.add(first_spec, None)
temporary_store.db.add(first_spec)
else:
monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", mock_fn)
@@ -1366,7 +1352,7 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True)
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized()
spack.store.STORE.db.add(spec, None)
spec.package.do_install(fake=True)
testspec = Spec("pkg-a cflags=-g")
testspec.concretize()
@@ -2109,11 +2095,13 @@ def test_external_python_extension_find_dependency_from_detection(self, monkeypa
"""Test that python extensions have access to a python dependency
when python isn't otherwise in the DAG"""
python_spec = Spec("python@=detected")
prefix = os.path.sep + "fake"
python_spec = Spec.from_detection("python@=detected", external_path=prefix)
def find_fake_python(classes, path_hints):
return {"python": [spack.detection.DetectedPackage(python_spec, prefix=path_hints[0])]}
return {
"python": [Spec.from_detection("python@=detected", external_path=path_hints[0])]
}
monkeypatch.setattr(spack.detection, "by_path", find_fake_python)
external_conf = {
@@ -2128,7 +2116,8 @@ def find_fake_python(classes, path_hints):
assert "python" in spec["py-extension1"]
assert spec["python"].prefix == prefix
assert spec["python"] == python_spec
assert spec["python"].external
assert spec["python"].satisfies(python_spec)
def test_external_python_extension_find_unified_python(self):
"""Test that python extensions use the same python as other specs in unified env"""
@@ -2959,7 +2948,7 @@ def test_concretization_version_order():
result = [
v
for v, _ in sorted(
versions, key=spack.solver.asp._concretization_version_order, reverse=True
versions, key=spack.solver.asp.concretization_version_order, reverse=True
)
]
assert result == [

View File

@@ -8,8 +8,6 @@
import spack.solver.asp
import spack.spec
pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",
" required because quantum-espresso depends on fftw@:1.0",

View File

@@ -19,8 +19,6 @@
from spack.test.conftest import create_test_repo
from spack.util.url import path_to_file_url
pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
def update_packages_config(conf_str):
conf = syaml.load_config(conf_str)

View File

@@ -27,7 +27,6 @@ def test_listing_possible_os():
assert expected_os in output
@pytest.mark.not_on_windows("test unsupported on Windows")
@pytest.mark.maybeslow
@pytest.mark.requires_executables("git")
def test_bootstrap_phase(minimal_configuration, config_dumper, capsys):

View File

@@ -40,20 +40,21 @@
@pytest.fixture()
def upstream_and_downstream_db(tmpdir, gen_mock_layout):
mock_db_root = str(tmpdir.mkdir("mock_db_root"))
upstream_write_db = spack.database.Database(mock_db_root)
upstream_db = spack.database.Database(mock_db_root, is_upstream=True)
upstream_layout = gen_mock_layout("/a/")
upstream_write_db = spack.database.Database(mock_db_root, layout=upstream_layout)
upstream_db = spack.database.Database(mock_db_root, is_upstream=True, layout=upstream_layout)
# Generate initial DB file to avoid reindex
with open(upstream_write_db._index_path, "w") as db_file:
upstream_write_db._write_to_file(db_file)
upstream_layout = gen_mock_layout("/a/")
downstream_db_root = str(tmpdir.mkdir("mock_downstream_db_root"))
downstream_db = spack.database.Database(downstream_db_root, upstream_dbs=[upstream_db])
downstream_db = spack.database.Database(
downstream_db_root, upstream_dbs=[upstream_db], layout=gen_mock_layout("/b/")
)
with open(downstream_db._index_path, "w") as db_file:
downstream_db._write_to_file(db_file)
downstream_layout = gen_mock_layout("/b/")
yield upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout
yield upstream_write_db, upstream_db, downstream_db
@pytest.mark.parametrize(
@@ -69,14 +70,14 @@ def upstream_and_downstream_db(tmpdir, gen_mock_layout):
def test_query_by_install_tree(
install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config
):
up_write_db, up_db, up_layout, down_db, down_layout = upstream_and_downstream_db
up_write_db, up_db, down_db = upstream_and_downstream_db
# Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b")
b = spack.spec.Spec("pkg-b").concretized()
c = spack.spec.Spec("pkg-c").concretized()
up_write_db.add(c, up_layout)
up_write_db.add(c)
up_db._read()
down_db.add(b, down_layout)
down_db.add(b)
specs = down_db.query(install_tree=install_tree.format(u=up_db.root, d=down_db.root))
assert [s.name for s in specs] == result
@@ -86,9 +87,7 @@ def test_spec_installed_upstream(
upstream_and_downstream_db, mock_custom_repository, config, monkeypatch
):
"""Test whether Spec.installed_upstream() works."""
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
# a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository):
@@ -96,7 +95,7 @@ def test_spec_installed_upstream(
assert not spec.installed
assert not spec.installed_upstream
upstream_write_db.add(spec, upstream_layout)
upstream_write_db.add(spec)
upstream_db._read()
monkeypatch.setattr(spack.store.STORE, "db", downstream_db)
@@ -112,9 +111,7 @@ def test_spec_installed_upstream(
@pytest.mark.usefixtures("config")
def test_installed_upstream(upstream_and_downstream_db, tmpdir):
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
@@ -125,7 +122,7 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("w").concretized()
for dep in spec.traverse(root=False):
upstream_write_db.add(dep, upstream_layout)
upstream_write_db.add(dep)
upstream_db._read()
for dep in spec.traverse(root=False):
@@ -135,11 +132,11 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
upstream_db.get_by_hash(dep.dag_hash())
new_spec = spack.spec.Spec("w").concretized()
downstream_db.add(new_spec, downstream_layout)
downstream_db.add(new_spec)
for dep in new_spec.traverse(root=False):
upstream, record = downstream_db.query_by_spec_hash(dep.dag_hash())
assert upstream
assert record.path == upstream_layout.path_for_spec(dep)
assert record.path == upstream_db.layout.path_for_spec(dep)
upstream, record = downstream_db.query_by_spec_hash(new_spec.dag_hash())
assert not upstream
assert record.installed
@@ -148,32 +145,32 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
downstream_db._check_ref_counts()
@pytest.mark.usefixtures("config")
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir):
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir, capsys, config):
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("z")
builder.add_package("y", dependencies=[("z", None, None)])
with spack.repo.use_repositories(builder):
spec = spack.spec.Spec("y").concretized()
y = spack.spec.Spec("y").concretized()
z = y["z"]
upstream_write_db.add(spec["z"], upstream_layout)
# add dependency to upstream, dependents to downstream
upstream_write_db.add(z)
upstream_db._read()
downstream_db.add(y)
# remove the dependency from the upstream DB
upstream_write_db.remove(z)
upstream_db._read()
new_spec = spack.spec.Spec("y").concretized()
downstream_db.add(new_spec, downstream_layout)
upstream_write_db.remove(new_spec["z"])
upstream_db._read()
new_downstream = spack.database.Database(downstream_db.root, upstream_dbs=[upstream_db])
new_downstream._fail_when_missing_deps = True
with pytest.raises(spack.database.MissingDependenciesError):
new_downstream._read()
# then rereading the downstream DB should warn about the missing dep
downstream_db._read_from_file(downstream_db._index_path)
assert (
f"Missing dependency not in database: y/{y.dag_hash(7)} needs z"
in capsys.readouterr().err
)
@pytest.mark.usefixtures("config")
@@ -182,9 +179,7 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
DB. When a package is recorded as installed in both, the results should
refer to the downstream DB.
"""
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
@@ -192,8 +187,8 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
downstream_db.add(spec, downstream_layout)
upstream_write_db.add(spec, upstream_layout)
downstream_db.add(spec)
upstream_write_db.add(spec)
upstream_db._read()
upstream, record = downstream_db.query_by_spec_hash(spec.dag_hash())
@@ -207,33 +202,22 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
try:
orig_db = spack.store.STORE.db
spack.store.STORE.db = downstream_db
assert queried_spec.prefix == downstream_layout.path_for_spec(spec)
assert queried_spec.prefix == downstream_db.layout.path_for_spec(spec)
finally:
spack.store.STORE.db = orig_db
@pytest.mark.usefixtures("config", "temporary_store")
def test_cannot_write_upstream(tmpdir, gen_mock_layout):
roots = [str(tmpdir.mkdir(x)) for x in ["a", "b"]]
layouts = [gen_mock_layout(x) for x in ["/ra/", "/rb/"]]
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
def test_cannot_write_upstream(tmp_path, mock_packages, config):
# Instantiate the database that will be used as the upstream DB and make
# sure it has an index file
upstream_db_independent = spack.database.Database(roots[1])
with upstream_db_independent.write_transaction():
with spack.database.Database(str(tmp_path)).write_transaction():
pass
upstream_dbs = spack.store._construct_upstream_dbs_from_install_roots([roots[1]], _test=True)
# Create it as an upstream
db = spack.database.Database(str(tmp_path), is_upstream=True)
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x")
spec.concretize()
with pytest.raises(spack.database.ForbiddenLockError):
upstream_dbs[0].add(spec, layouts[1])
with pytest.raises(spack.database.ForbiddenLockError):
db.add(spack.spec.Spec("pkg-a").concretized())
@pytest.mark.usefixtures("config", "temporary_store")
@@ -248,17 +232,17 @@ def test_recursive_upstream_dbs(tmpdir, gen_mock_layout):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
db_c = spack.database.Database(roots[2])
db_c.add(spec["z"], layouts[2])
db_c = spack.database.Database(roots[2], layout=layouts[2])
db_c.add(spec["z"])
db_b = spack.database.Database(roots[1], upstream_dbs=[db_c])
db_b.add(spec["y"], layouts[1])
db_b = spack.database.Database(roots[1], upstream_dbs=[db_c], layout=layouts[1])
db_b.add(spec["y"])
db_a = spack.database.Database(roots[0], upstream_dbs=[db_b, db_c])
db_a.add(spec["x"], layouts[0])
db_a = spack.database.Database(roots[0], upstream_dbs=[db_b, db_c], layout=layouts[0])
db_a.add(spec["x"])
upstream_dbs_from_scratch = spack.store._construct_upstream_dbs_from_install_roots(
[roots[1], roots[2]], _test=True
[roots[1], roots[2]]
)
db_a_from_scratch = spack.database.Database(
roots[0], upstream_dbs=upstream_dbs_from_scratch
@@ -366,7 +350,7 @@ def _check_db_sanity(database):
_check_merkleiness()
def _check_remove_and_add_package(database, spec):
def _check_remove_and_add_package(database: spack.database.Database, spec):
"""Remove a spec from the DB, then add it and make sure everything's
still ok once it is added. This checks that it was
removed, that it's back when added again, and that ref
@@ -386,7 +370,7 @@ def _check_remove_and_add_package(database, spec):
assert concrete_spec not in remaining
# add it back and make sure everything is ok.
database.add(concrete_spec, spack.store.STORE.layout)
database.add(concrete_spec)
installed = database.query()
assert concrete_spec in installed
assert installed == original
@@ -396,7 +380,7 @@ def _check_remove_and_add_package(database, spec):
database._check_ref_counts()
def _mock_install(spec):
def _mock_install(spec: str):
s = spack.spec.Spec(spec).concretized()
s.package.do_install(fake=True)
@@ -636,7 +620,7 @@ def test_080_root_ref_counts(mutable_database):
assert mutable_database.get_record("mpich").ref_count == 1
# Put the spec back
mutable_database.add(rec.spec, spack.store.STORE.layout)
mutable_database.add(rec.spec)
# record is present again
assert len(mutable_database.query("mpileaks ^mpich", installed=any)) == 1
@@ -1117,9 +1101,9 @@ def test_database_construction_doesnt_use_globals(tmpdir, config, nullify_global
def test_database_read_works_with_trailing_data(tmp_path, default_mock_concretization):
# Populate a database
root = str(tmp_path)
db = spack.database.Database(root)
db = spack.database.Database(root, layout=None)
spec = default_mock_concretization("pkg-a")
db.add(spec, directory_layout=None)
db.add(spec)
specs_in_db = db.query_local()
assert spec in specs_in_db

View File

@@ -11,11 +11,7 @@
def test_detection_update_config(mutable_config):
# mock detected package
detected_packages = collections.defaultdict(list)
detected_packages["cmake"] = [
spack.detection.common.DetectedPackage(
spec=spack.spec.Spec("cmake@3.27.5"), prefix="/usr/bin"
)
]
detected_packages["cmake"] = [spack.spec.Spec("cmake@3.27.5", external_path="/usr/bin")]
# update config for new package
spack.detection.common.update_configuration(detected_packages)

View File

@@ -860,3 +860,33 @@ def test_env_view_on_non_empty_dir_errors(tmp_path, config, mock_packages, tempo
env.install_all(fake=True)
with pytest.raises(ev.SpackEnvironmentError, match="because it is a non-empty dir"):
env.regenerate_views()
@pytest.mark.parametrize(
"matrix_line", [("^zmpi", "^mpich"), ("~shared", "+shared"), ("shared=False", "+shared-libs")]
)
@pytest.mark.regression("40791")
def test_stack_enforcement_is_strict(tmp_path, matrix_line, config, mock_packages):
"""Ensure that constraints in matrices are applied strictly after expansion, to avoid
inconsistencies between abstract user specs and concrete specs.
"""
manifest = tmp_path / "spack.yaml"
manifest.write_text(
f"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- [{", ".join(item for item in matrix_line)}]
specs:
- $install
concretizer:
unify: false
"""
)
# Here we raise different exceptions depending on whether we solve serially or not
with pytest.raises(Exception):
with ev.Environment(tmp_path) as e:
e.concretize()

View File

@@ -0,0 +1,333 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.build_systems.generic
import spack.config
import spack.environment as ev
import spack.error
import spack.package_base
import spack.repo
import spack.util.spack_yaml as syaml
import spack.version
from spack.spec import Spec
from spack.test.conftest import create_test_repo
"""
These tests include the following package DAGs:
Firstly, w, x, y where w and x apply cflags to y.
w
|\
x |
|/
y
Secondly, v, y which where v does not apply cflags to y - this is for testing
mixing with compiler flag propagation in the absence of compiler flags applied
by dependents.
v
|
y
Finally, a diamond dag to check that the topological order is resolved into
a total order:
t
|\
u x
|/
y
"""
_pkgx = (
"x",
"""\
class X(Package):
version("1.1")
version("1.0")
variant("activatemultiflag", default=False)
depends_on('y cflags="-d1"', when="~activatemultiflag")
depends_on('y cflags="-d1 -d2"', when="+activatemultiflag")
""",
)
_pkgy = (
"y",
"""\
class Y(Package):
version("2.1")
version("2.0")
""",
)
_pkgw = (
"w",
"""\
class W(Package):
version("3.1")
version("3.0")
variant("moveflaglater", default=False)
depends_on('x +activatemultiflag')
depends_on('y cflags="-d0"', when="~moveflaglater")
depends_on('y cflags="-d3"', when="+moveflaglater")
""",
)
_pkgv = (
"v",
"""\
class V(Package):
version("4.1")
version("4.0")
depends_on("y")
""",
)
_pkgt = (
"t",
"""\
class T(Package):
version("5.0")
depends_on("u")
depends_on("x+activatemultiflag")
depends_on("y cflags='-c1 -c2'")
""",
)
_pkgu = (
"u",
"""\
class U(Package):
version("6.0")
depends_on("y cflags='-e1 -e2'")
""",
)
@pytest.fixture
def _create_test_repo(tmpdir, mutable_config):
yield create_test_repo(tmpdir, [_pkgt, _pkgu, _pkgv, _pkgw, _pkgx, _pkgy])
@pytest.fixture
def test_repo(_create_test_repo, monkeypatch, mock_stage):
with spack.repo.use_repositories(_create_test_repo) as mock_repo_path:
yield mock_repo_path
def update_concretize_scope(conf_str, section):
conf = syaml.load_config(conf_str)
spack.config.set(section, conf[section], scope="concretize")
def test_mix_spec_and_requirements(concretize_scope, test_repo):
conf_str = """\
packages:
y:
require: cflags="-c"
"""
update_concretize_scope(conf_str, "packages")
s1 = Spec('y cflags="-a"').concretized()
assert s1.satisfies('cflags="-a -c"')
def test_mix_spec_and_dependent(concretize_scope, test_repo):
s1 = Spec('x ^y cflags="-a"').concretized()
assert s1["y"].satisfies('cflags="-a -d1"')
def _compiler_cfg_one_entry_with_cflags(cflags):
return f"""\
compilers::
- compiler:
spec: gcc@12.100.100
paths:
cc: /usr/bin/fake-gcc
cxx: /usr/bin/fake-g++
f77: null
fc: null
flags:
cflags: {cflags}
operating_system: debian6
modules: []
"""
def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "compilers")
s1 = Spec('y %gcc@12.100.100 cflags="-O2"').concretized()
assert s1.satisfies('cflags="-Wall -O2"')
@pytest.mark.parametrize(
"cmd_flags,req_flags,cmp_flags,dflags,expected_order",
[
("-a -b", "-c", None, False, "-c -a -b"),
("-x7 -x4", "-x5 -x6", None, False, "-x5 -x6 -x7 -x4"),
("-x7 -x4", "-x5 -x6", "-x3 -x8", False, "-x3 -x8 -x5 -x6 -x7 -x4"),
("-x7 -x4", "-x5 -x6", "-x3 -x8", True, "-x3 -x8 -d1 -d2 -x5 -x6 -x7 -x4"),
("-x7 -x4", None, "-x3 -x8", False, "-x3 -x8 -x7 -x4"),
("-x7 -x4", None, "-x3 -x8", True, "-x3 -x8 -d1 -d2 -x7 -x4"),
# The remaining test cover cases of intersection
("-a -b", "-a -c", None, False, "-c -a -b"),
("-a -b", None, "-a -c", False, "-c -a -b"),
("-a -b", "-a -c", "-a -d", False, "-d -c -a -b"),
("-a -d2 -d1", "-d2 -c", "-d1 -b", True, "-b -c -a -d2 -d1"),
("-a", "-d0 -d2 -c", "-d1 -b", True, "-b -d1 -d0 -d2 -c -a"),
],
)
def test_flag_order_and_grouping(
concretize_scope, test_repo, cmd_flags, req_flags, cmp_flags, dflags, expected_order
):
"""Check consistent flag ordering and grouping on a package "y"
with flags introduced from a variety of sources.
The ordering rules are explained in ``asp.SpecBuilder.reorder_flags``.
"""
if req_flags:
conf_str = f"""\
packages:
y:
require: cflags="{req_flags}"
"""
update_concretize_scope(conf_str, "packages")
if cmp_flags:
conf_str = _compiler_cfg_one_entry_with_cflags(cmp_flags)
update_concretize_scope(conf_str, "compilers")
compiler_spec = ""
if cmp_flags:
compiler_spec = "%gcc@12.100.100"
if dflags:
spec_str = f"x+activatemultiflag {compiler_spec} ^y"
expected_dflags = "-d1 -d2"
else:
spec_str = f"y {compiler_spec}"
expected_dflags = None
if cmd_flags:
spec_str += f' cflags="{cmd_flags}"'
root_spec = Spec(spec_str).concretized()
spec = root_spec["y"]
satisfy_flags = " ".join(x for x in [cmd_flags, req_flags, cmp_flags, expected_dflags] if x)
assert spec.satisfies(f'cflags="{satisfy_flags}"')
assert spec.compiler_flags["cflags"] == expected_order.split()
def test_two_dependents_flag_mixing(concretize_scope, test_repo):
root_spec1 = Spec("w~moveflaglater").concretized()
spec1 = root_spec1["y"]
assert spec1.compiler_flags["cflags"] == "-d0 -d1 -d2".split()
root_spec2 = Spec("w+moveflaglater").concretized()
spec2 = root_spec2["y"]
assert spec2.compiler_flags["cflags"] == "-d3 -d1 -d2".split()
def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
update_concretize_scope(conf_str, "compilers")
root_spec = Spec("v %gcc@12.100.100 cflags=='-f1'").concretized()
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
# Note: setting flags on a dependency overrides propagation, which
# is tested in test/concretize.py:test_compiler_flag_propagation
def test_propagate_and_pkg_dep(concretize_scope, test_repo):
root_spec1 = Spec("x ~activatemultiflag cflags=='-f1'").concretized()
assert root_spec1["y"].satisfies("cflags='-f1 -d1'")
def test_propagate_and_require(concretize_scope, test_repo):
conf_str = """\
packages:
y:
require: cflags="-f2"
"""
update_concretize_scope(conf_str, "packages")
root_spec1 = Spec("v cflags=='-f1'").concretized()
assert root_spec1["y"].satisfies("cflags='-f1 -f2'")
# Next, check that a requirement does not "undo" a request for
# propagation from the command-line spec
conf_str = """\
packages:
v:
require: cflags="-f1"
"""
update_concretize_scope(conf_str, "packages")
root_spec2 = Spec("v cflags=='-f1'").concretized()
assert root_spec2["y"].satisfies("cflags='-f1'")
# Note: requirements cannot enforce propagation: any attempt to do
# so will generate a concretization error; this likely relates to
# the note about #37180 in concretize.lp
def test_dev_mix_flags(tmp_path, concretize_scope, mutable_mock_env_path, test_repo):
src_dir = tmp_path / "x-src"
env_content = f"""\
spack:
specs:
- y %gcc@12.100.100 cflags=='-fsanitize=address'
develop:
y:
spec: y cflags=='-fsanitize=address'
path: {src_dir}
"""
conf_str = _compiler_cfg_one_entry_with_cflags("-f1")
update_concretize_scope(conf_str, "compilers")
manifest_file = tmp_path / ev.manifest_name
manifest_file.write_text(env_content)
e = ev.create("test", manifest_file)
with e:
e.concretize()
e.write()
(result,) = list(j for i, j in e.concretized_specs() if j.name == "y")
assert result["y"].satisfies("cflags='-fsanitize=address -f1'")
def test_diamond_dep_flag_mixing(concretize_scope, test_repo):
"""A diamond where each dependent applies flags to the bottom
dependency. The goal is to ensure that the flag ordering is
(a) topological and (b) repeatable for elements not subject to
this partial ordering (i.e. the flags for the left and right
nodes of the diamond always appear in the same order).
`Spec.traverse` is responsible for handling both of these needs.
"""
root_spec1 = Spec("t").concretized()
spec1 = root_spec1["y"]
assert spec1.satisfies('cflags="-c1 -c2 -d1 -d2 -e1 -e2"')
assert spec1.compiler_flags["cflags"] == "-c1 -c2 -e1 -e2 -d1 -d2".split()

Some files were not shown because too many files have changed in this diff Show More